mirror of https://github.com/fail2ban/fail2ban
Merge branch '0.10' into 0.10-full
commit
99634638ba
135
ChangeLog
135
ChangeLog
|
@ -13,6 +13,15 @@ TODO: implementing of options resp. other tasks from PR #1346
|
|||
|
||||
### Fixes
|
||||
* [Grave] memory leak's fixed (gh-1277, gh-1234)
|
||||
* [Grave] Misleading date patterns defined more precisely (using extended syntax
|
||||
`%Ex[mdHMS]` for exact two-digit match or e. g. `%ExY` as more precise year
|
||||
pattern, within same century of last year and the next 3 years)
|
||||
* [Grave] extends date detector template with distance (position of match in
|
||||
log-line), to prevent grave collision using (re)ordered template list (e.g.
|
||||
find-spot of wrong date-match inside foreign input, misleading date patterns
|
||||
by ambiguous formats, etc.)
|
||||
* Distance collision check always prefers template with shortest distance
|
||||
(left for right) if date pattern is not anchored
|
||||
* Tricky bug fix: last position of log file will be never retrieved (gh-795),
|
||||
because of CASCADE all log entries will be deleted from logs table together with jail,
|
||||
if used "INSERT OR REPLACE" statement
|
||||
|
@ -31,6 +40,22 @@ TODO: implementing of options resp. other tasks from PR #1346
|
|||
* Pyinotify-backend: stability fix for sporadically errors in multi-threaded
|
||||
environment (without lock)
|
||||
* Fixed sporadically error in testCymruInfoNxdomain, because of unsorted values
|
||||
* Misleading errors logged from ignorecommand in success case on retcode 1 (gh-1194)
|
||||
* fail2ban.service - systemd service updated (gh-1618):
|
||||
- starting service in normal mode (without forking)
|
||||
- does not restart if service exited normally (exit-code 0, e.g. stopped via fail2ban-client)
|
||||
- does not restart if service can not start (exit-code 255, e.g. wrong configuration, etc.)
|
||||
- service can be additionally started/stopped with commands (fail2ban-client, fail2ban-server)
|
||||
- automatically creates `/var/run/fail2ban` directory before start fail2ban
|
||||
(systems with virtual resp. memory-based FS for `/var/run`), see gh-1531
|
||||
- if fail2ban running as systemd-service, for logging to the systemd-journal,
|
||||
the `logtarget` could be set to STDOUT
|
||||
- value `logtarget` for system targets allowed also in lowercase (stdout, stderr, syslog, etc.)
|
||||
* Fixed UTC/GMT named time zone, using `%Z` and `%z` patterns
|
||||
(special case with 0 zone offset, see gh-1575)
|
||||
* `filter.d/freeswitch.conf`
|
||||
- Optional prefixes (server, daemon, dual time) if systemd daemon logs used (gh-1548)
|
||||
- User part rewritten to accept IPv6 resp. domain after "@" (gh-1548)
|
||||
|
||||
### New Features
|
||||
* IPv6 support:
|
||||
|
@ -60,6 +85,8 @@ TODO: implementing of options resp. other tasks from PR #1346
|
|||
banned in this jail, if option `--unban` specified
|
||||
- `unban --all` - unbans all IP addresses (in all jails and database)
|
||||
- `unban <IP> ... <IP>` - unbans \<IP\> (in all jails and database) (see gh-1388)
|
||||
- introduced new option `-t` or `--test` to test configuration resp. start server only
|
||||
if configuration is clean (fails by wrong configured jails if option `-t` specified)
|
||||
* New command action parameter `actionrepair` - command executed in order to restore
|
||||
sane environment in error case of `actioncheck`.
|
||||
|
||||
|
@ -124,6 +151,51 @@ fail2ban-client set loglevel INFO
|
|||
- new replacement for `<ADDR>` in opposition to `<HOST>`, for separate
|
||||
usage of 2 address groups only (regardless of `usedns`), `ip4` and `ip6`
|
||||
together, without host (dns)
|
||||
* Misconfigured jails don't prevent fail2ban from starting, server starts
|
||||
nevertheless, as long as one jail was successful configured (gh-1619)
|
||||
Message about wrong jail configuration logged in client log (stdout, systemd
|
||||
journal etc.) and in server log with error level
|
||||
* More precise date template handling (WARNING: theoretically possible incompatibilities):
|
||||
- datedetector rewritten more strict as earlier;
|
||||
- default templates can be specified exacter using prefix/suffix syntax (via `datepattern`);
|
||||
- more as one date pattern can be specified using option `datepattern` now
|
||||
(new-line separated);
|
||||
- some default options like `datepattern` can be specified directly in
|
||||
section `[Definition]`, that avoids contrary usage of unnecessarily `[Init]`
|
||||
section, because of performance (each extra section costs time);
|
||||
- option `datepattern` can be specified in jail also (e. g. jails without filters
|
||||
or custom log-format, new-line separated for multiple patterns);
|
||||
- if first unnamed group specified in pattern, only this will be cut out from
|
||||
search log-line (e. g.: `^date:[({DATE})]` will cut out only datetime match
|
||||
pattern, and leaves `date:[] ...` for searching in filter);
|
||||
- faster match and fewer searching of appropriate templates
|
||||
(DateDetector.matchTime calls rarer DateTemplate.matchDate now);
|
||||
- several standard filters extended with exact prefixed or anchored date templates;
|
||||
* Added possibility to recognize restored state of the tickets (see gh-1669).
|
||||
New option `norestored` introduced, to ignore restored tickets (after restart).
|
||||
To avoid execution of ban/unban for the restored tickets, `norestored = true`
|
||||
could be added in definition section of action.
|
||||
For conditional usage in the shell-based actions an interpolation `<restored>`
|
||||
could be used also. E. g. it is enough to add following script-piece at begin
|
||||
of `actionban` (or `actionunban`) to prevent execution:
|
||||
`if [ '<restored>' = '1' ]; then exit 0; fi;`
|
||||
Several actions extended now using `norestored` option:
|
||||
- complain.conf
|
||||
- dshield.conf
|
||||
- mail-buffered.conf
|
||||
- mail-whois-lines.conf
|
||||
- mail-whois.conf
|
||||
- mail.conf
|
||||
- sendmail-buffered.conf
|
||||
- sendmail-geoip-lines.conf
|
||||
- sendmail-whois-ipjailmatches.conf
|
||||
- sendmail-whois-ipmatches.conf
|
||||
- sendmail-whois-lines.conf
|
||||
- sendmail-whois-matches.conf
|
||||
- sendmail-whois.conf
|
||||
- sendmail.conf
|
||||
- smtp.py
|
||||
- xarf-login-attack.conf
|
||||
* fail2ban-testcases:
|
||||
- `assertLogged` extended with parameter wait (to wait up to specified timeout,
|
||||
before we throw assert exception) + test cases rewritten using that
|
||||
|
@ -131,29 +203,69 @@ fail2ban-client set loglevel INFO
|
|||
- new `with_foreground_server_thread` decorator to test several client/server commands
|
||||
|
||||
|
||||
ver. 0.9.6 (2016/XX/XX) - wanna-be-released
|
||||
ver. 0.9.x (2016/??/??) - wanna-be-released
|
||||
-----------
|
||||
|
||||
0.9.x line is no longer heavily developed. If you are interested in
|
||||
new features (e.g. IPv6 support), please consider 0.10 branch and its
|
||||
releases.
|
||||
|
||||
### Fixes
|
||||
* Fixed a systemd-journal handling in fail2ban-regex (gh-1657)
|
||||
* filter.d/sshd.conf
|
||||
- Fixed non-anchored part of failregex (misleading match of colon inside
|
||||
IPv6 address instead of `: ` in the reason-part by missing space, gh-1658)
|
||||
(0.10th resp. IPv6 relevant only, amend for gh-1479)
|
||||
* config/pathes-freebsd.conf
|
||||
- Fixed filenames for apache and nginx log files (gh-1667)
|
||||
* filter.d/sshd.conf
|
||||
- new aggressive rules (gh-864):
|
||||
- Connection reset by peer (multi-line rule during authorization process)
|
||||
- No supported authentication methods available
|
||||
- single line and multi-line expression optimized, added optional prefixes
|
||||
and suffix (logged from several ssh versions), according to gh-1206;
|
||||
- fixed expression received disconnect auth fail (optional space after port
|
||||
part, gh-1652)
|
||||
and suffix (logged from several ssh versions), according to gh-1206;
|
||||
* filter.d/suhosin.conf
|
||||
- greedy catch-all before `<HOST>` fixed (potential vulnerability)
|
||||
* Filter tests extended with check of all config-regexp, that contains greedy catch-all
|
||||
before `<HOST>`, that is hard-anchored at end or precise sub expression after `<HOST>`
|
||||
|
||||
### New Features
|
||||
* New Actions:
|
||||
- action.d/netscaler: Block IPs on a Citrix Netscaler ADC (gh-1663)
|
||||
|
||||
* New Filters:
|
||||
- filter.d/domino-smtp: IBM Domino SMTP task (gh-1603)
|
||||
|
||||
### Enhancements
|
||||
|
||||
|
||||
ver. 0.9.6 (2016/12/10) - stretch-is-coming
|
||||
-----------
|
||||
|
||||
### Fixes
|
||||
* Misleading add resp. enable of (already available) jail in database, that
|
||||
induced a subsequent error: last position of log file will be never retrieved (gh-795)
|
||||
* Fixed a distribution related bug within testReadStockJailConfForceEnabled
|
||||
(e.g. test-cases faults on Fedora, see gh-1353)
|
||||
* Fixed pythonic filters and test scripts (running via wrong python version,
|
||||
* Fixed pythonic filters and test scripts (running via wrong python version,
|
||||
uses "fail2ban-python" now);
|
||||
* Fixed test case "testSetupInstallRoot" for not default python version (also
|
||||
using direct call, out of virtualenv);
|
||||
* Fixed ambiguous wrong recognized date pattern resp. its optional parts (see gh-1512);
|
||||
* FIPS compliant, use sha1 instead of md5 if it not allowed (see gh-1540)
|
||||
* Monit config: scripting is not supported in path (gh-1556)
|
||||
* `filter.d/apache-modsecurity.conf`
|
||||
- Fixed for newer version (one space, gh-1626), optimized: non-greedy catch-all
|
||||
replaced for safer match, unneeded catch-all anchoring removed, non-capturing
|
||||
* `filter.d/asterisk.conf`
|
||||
- Fixed to match different asterisk log prefix (source file: method:)
|
||||
* `filter.d/dovecot.conf`
|
||||
- Fixed failregex ignores failures through some not relevant info (gh-1623)
|
||||
* `filter.d/ignorecommands/apache-fakegooglebot`
|
||||
- Fixed error within apache-fakegooglebot, that will be called
|
||||
- Fixed error within apache-fakegooglebot, that will be called
|
||||
with wrong python version (gh-1506)
|
||||
* `filter.d/assp.conf`
|
||||
- Extended failregex and test cases to handle ASSP V1 and V2 (gh-1494)
|
||||
|
@ -161,18 +273,27 @@ releases.
|
|||
- Allow for having no trailing space after 'failed:' (gh-1497)
|
||||
* `filter.d/vsftpd.conf`
|
||||
- Optional reason part in message after FAIL LOGIN (gh-1543)
|
||||
|
||||
* `filter.d/sendmail-reject.conf`
|
||||
- removed mandatory double space (if dns-host available, gh-1579)
|
||||
* filter.d/sshd.conf
|
||||
- recognized "Failed publickey for" (gh-1477);
|
||||
- optimized failregex to match all of "Failed any-method for ... from <HOST>" (gh-1479)
|
||||
- eliminated possible complex injections (on user-name resp. auth-info, see gh-1479)
|
||||
- optional port part after host (see gh-1533, gh-1581)
|
||||
|
||||
### New Features
|
||||
* New Actions:
|
||||
- `action.d/npf.conf` for NPF, the latest packet filter for NetBSD
|
||||
* New Filters:
|
||||
- `filter.d/mongodb-auth.conf` for MongoDB (document-oriented NoSQL database engine)
|
||||
(gh-1586, gh-1606 and gh-1607)
|
||||
|
||||
### Enhancements
|
||||
* DateTemplate regexp extended with the word-end boundary, additionally to
|
||||
* DateTemplate regexp extended with the word-end boundary, additionally to
|
||||
word-start boundary
|
||||
* Introduces new command "fail2ban-python", as automatically created symlink to
|
||||
* Introduces new command "fail2ban-python", as automatically created symlink to
|
||||
python executable, where fail2ban currently installed (resp. its modules are located):
|
||||
- allows to use the same version, fail2ban currently running, e.g. in
|
||||
- allows to use the same version, fail2ban currently running, e.g. in
|
||||
external scripts just via replace python with fail2ban-python:
|
||||
```diff
|
||||
-#!/usr/bin/env python
|
||||
|
|
2
FILTERS
2
FILTERS
|
@ -227,7 +227,7 @@ Regular expressions (failregex, ignoreregex) assume that the date/time has been
|
|||
removed from the log line (this is just how fail2ban works internally ATM).
|
||||
|
||||
If the format is like '<date...> error 1.2.3.4 is evil' then you need to match
|
||||
the < at the start so regex should be similar to '^<> <HOST> is evil$' using
|
||||
the <> at the start so regex should be similar to '^<> error <HOST> is evil$' using
|
||||
<HOST> where the IP/domain name appears in the log line.
|
||||
|
||||
The following general rules apply to regular expressions:
|
||||
|
|
6
MANIFEST
6
MANIFEST
|
@ -41,6 +41,7 @@ config/action.d/mynetwatchman.conf
|
|||
config/action.d/nftables-allports.conf
|
||||
config/action.d/nftables-common.conf
|
||||
config/action.d/nftables-multiport.conf
|
||||
config/action.d/npf.conf
|
||||
config/action.d/nsupdate.conf
|
||||
config/action.d/osx-afctl.conf
|
||||
config/action.d/osx-ipfw.conf
|
||||
|
@ -100,6 +101,7 @@ config/filter.d/horde.conf
|
|||
config/filter.d/ignorecommands/apache-fakegooglebot
|
||||
config/filter.d/kerio.conf
|
||||
config/filter.d/lighttpd-auth.conf
|
||||
config/filter.d/mongodb-auth.conf
|
||||
config/filter.d/monit.conf
|
||||
config/filter.d/murmur.conf
|
||||
config/filter.d/mysqld-auth.conf
|
||||
|
@ -154,6 +156,7 @@ config/paths-opensuse.conf
|
|||
config/paths-osx.conf
|
||||
CONTRIBUTING.md
|
||||
COPYING
|
||||
.coveragerc
|
||||
DEVELOP
|
||||
fail2ban-2to3
|
||||
fail2ban/client/actionreader.py
|
||||
|
@ -214,6 +217,7 @@ fail2ban/tests/clientbeautifiertestcase.py
|
|||
fail2ban/tests/clientreadertestcase.py
|
||||
fail2ban/tests/config/action.d/brokenaction.conf
|
||||
fail2ban/tests/config/fail2ban.conf
|
||||
fail2ban/tests/config/filter.d/common.conf
|
||||
fail2ban/tests/config/filter.d/simple.conf
|
||||
fail2ban/tests/config/filter.d/test.conf
|
||||
fail2ban/tests/config/filter.d/test.local
|
||||
|
@ -289,6 +293,7 @@ fail2ban/tests/files/logs/haproxy-http-auth
|
|||
fail2ban/tests/files/logs/horde
|
||||
fail2ban/tests/files/logs/kerio
|
||||
fail2ban/tests/files/logs/lighttpd-auth
|
||||
fail2ban/tests/files/logs/mongodb-auth
|
||||
fail2ban/tests/files/logs/monit
|
||||
fail2ban/tests/files/logs/murmur
|
||||
fail2ban/tests/files/logs/mysqld-auth
|
||||
|
@ -389,6 +394,7 @@ man/fail2ban-testcases.1
|
|||
man/fail2ban-testcases.h2m
|
||||
man/generate-man
|
||||
man/jail.conf.5
|
||||
.pylintrc
|
||||
README.md
|
||||
README.Solaris
|
||||
RELEASE
|
||||
|
|
|
@ -17,6 +17,9 @@ Though Fail2Ban is able to reduce the rate of incorrect authentications
|
|||
attempts, it cannot eliminate the risk that weak authentication presents.
|
||||
Configure services to use only two factor or public/private authentication
|
||||
mechanisms if you really want to protect services.
|
||||
|
||||
<img src="http://www.worldipv6launch.org/wp-content/themes/ipv6/downloads/World_IPv6_launch_logo.svg" height="52pt"/> | Since v0.10 fail2ban supports the matching of the IPv6 addresses.
|
||||
------|------
|
||||
|
||||
This README is a quick introduction to Fail2ban. More documentation, FAQ, HOWTOs
|
||||
are available in fail2ban(1) manpage and on the website http://www.fail2ban.org
|
||||
|
|
8
RELEASE
8
RELEASE
|
@ -53,7 +53,7 @@ Preparation
|
|||
|
||||
or an alternative for comparison with previous release
|
||||
|
||||
git diff 0.9.5 | grep -B2 'index 0000000..' | grep -B1 'new file mode' | sed -n -e '/^diff /s,.* b/,,gp' >> MANIFEST
|
||||
git diff 0.10.0 | grep -B2 'index 0000000..' | grep -B1 'new file mode' | sed -n -e '/^diff /s,.* b/,,gp' >> MANIFEST
|
||||
sort MANIFEST | uniq | sponge MANIFEST
|
||||
|
||||
* Run::
|
||||
|
@ -70,7 +70,7 @@ Preparation
|
|||
|
||||
* clean up current directory::
|
||||
|
||||
diff -rul --exclude \*.pyc . /tmp/fail2ban-0.9.5/
|
||||
diff -rul --exclude \*.pyc . /tmp/fail2ban-0.10.0/
|
||||
|
||||
* Only differences should be files that you don't want distributed.
|
||||
|
||||
|
@ -83,7 +83,7 @@ Preparation
|
|||
|
||||
* To generate a list of committers use e.g.::
|
||||
|
||||
git shortlog -sn 0.9.5.. | sed -e 's,^[ 0-9\t]*,,g' | tr '\n' '\|' | sed -e 's:|:, :g'
|
||||
git shortlog -sn 0.10.0.. | sed -e 's,^[ 0-9\t]*,,g' | tr '\n' '\|' | sed -e 's:|:, :g'
|
||||
|
||||
* Ensure the top of the ChangeLog has the right version and current date.
|
||||
* Ensure the top entry of the ChangeLog has the right version and current date.
|
||||
|
@ -106,7 +106,7 @@ Preparation
|
|||
* Tag the release by using a signed (and annotated) tag. Cut/paste
|
||||
release ChangeLog entry as tag annotation::
|
||||
|
||||
git tag -s 0.9.5
|
||||
git tag -s 0.10.0
|
||||
|
||||
Pre Release
|
||||
===========
|
||||
|
|
2
THANKS
2
THANKS
|
@ -110,6 +110,7 @@ SATO Kentaro
|
|||
Sean DuBois
|
||||
Sebastian Arcus
|
||||
Serg G. Brester (sebres)
|
||||
Sergey Safarov
|
||||
Sireyessire
|
||||
silviogarbes
|
||||
Stefan Tatschner
|
||||
|
@ -120,6 +121,7 @@ Thomas Mayer
|
|||
Tom Pike
|
||||
Tom Hendrikx
|
||||
Tomas Pihl
|
||||
Thomas Skierlo (phaleas)
|
||||
Tony Lawrence
|
||||
Tomasz Ciolek
|
||||
Tyler
|
||||
|
|
|
@ -28,8 +28,15 @@
|
|||
#
|
||||
|
||||
|
||||
[INCLUDES]
|
||||
|
||||
before = helpers-common.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionstart
|
||||
# Notes.: command executed once at the start of Fail2Ban.
|
||||
# Values: CMD
|
||||
|
@ -54,10 +61,16 @@ actioncheck =
|
|||
# Tags: See jail.conf(5) man page
|
||||
# Values: CMD
|
||||
#
|
||||
actionban = oifs=${IFS}; IFS=.;SEP_IP=( <ip> ); set -- ${SEP_IP}; ADDRESSES=$(dig +short -t txt -q $4.$3.$2.$1.abuse-contacts.abusix.org); IFS=${oifs}
|
||||
IP=<ip>
|
||||
actionban = oifs=${IFS};
|
||||
IFS=.; SEP_IP=( <ip> ); set -- ${SEP_IP}; ADDRESSES=$(dig +short -t txt -q $4.$3.$2.$1.abuse-contacts.abusix.org);
|
||||
IFS=,; ADDRESSES=$(echo $ADDRESSES)
|
||||
IFS=${oifs}
|
||||
IP=<ip>
|
||||
if [ ! -z "$ADDRESSES" ]; then
|
||||
(printf %%b "<message>\n"; date '+Note: Local timezone is %%z (%%Z)'; grep -E '(^|[^0-9])<ip>([^0-9]|$)' <logpath>) | <mailcmd> "Abuse from <ip>" <mailargs> ${ADDRESSES//,/\" \"}
|
||||
( printf %%b "<message>\n"; date '+Note: Local timezone is %%z (%%Z)';
|
||||
printf %%b "\nLines containing failures of <ip> (max <grepmax>)\n";
|
||||
%(_grep_logs)s;
|
||||
) | <mailcmd> "Abuse from <ip>" <mailargs> $ADDRESSES
|
||||
fi
|
||||
|
||||
# Option: actionunban
|
||||
|
@ -92,3 +105,7 @@ mailcmd = mail -s
|
|||
#
|
||||
mailargs =
|
||||
|
||||
# Number of log lines to include in the email
|
||||
#
|
||||
#grepmax = 1000
|
||||
#grepopts = -m <grepmax>
|
||||
|
|
|
@ -28,6 +28,9 @@
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionstart
|
||||
# Notes.: command executed once at the start of Fail2Ban.
|
||||
# Values: CMD
|
||||
|
|
|
@ -35,7 +35,7 @@ actioncheck =
|
|||
# service name example:
|
||||
# firewall-cmd --zone=<zone> --add-rich-rule="rule family='<family>' source address='<ip>' service name='<service>' log prefix='f2b-<name>' level='<level>' limit value='<rate>/m' <rich-blocktype>"
|
||||
#
|
||||
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges seperated by a comma or space for an example: http, https, 22-60, 18 smtp
|
||||
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges separated by a comma or space for an example: http, https, 22-60, 18 smtp
|
||||
|
||||
actionban = ports="<port>"; for p in $(echo $ports | tr ", " " "); do firewall-cmd --add-rich-rule="rule family='<family>' source address='<ip>' port port='$p' protocol='<protocol>' log prefix='f2b-<name>' level='<level>' limit value='<rate>/m' <rich-blocktype>"; done
|
||||
|
||||
|
|
|
@ -33,7 +33,7 @@ actioncheck =
|
|||
# service name example:
|
||||
# firewall-cmd --zone=<zone> --add-rich-rule="rule family='ipv4' source address='<ip>' service name='<service>' <rich-blocktype>"
|
||||
#
|
||||
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges seperated by a comma or space for an example: http, https, 22-60, 18 smtp
|
||||
# Because rich rules can only handle single or a range of ports we must split ports and execute the command for each port. Ports can be single and ranges separated by a comma or space for an example: http, https, 22-60, 18 smtp
|
||||
|
||||
actionban = ports="<port>"; for p in $(echo $ports | tr ", " " "); do firewall-cmd --add-rich-rule="rule family='<family>' source address='<ip>' port port='$p' protocol='<protocol>' <rich-blocktype>"; done
|
||||
|
||||
|
|
|
@ -0,0 +1,16 @@
|
|||
[DEFAULT]
|
||||
|
||||
# Usage:
|
||||
# _grep_logs_args = 'test'
|
||||
# (printf %%b "Log-excerpt contains 'test':\n"; %(_grep_logs)s; printf %%b "Log-excerpt contains 'test':\n") | mail ...
|
||||
#
|
||||
_grep_logs = logpath="<logpath>"; grep <grepopts> -E %(_grep_logs_args)s $logpath | <greplimit>
|
||||
_grep_logs_args = '(^|[^0-9])<ip>([^0-9]|$)'
|
||||
|
||||
# Used for actions, that should not by executed if ticket was restored:
|
||||
_bypass_if_restored = if [ '<restored>' = '1' ]; then exit 0; fi;
|
||||
|
||||
[Init]
|
||||
greplimit = tail -n <grepmax>
|
||||
grepmax = 1000
|
||||
grepopts = -m <grepmax>
|
|
@ -6,6 +6,9 @@
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionstart
|
||||
# Notes.: command executed once at the start of Fail2Ban.
|
||||
# Values: CMD
|
||||
|
|
|
@ -7,9 +7,13 @@
|
|||
[INCLUDES]
|
||||
|
||||
before = mail-whois-common.conf
|
||||
helpers-common.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionstart
|
||||
# Notes.: command executed once at the start of Fail2Ban.
|
||||
# Values: CMD
|
||||
|
@ -17,7 +21,7 @@ before = mail-whois-common.conf
|
|||
actionstart = printf %%b "Hi,\n
|
||||
The jail <name> has been started successfully.\n
|
||||
Regards,\n
|
||||
Fail2Ban"|mail -s "[Fail2Ban] <name>: started on `uname -n`" <dest>
|
||||
Fail2Ban" | <mailcmd> -s "[Fail2Ban] <name>: started on `uname -n`" <dest>
|
||||
|
||||
# Option: actionstop
|
||||
# Notes.: command executed once at the end of Fail2Ban
|
||||
|
@ -26,7 +30,7 @@ actionstart = printf %%b "Hi,\n
|
|||
actionstop = printf %%b "Hi,\n
|
||||
The jail <name> has been stopped.\n
|
||||
Regards,\n
|
||||
Fail2Ban"|mail -s "[Fail2Ban] <name>: stopped on `uname -n`" <dest>
|
||||
Fail2Ban" | <mailcmd> -s "[Fail2Ban] <name>: stopped on `uname -n`" <dest>
|
||||
|
||||
# Option: actioncheck
|
||||
# Notes.: command executed once before each actionban command
|
||||
|
@ -40,15 +44,19 @@ actioncheck =
|
|||
# Tags: See jail.conf(5) man page
|
||||
# Values: CMD
|
||||
#
|
||||
actionban = printf %%b "Hi,\n
|
||||
|
||||
_ban_mail_content = ( printf %%b "Hi,\n
|
||||
The IP <ip> has just been banned by Fail2Ban after
|
||||
<failures> attempts against <name>.\n\n
|
||||
Here is more information about <ip> :\n
|
||||
`%(_whois_command)s`\n\n
|
||||
Lines containing IP:<ip> in <logpath>\n
|
||||
`grep -E <grepopts> '(^|[^0-9])<ip>([^0-9]|$)' <logpath>`\n\n
|
||||
Here is more information about <ip> :\n"
|
||||
%(_whois_command)s;
|
||||
printf %%b "\nLines containing failures of <ip> (max <grepmax>)\n";
|
||||
%(_grep_logs)s;
|
||||
printf %%b "\n
|
||||
Regards,\n
|
||||
Fail2Ban"|mail -s "[Fail2Ban] <name>: banned <ip> from `uname -n`" <dest>
|
||||
Fail2Ban" )
|
||||
|
||||
actionban = %(_ban_mail_content)s | <mailcmd> "[Fail2Ban] <name>: banned <ip> from `uname -n`" <dest>
|
||||
|
||||
# Option: actionunban
|
||||
# Notes.: command executed when unbanning an IP. Take care that the
|
||||
|
@ -60,6 +68,12 @@ actionunban =
|
|||
|
||||
[Init]
|
||||
|
||||
# Option: mailcmd
|
||||
# Notes.: Your system mail command. Is passed 2 args: subject and recipient
|
||||
# Values: CMD
|
||||
#
|
||||
mailcmd = mail -s
|
||||
|
||||
# Default name of the chain
|
||||
#
|
||||
name = default
|
||||
|
@ -74,4 +88,5 @@ logpath = /dev/null
|
|||
|
||||
# Number of log lines to include in the email
|
||||
#
|
||||
grepopts = -m 1000
|
||||
#grepmax = 1000
|
||||
#grepopts = -m <grepmax>
|
||||
|
|
|
@ -10,6 +10,9 @@ before = mail-whois-common.conf
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionstart
|
||||
# Notes.: command executed once at the start of Fail2Ban.
|
||||
# Values: CMD
|
||||
|
|
|
@ -6,6 +6,9 @@
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionstart
|
||||
# Notes.: command executed once at the start of Fail2Ban.
|
||||
# Values: CMD
|
||||
|
|
|
@ -0,0 +1,33 @@
|
|||
# Fail2ban Citrix Netscaler Action
|
||||
# by Juliano Jeziorny
|
||||
# juliano@jeziorny.eu
|
||||
#
|
||||
# The script will add offender IPs to a dataset on netscaler, the dataset can then be used to block the IPs at a cs/vserver or global level
|
||||
# This dataset is then used to block IPs using responder policies on the netscaler.
|
||||
#
|
||||
# The script assumes using HTTPS with unsecure certificate to access the netscaler,
|
||||
# if you have a valid certificate installed remove the -k from the curl lines, or if you want http change it accordingly (and remove the -k)
|
||||
#
|
||||
# This action depends on curl
|
||||
#
|
||||
# You need to populate the 3 options inside Init
|
||||
#
|
||||
# ns_host: IP or hostname of netslcaer appliance
|
||||
# ns_auth: username:password, suggest base64 encoded for a little added security (echo -n "username:password" | base64)
|
||||
# ns_dataset: Name of the netscaler dataset holding the IPs to be blocked.
|
||||
#
|
||||
# For further details on how to use it please check http://blog.ckzone.eu/2017/01/fail2ban-action-for-citrix-netscaler.html
|
||||
|
||||
[Init]
|
||||
ns_host =
|
||||
ns_auth =
|
||||
ns_dataset =
|
||||
|
||||
[Definition]
|
||||
actionstart = curl -kH 'Authorization: Basic <ns_auth>' https://<ns_host>/nitro/v1/config
|
||||
|
||||
actioncheck =
|
||||
|
||||
actionban = curl -k -H 'Authorization: Basic <ns_auth>' -X PUT -d '{"policydataset_value_binding":{"name":"<ns_dataset>","value":"<ip>"}}' https://<ns_host>/nitro/v1/config/
|
||||
|
||||
actionunban = curl -H 'Authorization: Basic <ns_auth>' -X DELETE -k "https://<ns_host>/nitro/v1/config/policydataset_value_binding/<ns_dataset>?args=value:<ip>"
|
|
@ -10,6 +10,9 @@ before = sendmail-common.conf
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionstart
|
||||
# Notes.: command executed once at the start of Fail2Ban.
|
||||
# Values: CMD
|
||||
|
|
|
@ -7,9 +7,13 @@
|
|||
[INCLUDES]
|
||||
|
||||
before = sendmail-common.conf
|
||||
helpers-common.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionban
|
||||
# Notes.: Command executed when banning an IP. Take care that the
|
||||
# command is executed with Fail2Ban user rights.
|
||||
|
@ -19,7 +23,7 @@ before = sendmail-common.conf
|
|||
# Tags: See jail.conf(5) man page
|
||||
# Values: CMD
|
||||
#
|
||||
actionban = printf %%b "Subject: [Fail2Ban] <name>: banned <ip> from `uname -n`
|
||||
actionban = ( printf %%b "Subject: [Fail2Ban] <name>: banned <ip> from `uname -n`
|
||||
Date: `LC_ALL=C date +"%%a, %%d %%h %%Y %%T %%z"`
|
||||
From: <sendername> <<sender>>
|
||||
To: <dest>\n
|
||||
|
@ -33,10 +37,11 @@ actionban = printf %%b "Subject: [Fail2Ban] <name>: banned <ip> from `uname -n`
|
|||
Country:`geoiplookup -f /usr/share/GeoIP/GeoIP.dat "<ip>" | cut -d':' -f2-`
|
||||
AS:`geoiplookup -f /usr/share/GeoIP/GeoIPASNum.dat "<ip>" | cut -d':' -f2-`
|
||||
hostname: `host -t A <ip> 2>&1`\n\n
|
||||
Lines containing IP:<ip> in <logpath>\n
|
||||
`grep -E <grepopts> '(^|[^0-9])<ip>([^0-9]|$)' <logpath>`\n\n
|
||||
Lines containing failures of <ip>\n";
|
||||
%(_grep_logs)s;
|
||||
printf %%b "\n
|
||||
Regards,\n
|
||||
Fail2Ban" | /usr/sbin/sendmail -f <sender> <dest>
|
||||
Fail2Ban" ) | /usr/sbin/sendmail -f <sender> <dest>
|
||||
|
||||
[Init]
|
||||
|
||||
|
@ -50,4 +55,5 @@ logpath = /dev/null
|
|||
|
||||
# Number of log lines to include in the email
|
||||
#
|
||||
grepopts = -m 1000
|
||||
#grepmax = 1000
|
||||
#grepopts = -m <grepmax>
|
||||
|
|
|
@ -10,6 +10,9 @@ before = sendmail-common.conf
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionban
|
||||
# Notes.: command executed when banning an IP. Take care that the
|
||||
# command is executed with Fail2Ban user rights.
|
||||
|
|
|
@ -10,6 +10,9 @@ before = sendmail-common.conf
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionban
|
||||
# Notes.: command executed when banning an IP. Take care that the
|
||||
# command is executed with Fail2Ban user rights.
|
||||
|
|
|
@ -7,16 +7,20 @@
|
|||
[INCLUDES]
|
||||
|
||||
before = sendmail-common.conf
|
||||
helpers-common.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionban
|
||||
# Notes.: command executed when banning an IP. Take care that the
|
||||
# command is executed with Fail2Ban user rights.
|
||||
# Tags: See jail.conf(5) man page
|
||||
# Values: CMD
|
||||
#
|
||||
actionban = printf %%b "Subject: [Fail2Ban] <name>: banned <ip> from `uname -n`
|
||||
actionban = ( printf %%b "Subject: [Fail2Ban] <name>: banned <ip> from `uname -n`
|
||||
Date: `LC_ALL=C date +"%%a, %%d %%h %%Y %%T %%z"`
|
||||
From: <sendername> <<sender>>
|
||||
To: <dest>\n
|
||||
|
@ -25,10 +29,11 @@ actionban = printf %%b "Subject: [Fail2Ban] <name>: banned <ip> from `uname -n`
|
|||
<failures> attempts against <name>.\n\n
|
||||
Here is more information about <ip> :\n
|
||||
`/usr/bin/whois <ip> || echo missing whois program`\n\n
|
||||
Lines containing IP:<ip> in <logpath>\n
|
||||
`grep -E <grepopts> '(^|[^0-9])<ip>([^0-9]|$)' <logpath>`\n\n
|
||||
Lines containing failures of <ip>\n";
|
||||
%(_grep_logs)s;
|
||||
printf %%b "\n
|
||||
Regards,\n
|
||||
Fail2Ban" | /usr/sbin/sendmail -f <sender> <dest>
|
||||
Fail2Ban" ) | /usr/sbin/sendmail -f <sender> <dest>
|
||||
|
||||
[Init]
|
||||
|
||||
|
@ -42,4 +47,5 @@ logpath = /dev/null
|
|||
|
||||
# Number of log lines to include in the email
|
||||
#
|
||||
grepopts = -m 1000
|
||||
#grepmax = 1000
|
||||
#grepopts = -m <grepmax>
|
||||
|
|
|
@ -10,6 +10,9 @@ before = sendmail-common.conf
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionban
|
||||
# Notes.: command executed when banning an IP. Take care that the
|
||||
# command is executed with Fail2Ban user rights.
|
||||
|
|
|
@ -10,6 +10,9 @@ before = sendmail-common.conf
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionban
|
||||
# Notes.: command executed when banning an IP. Take care that the
|
||||
# command is executed with Fail2Ban user rights.
|
||||
|
|
|
@ -10,6 +10,9 @@ before = sendmail-common.conf
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
# Option: actionban
|
||||
# Notes.: command executed when banning an IP. Take care that the
|
||||
# command is executed with Fail2Ban user rights.
|
||||
|
|
|
@ -126,6 +126,9 @@ class SMTPAction(ActionBase):
|
|||
bantime = self._jail.actions.getBanTime,
|
||||
)
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
self.norestored = 1
|
||||
|
||||
def _sendMessage(self, subject, text):
|
||||
"""Sends message based on arguments and instance's properties.
|
||||
|
||||
|
@ -211,6 +214,8 @@ class SMTPAction(ActionBase):
|
|||
Dictionary which includes information in relation to
|
||||
the ban.
|
||||
"""
|
||||
if aInfo.get('restored'):
|
||||
return
|
||||
aInfo.update(self.message_values)
|
||||
message = "".join([
|
||||
messages['ban']['head'],
|
||||
|
|
|
@ -32,6 +32,9 @@
|
|||
|
||||
[Definition]
|
||||
|
||||
# bypass ban/unban for restored tickets
|
||||
norestored = 1
|
||||
|
||||
actionstart =
|
||||
|
||||
actionstop =
|
||||
|
|
|
@ -9,6 +9,8 @@ failregex = ^\s[+-]\d{4} \S+ \d{3}0[1-9] \S+ <HOST>:\d+ [\d.]+:\d+ \d+ \d+ \d+\s
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# DEV Notes:
|
||||
# http://www.3proxy.ru/howtoe.asp#ERRORS indicates that 01-09 are
|
||||
# all authentication problems (%E field)
|
||||
|
|
|
@ -14,6 +14,9 @@ failregex = ^<HOST> -.*"(GET|POST|HEAD).*HTTP.*"(?:%(badbots)s|%(badbotscustom)s
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = ^[^\[]*\[({DATE})
|
||||
{^LN-BEG}
|
||||
|
||||
# DEV Notes:
|
||||
# List of bad bots fetched from http://www.user-agents.org
|
||||
# Generated on Thu Nov 7 14:23:35 PST 2013 by files/gen_badbots.
|
||||
|
|
|
@ -10,6 +10,8 @@ after = apache-common.local
|
|||
|
||||
_apache_error_client = \[\] \[(:?error|\S+:\S+)\]( \[pid \d+(:\S+ \d+)?\])? \[client <HOST>(:\d{1,5})?\]
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# Common prefix for [error] apache messages which also would include <HOST>
|
||||
# Depending on the version it could be
|
||||
# 2.2: [Sat Jun 01 11:23:08 2013] [error] [client 1.2.3.4]
|
||||
|
|
|
@ -6,6 +6,8 @@ failregex = ^<HOST> .*Googlebot.*$
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = ^[^\[]*\[({DATE})
|
||||
{^LN-BEG}
|
||||
|
||||
# DEV Notes:
|
||||
#
|
||||
|
|
|
@ -10,9 +10,10 @@ before = apache-common.conf
|
|||
[Definition]
|
||||
|
||||
|
||||
failregex = ^%(_apache_error_client)s ModSecurity: (\[.*?\] )*Access denied with code [45]\d\d.*$
|
||||
failregex = ^%(_apache_error_client)s ModSecurity:\s+(?:\[(?:\w+ \"[^\"]*\"|[^\]]*)\]\s*)*Access denied with code [45]\d\d
|
||||
|
||||
ignoreregex =
|
||||
|
||||
# https://github.com/SpiderLabs/ModSecurity/wiki/ModSecurity-2-Data-Formats
|
||||
# Author: Daniel Black
|
||||
# Sergey G. Brester aka sebres (review, optimization)
|
|
@ -3,16 +3,15 @@
|
|||
#
|
||||
# The knocking request must have a referer.
|
||||
|
||||
[INCLUDES]
|
||||
|
||||
before = apache-common.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
failregex = ^<HOST> - \w+ \[\] "GET <knocking_url> HTTP/1\.[01]" 200 \d+ ".*" "[^-].*"$
|
||||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = ^[^\[]*\[({DATE})
|
||||
{^LN-BEG}
|
||||
|
||||
[Init]
|
||||
|
||||
knocking_url = /knocking/
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
#
|
||||
|
||||
[Definition]
|
||||
# Note: First three failregex matches below are for ASSP V1 with the remaining being designed for V2. Deleting the V1 regex is recommended but I left it in for compatibilty reasons.
|
||||
# Note: First three failregex matches below are for ASSP V1 with the remaining being designed for V2. Deleting the V1 regex is recommended but I left it in for compatibility reasons.
|
||||
|
||||
__assp_actions = (?:dropping|refusing)
|
||||
|
||||
|
@ -20,6 +20,9 @@ failregex = ^(:? \[SSL-out\])? <HOST> max sender authentication errors \(\d{,3}\
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}%%b-%%d-%%Exy %%H:%%M:%%S
|
||||
{^LN-BEG}
|
||||
|
||||
# DEV Notes:
|
||||
# V1 Examples matches:
|
||||
# Apr-27-13 02:33:09 Blocking 217.194.197.97 - too much AUTH errors (41);
|
||||
|
|
|
@ -31,6 +31,7 @@ failregex = ^%(__prefix_line)s%(log_prefix)s Registration from '[^']*' failed fo
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# Author: Xavier Devlamynck / Daniel Black
|
||||
#
|
||||
|
|
|
@ -61,4 +61,7 @@ __prefix_line = %(__date_ambit)s?\s*(?:%(__bsd_syslog_verbose)s\s+)?(?:%(__hostn
|
|||
# pam_ldap
|
||||
__pam_auth = pam_unix
|
||||
|
||||
# standardly all formats using prefix have line-begin anchored date:
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# Author: Yaroslav Halchenko
|
||||
|
|
|
@ -8,8 +8,6 @@ failregex = ^: Bad Rcon: "rcon \d+ "\S+" sv_contact ".*?"" from "<HOST>:\d+"$
|
|||
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
|
||||
datepattern = ^L %%d/%%m/%%Y - %%H:%%M:%%S
|
||||
|
||||
|
||||
|
|
|
@ -15,5 +15,7 @@ failregex = ^%(__prefix_line)sLOGIN FAILED, user=.*, ip=\[<HOST>\]$
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# Author: Christoph Haas
|
||||
# Modified by: Cyril Jaquier
|
||||
|
|
|
@ -13,7 +13,6 @@ failregex = ^: \'<HOST>\' \d{1,3} failed login attempt(s)?. \s*
|
|||
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
datepattern = ^%%Y:%%m:%%d-%%H:%%M:%%S
|
||||
|
||||
#
|
||||
|
|
|
@ -0,0 +1,47 @@
|
|||
# Fail2Ban configuration file for IBM Domino SMTP Server TASK to detect failed login attempts
|
||||
#
|
||||
# Author: Christian Brandlehner
|
||||
#
|
||||
# $Revision: 003 $
|
||||
#
|
||||
# Configuration:
|
||||
# Set the following Domino Server parameters in notes.ini:
|
||||
# console_log_enabled=1
|
||||
# log_sessions=2
|
||||
# You also have to use a date and time format supported by fail2ban. Recommended notes.ini configuration is:
|
||||
# DateOrder=DMY
|
||||
# DateSeparator=-
|
||||
# ClockType=24_Hour
|
||||
# TimeSeparator=:
|
||||
#
|
||||
# Depending on your locale you might have to tweak the date and time format so fail2ban can read the log
|
||||
|
||||
#[INCLUDES]
|
||||
# Read common prefixes. If any customizations available -- read them from
|
||||
# common.local
|
||||
#before = common.conf
|
||||
|
||||
[Definition]
|
||||
# Option: failregex
|
||||
# Notes.: regex to match the password failure messages in the logfile. The
|
||||
# host must be matched by a group named "host". The tag "<HOST>" can
|
||||
# be used for standard IP/hostname matching and is only an alias for
|
||||
# (?:::f{4,6}:)?(?P<host>\S+)
|
||||
# Values: TEXT
|
||||
#
|
||||
# Sample log entries (used different time formats and an extra sample with process info in front of date)
|
||||
# 01-23-2009 19:54:51 SMTP Server: Authentication failed for user postmaster ; connecting host 1.2.3.4
|
||||
# [28325:00010-3735542592] 22-06-2014 09:56:12 smtp: postmaster [1.2.3.4] authentication failure using internet password
|
||||
# 08-09-2014 06:14:27 smtp: postmaster [1.2.3.4] authentication failure using internet password
|
||||
# 08-09-2014 06:14:27 SMTP Server: Authentication failed for user postmaster ; connecting host 1.2.3.4
|
||||
|
||||
__prefix = (?:\[[^\]]+\])?\s+
|
||||
failregex = ^%(__prefix)sSMTP Server: Authentication failed for user .*? \; connecting host <HOST>$
|
||||
^%(__prefix)ssmtp: (?:[^\[]+ )*\[<HOST>\] authentication failure using internet password\s*$
|
||||
# Option: ignoreregex
|
||||
# Notes.: regex to ignore. If this regex matches, the line is ignored.
|
||||
# Values: TEXT
|
||||
#
|
||||
|
||||
ignoreregex =
|
||||
|
|
@ -9,18 +9,19 @@ before = common.conf
|
|||
|
||||
_daemon = (auth|dovecot(-auth)?|auth-worker)
|
||||
|
||||
failregex = ^%(__prefix_line)s(%(__pam_auth)s(\(dovecot:auth\))?:)?\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(\s+user=\S*)?\s*$
|
||||
^%(__prefix_line)s(pop3|imap)-login: (Info: )?(Aborted login|Disconnected)(: Inactivity)? \(((auth failed, \d+ attempts)( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<\S*>,)?( method=\S+,)? rip=<HOST>(, lip=(\d{1,3}\.){3}\d{1,3})?(, TLS( handshaking(: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
|
||||
^%(__prefix_line)s(Info|dovecot: auth\(default\)|auth-worker\(\d+\)): pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$
|
||||
^%(__prefix_line)s(auth|auth-worker\(\d+\)): (pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$
|
||||
^%(__prefix_line)s(auth|auth-worker\(\d+\)): Info: ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$
|
||||
failregex = ^%(__prefix_line)s(?:%(__pam_auth)s(?:\(dovecot:auth\))?:)?\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(?:\s+user=\S*)?\s*$
|
||||
^%(__prefix_line)s(?:pop3|imap)-login: (?:Info: )?(?:Aborted login|Disconnected)(?::(?: [^ \(]+)+)? \((?:auth failed, \d+ attempts( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<[^>]+>,)?( method=\S+,)? rip=<HOST>(?:, lip=\S+)?(?:, TLS(?: handshaking(?:: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
|
||||
^%(__prefix_line)s(?:Info|dovecot: auth\(default\)|auth-worker\(\d+\)): pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$
|
||||
^%(__prefix_line)s(?:auth|auth-worker\(\d+\)): (?:pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$
|
||||
^%(__prefix_line)s(?:auth|auth-worker\(\d+\)): Info: ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$
|
||||
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
|
||||
journalmatch = _SYSTEMD_UNIT=dovecot.service
|
||||
|
||||
datepattern = {^LN-BEG}TAI64N
|
||||
{^LN-BEG}
|
||||
|
||||
# DEV Notes:
|
||||
# * the first regex is essentially a copy of pam-generic.conf
|
||||
# * Probably doesn't do dovecot sql/ldap backends properly (resolved in edit 21/03/2016)
|
||||
|
@ -30,3 +31,4 @@ journalmatch = _SYSTEMD_UNIT=dovecot.service
|
|||
# Author: Martin Waschbuesch
|
||||
# Daniel Black (rewrote with begin and end anchors)
|
||||
# Martin O'Neal (added LDAP authentication failure regex)
|
||||
# Sergey G. Brester aka sebres (reviewed, optimized, IPv6-compatibility)
|
||||
|
|
|
@ -25,8 +25,6 @@ failregex = ^=INFO REPORT==== ===\nI\(<0\.\d+\.0>:ejabberd_c2s:\d+\) : \([^)]+\
|
|||
#
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
|
||||
# "maxlines" is number of log lines to buffer for multi-line regex searches
|
||||
maxlines = 2
|
||||
|
||||
|
@ -35,3 +33,8 @@ maxlines = 2
|
|||
# Values: TEXT
|
||||
#
|
||||
journalmatch =
|
||||
|
||||
#datepattern = ^(?:=[^=]+={3,} )?({DATE})
|
||||
# explicit time format using prefix =...==== and no date in second string begins with I(...)...
|
||||
datepattern = ^(?:=[^=]+={3,} )?(%%ExY(?P<_sep>[-/.])%%m(?P=_sep)%%d[T ]%%H:%%M:%%S(?:[.,]%%f)?(?:\s*%%z)?)
|
||||
^I\(()**
|
||||
|
|
|
@ -8,13 +8,26 @@
|
|||
# IP addresses on your LAN.
|
||||
#
|
||||
|
||||
[INCLUDES]
|
||||
|
||||
# Read common prefixes. If any customizations available -- read them from
|
||||
# common.local
|
||||
before = common.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
failregex = ^\.\d+ \[WARNING\] sofia_reg\.c:\d+ SIP auth (failure|challenge) \((REGISTER|INVITE)\) on sofia profile \'[^']+\' for \[.*\] from ip <HOST>$
|
||||
^\.\d+ \[WARNING\] sofia_reg\.c:\d+ Can't find user \[\d+@\d+\.\d+\.\d+\.\d+\] from <HOST>$
|
||||
_daemon = freeswitch
|
||||
|
||||
# Prefix contains common prefix line (server, daemon, etc.) and 2 datetimes if used systemd backend
|
||||
_pref_line = ^%(__prefix_line)s(?:\d+-\d+-\d+ \d+:\d+:\d+\.\d+)?
|
||||
|
||||
failregex = %(_pref_line)s \[WARNING\] sofia_reg\.c:\d+ SIP auth (failure|challenge) \((REGISTER|INVITE)\) on sofia profile \'[^']+\' for \[[^\]]*\] from ip <HOST>$
|
||||
%(_pref_line)s \[WARNING\] sofia_reg\.c:\d+ Can't find user \[[^@]+@[^\]]+\] from <HOST>$
|
||||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# Author: Rupa SChomaker, soapee01, Daniel Black
|
||||
# https://freeswitch.org/confluence/display/FREESWITCH/Fail2Ban
|
||||
# Thanks to Jim on mailing list of samples and guidance
|
||||
|
|
|
@ -17,6 +17,9 @@ failregex = ^.*\nWARNING: Authentication attempt from <HOST> for user "[^"]*" fa
|
|||
#
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
# "maxlines" is number of log lines to buffer for multi-line regex searches
|
||||
maxlines = 2
|
||||
|
||||
datepattern = ^%%b %%d, %%ExY %%I:%%M:%%S %%p
|
||||
^WARNING:()**
|
||||
{^LN-BEG}
|
|
@ -9,8 +9,6 @@ failregex = ^ SMTP Spam attack detected from <HOST>,
|
|||
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
|
||||
datepattern = ^\[%%d/%%b/%%Y %%H:%%M:%%S\]
|
||||
|
||||
# DEV NOTES:
|
||||
|
|
|
@ -0,0 +1,49 @@
|
|||
# Fail2Ban filter for unsuccesfull MongoDB authentication attempts
|
||||
#
|
||||
# Logfile /var/log/mongodb/mongodb.log
|
||||
#
|
||||
# add setting in /etc/mongodb.conf
|
||||
# logpath=/var/log/mongodb/mongodb.log
|
||||
#
|
||||
# and use of the authentication
|
||||
# auth = true
|
||||
#
|
||||
|
||||
[Definition]
|
||||
#failregex = ^\s+\[initandlisten\] connection accepted from <HOST>:\d+ \#(?P<__connid>\d+) \(1 connection now open\)<SKIPLINES>\s+\[conn(?P=__connid)\] Failed to authenticate\s+
|
||||
failregex = ^\s+\[conn(?P<__connid>\d+)\] Failed to authenticate [^\n]+<SKIPLINES>\s+\[conn(?P=__connid)\] end connection <HOST>
|
||||
|
||||
ignoreregex =
|
||||
|
||||
|
||||
[Init]
|
||||
maxlines = 10
|
||||
|
||||
# DEV Notes:
|
||||
#
|
||||
# Regarding the multiline regex:
|
||||
#
|
||||
# There can be a nunber of non-related lines between the first and second part
|
||||
# of this regex maxlines of 10 is quite generious.
|
||||
#
|
||||
# Note the capture __connid, includes the connection ID, used in second part of regex.
|
||||
#
|
||||
# The first regex is commented out (but will match also), because it is better to use
|
||||
# the host from "end connection" line (uncommented above):
|
||||
# - it has the same prefix, searching begins directly with failure message
|
||||
# (so faster, because ignores success connections at all)
|
||||
# - it is not so vulnerable in case of possible race condition
|
||||
#
|
||||
# Log example:
|
||||
# 2016-10-20T09:54:27.108+0200 [initandlisten] connection accepted from 127.0.0.1:53276 #1 (1 connection now open)
|
||||
# 2016-10-20T09:54:27.109+0200 [conn1] authenticate db: test { authenticate: 1, nonce: "xxx", user: "root", key: "xxx" }
|
||||
# 2016-10-20T09:54:27.110+0200 [conn1] Failed to authenticate root@test with mechanism MONGODB-CR: AuthenticationFailed UserNotFound Could not find user root@test
|
||||
# 2016-11-09T09:54:27.894+0100 [conn1] end connection 127.0.0.1:53276 (0 connections now open)
|
||||
# 2016-11-09T11:55:58.890+0100 [initandlisten] connection accepted from 127.0.0.1:54266 #1510 (1 connection now open)
|
||||
# 2016-11-09T11:55:58.892+0100 [conn1510] authenticate db: admin { authenticate: 1, nonce: "xxx", user: "root", key: "xxx" }
|
||||
# 2016-11-09T11:55:58.892+0100 [conn1510] Failed to authenticate root@admin with mechanism MONGODB-CR: AuthenticationFailed key mismatch
|
||||
# 2016-11-09T11:55:58.894+0100 [conn1510] end connection 127.0.0.1:54266 (0 connections now open)
|
||||
#
|
||||
# Authors: Alexander Finkhäuser
|
||||
# Sergey G. Brester (sebres)
|
||||
|
|
@ -13,7 +13,7 @@ before = common.conf
|
|||
_daemon = monit
|
||||
|
||||
# Regexp for previous (accessing monit httpd) and new (access denied) versions
|
||||
failregex = ^\[[A-Z]+\s+\]\s*error\s*:\s*Warning:\s+Client '<HOST>' supplied (?:unknown user '[^']+'|wrong password for user '[^']*') accessing monit httpd$
|
||||
failregex = ^\[\s*\]\s*error\s*:\s*Warning:\s+Client '<HOST>' supplied (?:unknown user '[^']+'|wrong password for user '[^']*') accessing monit httpd$
|
||||
^%(__prefix_line)s\w+: access denied -- client <HOST>: (?:unknown user '[^']+'|wrong password for user '[^']*'|empty password)$
|
||||
|
||||
# Ignore login with empty user (first connect, no user specified)
|
||||
|
|
|
@ -15,13 +15,14 @@ _daemon = murmurd
|
|||
# variable in your server config file (murmur.ini / mumble-server.ini).
|
||||
_usernameregex = [^>]+
|
||||
|
||||
_prefix = <W>[\n\s]*(\.\d{3})?\s+\d+ => <\d+:%(_usernameregex)s\(-1\)> Rejected connection from <HOST>:\d+:
|
||||
_prefix = \s+\d+ => <\d+:%(_usernameregex)s\(-1\)> Rejected connection from <HOST>:\d+:
|
||||
|
||||
failregex = ^%(_prefix)s Invalid server password$
|
||||
^%(_prefix)s Wrong certificate or password for existing user$
|
||||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = ^<W>{DATE}
|
||||
|
||||
# DEV Notes:
|
||||
#
|
||||
|
|
|
@ -13,6 +13,9 @@ failregex = ^<HOST> \- \S+ \[\] \"(GET|POST|HEAD) \/<block> \S+\" 404 .+$
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}%%ExY(?P<_sep>[-/.])%%m(?P=_sep)%%d[T ]%%H:%%M:%%S(?:[.,]%%f)?(?:\s*%%z)?
|
||||
^[^\[]*\[({DATE})
|
||||
{^LN-BEG}
|
||||
|
||||
# DEV Notes:
|
||||
# Based on apache-botsearch filter
|
||||
|
|
|
@ -8,6 +8,8 @@ failregex = ^ \[error\] \d+#\d+: \*\d+ user "\S+":? (password mismatch|was not f
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# DEV NOTES:
|
||||
# Based on samples in https://github.com/fail2ban/fail2ban/pull/43/files
|
||||
# Extensive search of all nginx auth failures not done yet.
|
||||
|
|
|
@ -43,3 +43,4 @@ failregex = ^\s*\[error\] \d+#\d+: \*\d+ limiting requests, excess: [\d\.]+ by z
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
|
|
@ -26,3 +26,6 @@ failregex = ^%(__prefix_line)sinfo: ratelimit block .* query <HOST> TYPE255$
|
|||
^%(__prefix_line)sinfo: .* <HOST> refused, no acl matches\.$
|
||||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}Epoch
|
||||
{^LN-BEG}
|
|
@ -9,7 +9,6 @@
|
|||
[Definition]
|
||||
failregex = ^<HOST>\s+-\s+-\s+\[\]\s+"[A-Z]+ .*" 401 \d+\s*$
|
||||
|
||||
[Init]
|
||||
datepattern = %%d/%%b[^/]*/%%Y:%%H:%%M:%%S %%z
|
||||
|
||||
|
||||
|
|
|
@ -52,10 +52,12 @@ before = common.conf
|
|||
# Note that you MUST have LOG_FORMAT=4 for this to work!
|
||||
#
|
||||
|
||||
failregex = ^.*tr="[A-Z]+\|[0-9.]+\|\d+\|<HOST>\|\d+" ap="[^"]*" mi="Bad password" us="[^"]*" di="535 5.7.8 Bad username or password( \(Authentication failed\))?\."/>$
|
||||
failregex = tr="[A-Z]+\|[0-9.]+\|\d+\|<HOST>\|\d+" ap="[^"]*" mi="Bad password" us="[^"]*" di="535 5.7.8 Bad username or password( \(Authentication failed\))?\."/>$
|
||||
|
||||
# Option: ignoreregex
|
||||
# Notes.: regex to ignore. If this regex matches, the line is ignored.
|
||||
# Values: TEXT
|
||||
#
|
||||
ignoreregex =
|
||||
|
||||
datepattern = ^<co ts="{DATE}"\s+
|
||||
|
|
|
@ -18,3 +18,6 @@ ignoreregex =
|
|||
# http://blogs.buanzo.com.ar/2009/04/fail2ban-filter-for-php-injection-attacks.html#comment-1489
|
||||
#
|
||||
# Author: Arturo 'Buanzo' Busleiman <buanzo@buanzo.com.ar>
|
||||
|
||||
datepattern = ^[^\[]*\[({DATE})
|
||||
{^LN-BEG}
|
||||
|
|
|
@ -8,5 +8,8 @@ failregex = \/<HOST> Port\: [0-9]+ (TCP|UDP) Blocked$
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}Epoch
|
||||
{^LN-BEG}
|
||||
|
||||
# Author: Pacop <pacoparu@gmail.com>
|
||||
|
||||
|
|
|
@ -18,4 +18,6 @@ failregex = ^type=%(_type)s msg=audit\(:\d+\): (user )?pid=\d+ uid=%(_uid)s auid
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = EPOCH
|
||||
|
||||
# Author: Daniel Black
|
||||
|
|
|
@ -23,9 +23,9 @@ _daemon = (?:(sm-(mta|acceptingconnections)|sendmail))
|
|||
|
||||
failregex = ^%(__prefix_line)s\w{14}: ruleset=check_rcpt, arg1=(?P<email><\S+@\S+>), relay=(\S+ )?\[<HOST>\]( \(may be forged\))?, reject=(550 5\.7\.1 (?P=email)\.\.\. Relaying denied\. (IP name possibly forged \[(\d+\.){3}\d+\]|Proper authentication required\.|IP name lookup failed \[(\d+\.){3}\d+\])|553 5\.1\.8 (?P=email)\.\.\. Domain of sender address \S+ does not exist|550 5\.[71]\.1 (?P=email)\.\.\. (Rejected: .*|User unknown))$
|
||||
^%(__prefix_line)sruleset=check_relay, arg1=(?P<dom>\S+), arg2=<HOST>, relay=((?P=dom) )?\[(\d+\.){3}\d+\]( \(may be forged\))?, reject=421 4\.3\.2 (Connection rate limit exceeded\.|Too many open connections\.)$
|
||||
^%(__prefix_line)s\w{14}: rejecting commands from (\S+ )?\[<HOST>\] due to pre-greeting traffic after \d+ seconds$
|
||||
^%(__prefix_line)s\w{14}: rejecting commands from (\S* )?\[<HOST>\] due to pre-greeting traffic after \d+ seconds$
|
||||
^%(__prefix_line)s\w{14}: (\S+ )?\[<HOST>\]: ((?i)expn|vrfy) \S+ \[rejected\]$
|
||||
^(?P<__prefix>%(__prefix_line)s\w+: )<[^@]+@[^>]+>\.\.\. No such user here<SKIPLINES>(?P=__prefix)from=<[^@]+@[^>]+>, size=\d+, class=\d+, nrcpts=\d+, bodytype=\w+, proto=E?SMTP, daemon=MTA, relay=\S+ \[<HOST>\]$
|
||||
^(?P<__prefix>%(__prefix_line)s\w+: )<[^@]+@[^>]+>\.\.\. No such user here$<SKIPLINES>^(?P=__prefix)from=<[^@]+@[^>]+>, size=\d+, class=\d+, nrcpts=\d+, bodytype=\w+, proto=E?SMTP, daemon=MTA, relay=\S+ \[<HOST>\]$
|
||||
|
||||
|
||||
ignoreregex =
|
||||
|
|
|
@ -6,7 +6,12 @@
|
|||
|
||||
failregex = ^ sogod \[\d+\]: SOGoRootPage Login from '<HOST>' for user '.*' might not have worked( - password policy: \d* grace: -?\d* expire: -?\d* bound: -?\d*)?\s*$
|
||||
|
||||
ignoreregex =
|
||||
ignoreregex = "^<ADDR>"
|
||||
|
||||
datepattern = {^LN-BEG}%%ExY(?P<_sep>[-/.])%%m(?P=_sep)%%d[T ]%%H:%%M:%%S(?:[.,]%%f)?(?:\s*%%z)?
|
||||
{^LN-BEG}(?:%%a )?%%b %%d %%H:%%M:%%S(?:\.%%f)?(?: %%ExY)?
|
||||
^[^\[]*\[({DATE})
|
||||
{^LN-BEG}
|
||||
|
||||
#
|
||||
# DEV Notes:
|
||||
|
|
|
@ -9,5 +9,8 @@ failregex = ^\s+\d\s<HOST>\s+[A-Z_]+_DENIED/403 .*$
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = {^LN-BEG}Epoch
|
||||
{^LN-BEG}
|
||||
|
||||
# Author: Daniel Black
|
||||
|
||||
|
|
|
@ -5,8 +5,6 @@ failregex = ^ \[LOGIN_ERROR\].*from <HOST>: Unknown user or password incorrect\.
|
|||
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
|
||||
datepattern = ^%%m/%%d/%%Y %%H:%%M:%%S
|
||||
|
||||
# DEV NOTES:
|
||||
|
|
|
@ -0,0 +1,11 @@
|
|||
# Fail2Ban aggressive ssh filter for at attempted exploit
|
||||
#
|
||||
# Includes failregex of both sshd and sshd-ddos filters
|
||||
#
|
||||
[INCLUDES]
|
||||
|
||||
before = sshd.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
mode = %(aggressive)s
|
|
@ -10,20 +10,8 @@
|
|||
|
||||
[INCLUDES]
|
||||
|
||||
# Read common prefixes. If any customizations available -- read them from
|
||||
# common.local
|
||||
before = common.conf
|
||||
before = sshd.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
_daemon = sshd
|
||||
|
||||
failregex = ^%(__prefix_line)sDid not receive identification string from <HOST>\s*$
|
||||
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
|
||||
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd
|
||||
|
||||
# Author: Yaroslav Halchenko
|
||||
mode = %(ddos)s
|
||||
|
|
|
@ -14,37 +14,64 @@
|
|||
# common.local
|
||||
before = common.conf
|
||||
|
||||
[Definition]
|
||||
[DEFAULT]
|
||||
|
||||
_daemon = sshd
|
||||
|
||||
failregex = ^%(__prefix_line)s(?:error: PAM: )?[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*$
|
||||
^%(__prefix_line)s(?:error: PAM: )?User not known to the underlying authentication module for .* from <HOST>\s*$
|
||||
^%(__prefix_line)sFailed \S+ for .*? from <HOST>(?: port \d*)?(?: ssh\d*)?(: (ruser .*|(\S+ ID \S+ \(serial \d+\) CA )?\S+ %(__md5hex)s(, client user ".*", client host ".*")?))?\s*$
|
||||
^%(__prefix_line)sROOT LOGIN REFUSED.* FROM <HOST>\s*$
|
||||
^%(__prefix_line)s[iI](?:llegal|nvalid) user .* from <HOST>\s*$
|
||||
^%(__prefix_line)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*$
|
||||
^%(__prefix_line)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*$
|
||||
^%(__prefix_line)sUser .+ from <HOST> not allowed because not in any group\s*$
|
||||
^%(__prefix_line)srefused connect from \S+ \(<HOST>\)\s*$
|
||||
^%(__prefix_line)s(?:error: )?Received disconnect from <HOST>: 3: .*: Auth fail(?: \[preauth\])?$
|
||||
^%(__prefix_line)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*$
|
||||
^%(__prefix_line)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*$
|
||||
^(?P<__prefix>%(__prefix_line)s)User .+ not allowed because account is locked<SKIPLINES>(?P=__prefix)(?:error: )?Received disconnect from <HOST>: 11: .+ \[preauth\]$
|
||||
^(?P<__prefix>%(__prefix_line)s)Disconnecting: Too many authentication failures for .+? \[preauth\]<SKIPLINES>(?P=__prefix)(?:error: )?Connection closed by <HOST> \[preauth\]$
|
||||
^(?P<__prefix>%(__prefix_line)s)Connection from <HOST> port \d+(?: on \S+ port \d+)?<SKIPLINES>(?P=__prefix)Disconnecting: Too many authentication failures for .+? \[preauth\]$
|
||||
^%(__prefix_line)s(error: )?maximum authentication attempts exceeded for .* from <HOST>(?: port \d*)?(?: ssh\d*)? \[preauth\]$
|
||||
^%(__prefix_line)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*$
|
||||
# optional prefix (logged from several ssh versions) like "error: ", "error: PAM: " or "fatal: "
|
||||
__pref = (?:(?:error|fatal): (?:PAM: )?)?
|
||||
# optional suffix (logged from several ssh versions) like " [preauth]"
|
||||
__suff = (?: \[preauth\])?\s*
|
||||
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
|
||||
|
||||
# single line prefix:
|
||||
__prefix_line_sl = %(__prefix_line)s%(__pref)s
|
||||
# multi line prefixes (for first and second lines):
|
||||
__prefix_line_ml1 = (?P<__prefix>%(__prefix_line)s)%(__pref)s
|
||||
__prefix_line_ml2 = %(__suff)s$<SKIPLINES>^(?P=__prefix)%(__pref)s
|
||||
|
||||
mode = %(normal)s
|
||||
|
||||
normal = ^%(__prefix_line_sl)s[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)sUser not known to the underlying authentication module for .* from <HOST>\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)sFailed \S+ for (?P<cond_inv>invalid user )?(?P<user>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)) from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
|
||||
^%(__prefix_line_sl)sROOT LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)s[iI](?:llegal|nvalid) user .*? from <HOST>%(__on_port_opt)s\s*$
|
||||
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not in any group\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)srefused connect from \S+ \(<HOST>\)\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
|
||||
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
|
||||
^%(__prefix_line_sl)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*%(__suff)s$
|
||||
^%(__prefix_line_sl)s(error: )?maximum authentication attempts exceeded for .* from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$
|
||||
^%(__prefix_line_ml1)sUser .+ not allowed because account is locked%(__prefix_line_ml2)sReceived disconnect from <HOST>: 11: .+%(__suff)s$
|
||||
^%(__prefix_line_ml1)sDisconnecting: Too many authentication failures for .+?%(__prefix_line_ml2)sConnection closed by <HOST>%(__suff)s$
|
||||
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sDisconnecting: Too many authentication failures for .+%(__suff)s$
|
||||
|
||||
ddos = ^%(__prefix_line_sl)sDid not receive identification string from <HOST>%(__suff)s$
|
||||
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
|
||||
^%(__prefix_line_sl)sUnable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found.
|
||||
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sUnable to negotiate a (?:cipher|key exchange method)%(__suff)s$
|
||||
^%(__prefix_line_ml1)sSSH: Server;Ltype: (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:.*%(__prefix_line_ml2)sRead from socket failed: Connection reset by peer%(__suff)s$
|
||||
|
||||
aggressive = %(normal)s
|
||||
%(ddos)s
|
||||
|
||||
[Definition]
|
||||
|
||||
failregex = %(mode)s
|
||||
|
||||
ignoreregex =
|
||||
|
||||
[Init]
|
||||
|
||||
# "maxlines" is number of log lines to buffer for multi-line regex searches
|
||||
maxlines = 10
|
||||
|
||||
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd
|
||||
|
||||
datepattern = {^LN-BEG}
|
||||
|
||||
# DEV Notes:
|
||||
#
|
||||
# "Failed \S+ for .*? from <HOST>..." failregex uses non-greedy catch-all because
|
||||
|
|
|
@ -17,7 +17,7 @@ _daemon = (?:lighttpd|suhosin)
|
|||
|
||||
_lighttpd_prefix = (?:\(mod_fastcgi\.c\.\d+\) FastCGI-stderr:\s)
|
||||
|
||||
failregex = ^%(__prefix_line)s%(_lighttpd_prefix)s?ALERT - .* \(attacker '<HOST>', file '.*'(?:, line \d+)?\)$
|
||||
failregex = ^%(__prefix_line)s%(_lighttpd_prefix)s?ALERT - .*? \(attacker '<HOST>', file '[^']*'(?:, line \d+)?\)$
|
||||
|
||||
ignoreregex =
|
||||
|
||||
|
|
|
@ -10,6 +10,9 @@ failregex = ^[\da-f]{5,} [\da-f]{5,} (-- none --|.*?)( \d+(\.\d+)?(h|m|s|ms)){0
|
|||
|
||||
ignoreregex =
|
||||
|
||||
datepattern = ^[^-]+ -- [^-]+ -- - ({DATE})
|
||||
{^LN-BEG}
|
||||
|
||||
# Author: Mika (mkl) from Tine20.org forum: https://www.tine20.org/forum/viewtopic.php?f=2&t=15688&p=54766
|
||||
# Editor: Daniel Black
|
||||
# Advisor: Lars Kneschke
|
||||
|
|
|
@ -261,6 +261,8 @@ action = %(action_)s
|
|||
|
||||
[sshd]
|
||||
|
||||
# To use more aggressive sshd filter (inclusive sshd-ddos failregex):
|
||||
#filter = sshd-aggressive
|
||||
port = ssh
|
||||
logpath = %(sshd_log)s
|
||||
backend = %(sshd_backend)s
|
||||
|
@ -769,6 +771,13 @@ logpath = %(mysql_log)s
|
|||
backend = %(mysql_backend)s
|
||||
|
||||
|
||||
# Log wrong MongoDB auth (for details see filter 'filter.d/mongodb-auth.conf')
|
||||
[mongodb-auth]
|
||||
# change port when running with "--shardsvr" or "--configsvr" runtime operation
|
||||
port = 27017
|
||||
logpath = /var/log/mongodb/mongodb.log
|
||||
|
||||
|
||||
# Jail for more extended banning of persistent abusers
|
||||
# !!! WARNINGS !!!
|
||||
# 1. Make sure that your loglevel specified in fail2ban.conf/.local
|
||||
|
@ -848,8 +857,9 @@ maxretry = 1
|
|||
[pass2allow-ftp]
|
||||
# this pass2allow example allows FTP traffic after successful HTTP authentication
|
||||
port = ftp,ftp-data,ftps,ftps-data
|
||||
# knocking_url variable must be overridden to some secret value in filter.d/apache-pass.local
|
||||
filter = apache-pass
|
||||
# knocking_url variable must be overridden to some secret value in jail.local
|
||||
knocking_url = /knocking/
|
||||
filter = apache-pass[knocking_url="%(knocking_url)s"]
|
||||
# access log of the website with HTTP auth
|
||||
logpath = %(apache_access_log)s
|
||||
blocktype = RETURN
|
||||
|
@ -883,3 +893,8 @@ logpath = /var/log/haproxy.log
|
|||
port = ldap,ldaps
|
||||
filter = slapd
|
||||
logpath = /var/log/slapd.log
|
||||
|
||||
[domino-smtp]
|
||||
port = smtp,ssmtp
|
||||
filter = domino-smtp
|
||||
logpath = /home/domino01/data/IBM_TECHNICAL_SUPPORT/console.log
|
||||
|
|
|
@ -34,13 +34,13 @@ auditd_log = /dev/null
|
|||
# http://svnweb.freebsd.org/ports/head/www/apache24/files/patch-config.layout
|
||||
# http://svnweb.freebsd.org/ports/head/www/apache22/files/patch-config.layout
|
||||
|
||||
apache_error_log = /usr/local/www/logs/*error[_.]log
|
||||
apache_error_log = /var/log/httpd-error.log
|
||||
|
||||
apache_access_log = /usr/local/www/logs/*access[_.]log
|
||||
apache_access_log = /var/log/httpd-access.log
|
||||
|
||||
# http://svnweb.freebsd.org/ports/head/www/nginx/Makefile?view=markup
|
||||
|
||||
nginx_error_log = /var/log/nginx-error.log
|
||||
nginx_error_log = /var/log/nginx/error.log
|
||||
|
||||
nginx_access_log = /var/log/nginx-access.log
|
||||
nginx_access_log = /var/log/nginx/access.log
|
||||
|
||||
|
|
|
@ -36,3 +36,15 @@ mysql_log = /var/log/mysql/mysqld.log
|
|||
roundcube_errors_log = /srv/www/roundcubemail/logs/errors
|
||||
|
||||
solidpop3d_log = %(syslog_mail)s
|
||||
|
||||
# These services will log to the journal via syslog, so use the journal by
|
||||
# default.
|
||||
syslog_backend = systemd
|
||||
sshd_backend = systemd
|
||||
dropbear_backend = systemd
|
||||
proftpd_backend = systemd
|
||||
pureftpd_backend = systemd
|
||||
wuftpd_backend = systemd
|
||||
postfix_backend = systemd
|
||||
dovecot_backend = systemd
|
||||
mysql_backend = systemd
|
||||
|
|
|
@ -43,10 +43,15 @@ class ActionReader(DefinitionInitConfigReader):
|
|||
"actionrepair": ["string", None],
|
||||
"actionban": ["string", None],
|
||||
"actionunban": ["string", None],
|
||||
"norestored": ["string", None],
|
||||
}
|
||||
|
||||
def __init__(self, file_, jailName, initOpts, **kwargs):
|
||||
self._name = initOpts.get("actname", file_)
|
||||
actname = initOpts.get("actname")
|
||||
if actname is None:
|
||||
actname = file_
|
||||
initOpts["actname"] = actname
|
||||
self._name = actname
|
||||
DefinitionInitConfigReader.__init__(
|
||||
self, file_, jailName, initOpts, **kwargs)
|
||||
|
||||
|
@ -64,16 +69,22 @@ class ActionReader(DefinitionInitConfigReader):
|
|||
return self._name
|
||||
|
||||
def convert(self):
|
||||
opts = self.getCombined(ignore=('timeout', 'bantime'))
|
||||
# type-convert only after combined (otherwise boolean converting prevents substitution):
|
||||
if opts.get('norestored'):
|
||||
opts['norestored'] = self._convert_to_boolean(opts['norestored'])
|
||||
# stream-convert:
|
||||
head = ["set", self._jailName]
|
||||
stream = list()
|
||||
stream.append(head + ["addaction", self._name])
|
||||
multi = []
|
||||
for opt, optval in self._opts.iteritems():
|
||||
for opt, optval in opts.iteritems():
|
||||
if opt in self._configOpts:
|
||||
multi.append([opt, optval])
|
||||
if self._initOpts:
|
||||
for opt, optval in self._initOpts.iteritems():
|
||||
multi.append([opt, optval])
|
||||
if opt not in self._configOpts:
|
||||
multi.append([opt, optval])
|
||||
if len(multi) > 1:
|
||||
stream.append(["multi-set", self._jailName, "action", self._name, multi])
|
||||
elif len(multi):
|
||||
|
|
|
@ -29,7 +29,7 @@ import re
|
|||
import sys
|
||||
from ..helpers import getLogger
|
||||
|
||||
if sys.version_info >= (3,2): # pragma: no cover
|
||||
if sys.version_info >= (3,2):
|
||||
|
||||
# SafeConfigParser deprecated from Python 3.2 (renamed to ConfigParser)
|
||||
from configparser import ConfigParser as SafeConfigParser, \
|
||||
|
|
|
@ -28,13 +28,26 @@ import glob
|
|||
import os
|
||||
from ConfigParser import NoOptionError, NoSectionError
|
||||
|
||||
from .configparserinc import SafeConfigParserWithIncludes, logLevel
|
||||
from .configparserinc import sys, SafeConfigParserWithIncludes, logLevel
|
||||
from ..helpers import getLogger
|
||||
from ..server.action import CommandAction
|
||||
|
||||
# Gets the instance of the logger.
|
||||
logSys = getLogger(__name__)
|
||||
|
||||
|
||||
# if sys.version_info >= (3,5):
|
||||
# def _merge_dicts(x, y):
|
||||
# return {**x, **y}
|
||||
# else:
|
||||
def _merge_dicts(x, y):
|
||||
r = x
|
||||
if y:
|
||||
r = x.copy()
|
||||
r.update(y)
|
||||
return r
|
||||
|
||||
|
||||
class ConfigReader():
|
||||
"""Generic config reader class.
|
||||
|
||||
|
@ -127,9 +140,9 @@ class ConfigReader():
|
|||
return self._cfg.options(*args)
|
||||
return {}
|
||||
|
||||
def get(self, sec, opt):
|
||||
def get(self, sec, opt, raw=False, vars={}):
|
||||
if self._cfg is not None:
|
||||
return self._cfg.get(sec, opt)
|
||||
return self._cfg.get(sec, opt, raw=raw, vars=vars)
|
||||
return None
|
||||
|
||||
def getOptions(self, *args, **kwargs):
|
||||
|
@ -210,6 +223,8 @@ class ConfigReaderUnshared(SafeConfigParserWithIncludes):
|
|||
|
||||
def getOptions(self, sec, options, pOptions=None, shouldExist=False):
|
||||
values = dict()
|
||||
if pOptions is None:
|
||||
pOptions = {}
|
||||
for optname in options:
|
||||
if isinstance(options, (list,tuple)):
|
||||
if len(optname) > 2:
|
||||
|
@ -218,15 +233,15 @@ class ConfigReaderUnshared(SafeConfigParserWithIncludes):
|
|||
(opttype, optname), optvalue = optname, None
|
||||
else:
|
||||
opttype, optvalue = options[optname]
|
||||
if optname in pOptions:
|
||||
continue
|
||||
try:
|
||||
if opttype == "bool":
|
||||
v = self.getboolean(sec, optname)
|
||||
elif opttype == "int":
|
||||
v = self.getint(sec, optname)
|
||||
else:
|
||||
v = self.get(sec, optname)
|
||||
if not pOptions is None and optname in pOptions:
|
||||
continue
|
||||
v = self.get(sec, optname, vars=pOptions)
|
||||
values[optname] = v
|
||||
except NoSectionError as e:
|
||||
if shouldExist:
|
||||
|
@ -289,6 +304,12 @@ class DefinitionInitConfigReader(ConfigReader):
|
|||
return SafeConfigParserWithIncludes.read(self._cfg, self._file)
|
||||
|
||||
def getOptions(self, pOpts):
|
||||
# overwrite static definition options with init values, supplied as
|
||||
# direct parameters from jail-config via action[xtra1="...", xtra2=...]:
|
||||
if self._initOpts:
|
||||
if not pOpts:
|
||||
pOpts = dict()
|
||||
pOpts = _merge_dicts(pOpts, self._initOpts)
|
||||
self._opts = ConfigReader.getOptions(
|
||||
self, "Definition", self._configOpts, pOpts)
|
||||
|
||||
|
@ -299,6 +320,28 @@ class DefinitionInitConfigReader(ConfigReader):
|
|||
self._initOpts['known/'+opt] = v
|
||||
if not opt in self._initOpts:
|
||||
self._initOpts[opt] = v
|
||||
|
||||
def _convert_to_boolean(self, value):
|
||||
return value.lower() in ("1", "yes", "true", "on")
|
||||
|
||||
def getCombined(self, ignore=()):
|
||||
combinedopts = self._opts
|
||||
ignore = set(ignore).copy()
|
||||
if self._initOpts:
|
||||
combinedopts = _merge_dicts(self._opts, self._initOpts)
|
||||
if not len(combinedopts):
|
||||
return {}
|
||||
# ignore conditional options:
|
||||
for n in combinedopts:
|
||||
cond = SafeConfigParserWithIncludes.CONDITIONAL_RE.match(n)
|
||||
if cond:
|
||||
n, cond = cond.groups()
|
||||
ignore.add(n)
|
||||
# substiture options already specified direct:
|
||||
opts = CommandAction.substituteRecursiveTags(combinedopts, ignore=ignore)
|
||||
if not opts:
|
||||
raise ValueError('recursive tag definitions unable to be resolved')
|
||||
return opts
|
||||
|
||||
def convert(self):
|
||||
raise NotImplementedError
|
||||
|
|
|
@ -72,9 +72,9 @@ class Configurator:
|
|||
def getEarlyOptions(self):
|
||||
return self.__fail2ban.getEarlyOptions()
|
||||
|
||||
def getOptions(self, jail=None, updateMainOpt=None):
|
||||
def getOptions(self, jail=None, updateMainOpt=None, ignoreWrong=True):
|
||||
self.__fail2ban.getOptions(updateMainOpt)
|
||||
return self.__jails.getOptions(jail)
|
||||
return self.__jails.getOptions(jail, ignoreWrong=ignoreWrong)
|
||||
|
||||
def convertToProtocol(self):
|
||||
self.__streams["general"] = self.__fail2ban.convert()
|
||||
|
|
|
@ -125,7 +125,7 @@ class Fail2banClient(Fail2banCmdLine, Thread):
|
|||
if client:
|
||||
try :
|
||||
client.close()
|
||||
except Exception as e:
|
||||
except Exception as e: # pragma: no cover
|
||||
if showRet or self._conf["verbose"] > 1:
|
||||
logSys.debug(e)
|
||||
if showRet or c[0] == 'echo':
|
||||
|
|
|
@ -47,6 +47,7 @@ class Fail2banCmdLine():
|
|||
def __init__(self):
|
||||
self._argv = self._args = None
|
||||
self._configurator = None
|
||||
self.cleanConfOnly = False
|
||||
self.resetConf()
|
||||
|
||||
def resetConf(self):
|
||||
|
@ -101,6 +102,7 @@ class Fail2banCmdLine():
|
|||
output(" --logtarget <FILE>|STDOUT|STDERR|SYSLOG")
|
||||
output(" --syslogsocket auto|<FILE>")
|
||||
output(" -d dump configuration. For debugging")
|
||||
output(" -t, --test test configuration (can be also specified with start parameters)")
|
||||
output(" -i interactive mode")
|
||||
output(" -v increase verbosity")
|
||||
output(" -q decrease verbosity")
|
||||
|
@ -136,6 +138,9 @@ class Fail2banCmdLine():
|
|||
self._conf[ o[2:] ] = opt[1]
|
||||
elif o == "-d":
|
||||
self._conf["dump"] = True
|
||||
elif o == "-t" or o == "--test":
|
||||
self.cleanConfOnly = True
|
||||
self._conf["test"] = True
|
||||
elif o == "-v":
|
||||
self._conf["verbose"] += 1
|
||||
elif o == "-q":
|
||||
|
@ -173,8 +178,8 @@ class Fail2banCmdLine():
|
|||
|
||||
# Reads the command line options.
|
||||
try:
|
||||
cmdOpts = 'hc:s:p:xfbdviqV'
|
||||
cmdLongOpts = ['loglevel=', 'logtarget=', 'syslogsocket=', 'async', 'timeout=', 'help', 'version']
|
||||
cmdOpts = 'hc:s:p:xfbdtviqV'
|
||||
cmdLongOpts = ['loglevel=', 'logtarget=', 'syslogsocket=', 'test', 'async', 'timeout=', 'help', 'version']
|
||||
optList, self._args = getopt.getopt(self._argv[1:], cmdOpts, cmdLongOpts)
|
||||
except getopt.GetoptError:
|
||||
self.dispUsage()
|
||||
|
@ -225,13 +230,30 @@ class Fail2banCmdLine():
|
|||
logSys.info("Using pid file %s, [%s] logging to %s",
|
||||
self._conf["pidfile"], logging.getLevelName(llev), self._conf["logtarget"])
|
||||
|
||||
readcfg = True
|
||||
if self._conf.get("dump", False):
|
||||
ret, stream = self.readConfig()
|
||||
if readcfg:
|
||||
ret, stream = self.readConfig()
|
||||
readcfg = False
|
||||
self.dumpConfig(stream)
|
||||
return ret
|
||||
if not self._conf.get("test", False):
|
||||
return ret
|
||||
|
||||
if self._conf.get("test", False):
|
||||
if readcfg:
|
||||
readcfg = False
|
||||
ret, stream = self.readConfig()
|
||||
if not ret:
|
||||
raise ServerExecutionException("ERROR: test configuration failed")
|
||||
# exit after test if no commands specified (test only):
|
||||
if not len(self._args):
|
||||
output("OK: configuration test is successful")
|
||||
return ret
|
||||
|
||||
# Nothing to do here, process in client/server
|
||||
return None
|
||||
except ServerExecutionException:
|
||||
raise
|
||||
except Exception as e:
|
||||
output("ERROR: %s" % (e,))
|
||||
if verbose > 2:
|
||||
|
@ -246,7 +268,8 @@ class Fail2banCmdLine():
|
|||
try:
|
||||
self.configurator.Reload()
|
||||
self.configurator.readAll()
|
||||
ret = self.configurator.getOptions(jail, self._conf)
|
||||
ret = self.configurator.getOptions(jail, self._conf,
|
||||
ignoreWrong=not self.cleanConfOnly)
|
||||
self.configurator.convertToProtocol()
|
||||
stream = self.configurator.getConfigStream()
|
||||
except Exception as e:
|
||||
|
@ -274,6 +297,7 @@ class Fail2banCmdLine():
|
|||
def exit(code=0):
|
||||
logSys.debug("Exit with code %s", code)
|
||||
# because of possible buffered output in python, we should flush it before exit:
|
||||
logging.shutdown()
|
||||
sys.stdout.flush()
|
||||
sys.stderr.flush()
|
||||
# exit
|
||||
|
|
|
@ -41,12 +41,12 @@ from optparse import OptionParser, Option
|
|||
from ConfigParser import NoOptionError, NoSectionError, MissingSectionHeaderError
|
||||
|
||||
try: # pragma: no cover
|
||||
from systemd import journal
|
||||
from ..server.filtersystemd import FilterSystemd
|
||||
except ImportError:
|
||||
journal = None
|
||||
FilterSystemd = None
|
||||
|
||||
from ..version import version
|
||||
from .jailreader import JailReader
|
||||
from .filterreader import FilterReader
|
||||
from ..server.filter import Filter, FileContainer
|
||||
from ..server.failregex import RegexException
|
||||
|
@ -80,7 +80,7 @@ def pprint_list(l, header=None):
|
|||
s = ''
|
||||
output( s + "| " + "\n| ".join(l) + '\n`-' )
|
||||
|
||||
def journal_lines_gen(myjournal): # pragma: no cover
|
||||
def journal_lines_gen(flt, myjournal): # pragma: no cover
|
||||
while True:
|
||||
try:
|
||||
entry = myjournal.get_next()
|
||||
|
@ -88,7 +88,7 @@ def journal_lines_gen(myjournal): # pragma: no cover
|
|||
continue
|
||||
if not entry:
|
||||
break
|
||||
yield FilterSystemd.formatJournalEntry(entry)
|
||||
yield flt.formatJournalEntry(entry)
|
||||
|
||||
def get_opt_parser():
|
||||
# use module docstring for help output
|
||||
|
@ -122,15 +122,15 @@ Report bugs to https://github.com/fail2ban/fail2ban/issues
|
|||
p.add_options([
|
||||
Option("-d", "--datepattern",
|
||||
help="set custom pattern used to match date/times"),
|
||||
Option("-e", "--encoding",
|
||||
Option("-e", "--encoding", default=PREFER_ENC,
|
||||
help="File encoding. Default: system locale"),
|
||||
Option("-r", "--raw", action='store_true',
|
||||
Option("-r", "--raw", action='store_true', default=False,
|
||||
help="Raw hosts, don't resolve dns"),
|
||||
Option("--usedns", action='store', default=None,
|
||||
help="DNS specified replacement of tags <HOST> in regexp "
|
||||
"('yes' - matches all form of hosts, 'no' - IP addresses only)"),
|
||||
Option("-L", "--maxlines", type=int, default=0,
|
||||
help="maxlines for multi-line regex"),
|
||||
help="maxlines for multi-line regex."),
|
||||
Option("-m", "--journalmatch",
|
||||
help="journalctl style matches overriding filter file. "
|
||||
"\"systemd-journal\" only"),
|
||||
|
@ -143,6 +143,8 @@ Report bugs to https://github.com/fail2ban/fail2ban/issues
|
|||
help="Increase verbosity"),
|
||||
Option("--verbosity", action="store", dest="verbose", type=int,
|
||||
help="Set numerical level of verbosity (0..4)"),
|
||||
Option("--verbose-date", "--VD", action='store_true',
|
||||
help="Verbose date patterns/regex in output"),
|
||||
Option("-D", "--debuggex", action='store_true',
|
||||
help="Produce debuggex.com urls for debugging there"),
|
||||
Option("--print-no-missed", action='store_true',
|
||||
|
@ -215,14 +217,8 @@ class LineStats(object):
|
|||
class Fail2banRegex(object):
|
||||
|
||||
def __init__(self, opts):
|
||||
self._verbose = opts.verbose
|
||||
self._debuggex = opts.debuggex
|
||||
self._maxlines = 20
|
||||
self._print_no_missed = opts.print_no_missed
|
||||
self._print_no_ignored = opts.print_no_ignored
|
||||
self._print_all_matched = opts.print_all_matched
|
||||
self._print_all_missed = opts.print_all_missed
|
||||
self._print_all_ignored = opts.print_all_ignored
|
||||
# set local protected memebers from given options:
|
||||
self.__dict__.update(dict(('_'+o,v) for o,v in opts.__dict__.iteritems()))
|
||||
self._maxlines_set = False # so we allow to override maxlines in cmdline
|
||||
self._datepattern_set = False
|
||||
self._journalmatch = None
|
||||
|
@ -236,23 +232,23 @@ class Fail2banRegex(object):
|
|||
|
||||
if opts.maxlines:
|
||||
self.setMaxLines(opts.maxlines)
|
||||
else:
|
||||
self._maxlines = 20
|
||||
if opts.journalmatch is not None:
|
||||
self.setJournalMatch(opts.journalmatch.split())
|
||||
if opts.datepattern:
|
||||
self.setDatePattern(opts.datepattern)
|
||||
if opts.encoding:
|
||||
self.encoding = opts.encoding
|
||||
else:
|
||||
self.encoding = PREFER_ENC
|
||||
self.raw = True if opts.raw else False
|
||||
if opts.usedns:
|
||||
self._filter.setUseDns(opts.usedns)
|
||||
self._filter.returnRawHost = opts.raw
|
||||
self._filter.checkFindTime = False
|
||||
self._filter.checkAllRegex = True
|
||||
|
||||
def decode_line(self, line):
|
||||
return FileContainer.decode_line('<LOG>', self.encoding, line)
|
||||
return FileContainer.decode_line('<LOG>', self._encoding, line)
|
||||
|
||||
def encode_line(self, line):
|
||||
return line.encode(self.encoding, 'ignore')
|
||||
return line.encode(self._encoding, 'ignore')
|
||||
|
||||
def setDatePattern(self, pattern):
|
||||
if not self._datepattern_set:
|
||||
|
@ -350,7 +346,8 @@ class Fail2banRegex(object):
|
|||
orgLineBuffer = self._filter._Filter__lineBuffer
|
||||
fullBuffer = len(orgLineBuffer) >= self._filter.getMaxLines()
|
||||
try:
|
||||
line, ret = self._filter.processLine(line, date, checkAllRegex=True, returnRawHost=self.raw)
|
||||
ret = self._filter.processLine(line, date)
|
||||
line = self._filter.processedLine()
|
||||
for match in ret:
|
||||
# Append True/False flag depending if line was matched by
|
||||
# more than one regex
|
||||
|
@ -479,8 +476,12 @@ class Fail2banRegex(object):
|
|||
out = []
|
||||
for template in self._filter.dateDetector.templates:
|
||||
if self._verbose or template.hits:
|
||||
out.append("[%d] %s" % (
|
||||
template.hits, template.name))
|
||||
out.append("[%d] %s" % (template.hits, template.name))
|
||||
if self._verbose_date:
|
||||
out.append(" # weight: %.3f (%.3f), pattern: %s" % (
|
||||
template.weight, template.template.weight,
|
||||
getattr(template, 'pattern', ''),))
|
||||
out.append(" # regex: %s" % (getattr(template, 'regex', ''),))
|
||||
pprint_list(out, "[# of hits] date format")
|
||||
|
||||
output( "\nLines: %s" % self._line_stats, )
|
||||
|
@ -518,30 +519,27 @@ class Fail2banRegex(object):
|
|||
try:
|
||||
hdlr = open(cmd_log, 'rb')
|
||||
output( "Use log file : %s" % cmd_log )
|
||||
output( "Use encoding : %s" % self.encoding )
|
||||
output( "Use encoding : %s" % self._encoding )
|
||||
test_lines = self.file_lines_gen(hdlr)
|
||||
except IOError as e:
|
||||
output( e )
|
||||
return False
|
||||
elif cmd_log == "systemd-journal": # pragma: no cover
|
||||
if not journal:
|
||||
elif cmd_log.startswith("systemd-journal"): # pragma: no cover
|
||||
if not FilterSystemd:
|
||||
output( "Error: systemd library not found. Exiting..." )
|
||||
return False
|
||||
myjournal = journal.Reader(converters={'__CURSOR': lambda x: x})
|
||||
output( "Use systemd journal" )
|
||||
output( "Use encoding : %s" % self._encoding )
|
||||
backend, beArgs = JailReader.extractOptions(cmd_log)
|
||||
flt = FilterSystemd(None, **beArgs)
|
||||
flt.setLogEncoding(self._encoding)
|
||||
myjournal = flt.getJournalReader()
|
||||
journalmatch = self._journalmatch
|
||||
self.setDatePattern(None)
|
||||
if journalmatch:
|
||||
try:
|
||||
for element in journalmatch:
|
||||
if element == "+":
|
||||
myjournal.add_disjunction()
|
||||
else:
|
||||
myjournal.add_match(element)
|
||||
except ValueError:
|
||||
output( "Error: Invalid journalmatch: %s" % shortstr(" ".join(journalmatch)) )
|
||||
return False
|
||||
flt.addJournalMatch(journalmatch)
|
||||
output( "Use journal match : %s" % " ".join(journalmatch) )
|
||||
test_lines = journal_lines_gen(myjournal)
|
||||
test_lines = journal_lines_gen(flt, myjournal)
|
||||
else:
|
||||
output( "Use single line : %s" % shortstr(cmd_log) )
|
||||
test_lines = [ cmd_log ]
|
||||
|
|
|
@ -144,27 +144,27 @@ class Fail2banServer(Fail2banCmdLine):
|
|||
return cli
|
||||
|
||||
def start(self, argv):
|
||||
# Command line options
|
||||
ret = self.initCmdLine(argv)
|
||||
if ret is not None:
|
||||
return ret
|
||||
|
||||
# Commands
|
||||
args = self._args
|
||||
|
||||
cli = None
|
||||
# Just start:
|
||||
if len(args) == 1 and args[0] == 'start' and not self._conf.get("interactive", False):
|
||||
pass
|
||||
else:
|
||||
# If client mode - whole processing over client:
|
||||
if len(args) or self._conf.get("interactive", False):
|
||||
cli = self._Fail2banClient()
|
||||
return cli.start(argv)
|
||||
|
||||
# Start the server:
|
||||
server = None
|
||||
try:
|
||||
# Command line options
|
||||
ret = self.initCmdLine(argv)
|
||||
if ret is not None:
|
||||
return ret
|
||||
|
||||
# Commands
|
||||
args = self._args
|
||||
|
||||
cli = None
|
||||
# Just start:
|
||||
if len(args) == 1 and args[0] == 'start' and not self._conf.get("interactive", False):
|
||||
pass
|
||||
else:
|
||||
# If client mode - whole processing over client:
|
||||
if len(args) or self._conf.get("interactive", False):
|
||||
cli = self._Fail2banClient()
|
||||
return cli.start(argv)
|
||||
|
||||
# Start the server:
|
||||
from ..server.utils import Utils
|
||||
# background = True, if should be new process running in background, otherwise start in foreground
|
||||
# process will be forked in daemonize, inside of Server module.
|
||||
|
|
|
@ -28,7 +28,6 @@ import os
|
|||
import shlex
|
||||
|
||||
from .configreader import DefinitionInitConfigReader
|
||||
from ..server.action import CommandAction
|
||||
from ..helpers import getLogger
|
||||
|
||||
# Gets the instance of the logger.
|
||||
|
@ -40,6 +39,9 @@ class FilterReader(DefinitionInitConfigReader):
|
|||
_configOpts = {
|
||||
"ignoreregex": ["string", None],
|
||||
"failregex": ["string", ""],
|
||||
"maxlines": ["int", None],
|
||||
"datepattern": ["string", None],
|
||||
"journalmatch": ["string", None],
|
||||
}
|
||||
|
||||
def setFile(self, fileName):
|
||||
|
@ -49,15 +51,6 @@ class FilterReader(DefinitionInitConfigReader):
|
|||
def getFile(self):
|
||||
return self.__file
|
||||
|
||||
def getCombined(self):
|
||||
combinedopts = dict(list(self._opts.items()) + list(self._initOpts.items()))
|
||||
if not len(combinedopts):
|
||||
return {}
|
||||
opts = CommandAction.substituteRecursiveTags(combinedopts)
|
||||
if not opts:
|
||||
raise ValueError('recursive tag definitions unable to be resolved')
|
||||
return opts
|
||||
|
||||
def convert(self):
|
||||
stream = list()
|
||||
opts = self.getCombined()
|
||||
|
@ -65,6 +58,7 @@ class FilterReader(DefinitionInitConfigReader):
|
|||
return stream
|
||||
for opt, value in opts.iteritems():
|
||||
if opt in ("failregex", "ignoreregex"):
|
||||
if value is None: continue
|
||||
multi = []
|
||||
for regex in value.split('\n'):
|
||||
# Do not send a command if the rule is empty.
|
||||
|
@ -74,16 +68,17 @@ class FilterReader(DefinitionInitConfigReader):
|
|||
stream.append(["multi-set", self._jailName, "add" + opt, multi])
|
||||
elif len(multi):
|
||||
stream.append(["set", self._jailName, "add" + opt, multi[0]])
|
||||
if self._initOpts:
|
||||
if 'maxlines' in self._initOpts:
|
||||
elif opt == 'maxlines':
|
||||
# We warn when multiline regex is used without maxlines > 1
|
||||
# therefore keep sure we set this option first.
|
||||
stream.insert(0, ["set", self._jailName, "maxlines", self._initOpts["maxlines"]])
|
||||
if 'datepattern' in self._initOpts:
|
||||
stream.append(["set", self._jailName, "datepattern", self._initOpts["datepattern"]])
|
||||
stream.insert(0, ["set", self._jailName, "maxlines", value])
|
||||
elif opt == 'datepattern':
|
||||
stream.append(["set", self._jailName, "datepattern", value])
|
||||
# Do not send a command if the match is empty.
|
||||
if self._initOpts.get("journalmatch", '') != '':
|
||||
for match in self._initOpts["journalmatch"].split("\n"):
|
||||
elif opt == 'journalmatch':
|
||||
if value is None: continue
|
||||
for match in value.split("\n"):
|
||||
if match == '': continue
|
||||
stream.append(
|
||||
["set", self._jailName, "addjournalmatch"] +
|
||||
shlex.split(match))
|
||||
|
|
|
@ -43,13 +43,13 @@ logSys = getLogger(__name__)
|
|||
class JailReader(ConfigReader):
|
||||
|
||||
# regex, to extract list of options:
|
||||
optionCRE = re.compile("^((?:\w|-|_|\.)+)(?:\[(.*)\])?$")
|
||||
optionCRE = re.compile(r"^([\w\-_\.]+)(?:\[(.*)\])?\s*$", re.DOTALL)
|
||||
# regex, to iterate over single option in option list, syntax:
|
||||
# `action = act[p1="...", p2='...', p3=...]`, where the p3=... not contains `,` or ']'
|
||||
# since v0.10 separator extended with `]\s*[` for support of multiple option groups, syntax
|
||||
# `action = act[p1=...][p2=...]`
|
||||
optionExtractRE = re.compile(
|
||||
r'([\w\-_\.]+)=(?:"([^"]*)"|\'([^\']*)\'|([^,\]]*))(?:,|\]\s*\[|$)')
|
||||
r'([\w\-_\.]+)=(?:"([^"]*)"|\'([^\']*)\'|([^,\]]*))(?:,|\]\s*\[|$)', re.DOTALL)
|
||||
|
||||
def __init__(self, name, force_enable=False, **kwargs):
|
||||
ConfigReader.__init__(self, **kwargs)
|
||||
|
@ -119,39 +119,46 @@ class JailReader(ConfigReader):
|
|||
["string", "ignorecommand", None],
|
||||
["string", "ignoreip", None],
|
||||
["string", "filter", ""],
|
||||
["string", "datepattern", None],
|
||||
["string", "action", ""]]
|
||||
|
||||
# Before interpolation (substitution) add static options always available as default:
|
||||
defsec = self._cfg.get_defaults()
|
||||
defsec["fail2ban_version"] = version
|
||||
|
||||
# Read first options only needed for merge defaults ('known/...' from filter):
|
||||
self.__opts = ConfigReader.getOptions(self, self.__name, opts1st, shouldExist=True)
|
||||
if not self.__opts:
|
||||
return False
|
||||
try:
|
||||
|
||||
# Read first options only needed for merge defaults ('known/...' from filter):
|
||||
self.__opts = ConfigReader.getOptions(self, self.__name, opts1st, shouldExist=True)
|
||||
if not self.__opts: # pragma: no cover
|
||||
raise JailDefError("Init jail options failed")
|
||||
|
||||
if self.isEnabled():
|
||||
if not self.isEnabled():
|
||||
return True
|
||||
|
||||
# Read filter
|
||||
if self.__opts["filter"]:
|
||||
filterName, filterOpt = JailReader.extractOptions(
|
||||
self.__opts["filter"])
|
||||
flt = self.__opts["filter"]
|
||||
if flt:
|
||||
filterName, filterOpt = JailReader.extractOptions(flt)
|
||||
if not filterName:
|
||||
raise JailDefError("Invalid filter definition %r" % flt)
|
||||
self.__filter = FilterReader(
|
||||
filterName, self.__name, filterOpt, share_config=self.share_config, basedir=self.getBaseDir())
|
||||
filterName, self.__name, filterOpt,
|
||||
share_config=self.share_config, basedir=self.getBaseDir())
|
||||
ret = self.__filter.read()
|
||||
# merge options from filter as 'known/...':
|
||||
self.__filter.getOptions(self.__opts)
|
||||
ConfigReader.merge_section(self, self.__name, self.__filter.getCombined(), 'known/')
|
||||
if not ret:
|
||||
logSys.error("Unable to read the filter")
|
||||
return False
|
||||
raise JailDefError("Unable to read the filter %r" % filterName)
|
||||
else:
|
||||
self.__filter = None
|
||||
logSys.warning("No filter set for jail %s" % self.__name)
|
||||
|
||||
# Read second all options (so variables like %(known/param) can be interpolated):
|
||||
self.__opts = ConfigReader.getOptions(self, self.__name, opts)
|
||||
if not self.__opts:
|
||||
return False
|
||||
if not self.__opts: # pragma: no cover
|
||||
raise JailDefError("Read jail options failed")
|
||||
|
||||
# cumulate filter options again (ignore given in jail):
|
||||
if self.__filter:
|
||||
|
@ -163,6 +170,8 @@ class JailReader(ConfigReader):
|
|||
if not act: # skip empty actions
|
||||
continue
|
||||
actName, actOpt = JailReader.extractOptions(act)
|
||||
if not actName:
|
||||
raise JailDefError("Invalid action definition %r" % act)
|
||||
if actName.endswith(".py"):
|
||||
self.__actions.append([
|
||||
"set",
|
||||
|
@ -182,13 +191,22 @@ class JailReader(ConfigReader):
|
|||
action.getOptions(self.__opts)
|
||||
self.__actions.append(action)
|
||||
else:
|
||||
raise AttributeError("Unable to read action")
|
||||
raise JailDefError("Unable to read action %r" % actName)
|
||||
except JailDefError:
|
||||
raise
|
||||
except Exception as e:
|
||||
logSys.error("Error in action definition " + act)
|
||||
logSys.debug("Caught exception: %s" % (e,))
|
||||
return False
|
||||
logSys.debug("Caught exception: %s", e, exc_info=True)
|
||||
raise ValueError("Error in action definition %r: %r" % (act, e))
|
||||
if not len(self.__actions):
|
||||
logSys.warning("No actions were defined for %s" % self.__name)
|
||||
|
||||
except JailDefError as e:
|
||||
e = str(e)
|
||||
logSys.error(e)
|
||||
if not self.__opts:
|
||||
self.__opts = dict()
|
||||
self.__opts['config-error'] = e
|
||||
return False
|
||||
return True
|
||||
|
||||
def convert(self, allow_no_files=False):
|
||||
|
@ -202,6 +220,12 @@ class JailReader(ConfigReader):
|
|||
"""
|
||||
|
||||
stream = []
|
||||
e = self.__opts.get('config-error')
|
||||
if e:
|
||||
stream.extend([['config-error', "Jail '%s' skipped, because of wrong configuration: %s" % (self.__name, e)]])
|
||||
return stream
|
||||
if self.__filter:
|
||||
stream.extend(self.__filter.convert())
|
||||
for opt, value in self.__opts.iteritems():
|
||||
if opt == "logpath" and \
|
||||
not self.__opts.get('backend', None).startswith("systemd"):
|
||||
|
@ -223,19 +247,9 @@ class JailReader(ConfigReader):
|
|||
stream.append(["set", self.__name, "logencoding", value])
|
||||
elif opt == "backend":
|
||||
backend = value
|
||||
elif opt == "maxretry":
|
||||
stream.append(["set", self.__name, "maxretry", value])
|
||||
elif opt == "ignoreip":
|
||||
for ip in splitwords(value):
|
||||
stream.append(["set", self.__name, "addignoreip", ip])
|
||||
elif opt == "findtime":
|
||||
stream.append(["set", self.__name, "findtime", value])
|
||||
elif opt == "bantime":
|
||||
stream.append(["set", self.__name, "bantime", value])
|
||||
elif opt.startswith("bantime."):
|
||||
stream.append(["set", self.__name, opt, self.__opts[opt]])
|
||||
elif opt == "usedns":
|
||||
stream.append(["set", self.__name, "usedns", value])
|
||||
elif opt in ("failregex", "ignoreregex"):
|
||||
multi = []
|
||||
for regex in value.split('\n'):
|
||||
|
@ -246,10 +260,8 @@ class JailReader(ConfigReader):
|
|||
stream.append(["multi-set", self.__name, "add" + opt, multi])
|
||||
elif len(multi):
|
||||
stream.append(["set", self.__name, "add" + opt, multi[0]])
|
||||
elif opt == "ignorecommand":
|
||||
stream.append(["set", self.__name, "ignorecommand", value])
|
||||
if self.__filter:
|
||||
stream.extend(self.__filter.convert())
|
||||
elif opt not in ('action', 'filter', 'enabled'):
|
||||
stream.append(["set", self.__name, opt, value])
|
||||
for action in self.__actions:
|
||||
if isinstance(action, (ConfigReaderUnshared, ConfigReader)):
|
||||
stream.extend(action.convert())
|
||||
|
@ -273,3 +285,7 @@ class JailReader(ConfigReader):
|
|||
val for val in optmatch.group(2,3,4) if val is not None][0]
|
||||
option_opts[opt.strip()] = value.strip()
|
||||
return option_name, option_opts
|
||||
|
||||
|
||||
class JailDefError(Exception):
|
||||
pass
|
||||
|
|
|
@ -54,7 +54,7 @@ class JailsReader(ConfigReader):
|
|||
self.__jails = list()
|
||||
return ConfigReader.read(self, "jail")
|
||||
|
||||
def getOptions(self, section=None):
|
||||
def getOptions(self, section=None, ignoreWrong=True):
|
||||
"""Reads configuration for jail(s) and adds enabled jails to __jails
|
||||
"""
|
||||
opts = []
|
||||
|
@ -66,7 +66,7 @@ class JailsReader(ConfigReader):
|
|||
sections = [ section ]
|
||||
|
||||
# Get the options of all jails.
|
||||
parse_status = True
|
||||
parse_status = 0
|
||||
for sec in sections:
|
||||
if sec == 'INCLUDES':
|
||||
continue
|
||||
|
@ -77,12 +77,16 @@ class JailsReader(ConfigReader):
|
|||
ret = jail.getOptions()
|
||||
if ret:
|
||||
if jail.isEnabled():
|
||||
# at least one jail was successful:
|
||||
parse_status |= 1
|
||||
# We only add enabled jails
|
||||
self.__jails.append(jail)
|
||||
else:
|
||||
logSys.error("Errors in jail %r. Skipping..." % sec)
|
||||
parse_status = False
|
||||
return parse_status
|
||||
logSys.error("Errors in jail %r.%s", sec, " Skipping..." if ignoreWrong else "")
|
||||
self.__jails.append(jail)
|
||||
# at least one jail was invalid:
|
||||
parse_status |= 2
|
||||
return ((ignoreWrong and parse_status & 1) or not (parse_status & 2))
|
||||
|
||||
def convert(self, allow_no_files=False):
|
||||
"""Convert read before __opts and jails to the commands stream
|
||||
|
@ -95,15 +99,13 @@ class JailsReader(ConfigReader):
|
|||
"""
|
||||
|
||||
stream = list()
|
||||
for opt in self.__opts:
|
||||
if opt == "":
|
||||
stream.append([])
|
||||
# Convert jails
|
||||
for jail in self.__jails:
|
||||
stream.extend(jail.convert(allow_no_files=allow_no_files))
|
||||
# Start jails
|
||||
for jail in self.__jails:
|
||||
stream.append(["start", jail.getName()])
|
||||
if not jail.options.get('config-error'):
|
||||
stream.append(["start", jail.getName()])
|
||||
|
||||
return stream
|
||||
|
||||
|
|
|
@ -219,9 +219,9 @@ class CommandAction(ActionBase):
|
|||
self.timeout = 60
|
||||
## Command executed in order to initialize the system.
|
||||
self.actionstart = ''
|
||||
## Command executed when an IP address gets banned.
|
||||
## Command executed when ticket gets banned.
|
||||
self.actionban = ''
|
||||
## Command executed when an IP address gets removed.
|
||||
## Command executed when ticket gets removed.
|
||||
self.actionunban = ''
|
||||
## Command executed in order to check requirements.
|
||||
self.actioncheck = ''
|
||||
|
@ -365,7 +365,7 @@ class CommandAction(ActionBase):
|
|||
return self._executeOperation('<actionreload>', 'reloading')
|
||||
|
||||
@classmethod
|
||||
def substituteRecursiveTags(cls, inptags, conditional=''):
|
||||
def substituteRecursiveTags(cls, inptags, conditional='', ignore=()):
|
||||
"""Sort out tag definitions within other tags.
|
||||
Since v.0.9.2 supports embedded interpolation (see test cases for examples).
|
||||
|
||||
|
@ -387,21 +387,26 @@ class CommandAction(ActionBase):
|
|||
# copy return tags dict to prevent modifying of inptags:
|
||||
tags = inptags.copy()
|
||||
t = TAG_CRE
|
||||
ignore = set(ignore)
|
||||
done = cls._escapedTags.copy() | ignore
|
||||
# repeat substitution while embedded-recursive (repFlag is True)
|
||||
done = cls._escapedTags.copy()
|
||||
while True:
|
||||
repFlag = False
|
||||
# substitute each value:
|
||||
for tag in tags.iterkeys():
|
||||
# ignore escaped or already done:
|
||||
# ignore escaped or already done (or in ignore list):
|
||||
if tag in done: continue
|
||||
value = str(tags[tag])
|
||||
value = orgval = str(tags[tag])
|
||||
# search and replace all tags within value, that can be interpolated using other tags:
|
||||
m = t.search(value)
|
||||
refCounts = {}
|
||||
#logSys.log(5, 'TAG: %s, value: %s' % (tag, value))
|
||||
while m:
|
||||
found_tag = m.group(1)
|
||||
# don't replace tags that should be currently ignored (pre-replacement):
|
||||
if found_tag in ignore:
|
||||
m = t.search(value, m.end())
|
||||
continue
|
||||
#logSys.log(5, 'found: %s' % found_tag)
|
||||
if found_tag == tag or refCounts.get(found_tag, 1) > MAX_TAG_REPLACE_COUNT:
|
||||
# recursive definitions are bad
|
||||
|
@ -429,7 +434,7 @@ class CommandAction(ActionBase):
|
|||
m = t.search(value, m.start())
|
||||
#logSys.log(5, 'TAG: %s, newvalue: %s' % (tag, value))
|
||||
# was substituted?
|
||||
if tags[tag] != value:
|
||||
if orgval != value:
|
||||
# check still contains any tag - should be repeated (possible embedded-recursive substitution):
|
||||
if t.search(value):
|
||||
repFlag = True
|
||||
|
@ -584,7 +589,7 @@ class CommandAction(ActionBase):
|
|||
return self.executeCmd(realCmd, self.timeout)
|
||||
|
||||
@staticmethod
|
||||
def executeCmd(realCmd, timeout=60):
|
||||
def executeCmd(realCmd, timeout=60, **kwargs):
|
||||
"""Executes a command.
|
||||
|
||||
Parameters
|
||||
|
@ -612,4 +617,4 @@ class CommandAction(ActionBase):
|
|||
return True
|
||||
|
||||
with _cmd_lock:
|
||||
return Utils.executeCmd(realCmd, timeout, shell=True, output=False)
|
||||
return Utils.executeCmd(realCmd, timeout, shell=True, output=False, **kwargs)
|
||||
|
|
|
@ -194,7 +194,7 @@ class Actions(JailThread, Mapping):
|
|||
def setBanTime(self, value):
|
||||
value = MyTime.str2seconds(value)
|
||||
self.__banManager.setBanTime(value)
|
||||
logSys.info("Set banTime = %s" % value)
|
||||
logSys.info(" banTime: %s" % value)
|
||||
|
||||
##
|
||||
# Get the ban time.
|
||||
|
@ -357,6 +357,8 @@ class Actions(JailThread, Mapping):
|
|||
aInfo["failures"] = bTicket.getAttempt()
|
||||
aInfo["time"] = bTicket.getTime()
|
||||
aInfo["matches"] = "\n".join(bTicket.getMatches())
|
||||
# to bypass actions, that should not be executed for restored tickets
|
||||
aInfo["restored"] = 1 if ticket.restored else 0
|
||||
# retarded merge info via twice lambdas : once for merge, once for matches/failures:
|
||||
if self._jail.database is not None:
|
||||
mi4ip = lambda overalljails=False, self=self, \
|
||||
|
@ -375,6 +377,8 @@ class Actions(JailThread, Mapping):
|
|||
# do actions :
|
||||
for name, action in self._actions.iteritems():
|
||||
try:
|
||||
if ticket.restored and getattr(action, 'norestored', False):
|
||||
continue
|
||||
action.ban(aInfo.copy())
|
||||
except Exception as e:
|
||||
logSys.error(
|
||||
|
@ -466,10 +470,14 @@ class Actions(JailThread, Mapping):
|
|||
aInfo["failures"] = ticket.getAttempt()
|
||||
aInfo["time"] = ticket.getTime()
|
||||
aInfo["matches"] = "".join(ticket.getMatches())
|
||||
# to bypass actions, that should not be executed for restored tickets
|
||||
aInfo["restored"] = 1 if ticket.restored else 0
|
||||
if actions is None:
|
||||
logSys.notice("[%s] Unban %s", self._jail.name, aInfo["ip"])
|
||||
for name, action in unbactions.iteritems():
|
||||
try:
|
||||
if ticket.restored and getattr(action, 'norestored', False):
|
||||
continue
|
||||
logSys.debug("[%s] action %r: unban %s", self._jail.name, name, aInfo["ip"])
|
||||
action.unban(aInfo.copy())
|
||||
except Exception as e:
|
||||
|
|
|
@ -241,7 +241,7 @@ class AsyncServer(asyncore.dispatcher):
|
|||
def _remove_sock(self):
|
||||
try:
|
||||
os.remove(self.__sock)
|
||||
except OSError as e:
|
||||
except OSError as e: # pragma: no cover
|
||||
if e.errno != errno.ENOENT:
|
||||
raise
|
||||
|
||||
|
|
|
@ -21,11 +21,13 @@ __author__ = "Cyril Jaquier and Fail2Ban Contributors"
|
|||
__copyright__ = "Copyright (c) 2004 Cyril Jaquier"
|
||||
__license__ = "GPL"
|
||||
|
||||
import copy
|
||||
import time
|
||||
|
||||
from threading import Lock
|
||||
|
||||
from .datetemplate import DatePatternRegex, DateTai64n, DateEpoch
|
||||
from .datetemplate import re, DateTemplate, DatePatternRegex, DateTai64n, DateEpoch
|
||||
from .utils import Utils
|
||||
from ..helpers import getLogger
|
||||
|
||||
# Gets the instance of the logger.
|
||||
|
@ -33,8 +35,52 @@ logSys = getLogger(__name__)
|
|||
|
||||
logLevel = 6
|
||||
|
||||
RE_DATE_PREMATCH = re.compile("\{DATE\}", re.IGNORECASE)
|
||||
DD_patternCache = Utils.Cache(maxCount=1000, maxTime=60*60)
|
||||
|
||||
|
||||
def _getPatternTemplate(pattern, key=None):
|
||||
if key is None:
|
||||
key = pattern
|
||||
if '%' not in pattern:
|
||||
key = pattern.upper()
|
||||
template = DD_patternCache.get(key)
|
||||
|
||||
if not template:
|
||||
if key in ("EPOCH", "{^LN-BEG}EPOCH", "^EPOCH"):
|
||||
template = DateEpoch(lineBeginOnly=(key != "EPOCH"))
|
||||
elif key in ("TAI64N", "{^LN-BEG}TAI64N", "^TAI64N"):
|
||||
template = DateTai64n(wordBegin=('start' if key != "TAI64N" else False))
|
||||
else:
|
||||
template = DatePatternRegex(pattern)
|
||||
|
||||
DD_patternCache.set(key, template)
|
||||
return template
|
||||
|
||||
def _getAnchoredTemplate(template, wrap=lambda s: '{^LN-BEG}' + s):
|
||||
# wrap name:
|
||||
name = wrap(template.name)
|
||||
# try to find in cache (by name):
|
||||
template2 = DD_patternCache.get(name)
|
||||
if not template2:
|
||||
# wrap pattern (or regexp if not pattern template):
|
||||
regex = wrap(getattr(template, 'pattern', template.regex))
|
||||
if hasattr(template, 'pattern'):
|
||||
# try to find in cache (by pattern):
|
||||
template2 = DD_patternCache.get(regex)
|
||||
# make duplicate and set new anchored regex:
|
||||
if not template2:
|
||||
if not hasattr(template, 'pattern'):
|
||||
template2 = _getPatternTemplate(name)
|
||||
else:
|
||||
template2 = _getPatternTemplate(regex)
|
||||
return template2
|
||||
|
||||
|
||||
|
||||
class DateDetectorCache(object):
|
||||
"""Implements the caching of the default templates list.
|
||||
"""
|
||||
def __init__(self):
|
||||
self.__lock = Lock()
|
||||
self.__templates = list()
|
||||
|
@ -43,71 +89,115 @@ class DateDetectorCache(object):
|
|||
def templates(self):
|
||||
"""List of template instances managed by the detector.
|
||||
"""
|
||||
if self.__templates:
|
||||
return self.__templates
|
||||
with self.__lock:
|
||||
if self.__templates:
|
||||
if self.__templates: # pragma: no cover - race-condition + multi-threaded environment only
|
||||
return self.__templates
|
||||
self._addDefaultTemplate()
|
||||
return self.__templates
|
||||
|
||||
def _cacheTemplate(self, template):
|
||||
"""Cache Fail2Ban's default template.
|
||||
|
||||
"""
|
||||
if isinstance(template, str):
|
||||
template = DatePatternRegex(template)
|
||||
self.__templates.append(template)
|
||||
# exact given template with word begin-end boundary:
|
||||
template = _getPatternTemplate(template)
|
||||
# if not already line-begin anchored, additional template, that prefers datetime
|
||||
# at start of a line (safety+performance feature):
|
||||
name = template.name
|
||||
if not name.startswith('{^LN-BEG}') and not name.startswith('^') and hasattr(template, 'regex'):
|
||||
template2 = _getAnchoredTemplate(template)
|
||||
# prevent to add duplicates:
|
||||
if template2.name != name:
|
||||
# increase weight of such templates, because they should be always
|
||||
# preferred in template sorting process (bubble up):
|
||||
template2.weight = 100.0
|
||||
self.__tmpcache[0].append(template2)
|
||||
# add template:
|
||||
self.__tmpcache[1].append(template)
|
||||
|
||||
def _addDefaultTemplate(self):
|
||||
"""Add resp. cache Fail2Ban's default set of date templates.
|
||||
"""
|
||||
self.__tmpcache = [], []
|
||||
# ISO 8601, simple date, optional subsecond and timezone:
|
||||
# 2005-01-23T21:59:59.981746, 2005-01-23 21:59:59
|
||||
# simple date: 2005/01/23 21:59:59
|
||||
# custom for syslog-ng 2006.12.21 06:43:20
|
||||
self._cacheTemplate("%ExY(?P<_sep>[-/.])%m(?P=_sep)%d[T ]%H:%M:%S(?:[.,]%f)?(?:\s*%z)?")
|
||||
# asctime with optional day, subsecond and/or year:
|
||||
# Sun Jan 23 21:59:59.011 2005
|
||||
self._cacheTemplate("(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %Y)?")
|
||||
self._cacheTemplate("(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?")
|
||||
# asctime with optional day, subsecond and/or year coming after day
|
||||
# http://bugs.debian.org/798923
|
||||
# Sun Jan 23 2005 21:59:59.011
|
||||
self._cacheTemplate("(?:%a )?%b %d %Y %H:%M:%S(?:\.%f)?")
|
||||
# simple date, optional subsecond (proftpd):
|
||||
# 2005-01-23 21:59:59
|
||||
# simple date: 2005/01/23 21:59:59
|
||||
# custom for syslog-ng 2006.12.21 06:43:20
|
||||
self._cacheTemplate("%Y(?P<_sep>[-/.])%m(?P=_sep)%d %H:%M:%S(?:,%f)?")
|
||||
self._cacheTemplate("(?:%a )?%b %d %ExY %H:%M:%S(?:\.%f)?")
|
||||
# simple date too (from x11vnc): 23/01/2005 21:59:59
|
||||
# and with optional year given by 2 digits: 23/01/05 21:59:59
|
||||
# (See http://bugs.debian.org/537610)
|
||||
# 17-07-2008 17:23:25
|
||||
self._cacheTemplate("%d(?P<_sep>[-/])%m(?P=_sep)(?:%Y|%y) %H:%M:%S")
|
||||
self._cacheTemplate("%d(?P<_sep>[-/])%m(?P=_sep)(?:%ExY|%Exy) %H:%M:%S")
|
||||
# Apache format optional time zone:
|
||||
# [31/Oct/2006:09:22:55 -0000]
|
||||
# 26-Jul-2007 15:20:52
|
||||
self._cacheTemplate("%d(?P<_sep>[-/])%b(?P=_sep)%Y[ :]?%H:%M:%S(?:\.%f)?(?: %z)?")
|
||||
# CPanel 05/20/2008:01:57:39
|
||||
self._cacheTemplate("%m/%d/%Y:%H:%M:%S")
|
||||
# named 26-Jul-2007 15:20:52.252
|
||||
# named 26-Jul-2007 15:20:52.252
|
||||
# roundcube 26-Jul-2007 15:20:52 +0200
|
||||
self._cacheTemplate("%d(?P<_sep>[-/])%b(?P=_sep)%ExY[ :]?%H:%M:%S(?:\.%f)?(?: %z)?")
|
||||
# CPanel 05/20/2008:01:57:39
|
||||
self._cacheTemplate("%m/%d/%ExY:%H:%M:%S")
|
||||
# 01-27-2012 16:22:44.252
|
||||
# subseconds explicit to avoid possible %m<->%d confusion
|
||||
# with previous
|
||||
self._cacheTemplate("%m-%d-%Y %H:%M:%S\.%f")
|
||||
# TAI64N
|
||||
template = DateTai64n()
|
||||
template.name = "TAI64N"
|
||||
self._cacheTemplate(template)
|
||||
# with previous ("%d-%m-%ExY %H:%M:%S" by "%d(?P<_sep>[-/])%m(?P=_sep)(?:%ExY|%Exy) %H:%M:%S")
|
||||
self._cacheTemplate("%m-%d-%ExY %H:%M:%S(?:\.%f)?")
|
||||
# Epoch
|
||||
template = DateEpoch()
|
||||
template.name = "Epoch"
|
||||
self._cacheTemplate(template)
|
||||
# ISO 8601
|
||||
self._cacheTemplate("%Y-%m-%d[T ]%H:%M:%S(?:\.%f)?(?:%z)?")
|
||||
self._cacheTemplate('EPOCH')
|
||||
# Only time information in the log
|
||||
self._cacheTemplate("^%H:%M:%S")
|
||||
self._cacheTemplate("{^LN-BEG}%H:%M:%S")
|
||||
# <09/16/08@05:03:30>
|
||||
self._cacheTemplate("^<%m/%d/%y@%H:%M:%S>")
|
||||
self._cacheTemplate("^<%m/%d/%Exy@%H:%M:%S>")
|
||||
# MySQL: 130322 11:46:11
|
||||
self._cacheTemplate("^%y%m%d ?%H:%M:%S")
|
||||
self._cacheTemplate("%Exy%Exm%Exd ?%H:%M:%S")
|
||||
# Apache Tomcat
|
||||
self._cacheTemplate("%b %d, %Y %I:%M:%S %p")
|
||||
self._cacheTemplate("%b %d, %ExY %I:%M:%S %p")
|
||||
# ASSP: Apr-27-13 02:33:06
|
||||
self._cacheTemplate("^%b-%d-%y %H:%M:%S")
|
||||
self._cacheTemplate("^%b-%d-%Exy %H:%M:%S")
|
||||
# 20050123T215959, 20050123 215959
|
||||
self._cacheTemplate("%ExY%Exm%Exd[T ]%ExH%ExM%ExS(?:[.,]%f)?(?:\s*%z)?")
|
||||
# prefixed with optional named time zone (monit):
|
||||
# PDT Apr 16 21:05:29
|
||||
self._cacheTemplate("(?:%Z )?(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?")
|
||||
# +00:00 Jan 23 21:59:59.011 2005
|
||||
self._cacheTemplate("(?:%z )?(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?")
|
||||
# TAI64N
|
||||
self._cacheTemplate("TAI64N")
|
||||
#
|
||||
self.__templates = self.__tmpcache[0] + self.__tmpcache[1]
|
||||
del self.__tmpcache
|
||||
|
||||
|
||||
class DateDetectorTemplate(object):
|
||||
"""Used for "shallow copy" of the template object.
|
||||
|
||||
Prevents collectively usage of hits/lastUsed in cached templates
|
||||
"""
|
||||
__slots__ = ('template', 'hits', 'lastUsed', 'distance')
|
||||
def __init__(self, template):
|
||||
self.template = template
|
||||
self.hits = 0
|
||||
self.lastUsed = 0
|
||||
# the last distance to date-match within the log file:
|
||||
self.distance = 0x7fffffff
|
||||
|
||||
@property
|
||||
def weight(self):
|
||||
return self.hits * self.template.weight / max(1, self.distance)
|
||||
|
||||
def __getattr__(self, name):
|
||||
""" Returns attribute of template (called for parameters not in slots)
|
||||
"""
|
||||
return getattr(self.template, name)
|
||||
|
||||
|
||||
class DateDetector(object):
|
||||
|
@ -120,19 +210,27 @@ class DateDetector(object):
|
|||
_defCache = DateDetectorCache()
|
||||
|
||||
def __init__(self):
|
||||
self.__lock = Lock()
|
||||
self.__templates = list()
|
||||
self.__known_names = set()
|
||||
# time the template was long unused (currently 300 == 5m):
|
||||
self.__unusedTime = 300
|
||||
# last known distance (bypass one char collision) and end position:
|
||||
self.__lastPos = 1, None
|
||||
self.__lastEndPos = 0x7fffffff, None
|
||||
self.__lastTemplIdx = 0x7fffffff
|
||||
# first free place:
|
||||
self.__firstUnused = 0
|
||||
# pre-match pattern:
|
||||
self.__preMatch = None
|
||||
|
||||
def _appendTemplate(self, template):
|
||||
def _appendTemplate(self, template, ignoreDup=False):
|
||||
name = template.name
|
||||
if name in self.__known_names:
|
||||
if ignoreDup: return
|
||||
raise ValueError(
|
||||
"There is already a template with name %s" % name)
|
||||
self.__known_names.add(name)
|
||||
self.__templates.append(template)
|
||||
self.__templates.append(DateDetectorTemplate(template))
|
||||
|
||||
def appendTemplate(self, template):
|
||||
"""Add a date template to manage and use in search of dates.
|
||||
|
@ -150,15 +248,45 @@ class DateDetector(object):
|
|||
If a template already exists with the same name.
|
||||
"""
|
||||
if isinstance(template, str):
|
||||
template = DatePatternRegex(template)
|
||||
self._appendTemplate(template)
|
||||
key = pattern = template
|
||||
if '%' not in pattern:
|
||||
key = pattern.upper()
|
||||
template = DD_patternCache.get(key)
|
||||
if not template:
|
||||
if key in ("{^LN-BEG}", "{DEFAULT}"):
|
||||
flt = \
|
||||
lambda template: template.flags & DateTemplate.LINE_BEGIN if key == "{^LN-BEG}" else None
|
||||
self.addDefaultTemplate(flt)
|
||||
return
|
||||
elif "{DATE}" in key:
|
||||
self.addDefaultTemplate(
|
||||
lambda template: not template.flags & DateTemplate.LINE_BEGIN, pattern)
|
||||
return
|
||||
else:
|
||||
template = _getPatternTemplate(pattern, key)
|
||||
|
||||
def addDefaultTemplate(self):
|
||||
DD_patternCache.set(key, template)
|
||||
|
||||
self._appendTemplate(template)
|
||||
logSys.info(" date pattern `%r`: `%s`",
|
||||
getattr(template, 'pattern', ''), template.name)
|
||||
logSys.debug(" date pattern regex for %r: %s",
|
||||
getattr(template, 'pattern', ''), template.regex)
|
||||
|
||||
def addDefaultTemplate(self, filterTemplate=None, preMatch=None):
|
||||
"""Add Fail2Ban's default set of date templates.
|
||||
"""
|
||||
with self.__lock:
|
||||
for template in DateDetector._defCache.templates:
|
||||
self._appendTemplate(template)
|
||||
ignoreDup = len(self.__templates) > 0
|
||||
for template in DateDetector._defCache.templates:
|
||||
# filter if specified:
|
||||
if filterTemplate is not None and not filterTemplate(template): continue
|
||||
# if exact pattern available - create copy of template, contains replaced {DATE} with default regex:
|
||||
if preMatch is not None:
|
||||
# get cached or create a copy with modified name/pattern, using preMatch replacement for {DATE}:
|
||||
template = _getAnchoredTemplate(template,
|
||||
wrap=lambda s: RE_DATE_PREMATCH.sub(s, preMatch))
|
||||
# append date detector template (ignore duplicate if some was added before default):
|
||||
self._appendTemplate(template, ignoreDup=ignoreDup)
|
||||
|
||||
@property
|
||||
def templates(self):
|
||||
|
@ -184,22 +312,115 @@ class DateDetector(object):
|
|||
The regex match returned from the first successfully matched
|
||||
template.
|
||||
"""
|
||||
i = 0
|
||||
with self.__lock:
|
||||
for template in self.__templates:
|
||||
# if no templates specified - default templates should be used:
|
||||
if not len(self.__templates):
|
||||
self.addDefaultTemplate()
|
||||
logSys.log(logLevel-1, "try to match time for line: %.120s", line)
|
||||
match = None
|
||||
# first try to use last template with same start/end position:
|
||||
ignoreBySearch = 0x7fffffff
|
||||
i = self.__lastTemplIdx
|
||||
if i < len(self.__templates):
|
||||
ddtempl = self.__templates[i]
|
||||
template = ddtempl.template
|
||||
if template.flags & (DateTemplate.LINE_BEGIN|DateTemplate.LINE_END):
|
||||
if logSys.getEffectiveLevel() <= logLevel-1: # pragma: no cover - very-heavy debug
|
||||
logSys.log(logLevel-1, " try to match last anchored template #%02i ...", i)
|
||||
match = template.matchDate(line)
|
||||
if not match is None:
|
||||
ignoreBySearch = i
|
||||
else:
|
||||
distance, endpos = self.__lastPos[0], self.__lastEndPos[0]
|
||||
if logSys.getEffectiveLevel() <= logLevel-1:
|
||||
logSys.log(logLevel-1, " try to match last template #%02i (from %r to %r): ...%r==%r %s %r==%r...",
|
||||
i, distance, endpos,
|
||||
line[distance-1:distance], self.__lastPos[1],
|
||||
line[distance:endpos],
|
||||
line[endpos:endpos+1], self.__lastEndPos[1])
|
||||
# check same boundaries left/right, otherwise possible collision/pattern switch:
|
||||
if (line[distance-1:distance] == self.__lastPos[1] and
|
||||
line[endpos:endpos+1] == self.__lastEndPos[1]
|
||||
):
|
||||
match = template.matchDate(line, distance, endpos)
|
||||
if match:
|
||||
distance = match.start()
|
||||
endpos = match.end()
|
||||
# if different position, possible collision/pattern switch:
|
||||
if (
|
||||
template.flags & (DateTemplate.LINE_BEGIN|DateTemplate.LINE_END) or
|
||||
(distance == self.__lastPos[0] and endpos == self.__lastEndPos[0])
|
||||
):
|
||||
logSys.log(logLevel, " matched last time template #%02i", i)
|
||||
else:
|
||||
logSys.log(logLevel, " ** last pattern collision - pattern change, search ...")
|
||||
match = None
|
||||
else:
|
||||
logSys.log(logLevel, " ** last pattern not found - pattern change, search ...")
|
||||
# search template and better match:
|
||||
if not match:
|
||||
logSys.log(logLevel, " search template (%i) ...", len(self.__templates))
|
||||
found = None, 0x7fffffff, 0x7fffffff, -1
|
||||
i = 0
|
||||
for ddtempl in self.__templates:
|
||||
if logSys.getEffectiveLevel() <= logLevel-1:
|
||||
logSys.log(logLevel-1, " try template #%02i: %s", i, ddtempl.name)
|
||||
if i == ignoreBySearch:
|
||||
i += 1
|
||||
continue
|
||||
template = ddtempl.template
|
||||
match = template.matchDate(line)
|
||||
if match:
|
||||
distance = match.start()
|
||||
endpos = match.end()
|
||||
if logSys.getEffectiveLevel() <= logLevel:
|
||||
logSys.log(logLevel, "Matched time template %s", template.name)
|
||||
template.hits += 1
|
||||
template.lastUsed = time.time()
|
||||
# if not first - try to reorder current template (bubble up), they will be not sorted anymore:
|
||||
if i:
|
||||
self._reorderTemplate(i)
|
||||
# return tuple with match and template reference used for parsing:
|
||||
return (match, template)
|
||||
logSys.log(logLevel, " matched time template #%02i (at %r <= %r, %r) %s",
|
||||
i, distance, ddtempl.distance, self.__lastPos[0], template.name)
|
||||
## last (or single) template - fast stop:
|
||||
if i+1 >= len(self.__templates):
|
||||
break
|
||||
## if line-begin/end anchored - stop searching:
|
||||
if template.flags & (DateTemplate.LINE_BEGIN|DateTemplate.LINE_END):
|
||||
break
|
||||
## stop searching if next template still unused, but we had already hits:
|
||||
if (distance == 0 and ddtempl.hits) and not self.__templates[i+1].template.hits:
|
||||
break
|
||||
## [grave] if distance changed, possible date-match was found somewhere
|
||||
## in body of message, so save this template, and search further:
|
||||
if distance > ddtempl.distance or distance > self.__lastPos[0]:
|
||||
logSys.log(logLevel, " ** distance collision - pattern change, reserve")
|
||||
## shortest of both:
|
||||
if distance < found[1]:
|
||||
found = match, distance, endpos, i
|
||||
## search further:
|
||||
match = None
|
||||
i += 1
|
||||
continue
|
||||
## winner - stop search:
|
||||
break
|
||||
i += 1
|
||||
# check other template was found (use this one with shortest distance):
|
||||
if not match and found[0]:
|
||||
match, distance, endpos, i = found
|
||||
logSys.log(logLevel, " use best time template #%02i", i)
|
||||
ddtempl = self.__templates[i]
|
||||
template = ddtempl.template
|
||||
# we've winner, incr hits, set distance, usage, reorder, etc:
|
||||
if match:
|
||||
ddtempl.hits += 1
|
||||
ddtempl.lastUsed = time.time()
|
||||
ddtempl.distance = distance
|
||||
if self.__firstUnused == i:
|
||||
self.__firstUnused += 1
|
||||
self.__lastPos = distance, line[distance-1:distance]
|
||||
self.__lastEndPos = endpos, line[endpos:endpos+1]
|
||||
# if not first - try to reorder current template (bubble up), they will be not sorted anymore:
|
||||
if i and i != self.__lastTemplIdx:
|
||||
i = self._reorderTemplate(i)
|
||||
self.__lastTemplIdx = i
|
||||
# return tuple with match and template reference used for parsing:
|
||||
return (match, template)
|
||||
|
||||
# not found:
|
||||
logSys.log(logLevel, " no template.")
|
||||
return (None, None)
|
||||
|
||||
def getTime(self, line, timeMatch=None):
|
||||
|
@ -221,31 +442,22 @@ class DateDetector(object):
|
|||
The Unix timestamp returned from the first successfully matched
|
||||
template or None if not found.
|
||||
"""
|
||||
if timeMatch:
|
||||
template = timeMatch[1]
|
||||
if template is not None:
|
||||
try:
|
||||
date = template.getDate(line, timeMatch[0])
|
||||
if date is not None:
|
||||
if logSys.getEffectiveLevel() <= logLevel:
|
||||
logSys.log(logLevel, "Got time %f for %r using template %s",
|
||||
date[0], date[1].group(), template.name)
|
||||
return date
|
||||
except ValueError:
|
||||
return None
|
||||
with self.__lock:
|
||||
for template in self.__templates:
|
||||
try:
|
||||
date = template.getDate(line)
|
||||
if date is None:
|
||||
continue
|
||||
if logSys.getEffectiveLevel() <= logLevel:
|
||||
logSys.log(logLevel, "Got time %f for %r using template %s",
|
||||
date[0], date[1].group(), template.name)
|
||||
# search match for all specified templates:
|
||||
if timeMatch is None:
|
||||
timeMatch = self.matchTime(line)
|
||||
# convert:
|
||||
template = timeMatch[1]
|
||||
if template is not None:
|
||||
try:
|
||||
date = template.getDate(line, timeMatch[0])
|
||||
if date is not None:
|
||||
if logSys.getEffectiveLevel() <= logLevel: # pragma: no cover - heavy debug
|
||||
logSys.log(logLevel, " got time %f for %r using template %s",
|
||||
date[0], date[1].group(1), template.name)
|
||||
return date
|
||||
except ValueError: # pragma: no cover
|
||||
pass
|
||||
return None
|
||||
except ValueError:
|
||||
pass
|
||||
return None
|
||||
|
||||
def _reorderTemplate(self, num):
|
||||
"""Reorder template (bubble up) in template list if hits grows enough.
|
||||
|
@ -257,18 +469,39 @@ class DateDetector(object):
|
|||
"""
|
||||
if num:
|
||||
templates = self.__templates
|
||||
template = templates[num]
|
||||
ddtempl = templates[num]
|
||||
if logSys.getEffectiveLevel() <= logLevel:
|
||||
logSys.log(logLevel, " -> reorder template #%02i, hits: %r", num, ddtempl.hits)
|
||||
## current hits and time the template was long unused:
|
||||
untime = template.lastUsed - self.__unusedTime
|
||||
hits = template.hits
|
||||
untime = ddtempl.lastUsed - self.__unusedTime
|
||||
weight = ddtempl.weight
|
||||
## try to move faster (first if unused available, or half of part to current template position):
|
||||
pos = self.__firstUnused if self.__firstUnused < num else num // 2
|
||||
## don't move too often (multiline logs resp. log's with different date patterns),
|
||||
## if template not used too long, replace it also :
|
||||
if hits > templates[num-1].hits + 5 or templates[num-1].lastUsed < untime:
|
||||
## try to move faster (half of part to current template):
|
||||
pos = num // 2
|
||||
## if not larger - move slow (exact 1 position):
|
||||
if hits <= templates[pos].hits or templates[pos].lastUsed < untime:
|
||||
pos = num-1
|
||||
templates[pos], templates[num] = template, templates[pos]
|
||||
|
||||
|
||||
def _moveable():
|
||||
pweight = templates[pos].weight
|
||||
if logSys.getEffectiveLevel() <= logLevel:
|
||||
logSys.log(logLevel, " -> compare template #%02i & #%02i, weight %.3f > %.3f, hits %r > %r",
|
||||
num, pos, weight, pweight, ddtempl.hits, templates[pos].hits)
|
||||
return weight > pweight or untime > templates[pos].lastUsed
|
||||
##
|
||||
## if not moveable (smaller weight or target position recently used):
|
||||
if not _moveable():
|
||||
## try to move slow (exact 1 position):
|
||||
if pos == num-1:
|
||||
return num
|
||||
pos = num-1
|
||||
## if still smaller and template at position used, don't move:
|
||||
if not _moveable():
|
||||
return num
|
||||
## move:
|
||||
del templates[num]
|
||||
templates[pos:0] = [ddtempl]
|
||||
## correct first unused:
|
||||
while self.__firstUnused < len(templates) and templates[self.__firstUnused].hits:
|
||||
self.__firstUnused += 1
|
||||
if logSys.getEffectiveLevel() <= logLevel:
|
||||
logSys.log(logLevel, " -> moved template #%02i -> #%02i", num, pos)
|
||||
return pos
|
||||
return num
|
||||
|
|
|
@ -24,14 +24,28 @@ __author__ = "Cyril Jaquier"
|
|||
__copyright__ = "Copyright (c) 2004 Cyril Jaquier"
|
||||
__license__ = "GPL"
|
||||
|
||||
import re
|
||||
import re, time
|
||||
from abc import abstractmethod
|
||||
|
||||
from .strptime import reGroupDictStrptime, timeRE
|
||||
from .strptime import reGroupDictStrptime, timeRE, getTimePatternRE
|
||||
from ..helpers import getLogger
|
||||
|
||||
logSys = getLogger(__name__)
|
||||
|
||||
# check already grouped contains "(", but ignores char "\(" and conditional "(?(id)...)":
|
||||
RE_GROUPED = re.compile(r'(?<!(?:\(\?))(?<!\\)\((?!\?)')
|
||||
RE_GROUP = ( re.compile(r'^((?:\(\?\w+\))?\^?(?:\(\?\w+\))?)(.*?)(\$?)$'), r"\1(\2)\3" )
|
||||
|
||||
RE_EXLINE_BOUND_BEG = re.compile(r'^\{\^LN-BEG\}')
|
||||
RE_NO_WRD_BOUND_BEG = re.compile(r'^\(*(?:\(\?\w+\))?(?:\^|\(*\*\*|\(\?:\^)')
|
||||
RE_NO_WRD_BOUND_END = re.compile(r'(?<!\\)(?:\$\)?|\*\*\)*)$')
|
||||
RE_DEL_WRD_BOUNDS = ( re.compile(r'^\(*(?:\(\?\w+\))?\(*\*\*|(?<!\\)\*\*\)*$'),
|
||||
lambda m: m.group().replace('**', '') )
|
||||
|
||||
RE_LINE_BOUND_BEG = re.compile(r'^(?:\(\?\w+\))?(?:\^|\(\?:\^(?!\|))')
|
||||
RE_LINE_BOUND_END = re.compile(r'(?<![\\\|])(?:\$\)?)$')
|
||||
|
||||
RE_ALPHA_PATTERN = re.compile(r'(?<!\%)\%[aAbBpc]')
|
||||
|
||||
class DateTemplate(object):
|
||||
"""A template which searches for and returns a date from a log line.
|
||||
|
@ -45,22 +59,19 @@ class DateTemplate(object):
|
|||
regex
|
||||
"""
|
||||
|
||||
LINE_BEGIN = 8
|
||||
LINE_END = 4
|
||||
WORD_BEGIN = 2
|
||||
WORD_END = 1
|
||||
|
||||
def __init__(self):
|
||||
self._name = ""
|
||||
self.name = ""
|
||||
self.weight = 1.0
|
||||
self.flags = 0
|
||||
self.hits = 0
|
||||
self.time = 0
|
||||
self._regex = ""
|
||||
self._cRegex = None
|
||||
self.hits = 0
|
||||
self.lastUsed = 0
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
"""Name assigned to template.
|
||||
"""
|
||||
return self._name
|
||||
|
||||
@name.setter
|
||||
def name(self, name):
|
||||
self._name = name
|
||||
|
||||
def getRegex(self):
|
||||
return self._regex
|
||||
|
@ -75,10 +86,12 @@ class DateTemplate(object):
|
|||
wordBegin : bool
|
||||
Defines whether the regex should be modified to search at beginning of a
|
||||
word, by adding special boundary r'(?=^|\b|\W)' to start of regex.
|
||||
Can be disabled with specifying of ** at front of regex.
|
||||
Default True.
|
||||
wordEnd : bool
|
||||
Defines whether the regex should be modified to search at end of a word,
|
||||
by adding special boundary r'(?=\b|\W|$)' to end of regex.
|
||||
Can be disabled with specifying of ** at end of regex.
|
||||
Default True.
|
||||
|
||||
Raises
|
||||
|
@ -86,23 +99,62 @@ class DateTemplate(object):
|
|||
re.error
|
||||
If regular expression fails to compile
|
||||
"""
|
||||
# Warning: don't use lookahead for line-begin boundary,
|
||||
# (e. g. r"^(?:\W{0,2})?" is much faster as r"(?:^|(?<=^\W)|(?<=^\W{2}))")
|
||||
# because it may be very slow in negative case (by long log-lines not matching pattern)
|
||||
|
||||
regex = regex.strip()
|
||||
if wordBegin and not re.search(r'^\^', regex):
|
||||
regex = r'(?=^|\b|\W)' + regex
|
||||
if wordEnd and not re.search(r'\$$', regex):
|
||||
boundBegin = wordBegin and not RE_NO_WRD_BOUND_BEG.search(regex)
|
||||
boundEnd = wordEnd and not RE_NO_WRD_BOUND_END.search(regex)
|
||||
# if no group add it now, should always have a group(1):
|
||||
if not RE_GROUPED.search(regex):
|
||||
regex = RE_GROUP[0].sub(RE_GROUP[1], regex)
|
||||
self.flags = 0
|
||||
# if word or line start boundary:
|
||||
if boundBegin:
|
||||
self.flags |= DateTemplate.WORD_BEGIN if wordBegin != 'start' else DateTemplate.LINE_BEGIN
|
||||
if wordBegin != 'start':
|
||||
regex = r'(?:^|\b|\W)' + regex
|
||||
else:
|
||||
regex = r"^(?:\W{0,2})?" + regex
|
||||
if not self.name.startswith('{^LN-BEG}'):
|
||||
self.name = '{^LN-BEG}' + self.name
|
||||
# if word end boundary:
|
||||
if boundEnd:
|
||||
self.flags |= DateTemplate.WORD_END
|
||||
regex += r'(?=\b|\W|$)'
|
||||
if RE_LINE_BOUND_BEG.search(regex): self.flags |= DateTemplate.LINE_BEGIN
|
||||
if RE_LINE_BOUND_END.search(regex): self.flags |= DateTemplate.LINE_END
|
||||
# remove possible special pattern "**" in front and end of regex:
|
||||
regex = RE_DEL_WRD_BOUNDS[0].sub(RE_DEL_WRD_BOUNDS[1], regex)
|
||||
self._regex = regex
|
||||
logSys.debug(' constructed regex %s', regex)
|
||||
self._cRegex = None
|
||||
|
||||
regex = property(getRegex, setRegex, doc=
|
||||
"""Regex used to search for date.
|
||||
""")
|
||||
|
||||
def matchDate(self, line):
|
||||
def _compileRegex(self):
|
||||
"""Compile regex by first usage.
|
||||
"""
|
||||
if not self._cRegex:
|
||||
try:
|
||||
# print('*'*10 + (' compile - %-30.30s -- %s' % (getattr(self, 'pattern', self.regex), self.name)))
|
||||
self._cRegex = re.compile(self.regex)
|
||||
except Exception as e:
|
||||
logSys.error('Compile %r failed, expression %r', self.name, self.regex)
|
||||
raise e
|
||||
|
||||
def matchDate(self, line, *args):
|
||||
"""Check if regex for date matches on a log line.
|
||||
"""
|
||||
if not self._cRegex:
|
||||
self._cRegex = re.compile(self.regex, re.UNICODE | re.IGNORECASE)
|
||||
dateMatch = self._cRegex.search(line)
|
||||
self._compileRegex()
|
||||
dateMatch = self._cRegex.search(line, *args); # pos, endpos
|
||||
if dateMatch:
|
||||
self.hits += 1
|
||||
# print('*'*10 + ('[%s] - %-30.30s -- %s' % ('*' if dateMatch else ' ', getattr(self, 'pattern', self.regex), self.name)))
|
||||
return dateMatch
|
||||
|
||||
@abstractmethod
|
||||
|
@ -138,9 +190,15 @@ class DateEpoch(DateTemplate):
|
|||
regex
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
def __init__(self, lineBeginOnly=False):
|
||||
DateTemplate.__init__(self)
|
||||
self.regex = r"(?:^|(?P<square>(?<=^\[))|(?P<selinux>(?<=audit\()))\d{10,11}\b(?:\.\d{3,6})?(?:(?(selinux)(?=:\d+\)))|(?(square)(?=\])))"
|
||||
self.name = "Epoch"
|
||||
if not lineBeginOnly:
|
||||
regex = r"((?:^|(?P<square>(?<=^\[))|(?P<selinux>(?<=\baudit\()))\d{10,11}\b(?:\.\d{3,6})?)(?:(?(selinux)(?=:\d+\)))|(?(square)(?=\])))"
|
||||
self.setRegex(regex, wordBegin=False) ;# already line begin resp. word begin anchored
|
||||
else:
|
||||
regex = r"((?P<square>(?<=^\[))?\d{10,11}\b(?:\.\d{3,6})?)(?(square)(?=\]))"
|
||||
self.setRegex(regex, wordBegin='start', wordEnd=True)
|
||||
|
||||
def getDate(self, line, dateMatch=None):
|
||||
"""Method to return the date for a log line.
|
||||
|
@ -160,8 +218,7 @@ class DateEpoch(DateTemplate):
|
|||
dateMatch = self.matchDate(line)
|
||||
if dateMatch:
|
||||
# extract part of format which represents seconds since epoch
|
||||
return (float(dateMatch.group()), dateMatch)
|
||||
return None
|
||||
return (float(dateMatch.group(1)), dateMatch)
|
||||
|
||||
|
||||
class DatePatternRegex(DateTemplate):
|
||||
|
@ -178,21 +235,15 @@ class DatePatternRegex(DateTemplate):
|
|||
regex
|
||||
pattern
|
||||
"""
|
||||
_patternRE = re.compile(r"%%(%%|[%s])" % "".join(timeRE.keys()))
|
||||
_patternName = {
|
||||
'a': "DAY", 'A': "DAYNAME", 'b': "MON", 'B': "MONTH", 'd': "Day",
|
||||
'H': "24hour", 'I': "12hour", 'j': "Yearday", 'm': "Month",
|
||||
'M': "Minute", 'p': "AMPM", 'S': "Second", 'U': "Yearweek",
|
||||
'w': "Weekday", 'W': "Yearweek", 'y': 'Year2', 'Y': "Year", '%': "%",
|
||||
'z': "Zone offset", 'f': "Microseconds", 'Z': "Zone name"}
|
||||
for _key in set(timeRE) - set(_patternName): # may not have them all...
|
||||
_patternName[_key] = "%%%s" % _key
|
||||
|
||||
_patternRE, _patternName = getTimePatternRE()
|
||||
_patternRE = re.compile(_patternRE)
|
||||
|
||||
def __init__(self, pattern=None):
|
||||
def __init__(self, pattern=None, **kwargs):
|
||||
super(DatePatternRegex, self).__init__()
|
||||
self._pattern = None
|
||||
if pattern is not None:
|
||||
self.pattern = pattern
|
||||
self.setRegex(pattern, **kwargs)
|
||||
|
||||
@property
|
||||
def pattern(self):
|
||||
|
@ -208,17 +259,23 @@ class DatePatternRegex(DateTemplate):
|
|||
|
||||
@pattern.setter
|
||||
def pattern(self, pattern):
|
||||
self.setRegex(pattern)
|
||||
|
||||
def setRegex(self, pattern, wordBegin=True, wordEnd=True):
|
||||
# original pattern:
|
||||
self._pattern = pattern
|
||||
# if explicit given {^LN-BEG} - remove it from pattern and set 'start' in wordBegin:
|
||||
if wordBegin and RE_EXLINE_BOUND_BEG.search(pattern):
|
||||
pattern = RE_EXLINE_BOUND_BEG.sub('', pattern)
|
||||
wordBegin = 'start'
|
||||
# wrap to regex:
|
||||
fmt = self._patternRE.sub(r'%(\1)s', pattern)
|
||||
self._name = fmt % self._patternName
|
||||
super(DatePatternRegex, self).setRegex(fmt % timeRE)
|
||||
|
||||
def setRegex(self, value):
|
||||
raise NotImplementedError("Regex derived from pattern")
|
||||
|
||||
@DateTemplate.name.setter
|
||||
def name(self, value):
|
||||
raise NotImplementedError("Name derived from pattern")
|
||||
self.name = fmt % self._patternName
|
||||
regex = fmt % timeRE
|
||||
# if expected add (?iu) for "ignore case" and "unicode":
|
||||
if RE_ALPHA_PATTERN.search(pattern):
|
||||
regex = r'(?iu)' + regex
|
||||
super(DatePatternRegex, self).setRegex(regex, wordBegin, wordEnd)
|
||||
|
||||
def getDate(self, line, dateMatch=None):
|
||||
"""Method to return the date for a log line.
|
||||
|
@ -240,11 +297,7 @@ class DatePatternRegex(DateTemplate):
|
|||
if not dateMatch:
|
||||
dateMatch = self.matchDate(line)
|
||||
if dateMatch:
|
||||
groupdict = dict(
|
||||
(key, value)
|
||||
for key, value in dateMatch.groupdict().iteritems()
|
||||
if value is not None)
|
||||
return reGroupDictStrptime(groupdict), dateMatch
|
||||
return reGroupDictStrptime(dateMatch.groupdict()), dateMatch
|
||||
|
||||
|
||||
class DateTai64n(DateTemplate):
|
||||
|
@ -256,11 +309,11 @@ class DateTai64n(DateTemplate):
|
|||
regex
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
def __init__(self, wordBegin=False):
|
||||
DateTemplate.__init__(self)
|
||||
self.name = "TAI64N"
|
||||
# We already know the format for TAI64N
|
||||
# yoh: we should not add an additional front anchor
|
||||
self.setRegex("@[0-9a-f]{24}", wordBegin=False)
|
||||
self.setRegex("@[0-9a-f]{24}", wordBegin=wordBegin)
|
||||
|
||||
def getDate(self, line, dateMatch=None):
|
||||
"""Method to return the date for a log line.
|
||||
|
@ -280,8 +333,7 @@ class DateTai64n(DateTemplate):
|
|||
dateMatch = self.matchDate(line)
|
||||
if dateMatch:
|
||||
# extract part of format which represents seconds since epoch
|
||||
value = dateMatch.group()
|
||||
value = dateMatch.group(1)
|
||||
seconds_since_epoch = value[2:17]
|
||||
# convert seconds from HEX into local time stamp
|
||||
return (int(seconds_since_epoch, 16), dateMatch)
|
||||
return None
|
||||
|
|
|
@ -36,7 +36,6 @@ from .observer import Observers
|
|||
from .ticket import FailTicket
|
||||
from .jailthread import JailThread
|
||||
from .datedetector import DateDetector
|
||||
from .datetemplate import DatePatternRegex, DateEpoch, DateTai64n
|
||||
from .mytime import MyTime
|
||||
from .failregex import FailRegex, Regex, RegexException
|
||||
from .action import CommandAction
|
||||
|
@ -91,11 +90,16 @@ class Filter(JailThread):
|
|||
## Error counter (protected, so can be used in filter implementations)
|
||||
## if it reached 100 (at once), run-cycle will go idle
|
||||
self._errors = 0
|
||||
## return raw host (host is not dns):
|
||||
self.returnRawHost = False
|
||||
## check each regex (used for test purposes):
|
||||
self.checkAllRegex = False
|
||||
## if true ignores obsolete failures (failure time < now - findTime):
|
||||
self.checkFindTime = True
|
||||
## Ticks counter
|
||||
self.ticks = 0
|
||||
|
||||
self.dateDetector = DateDetector()
|
||||
self.dateDetector.addDefaultTemplate()
|
||||
logSys.debug("Created %s" % self)
|
||||
|
||||
def __repr__(self):
|
||||
|
@ -258,20 +262,13 @@ class Filter(JailThread):
|
|||
if pattern is None:
|
||||
self.dateDetector = None
|
||||
return
|
||||
elif pattern.upper() == "EPOCH":
|
||||
template = DateEpoch()
|
||||
template.name = "Epoch"
|
||||
elif pattern.upper() == "TAI64N":
|
||||
template = DateTai64n()
|
||||
template.name = "TAI64N"
|
||||
else:
|
||||
template = DatePatternRegex(pattern)
|
||||
self.dateDetector = DateDetector()
|
||||
self.dateDetector.appendTemplate(template)
|
||||
logSys.info(" date pattern `%r`: `%s`",
|
||||
pattern, template.name)
|
||||
logSys.debug(" date pattern regex for %r: %s",
|
||||
pattern, template.regex)
|
||||
dd = DateDetector()
|
||||
if not isinstance(pattern, (list, tuple)):
|
||||
pattern = filter(bool, map(str.strip, re.split('\n+', pattern)))
|
||||
for pattern in pattern:
|
||||
dd.appendTemplate(pattern)
|
||||
self.dateDetector = dd
|
||||
|
||||
##
|
||||
# Get the date detector pattern, or Default Detectors if not changed
|
||||
|
@ -281,14 +278,16 @@ class Filter(JailThread):
|
|||
def getDatePattern(self):
|
||||
if self.dateDetector is not None:
|
||||
templates = self.dateDetector.templates
|
||||
if len(templates) > 1:
|
||||
# lazy template init, by first match
|
||||
if not len(templates) or len(templates) > 2:
|
||||
return None, "Default Detectors"
|
||||
elif len(templates) == 1:
|
||||
elif len(templates):
|
||||
if hasattr(templates[0], "pattern"):
|
||||
pattern = templates[0].pattern
|
||||
else:
|
||||
pattern = None
|
||||
return pattern, templates[0].name
|
||||
return None
|
||||
|
||||
##
|
||||
# Set the maximum retry value.
|
||||
|
@ -450,14 +449,14 @@ class Filter(JailThread):
|
|||
if self.__ignoreCommand:
|
||||
command = CommandAction.replaceTag(self.__ignoreCommand, { 'ip': ip } )
|
||||
logSys.debug('ignore command: ' + command)
|
||||
ret_ignore = CommandAction.executeCmd(command)
|
||||
ret, ret_ignore = CommandAction.executeCmd(command, success_codes=(0, 1))
|
||||
ret_ignore = ret and ret_ignore == 0
|
||||
self.logIgnoreIp(ip, log_ignore and ret_ignore, ignore_source="command")
|
||||
return ret_ignore
|
||||
|
||||
return False
|
||||
|
||||
def processLine(self, line, date=None, returnRawHost=False,
|
||||
checkAllRegex=False, checkFindTime=False):
|
||||
def processLine(self, line, date=None):
|
||||
"""Split the time portion from log msg and return findFailures on them
|
||||
"""
|
||||
if date:
|
||||
|
@ -469,22 +468,23 @@ class Filter(JailThread):
|
|||
(timeMatch, template) = self.dateDetector.matchTime(l)
|
||||
if timeMatch:
|
||||
tupleLine = (
|
||||
l[:timeMatch.start()],
|
||||
l[timeMatch.start():timeMatch.end()],
|
||||
l[timeMatch.end():],
|
||||
l[:timeMatch.start(1)],
|
||||
l[timeMatch.start(1):timeMatch.end(1)],
|
||||
l[timeMatch.end(1):],
|
||||
(timeMatch, template)
|
||||
)
|
||||
else:
|
||||
tupleLine = (l, "", "", None)
|
||||
|
||||
return "".join(tupleLine[::2]), self.findFailure(
|
||||
tupleLine, date, returnRawHost, checkAllRegex, checkFindTime)
|
||||
# save last line (lazy convert of process line tuple to string on demand):
|
||||
self.processedLine = lambda: "".join(tupleLine[::2])
|
||||
return self.findFailure(tupleLine, date)
|
||||
|
||||
def processLineAndAdd(self, line, date=None):
|
||||
"""Processes the line for failures and populates failManager
|
||||
"""
|
||||
try:
|
||||
for element in self.processLine(line, date, checkFindTime=True)[1]:
|
||||
for element in self.processLine(line, date):
|
||||
ip = element[1]
|
||||
unixTime = element[2]
|
||||
lines = element[3]
|
||||
|
@ -543,10 +543,10 @@ class Filter(JailThread):
|
|||
# to find the logging time.
|
||||
# @return a dict with IP and timestamp.
|
||||
|
||||
def findFailure(self, tupleLine, date=None, returnRawHost=False,
|
||||
checkAllRegex=False, checkFindTime=False):
|
||||
def findFailure(self, tupleLine, date=None):
|
||||
failList = list()
|
||||
|
||||
returnRawHost = self.returnRawHost
|
||||
cidr = IPAddr.CIDR_UNSPEC
|
||||
if self.__useDns == "raw":
|
||||
returnRawHost = True
|
||||
|
@ -581,7 +581,7 @@ class Filter(JailThread):
|
|||
timeText = self.__lastTimeText or "".join(tupleLine[::2])
|
||||
date = self.__lastDate
|
||||
|
||||
if checkFindTime and date is not None and date < MyTime.time() - self.getFindTime():
|
||||
if self.checkFindTime and date is not None and date < MyTime.time() - self.getFindTime():
|
||||
logSys.log(5, "Ignore line since time %s < %s - %s",
|
||||
date, MyTime.time(), self.getFindTime())
|
||||
return failList
|
||||
|
@ -602,7 +602,7 @@ class Filter(JailThread):
|
|||
# The ignoreregex matched. Remove ignored match.
|
||||
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
|
||||
logSys.log(7, "Matched ignoreregex and was ignored")
|
||||
if not checkAllRegex:
|
||||
if not self.checkAllRegex:
|
||||
break
|
||||
else:
|
||||
continue
|
||||
|
@ -645,7 +645,7 @@ class Filter(JailThread):
|
|||
ip = IPAddr(fid, IPAddr.CIDR_RAW)
|
||||
failList.append([failRegexIndex, ip, date,
|
||||
failRegex.getMatchedLines(), fail])
|
||||
if not checkAllRegex:
|
||||
if not self.checkAllRegex:
|
||||
break
|
||||
else:
|
||||
ips = DNSUtils.textToIp(host, self.__useDns)
|
||||
|
@ -653,7 +653,7 @@ class Filter(JailThread):
|
|||
for ip in ips:
|
||||
failList.append([failRegexIndex, ip, date,
|
||||
failRegex.getMatchedLines(), fail])
|
||||
if not checkAllRegex:
|
||||
if not self.checkAllRegex:
|
||||
break
|
||||
except RegexException as e: # pragma: no cover - unsure if reachable
|
||||
logSys.error(e)
|
||||
|
|
|
@ -178,6 +178,14 @@ class FilterSystemd(JournalFilter): # pragma: systemd no cover
|
|||
def getJournalMatch(self):
|
||||
return self.__matches
|
||||
|
||||
##
|
||||
# Get journal reader
|
||||
#
|
||||
# @return journal reader
|
||||
|
||||
def getJournalReader(self):
|
||||
return self.__journal
|
||||
|
||||
##
|
||||
# Format journal log entry into syslog style
|
||||
#
|
||||
|
|
|
@ -41,6 +41,21 @@ class MyTime:
|
|||
"""
|
||||
|
||||
myTime = None
|
||||
alternateNowTime = None
|
||||
alternateNow = None
|
||||
|
||||
@staticmethod
|
||||
def setAlternateNow(t):
|
||||
"""Set current time.
|
||||
|
||||
Use None in order to always get the real current time.
|
||||
|
||||
@param t the time to set or None
|
||||
"""
|
||||
|
||||
MyTime.alternateNowTime = t
|
||||
MyTime.alternateNow = \
|
||||
datetime.datetime.fromtimestamp(t) if t is not None else None
|
||||
|
||||
@staticmethod
|
||||
def setTime(t):
|
||||
|
@ -84,8 +99,9 @@ class MyTime:
|
|||
"""
|
||||
if MyTime.myTime is None:
|
||||
return datetime.datetime.now()
|
||||
else:
|
||||
return datetime.datetime.fromtimestamp(MyTime.myTime)
|
||||
if MyTime.myTime == MyTime.alternateNowTime:
|
||||
return MyTime.alternateNow
|
||||
return datetime.datetime.fromtimestamp(MyTime.myTime)
|
||||
|
||||
@staticmethod
|
||||
def localtime(x=None):
|
||||
|
|
|
@ -547,17 +547,19 @@ class Server:
|
|||
# @param target the logging target
|
||||
|
||||
def setLogTarget(self, target):
|
||||
# check reserved targets in uppercase, don't change target, because it can be file:
|
||||
systarget = target.upper()
|
||||
with self.__loggingLock:
|
||||
# don't set new handlers if already the same
|
||||
# or if "INHERITED" (foreground worker of the test cases, to prevent stop logging):
|
||||
if self.__logTarget == target:
|
||||
return True
|
||||
if target == "INHERITED":
|
||||
if systarget == "INHERITED":
|
||||
self.__logTarget = target
|
||||
return True
|
||||
# set a format which is simpler for console use
|
||||
fmt = "%(asctime)s %(name)-24s[%(process)d]: %(levelname)-7s %(message)s"
|
||||
if target == "SYSLOG":
|
||||
if systarget == "SYSLOG":
|
||||
# Syslog daemons already add date to the message.
|
||||
fmt = "%(name)s[%(process)d]: %(levelname)s %(message)s"
|
||||
facility = logging.handlers.SysLogHandler.LOG_DAEMON
|
||||
|
@ -576,9 +578,9 @@ class Server:
|
|||
"Syslog socket file: %s does not exists"
|
||||
" or is not a socket" % self.__syslogSocket)
|
||||
return False
|
||||
elif target == "STDOUT":
|
||||
elif systarget == "STDOUT":
|
||||
hdlr = logging.StreamHandler(sys.stdout)
|
||||
elif target == "STDERR":
|
||||
elif systarget == "STDERR":
|
||||
hdlr = logging.StreamHandler(sys.stderr)
|
||||
else:
|
||||
# Target should be a file
|
||||
|
|
|
@ -26,10 +26,59 @@ from .mytime import MyTime
|
|||
|
||||
locale_time = LocaleTime()
|
||||
timeRE = TimeRE()
|
||||
timeRE['z'] = r"(?P<z>Z|[+-]\d{2}(?::?[0-5]\d)?)"
|
||||
|
||||
def _getYearCentRE(cent=(0,3), distance=3, now=(MyTime.now(), MyTime.alternateNow)):
|
||||
""" Build century regex for last year and the next years (distance).
|
||||
|
||||
Thereby respect possible run in the test-cases (alternate date used there)
|
||||
"""
|
||||
cent = lambda year, f=cent[0], t=cent[1]: str(year)[f:t]
|
||||
exprset = set( cent(now[0].year + i) for i in (-1, distance) )
|
||||
if len(now) and now[1]:
|
||||
exprset |= set( cent(now[1].year + i) for i in (-1, distance) )
|
||||
return "(?:%s)" % "|".join(exprset) if len(exprset) > 1 else "".join(exprset)
|
||||
|
||||
def reGroupDictStrptime(found_dict):
|
||||
#todo: implement literal time zone support like CET, PST, PDT, etc (via pytz):
|
||||
#timeRE['z'] = r"%s?(?P<z>Z|[+-]\d{2}(?::?[0-5]\d)?|[A-Z]{3})?" % timeRE['Z']
|
||||
timeRE['Z'] = r"(?P<Z>[A-Z]{3,5})"
|
||||
timeRE['z'] = r"(?P<z>Z|UTC|GMT|[+-]\d{2}(?::?[0-5]\d)?)"
|
||||
|
||||
# Extend build-in TimeRE with some exact patterns
|
||||
# exact two-digit patterns:
|
||||
timeRE['Exd'] = r"(?P<d>3[0-1]|[1-2]\d|0[1-9])"
|
||||
timeRE['Exm'] = r"(?P<m>1[0-2]|0[1-9])"
|
||||
timeRE['ExH'] = r"(?P<H>2[0-3]|[0-1]\d)"
|
||||
timeRE['ExM'] = r"(?P<M>[0-5]\d)"
|
||||
timeRE['ExS'] = r"(?P<S>6[0-1]|[0-5]\d)"
|
||||
# more precise year patterns, within same century of last year and
|
||||
# the next 3 years (for possible long uptime of fail2ban); thereby
|
||||
# respect possible run in the test-cases (alternate date used there):
|
||||
timeRE['ExY'] = r"(?P<Y>%s\d)" % _getYearCentRE(cent=(0,3), distance=3)
|
||||
timeRE['Exy'] = r"(?P<y>%s\d)" % _getYearCentRE(cent=(2,3), distance=3)
|
||||
|
||||
def getTimePatternRE():
|
||||
keys = timeRE.keys()
|
||||
patt = (r"%%(%%|%s|[%s])" % (
|
||||
"|".join([k for k in keys if len(k) > 1]),
|
||||
"".join([k for k in keys if len(k) == 1]),
|
||||
))
|
||||
names = {
|
||||
'a': "DAY", 'A': "DAYNAME", 'b': "MON", 'B': "MONTH", 'd': "Day",
|
||||
'H': "24hour", 'I': "12hour", 'j': "Yearday", 'm': "Month",
|
||||
'M': "Minute", 'p': "AMPM", 'S': "Second", 'U': "Yearweek",
|
||||
'w': "Weekday", 'W': "Yearweek", 'y': 'Year2', 'Y': "Year", '%': "%",
|
||||
'z': "Zone offset", 'f': "Microseconds", 'Z': "Zone name",
|
||||
}
|
||||
for key in set(keys) - set(names): # may not have them all...
|
||||
if key.startswith('Ex'):
|
||||
kn = names.get(key[2:])
|
||||
if kn:
|
||||
names[key] = "Ex" + kn
|
||||
continue
|
||||
names[key] = "%%%s" % key
|
||||
return (patt, names)
|
||||
|
||||
def reGroupDictStrptime(found_dict, msec=False):
|
||||
"""Return time from dictionary of strptime fields
|
||||
|
||||
This is tweaked from python built-in _strptime.
|
||||
|
@ -58,14 +107,15 @@ def reGroupDictStrptime(found_dict):
|
|||
# weekday and julian defaulted to -1 so as to signal need to calculate
|
||||
# values
|
||||
weekday = julian = -1
|
||||
for group_key in found_dict.keys():
|
||||
for key, val in found_dict.iteritems():
|
||||
if val is None: continue
|
||||
# Directives not explicitly handled below:
|
||||
# c, x, X
|
||||
# handled by making out of other directives
|
||||
# U, W
|
||||
# worthless without day of the week
|
||||
if group_key == 'y':
|
||||
year = int(found_dict['y'])
|
||||
if key == 'y':
|
||||
year = int(val)
|
||||
# Open Group specification for strptime() states that a %y
|
||||
#value in the range of [00, 68] is in the century 2000, while
|
||||
#[69,99] is in the century 1900
|
||||
|
@ -73,20 +123,20 @@ def reGroupDictStrptime(found_dict):
|
|||
year += 2000
|
||||
else:
|
||||
year += 1900
|
||||
elif group_key == 'Y':
|
||||
year = int(found_dict['Y'])
|
||||
elif group_key == 'm':
|
||||
month = int(found_dict['m'])
|
||||
elif group_key == 'B':
|
||||
month = locale_time.f_month.index(found_dict['B'].lower())
|
||||
elif group_key == 'b':
|
||||
month = locale_time.a_month.index(found_dict['b'].lower())
|
||||
elif group_key == 'd':
|
||||
day = int(found_dict['d'])
|
||||
elif group_key == 'H':
|
||||
hour = int(found_dict['H'])
|
||||
elif group_key == 'I':
|
||||
hour = int(found_dict['I'])
|
||||
elif key == 'Y':
|
||||
year = int(val)
|
||||
elif key == 'm':
|
||||
month = int(val)
|
||||
elif key == 'B':
|
||||
month = locale_time.f_month.index(val.lower())
|
||||
elif key == 'b':
|
||||
month = locale_time.a_month.index(val.lower())
|
||||
elif key == 'd':
|
||||
day = int(val)
|
||||
elif key == 'H':
|
||||
hour = int(val)
|
||||
elif key == 'I':
|
||||
hour = int(val)
|
||||
ampm = found_dict.get('p', '').lower()
|
||||
# If there was no AM/PM indicator, we'll treat this like AM
|
||||
if ampm in ('', locale_time.am_pm[0]):
|
||||
|
@ -101,38 +151,39 @@ def reGroupDictStrptime(found_dict):
|
|||
# 12 noon == 12 PM == hour 12
|
||||
if hour != 12:
|
||||
hour += 12
|
||||
elif group_key == 'M':
|
||||
minute = int(found_dict['M'])
|
||||
elif group_key == 'S':
|
||||
second = int(found_dict['S'])
|
||||
elif group_key == 'f':
|
||||
s = found_dict['f']
|
||||
# Pad to always return microseconds.
|
||||
s += "0" * (6 - len(s))
|
||||
fraction = int(s)
|
||||
elif group_key == 'A':
|
||||
weekday = locale_time.f_weekday.index(found_dict['A'].lower())
|
||||
elif group_key == 'a':
|
||||
weekday = locale_time.a_weekday.index(found_dict['a'].lower())
|
||||
elif group_key == 'w':
|
||||
weekday = int(found_dict['w'])
|
||||
elif key == 'M':
|
||||
minute = int(val)
|
||||
elif key == 'S':
|
||||
second = int(val)
|
||||
elif key == 'f':
|
||||
if msec:
|
||||
s = val
|
||||
# Pad to always return microseconds.
|
||||
s += "0" * (6 - len(s))
|
||||
fraction = int(s)
|
||||
elif key == 'A':
|
||||
weekday = locale_time.f_weekday.index(val.lower())
|
||||
elif key == 'a':
|
||||
weekday = locale_time.a_weekday.index(val.lower())
|
||||
elif key == 'w':
|
||||
weekday = int(val)
|
||||
if weekday == 0:
|
||||
weekday = 6
|
||||
else:
|
||||
weekday -= 1
|
||||
elif group_key == 'j':
|
||||
julian = int(found_dict['j'])
|
||||
elif group_key in ('U', 'W'):
|
||||
week_of_year = int(found_dict[group_key])
|
||||
if group_key == 'U':
|
||||
elif key == 'j':
|
||||
julian = int(val)
|
||||
elif key in ('U', 'W'):
|
||||
week_of_year = int(val)
|
||||
if key == 'U':
|
||||
# U starts week on Sunday.
|
||||
week_of_year_start = 6
|
||||
else:
|
||||
# W starts week on Monday.
|
||||
week_of_year_start = 0
|
||||
elif group_key == 'z':
|
||||
z = found_dict['z']
|
||||
if z == "Z":
|
||||
elif key == 'z':
|
||||
z = val
|
||||
if z in ("Z", "UTC", "GMT"):
|
||||
tzoffset = 0
|
||||
else:
|
||||
tzoffset = int(z[1:3]) * 60 # Hours...
|
||||
|
@ -140,6 +191,10 @@ def reGroupDictStrptime(found_dict):
|
|||
tzoffset += int(z[-2:]) # ...and minutes
|
||||
if z.startswith("-"):
|
||||
tzoffset = -tzoffset
|
||||
elif key == 'Z':
|
||||
z = val
|
||||
if z in ("UTC", "GMT"):
|
||||
tzoffset = 0
|
||||
|
||||
# Fail2Ban will assume it's this year
|
||||
assume_year = False
|
||||
|
@ -176,7 +231,7 @@ def reGroupDictStrptime(found_dict):
|
|||
# Actully create date
|
||||
date_result = datetime.datetime(
|
||||
year, month, day, hour, minute, second, fraction)
|
||||
if gmtoff:
|
||||
if gmtoff is not None:
|
||||
date_result = date_result - datetime.timedelta(seconds=gmtoff)
|
||||
|
||||
if date_result > now and assume_today:
|
||||
|
@ -189,7 +244,9 @@ def reGroupDictStrptime(found_dict):
|
|||
year=year-1, month=month, day=day)
|
||||
|
||||
if gmtoff is not None:
|
||||
return calendar.timegm(date_result.utctimetuple())
|
||||
tm = calendar.timegm(date_result.utctimetuple())
|
||||
else:
|
||||
return time.mktime(date_result.timetuple())
|
||||
|
||||
tm = time.mktime(date_result.timetuple())
|
||||
if msec:
|
||||
tm += fraction/1000000.0
|
||||
return tm
|
||||
|
|
|
@ -131,6 +131,9 @@ class Transmitter:
|
|||
return self.status(command[1:])
|
||||
elif command[0] == "version":
|
||||
return version.version
|
||||
elif command[0] == "config-error":
|
||||
logSys.error(command[1])
|
||||
return None
|
||||
raise Exception("Invalid command")
|
||||
|
||||
def __commandSet(self, command, multiple=False):
|
||||
|
@ -308,7 +311,7 @@ class Transmitter:
|
|||
actionvalue = command[4]
|
||||
setattr(action, actionkey, actionvalue)
|
||||
return getattr(action, actionkey)
|
||||
raise Exception("Invalid command (no set action or not yet implemented)")
|
||||
raise Exception("Invalid command %r (no set action or not yet implemented)" % (command[1],))
|
||||
|
||||
def __commandGet(self, command):
|
||||
name = command[0]
|
||||
|
|
|
@ -110,7 +110,7 @@ class Utils():
|
|||
return flags
|
||||
|
||||
@staticmethod
|
||||
def executeCmd(realCmd, timeout=60, shell=True, output=False, tout_kill_tree=True):
|
||||
def executeCmd(realCmd, timeout=60, shell=True, output=False, tout_kill_tree=True, success_codes=(0,)):
|
||||
"""Executes a command.
|
||||
|
||||
Parameters
|
||||
|
@ -170,7 +170,7 @@ class Utils():
|
|||
time.sleep(Utils.DEFAULT_SLEEP_INTERVAL)
|
||||
retcode = popen.poll()
|
||||
#logSys.debug("%s -- killed %s ", realCmd, retcode)
|
||||
if retcode is None and not Utils.pid_exists(pgid):
|
||||
if retcode is None and not Utils.pid_exists(pgid): # pragma: no cover
|
||||
retcode = signal.SIGKILL
|
||||
except OSError as e:
|
||||
stderr = "%s -- failed with %s" % (realCmd, e)
|
||||
|
@ -178,7 +178,7 @@ class Utils():
|
|||
if not popen:
|
||||
return False if not output else (False, stdout, stderr, retcode)
|
||||
|
||||
std_level = retcode == 0 and logging.DEBUG or logging.ERROR
|
||||
std_level = logging.DEBUG if retcode in success_codes else logging.ERROR
|
||||
# if we need output (to return or to log it):
|
||||
if output or std_level >= logSys.getEffectiveLevel():
|
||||
# if was timeouted (killed/terminated) - to prevent waiting, set std handles to non-blocking mode.
|
||||
|
@ -208,8 +208,8 @@ class Utils():
|
|||
popen.stderr.close()
|
||||
|
||||
success = False
|
||||
if retcode == 0:
|
||||
logSys.debug("%-.40s -- returned successfully", realCmd)
|
||||
if retcode in success_codes:
|
||||
logSys.debug("%-.40s -- returned successfully %i", realCmd, retcode)
|
||||
success = True
|
||||
elif retcode is None:
|
||||
logSys.error("%-.40s -- unable to kill PID %i", realCmd, popen.pid)
|
||||
|
@ -223,7 +223,9 @@ class Utils():
|
|||
logSys.error("%-.40s -- returned %i", realCmd, retcode)
|
||||
if msg:
|
||||
logSys.info("HINT on %i: %s", retcode, msg % locals())
|
||||
return success if not output else (success, stdout, stderr, retcode)
|
||||
if output:
|
||||
return success, stdout, stderr, retcode
|
||||
return success if len(success_codes) == 1 else (success, retcode)
|
||||
|
||||
@staticmethod
|
||||
def wait_for(cond, timeout, interval=None):
|
||||
|
|
|
@ -32,6 +32,7 @@ if sys.version_info >= (2,7): # pragma: no cover - may be unavailable
|
|||
|
||||
def setUp(self):
|
||||
"""Call before every test case."""
|
||||
super(BadIPsActionTest, self).setUp()
|
||||
unittest.F2B.SkipIfNoNetwork()
|
||||
|
||||
self.jail = DummyJail()
|
||||
|
|
|
@ -21,6 +21,7 @@ import os
|
|||
import smtpd
|
||||
import threading
|
||||
import unittest
|
||||
import re
|
||||
import sys
|
||||
if sys.version_info >= (3, 3):
|
||||
import importlib
|
||||
|
@ -41,7 +42,9 @@ class TestSMTPServer(smtpd.SMTPServer):
|
|||
self.peer = peer
|
||||
self.mailfrom = mailfrom
|
||||
self.rcpttos = rcpttos
|
||||
self.data = data
|
||||
self.org_data = data
|
||||
# replace new line (with tab or space) for possible mime translations (word wrap):
|
||||
self.data = re.sub(r"\n[\t ]", " ", data)
|
||||
self.ready = True
|
||||
|
||||
|
||||
|
@ -49,6 +52,7 @@ class SMTPActionTest(unittest.TestCase):
|
|||
|
||||
def setUp(self):
|
||||
"""Call before every test case."""
|
||||
super(SMTPActionTest, self).setUp()
|
||||
self.jail = DummyJail()
|
||||
pythonModule = os.path.join(CONFIG_DIR, "action.d", "smtp.py")
|
||||
pythonModuleName = os.path.basename(pythonModule.rstrip(".py"))
|
||||
|
@ -99,23 +103,28 @@ class SMTPActionTest(unittest.TestCase):
|
|||
"Subject: [Fail2Ban] %s: stopped" %
|
||||
self.jail.name in self.smtpd.data)
|
||||
|
||||
def testBan(self):
|
||||
def _testBan(self, restored=False):
|
||||
aInfo = {
|
||||
'ip': "127.0.0.2",
|
||||
'failures': 3,
|
||||
'matches': "Test fail 1\n",
|
||||
'ipjailmatches': "Test fail 1\nTest Fail2\n",
|
||||
'ipmatches': "Test fail 1\nTest Fail2\nTest Fail3\n",
|
||||
}
|
||||
}
|
||||
if restored:
|
||||
aInfo['restored'] = 1
|
||||
|
||||
self._exec_and_wait(lambda: self.action.ban(aInfo))
|
||||
if restored: # no mail, should raises attribute error:
|
||||
self.assertRaises(AttributeError, lambda: self.smtpd.mailfrom)
|
||||
return
|
||||
self.assertEqual(self.smtpd.mailfrom, "fail2ban")
|
||||
self.assertEqual(self.smtpd.rcpttos, ["root"])
|
||||
subject = "Subject: [Fail2Ban] %s: banned %s" % (
|
||||
self.jail.name, aInfo['ip'])
|
||||
self.assertIn(subject, self.smtpd.data.replace("\n", ""))
|
||||
self.assertTrue(
|
||||
"%i attempts" % aInfo['failures'] in self.smtpd.data)
|
||||
self.assertIn(subject, self.smtpd.data)
|
||||
self.assertIn(
|
||||
"%i attempts" % aInfo['failures'], self.smtpd.data)
|
||||
|
||||
self.action.matches = "matches"
|
||||
self._exec_and_wait(lambda: self.action.ban(aInfo))
|
||||
|
@ -128,6 +137,12 @@ class SMTPActionTest(unittest.TestCase):
|
|||
self.action.matches = "ipmatches"
|
||||
self._exec_and_wait(lambda: self.action.ban(aInfo))
|
||||
self.assertIn(aInfo['ipmatches'], self.smtpd.data)
|
||||
|
||||
def testBan(self):
|
||||
self._testBan()
|
||||
|
||||
def testNOPByRestored(self):
|
||||
self._testBan(restored=True)
|
||||
|
||||
def testOptions(self):
|
||||
self._exec_and_wait(self.action.start)
|
||||
|
|
|
@ -34,6 +34,7 @@ from ..server.ticket import BanTicket
|
|||
class AddFailure(unittest.TestCase):
|
||||
def setUp(self):
|
||||
"""Call before every test case."""
|
||||
super(AddFailure, self).setUp()
|
||||
setUpMyTime()
|
||||
self.__ticket = BanTicket('193.168.0.128', 1167605999.0)
|
||||
self.__banManager = BanManager()
|
||||
|
@ -155,6 +156,7 @@ class AddFailure(unittest.TestCase):
|
|||
class StatusExtendedCymruInfo(unittest.TestCase):
|
||||
def setUp(self):
|
||||
"""Call before every test case."""
|
||||
super(StatusExtendedCymruInfo, self).setUp()
|
||||
unittest.F2B.SkipIfNoNetwork()
|
||||
setUpMyTime()
|
||||
self.__ban_ip = "93.184.216.34"
|
||||
|
|
|
@ -32,6 +32,7 @@ class BeautifierTest(unittest.TestCase):
|
|||
|
||||
def setUp(self):
|
||||
""" Call before every test case """
|
||||
super(BeautifierTest, self).setUp()
|
||||
self.b = Beautifier()
|
||||
|
||||
def tearDown(self):
|
||||
|
|
|
@ -55,6 +55,7 @@ class ConfigReaderTest(unittest.TestCase):
|
|||
|
||||
def setUp(self):
|
||||
"""Call before every test case."""
|
||||
super(ConfigReaderTest, self).setUp()
|
||||
self.d = tempfile.mkdtemp(prefix="f2b-temp")
|
||||
self.c = ConfigReaderUnshared(basedir=self.d)
|
||||
|
||||
|
@ -193,13 +194,15 @@ class JailReaderTest(LogCaptureTestCase):
|
|||
self.assertTrue(jail.read())
|
||||
self.assertFalse(jail.getOptions())
|
||||
self.assertTrue(jail.isEnabled())
|
||||
self.assertLogged('Error in action definition joho[foo')
|
||||
# This unittest has been deactivated for some time...
|
||||
# self.assertLogged(
|
||||
# 'Caught exception: While reading action joho[foo we should have got 1 or 2 groups. Got: 0')
|
||||
# let's test for what is actually logged and handle changes in the future
|
||||
self.assertLogged(
|
||||
"Caught exception: 'NoneType' object has no attribute 'endswith'")
|
||||
self.assertLogged("Invalid action definition 'joho[foo'")
|
||||
|
||||
def testJailFilterBrokenDef(self):
|
||||
jail = JailReader('brokenfilterdef', basedir=IMPERFECT_CONFIG,
|
||||
share_config=IMPERFECT_CONFIG_SHARE_CFG)
|
||||
self.assertTrue(jail.read())
|
||||
self.assertFalse(jail.getOptions())
|
||||
self.assertTrue(jail.isEnabled())
|
||||
self.assertLogged("Invalid filter definition 'flt[test'")
|
||||
|
||||
if STOCK:
|
||||
def testStockSSHJail(self):
|
||||
|
@ -344,7 +347,7 @@ class FilterReaderTest(unittest.TestCase):
|
|||
['set', 'testcase01', 'addjournalmatch',
|
||||
"FIELD= with spaces ", "+", "AFIELD= with + char and spaces"],
|
||||
['set', 'testcase01', 'datepattern', "%Y %m %d %H:%M:%S"],
|
||||
['set', 'testcase01', 'maxlines', "1"], # Last for overide test
|
||||
['set', 'testcase01', 'maxlines', 1], # Last for overide test
|
||||
]
|
||||
filterReader = FilterReader("testcase01", "testcase01", {})
|
||||
filterReader.setBaseDir(TEST_FILES_DIR)
|
||||
|
@ -496,7 +499,7 @@ class JailsReaderTest(LogCaptureTestCase):
|
|||
def testReadTestJailConf(self):
|
||||
jails = JailsReader(basedir=IMPERFECT_CONFIG, share_config=IMPERFECT_CONFIG_SHARE_CFG)
|
||||
self.assertTrue(jails.read())
|
||||
self.assertFalse(jails.getOptions())
|
||||
self.assertFalse(jails.getOptions(ignoreWrong=False))
|
||||
self.assertRaises(ValueError, jails.convert)
|
||||
comm_commands = jails.convert(allow_no_files=True)
|
||||
self.maxDiff = None
|
||||
|
@ -514,19 +517,27 @@ class JailsReaderTest(LogCaptureTestCase):
|
|||
['add', 'brokenaction', 'auto'],
|
||||
['set', 'brokenaction', 'addfailregex', '<IP>'],
|
||||
['set', 'brokenaction', 'addaction', 'brokenaction'],
|
||||
['set',
|
||||
'brokenaction',
|
||||
'action',
|
||||
'brokenaction',
|
||||
'actionban',
|
||||
'hit with big stick <ip>'],
|
||||
['multi-set', 'brokenaction', 'action', 'brokenaction', [
|
||||
['actionban', 'hit with big stick <ip>'],
|
||||
['actname', 'brokenaction']
|
||||
]],
|
||||
['add', 'parse_to_end_of_jail.conf', 'auto'],
|
||||
['set', 'parse_to_end_of_jail.conf', 'addfailregex', '<IP>'],
|
||||
['start', 'emptyaction'],
|
||||
['start', 'missinglogfiles'],
|
||||
['start', 'brokenaction'],
|
||||
['start', 'parse_to_end_of_jail.conf'],]))
|
||||
self.assertLogged("Errors in jail 'missingbitsjail'. Skipping...")
|
||||
['start', 'parse_to_end_of_jail.conf'],
|
||||
['config-error',
|
||||
"Jail 'brokenactiondef' skipped, because of wrong configuration: Invalid action definition 'joho[foo'"],
|
||||
['config-error',
|
||||
"Jail 'brokenfilterdef' skipped, because of wrong configuration: Invalid filter definition 'flt[test'"],
|
||||
['config-error',
|
||||
"Jail 'missingaction' skipped, because of wrong configuration: Unable to read action 'noactionfileforthisaction'"],
|
||||
['config-error',
|
||||
"Jail 'missingbitsjail' skipped, because of wrong configuration: Unable to read the filter 'catchallthebadies'"],
|
||||
]))
|
||||
self.assertLogged("Errors in jail 'missingbitsjail'.")
|
||||
self.assertNotLogged("Skipping...")
|
||||
self.assertLogged("No file(s) found for glob /weapons/of/mass/destruction")
|
||||
|
||||
if STOCK:
|
||||
|
@ -535,7 +546,10 @@ class JailsReaderTest(LogCaptureTestCase):
|
|||
actionName = os.path.basename(actionConfig).replace('.conf', '')
|
||||
actionReader = ActionReader(actionName, "TEST", {}, basedir=CONFIG_DIR)
|
||||
self.assertTrue(actionReader.read())
|
||||
actionReader.getOptions({}) # populate _opts
|
||||
try:
|
||||
actionReader.getOptions({}) # populate _opts
|
||||
except Exception as e: # pragma: no cover
|
||||
self.fail("action %r\n%s: %s" % (actionName, type(e).__name__, e))
|
||||
if not actionName.endswith('-common'):
|
||||
self.assertIn('Definition', actionReader.sections(),
|
||||
msg="Action file %r is lacking [Definition] section" % actionConfig)
|
||||
|
@ -614,7 +628,7 @@ class JailsReaderTest(LogCaptureTestCase):
|
|||
# grab all filter names
|
||||
filters = set(os.path.splitext(os.path.split(a)[1])[0]
|
||||
for a in glob.glob(os.path.join('config', 'filter.d', '*.conf'))
|
||||
if not a.endswith('common.conf'))
|
||||
if not (a.endswith('common.conf') or a.endswith('-aggressive.conf')))
|
||||
# get filters of all jails (filter names without options inside filter[...])
|
||||
filters_jail = set(
|
||||
JailReader.extractOptions(jail.options['filter'])[0] for jail in jails.jails
|
||||
|
|
|
@ -1,64 +0,0 @@
|
|||
# Generic configuration items (to be used as interpolations) in other
|
||||
# filters or actions configurations
|
||||
#
|
||||
|
||||
[INCLUDES]
|
||||
|
||||
# Load customizations if any available
|
||||
after = common.local
|
||||
|
||||
|
||||
[DEFAULT]
|
||||
|
||||
# Daemon definition is to be specialized (if needed) in .conf file
|
||||
_daemon = \S*
|
||||
|
||||
#
|
||||
# Shortcuts for easier comprehension of the failregex
|
||||
#
|
||||
# PID.
|
||||
# EXAMPLES: [123]
|
||||
__pid_re = (?:\[\d+\])
|
||||
|
||||
# Daemon name (with optional source_file:line or whatever)
|
||||
# EXAMPLES: pam_rhosts_auth, [sshd], pop(pam_unix)
|
||||
__daemon_re = [\[\(]?%(_daemon)s(?:\(\S+\))?[\]\)]?:?
|
||||
|
||||
# extra daemon info
|
||||
# EXAMPLE: [ID 800047 auth.info]
|
||||
__daemon_extra_re = \[ID \d+ \S+\]
|
||||
|
||||
# Combinations of daemon name and PID
|
||||
# EXAMPLES: sshd[31607], pop(pam_unix)[4920]
|
||||
__daemon_combs_re = (?:%(__pid_re)s?:\s+%(__daemon_re)s|%(__daemon_re)s%(__pid_re)s?:?)
|
||||
|
||||
# Some messages have a kernel prefix with a timestamp
|
||||
# EXAMPLES: kernel: [769570.846956]
|
||||
__kernel_prefix = kernel: \[ *\d+\.\d+\]
|
||||
|
||||
__hostname = \S+
|
||||
|
||||
# A MD5 hex
|
||||
# EXAMPLES: 07:06:27:55:b0:e3:0c:3c:5a:28:2d:7c:7e:4c:77:5f
|
||||
__md5hex = (?:[\da-f]{2}:){15}[\da-f]{2}
|
||||
|
||||
# bsdverbose is where syslogd is started with -v or -vv and results in <4.3> or
|
||||
# <auth.info> appearing before the host as per testcases/files/logs/bsd/*.
|
||||
__bsd_syslog_verbose = <[^.]+\.[^.]+>
|
||||
|
||||
__vserver = @vserver_\S+
|
||||
|
||||
__date_ambit = (?:\[\])
|
||||
|
||||
# Common line prefixes (beginnings) which could be used in filters
|
||||
#
|
||||
# [bsdverbose]? [hostname] [vserver tag] daemon_id spaces
|
||||
#
|
||||
# This can be optional (for instance if we match named native log files)
|
||||
__prefix_line = %(__date_ambit)s?\s*(?:%(__bsd_syslog_verbose)s\s+)?(?:%(__hostname)s\s+)?(?:%(__kernel_prefix)s\s+)?(?:%(__vserver)s\s+)?(?:%(__daemon_combs_re)s\s+)?(?:%(__daemon_extra_re)s\s+)?
|
||||
|
||||
# PAM authentication mechanism check for failures, e.g.: pam_unix, pam_sss,
|
||||
# pam_ldap
|
||||
__pam_auth = pam_unix
|
||||
|
||||
# Author: Yaroslav Halchenko
|
|
@ -8,7 +8,7 @@
|
|||
# Read common prefixes. If any customizations available -- read them from
|
||||
# common.local. common.conf is a symlink to the original common.conf and
|
||||
# should be copied (dereferenced) during installation
|
||||
before = common.conf
|
||||
before = ../../../../config/filter.d/common.conf
|
||||
|
||||
[Definition]
|
||||
|
||||
|
@ -20,3 +20,8 @@ failregex = ^%(__prefix_line)sF2B: failure from <HOST>$
|
|||
# just to test multiple ignoreregex:
|
||||
ignoreregex = ^%(__prefix_line)sF2B: error from 192.0.2.251$
|
||||
^%(__prefix_line)sF2B: error from 192.0.2.252$
|
||||
|
||||
# specify only exact date patterns, +1 with %%Y to test usage of last known date by wrong dates like 0000-00-00...
|
||||
datepattern = {^LN-BEG}%%ExY(?P<_sep>[-/.])%%m(?P=_sep)%%d[T ]%%H:%%M:%%S(?:[.,]%%f)?(?:\s*%%z)?
|
||||
{^LN-BEG}(?:%%a )?%%b %%d %%H:%%M:%%S(?:\.%%f)?(?: %%ExY)?
|
||||
{^LN-BEG}%%Y(?P<_sep>[-/.])%%m(?P=_sep)%%d[T ]%%H:%%M:%%S(?:[.,]%%f)?(?:\s*%%z)?
|
||||
|
|
|
@ -27,10 +27,18 @@ logpath = /weapons/of/mass/destruction
|
|||
enabled = true
|
||||
action = joho[foo
|
||||
|
||||
[brokenfilterdef]
|
||||
enabled = true
|
||||
filter = flt[test
|
||||
|
||||
[brokenaction]
|
||||
enabled = true
|
||||
action = brokenaction
|
||||
|
||||
[missingaction]
|
||||
enabled = true
|
||||
action = noactionfileforthisaction
|
||||
|
||||
[missingbitsjail]
|
||||
enabled = true
|
||||
filter = catchallthebadies
|
||||
|
|
|
@ -30,7 +30,7 @@ import datetime
|
|||
|
||||
from ..server.datedetector import DateDetector
|
||||
from ..server import datedetector
|
||||
from ..server.datetemplate import DateTemplate
|
||||
from ..server.datetemplate import DatePatternRegex, DateTemplate
|
||||
from .utils import setUpMyTime, tearDownMyTime, LogCaptureTestCase
|
||||
from ..helpers import getLogger
|
||||
|
||||
|
@ -42,35 +42,40 @@ class DateDetectorTest(LogCaptureTestCase):
|
|||
def setUp(self):
|
||||
"""Call before every test case."""
|
||||
LogCaptureTestCase.setUp(self)
|
||||
self.__old_eff_level = datedetector.logLevel
|
||||
datedetector.logLevel = logSys.getEffectiveLevel()
|
||||
setUpMyTime()
|
||||
self.__datedetector = DateDetector()
|
||||
self.__datedetector.addDefaultTemplate()
|
||||
self.__datedetector = None
|
||||
|
||||
def tearDown(self):
|
||||
"""Call after every test case."""
|
||||
LogCaptureTestCase.tearDown(self)
|
||||
datedetector.logLevel = self.__old_eff_level
|
||||
tearDownMyTime()
|
||||
|
||||
@property
|
||||
def datedetector(self):
|
||||
if self.__datedetector is None:
|
||||
self.__datedetector = DateDetector()
|
||||
self.__datedetector.addDefaultTemplate()
|
||||
return self.__datedetector
|
||||
|
||||
def testGetEpochTime(self):
|
||||
self.__datedetector = DateDetector()
|
||||
self.__datedetector.appendTemplate('EPOCH')
|
||||
# correct epoch time, using all variants:
|
||||
for dateUnix in (1138049999, 32535244799):
|
||||
for date in ("%s", "[%s]", "[%s.555]", "audit(%s.555:101)"):
|
||||
date = date % dateUnix
|
||||
log = date + " [sshd] error: PAM: Authentication failure"
|
||||
datelog = self.__datedetector.getTime(log)
|
||||
datelog = self.datedetector.getTime(log)
|
||||
self.assertTrue(datelog, "Parse epoch time for %s failed" % (date,))
|
||||
( datelog, matchlog ) = datelog
|
||||
self.assertEqual(int(datelog), dateUnix)
|
||||
self.assertIn(matchlog.group(), (str(dateUnix), str(dateUnix)+'.555'))
|
||||
self.assertIn(matchlog.group(1), (str(dateUnix), str(dateUnix)+'.555'))
|
||||
# wrong, no epoch time (< 10 digits, more as 11 digits, begin/end of word) :
|
||||
for dateUnix in ('123456789', '9999999999999999', '1138049999A', 'A1138049999'):
|
||||
for date in ("%s", "[%s]", "[%s.555]", "audit(%s.555:101)"):
|
||||
date = date % dateUnix
|
||||
log = date + " [sshd] error: PAM: Authentication failure"
|
||||
datelog = self.__datedetector.getTime(log)
|
||||
datelog = self.datedetector.getTime(log)
|
||||
self.assertFalse(datelog)
|
||||
|
||||
def testGetTime(self):
|
||||
|
@ -80,109 +85,331 @@ class DateDetectorTest(LogCaptureTestCase):
|
|||
# is not correctly determined atm, since year is not present
|
||||
# in the log entry. Since this doesn't effect the operation
|
||||
# of fail2ban -- we just ignore incorrect day of the week
|
||||
( datelog, matchlog ) = self.__datedetector.getTime(log)
|
||||
( datelog, matchlog ) = self.datedetector.getTime(log)
|
||||
self.assertEqual(datelog, dateUnix)
|
||||
self.assertEqual(matchlog.group(), 'Jan 23 21:59:59')
|
||||
self.assertEqual(matchlog.group(1), 'Jan 23 21:59:59')
|
||||
|
||||
def testVariousTimes(self):
|
||||
"""Test detection of various common date/time formats f2b should understand
|
||||
"""
|
||||
dateUnix = 1106513999.0
|
||||
|
||||
for anchored, sdate in (
|
||||
(False, "Jan 23 21:59:59"),
|
||||
(False, "Sun Jan 23 21:59:59 2005"),
|
||||
(False, "Sun Jan 23 21:59:59"),
|
||||
(False, "Sun Jan 23 2005 21:59:59"),
|
||||
(False, "2005/01/23 21:59:59"),
|
||||
(False, "2005.01.23 21:59:59"),
|
||||
(False, "23/01/2005 21:59:59"),
|
||||
(False, "23/01/05 21:59:59"),
|
||||
(False, "23/Jan/2005:21:59:59"),
|
||||
(False, "23/Jan/2005:21:59:59 +0100"),
|
||||
(False, "01/23/2005:21:59:59"),
|
||||
(False, "2005-01-23 21:59:59"),
|
||||
(False, "2005-01-23 21:59:59,000"), # proftpd
|
||||
(False, "23-Jan-2005 21:59:59"),
|
||||
(False, "23-Jan-2005 21:59:59.02"),
|
||||
(False, "23-Jan-2005 21:59:59 +0100"),
|
||||
(False, "23-01-2005 21:59:59"),
|
||||
(True, "1106513999"), # Portsetry
|
||||
(False, "01-23-2005 21:59:59.252"), # reported on f2b, causes Feb29 fix to break
|
||||
(False, "@4000000041f4104f00000000"), # TAI64N
|
||||
(False, "2005-01-23T20:59:59.252Z"), #ISO 8601 (UTC)
|
||||
(False, "2005-01-23T15:59:59-05:00"), #ISO 8601 with TZ
|
||||
(False, "2005-01-23T21:59:59"), #ISO 8601 no TZ, assume local
|
||||
(True, "<01/23/05@21:59:59>"),
|
||||
(True, "050123 21:59:59"), # MySQL
|
||||
(True, "Jan-23-05 21:59:59"), # ASSP like
|
||||
(False, "Jan 23, 2005 9:59:59 PM"), # Apache Tomcat
|
||||
(True, "1106513999"), # Regular epoch
|
||||
(True, "1106513999.000"), # Regular epoch with millisec
|
||||
(False, "audit(1106513999.000:987)"), # SELinux
|
||||
# anchored - matching expression (pattern) is anchored
|
||||
# bound - pattern can be tested using word boundary (e.g. False if contains in front some optional part)
|
||||
# sdate - date string used in test log-line
|
||||
# rdate - if specified, the result match, which differs from sdate
|
||||
for anchored, bound, sdate, rdate in (
|
||||
(False, True, "Jan 23 21:59:59", None),
|
||||
(False, False, "Sun Jan 23 21:59:59 2005", None),
|
||||
(False, False, "Sun Jan 23 21:59:59", None),
|
||||
(False, False, "Sun Jan 23 2005 21:59:59", None),
|
||||
(False, True, "2005/01/23 21:59:59", None),
|
||||
(False, True, "2005.01.23 21:59:59", None),
|
||||
(False, True, "23/01/2005 21:59:59", None),
|
||||
(False, True, "23/01/05 21:59:59", None),
|
||||
(False, True, "23/Jan/2005:21:59:59", None),
|
||||
(False, True, "23/Jan/2005:21:59:59 +0100", None),
|
||||
(False, True, "01/23/2005:21:59:59", None),
|
||||
(False, True, "2005-01-23 21:59:59", None),
|
||||
(False, True, "2005-01-23 21:59:59,000", None), # proftpd
|
||||
(False, True, "23-Jan-2005 21:59:59", None),
|
||||
(False, True, "23-Jan-2005 21:59:59.02", None),
|
||||
(False, True, "23-Jan-2005 21:59:59 +0100", None),
|
||||
(False, True, "23-01-2005 21:59:59", None),
|
||||
(True, True, "1106513999", None), # Portsetry
|
||||
(False, True, "01-23-2005 21:59:59.252", None), # reported on f2b, causes Feb29 fix to break
|
||||
(False, False, "@4000000041f4104f00000000", None), # TAI64N
|
||||
(False, True, "2005-01-23T20:59:59.252Z", None), #ISO 8601 (UTC)
|
||||
(False, True, "2005-01-23T15:59:59-05:00", None), #ISO 8601 with TZ
|
||||
(False, True, "2005-01-23 21:59:59", None), #ISO 8601 no TZ, assume local
|
||||
(False, True, "20050123T215959", None), #Short ISO with T
|
||||
(False, True, "20050123 215959", None), #Short ISO with space
|
||||
(True, True, "<01/23/05@21:59:59>", None),
|
||||
(False, True, "050123 21:59:59", None), # MySQL
|
||||
(True, True, "Jan-23-05 21:59:59", None), # ASSP like
|
||||
(False, True, "Jan 23, 2005 9:59:59 PM", None), # Apache Tomcat
|
||||
(True, True, "1106513999", None), # Regular epoch
|
||||
(True, True, "1106513999.000", None), # Regular epoch with millisec
|
||||
(True, True, "[1106513999.000]", "1106513999.000"), # epoch squared (brackets are not in match)
|
||||
(False, True, "audit(1106513999.000:987)", "1106513999.000"), # SELinux
|
||||
(True, True, "no date line", None), # no date in string
|
||||
):
|
||||
if rdate is None and sdate != "no date line": rdate = sdate
|
||||
logSys.debug('== test %r', (anchored, bound, sdate, rdate))
|
||||
for should_match, prefix in (
|
||||
(rdate is not None, ""),
|
||||
(not anchored, "bogus-prefix "),
|
||||
(False, "word-boundary")
|
||||
):
|
||||
for should_match, prefix in ((True, ""),
|
||||
(not anchored, "bogus-prefix ")):
|
||||
log = prefix + sdate + "[sshd] error: PAM: Authentication failure"
|
||||
|
||||
# if not allowed boundary test:
|
||||
if not bound and prefix == "word-boundary": continue
|
||||
logSys.debug(' -- test %-5s for %r', should_match, log)
|
||||
# with getTime:
|
||||
logtime = self.__datedetector.getTime(log)
|
||||
logtime = self.datedetector.getTime(log)
|
||||
if should_match:
|
||||
self.assertNotEqual(logtime, None, "getTime retrieved nothing: failure for %s, anchored: %r, log: %s" % ( sdate, anchored, log))
|
||||
self.assertNotEqual(logtime, None,
|
||||
"getTime retrieved nothing: failure for %s by prefix %r, anchored: %r, log: %s" % ( sdate, prefix, anchored, log))
|
||||
( logUnix, logMatch ) = logtime
|
||||
self.assertEqual(logUnix, dateUnix, "getTime comparison failure for %s: \"%s\" is not \"%s\"" % (sdate, logUnix, dateUnix))
|
||||
if sdate.startswith('audit('):
|
||||
# yes, special case, the group only matches the number
|
||||
self.assertEqual(logMatch.group(), '1106513999.000')
|
||||
else:
|
||||
self.assertEqual(logMatch.group(), sdate)
|
||||
self.assertEqual(logUnix, dateUnix,
|
||||
"getTime comparison failure for %s: by prefix %r \"%s\" is not \"%s\"" % (sdate, prefix, logUnix, dateUnix))
|
||||
self.assertEqual(logMatch.group(1), rdate)
|
||||
else:
|
||||
self.assertEqual(logtime, None, "getTime should have not matched for %r Got: %s" % (sdate, logtime))
|
||||
self.assertEqual(logtime, None,
|
||||
"getTime should have not matched for %r by prefix %r Got: %s" % (sdate, prefix, logtime))
|
||||
# with getTime(matchTime) - this combination used in filter:
|
||||
matchTime = self.__datedetector.matchTime(log)
|
||||
logtime = self.__datedetector.getTime(log, matchTime)
|
||||
(timeMatch, template) = matchTime = self.datedetector.matchTime(log)
|
||||
logtime = self.datedetector.getTime(log, matchTime)
|
||||
logSys.debug(' -- found - %r', template.name if timeMatch else False)
|
||||
if should_match:
|
||||
self.assertNotEqual(logtime, None, "getTime retrieved nothing: failure for %s, anchored: %r, log: %s" % ( sdate, anchored, log))
|
||||
self.assertNotEqual(logtime, None,
|
||||
"getTime retrieved nothing: failure for %s by prefix %r, anchored: %r, log: %s" % ( sdate, prefix, anchored, log))
|
||||
( logUnix, logMatch ) = logtime
|
||||
self.assertEqual(logUnix, dateUnix, "getTime comparison failure for %s: \"%s\" is not \"%s\"" % (sdate, logUnix, dateUnix))
|
||||
if sdate.startswith('audit('):
|
||||
# yes, special case, the group only matches the number
|
||||
self.assertEqual(logMatch.group(), '1106513999.000')
|
||||
else:
|
||||
self.assertEqual(logMatch.group(), sdate)
|
||||
self.assertEqual(logUnix, dateUnix,
|
||||
"getTime comparison failure for %s by prefix %r: \"%s\" is not \"%s\"" % (sdate, prefix, logUnix, dateUnix))
|
||||
self.assertEqual(logMatch.group(1), rdate)
|
||||
else:
|
||||
self.assertEqual(logtime, None, "getTime should have not matched for %r Got: %s" % (sdate, logtime))
|
||||
self.assertEqual(logtime, None,
|
||||
"getTime should have not matched for %r by prefix %r Got: %s" % (sdate, prefix, logtime))
|
||||
logSys.debug(' -- OK')
|
||||
|
||||
def testAllUniqueTemplateNames(self):
|
||||
self.assertRaises(ValueError, self.__datedetector.appendTemplate,
|
||||
self.__datedetector.templates[0])
|
||||
self.assertRaises(ValueError, self.datedetector.appendTemplate,
|
||||
self.datedetector.templates[0])
|
||||
|
||||
def testFullYearMatch_gh130(self):
|
||||
# see https://github.com/fail2ban/fail2ban/pull/130
|
||||
# yoh: unfortunately this test is not really effective to reproduce the
|
||||
# situation but left in place to assure consistent behavior
|
||||
mu = time.mktime(datetime.datetime(2012, 10, 11, 2, 37, 17).timetuple())
|
||||
logdate = self.__datedetector.getTime('2012/10/11 02:37:17 [error] 18434#0')
|
||||
logdate = self.datedetector.getTime('2012/10/11 02:37:17 [error] 18434#0')
|
||||
self.assertNotEqual(logdate, None)
|
||||
( logTime, logMatch ) = logdate
|
||||
self.assertEqual(logTime, mu)
|
||||
self.assertEqual(logMatch.group(), '2012/10/11 02:37:17')
|
||||
self.assertEqual(logMatch.group(1), '2012/10/11 02:37:17')
|
||||
# confuse it with year being at the end
|
||||
for i in xrange(10):
|
||||
( logTime, logMatch ) = self.__datedetector.getTime('11/10/2012 02:37:17 [error] 18434#0')
|
||||
( logTime, logMatch ) = self.datedetector.getTime('11/10/2012 02:37:17 [error] 18434#0')
|
||||
self.assertEqual(logTime, mu)
|
||||
self.assertEqual(logMatch.group(), '11/10/2012 02:37:17')
|
||||
self.assertEqual(logMatch.group(1), '11/10/2012 02:37:17')
|
||||
# and now back to the original
|
||||
( logTime, logMatch ) = self.__datedetector.getTime('2012/10/11 02:37:17 [error] 18434#0')
|
||||
( logTime, logMatch ) = self.datedetector.getTime('2012/10/11 02:37:17 [error] 18434#0')
|
||||
self.assertEqual(logTime, mu)
|
||||
self.assertEqual(logMatch.group(), '2012/10/11 02:37:17')
|
||||
self.assertEqual(logMatch.group(1), '2012/10/11 02:37:17')
|
||||
|
||||
def testDateTemplate(self):
|
||||
t = DateTemplate()
|
||||
t.setRegex('^a{3,5}b?c*$')
|
||||
self.assertEqual(t.getRegex(), '^a{3,5}b?c*$')
|
||||
self.assertRaises(Exception, t.getDate, '')
|
||||
self.assertEqual(t.matchDate('aaaac').group(), 'aaaac')
|
||||
t = DateTemplate()
|
||||
t.setRegex('^a{3,5}b?c*$')
|
||||
self.assertEqual(t.regex, '^(a{3,5}b?c*)$')
|
||||
self.assertRaises(Exception, t.getDate, '')
|
||||
self.assertEqual(t.matchDate('aaaac').group(1), 'aaaac')
|
||||
|
||||
## no word boundaries left and right:
|
||||
t = DatePatternRegex()
|
||||
t.pattern = '(?iu)**time:%ExY%Exm%ExdT%ExH%ExM%ExS**'
|
||||
# ** was removed from end-regex:
|
||||
self.assertFalse('**' in t.regex)
|
||||
# match date:
|
||||
dt = 'TIME:20050102T010203'
|
||||
self.assertEqual(t.matchDate('X' + dt + 'X').group(1), dt)
|
||||
self.assertEqual(t.matchDate(dt).group(1), dt)
|
||||
# wrong year (for exact %ExY):
|
||||
dt = 'TIME:50050102T010203'
|
||||
self.assertFalse(t.matchDate(dt))
|
||||
|
||||
## start boundary left and word boundary right (automatically if not **):
|
||||
t = DatePatternRegex()
|
||||
t.pattern = '{^LN-BEG}time:%ExY%Exm%ExdT%ExH%ExM%ExS'
|
||||
self.assertTrue('^' in t.regex)
|
||||
# try match date:
|
||||
dt = 'time:20050102T010203'
|
||||
self.assertFalse(t.matchDate('X' + dt))
|
||||
self.assertFalse(t.matchDate(dt + 'X'))
|
||||
self.assertEqual(t.matchDate('##' + dt + '...').group(1), dt)
|
||||
self.assertEqual(t.matchDate(dt).group(1), dt)
|
||||
# case sensitive:
|
||||
dt = 'TIME:20050102T010203'
|
||||
self.assertFalse(t.matchDate(dt))
|
||||
|
||||
## auto-switching "ignore case" and "unicode"
|
||||
t = DatePatternRegex()
|
||||
t.pattern = '^%Y %b %d'
|
||||
self.assertTrue('(?iu)' in t.regex)
|
||||
dt = '2005 jun 03'; self.assertEqual(t.matchDate(dt).group(1), dt)
|
||||
dt = '2005 Jun 03'; self.assertEqual(t.matchDate(dt).group(1), dt)
|
||||
dt = '2005 JUN 03'; self.assertEqual(t.matchDate(dt).group(1), dt)
|
||||
|
||||
def testAmbiguousInOrderedTemplates(self):
|
||||
dd = self.datedetector
|
||||
for (debit, line, cnt) in (
|
||||
# shortest distance to datetime should win:
|
||||
("030324 0:03:59", "some free text 030324 0:03:59 -- 2003-03-07 17:05:01 ...", 1),
|
||||
# some free text with datetime:
|
||||
("2003-03-07 17:05:01", "some free text 2003-03-07 17:05:01 test ...", 15),
|
||||
# distance collision detection (date from foreign input should not be found):
|
||||
("030324 0:04:00", "server mysqld[1000]: 030324 0:04:00 [Warning] Access denied ..."
|
||||
" foreign-input just some free text 2003-03-07 17:05:01 test", 10),
|
||||
# distance collision detection (first date should be found):
|
||||
("Sep 16 21:30:26", "server mysqld[1020]: Sep 16 21:30:26 server mysqld: 030916 21:30:26 [Warning] Access denied", 15),
|
||||
# just to test sorting:
|
||||
("2005-10-07 06:09:42", "server mysqld[5906]: 2005-10-07 06:09:42 5907 [Warning] Access denied", 20),
|
||||
("2005-10-08T15:26:18.237955", "server mysqld[5906]: 2005-10-08T15:26:18.237955 6 [Note] Access denied", 20),
|
||||
# date format changed again:
|
||||
("051009 10:05:30", "server mysqld[1000]: 051009 10:05:30 [Warning] Access denied ...", 50),
|
||||
):
|
||||
logSys.debug('== test: %r', (debit, line, cnt))
|
||||
for i in range(cnt):
|
||||
logSys.debug('Line: %s', line)
|
||||
match, template = dd.matchTime(line)
|
||||
self.assertTrue(match)
|
||||
self.assertEqual(match.group(1), debit)
|
||||
|
||||
def testLowLevelLogging(self):
|
||||
# test coverage for the deep (heavy) debug messages:
|
||||
try:
|
||||
self.__old_eff_level = datedetector.logLevel
|
||||
if datedetector.logLevel < logSys.getEffectiveLevel()+1:
|
||||
datedetector.logLevel = logSys.getEffectiveLevel()+1
|
||||
dd = self.datedetector
|
||||
i = 0
|
||||
for (line, cnt) in (
|
||||
("server mysqld[5906]: 2005-10-07 06:09:%02i 5907 [Warning] Access denied", 2),
|
||||
("server mysqld[5906]: 051007 06:10:%02i 5907 [Warning] Access denied", 5),
|
||||
("server mysqld[5906]: 2005-10-07 06:09:%02i 5907 [Warning] Access denied", 10),
|
||||
):
|
||||
for i in range(i, i+cnt+1):
|
||||
logSys.debug('== test: %r', (line % i, cnt))
|
||||
match, template = dd.matchTime(line % i)
|
||||
self.assertTrue(match)
|
||||
finally:
|
||||
datedetector.logLevel = self.__old_eff_level
|
||||
|
||||
def testWrongTemplate(self):
|
||||
t = DatePatternRegex('(%ExY%Exm%Exd')
|
||||
# lazy compiling used, so try match:
|
||||
self.assertRaises(Exception, t.matchDate, '(20050101')
|
||||
self.assertLogged("Compile %r failed" % t.name)
|
||||
# abstract:
|
||||
t = DateTemplate()
|
||||
self.assertRaises(Exception, t.getDate, 'no date line')
|
||||
|
||||
|
||||
iso8601 = DatePatternRegex("%Y-%m-%d[T ]%H:%M:%S(?:\.%f)?%z")
|
||||
|
||||
class CustomDateFormatsTest(unittest.TestCase):
|
||||
|
||||
def testIso8601(self):
|
||||
date = datetime.datetime.utcfromtimestamp(
|
||||
iso8601.getDate("2007-01-25T12:00:00Z")[0])
|
||||
self.assertEqual(
|
||||
date,
|
||||
datetime.datetime(2007, 1, 25, 12, 0))
|
||||
self.assertRaises(TypeError, iso8601.getDate, None)
|
||||
self.assertRaises(TypeError, iso8601.getDate, date)
|
||||
|
||||
self.assertEqual(iso8601.getDate(""), None)
|
||||
self.assertEqual(iso8601.getDate("Z"), None)
|
||||
|
||||
self.assertEqual(iso8601.getDate("2007-01-01T120:00:00Z"), None)
|
||||
self.assertEqual(iso8601.getDate("2007-13-01T12:00:00Z"), None)
|
||||
date = datetime.datetime.utcfromtimestamp(
|
||||
iso8601.getDate("2007-01-25T12:00:00+0400")[0])
|
||||
self.assertEqual(
|
||||
date,
|
||||
datetime.datetime(2007, 1, 25, 8, 0))
|
||||
date = datetime.datetime.utcfromtimestamp(
|
||||
iso8601.getDate("2007-01-25T12:00:00+04:00")[0])
|
||||
self.assertEqual(
|
||||
date,
|
||||
datetime.datetime(2007, 1, 25, 8, 0))
|
||||
date = datetime.datetime.utcfromtimestamp(
|
||||
iso8601.getDate("2007-01-25T12:00:00-0400")[0])
|
||||
self.assertEqual(
|
||||
date,
|
||||
datetime.datetime(2007, 1, 25, 16, 0))
|
||||
date = datetime.datetime.utcfromtimestamp(
|
||||
iso8601.getDate("2007-01-25T12:00:00-04")[0])
|
||||
self.assertEqual(
|
||||
date,
|
||||
datetime.datetime(2007, 1, 25, 16, 0))
|
||||
|
||||
def testAmbiguousDatePattern(self):
|
||||
defDD = DateDetector()
|
||||
defDD.addDefaultTemplate()
|
||||
for (matched, dp, line) in (
|
||||
# positive case:
|
||||
('Jan 23 21:59:59', None, 'Test failure Jan 23 21:59:59 for 192.0.2.1'),
|
||||
# ambiguous "unbound" patterns (missed):
|
||||
(False, None, 'Test failure TestJan 23 21:59:59.011 2015 for 192.0.2.1'),
|
||||
(False, None, 'Test failure Jan 23 21:59:59123456789 for 192.0.2.1'),
|
||||
# ambiguous "no optional year" patterns (matched):
|
||||
('Aug 8 11:25:50', None, 'Aug 8 11:25:50 20030f2329b8 Authentication failed from 192.0.2.1'),
|
||||
('Aug 8 11:25:50', None, '[Aug 8 11:25:50] 20030f2329b8 Authentication failed from 192.0.2.1'),
|
||||
('Aug 8 11:25:50 2014', None, 'Aug 8 11:25:50 2014 20030f2329b8 Authentication failed from 192.0.2.1'),
|
||||
# direct specified patterns:
|
||||
('20:00:00 01.02.2003', r'%H:%M:%S %d.%m.%Y$', '192.0.2.1 at 20:00:00 01.02.2003'),
|
||||
('[20:00:00 01.02.2003]', r'\[%H:%M:%S %d.%m.%Y\]', '192.0.2.1[20:00:00 01.02.2003]'),
|
||||
('[20:00:00 01.02.2003]', r'\[%H:%M:%S %d.%m.%Y\]', '[20:00:00 01.02.2003]192.0.2.1'),
|
||||
('[20:00:00 01.02.2003]', r'\[%H:%M:%S %d.%m.%Y\]$', '192.0.2.1[20:00:00 01.02.2003]'),
|
||||
('[20:00:00 01.02.2003]', r'^\[%H:%M:%S %d.%m.%Y\]', '[20:00:00 01.02.2003]192.0.2.1'),
|
||||
('[17/Jun/2011 17:00:45]', r'^\[%d/%b/%Y %H:%M:%S\]', '[17/Jun/2011 17:00:45] Attempt, IP address 192.0.2.1'),
|
||||
('[17/Jun/2011 17:00:45]', r'\[%d/%b/%Y %H:%M:%S\]', 'Attempt [17/Jun/2011 17:00:45] IP address 192.0.2.1'),
|
||||
('[17/Jun/2011 17:00:45]', r'\[%d/%b/%Y %H:%M:%S\]', 'Attempt IP address 192.0.2.1, date: [17/Jun/2011 17:00:45]'),
|
||||
# direct specified patterns (begin/end, missed):
|
||||
(False, r'%H:%M:%S %d.%m.%Y', '192.0.2.1x20:00:00 01.02.2003'),
|
||||
(False, r'%H:%M:%S %d.%m.%Y', '20:00:00 01.02.2003x192.0.2.1'),
|
||||
# direct specified unbound patterns (no begin/end boundary):
|
||||
('20:00:00 01.02.2003', r'**%H:%M:%S %d.%m.%Y**', '192.0.2.1x20:00:00 01.02.2003'),
|
||||
('20:00:00 01.02.2003', r'**%H:%M:%S %d.%m.%Y**', '20:00:00 01.02.2003x192.0.2.1'),
|
||||
# pattern enclosed with stars (in comparison to example above):
|
||||
('*20:00:00 01.02.2003*', r'\**%H:%M:%S %d.%m.%Y\**', 'test*20:00:00 01.02.2003*test'),
|
||||
# direct specified patterns (begin/end, matched):
|
||||
('20:00:00 01.02.2003', r'%H:%M:%S %d.%m.%Y', '192.0.2.1 20:00:00 01.02.2003'),
|
||||
('20:00:00 01.02.2003', r'%H:%M:%S %d.%m.%Y', '20:00:00 01.02.2003 192.0.2.1'),
|
||||
# wrong year in 1st date, so failed by convert using not precise year (filter used last known date),
|
||||
# in the 2nd and 3th tests (with precise year) it should find correct the 2nd date:
|
||||
(None, r'%Y-%Exm-%Exd %ExH:%ExM:%ExS', "0000-12-30 00:00:00 - 2003-12-30 00:00:00"),
|
||||
('2003-12-30 00:00:00', r'%ExY-%Exm-%Exd %ExH:%ExM:%ExS', "0000-12-30 00:00:00 - 2003-12-30 00:00:00"),
|
||||
('2003-12-30 00:00:00', None, "0000-12-30 00:00:00 - 2003-12-30 00:00:00"),
|
||||
# wrong date recognized short month/day (unbounded date pattern without separator between parts),
|
||||
# in the 2nd and 3th tests (with precise month and day) it should find correct the 2nd date:
|
||||
('200333 010203', r'%Y%m%d %H%M%S', "text:200333 010203 | date:20031230 010203"),
|
||||
('20031230 010203', r'%ExY%Exm%Exd %ExH%ExM%ExS', "text:200333 010203 | date:20031230 010203"),
|
||||
('20031230 010203', None, "text:200333 010203 | date:20031230 010203"),
|
||||
# Explicit bound in start of the line using {^LN-BEG} key,
|
||||
# (negative) in the 1st case without line begin boundary - wrong date may be found,
|
||||
# (positive) in the 2nd case with line begin boundary - unexpected date / log line (not found)
|
||||
# (positive) and in 3th case with line begin boundary - find the correct date
|
||||
("20030101 000000", "%ExY%Exm%Exd %ExH%ExM%ExS", "00001230 010203 - 20030101 000000"),
|
||||
(None, "{^LN-BEG}%ExY%Exm%Exd %ExH%ExM%ExS", "00001230 010203 - 20030101 000000"),
|
||||
("20031230 010203", "{^LN-BEG}%ExY%Exm%Exd %ExH%ExM%ExS", "20031230 010203 - 20030101 000000"),
|
||||
# Explicit bound in start of the line using {^LN-BEG} key,
|
||||
# up to 2 non-alphanumeric chars front, ** - no word boundary on the right
|
||||
("20031230010203", "{^LN-BEG}%ExY%Exm%Exd%ExH%ExM%ExS**", "2003123001020320030101000000"),
|
||||
("20031230010203", "{^LN-BEG}%ExY%Exm%Exd%ExH%ExM%ExS**", "#2003123001020320030101000000"),
|
||||
("20031230010203", "{^LN-BEG}%ExY%Exm%Exd%ExH%ExM%ExS**", "##2003123001020320030101000000"),
|
||||
("20031230010203", "{^LN-BEG}%ExY%Exm%Exd%ExH%ExM%ExS", "[20031230010203]20030101000000"),
|
||||
# UTC/GMT time zone offset (with %z and %Z):
|
||||
(1072746123.0 - 3600, "{^LN-BEG}%ExY-%Exm-%Exd %ExH:%ExM:%ExS(?: %z)?", "[2003-12-30 01:02:03] server ..."),
|
||||
(1072746123.0 - 3600, "{^LN-BEG}%ExY-%Exm-%Exd %ExH:%ExM:%ExS(?: %Z)?", "[2003-12-30 01:02:03] server ..."),
|
||||
(1072746123.0, "{^LN-BEG}%ExY-%Exm-%Exd %ExH:%ExM:%ExS(?: %z)?", "[2003-12-30 01:02:03 UTC] server ..."),
|
||||
(1072746123.0, "{^LN-BEG}%ExY-%Exm-%Exd %ExH:%ExM:%ExS(?: %Z)?", "[2003-12-30 01:02:03 UTC] server ..."),
|
||||
):
|
||||
logSys.debug('== test: %r', (matched, dp, line))
|
||||
if dp is None:
|
||||
dd = defDD
|
||||
else:
|
||||
dd = DateDetector()
|
||||
dd.appendTemplate(dp)
|
||||
date = dd.getTime(line)
|
||||
if matched:
|
||||
self.assertTrue(date)
|
||||
if isinstance(matched, basestring):
|
||||
self.assertEqual(matched, date[1].group(1))
|
||||
else:
|
||||
self.assertEqual(matched, date[0])
|
||||
else:
|
||||
self.assertEqual(date, None)
|
||||
|
||||
|
||||
# def testDefaultTempate(self):
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue