Merge branch '0.10' into 0.10-full

pull/1460/head
sebres 2017-02-28 14:34:32 +01:00
commit 28b5262976
45 changed files with 1345 additions and 602 deletions

View File

@ -10,6 +10,45 @@ ver. 0.10.0 (2016/XX/XXX) - gonna-be-released-some-time-shining
-----------
TODO: implementing of options resp. other tasks from PR #1346
documentation should be extended (new options, etc)
### Fixes
* `filter.d/pam-generic.conf`:
- [grave] injection on user name to host fixed
* `action.d/complain.conf`
- fixed using new tag `<ip-rev>` (sh/dash compliant now)
### New Features
* New Actions:
* New Filters:
### Enhancements
* Introduced new filter option `prefregex` for pre-filtering using single regular expression (gh-1698);
* Many times faster and fewer CPU-hungry because of parsing with `maxlines=1`, so without
line buffering (scrolling of the buffer-window).
Combination of tags `<F-MLFID>` and `<F-NOFAIL>` can be used now to process multi-line logs
using single-line expressions:
- tag `<F-MLFID>`: used to identify resp. store failure info for groups of log-lines with the same
identifier (e. g. combined failure-info for the same conn-id by `<F-MLFID>(?:conn-id)</F-MLFID>`,
see sshd.conf for example)
- tag `<F-NOFAIL>`: used as mark for no-failure (helper to accumulate common failure-info,
e. g. from lines that contain IP-address);
* Several filters optimized with pre-filtering using new option `prefregex`, and multiline filter
using `<F-MLFID>` + `<F-NOFAIL>` combination;
* Exposes filter group captures in actions (non-recursive interpolation of tags `<F-...>`,
see gh-1698, gh-1110)
* Some filters extended with user name (can be used in gh-1243 to distinguish IP and user,
resp. to remove after success login the user-related failures only);
* Safer, more stable and faster replaceTag interpolation (switched from cycle over all tags
to re.sub with callable)
* substituteRecursiveTags optimization + moved in helpers facilities (because currently used
commonly in server and in client)
* Provides new tag `<ip-rev>` for PTR reversed representation of IP address
ver. 0.10.0-alpha-1 (2016/07/14) - ipv6-support-etc
-----------
### Fixes
* [Grave] memory leak's fixed (gh-1277, gh-1234)

View File

@ -34,6 +34,9 @@ before = helpers-common.conf
[Definition]
# Used in test cases for coverage internal transformations
debug = 0
# bypass ban/unban for restored tickets
norestored = 1
@ -61,9 +64,11 @@ actioncheck =
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionban = oifs=${IFS};
IFS=.; SEP_IP=( <ip> ); set -- ${SEP_IP}; ADDRESSES=$(dig +short -t txt -q $4.$3.$2.$1.abuse-contacts.abusix.org);
IFS=,; ADDRESSES=$(echo $ADDRESSES)
actionban = oifs=${IFS};
RESOLVER_ADDR="%(addr_resolver)s"
if [ "<debug>" -gt 0 ]; then echo "try to resolve $RESOLVER_ADDR"; fi
ADDRESSES=$(dig +short -t txt -q $RESOLVER_ADDR | tr -d '"')
IFS=,; ADDRESSES=$(echo $ADDRESSES)
IFS=${oifs}
IP=<ip>
if [ ! -z "$ADDRESSES" ]; then
@ -81,7 +86,12 @@ actionban = oifs=${IFS};
#
actionunban =
[Init]
# Server as resolver used in dig command
#
addr_resolver = <ip-rev>abuse-contacts.abusix.org
# Default message used for abuse content
#
message = Dear Sir/Madam,\n\nWe have detected abuse from the IP address $IP, which according to a abusix.com is on your network. We would appreciate if you would investigate and take action as appropriate.\n\nLog lines are given below, but please ask if you require any further information.\n\n(If you are not the correct person to contact about this please accept our apologies - your e-mail address was extracted from the whois record by an automated process.)\n\n This mail was generated by Fail2Ban.\nThe recipient address of this report was provided by the Abuse Contact DB by abusix.com. abusix.com does not maintain the content of the database. All information which we pass out, derives from the RIR databases and is processed for ease of use. If you want to change or report non working abuse contacts please contact the appropriate RIR. If you have any further question, contact abusix.com directly via email (info@abusix.com). Information about the Abuse Contact Database can be found here: https://abusix.com/global-reporting/abuse-contact-db\nabusix.com is neither responsible nor liable for the content or accuracy of this message.\n
# Path to the log files which contain relevant lines for the abuser IP

View File

@ -123,7 +123,7 @@ class SMTPAction(ActionBase):
self.message_values = CallingMap(
jailname = self._jail.name,
hostname = socket.gethostname,
bantime = self._jail.actions.getBanTime,
bantime = lambda: self._jail.actions.getBanTime(),
)
# bypass ban/unban for restored tickets

View File

@ -9,20 +9,24 @@ before = apache-common.conf
[Definition]
prefregex = ^%(_apache_error_client)s (?:AH\d+: )?<F-CONTENT>.+</F-CONTENT>$
failregex = ^%(_apache_error_client)s (AH(01797|01630): )?client denied by server configuration: (uri )?\S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01617: )?user .*? authentication failure for "\S*": Password Mismatch(, referer: \S+)?$
^%(_apache_error_client)s (AH01618: )?user .*? not found(: )?\S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01614: )?client used wrong authentication scheme: \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH\d+: )?Authorization of user \S+ to access \S* failed, reason: .*$
^%(_apache_error_client)s (AH0179[24]: )?(Digest: )?user .*?: password mismatch: \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH0179[01]: |Digest: )user `.*?' in realm `.+' (not found|denied by provider): \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01631: )?user .*?: authorization failure for "\S*":(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01775: )?(Digest: )?invalid nonce .* received - length is not \S+(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01788: )?(Digest: )?realm mismatch - got `.*?' but expected `.+'(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01789: )?(Digest: )?unknown algorithm `.*?' received: \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01793: )?invalid qop `.*?' received: \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01777: )?(Digest: )?invalid nonce .*? received - user attempted time travel(, referer: \S+)?\s*$
# auth_type = ((?:Digest|Basic): )?
auth_type = ([A-Z]\w+: )?
failregex = ^client denied by server configuration: (uri )?\S*(, referer: \S+)?\s*$
^user .*? authentication failure for "\S*": Password Mismatch(, referer: \S+)?$
^user .*? not found(: )?\S*(, referer: \S+)?\s*$
^client used wrong authentication scheme: \S*(, referer: \S+)?\s*$
^Authorization of user \S+ to access \S* failed, reason: .*$
^%(auth_type)suser .*?: password mismatch: \S*(, referer: \S+)?\s*$
^%(auth_type)suser `.*?' in realm `.+' (not found|denied by provider): \S*(, referer: \S+)?\s*$
^user .*?: authorization failure for "\S*":(, referer: \S+)?\s*$
^%(auth_type)sinvalid nonce .* received - length is not \S+(, referer: \S+)?\s*$
^%(auth_type)srealm mismatch - got `.*?' but expected `.+'(, referer: \S+)?\s*$
^%(auth_type)sunknown algorithm `.*?' received: \S*(, referer: \S+)?\s*$
^invalid qop `.*?' received: \S*(, referer: \S+)?\s*$
^%(auth_type)sinvalid nonce .*? received - user attempted time travel(, referer: \S+)?\s*$
ignoreregex =
@ -53,4 +57,4 @@ ignoreregex =
# referer is always in error log messages if it exists added as per the log_error_core function in server/log.c
#
# Author: Cyril Jaquier
# Major edits by Daniel Black
# Major edits by Daniel Black and Sergey Brester (sebres)

View File

@ -23,14 +23,13 @@ before = apache-common.conf
[Definition]
failregex = ^%(_apache_error_client)s ((AH001(28|30): )?File does not exist|(AH01264: )?script not found or unable to stat): <webroot><block>(, referer: \S+)?\s*$
^%(_apache_error_client)s script '<webroot><block>' not found or unable to stat(, referer: \S+)?\s*$
prefregex = ^%(_apache_error_client)s (?:AH\d+: )?<F-CONTENT>.+</F-CONTENT>$
failregex = ^(?:File does not exist|script not found or unable to stat): <webroot><block>(, referer: \S+)?\s*$
^script '<webroot><block>' not found or unable to stat(, referer: \S+)?\s*$
ignoreregex =
[Init]
# Webroot represents the webroot on which all other files are based
webroot = /var/www/

View File

@ -9,8 +9,10 @@ before = apache-common.conf
[Definition]
failregex = ^%(_apache_error_client)s (AH01215: )?/bin/(ba)?sh: warning: HTTP_.*?: ignoring function definition attempt(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01215: )?/bin/(ba)?sh: error importing function definition for `HTTP_.*?'(, referer: \S+)?\s*$
prefregex = ^%(_apache_error_client)s (AH01215: )?/bin/([bd]a)?sh: <F-CONTENT>.+</F-CONTENT>$
failregex = ^warning: HTTP_[^:]+: ignoring function definition attempt(, referer: \S+)?\s*$
^error importing function definition for `HTTP_[^']+'(, referer: \S+)?\s*$
ignoreregex =

View File

@ -18,16 +18,18 @@ iso8601 = \d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+[+-]\d{4}
# All Asterisk log messages begin like this:
log_prefix= (?:NOTICE|SECURITY|WARNING)%(__pid_re)s:?(?:\[C-[\da-f]*\])? [^:]+:\d*(?:(?: in)? \w+:)?
failregex = ^%(__prefix_line)s%(log_prefix)s Registration from '[^']*' failed for '<HOST>(:\d+)?' - (Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$
^%(__prefix_line)s%(log_prefix)s Call from '[^']*' \(<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context
^%(__prefix_line)s%(log_prefix)s Host <HOST> failed to authenticate as '[^']*'$
^%(__prefix_line)s%(log_prefix)s No registration for peer '[^']*' \(from <HOST>\)$
^%(__prefix_line)s%(log_prefix)s Host <HOST> failed MD5 authentication for '[^']*' \([^)]+\)$
^%(__prefix_line)s%(log_prefix)s Failed to authenticate (user|device) [^@]+@<HOST>\S*$
^%(__prefix_line)s%(log_prefix)s hacking attempt detected '<HOST>'$
^%(__prefix_line)s%(log_prefix)s SecurityEvent="(FailedACL|InvalidAccountID|ChallengeResponseFailed|InvalidPassword)",EventTV="([\d-]+|%(iso8601)s)",Severity="[\w]+",Service="[\w]+",EventVersion="\d+",AccountID="(\d*|<unknown>)",SessionID=".+",LocalAddress="IPV[46]/(UDP|TCP|WS)/[\da-fA-F:.]+/\d+",RemoteAddress="IPV[46]/(UDP|TCP|WS)/<HOST>/\d+"(,Challenge="[\w/]+")?(,ReceivedChallenge="\w+")?(,Response="\w+",ExpectedResponse="\w*")?(,ReceivedHash="[\da-f]+")?(,ACLName="\w+")?$
^%(__prefix_line)s%(log_prefix)s "Rejecting unknown SIP connection from <HOST>"$
^%(__prefix_line)s%(log_prefix)s Request (?:'[^']*' )?from '[^']*' failed for '<HOST>(?::\d+)?'\s\(callid: [^\)]*\) - (?:No matching endpoint found|Not match Endpoint(?: Contact)? ACL|(?:Failed|Error) to authenticate)\s*$
prefregex = ^%(__prefix_line)s%(log_prefix)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^Registration from '[^']*' failed for '<HOST>(:\d+)?' - (Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$
^Call from '[^']*' \(<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context
^Host <HOST> failed to authenticate as '[^']*'$
^No registration for peer '[^']*' \(from <HOST>\)$
^Host <HOST> failed MD5 authentication for '[^']*' \([^)]+\)$
^Failed to authenticate (user|device) [^@]+@<HOST>\S*$
^hacking attempt detected '<HOST>'$
^SecurityEvent="(FailedACL|InvalidAccountID|ChallengeResponseFailed|InvalidPassword)",EventTV="([\d-]+|%(iso8601)s)",Severity="[\w]+",Service="[\w]+",EventVersion="\d+",AccountID="(\d*|<unknown>)",SessionID=".+",LocalAddress="IPV[46]/(UDP|TCP|WS)/[\da-fA-F:.]+/\d+",RemoteAddress="IPV[46]/(UDP|TCP|WS)/<HOST>/\d+"(,Challenge="[\w/]+")?(,ReceivedChallenge="\w+")?(,Response="\w+",ExpectedResponse="\w*")?(,ReceivedHash="[\da-f]+")?(,ACLName="\w+")?$
^"Rejecting unknown SIP connection from <HOST>"$
^Request (?:'[^']*' )?from '[^']*' failed for '<HOST>(?::\d+)?'\s\(callid: [^\)]*\) - (?:No matching endpoint found|Not match Endpoint(?: Contact)? ACL|(?:Failed|Error) to authenticate)\s*$
ignoreregex =

View File

@ -12,8 +12,10 @@ before = common.conf
_daemon = courieresmtpd
failregex = ^%(__prefix_line)serror,relay=<HOST>,.*: 550 User (<.*> )?unknown\.?$
^%(__prefix_line)serror,relay=<HOST>,msg="535 Authentication failed\.",cmd:( AUTH \S+)?( [0-9a-zA-Z\+/=]+)?(?: \S+)$
prefregex = ^%(__prefix_line)serror,relay=<HOST>,<F-CONTENT>.+</F-CONTENT>$
failregex = ^[^:]*: 550 User (<.*> )?unknown\.?$
^msg="535 Authentication failed\.",cmd:( AUTH \S+)?( [0-9a-zA-Z\+/=]+)?(?: \S+)$
ignoreregex =

View File

@ -7,13 +7,16 @@ before = common.conf
[Definition]
_daemon = (auth|dovecot(-auth)?|auth-worker)
_auth_worker = (?:dovecot: )?auth(?:-worker)?
_daemon = (dovecot(-auth)?|auth)
failregex = ^%(__prefix_line)s(?:%(__pam_auth)s(?:\(dovecot:auth\))?:)?\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(?:\s+user=\S*)?\s*$
^%(__prefix_line)s(?:pop3|imap)-login: (?:Info: )?(?:Aborted login|Disconnected)(?::(?: [^ \(]+)+)? \((?:auth failed, \d+ attempts( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<[^>]+>,)?( method=\S+,)? rip=<HOST>(?:, lip=\S+)?(?:, TLS(?: handshaking(?:: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
^%(__prefix_line)s(?:Info|dovecot: auth\(default\)|auth-worker\(\d+\)): pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$
^%(__prefix_line)s(?:auth|auth-worker\(\d+\)): (?:pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$
^%(__prefix_line)s(?:auth|auth-worker\(\d+\)): Info: ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$
prefregex = ^%(__prefix_line)s(%(_auth_worker)s(?:\([^\)]+\))?: )?(?:%(__pam_auth)s(?:\(dovecot:auth\))?: |(?:pop3|imap)-login: )?(?:Info: )?<F-CONTENT>.+</F-CONTENT>$
failregex = ^authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(?:\s+user=\S*)?\s*$
^(?:Aborted login|Disconnected)(?::(?: [^ \(]+)+)? \((?:auth failed, \d+ attempts( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<[^>]+>,)?( method=\S+,)? rip=<HOST>(?:, lip=\S+)?(?:, TLS(?: handshaking(?:: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
^pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$
^(?:pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$
^ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$
ignoreregex =

View File

@ -23,9 +23,11 @@ before = common.conf
_daemon = dropbear
failregex = ^%(__prefix_line)s[Ll]ogin attempt for nonexistent user ('.*' )?from <HOST>:\d+$
^%(__prefix_line)s[Bb]ad (PAM )?password attempt for .+ from <HOST>(:\d+)?$
^%(__prefix_line)s[Ee]xit before auth \(user '.+', \d+ fails\): Max auth tries reached - user '.+' from <HOST>:\d+\s*$
prefregex = ^%(__prefix_line)s<F-CONTENT>(?:[Ll]ogin|[Bb]ad|[Ee]xit).+</F-CONTENT>$
failregex = ^[Ll]ogin attempt for nonexistent user ('.*' )?from <HOST>:\d+$
^[Bb]ad (PAM )?password attempt for .+ from <HOST>(:\d+)?$
^[Ee]xit before auth \(user '.+', \d+ fails\): Max auth tries reached - user '.+' from <HOST>:\d+\s*$
ignoreregex =

View File

@ -9,7 +9,9 @@ after = exim-common.local
[Definition]
host_info = (?:H=([\w.-]+ )?(?:\(\S+\) )?)?\[<HOST>\](?::\d+)? (?:I=\[\S+\](:\d+)? )?(?:U=\S+ )?(?:P=e?smtp )?
host_info_pre = (?:H=([\w.-]+ )?(?:\(\S+\) )?)?
host_info_suf = (?::\d+)?(?: I=\[\S+\](:\d+)?)?(?: U=\S+)?(?: P=e?smtp)?(?: F=(?:<>|[^@]+@\S+))?\s
host_info = %(host_info_pre)s\[<HOST>\]%(host_info_suf)s
pid = (?: \[\d+\])?
# DEV Notes:

View File

@ -13,14 +13,17 @@ before = exim-common.conf
[Definition]
# Fre-filter via "prefregex" is currently inactive because of too different failure syntax in exim-log (testing needed):
#prefregex = ^%(pid)s <F-CONTENT>\b(?:\w+ authenticator failed|([\w\-]+ )?SMTP (?:(?:call|connection) from|protocol(?: synchronization)? error)|no MAIL in|(?:%(host_info_pre)s\[[^\]]+\]%(host_info_suf)s(?:sender verify fail|rejected RCPT|dropped|AUTH command))).+</F-CONTENT>$
failregex = ^%(pid)s %(host_info)ssender verify fail for <\S+>: (?:Unknown user|Unrouteable address|all relevant MX records point to non-existent hosts)\s*$
^%(pid)s \w+ authenticator failed for (\S+ )?\(\S+\) \[<HOST>\](?::\d+)?(?: I=\[\S+\](:\d+)?)?: 535 Incorrect authentication data( \(set_id=.*\)|: \d+ Time\(s\))?\s*$
^%(pid)s %(host_info)sF=(?:<>|[^@]+@\S+) rejected RCPT [^@]+@\S+: (?:relay not permitted|Sender verify failed|Unknown user)\s*$
^%(pid)s %(host_info)srejected RCPT [^@]+@\S+: (?:relay not permitted|Sender verify failed|Unknown user)\s*$
^%(pid)s SMTP protocol synchronization error \([^)]*\): rejected (?:connection from|"\S+") %(host_info)s(?:next )?input=".*"\s*$
^%(pid)s SMTP call from \S+ %(host_info)sdropped: too many nonmail commands \(last was "\S+"\)\s*$
^%(pid)s SMTP protocol error in "AUTH \S*(?: \S*)?" %(host_info)sAUTH command used when not advertised\s*$
^%(pid)s no MAIL in SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sD=\d+s(?: C=\S*)?\s*$
^%(pid)s \S+ SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sclosed by DROP in ACL\s*$
^%(pid)s ([\w\-]+ )?SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sclosed by DROP in ACL\s*$
ignoreregex =

View File

@ -25,8 +25,11 @@ _daemon = Froxlor
# (?:::f{4,6}:)?(?P<host>[\w\-.^_]+)
# Values: TEXT
#
failregex = ^%(__prefix_line)s\[Login Action <HOST>\] Unknown user \S* tried to login.$
^%(__prefix_line)s\[Login Action <HOST>\] User \S* tried to login with wrong password.$
prefregex = ^%(__prefix_line)s\[Login Action <HOST>\] <F-CONTENT>.+</F-CONTENT>$
failregex = ^Unknown user \S* tried to login.$
^User \S* tried to login with wrong password.$
# Option: ignoreregex

View File

@ -17,8 +17,10 @@ _usernameregex = [^>]+
_prefix = \s+\d+ => <\d+:%(_usernameregex)s\(-1\)> Rejected connection from <HOST>:\d+:
failregex = ^%(_prefix)s Invalid server password$
^%(_prefix)s Wrong certificate or password for existing user$
prefregex = ^%(_prefix)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^Invalid server password$
^Wrong certificate or password for existing user$
ignoreregex =

View File

@ -34,9 +34,11 @@ __daemon_combs_re=(?:%(__pid_re)s?:\s+%(__daemon_re)s|%(__daemon_re)s%(__pid_re)
# this can be optional (for instance if we match named native log files)
__line_prefix=(?:\s\S+ %(__daemon_combs_re)s\s+)?
failregex = ^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: (view (internal|external): )?query(?: \(cache\))? '.*' denied\s*$
^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: zone transfer '\S+/AXFR/\w+' denied\s*$
^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: bad zone transfer request: '\S+/IN': non-authoritative zone \(NOTAUTH\)\s*$
prefregex = ^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: <F-CONTENT>.+</F-CONTENT>$
failregex = ^(view (internal|external): )?query(?: \(cache\))? '.*' denied\s*$
^zone transfer '\S+/AXFR/\w+' denied\s*$
^bad zone transfer request: '\S+/IN': non-authoritative zone \(NOTAUTH\)\s*$
ignoreregex =

View File

@ -16,7 +16,12 @@ _ttys_re=\S*
__pam_re=\(?%(__pam_auth)s(?:\(\S+\))?\)?:?
_daemon = \S+
failregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=%(_ttys_re)s ruser=\S* rhost=<HOST>(?:\s+user=.*)?\s*$
prefregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=%(_ttys_re)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^ruser=<F-USER>\S*</F-USER> rhost=<HOST>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>\S*</F-USER>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>.*?</F-USER>\s*$
^ruser=<F-USER>.*?</F-USER> rhost=<HOST>\s*$
ignoreregex =

View File

@ -12,13 +12,15 @@ before = common.conf
_daemon = postfix(-\w+)?/(?:submission/|smtps/)?smtp[ds]
failregex = ^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 554 5\.7\.1 .*$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.7\.1 Client host rejected: cannot find your hostname, (\[\S*\]); from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.7\.1 : Helo command rejected: Host not found; from=<> to=<> proto=ESMTP helo= *$
^%(__prefix_line)sNOQUEUE: reject: EHLO from \S+\[<HOST>\]: 504 5\.5\.2 <\S+>: Helo command rejected: need fully-qualified hostname;
^%(__prefix_line)sNOQUEUE: reject: VRFY from \S+\[<HOST>\]: 550 5\.1\.1 .*$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.1\.8 <\S*>: Sender address rejected: Domain not found; from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^%(__prefix_line)simproper command pipelining after \S+ from [^[]*\[<HOST>\]:?$
prefregex = ^%(__prefix_line)s(?:NOQUEUE: reject:|improper command pipelining) <F-CONTENT>.+</F-CONTENT>$
failregex = ^RCPT from \S+\[<HOST>\]: 554 5\.7\.1
^RCPT from \S+\[<HOST>\]: 450 4\.7\.1 Client host rejected: cannot find your hostname, (\[\S*\]); from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^RCPT from \S+\[<HOST>\]: 450 4\.7\.1 : Helo command rejected: Host not found; from=<> to=<> proto=ESMTP helo= *$
^EHLO from \S+\[<HOST>\]: 504 5\.5\.2 <\S+>: Helo command rejected: need fully-qualified hostname;
^VRFY from \S+\[<HOST>\]: 550 5\.1\.1
^RCPT from \S+\[<HOST>\]: 450 4\.1\.8 <\S*>: Sender address rejected: Domain not found; from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^after \S+ from [^[]*\[<HOST>\]:?$
ignoreregex =

View File

@ -16,10 +16,14 @@ _daemon = proftpd
__suffix_failed_login = (User not authorized for login|No such user found|Incorrect password|Password expired|Account disabled|Invalid shell: '\S+'|User in \S+|Limit (access|configuration) denies login|Not a UserAlias|maximum login length exceeded).?
failregex = ^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ USER .*: no such user found from \S+ \[\S+\] to \S+:\S+ *$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ USER .* \(Login failed\): %(__suffix_failed_login)s\s*$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ SECURITY VIOLATION: .* login attempted\. *$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ Maximum login attempts \(\d+\) exceeded *$
prefregex = ^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ <F-CONTENT>(?:USER|SECURITY|Maximum).+</F-CONTENT>$
failregex = ^USER .*: no such user found from \S+ \[\S+\] to \S+:\S+ *$
^USER .* \(Login failed\): %(__suffix_failed_login)s\s*$
^SECURITY VIOLATION: .* login attempted\. *$
^Maximum login attempts \(\d+\) exceeded *$
ignoreregex =

View File

@ -24,37 +24,37 @@ __pref = (?:(?:error|fatal): (?:PAM: )?)?
__suff = (?: \[preauth\])?\s*
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
# single line prefix:
__prefix_line_sl = %(__prefix_line)s%(__pref)s
# multi line prefixes (for first and second lines):
__prefix_line_ml1 = (?P<__prefix>%(__prefix_line)s)%(__pref)s
__prefix_line_ml2 = %(__suff)s$<SKIPLINES>^(?P=__prefix)%(__pref)s
prefregex = ^<F-MLFID>%(__prefix_line)s</F-MLFID>%(__pref)s<F-CONTENT>.+</F-CONTENT>$
mode = %(normal)s
normal = ^%(__prefix_line_sl)s[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*%(__suff)s$
^%(__prefix_line_sl)sUser not known to the underlying authentication module for .* from <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)sFailed \S+ for (?P<cond_inv>invalid user )?(?P<user>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)) from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^%(__prefix_line_sl)sROOT LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)s[iI](?:llegal|nvalid) user .*? from <HOST>%(__on_port_opt)s\s*$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not in any group\s*%(__suff)s$
^%(__prefix_line_sl)srefused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^%(__prefix_line_sl)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*%(__suff)s$
^%(__prefix_line_sl)s(error: )?maximum authentication attempts exceeded for .* from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$
^%(__prefix_line_ml1)sUser .+ not allowed because account is locked%(__prefix_line_ml2)sReceived disconnect from <HOST>: 11: .+%(__suff)s$
^%(__prefix_line_ml1)sDisconnecting: Too many authentication failures for .+?%(__prefix_line_ml2)sConnection closed by <HOST>%(__suff)s$
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sDisconnecting: Too many authentication failures for .+%(__suff)s$
normal = ^[aA]uthentication (?:failure|error|failed) for <F-USER>.*</F-USER> from <HOST>( via \S+)?\s*%(__suff)s$
^User not known to the underlying authentication module for <F-USER>.*</F-USER> from <HOST>\s*%(__suff)s$
^Failed \S+ for (?P<cond_inv>invalid user )?<F-USER>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^<F-USER>ROOT</F-USER> LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^[iI](?:llegal|nvalid) user <F-USER>.*?</F-USER> from <HOST>%(__on_port_opt)s\s*$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not in any group\s*%(__suff)s$
^refused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^Received disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^pam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=<F-USER>\S*</F-USER>\s*rhost=<HOST>\s.*%(__suff)s$
^(error: )?maximum authentication attempts exceeded for <F-USER>.*</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$
^User <F-USER>.+</F-USER> not allowed because account is locked%(__suff)s
^Disconnecting: Too many authentication failures for <F-USER>.+?</F-USER>%(__suff)s
^<F-NOFAIL>Received disconnect</F-NOFAIL> from <HOST>: 11:
^<F-NOFAIL>Connection closed</F-NOFAIL> by <HOST>%(__suff)s$
ddos = ^%(__prefix_line_sl)sDid not receive identification string from <HOST>%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
^%(__prefix_line_sl)sUnable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found.
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sUnable to negotiate a (?:cipher|key exchange method)%(__suff)s$
^%(__prefix_line_ml1)sSSH: Server;Ltype: (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:.*%(__prefix_line_ml2)sRead from socket failed: Connection reset by peer%(__suff)s$
ddos = ^Did not receive identification string from <HOST>%(__suff)s$
^Received disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
^Unable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found.
^Unable to negotiate a (?:cipher|key exchange method)%(__suff)s$
^<F-NOFAIL>SSH: Server;Ltype:</F-NOFAIL> (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:
^Read from socket failed: Connection reset by peer \[preauth\]
common = ^<F-NOFAIL>Connection from</F-NOFAIL> <HOST>
aggressive = %(normal)s
%(ddos)s
@ -62,11 +62,11 @@ aggressive = %(normal)s
[Definition]
failregex = %(mode)s
%(common)s
ignoreregex =
# "maxlines" is number of log lines to buffer for multi-line regex searches
maxlines = 10
maxlines = 1
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd

View File

@ -14,8 +14,10 @@ before = common.conf
_daemon = xinetd
failregex = ^%(__prefix_line)sFAIL: \S+ address from=<HOST>$
^%(__prefix_line)sFAIL: \S+ libwrap from=<HOST>$
prefregex = ^%(__prefix_line)sFAIL: <F-CONTENT>.+</F-CONTENT>$
failregex = ^\S+ address from=<HOST>$
^\S+ libwrap from=<HOST>$
ignoreregex =

View File

@ -28,6 +28,7 @@ import os
from .configreader import DefinitionInitConfigReader
from ..helpers import getLogger
from ..server.action import CommandAction
# Gets the instance of the logger.
logSys = getLogger(__name__)
@ -69,7 +70,8 @@ class ActionReader(DefinitionInitConfigReader):
return self._name
def convert(self):
opts = self.getCombined(ignore=('timeout', 'bantime'))
opts = self.getCombined(
ignore=CommandAction._escapedTags | set(('timeout', 'bantime')))
# type-convert only after combined (otherwise boolean converting prevents substitution):
if opts.get('norestored'):
opts['norestored'] = self._convert_to_boolean(opts['norestored'])

View File

@ -29,8 +29,7 @@ import os
from ConfigParser import NoOptionError, NoSectionError
from .configparserinc import sys, SafeConfigParserWithIncludes, logLevel
from ..helpers import getLogger
from ..server.action import CommandAction
from ..helpers import getLogger, substituteRecursiveTags
# Gets the instance of the logger.
logSys = getLogger(__name__)
@ -225,6 +224,7 @@ class ConfigReaderUnshared(SafeConfigParserWithIncludes):
values = dict()
if pOptions is None:
pOptions = {}
# Get only specified options:
for optname in options:
if isinstance(options, (list,tuple)):
if len(optname) > 2:
@ -280,6 +280,8 @@ class DefinitionInitConfigReader(ConfigReader):
self.setFile(file_)
self.setJailName(jailName)
self._initOpts = initOpts
self._pOpts = dict()
self._defCache = dict()
def setFile(self, fileName):
self._file = fileName
@ -312,7 +314,7 @@ class DefinitionInitConfigReader(ConfigReader):
pOpts = _merge_dicts(pOpts, self._initOpts)
self._opts = ConfigReader.getOptions(
self, "Definition", self._configOpts, pOpts)
self._pOpts = pOpts
if self.has_section("Init"):
for opt in self.options("Init"):
v = self.get("Init", opt)
@ -324,6 +326,22 @@ class DefinitionInitConfigReader(ConfigReader):
def _convert_to_boolean(self, value):
return value.lower() in ("1", "yes", "true", "on")
def getCombOption(self, optname):
"""Get combined definition option (as string) using pre-set and init
options as preselection (values with higher precedence as specified in section).
Can be used only after calling of getOptions.
"""
try:
return self._defCache[optname]
except KeyError:
try:
v = self.get("Definition", optname, vars=self._pOpts)
except (NoSectionError, NoOptionError, ValueError):
v = None
self._defCache[optname] = v
return v
def getCombined(self, ignore=()):
combinedopts = self._opts
ignore = set(ignore).copy()
@ -338,7 +356,8 @@ class DefinitionInitConfigReader(ConfigReader):
n, cond = cond.groups()
ignore.add(n)
# substiture options already specified direct:
opts = CommandAction.substituteRecursiveTags(combinedopts, ignore=ignore)
opts = substituteRecursiveTags(combinedopts,
ignore=ignore, addrepl=self.getCombOption)
if not opts:
raise ValueError('recursive tag definitions unable to be resolved')
return opts

View File

@ -61,7 +61,7 @@ def debuggexURL(sample, regex):
'flavor': 'python' })
return 'https://www.debuggex.com/?' + q
def output(args):
def output(args): # pragma: no cover (overriden in test-cases)
print(args)
def shortstr(s, l=53):
@ -235,7 +235,7 @@ class Fail2banRegex(object):
else:
self._maxlines = 20
if opts.journalmatch is not None:
self.setJournalMatch(opts.journalmatch.split())
self.setJournalMatch(shlex.split(opts.journalmatch))
if opts.datepattern:
self.setDatePattern(opts.datepattern)
if opts.usedns:
@ -243,6 +243,7 @@ class Fail2banRegex(object):
self._filter.returnRawHost = opts.raw
self._filter.checkFindTime = False
self._filter.checkAllRegex = True
self._opts = opts
def decode_line(self, line):
return FileContainer.decode_line('<LOG>', self._encoding, line)
@ -265,69 +266,81 @@ class Fail2banRegex(object):
output( "Use maxlines : %d" % self._filter.getMaxLines() )
def setJournalMatch(self, v):
if self._journalmatch is None:
self._journalmatch = v
self._journalmatch = v
def readRegex(self, value, regextype):
assert(regextype in ('fail', 'ignore'))
regex = regextype + 'regex'
if os.path.isfile(value) or os.path.isfile(value + '.conf'):
if regextype == 'fail' and (os.path.isfile(value) or os.path.isfile(value + '.conf')):
if os.path.basename(os.path.dirname(value)) == 'filter.d':
## within filter.d folder - use standard loading algorithm to load filter completely (with .local etc.):
basedir = os.path.dirname(os.path.dirname(value))
value = os.path.splitext(os.path.basename(value))[0]
output( "Use %11s filter file : %s, basedir: %s" % (regex, value, basedir) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config, basedir=basedir)
if not reader.read():
if not reader.read(): # pragma: no cover
output( "ERROR: failed to load filter %s" % value )
return False
else:
else: # pragma: no cover
## foreign file - readexplicit this file and includes if possible:
output( "Use %11s file : %s" % (regex, value) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config)
reader.setBaseDir(None)
if not reader.readexplicit():
if not reader.readexplicit():
output( "ERROR: failed to read %s" % value )
return False
reader.getOptions(None)
readercommands = reader.convert()
regex_values = [
RegexStat(m[3])
for m in filter(
lambda x: x[0] == 'set' and x[2] == "add%sregex" % regextype,
readercommands)
] + [
RegexStat(m)
for mm in filter(
lambda x: x[0] == 'multi-set' and x[2] == "add%sregex" % regextype,
readercommands)
for m in mm[3]
]
# Read out and set possible value of maxlines
for command in readercommands:
if command[2] == "maxlines":
maxlines = int(command[3])
try:
self.setMaxLines(maxlines)
except ValueError:
output( "ERROR: Invalid value for maxlines (%(maxlines)r) " \
"read from %(value)s" % locals() )
return False
elif command[2] == 'addjournalmatch':
journalmatch = command[3:]
self.setJournalMatch(journalmatch)
elif command[2] == 'datepattern':
datepattern = command[3]
self.setDatePattern(datepattern)
regex_values = {}
for opt in readercommands:
if opt[0] == 'multi-set':
optval = opt[3]
elif opt[0] == 'set':
optval = opt[3:]
else: # pragma: no cover
continue
try:
if opt[2] == "prefregex":
for optval in optval:
self._filter.prefRegex = optval
elif opt[2] == "addfailregex":
stor = regex_values.get('fail')
if not stor: stor = regex_values['fail'] = list()
for optval in optval:
stor.append(RegexStat(optval))
#self._filter.addFailRegex(optval)
elif opt[2] == "addignoreregex":
stor = regex_values.get('ignore')
if not stor: stor = regex_values['ignore'] = list()
for optval in optval:
stor.append(RegexStat(optval))
#self._filter.addIgnoreRegex(optval)
elif opt[2] == "maxlines":
for optval in optval:
self.setMaxLines(optval)
elif opt[2] == "datepattern":
for optval in optval:
self.setDatePattern(optval)
elif opt[2] == "addjournalmatch": # pragma: no cover
if self._opts.journalmatch is None:
self.setJournalMatch(optval)
except ValueError as e: # pragma: no cover
output( "ERROR: Invalid value for %s (%r) " \
"read from %s: %s" % (opt[2], optval, value, e) )
return False
else:
output( "Use %11s line : %s" % (regex, shortstr(value)) )
regex_values = [RegexStat(value)]
regex_values = {regextype: [RegexStat(value)]}
setattr(self, "_" + regex, regex_values)
for regex in regex_values:
getattr(
self._filter,
'add%sRegex' % regextype.title())(regex.getFailRegex())
for regextype, regex_values in regex_values.iteritems():
regex = regextype + 'regex'
setattr(self, "_" + regex, regex_values)
for regex in regex_values:
getattr(
self._filter,
'add%sRegex' % regextype.title())(regex.getFailRegex())
return True
def testIgnoreRegex(self, line):
@ -337,7 +350,7 @@ class Fail2banRegex(object):
if ret is not None:
found = True
regex = self._ignoreregex[ret].inc()
except RegexException as e:
except RegexException as e: # pragma: no cover
output( 'ERROR: %s' % e )
return False
return found
@ -355,7 +368,7 @@ class Fail2banRegex(object):
regex = self._failregex[match[0]]
regex.inc()
regex.appendIP(match)
except RegexException as e:
except RegexException as e: # pragma: no cover
output( 'ERROR: %s' % e )
return False
for bufLine in orgLineBuffer[int(fullBuffer):]:
@ -502,14 +515,14 @@ class Fail2banRegex(object):
for line in hdlr:
yield self.decode_line(line)
def start(self, opts, args):
def start(self, args):
cmd_log, cmd_regex = args[:2]
try:
if not self.readRegex(cmd_regex, 'fail'):
if not self.readRegex(cmd_regex, 'fail'): # pragma: no cover
return False
if len(args) == 3 and not self.readRegex(args[2], 'ignore'):
if len(args) == 3 and not self.readRegex(args[2], 'ignore'): # pragma: no cover
return False
except RegexException as e:
output( 'ERROR: %s' % e )
@ -521,7 +534,7 @@ class Fail2banRegex(object):
output( "Use log file : %s" % cmd_log )
output( "Use encoding : %s" % self._encoding )
test_lines = self.file_lines_gen(hdlr)
except IOError as e:
except IOError as e: # pragma: no cover
output( e )
return False
elif cmd_log.startswith("systemd-journal"): # pragma: no cover
@ -595,5 +608,5 @@ def exec_command_line(*args):
logSys.addHandler(stdout)
fail2banRegex = Fail2banRegex(opts)
if not fail2banRegex.start(opts, args):
if not fail2banRegex.start(args):
sys.exit(-1)

View File

@ -37,6 +37,7 @@ logSys = getLogger(__name__)
class FilterReader(DefinitionInitConfigReader):
_configOpts = {
"prefregex": ["string", None],
"ignoreregex": ["string", None],
"failregex": ["string", ""],
"maxlines": ["int", None],
@ -68,12 +69,11 @@ class FilterReader(DefinitionInitConfigReader):
stream.append(["multi-set", self._jailName, "add" + opt, multi])
elif len(multi):
stream.append(["set", self._jailName, "add" + opt, multi[0]])
elif opt == 'maxlines':
# We warn when multiline regex is used without maxlines > 1
# therefore keep sure we set this option first.
stream.insert(0, ["set", self._jailName, "maxlines", value])
elif opt == 'datepattern':
stream.append(["set", self._jailName, "datepattern", value])
elif opt in ('maxlines', 'prefregex'):
# Be sure we set this options first.
stream.insert(0, ["set", self._jailName, opt, value])
elif opt in ('datepattern'):
stream.append(["set", self._jailName, opt, value])
# Do not send a command if the match is empty.
elif opt == 'journalmatch':
if value is None: continue

View File

@ -200,6 +200,108 @@ else:
raise
return uni_decode(x, enc, 'replace')
#
# Following facilities used for safe recursive interpolation of
# tags (<tag>) in tagged options.
#
# max tag replacement count:
MAX_TAG_REPLACE_COUNT = 10
# compiled RE for tag name (replacement name)
TAG_CRE = re.compile(r'<([^ <>]+)>')
def substituteRecursiveTags(inptags, conditional='',
ignore=(), addrepl=None
):
"""Sort out tag definitions within other tags.
Since v.0.9.2 supports embedded interpolation (see test cases for examples).
so: becomes:
a = 3 a = 3
b = <a>_3 b = 3_3
Parameters
----------
inptags : dict
Dictionary of tags(keys) and their values.
Returns
-------
dict
Dictionary of tags(keys) and their values, with tags
within the values recursively replaced.
"""
#logSys = getLogger("fail2ban")
tre_search = TAG_CRE.search
# copy return tags dict to prevent modifying of inptags:
tags = inptags.copy()
# init:
ignore = set(ignore)
done = set()
# repeat substitution while embedded-recursive (repFlag is True)
while True:
repFlag = False
# substitute each value:
for tag in tags.iterkeys():
# ignore escaped or already done (or in ignore list):
if tag in ignore or tag in done: continue
value = orgval = str(tags[tag])
# search and replace all tags within value, that can be interpolated using other tags:
m = tre_search(value)
refCounts = {}
#logSys.log(5, 'TAG: %s, value: %s' % (tag, value))
while m:
# found replacement tag:
rtag = m.group(1)
# don't replace tags that should be currently ignored (pre-replacement):
if rtag in ignore:
m = tre_search(value, m.end())
continue
#logSys.log(5, 'found: %s' % rtag)
if rtag == tag or refCounts.get(rtag, 1) > MAX_TAG_REPLACE_COUNT:
# recursive definitions are bad
#logSys.log(5, 'recursion fail tag: %s value: %s' % (tag, value) )
raise ValueError(
"properties contain self referencing definitions "
"and cannot be resolved, fail tag: %s, found: %s in %s, value: %s" %
(tag, rtag, refCounts, value))
repl = None
if conditional:
repl = tags.get(rtag + '?' + conditional)
if repl is None:
repl = tags.get(rtag)
# try to find tag using additional replacement (callable):
if repl is None and addrepl is not None:
repl = addrepl(rtag)
if repl is None:
# Missing tags - just continue on searching after end of match
# Missing tags are ok - cInfo can contain aInfo elements like <HOST> and valid shell
# constructs like <STDIN>.
m = tre_search(value, m.end())
continue
value = value.replace('<%s>' % rtag, repl)
#logSys.log(5, 'value now: %s' % value)
# increment reference count:
refCounts[rtag] = refCounts.get(rtag, 0) + 1
# the next match for replace:
m = tre_search(value, m.start())
#logSys.log(5, 'TAG: %s, newvalue: %s' % (tag, value))
# was substituted?
if orgval != value:
# check still contains any tag - should be repeated (possible embedded-recursive substitution):
if tre_search(value):
repFlag = True
tags[tag] = value
# no more sub tags (and no possible composite), add this tag to done set (just to be faster):
if '<' not in value: done.add(tag)
# stop interpolation, if no replacements anymore:
if not repFlag:
break
return tags
class BgService(object):
"""Background servicing

View File

@ -32,10 +32,11 @@ import time
from abc import ABCMeta
from collections import MutableMapping
from .failregex import mapTag2Opt
from .ipdns import asip
from .mytime import MyTime
from .utils import Utils
from ..helpers import getLogger
from ..helpers import getLogger, substituteRecursiveTags, TAG_CRE, MAX_TAG_REPLACE_COUNT
# Gets the instance of the logger.
logSys = getLogger(__name__)
@ -46,14 +47,17 @@ _cmd_lock = threading.Lock()
# Todo: make it configurable resp. automatically set, ex.: `[ -f /proc/net/if_inet6 ] && echo 'yes' || echo 'no'`:
allowed_ipv6 = True
# max tag replacement count:
MAX_TAG_REPLACE_COUNT = 10
# capture groups from filter for map to ticket data:
FCUSTAG_CRE = re.compile(r'<F-([A-Z0-9_\-]+)>'); # currently uppercase only
# compiled RE for tag name (replacement name)
TAG_CRE = re.compile(r'<([^ <>]+)>')
# New line, space
ADD_REPL_TAGS = {
"br": "\n",
"sp": " "
}
class CallingMap(MutableMapping):
class CallingMap(MutableMapping, object):
"""A Mapping type which returns the result of callable values.
`CallingMap` behaves similar to a standard python dictionary,
@ -70,23 +74,64 @@ class CallingMap(MutableMapping):
The dictionary data which can be accessed to obtain items uncalled
"""
# immutable=True saves content between actions, without interim copying (save original on demand, recoverable via reset)
__slots__ = ('data', 'storage', 'immutable', '__org_data')
def __init__(self, *args, **kwargs):
self.storage = dict()
self.immutable = True
self.data = dict(*args, **kwargs)
def reset(self, immutable=True):
self.storage = dict()
try:
self.data = self.__org_data
except AttributeError:
pass
self.immutable = immutable
def __repr__(self):
return "%s(%r)" % (self.__class__.__name__, self.data)
return "%s(%r)" % (self.__class__.__name__, self._asdict())
def _asdict(self):
try:
return dict(self)
except:
return dict(self.data, **self.storage)
def __getitem__(self, key):
value = self.data[key]
try:
value = self.storage[key]
except KeyError:
value = self.data[key]
if callable(value):
return value()
else:
return value
# check arguments can be supplied to callable (for backwards compatibility):
value = value(self) if hasattr(value, '__code__') and value.__code__.co_argcount else value()
self.storage[key] = value
return value
def __setitem__(self, key, value):
self.data[key] = value
# mutate to copy:
if self.immutable:
self.storage = self.storage.copy()
self.__org_data = self.data
self.data = self.data.copy()
self.immutable = False
self.storage[key] = value
def __unavailable(self, key):
raise KeyError("Key %r was deleted" % key)
def __delitem__(self, key):
# mutate to copy:
if self.immutable:
self.storage = self.storage.copy()
self.__org_data = self.data
self.data = self.data.copy()
self.immutable = False
try:
del self.storage[key]
except KeyError:
pass
del self.data[key]
def __iter__(self):
@ -95,7 +140,7 @@ class CallingMap(MutableMapping):
def __len__(self):
return len(self.data)
def copy(self):
def copy(self): # pargma: no cover
return self.__class__(self.data.copy())
@ -259,6 +304,16 @@ class CommandAction(ActionBase):
# set:
self.__dict__[name] = value
def __delattr__(self, name):
if not name.startswith('_'):
# parameters changed - clear properties and substitution cache:
self.__properties = None
self.__substCache.clear()
#self._logSys.debug("Unset action %r %s", self._name, name)
self._logSys.debug(" Unset %s", name)
# del:
del self.__dict__[name]
@property
def _properties(self):
"""A dictionary of the actions properties.
@ -364,88 +419,6 @@ class CommandAction(ActionBase):
"""
return self._executeOperation('<actionreload>', 'reloading')
@classmethod
def substituteRecursiveTags(cls, inptags, conditional='', ignore=()):
"""Sort out tag definitions within other tags.
Since v.0.9.2 supports embedded interpolation (see test cases for examples).
so: becomes:
a = 3 a = 3
b = <a>_3 b = 3_3
Parameters
----------
inptags : dict
Dictionary of tags(keys) and their values.
Returns
-------
dict
Dictionary of tags(keys) and their values, with tags
within the values recursively replaced.
"""
# copy return tags dict to prevent modifying of inptags:
tags = inptags.copy()
t = TAG_CRE
ignore = set(ignore)
done = cls._escapedTags.copy() | ignore
# repeat substitution while embedded-recursive (repFlag is True)
while True:
repFlag = False
# substitute each value:
for tag in tags.iterkeys():
# ignore escaped or already done (or in ignore list):
if tag in done: continue
value = orgval = str(tags[tag])
# search and replace all tags within value, that can be interpolated using other tags:
m = t.search(value)
refCounts = {}
#logSys.log(5, 'TAG: %s, value: %s' % (tag, value))
while m:
found_tag = m.group(1)
# don't replace tags that should be currently ignored (pre-replacement):
if found_tag in ignore:
m = t.search(value, m.end())
continue
#logSys.log(5, 'found: %s' % found_tag)
if found_tag == tag or refCounts.get(found_tag, 1) > MAX_TAG_REPLACE_COUNT:
# recursive definitions are bad
#logSys.log(5, 'recursion fail tag: %s value: %s' % (tag, value) )
raise ValueError(
"properties contain self referencing definitions "
"and cannot be resolved, fail tag: %s, found: %s in %s, value: %s" %
(tag, found_tag, refCounts, value))
repl = None
if found_tag not in cls._escapedTags:
repl = tags.get(found_tag + '?' + conditional)
if repl is None:
repl = tags.get(found_tag)
if repl is None:
# Escaped or missing tags - just continue on searching after end of match
# Missing tags are ok - cInfo can contain aInfo elements like <HOST> and valid shell
# constructs like <STDIN>.
m = t.search(value, m.end())
continue
value = value.replace('<%s>' % found_tag, repl)
#logSys.log(5, 'value now: %s' % value)
# increment reference count:
refCounts[found_tag] = refCounts.get(found_tag, 0) + 1
# the next match for replace:
m = t.search(value, m.start())
#logSys.log(5, 'TAG: %s, newvalue: %s' % (tag, value))
# was substituted?
if orgval != value:
# check still contains any tag - should be repeated (possible embedded-recursive substitution):
if t.search(value):
repFlag = True
tags[tag] = value
# no more sub tags (and no possible composite), add this tag to done set (just to be faster):
if '<' not in value: done.add(tag)
# stop interpolation, if no replacements anymore:
if not repFlag:
break
return tags
@staticmethod
def escapeTag(value):
"""Escape characters which may be used for command injection.
@ -488,33 +461,73 @@ class CommandAction(ActionBase):
str
`query` string with tags replaced.
"""
if '<' not in query: return query
# use cache if allowed:
if cache is not None:
ckey = (query, conditional)
string = cache.get(ckey)
if string is not None:
return string
# replace:
string = query
aInfo = cls.substituteRecursiveTags(aInfo, conditional)
for tag in aInfo:
if "<%s>" % tag in query:
value = aInfo.get(tag + '?' + conditional)
if value is None:
value = aInfo.get(tag)
value = str(value) # assure string
if tag in cls._escapedTags:
# That one needs to be escaped since its content is
# out of our control
value = cls.escapeTag(value)
string = string.replace('<' + tag + '>', value)
# New line, space
string = reduce(lambda s, kv: s.replace(*kv), (("<br>", '\n'), ("<sp>", " ")), string)
# cache if properties:
try:
return cache[ckey]
except KeyError:
pass
# first try get cached tags dictionary:
subInfo = csubkey = None
if cache is not None:
cache[ckey] = string
csubkey = ('subst-tags', id(aInfo), conditional)
try:
subInfo = cache[csubkey]
except KeyError:
pass
# interpolation of dictionary:
if subInfo is None:
subInfo = substituteRecursiveTags(aInfo, conditional, ignore=cls._escapedTags)
# cache if possible:
if csubkey is not None:
cache[csubkey] = subInfo
# substitution callable, used by interpolation of each tag
repeatSubst = {0: 0}
def substVal(m):
tag = m.group(1) # tagname from match
value = None
if conditional:
value = subInfo.get(tag + '?' + conditional)
if value is None:
value = subInfo.get(tag)
if value is None:
# fallback (no or default replacement)
return ADD_REPL_TAGS.get(tag, m.group())
value = str(value) # assure string
if tag in cls._escapedTags:
# That one needs to be escaped since its content is
# out of our control
value = cls.escapeTag(value)
# possible contains tags:
if '<' in value:
repeatSubst[0] = 1
return value
# interpolation of query:
count = MAX_TAG_REPLACE_COUNT + 1
while True:
repeatSubst[0] = 0
value = TAG_CRE.sub(substVal, query)
# possible recursion ?
if not repeatSubst or value == query: break
query = value
count -= 1
if count <= 0:
raise ValueError(
"unexpected too long replacement interpolation, "
"possible self referencing definitions in query: %s" % (query,))
# cache if possible:
if cache is not None:
cache[ckey] = value
#
return string
return value
def _processCmd(self, cmd, aInfo=None, conditional=''):
"""Executes a command with preliminary checks and substitutions.
@ -580,9 +593,21 @@ class CommandAction(ActionBase):
realCmd = self.replaceTag(cmd, self._properties,
conditional=conditional, cache=self.__substCache)
# Replace tags
# Replace dynamical tags (don't use cache here)
if aInfo is not None:
realCmd = self.replaceTag(realCmd, aInfo, conditional=conditional)
# Replace ticket options (filter capture groups) non-recursive:
if '<' in realCmd:
tickData = aInfo.get("F-*")
if not tickData: tickData = {}
def substTag(m):
tn = mapTag2Opt(m.groups()[0])
try:
return str(tickData[tn])
except KeyError:
return ""
realCmd = FCUSTAG_CRE.sub(substTag, realCmd)
else:
realCmd = cmd

View File

@ -287,44 +287,86 @@ class Actions(JailThread, Mapping):
self.stopActions()
return True
def __getBansMerged(self, mi, overalljails=False):
"""Gets bans merged once, a helper for lambda(s), prevents stop of executing action by any exception inside.
class ActionInfo(CallingMap):
This function never returns None for ainfo lambdas - always a ticket (merged or single one)
and prevents any errors through merging (to guarantee ban actions will be executed).
[TODO] move merging to observer - here we could wait for merge and read already merged info from a database
AI_DICT = {
"ip": lambda self: self.__ticket.getIP(),
"ip-rev": lambda self: self['ip'].getPTR(''),
"fid": lambda self: self.__ticket.getID(),
"failures": lambda self: self.__ticket.getAttempt(),
"time": lambda self: self.__ticket.getTime(),
"matches": lambda self: "\n".join(self.__ticket.getMatches()),
# to bypass actions, that should not be executed for restored tickets
"restored": lambda self: (1 if self.__ticket.restored else 0),
# extra-interpolation - all match-tags (captured from the filter):
"F-*": lambda self, tag=None: self.__ticket.getData(tag),
# merged info:
"ipmatches": lambda self: "\n".join(self._mi4ip(True).getMatches()),
"ipjailmatches": lambda self: "\n".join(self._mi4ip().getMatches()),
"ipfailures": lambda self: self._mi4ip(True).getAttempt(),
"ipjailfailures": lambda self: self._mi4ip().getAttempt(),
}
Parameters
----------
mi : dict
merge info, initial for lambda should contains {ip, ticket}
overalljails : bool
switch to get a merged bans :
False - (default) bans merged for current jail only
True - bans merged for all jails of current ip address
__slots__ = CallingMap.__slots__ + ('__ticket', '__jail', '__mi4ip')
def __init__(self, ticket, jail=None, immutable=True, data=AI_DICT):
self.__ticket = ticket
self.__jail = jail
self.storage = dict()
self.immutable = immutable
self.data = data
def copy(self): # pargma: no cover
return self.__class__(self.__ticket, self.__jail, self.immutable, self.data.copy())
def _mi4ip(self, overalljails=False):
"""Gets bans merged once, a helper for lambda(s), prevents stop of executing action by any exception inside.
This function never returns None for ainfo lambdas - always a ticket (merged or single one)
and prevents any errors through merging (to guarantee ban actions will be executed).
[TODO] move merging to observer - here we could wait for merge and read already merged info from a database
Parameters
----------
overalljails : bool
switch to get a merged bans :
False - (default) bans merged for current jail only
True - bans merged for all jails of current ip address
Returns
-------
BanTicket
merged or self ticket only
"""
if not hasattr(self, '__mi4ip'):
self.__mi4ip = {}
mi = self.__mi4ip
idx = 'all' if overalljails else 'jail'
if idx in mi:
return mi[idx] if mi[idx] is not None else self.__ticket
try:
jail = self.__jail
ip = self['ip']
mi[idx] = None
if not jail.database: # pragma: no cover
return self.__ticket
if overalljails:
mi[idx] = jail.database.getBansMerged(ip=ip)
else:
mi[idx] = jail.database.getBansMerged(ip=ip, jail=jail)
except Exception as e:
logSys.error(
"Failed to get %s bans merged, jail '%s': %s",
idx, jail.name, e,
exc_info=logSys.getEffectiveLevel()<=logging.DEBUG)
return mi[idx] if mi[idx] is not None else self.__ticket
def __getActionInfo(self, ticket):
ip = ticket.getIP()
aInfo = Actions.ActionInfo(ticket, self._jail)
return aInfo
Returns
-------
BanTicket
merged or self ticket only
"""
idx = 'all' if overalljails else 'jail'
if idx in mi:
return mi[idx] if mi[idx] is not None else mi['ticket']
try:
jail=self._jail
ip=mi['ip']
mi[idx] = None
if overalljails:
mi[idx] = jail.database.getBansMerged(ip=ip)
else:
mi[idx] = jail.database.getBansMerged(ip=ip, jail=jail)
except Exception as e:
logSys.error(
"Failed to get %s bans merged, jail '%s': %s",
idx, jail.name, e,
exc_info=logSys.getEffectiveLevel()<=logging.DEBUG)
return mi[idx] if mi[idx] is not None else mi['ticket']
def __checkBan(self):
"""Check for IP address to ban.
@ -342,7 +384,6 @@ class Actions(JailThread, Mapping):
ticket = self._jail.getFailTicket()
if not ticket:
break
aInfo = CallingMap()
bTicket = BanManager.createBanTicket(ticket)
btime = ticket.getBanTime()
if btime is not None:
@ -353,20 +394,7 @@ class Actions(JailThread, Mapping):
if ticket.restored:
bTicket.restored = True
ip = bTicket.getIP()
aInfo["ip"] = ip
aInfo["failures"] = bTicket.getAttempt()
aInfo["time"] = bTicket.getTime()
aInfo["matches"] = "\n".join(bTicket.getMatches())
# to bypass actions, that should not be executed for restored tickets
aInfo["restored"] = 1 if ticket.restored else 0
# retarded merge info via twice lambdas : once for merge, once for matches/failures:
if self._jail.database is not None:
mi4ip = lambda overalljails=False, self=self, \
mi={'ip':ip, 'ticket':bTicket}: self.__getBansMerged(mi, overalljails)
aInfo["ipmatches"] = lambda: "\n".join(mi4ip(True).getMatches())
aInfo["ipjailmatches"] = lambda: "\n".join(mi4ip().getMatches())
aInfo["ipfailures"] = lambda: mi4ip(True).getAttempt()
aInfo["ipjailfailures"] = lambda: mi4ip().getAttempt()
aInfo = self.__getActionInfo(bTicket)
reason = {}
if self.__banManager.addBanTicket(bTicket, reason=reason):
cnt += 1
@ -379,7 +407,8 @@ class Actions(JailThread, Mapping):
try:
if ticket.restored and getattr(action, 'norestored', False):
continue
action.ban(aInfo.copy())
if not aInfo.immutable: aInfo.reset()
action.ban(aInfo)
except Exception as e:
logSys.error(
"Failed to execute ban jail '%s' action '%s' "
@ -465,21 +494,17 @@ class Actions(JailThread, Mapping):
unbactions = self._actions
else:
unbactions = actions
aInfo = dict()
aInfo["ip"] = ticket.getIP()
aInfo["failures"] = ticket.getAttempt()
aInfo["time"] = ticket.getTime()
aInfo["matches"] = "".join(ticket.getMatches())
# to bypass actions, that should not be executed for restored tickets
aInfo["restored"] = 1 if ticket.restored else 0
ip = ticket.getIP()
aInfo = self.__getActionInfo(ticket)
if actions is None:
logSys.notice("[%s] Unban %s", self._jail.name, aInfo["ip"])
for name, action in unbactions.iteritems():
try:
if ticket.restored and getattr(action, 'norestored', False):
continue
logSys.debug("[%s] action %r: unban %s", self._jail.name, name, aInfo["ip"])
action.unban(aInfo.copy())
logSys.debug("[%s] action %r: unban %s", self._jail.name, name, ip)
if not aInfo.immutable: aInfo.reset()
action.unban(aInfo)
except Exception as e:
logSys.error(
"Failed to execute unban jail '%s' action '%s' "

View File

@ -483,7 +483,7 @@ class Fail2BanDb(object):
cur.execute(
"INSERT OR REPLACE INTO bips(ip, jail, timeofban, bantime, bancount, data) VALUES(?, ?, ?, ?, ?, ?)",
(ip, jail.name, int(round(ticket.getTime())), ticket.getBanTime(jail.actions.getBanTime()), ticket.getBanCount(),
{"matches": ticket.getMatches(), "failures": ticket.getAttempt()}))
ticket.getData()))
@commitandrollback
def delBan(self, cur, jail, ip):

View File

@ -27,6 +27,68 @@ import sys
from .ipdns import IPAddr
FTAG_CRE = re.compile(r'</?[\w\-]+/?>')
FCUSTNAME_CRE = re.compile(r'^(/?)F-([A-Z0-9_\-]+)$'); # currently uppercase only
R_HOST = [
# separated ipv4:
r"""(?:::f{4,6}:)?(?P<ip4>%s)""" % (IPAddr.IP_4_RE,),
# separated ipv6:
r"""(?P<ip6>%s)""" % (IPAddr.IP_6_RE,),
# place-holder for ipv6 enclosed in optional [] (used in addr-, host-regex)
"",
# separated dns:
r"""(?P<dns>[\w\-.^_]*\w)""",
# place-holder for ADDR tag-replacement (joined):
"",
# place-holder for HOST tag replacement (joined):
""
]
RI_IPV4 = 0
RI_IPV6 = 1
RI_IPV6BR = 2
RI_DNS = 3
RI_ADDR = 4
RI_HOST = 5
R_HOST[RI_IPV6BR] = r"""\[?%s\]?""" % (R_HOST[RI_IPV6],)
R_HOST[RI_ADDR] = "(?:%s)" % ("|".join((R_HOST[RI_IPV4], R_HOST[RI_IPV6BR])),)
R_HOST[RI_HOST] = "(?:%s)" % ("|".join((R_HOST[RI_IPV4], R_HOST[RI_IPV6BR], R_HOST[RI_DNS])),)
RH4TAG = {
# separated ipv4 (self closed, closed):
"IP4": R_HOST[RI_IPV4],
"F-IP4/": R_HOST[RI_IPV4],
# separated ipv6 (self closed, closed):
"IP6": R_HOST[RI_IPV6],
"F-IP6/": R_HOST[RI_IPV6],
# 2 address groups instead of <ADDR> - in opposition to `<HOST>`,
# for separate usage of 2 address groups only (regardless of `usedns`), `ip4` and `ip6` together
"ADDR": R_HOST[RI_ADDR],
"F-ADDR/": R_HOST[RI_ADDR],
# separated dns (self closed, closed):
"DNS": R_HOST[RI_DNS],
"F-DNS/": R_HOST[RI_DNS],
# default failure-id as no space tag:
"F-ID/": r"""(?P<fid>\S+)""",
# default failure port, like 80 or http :
"F-PORT/": r"""(?P<fport>\w+)""",
}
# default failure groups map for customizable expressions (with different group-id):
R_MAP = {
"ID": "fid",
"PORT": "fport",
}
def mapTag2Opt(tag):
try: # if should be mapped:
return R_MAP[tag]
except KeyError:
return tag.lower()
##
# Regular expression class.
#
@ -71,38 +133,44 @@ class Regex:
@staticmethod
def _resolveHostTag(regex, useDns="yes"):
# separated ipv4:
r_host = []
r = r"""(?:::f{4,6}:)?(?P<ip4>%s)""" % (IPAddr.IP_4_RE,)
regex = regex.replace("<IP4>", r); # self closed
regex = regex.replace("<F-IP4/>", r); # closed
r_host.append(r)
# separated ipv6:
r = r"""(?P<ip6>%s)""" % (IPAddr.IP_6_RE,)
regex = regex.replace("<IP6>", r); # self closed
regex = regex.replace("<F-IP6/>", r); # closed
r_host.append(r"""\[?%s\]?""" % (r,)); # enclose ipv6 in optional [] in host-regex
# 2 address groups instead of <ADDR> - in opposition to `<HOST>`,
# for separate usage of 2 address groups only (regardless of `usedns`), `ip4` and `ip6` together
regex = regex.replace("<ADDR>", "(?:%s)" % ("|".join(r_host),))
# separated dns:
r = r"""(?P<dns>[\w\-.^_]*\w)"""
regex = regex.replace("<DNS>", r); # self closed
regex = regex.replace("<F-DNS/>", r); # closed
if useDns not in ("no",):
r_host.append(r)
# 3 groups instead of <HOST> - separated ipv4, ipv6 and host (dns)
regex = regex.replace("<HOST>", "(?:%s)" % ("|".join(r_host),))
# default failure-id as no space tag:
regex = regex.replace("<F-ID/>", r"""(?P<fid>\S+)"""); # closed
# default failure port, like 80 or http :
regex = regex.replace("<F-PORT/>", r"""(?P<port>\w+)"""); # closed
# default failure groups (begin / end tag) for customizable expressions:
for o,r in (('IP4', 'ip4'), ('IP6', 'ip6'), ('DNS', 'dns'), ('ID', 'fid'), ('PORT', 'fport')):
regex = regex.replace("<F-%s>" % o, "(?P<%s>" % r); # open tag
regex = regex.replace("</F-%s>" % o, ")"); # close tag
return regex
openTags = dict()
# tag interpolation callable:
def substTag(m):
tag = m.group()
tn = tag[1:-1]
# 3 groups instead of <HOST> - separated ipv4, ipv6 and host (dns)
if tn == "HOST":
return R_HOST[RI_HOST if useDns not in ("no",) else RI_ADDR]
# static replacement from RH4TAG:
try:
return RH4TAG[tn]
except KeyError:
pass
# (begin / end tag) for customizable expressions, additionally used as
# user custom tags (match will be stored in ticket data, can be used in actions):
m = FCUSTNAME_CRE.match(tn)
if m: # match F-...
m = m.groups()
tn = m[1]
# close tag:
if m[0]:
# check it was already open:
if openTags.get(tn):
return ")"
return tag; # tag not opened, use original
# open tag:
openTags[tn] = 1
# if should be mapped:
tn = mapTag2Opt(tn)
return "(?P<%s>" % (tn,)
# original, no replacement:
return tag
# substitute tags:
return FTAG_CRE.sub(substTag, regex)
##
# Gets the regular expression.
@ -121,40 +189,45 @@ class Regex:
# method of this object.
# @param a list of tupples. The tupples are ( prematch, datematch, postdatematch )
def search(self, tupleLines):
def search(self, tupleLines, orgLines=None):
self._matchCache = self._regexObj.search(
"\n".join("".join(value[::2]) for value in tupleLines) + "\n")
if self.hasMatched():
# Find start of the first line where the match was found
try:
self._matchLineStart = self._matchCache.string.rindex(
"\n", 0, self._matchCache.start() +1 ) + 1
except ValueError:
self._matchLineStart = 0
# Find end of the last line where the match was found
try:
self._matchLineEnd = self._matchCache.string.index(
"\n", self._matchCache.end() - 1) + 1
except ValueError:
self._matchLineEnd = len(self._matchCache.string)
if self._matchCache:
if orgLines is None: orgLines = tupleLines
# if single-line:
if len(orgLines) <= 1:
self._matchedTupleLines = orgLines
self._unmatchedTupleLines = []
else:
# Find start of the first line where the match was found
try:
matchLineStart = self._matchCache.string.rindex(
"\n", 0, self._matchCache.start() +1 ) + 1
except ValueError:
matchLineStart = 0
# Find end of the last line where the match was found
try:
matchLineEnd = self._matchCache.string.index(
"\n", self._matchCache.end() - 1) + 1
except ValueError:
matchLineEnd = len(self._matchCache.string)
lineCount1 = self._matchCache.string.count(
"\n", 0, self._matchLineStart)
lineCount2 = self._matchCache.string.count(
"\n", 0, self._matchLineEnd)
self._matchedTupleLines = tupleLines[lineCount1:lineCount2]
self._unmatchedTupleLines = tupleLines[:lineCount1]
n = 0
for skippedLine in self.getSkippedLines():
for m, matchedTupleLine in enumerate(
self._matchedTupleLines[n:]):
if "".join(matchedTupleLine[::2]) == skippedLine:
self._unmatchedTupleLines.append(
self._matchedTupleLines.pop(n+m))
n += m
break
self._unmatchedTupleLines.extend(tupleLines[lineCount2:])
lineCount1 = self._matchCache.string.count(
"\n", 0, matchLineStart)
lineCount2 = self._matchCache.string.count(
"\n", 0, matchLineEnd)
self._matchedTupleLines = orgLines[lineCount1:lineCount2]
self._unmatchedTupleLines = orgLines[:lineCount1]
n = 0
for skippedLine in self.getSkippedLines():
for m, matchedTupleLine in enumerate(
self._matchedTupleLines[n:]):
if "".join(matchedTupleLine[::2]) == skippedLine:
self._unmatchedTupleLines.append(
self._matchedTupleLines.pop(n+m))
n += m
break
self._unmatchedTupleLines.extend(orgLines[lineCount2:])
# Checks if the previous call to search() matched.
#
@ -166,6 +239,13 @@ class Regex:
else:
return False
##
# Returns all matched groups.
#
def getGroups(self):
return self._matchCache.groupdict()
##
# Returns skipped lines.
#
@ -243,6 +323,10 @@ class RegexException(Exception):
#
FAILURE_ID_GROPS = ("fid", "ip4", "ip6", "dns")
# Additionally allows multi-line failure-id (used for wrapping e. g. conn-id to host)
#
FAILURE_ID_PRESENTS = FAILURE_ID_GROPS + ("mlfid",)
##
# Regular expression class.
#
@ -257,20 +341,16 @@ class FailRegex(Regex):
# avoid construction of invalid object.
# @param value the regular expression
def __init__(self, regex, **kwargs):
def __init__(self, regex, prefRegex=None, **kwargs):
# Initializes the parent.
Regex.__init__(self, regex, **kwargs)
# Check for group "dns", "ip4", "ip6", "fid"
if not [grp for grp in FAILURE_ID_GROPS if grp in self._regexObj.groupindex]:
if (not [grp for grp in FAILURE_ID_PRESENTS if grp in self._regexObj.groupindex]
and (prefRegex is None or
not [grp for grp in FAILURE_ID_PRESENTS if grp in prefRegex._regexObj.groupindex])
):
raise RegexException("No failure-id group in '%s'" % self._regex)
##
# Returns all matched groups.
#
def getGroups(self):
return self._matchCache.groupdict()
##
# Returns the matched failure id.
#

View File

@ -39,6 +39,7 @@ from .datedetector import DateDetector
from .mytime import MyTime
from .failregex import FailRegex, Regex, RegexException
from .action import CommandAction
from .utils import Utils
from ..helpers import getLogger, PREFER_ENC
# Gets the instance of the logger.
@ -66,6 +67,8 @@ class Filter(JailThread):
self.jail = jail
## The failures manager.
self.failManager = FailManager()
## Regular expression pre-filtering matching the failures.
self.__prefRegex = None
## The regular expression list matching the failures.
self.__failRegex = list()
## The regular expression list with expressions to ignore.
@ -87,6 +90,8 @@ class Filter(JailThread):
self.__ignoreCommand = False
## Default or preferred encoding (to decode bytes from file or journal):
self.__encoding = PREFER_ENC
## Cache temporary holds failures info (used by multi-line for wrapping e. g. conn-id to host):
self.__mlfidCache = None
## Error counter (protected, so can be used in filter implementations)
## if it reached 100 (at once), run-cycle will go idle
self._errors = 0
@ -100,7 +105,7 @@ class Filter(JailThread):
self.ticks = 0
self.dateDetector = DateDetector()
logSys.debug("Created %s" % self)
logSys.debug("Created %s", self)
def __repr__(self):
return "%s(%r)" % (self.__class__.__name__, self.jail)
@ -130,6 +135,23 @@ class Filter(JailThread):
self.delLogPath(path)
delattr(self, '_reload_logs')
@property
def mlfidCache(self):
if self.__mlfidCache:
return self.__mlfidCache
self.__mlfidCache = Utils.Cache(maxCount=100, maxTime=5*60)
return self.__mlfidCache
@property
def prefRegex(self):
return self.__prefRegex
@prefRegex.setter
def prefRegex(self, value):
if value:
self.__prefRegex = Regex(value, useDns=self.__useDns)
else:
self.__prefRegex = None
##
# Add a regular expression which matches the failure.
#
@ -139,7 +161,7 @@ class Filter(JailThread):
def addFailRegex(self, value):
try:
regex = FailRegex(value, useDns=self.__useDns)
regex = FailRegex(value, prefRegex=self.__prefRegex, useDns=self.__useDns)
self.__failRegex.append(regex)
if "\n" in regex.getRegex() and not self.getMaxLines() > 1:
logSys.warning(
@ -159,7 +181,7 @@ class Filter(JailThread):
del self.__failRegex[index]
except IndexError:
logSys.error("Cannot remove regular expression. Index %d is not "
"valid" % index)
"valid", index)
##
# Get the regular expression which matches the failure.
@ -197,7 +219,7 @@ class Filter(JailThread):
del self.__ignoreRegex[index]
except IndexError:
logSys.error("Cannot remove regular expression. Index %d is not "
"valid" % index)
"valid", index)
##
# Get the regular expression which matches the failure.
@ -220,9 +242,9 @@ class Filter(JailThread):
value = value.lower() # must be a string by now
if not (value in ('yes', 'warn', 'no', 'raw')):
logSys.error("Incorrect value %r specified for usedns. "
"Using safe 'no'" % (value,))
"Using safe 'no'", value)
value = 'no'
logSys.debug("Setting usedns = %s for %s" % (value, self))
logSys.debug("Setting usedns = %s for %s", value, self)
self.__useDns = value
##
@ -335,7 +357,7 @@ class Filter(JailThread):
encoding = PREFER_ENC
codecs.lookup(encoding) # Raise LookupError if invalid codec
self.__encoding = encoding
logSys.info(" encoding: %s" % encoding)
logSys.info(" encoding: %s", encoding)
return encoding
##
@ -380,7 +402,7 @@ class Filter(JailThread):
if not isinstance(ip, IPAddr):
ip = IPAddr(ip)
if self.inIgnoreIPList(ip):
logSys.warning('Requested to manually ban an ignored IP %s. User knows best. Proceeding to ban it.' % ip)
logSys.warning('Requested to manually ban an ignored IP %s. User knows best. Proceeding to ban it.', ip)
unixTime = MyTime.time()
self.failManager.addFailure(FailTicket(ip, unixTime), self.failManager.getMaxRetry())
@ -424,7 +446,7 @@ class Filter(JailThread):
def logIgnoreIp(self, ip, log_ignore, ignore_source="unknown source"):
if log_ignore:
logSys.info("[%s] Ignore %s by %s" % (self.jailName, ip, ignore_source))
logSys.info("[%s] Ignore %s by %s", self.jailName, ip, ignore_source)
def getIgnoreIP(self):
return self.__ignoreIpList
@ -448,7 +470,7 @@ class Filter(JailThread):
if self.__ignoreCommand:
command = CommandAction.replaceTag(self.__ignoreCommand, { 'ip': ip } )
logSys.debug('ignore command: ' + command)
logSys.debug('ignore command: %s', command)
ret, ret_ignore = CommandAction.executeCmd(command, success_codes=(0, 1))
ret_ignore = ret and ret_ignore == 0
self.logIgnoreIp(ip, log_ignore and ret_ignore, ignore_source="command")
@ -487,10 +509,7 @@ class Filter(JailThread):
for element in self.processLine(line, date):
ip = element[1]
unixTime = element[2]
lines = element[3]
fail = {}
if len(element) > 4:
fail = element[4]
fail = element[3]
logSys.debug("Processing line with time:%s and ip:%s",
unixTime, ip)
if self.inIgnoreIPList(ip, log_ignore=True):
@ -498,7 +517,7 @@ class Filter(JailThread):
logSys.info(
"[%s] Found %s - %s", self.jailName, ip, datetime.datetime.fromtimestamp(unixTime).strftime("%Y-%m-%d %H:%M:%S")
)
tick = FailTicket(ip, unixTime, lines, data=fail)
tick = FailTicket(ip, unixTime, data=fail)
self.failManager.addFailure(tick)
# report to observer - failure was found, for possibly increasing of it retry counter (asynchronous)
if Observers.Main is not None:
@ -536,6 +555,29 @@ class Filter(JailThread):
return ignoreRegexIndex
return None
def _mergeFailure(self, mlfid, fail, failRegex):
mlfidFail = self.mlfidCache.get(mlfid) if self.__mlfidCache else None
if mlfidFail:
mlfidGroups = mlfidFail[1]
# if current line not failure, but previous was failure:
if fail.get('nofail') and not mlfidGroups.get('nofail'):
del fail['nofail'] # remove nofail flag - was already market as failure
self.mlfidCache.unset(mlfid) # remove cache entry
# if current line is failure, but previous was not:
elif not fail.get('nofail') and mlfidGroups.get('nofail'):
del mlfidGroups['nofail'] # remove nofail flag
self.mlfidCache.unset(mlfid) # remove cache entry
fail2 = mlfidGroups.copy()
fail2.update(fail)
fail2["matches"] = fail.get("matches", []) + failRegex.getMatchedTupleLines()
fail = fail2
elif fail.get('nofail'):
fail["matches"] = failRegex.getMatchedTupleLines()
mlfidFail = [self.__lastDate, fail]
self.mlfidCache.set(mlfid, mlfidFail)
return fail
##
# Finds the failure in a line given split into time and log parts.
#
@ -568,7 +610,7 @@ class Filter(JailThread):
dateTimeMatch = self.dateDetector.getTime(timeText, tupleLine[3])
if dateTimeMatch is None:
logSys.error("findFailure failed to parse timeText: " + timeText)
logSys.error("findFailure failed to parse timeText: %s", timeText)
date = self.__lastDate
else:
@ -586,77 +628,118 @@ class Filter(JailThread):
date, MyTime.time(), self.getFindTime())
return failList
self.__lineBuffer = (
self.__lineBuffer + [tupleLine[:3]])[-self.__lineBufferSize:]
logSys.log(5, "Looking for failregex match of %r" % self.__lineBuffer)
if self.__lineBufferSize > 1:
orgBuffer = self.__lineBuffer = (
self.__lineBuffer + [tupleLine[:3]])[-self.__lineBufferSize:]
else:
orgBuffer = self.__lineBuffer = [tupleLine[:3]]
logSys.log(5, "Looking for failregex match of %r", self.__lineBuffer)
# Pre-filter fail regex (if available):
preGroups = {}
if self.__prefRegex:
self.__prefRegex.search(self.__lineBuffer)
if not self.__prefRegex.hasMatched():
return failList
preGroups = self.__prefRegex.getGroups()
logSys.log(7, "Pre-filter matched %s", preGroups)
repl = preGroups.get('content')
# Content replacement:
if repl:
del preGroups['content']
self.__lineBuffer = [('', '', repl)]
# Iterates over all the regular expressions.
for failRegexIndex, failRegex in enumerate(self.__failRegex):
failRegex.search(self.__lineBuffer)
if failRegex.hasMatched():
# The failregex matched.
logSys.log(7, "Matched %s", failRegex)
# Checks if we must ignore this match.
if self.ignoreLine(failRegex.getMatchedTupleLines()) \
is not None:
# The ignoreregex matched. Remove ignored match.
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
logSys.log(7, "Matched ignoreregex and was ignored")
if not self.checkAllRegex:
break
else:
continue
if date is None:
logSys.warning(
"Found a match for %r but no valid date/time "
"found for %r. Please try setting a custom "
"date pattern (see man page jail.conf(5)). "
"If format is complex, please "
"file a detailed issue on"
" https://github.com/fail2ban/fail2ban/issues "
"in order to get support for this format."
% ("\n".join(failRegex.getMatchedLines()), timeText))
failRegex.search(self.__lineBuffer, orgBuffer)
if not failRegex.hasMatched():
continue
# The failregex matched.
logSys.log(7, "Matched %s", failRegex)
# Checks if we must ignore this match.
if self.ignoreLine(failRegex.getMatchedTupleLines()) \
is not None:
# The ignoreregex matched. Remove ignored match.
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
logSys.log(7, "Matched ignoreregex and was ignored")
if not self.checkAllRegex:
break
else:
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
# retrieve failure-id, host, etc from failure match:
raw = returnRawHost
try:
fail = failRegex.getGroups()
# failure-id:
fid = fail.get('fid')
# ip-address or host:
host = fail.get('ip4') or fail.get('ip6')
if host is not None:
raw = True
else:
host = fail.get('dns')
if host is None:
continue
if date is None:
logSys.warning(
"Found a match for %r but no valid date/time "
"found for %r. Please try setting a custom "
"date pattern (see man page jail.conf(5)). "
"If format is complex, please "
"file a detailed issue on"
" https://github.com/fail2ban/fail2ban/issues "
"in order to get support for this format.",
"\n".join(failRegex.getMatchedLines()), timeText)
continue
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
# retrieve failure-id, host, etc from failure match:
try:
raw = returnRawHost
if preGroups:
fail = preGroups.copy()
fail.update(failRegex.getGroups())
else:
fail = failRegex.getGroups()
# first try to check we have mlfid case (caching of connection id by multi-line):
mlfid = fail.get('mlfid')
if mlfid is not None:
fail = self._mergeFailure(mlfid, fail, failRegex)
else:
# matched lines:
fail["matches"] = fail.get("matches", []) + failRegex.getMatchedTupleLines()
# failure-id:
fid = fail.get('fid')
# ip-address or host:
host = fail.get('ip4')
if host is not None:
cidr = IPAddr.FAM_IPv4
raw = True
else:
host = fail.get('ip6')
if host is not None:
cidr = IPAddr.FAM_IPv6
raw = True
if host is None:
host = fail.get('dns')
if host is None:
# first try to check we have mlfid case (cache connection id):
if fid is None:
if mlfid:
fail = self._mergeFailure(mlfid, fail, failRegex)
else:
# if no failure-id also (obscure case, wrong regex), throw error inside getFailID:
if fid is None:
fid = failRegex.getFailID()
host = fid
cidr = IPAddr.CIDR_RAW
# if raw - add single ip or failure-id,
# otherwise expand host to multiple ips using dns (or ignore it if not valid):
if raw:
ip = IPAddr(host, cidr)
# check host equal failure-id, if not - failure with complex id:
if fid is not None and fid != host:
ip = IPAddr(fid, IPAddr.CIDR_RAW)
failList.append([failRegexIndex, ip, date,
failRegex.getMatchedLines(), fail])
if not self.checkAllRegex:
break
else:
ips = DNSUtils.textToIp(host, self.__useDns)
if ips:
for ip in ips:
failList.append([failRegexIndex, ip, date,
failRegex.getMatchedLines(), fail])
if not self.checkAllRegex:
break
except RegexException as e: # pragma: no cover - unsure if reachable
logSys.error(e)
fid = failRegex.getFailID()
host = fid
cidr = IPAddr.CIDR_RAW
# if mlfid case (not failure):
if host is None:
if not self.checkAllRegex: # or fail.get('nofail'):
return failList
ips = [None]
# if raw - add single ip or failure-id,
# otherwise expand host to multiple ips using dns (or ignore it if not valid):
elif raw:
ip = IPAddr(host, cidr)
# check host equal failure-id, if not - failure with complex id:
if fid is not None and fid != host:
ip = IPAddr(fid, IPAddr.CIDR_RAW)
ips = [ip]
# otherwise, try to use dns conversion:
else:
ips = DNSUtils.textToIp(host, self.__useDns)
# append failure with match to the list:
for ip in ips:
failList.append([failRegexIndex, ip, date, fail])
if not self.checkAllRegex:
break
except RegexException as e: # pragma: no cover - unsure if reachable
logSys.error(e)
return failList
def status(self, flavor="basic"):
@ -720,7 +803,7 @@ class FileFilter(Filter):
db = self.jail.database
if db is not None:
db.updateLog(self.jail, log)
logSys.info("Removed logfile: %r" % path)
logSys.info("Removed logfile: %r", path)
self._delLogPath(path)
return
@ -785,7 +868,7 @@ class FileFilter(Filter):
def getFailures(self, filename):
log = self.getLog(filename)
if log is None:
logSys.error("Unable to get failures in " + filename)
logSys.error("Unable to get failures in %s", filename)
return False
# We should always close log (file), otherwise may be locked (log-rotate, etc.)
try:
@ -794,11 +877,11 @@ class FileFilter(Filter):
has_content = log.open()
# see http://python.org/dev/peps/pep-3151/
except IOError as e:
logSys.error("Unable to open %s" % filename)
logSys.error("Unable to open %s", filename)
logSys.exception(e)
return False
except OSError as e: # pragma: no cover - requires race condition to tigger this
logSys.error("Error opening %s" % filename)
logSys.error("Error opening %s", filename)
logSys.exception(e)
return False
except Exception as e: # pragma: no cover - Requires implemention error in FileContainer to generate
@ -1019,7 +1102,7 @@ class FileContainer:
## sys.stdout.flush()
# Compare hash and inode
if self.__hash != myHash or self.__ino != stats.st_ino:
logSys.info("Log rotation detected for %s" % self.__filename)
logSys.info("Log rotation detected for %s", self.__filename)
self.__hash = myHash
self.__ino = stats.st_ino
self.__pos = 0

View File

@ -64,16 +64,19 @@ class DNSUtils:
if ips is not None:
return ips
# retrieve ips
try:
ips = list()
for result in socket.getaddrinfo(dns, None, 0, 0, socket.IPPROTO_TCP):
ip = IPAddr(result[4][0])
if ip.isValid:
ips.append(ip)
except socket.error as e:
# todo: make configurable the expired time of cache entry:
logSys.warning("Unable to find a corresponding IP address for %s: %s", dns, e)
ips = list()
ips = list()
saveerr = None
for fam, ipfam in ((socket.AF_INET, IPAddr.FAM_IPv4), (socket.AF_INET6, IPAddr.FAM_IPv6)):
try:
for result in socket.getaddrinfo(dns, None, fam, 0, socket.IPPROTO_TCP):
ip = IPAddr(result[4][0], ipfam)
if ip.isValid:
ips.append(ip)
except socket.error as e:
saveerr = e
if not ips and saveerr:
logSys.warning("Unable to find a corresponding IP address for %s: %s", dns, saveerr)
DNSUtils.CACHE_nameToIp.set(dns, ips)
return ips
@ -140,6 +143,8 @@ class IPAddr(object):
CIDR_RAW = -2
CIDR_UNSPEC = -1
FAM_IPv4 = CIDR_RAW - socket.AF_INET
FAM_IPv6 = CIDR_RAW - socket.AF_INET6
def __new__(cls, ipstr, cidr=CIDR_UNSPEC):
# check already cached as IPAddr
@ -191,7 +196,11 @@ class IPAddr(object):
self._raw = ipstr
# if not raw - recognize family, set addr, etc.:
if cidr != IPAddr.CIDR_RAW:
for family in [socket.AF_INET, socket.AF_INET6]:
if cidr is not None and cidr < IPAddr.CIDR_RAW:
family = [IPAddr.CIDR_RAW - cidr]
else:
family = [socket.AF_INET, socket.AF_INET6]
for family in family:
try:
binary = socket.inet_pton(family, ipstr)
self._family = family
@ -346,7 +355,7 @@ class IPAddr(object):
return socket.inet_ntop(self._family, binary) + add
def getPTR(self, suffix=""):
def getPTR(self, suffix=None):
""" return the DNS PTR string of the provided IP address object
If "suffix" is provided it will be appended as the second and top
@ -356,11 +365,11 @@ class IPAddr(object):
"""
if self.isIPv4:
exploded_ip = self.ntoa.split(".")
if not suffix:
if suffix is None:
suffix = "in-addr.arpa."
elif self.isIPv6:
exploded_ip = self.hexdump
if not suffix:
if suffix is None:
suffix = "ip6.arpa."
else:
return ""

View File

@ -402,6 +402,14 @@ class Server:
def getIgnoreCommand(self, name):
return self.__jails[name].filter.getIgnoreCommand()
def setPrefRegex(self, name, value):
flt = self.__jails[name].filter
logSys.debug(" prefregex: %r", value)
flt.prefRegex = value
def getPrefRegex(self, name):
return self.__jails[name].filter.prefRegex
def addFailRegex(self, name, value, multiple=False):
flt = self.__jails[name].filter
if not multiple: value = (value,)

View File

@ -54,7 +54,9 @@ class Ticket(object):
self._time = time if time is not None else MyTime.time()
self._data = {'matches': matches or [], 'failures': 0}
if data is not None:
self._data.update(data)
for k,v in data.iteritems():
if v is not None:
self._data[k] = v
if ticket:
# ticket available - copy whole information from ticket:
self.__dict__.update(i for i in ticket.__dict__.iteritems() if i[0] in self.__dict__)
@ -135,7 +137,8 @@ class Ticket(object):
self._data['matches'] = matches or []
def getMatches(self):
return self._data.get('matches', [])
return [(line if isinstance(line, basestring) else "".join(line)) \
for line in self._data.get('matches', ())]
@property
def restored(self):
@ -232,7 +235,11 @@ class FailTicket(Ticket):
self.__retry += count
self._data['failures'] += attempt
if matches:
self._data['matches'] += matches
# we should duplicate "matches", because possibly referenced to multiple tickets:
if self._data['matches']:
self._data['matches'] = self._data['matches'] + matches
else:
self._data['matches'] = matches
def setLastTime(self, value):
if value > self._time:

View File

@ -221,6 +221,10 @@ class Transmitter:
value = command[2:]
self.__server.delJournalMatch(name, value)
return self.__server.getJournalMatch(name)
elif command[1] == "prefregex":
value = command[2]
self.__server.setPrefRegex(name, value)
return self.__server.getPrefRegex(name)
elif command[1] == "addfailregex":
value = command[2]
self.__server.addFailRegex(name, value, multiple=multiple)
@ -346,6 +350,8 @@ class Transmitter:
return self.__server.getIgnoreIP(name)
elif command[1] == "ignorecommand":
return self.__server.getIgnoreCommand(name)
elif command[1] == "prefregex":
return self.__server.getPrefRegex(name)
elif command[1] == "failregex":
return self.__server.getFailRegex(name)
elif command[1] == "ignoreregex":

View File

@ -98,6 +98,12 @@ class Utils():
cache.popitem()
cache[k] = (v, t + self.maxTime)
def unset(self, k):
try:
del self._cache[k]
except KeyError: # pragme: no cover
pass
@staticmethod
def setFBlockMode(fhandle, value):

View File

@ -29,7 +29,7 @@ import tempfile
import time
import unittest
from ..server.action import CommandAction, CallingMap
from ..server.action import CommandAction, CallingMap, substituteRecursiveTags
from ..server.actions import OrderedDict
from ..server.utils import Utils
@ -40,12 +40,20 @@ class CommandActionTest(LogCaptureTestCase):
def setUp(self):
"""Call before every test case."""
self.__action = CommandAction(None, "Test")
LogCaptureTestCase.setUp(self)
self.__action = CommandAction(None, "Test")
# prevent execute stop if start fails (or event not started at all):
self.__action_started = False
orgstart = self.__action.start
def _action_start():
self.__action_started = True
return orgstart()
self.__action.start = _action_start
def tearDown(self):
"""Call after every test case."""
self.__action.stop()
if self.__action_started:
self.__action.stop()
LogCaptureTestCase.tearDown(self)
def testSubstituteRecursiveTags(self):
@ -56,30 +64,30 @@ class CommandActionTest(LogCaptureTestCase):
}
# Recursion is bad
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<A>'}))
lambda: substituteRecursiveTags({'A': '<A>'}))
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<B>', 'B': '<A>'}))
lambda: substituteRecursiveTags({'A': '<B>', 'B': '<A>'}))
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<B>', 'B': '<C>', 'C': '<A>'}))
lambda: substituteRecursiveTags({'A': '<B>', 'B': '<C>', 'C': '<A>'}))
# Unresolveable substition
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': 'to=<B> fromip=<IP>', 'C': '<B>', 'B': '<C>', 'D': ''}))
lambda: substituteRecursiveTags({'A': 'to=<B> fromip=<IP>', 'C': '<B>', 'B': '<C>', 'D': ''}))
self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP>', 'sweet': '<honeypot>', 'honeypot': '<sweet>', 'ignoreregex': ''}))
lambda: substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP>', 'sweet': '<honeypot>', 'honeypot': '<sweet>', 'ignoreregex': ''}))
# We need here an ordered, because the sequence of iteration is very important for this test
if OrderedDict:
# No cyclic recursion, just multiple replacement of tag <T>, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict(
self.assertEqual(substituteRecursiveTags( OrderedDict(
(('X', 'x=x<T>'), ('T', '1'), ('Z', '<X> <T> <Y>'), ('Y', 'y=y<T>')))
), {'X': 'x=x1', 'T': '1', 'Y': 'y=y1', 'Z': 'x=x1 1 y=y1'}
)
# No cyclic recursion, just multiple replacement of tag <T> in composite tags, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict(
self.assertEqual(substituteRecursiveTags( OrderedDict(
(('X', 'x=x<T> <Z> <<R1>> <<R2>>'), ('R1', 'Z'), ('R2', 'Y'), ('T', '1'), ('Z', '<T> <Y>'), ('Y', 'y=y<T>')))
), {'X': 'x=x1 1 y=y1 1 y=y1 y=y1', 'R1': 'Z', 'R2': 'Y', 'T': '1', 'Z': '1 y=y1', 'Y': 'y=y1'}
)
# No cyclic recursion, just multiple replacement of same tags, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict((
self.assertEqual(substituteRecursiveTags( OrderedDict((
('actionstart', 'ipset create <ipmset> hash:ip timeout <bantime> family <ipsetfamily>\n<iptables> -I <chain> <actiontype>'),
('ipmset', 'f2b-<name>'),
('name', 'any'),
@ -111,42 +119,42 @@ class CommandActionTest(LogCaptureTestCase):
))
)
# Cyclic recursion by composite tag creation, tags "create" another tag, that closes cycle:
self.assertRaises(ValueError, lambda: CommandAction.substituteRecursiveTags( OrderedDict((
self.assertRaises(ValueError, lambda: substituteRecursiveTags( OrderedDict((
('A', '<<B><C>>'),
('B', 'D'), ('C', 'E'),
('DE', 'cycle <A>'),
)) ))
self.assertRaises(ValueError, lambda: CommandAction.substituteRecursiveTags( OrderedDict((
self.assertRaises(ValueError, lambda: substituteRecursiveTags( OrderedDict((
('DE', 'cycle <A>'),
('A', '<<B><C>>'),
('B', 'D'), ('C', 'E'),
)) ))
# missing tags are ok
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C>'}), {'A': '<C>'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C> <D> <X>','X':'fun'}), {'A': '<C> <D> fun', 'X':'fun'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C> <B>', 'B': 'cool'}), {'A': '<C> cool', 'B': 'cool'})
self.assertEqual(substituteRecursiveTags({'A': '<C>'}), {'A': '<C>'})
self.assertEqual(substituteRecursiveTags({'A': '<C> <D> <X>','X':'fun'}), {'A': '<C> <D> fun', 'X':'fun'})
self.assertEqual(substituteRecursiveTags({'A': '<C> <B>', 'B': 'cool'}), {'A': '<C> cool', 'B': 'cool'})
# Escaped tags should be ignored
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<matches> <B>', 'B': 'cool'}), {'A': '<matches> cool', 'B': 'cool'})
self.assertEqual(substituteRecursiveTags({'A': '<matches> <B>', 'B': 'cool'}), {'A': '<matches> cool', 'B': 'cool'})
# Multiple stuff on same line is ok
self.assertEqual(CommandAction.substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP> evilperson=<honeypot>', 'honeypot': 'pokie', 'ignoreregex': ''}),
self.assertEqual(substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP> evilperson=<honeypot>', 'honeypot': 'pokie', 'ignoreregex': ''}),
{ 'failregex': "to=pokie fromip=<IP> evilperson=pokie",
'honeypot': 'pokie',
'ignoreregex': '',
})
# rest is just cool
self.assertEqual(CommandAction.substituteRecursiveTags(aInfo),
self.assertEqual(substituteRecursiveTags(aInfo),
{ 'HOST': "192.0.2.0",
'ABC': '123 192.0.2.0',
'xyz': '890 123 192.0.2.0',
})
# obscure embedded case
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4'}),
self.assertEqual(substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4'}),
{'A': '<IPV4HOST>', 'PREF': 'IPV4'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'}),
self.assertEqual(substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'}),
{'A': '1.2.3.4', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'})
# more embedded within a string and two interpolations
self.assertEqual(CommandAction.substituteRecursiveTags({'A': 'A <IP<PREF>HOST> B IP<PREF> C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'}),
self.assertEqual(substituteRecursiveTags({'A': 'A <IP<PREF>HOST> B IP<PREF> C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'}),
{'A': 'A 1.2.3.4 B IPV4 C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'})
def testReplaceTag(self):
@ -186,7 +194,7 @@ class CommandActionTest(LogCaptureTestCase):
# Callable
self.assertEqual(
self.__action.replaceTag("09 <matches> 11",
CallingMap(matches=lambda: str(10))),
CallingMap(matches=lambda self: str(10))),
"09 10 11")
def testReplaceNoTag(self):
@ -194,7 +202,27 @@ class CommandActionTest(LogCaptureTestCase):
# Will raise ValueError if it is
self.assertEqual(
self.__action.replaceTag("abc",
CallingMap(matches=lambda: int("a"))), "abc")
CallingMap(matches=lambda self: int("a"))), "abc")
def testReplaceTagSelfRecursion(self):
setattr(self.__action, 'a', "<a")
setattr(self.__action, 'b', "c>")
setattr(self.__action, 'b?family=inet6', "b>")
setattr(self.__action, 'ac', "<a><b>")
setattr(self.__action, 'ab', "<ac>")
setattr(self.__action, 'x?family=inet6', "")
# produce self-referencing properties except:
self.assertRaisesRegexp(ValueError, r"properties contain self referencing definitions",
lambda: self.__action.replaceTag("<a><b>",
self.__action._properties, conditional="family=inet4")
)
# remore self-referencing in props:
delattr(self.__action, 'ac')
# produce self-referencing query except:
self.assertRaisesRegexp(ValueError, r"possible self referencing definitions in query",
lambda: self.__action.replaceTag("<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x>>>>>>>>>>>>>>>>>>>>>",
self.__action._properties, conditional="family=inet6")
)
def testReplaceTagConditionalCached(self):
setattr(self.__action, 'abc', "123")
@ -217,10 +245,10 @@ class CommandActionTest(LogCaptureTestCase):
self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties,
conditional="family=inet6", cache=cache),
"Text 890-567 text 567 '567'")
self.assertEqual(len(cache) if cache is not None else -1, 3)
self.assertTrue(len(cache) >= 3)
# set one parameter - internal properties and cache should be reseted:
setattr(self.__action, 'xyz', "000-<abc>")
self.assertEqual(len(cache) if cache is not None else -1, 0)
self.assertEqual(len(cache), 0)
# test againg, should have 000 instead of 890:
for i in range(2):
self.assertEqual(
@ -235,7 +263,7 @@ class CommandActionTest(LogCaptureTestCase):
self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties,
conditional="family=inet6", cache=cache),
"Text 000-567 text 567 '567'")
self.assertEqual(len(cache), 3)
self.assertTrue(len(cache) >= 3)
def testExecuteActionBan(self):
@ -301,13 +329,24 @@ class CommandActionTest(LogCaptureTestCase):
self.assertEqual(self.__action.ROST,"192.0.2.0")
def testExecuteActionUnbanAinfo(self):
aInfo = {
aInfo = CallingMap({
'ABC': "123",
}
self.__action.actionban = "touch /tmp/fail2ban.test.123"
self.__action.actionunban = "rm /tmp/fail2ban.test.<ABC>"
'ip': '192.0.2.1',
'F-*': lambda self: {
'fid': 111,
'fport': 222,
'user': "tester"
}
})
self.__action.actionban = "touch /tmp/fail2ban.test.123; echo 'failure <F-ID> of <F-USER> -<F-TEST>- from <ip>:<F-PORT>'"
self.__action.actionunban = "rm /tmp/fail2ban.test.<ABC>; echo 'user <F-USER> unbanned'"
self.__action.ban(aInfo)
self.__action.unban(aInfo)
self.assertLogged(
" -- stdout: 'failure 111 of tester -- from 192.0.2.1:222'",
" -- stdout: 'user tester unbanned'",
all=True
)
def testExecuteActionStartEmpty(self):
self.__action.actionstart = ""
@ -403,7 +442,7 @@ class CommandActionTest(LogCaptureTestCase):
"stderr: 'The rain in Spain stays mainly in the plain'\n")
def testCallingMap(self):
mymap = CallingMap(callme=lambda: str(10), error=lambda: int('a'),
mymap = CallingMap(callme=lambda self: str(10), error=lambda self: int('a'),
dontcallme= "string", number=17)
# Should work fine
@ -412,3 +451,43 @@ class CommandActionTest(LogCaptureTestCase):
"10 okay string 17")
# Error will now trip, demonstrating delayed call
self.assertRaises(ValueError, lambda x: "%(error)i" % x, mymap)
def testCallingMapModify(self):
m = CallingMap({
'a': lambda self: 2 + 3,
'b': lambda self: self['a'] + 6,
'c': 'test',
})
# test reset (without modifications):
m.reset()
# do modifications:
m['a'] = 4
del m['c']
# test set and delete:
self.assertEqual(len(m), 2)
self.assertNotIn('c', m)
self.assertEqual((m['a'], m['b']), (4, 10))
# reset to original and test again:
m.reset()
s = repr(m)
self.assertEqual(len(m), 3)
self.assertIn('c', m)
self.assertEqual((m['a'], m['b'], m['c']), (5, 11, 'test'))
def testCallingMapRep(self):
m = CallingMap({
'a': lambda self: 2 + 3,
'b': lambda self: self['a'] + 6,
'c': ''
})
s = repr(m)
self.assertIn("'a': 5", s)
self.assertIn("'b': 11", s)
self.assertIn("'c': ''", s)
m['c'] = lambda self: self['xxx'] + 7; # unresolvable
s = repr(m)
self.assertIn("'a': 5", s)
self.assertIn("'b': 11", s)
self.assertIn("'c': ", s) # presents as callable
self.assertNotIn("'c': ''", s) # but not empty

View File

@ -0,0 +1,76 @@
# Fail2Ban obsolete multiline example resp. test filter (previously sshd.conf)
#
[INCLUDES]
# Read common prefixes. If any customizations available -- read them from
# common.local
before = ../../../../config/filter.d/common.conf
[DEFAULT]
_daemon = sshd
# optional prefix (logged from several ssh versions) like "error: ", "error: PAM: " or "fatal: "
__pref = (?:(?:error|fatal): (?:PAM: )?)?
# optional suffix (logged from several ssh versions) like " [preauth]"
__suff = (?: \[preauth\])?\s*
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
# single line prefix:
__prefix_line_sl = %(__prefix_line)s%(__pref)s
# multi line prefixes (for first and second lines):
__prefix_line_ml1 = (?P<__prefix>%(__prefix_line)s)%(__pref)s
__prefix_line_ml2 = %(__suff)s$<SKIPLINES>^(?P=__prefix)%(__pref)s
mode = %(normal)s
normal = ^%(__prefix_line_sl)s[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*%(__suff)s$
^%(__prefix_line_sl)sUser not known to the underlying authentication module for .* from <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)sFailed \S+ for (?P<cond_inv>invalid user )?(?P<user>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)) from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^%(__prefix_line_sl)sROOT LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)s[iI](?:llegal|nvalid) user .*? from <HOST>%(__on_port_opt)s\s*$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not in any group\s*%(__suff)s$
^%(__prefix_line_sl)srefused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^%(__prefix_line_sl)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*%(__suff)s$
^%(__prefix_line_sl)s(error: )?maximum authentication attempts exceeded for .* from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$
^%(__prefix_line_ml1)sUser .+ not allowed because account is locked%(__prefix_line_ml2)sReceived disconnect from <HOST>: 11: .+%(__suff)s$
^%(__prefix_line_ml1)sDisconnecting: Too many authentication failures for .+?%(__prefix_line_ml2)sConnection closed by <HOST>%(__suff)s$
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sDisconnecting: Too many authentication failures for .+%(__suff)s$
ddos = ^%(__prefix_line_sl)sDid not receive identification string from <HOST>%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
^%(__prefix_line_sl)sUnable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found.
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sUnable to negotiate a (?:cipher|key exchange method)%(__suff)s$
^%(__prefix_line_ml1)sSSH: Server;Ltype: (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:.*%(__prefix_line_ml2)sRead from socket failed: Connection reset by peer%(__suff)s$
aggressive = %(normal)s
%(ddos)s
[Definition]
failregex = %(mode)s
ignoreregex =
# "maxlines" is number of log lines to buffer for multi-line regex searches
maxlines = 10
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd
datepattern = {^LN-BEG}
# DEV Notes:
#
# "Failed \S+ for .*? from <HOST>..." failregex uses non-greedy catch-all because
# it is coming before use of <HOST> which is not hard-anchored at the end as well,
# and later catch-all's could contain user-provided input, which need to be greedily
# matched away first.
#
# Author: Cyril Jaquier, Yaroslav Halchenko, Petr Voralek, Daniel Black

View File

@ -769,9 +769,10 @@ class Fail2banServerTest(Fail2banClientServerBase):
"[Definition]",
"norestored = %(_exec_once)s",
"restore = ",
"info = ",
"actionstart = echo '[%(name)s] %(actname)s: ** start'", start,
"actionreload = echo '[%(name)s] %(actname)s: .. reload'", reload,
"actionban = echo '[%(name)s] %(actname)s: ++ ban <ip> %(restore)s'", ban,
"actionban = echo '[%(name)s] %(actname)s: ++ ban <ip> %(restore)s%(info)s'", ban,
"actionunban = echo '[%(name)s] %(actname)s: -- unban <ip>'", unban,
"actionstop = echo '[%(name)s] %(actname)s: __ stop'", stop,
)
@ -785,28 +786,28 @@ class Fail2banServerTest(Fail2banClientServerBase):
"usedns = no",
"maxretry = 3",
"findtime = 10m",
"failregex = ^\s*failure (401|403) from <HOST>",
"failregex = ^\s*failure <F-ERRCODE>401|403</F-ERRCODE> from <HOST>",
"datepattern = {^LN-BEG}EPOCH",
"",
"[test-jail1]", "backend = " + backend, "filter =",
"action = ",
" test-action1[name='%(__name__)s']" \
if 1 in actions else "",
" test-action2[name='%(__name__)s', restore='restored: <restored>']" \
" test-action2[name='%(__name__)s', restore='restored: <restored>', info=', err-code: <F-ERRCODE>']" \
if 2 in actions else "",
" test-action2[name='%(__name__)s', actname=test-action3, _exec_once=1, restore='restored: <restored>']" \
if 3 in actions else "",
"logpath = " + test1log,
" " + test2log if 2 in enabled else "",
" " + test3log if 2 in enabled else "",
"failregex = ^\s*failure (401|403) from <HOST>",
" ^\s*error (401|403) from <HOST>" \
"failregex = ^\s*failure <F-ERRCODE>401|403</F-ERRCODE> from <HOST>",
" ^\s*error <F-ERRCODE>401|403</F-ERRCODE> from <HOST>" \
if 2 in enabled else "",
"enabled = true" if 1 in enabled else "",
"",
"[test-jail2]", "backend = " + backend, "filter =",
"action = ",
" test-action2[name='%(__name__)s', restore='restored: <restored>']" \
" test-action2[name='%(__name__)s', restore='restored: <restored>', info=', err-code: <F-ERRCODE>']" \
if 2 in actions else "",
" test-action2[name='%(__name__)s', actname=test-action3, _exec_once=1, restore='restored: <restored>']" \
if 3 in actions else "",
@ -845,7 +846,7 @@ class Fail2banServerTest(Fail2banClientServerBase):
"stdout: '[test-jail1] test-action2: ** start'", all=True)
# test restored is 0 (both actions available):
self.assertLogged(
"stdout: '[test-jail1] test-action2: ++ ban 192.0.2.1 restored: 0'",
"stdout: '[test-jail1] test-action2: ++ ban 192.0.2.1 restored: 0, err-code: 401'",
"stdout: '[test-jail1] test-action3: ++ ban 192.0.2.1 restored: 0'",
all=True, wait=MID_WAITTIME)
@ -968,8 +969,8 @@ class Fail2banServerTest(Fail2banClientServerBase):
)
# test restored is 1 (only test-action2):
self.assertLogged(
"stdout: '[test-jail2] test-action2: ++ ban 192.0.2.4 restored: 1'",
"stdout: '[test-jail2] test-action2: ++ ban 192.0.2.8 restored: 1'",
"stdout: '[test-jail2] test-action2: ++ ban 192.0.2.4 restored: 1, err-code: 401'",
"stdout: '[test-jail2] test-action2: ++ ban 192.0.2.8 restored: 1, err-code: 401'",
all=True, wait=MID_WAITTIME)
# test test-action3 not executed at all (norestored check):
self.assertNotLogged(

View File

@ -27,17 +27,18 @@ import os
import sys
from ..client import fail2banregex
from ..client.fail2banregex import Fail2banRegex, get_opt_parser, exec_command_line, output
from ..client.fail2banregex import Fail2banRegex, get_opt_parser, exec_command_line, output, str2LogLevel
from .utils import setUpMyTime, tearDownMyTime, LogCaptureTestCase, logSys
from .utils import CONFIG_DIR
fail2banregex.logSys = logSys
def _test_output(*args):
logSys.info(args[0])
logSys.notice(args[0])
fail2banregex.output = _test_output
TEST_CONFIG_DIR = os.path.join(os.path.dirname(__file__), "config")
TEST_FILES_DIR = os.path.join(os.path.dirname(__file__), "files")
DEV_NULL = None
@ -45,6 +46,9 @@ DEV_NULL = None
def _Fail2banRegex(*args):
parser = get_opt_parser()
(opts, args) = parser.parse_args(list(args))
# put down log-level if expected, because of too many debug-messages:
if opts.log_level in ("notice", "warning"):
logSys.setLevel(str2LogLevel(opts.log_level))
return (opts, args, Fail2banRegex(opts))
class ExitException(Exception):
@ -80,7 +84,13 @@ class Fail2banRegexTest(LogCaptureTestCase):
FILENAME_02 = os.path.join(TEST_FILES_DIR, "testcase02.log")
FILENAME_WRONGCHAR = os.path.join(TEST_FILES_DIR, "testcase-wrong-char.log")
FILENAME_SSHD = os.path.join(TEST_FILES_DIR, "logs", "sshd")
FILTER_SSHD = os.path.join(CONFIG_DIR, 'filter.d', 'sshd.conf')
FILENAME_ZZZ_SSHD = os.path.join(TEST_FILES_DIR, 'zzz-sshd-obsolete-multiline.log')
FILTER_ZZZ_SSHD = os.path.join(TEST_CONFIG_DIR, 'filter.d', 'zzz-sshd-obsolete-multiline.conf')
FILENAME_ZZZ_GEN = os.path.join(TEST_FILES_DIR, "logs", "zzz-generic-example")
FILTER_ZZZ_GEN = os.path.join(TEST_CONFIG_DIR, 'filter.d', 'zzz-generic-example.conf')
def setUp(self):
"""Call before every test case."""
@ -96,7 +106,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
(opts, args, fail2banRegex) = _Fail2banRegex(
"test", r".** from <HOST>$"
)
self.assertFalse(fail2banRegex.start(opts, args))
self.assertFalse(fail2banRegex.start(args))
self.assertLogged("Unable to compile regular expression")
def testWrongIngnoreRE(self):
@ -104,7 +114,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
"--datepattern", "{^LN-BEG}EPOCH",
"test", r".*? from <HOST>$", r".**"
)
self.assertFalse(fail2banRegex.start(opts, args))
self.assertFalse(fail2banRegex.start(args))
self.assertLogged("Unable to compile regular expression")
def testDirectFound(self):
@ -114,7 +124,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
"Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0",
r"Authentication failure for .*? from <HOST>$"
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 1 lines, 0 ignored, 1 matched, 0 missed')
def testDirectNotFound(self):
@ -123,7 +133,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
"Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0",
r"XYZ from <HOST>$"
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 1 lines, 0 ignored, 0 matched, 1 missed')
def testDirectIgnored(self):
@ -133,7 +143,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
r"Authentication failure for .*? from <HOST>$",
r"kevin from 192.0.2.0$"
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 1 lines, 1 ignored, 0 matched, 0 missed')
def testDirectRE_1(self):
@ -143,7 +153,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_01,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 19 lines, 0 ignored, 13 matched, 6 missed')
self.assertLogged('Error decoding line');
@ -159,7 +169,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_01,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 19 lines, 0 ignored, 16 matched, 3 missed')
def testDirectRE_1raw_noDns(self):
@ -169,7 +179,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_01,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 19 lines, 0 ignored, 13 matched, 6 missed')
def testDirectRE_2(self):
@ -179,7 +189,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_02,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed')
def testVerbose(self):
@ -189,18 +199,69 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_02,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed')
self.assertLogged('141.3.81.106 Sun Aug 14 11:53:59 2005')
self.assertLogged('141.3.81.106 Sun Aug 14 11:54:59 2005')
def testWronChar(self):
def testVerboseFullSshd(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"-v", "--verbose-date", "--print-all-matched",
Fail2banRegexTest.FILENAME_SSHD, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(args))
# test failure line and not-failure lines both presents:
self.assertLogged("[29116]: User root not allowed because account is locked",
"[29116]: Received disconnect from 1.2.3.4", all=True)
def testFastSshd(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--print-all-matched",
Fail2banRegexTest.FILENAME_ZZZ_SSHD, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(args))
# test failure line and all not-failure lines presents:
self.assertLogged(
"[29116]: Connection from 192.0.2.4",
"[29116]: User root not allowed because account is locked",
"[29116]: Received disconnect from 192.0.2.4", all=True)
def testMultilineSshd(self):
# by the way test of missing lines by multiline in `for bufLine in orgLineBuffer[int(fullBuffer):]`
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--print-all-matched", "--print-all-missed",
Fail2banRegexTest.FILENAME_ZZZ_SSHD, Fail2banRegexTest.FILTER_ZZZ_SSHD
)
self.assertTrue(fail2banRegex.start(args))
# test "failure" line presents (2nd part only, because multiline fewer precise):
self.assertLogged(
"[29116]: Received disconnect from 192.0.2.4", all=True)
def testFullGeneric(self):
# by the way test of ignoreregex (specified in filter file)...
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
Fail2banRegexTest.FILENAME_ZZZ_GEN, Fail2banRegexTest.FILTER_ZZZ_GEN
)
self.assertTrue(fail2banRegex.start(args))
def _reset(self):
# reset global warn-counter:
from ..server.filter import _decode_line_warn
_decode_line_warn.clear()
def testWronChar(self):
self._reset()
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--datepattern", "^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?",
Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed')
self.assertLogged('Error decoding line')
@ -210,12 +271,15 @@ class Fail2banRegexTest(LogCaptureTestCase):
self.assertLogged('Nov 8 00:16:12 main sshd[32547]: pam_succeed_if(sshd:auth): error retrieving information about user llinco')
def testWronCharDebuggex(self):
self._reset()
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--datepattern", "^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?",
"--debuggex", "--print-all-matched",
Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Error decoding line')
self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed')
self.assertLogged('https://')

View File

@ -1,17 +1,23 @@
# failJSON: { "time": "2005-02-07T15:10:42", "match": true , "host": "192.168.1.1" }
# failJSON: { "time": "2005-02-07T15:10:42", "match": true , "host": "192.168.1.1", "user": "sample-user" }
Feb 7 15:10:42 example pure-ftpd: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=pure-ftpd ruser=sample-user rhost=192.168.1.1
# failJSON: { "time": "2005-05-12T09:47:54", "match": true , "host": "71-13-115-12.static.mdsn.wi.charter.com" }
# failJSON: { "time": "2005-05-12T09:47:54", "match": true , "host": "71-13-115-12.static.mdsn.wi.charter.com", "user": "root" }
May 12 09:47:54 vaio sshd[16004]: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=71-13-115-12.static.mdsn.wi.charter.com user=root
# failJSON: { "time": "2005-05-12T09:48:03", "match": true , "host": "71-13-115-12.static.mdsn.wi.charter.com" }
May 12 09:48:03 vaio sshd[16021]: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=71-13-115-12.static.mdsn.wi.charter.com
# failJSON: { "time": "2005-05-15T18:02:12", "match": true , "host": "66.232.129.62" }
# failJSON: { "time": "2005-05-15T18:02:12", "match": true , "host": "66.232.129.62", "user": "mark" }
May 15 18:02:12 localhost proftpd: (pam_unix) authentication failure; logname= uid=0 euid=0 tty= ruser= rhost=66.232.129.62 user=mark
# linux-pam messages before commit f0f9c4479303b5a9c37667cf07f58426dc081676 (release 0.99.2.0 ) - nolonger supported
# failJSON: { "time": "2004-11-25T17:12:13", "match": false }
Nov 25 17:12:13 webmail pop(pam_unix)[4920]: authentication failure; logname= uid=0 euid=0 tty= ruser= rhost=192.168.10.3 user=mailuser
# failJSON: { "time": "2005-07-19T18:11:26", "match": true , "host": "www.google.com" }
# failJSON: { "time": "2005-07-19T18:11:26", "match": true , "host": "www.google.com", "user": "an8767" }
Jul 19 18:11:26 srv2 vsftpd: pam_unix(vsftpd:auth): authentication failure; logname= uid=0 euid=0 tty=ftp ruser=an8767 rhost=www.google.com
# failJSON: { "time": "2005-07-19T18:11:26", "match": true , "host": "www.google.com" }
Jul 19 18:11:26 srv2 vsftpd: pam_unix: authentication failure; logname= uid=0 euid=0 tty=ftp ruser=an8767 rhost=www.google.com
# failJSON: { "time": "2005-07-19T18:11:50", "match": true , "host": "192.0.2.1", "user": "test rhost=192.0.2.151", "desc": "Injecting on username"}
Jul 19 18:11:50 srv2 daemon: pam_unix(auth): authentication failure; logname= uid=0 euid=0 tty=xxx ruser=test rhost=192.0.2.151 rhost=192.0.2.1
# failJSON: { "time": "2005-07-19T18:11:52", "match": true , "host": "192.0.2.2", "user": "test rhost=192.0.2.152", "desc": "Injecting on username after host"}
Jul 19 18:11:52 srv2 daemon: pam_unix(auth): authentication failure; logname= uid=0 euid=0 tty=xxx ruser= rhost=192.0.2.2 user=test rhost=192.0.2.152

View File

@ -0,0 +1,2 @@
# test sshd file:
# addFILE: "sshd"

View File

@ -0,0 +1,4 @@
Apr 27 13:02:01 host sshd[29116]: Connection from 192.0.2.4 port 55555
Apr 27 13:02:02 host sshd[29116]: User root not allowed because account is locked
Apr 27 13:02:03 host sshd[29116]: input_userauth_request: invalid user root [preauth]
Apr 27 13:02:04 host sshd[29116]: Received disconnect from 192.0.2.4: 11: Normal Shutdown, Thank you for playing [preauth]

View File

@ -337,6 +337,11 @@ class IgnoreIP(LogCaptureTestCase):
for ip in ipList:
self.filter.addIgnoreIP(ip)
self.assertFalse(self.filter.inIgnoreIPList(ip))
if not unittest.F2B.no_network: # pragma: no cover
self.assertLogged(
'Unable to find a corresponding IP address for 999.999.999.999',
'Unable to find a corresponding IP address for abcdef.abcdef',
'Unable to find a corresponding IP address for 192.168.0.', all=True)
def testIgnoreIPCIDR(self):
self.filter.addIgnoreIP('192.168.1.0/25')
@ -1426,6 +1431,7 @@ class GetFailures(LogCaptureTestCase):
('no', output_no),
('warn', output_yes)
):
self.pruneLog("[test-phase useDns=%s]" % useDns)
jail = DummyJail()
filter_ = FileFilter(jail, useDns=useDns)
filter_.active = True

View File

@ -102,7 +102,9 @@ def testSampleRegexsFactory(name, basedir):
else:
continue
for optval in optval:
if opt[2] == "addfailregex":
if opt[2] == "prefregex":
self.filter.prefRegex = optval
elif opt[2] == "addfailregex":
self.filter.addFailRegex(optval)
elif opt[2] == "addignoreregex":
self.filter.addIgnoreRegex(optval)
@ -126,7 +128,7 @@ def testSampleRegexsFactory(name, basedir):
# test regexp contains greedy catch-all before <HOST>, that is
# not hard-anchored at end or has not precise sub expression after <HOST>:
for fr in self.filter.getFailRegex():
if RE_WRONG_GREED.search(fr): #pragma: no cover
if RE_WRONG_GREED.search(fr): # pragma: no cover
raise AssertionError("Following regexp of \"%s\" contains greedy catch-all before <HOST>, "
"that is not hard-anchored at end or has not precise sub expression after <HOST>:\n%s" %
(name, str(fr).replace(RE_HOST, '<HOST>')))
@ -148,23 +150,34 @@ def testSampleRegexsFactory(name, basedir):
else:
faildata = {}
ret = self.filter.processLine(line)
if not ret:
# Check line is flagged as none match
self.assertFalse(faildata.get('match', True),
"Line not matched when should have: %s:%i %r" %
(logFile.filename(), logFile.filelineno(), line))
elif ret:
# Check line is flagged to match
self.assertTrue(faildata.get('match', False),
"Line matched when shouldn't have: %s:%i %r" %
(logFile.filename(), logFile.filelineno(), line))
self.assertEqual(len(ret), 1, "Multiple regexs matched %r - %s:%i" %
(map(lambda x: x[0], ret),logFile.filename(), logFile.filelineno()))
try:
ret = self.filter.processLine(line)
if not ret:
# Check line is flagged as none match
self.assertFalse(faildata.get('match', True),
"Line not matched when should have")
continue
# Verify timestamp and host as expected
failregex, host, fail2banTime, lines, fail = ret[0]
self.assertEqual(host, faildata.get("host", None))
failregex, fid, fail2banTime, fail = ret[0]
# Bypass no failure helpers-regexp:
if not faildata.get('match', False) and (fid is None or fail.get('nofail')):
regexsUsed.add(failregex)
continue
# Check line is flagged to match
self.assertTrue(faildata.get('match', False),
"Line matched when shouldn't have")
self.assertEqual(len(ret), 1,
"Multiple regexs matched %r" % (map(lambda x: x[0], ret)))
# Fallback for backwards compatibility (previously no fid, was host only):
if faildata.get("host", None) is not None and fail.get("host", None) is None:
fail["host"] = fid
# Verify match captures (at least fid/host) and timestamp as expected
for k, v in faildata.iteritems():
if k not in ("time", "match", "desc"):
fv = fail.get(k, None)
self.assertEqual(fv, v)
t = faildata.get("time", None)
try:
@ -177,12 +190,15 @@ def testSampleRegexsFactory(name, basedir):
jsonTime += jsonTimeLocal.microsecond / 1000000
self.assertEqual(fail2banTime, jsonTime,
"UTC Time mismatch fail2ban %s (%s) != failJson %s (%s) (diff %.3f seconds) on: %s:%i %r:" %
"UTC Time mismatch %s (%s) != %s (%s) (diff %.3f seconds)" %
(fail2banTime, time.strftime("%Y-%m-%dT%H:%M:%S", time.gmtime(fail2banTime)),
jsonTime, time.strftime("%Y-%m-%dT%H:%M:%S", time.gmtime(jsonTime)),
fail2banTime - jsonTime, logFile.filename(), logFile.filelineno(), line ) )
fail2banTime - jsonTime) )
regexsUsed.add(failregex)
except AssertionError as e: # pragma: no cover
raise AssertionError("%s on: %s:%i, line:\n%s" % (
e, logFile.filename(), logFile.filelineno(), line))
for failRegexIndex, failRegex in enumerate(self.filter.getFailRegex()):
self.assertTrue(

View File

@ -1654,9 +1654,10 @@ class ServerConfigReaderTests(LogCaptureTestCase):
# replace pipe to mail with pipe to cat:
realCmd = re.sub(r'\)\s*\|\s*mail\b([^\n]*)',
r' echo mail \1 ) | cat', realCmd)
# replace abuse retrieving (possible no-network):
realCmd = re.sub(r'[^\n]+\bADDRESSES=\$\(dig\s[^\n]+',
lambda m: 'ADDRESSES="abuse-1@abuse-test-server, abuse-2@abuse-test-server"', realCmd)
# replace abuse retrieving (possible no-network), just replace first occurrence of 'dig...':
realCmd = re.sub(r'\bADDRESSES=\$\(dig\s[^\n]+',
lambda m: 'ADDRESSES="abuse-1@abuse-test-server, abuse-2@abuse-test-server"',
realCmd, 1)
# execute action:
return _actions.CommandAction.executeCmd(realCmd, timeout=timeout)
@ -1686,18 +1687,29 @@ class ServerConfigReaderTests(LogCaptureTestCase):
('j-complain-abuse',
'complain['
'name=%(__name__)s, grepopts="-m 1", grepmax=2, mailcmd="mail -s",' +
# test reverse ip:
'debug=1,' +
# 2 logs to test grep from multiple logs:
'logpath="' + os.path.join(TEST_FILES_DIR, "testcase01.log") + '\n' +
' ' + os.path.join(TEST_FILES_DIR, "testcase01a.log") + '", '
']',
{
'ip4-ban': (
# test reverse ip:
'try to resolve 10.124.142.87.abuse-contacts.abusix.org',
'Lines containing failures of 87.142.124.10 (max 2)',
'testcase01.log:Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 87.142.124.10',
'testcase01a.log:Dec 31 11:55:01 [sshd] error: PAM: Authentication failure for test from 87.142.124.10',
# both abuse mails should be separated with space:
'mail -s Abuse from 87.142.124.10 abuse-1@abuse-test-server abuse-2@abuse-test-server',
),
'ip6-ban': (
# test reverse ip:
'try to resolve 1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.8.b.d.0.1.0.0.2.abuse-contacts.abusix.org',
'Lines containing failures of 2001:db8::1 (max 2)',
# both abuse mails should be separated with space:
'mail -s Abuse from 2001:db8::1 abuse-1@abuse-test-server abuse-2@abuse-test-server',
),
}),
)
server = TestServer()
@ -1718,6 +1730,8 @@ class ServerConfigReaderTests(LogCaptureTestCase):
jails = server._Server__jails
ipv4 = IPAddr('87.142.124.10')
ipv6 = IPAddr('2001:db8::1');
for jail, act, tests in testJailsActions:
# print(jail, jails[jail])
for a in jails[jail].actions:
@ -1728,8 +1742,10 @@ class ServerConfigReaderTests(LogCaptureTestCase):
# wrap default command processor:
action.executeCmd = self._executeMailCmd
# test ban :
self.pruneLog('# === ban ===')
action.ban({'ip': IPAddr('87.142.124.10'),
'failures': 100,
})
self.assertLogged(*tests['ip4-ban'], all=True)
for (test, ip) in (('ip4-ban', ipv4), ('ip6-ban', ipv6)):
if not tests.get(test): continue
self.pruneLog('# === %s ===' % test)
ticket = _actions.CallingMap({
'ip': ip, 'ip-rev': lambda self: self['ip'].getPTR(''), 'failures': 100,})
action.ban(ticket)
self.assertLogged(*tests[test], all=True)