Merge branch '0.10' into 0.10-full

pull/1460/head
sebres 2017-02-28 14:34:32 +01:00
commit 28b5262976
45 changed files with 1345 additions and 602 deletions

View File

@ -10,6 +10,45 @@ ver. 0.10.0 (2016/XX/XXX) - gonna-be-released-some-time-shining
----------- -----------
TODO: implementing of options resp. other tasks from PR #1346 TODO: implementing of options resp. other tasks from PR #1346
documentation should be extended (new options, etc)
### Fixes
* `filter.d/pam-generic.conf`:
- [grave] injection on user name to host fixed
* `action.d/complain.conf`
- fixed using new tag `<ip-rev>` (sh/dash compliant now)
### New Features
* New Actions:
* New Filters:
### Enhancements
* Introduced new filter option `prefregex` for pre-filtering using single regular expression (gh-1698);
* Many times faster and fewer CPU-hungry because of parsing with `maxlines=1`, so without
line buffering (scrolling of the buffer-window).
Combination of tags `<F-MLFID>` and `<F-NOFAIL>` can be used now to process multi-line logs
using single-line expressions:
- tag `<F-MLFID>`: used to identify resp. store failure info for groups of log-lines with the same
identifier (e. g. combined failure-info for the same conn-id by `<F-MLFID>(?:conn-id)</F-MLFID>`,
see sshd.conf for example)
- tag `<F-NOFAIL>`: used as mark for no-failure (helper to accumulate common failure-info,
e. g. from lines that contain IP-address);
* Several filters optimized with pre-filtering using new option `prefregex`, and multiline filter
using `<F-MLFID>` + `<F-NOFAIL>` combination;
* Exposes filter group captures in actions (non-recursive interpolation of tags `<F-...>`,
see gh-1698, gh-1110)
* Some filters extended with user name (can be used in gh-1243 to distinguish IP and user,
resp. to remove after success login the user-related failures only);
* Safer, more stable and faster replaceTag interpolation (switched from cycle over all tags
to re.sub with callable)
* substituteRecursiveTags optimization + moved in helpers facilities (because currently used
commonly in server and in client)
* Provides new tag `<ip-rev>` for PTR reversed representation of IP address
ver. 0.10.0-alpha-1 (2016/07/14) - ipv6-support-etc
-----------
### Fixes ### Fixes
* [Grave] memory leak's fixed (gh-1277, gh-1234) * [Grave] memory leak's fixed (gh-1277, gh-1234)

View File

@ -34,6 +34,9 @@ before = helpers-common.conf
[Definition] [Definition]
# Used in test cases for coverage internal transformations
debug = 0
# bypass ban/unban for restored tickets # bypass ban/unban for restored tickets
norestored = 1 norestored = 1
@ -62,7 +65,9 @@ actioncheck =
# Values: CMD # Values: CMD
# #
actionban = oifs=${IFS}; actionban = oifs=${IFS};
IFS=.; SEP_IP=( <ip> ); set -- ${SEP_IP}; ADDRESSES=$(dig +short -t txt -q $4.$3.$2.$1.abuse-contacts.abusix.org); RESOLVER_ADDR="%(addr_resolver)s"
if [ "<debug>" -gt 0 ]; then echo "try to resolve $RESOLVER_ADDR"; fi
ADDRESSES=$(dig +short -t txt -q $RESOLVER_ADDR | tr -d '"')
IFS=,; ADDRESSES=$(echo $ADDRESSES) IFS=,; ADDRESSES=$(echo $ADDRESSES)
IFS=${oifs} IFS=${oifs}
IP=<ip> IP=<ip>
@ -81,7 +86,12 @@ actionban = oifs=${IFS};
# #
actionunban = actionunban =
[Init] # Server as resolver used in dig command
#
addr_resolver = <ip-rev>abuse-contacts.abusix.org
# Default message used for abuse content
#
message = Dear Sir/Madam,\n\nWe have detected abuse from the IP address $IP, which according to a abusix.com is on your network. We would appreciate if you would investigate and take action as appropriate.\n\nLog lines are given below, but please ask if you require any further information.\n\n(If you are not the correct person to contact about this please accept our apologies - your e-mail address was extracted from the whois record by an automated process.)\n\n This mail was generated by Fail2Ban.\nThe recipient address of this report was provided by the Abuse Contact DB by abusix.com. abusix.com does not maintain the content of the database. All information which we pass out, derives from the RIR databases and is processed for ease of use. If you want to change or report non working abuse contacts please contact the appropriate RIR. If you have any further question, contact abusix.com directly via email (info@abusix.com). Information about the Abuse Contact Database can be found here: https://abusix.com/global-reporting/abuse-contact-db\nabusix.com is neither responsible nor liable for the content or accuracy of this message.\n message = Dear Sir/Madam,\n\nWe have detected abuse from the IP address $IP, which according to a abusix.com is on your network. We would appreciate if you would investigate and take action as appropriate.\n\nLog lines are given below, but please ask if you require any further information.\n\n(If you are not the correct person to contact about this please accept our apologies - your e-mail address was extracted from the whois record by an automated process.)\n\n This mail was generated by Fail2Ban.\nThe recipient address of this report was provided by the Abuse Contact DB by abusix.com. abusix.com does not maintain the content of the database. All information which we pass out, derives from the RIR databases and is processed for ease of use. If you want to change or report non working abuse contacts please contact the appropriate RIR. If you have any further question, contact abusix.com directly via email (info@abusix.com). Information about the Abuse Contact Database can be found here: https://abusix.com/global-reporting/abuse-contact-db\nabusix.com is neither responsible nor liable for the content or accuracy of this message.\n
# Path to the log files which contain relevant lines for the abuser IP # Path to the log files which contain relevant lines for the abuser IP

View File

@ -123,7 +123,7 @@ class SMTPAction(ActionBase):
self.message_values = CallingMap( self.message_values = CallingMap(
jailname = self._jail.name, jailname = self._jail.name,
hostname = socket.gethostname, hostname = socket.gethostname,
bantime = self._jail.actions.getBanTime, bantime = lambda: self._jail.actions.getBanTime(),
) )
# bypass ban/unban for restored tickets # bypass ban/unban for restored tickets

View File

@ -9,20 +9,24 @@ before = apache-common.conf
[Definition] [Definition]
prefregex = ^%(_apache_error_client)s (?:AH\d+: )?<F-CONTENT>.+</F-CONTENT>$
failregex = ^%(_apache_error_client)s (AH(01797|01630): )?client denied by server configuration: (uri )?\S*(, referer: \S+)?\s*$ # auth_type = ((?:Digest|Basic): )?
^%(_apache_error_client)s (AH01617: )?user .*? authentication failure for "\S*": Password Mismatch(, referer: \S+)?$ auth_type = ([A-Z]\w+: )?
^%(_apache_error_client)s (AH01618: )?user .*? not found(: )?\S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01614: )?client used wrong authentication scheme: \S*(, referer: \S+)?\s*$ failregex = ^client denied by server configuration: (uri )?\S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH\d+: )?Authorization of user \S+ to access \S* failed, reason: .*$ ^user .*? authentication failure for "\S*": Password Mismatch(, referer: \S+)?$
^%(_apache_error_client)s (AH0179[24]: )?(Digest: )?user .*?: password mismatch: \S*(, referer: \S+)?\s*$ ^user .*? not found(: )?\S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH0179[01]: |Digest: )user `.*?' in realm `.+' (not found|denied by provider): \S*(, referer: \S+)?\s*$ ^client used wrong authentication scheme: \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01631: )?user .*?: authorization failure for "\S*":(, referer: \S+)?\s*$ ^Authorization of user \S+ to access \S* failed, reason: .*$
^%(_apache_error_client)s (AH01775: )?(Digest: )?invalid nonce .* received - length is not \S+(, referer: \S+)?\s*$ ^%(auth_type)suser .*?: password mismatch: \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01788: )?(Digest: )?realm mismatch - got `.*?' but expected `.+'(, referer: \S+)?\s*$ ^%(auth_type)suser `.*?' in realm `.+' (not found|denied by provider): \S*(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01789: )?(Digest: )?unknown algorithm `.*?' received: \S*(, referer: \S+)?\s*$ ^user .*?: authorization failure for "\S*":(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01793: )?invalid qop `.*?' received: \S*(, referer: \S+)?\s*$ ^%(auth_type)sinvalid nonce .* received - length is not \S+(, referer: \S+)?\s*$
^%(_apache_error_client)s (AH01777: )?(Digest: )?invalid nonce .*? received - user attempted time travel(, referer: \S+)?\s*$ ^%(auth_type)srealm mismatch - got `.*?' but expected `.+'(, referer: \S+)?\s*$
^%(auth_type)sunknown algorithm `.*?' received: \S*(, referer: \S+)?\s*$
^invalid qop `.*?' received: \S*(, referer: \S+)?\s*$
^%(auth_type)sinvalid nonce .*? received - user attempted time travel(, referer: \S+)?\s*$
ignoreregex = ignoreregex =
@ -53,4 +57,4 @@ ignoreregex =
# referer is always in error log messages if it exists added as per the log_error_core function in server/log.c # referer is always in error log messages if it exists added as per the log_error_core function in server/log.c
# #
# Author: Cyril Jaquier # Author: Cyril Jaquier
# Major edits by Daniel Black # Major edits by Daniel Black and Sergey Brester (sebres)

View File

@ -23,14 +23,13 @@ before = apache-common.conf
[Definition] [Definition]
failregex = ^%(_apache_error_client)s ((AH001(28|30): )?File does not exist|(AH01264: )?script not found or unable to stat): <webroot><block>(, referer: \S+)?\s*$ prefregex = ^%(_apache_error_client)s (?:AH\d+: )?<F-CONTENT>.+</F-CONTENT>$
^%(_apache_error_client)s script '<webroot><block>' not found or unable to stat(, referer: \S+)?\s*$
failregex = ^(?:File does not exist|script not found or unable to stat): <webroot><block>(, referer: \S+)?\s*$
^script '<webroot><block>' not found or unable to stat(, referer: \S+)?\s*$
ignoreregex = ignoreregex =
[Init]
# Webroot represents the webroot on which all other files are based # Webroot represents the webroot on which all other files are based
webroot = /var/www/ webroot = /var/www/

View File

@ -9,8 +9,10 @@ before = apache-common.conf
[Definition] [Definition]
failregex = ^%(_apache_error_client)s (AH01215: )?/bin/(ba)?sh: warning: HTTP_.*?: ignoring function definition attempt(, referer: \S+)?\s*$ prefregex = ^%(_apache_error_client)s (AH01215: )?/bin/([bd]a)?sh: <F-CONTENT>.+</F-CONTENT>$
^%(_apache_error_client)s (AH01215: )?/bin/(ba)?sh: error importing function definition for `HTTP_.*?'(, referer: \S+)?\s*$
failregex = ^warning: HTTP_[^:]+: ignoring function definition attempt(, referer: \S+)?\s*$
^error importing function definition for `HTTP_[^']+'(, referer: \S+)?\s*$
ignoreregex = ignoreregex =

View File

@ -18,16 +18,18 @@ iso8601 = \d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d+[+-]\d{4}
# All Asterisk log messages begin like this: # All Asterisk log messages begin like this:
log_prefix= (?:NOTICE|SECURITY|WARNING)%(__pid_re)s:?(?:\[C-[\da-f]*\])? [^:]+:\d*(?:(?: in)? \w+:)? log_prefix= (?:NOTICE|SECURITY|WARNING)%(__pid_re)s:?(?:\[C-[\da-f]*\])? [^:]+:\d*(?:(?: in)? \w+:)?
failregex = ^%(__prefix_line)s%(log_prefix)s Registration from '[^']*' failed for '<HOST>(:\d+)?' - (Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$ prefregex = ^%(__prefix_line)s%(log_prefix)s <F-CONTENT>.+</F-CONTENT>$
^%(__prefix_line)s%(log_prefix)s Call from '[^']*' \(<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context
^%(__prefix_line)s%(log_prefix)s Host <HOST> failed to authenticate as '[^']*'$ failregex = ^Registration from '[^']*' failed for '<HOST>(:\d+)?' - (Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$
^%(__prefix_line)s%(log_prefix)s No registration for peer '[^']*' \(from <HOST>\)$ ^Call from '[^']*' \(<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context
^%(__prefix_line)s%(log_prefix)s Host <HOST> failed MD5 authentication for '[^']*' \([^)]+\)$ ^Host <HOST> failed to authenticate as '[^']*'$
^%(__prefix_line)s%(log_prefix)s Failed to authenticate (user|device) [^@]+@<HOST>\S*$ ^No registration for peer '[^']*' \(from <HOST>\)$
^%(__prefix_line)s%(log_prefix)s hacking attempt detected '<HOST>'$ ^Host <HOST> failed MD5 authentication for '[^']*' \([^)]+\)$
^%(__prefix_line)s%(log_prefix)s SecurityEvent="(FailedACL|InvalidAccountID|ChallengeResponseFailed|InvalidPassword)",EventTV="([\d-]+|%(iso8601)s)",Severity="[\w]+",Service="[\w]+",EventVersion="\d+",AccountID="(\d*|<unknown>)",SessionID=".+",LocalAddress="IPV[46]/(UDP|TCP|WS)/[\da-fA-F:.]+/\d+",RemoteAddress="IPV[46]/(UDP|TCP|WS)/<HOST>/\d+"(,Challenge="[\w/]+")?(,ReceivedChallenge="\w+")?(,Response="\w+",ExpectedResponse="\w*")?(,ReceivedHash="[\da-f]+")?(,ACLName="\w+")?$ ^Failed to authenticate (user|device) [^@]+@<HOST>\S*$
^%(__prefix_line)s%(log_prefix)s "Rejecting unknown SIP connection from <HOST>"$ ^hacking attempt detected '<HOST>'$
^%(__prefix_line)s%(log_prefix)s Request (?:'[^']*' )?from '[^']*' failed for '<HOST>(?::\d+)?'\s\(callid: [^\)]*\) - (?:No matching endpoint found|Not match Endpoint(?: Contact)? ACL|(?:Failed|Error) to authenticate)\s*$ ^SecurityEvent="(FailedACL|InvalidAccountID|ChallengeResponseFailed|InvalidPassword)",EventTV="([\d-]+|%(iso8601)s)",Severity="[\w]+",Service="[\w]+",EventVersion="\d+",AccountID="(\d*|<unknown>)",SessionID=".+",LocalAddress="IPV[46]/(UDP|TCP|WS)/[\da-fA-F:.]+/\d+",RemoteAddress="IPV[46]/(UDP|TCP|WS)/<HOST>/\d+"(,Challenge="[\w/]+")?(,ReceivedChallenge="\w+")?(,Response="\w+",ExpectedResponse="\w*")?(,ReceivedHash="[\da-f]+")?(,ACLName="\w+")?$
^"Rejecting unknown SIP connection from <HOST>"$
^Request (?:'[^']*' )?from '[^']*' failed for '<HOST>(?::\d+)?'\s\(callid: [^\)]*\) - (?:No matching endpoint found|Not match Endpoint(?: Contact)? ACL|(?:Failed|Error) to authenticate)\s*$
ignoreregex = ignoreregex =

View File

@ -12,8 +12,10 @@ before = common.conf
_daemon = courieresmtpd _daemon = courieresmtpd
failregex = ^%(__prefix_line)serror,relay=<HOST>,.*: 550 User (<.*> )?unknown\.?$ prefregex = ^%(__prefix_line)serror,relay=<HOST>,<F-CONTENT>.+</F-CONTENT>$
^%(__prefix_line)serror,relay=<HOST>,msg="535 Authentication failed\.",cmd:( AUTH \S+)?( [0-9a-zA-Z\+/=]+)?(?: \S+)$
failregex = ^[^:]*: 550 User (<.*> )?unknown\.?$
^msg="535 Authentication failed\.",cmd:( AUTH \S+)?( [0-9a-zA-Z\+/=]+)?(?: \S+)$
ignoreregex = ignoreregex =

View File

@ -7,13 +7,16 @@ before = common.conf
[Definition] [Definition]
_daemon = (auth|dovecot(-auth)?|auth-worker) _auth_worker = (?:dovecot: )?auth(?:-worker)?
_daemon = (dovecot(-auth)?|auth)
failregex = ^%(__prefix_line)s(?:%(__pam_auth)s(?:\(dovecot:auth\))?:)?\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(?:\s+user=\S*)?\s*$ prefregex = ^%(__prefix_line)s(%(_auth_worker)s(?:\([^\)]+\))?: )?(?:%(__pam_auth)s(?:\(dovecot:auth\))?: |(?:pop3|imap)-login: )?(?:Info: )?<F-CONTENT>.+</F-CONTENT>$
^%(__prefix_line)s(?:pop3|imap)-login: (?:Info: )?(?:Aborted login|Disconnected)(?::(?: [^ \(]+)+)? \((?:auth failed, \d+ attempts( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<[^>]+>,)?( method=\S+,)? rip=<HOST>(?:, lip=\S+)?(?:, TLS(?: handshaking(?:: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
^%(__prefix_line)s(?:Info|dovecot: auth\(default\)|auth-worker\(\d+\)): pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$ failregex = ^authentication failure; logname=\S* uid=\S* euid=\S* tty=dovecot ruser=\S* rhost=<HOST>(?:\s+user=\S*)?\s*$
^%(__prefix_line)s(?:auth|auth-worker\(\d+\)): (?:pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$ ^(?:Aborted login|Disconnected)(?::(?: [^ \(]+)+)? \((?:auth failed, \d+ attempts( in \d+ secs)?|tried to use (disabled|disallowed) \S+ auth)\):( user=<[^>]+>,)?( method=\S+,)? rip=<HOST>(?:, lip=\S+)?(?:, TLS(?: handshaking(?:: SSL_accept\(\) failed: error:[\dA-F]+:SSL routines:[TLS\d]+_GET_CLIENT_HELLO:unknown protocol)?)?(: Disconnected)?)?(, session=<\S+>)?\s*$
^%(__prefix_line)s(?:auth|auth-worker\(\d+\)): Info: ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$ ^pam\(\S+,<HOST>\): pam_authenticate\(\) failed: (User not known to the underlying authentication module: \d+ Time\(s\)|Authentication failure \(password mismatch\?\))\s*$
^(?:pam|passwd-file)\(\S+,<HOST>\): unknown user\s*$
^ldap\(\S*,<HOST>,\S*\): invalid credentials\s*$
ignoreregex = ignoreregex =

View File

@ -23,9 +23,11 @@ before = common.conf
_daemon = dropbear _daemon = dropbear
failregex = ^%(__prefix_line)s[Ll]ogin attempt for nonexistent user ('.*' )?from <HOST>:\d+$ prefregex = ^%(__prefix_line)s<F-CONTENT>(?:[Ll]ogin|[Bb]ad|[Ee]xit).+</F-CONTENT>$
^%(__prefix_line)s[Bb]ad (PAM )?password attempt for .+ from <HOST>(:\d+)?$
^%(__prefix_line)s[Ee]xit before auth \(user '.+', \d+ fails\): Max auth tries reached - user '.+' from <HOST>:\d+\s*$ failregex = ^[Ll]ogin attempt for nonexistent user ('.*' )?from <HOST>:\d+$
^[Bb]ad (PAM )?password attempt for .+ from <HOST>(:\d+)?$
^[Ee]xit before auth \(user '.+', \d+ fails\): Max auth tries reached - user '.+' from <HOST>:\d+\s*$
ignoreregex = ignoreregex =

View File

@ -9,7 +9,9 @@ after = exim-common.local
[Definition] [Definition]
host_info = (?:H=([\w.-]+ )?(?:\(\S+\) )?)?\[<HOST>\](?::\d+)? (?:I=\[\S+\](:\d+)? )?(?:U=\S+ )?(?:P=e?smtp )? host_info_pre = (?:H=([\w.-]+ )?(?:\(\S+\) )?)?
host_info_suf = (?::\d+)?(?: I=\[\S+\](:\d+)?)?(?: U=\S+)?(?: P=e?smtp)?(?: F=(?:<>|[^@]+@\S+))?\s
host_info = %(host_info_pre)s\[<HOST>\]%(host_info_suf)s
pid = (?: \[\d+\])? pid = (?: \[\d+\])?
# DEV Notes: # DEV Notes:

View File

@ -13,14 +13,17 @@ before = exim-common.conf
[Definition] [Definition]
# Fre-filter via "prefregex" is currently inactive because of too different failure syntax in exim-log (testing needed):
#prefregex = ^%(pid)s <F-CONTENT>\b(?:\w+ authenticator failed|([\w\-]+ )?SMTP (?:(?:call|connection) from|protocol(?: synchronization)? error)|no MAIL in|(?:%(host_info_pre)s\[[^\]]+\]%(host_info_suf)s(?:sender verify fail|rejected RCPT|dropped|AUTH command))).+</F-CONTENT>$
failregex = ^%(pid)s %(host_info)ssender verify fail for <\S+>: (?:Unknown user|Unrouteable address|all relevant MX records point to non-existent hosts)\s*$ failregex = ^%(pid)s %(host_info)ssender verify fail for <\S+>: (?:Unknown user|Unrouteable address|all relevant MX records point to non-existent hosts)\s*$
^%(pid)s \w+ authenticator failed for (\S+ )?\(\S+\) \[<HOST>\](?::\d+)?(?: I=\[\S+\](:\d+)?)?: 535 Incorrect authentication data( \(set_id=.*\)|: \d+ Time\(s\))?\s*$ ^%(pid)s \w+ authenticator failed for (\S+ )?\(\S+\) \[<HOST>\](?::\d+)?(?: I=\[\S+\](:\d+)?)?: 535 Incorrect authentication data( \(set_id=.*\)|: \d+ Time\(s\))?\s*$
^%(pid)s %(host_info)sF=(?:<>|[^@]+@\S+) rejected RCPT [^@]+@\S+: (?:relay not permitted|Sender verify failed|Unknown user)\s*$ ^%(pid)s %(host_info)srejected RCPT [^@]+@\S+: (?:relay not permitted|Sender verify failed|Unknown user)\s*$
^%(pid)s SMTP protocol synchronization error \([^)]*\): rejected (?:connection from|"\S+") %(host_info)s(?:next )?input=".*"\s*$ ^%(pid)s SMTP protocol synchronization error \([^)]*\): rejected (?:connection from|"\S+") %(host_info)s(?:next )?input=".*"\s*$
^%(pid)s SMTP call from \S+ %(host_info)sdropped: too many nonmail commands \(last was "\S+"\)\s*$ ^%(pid)s SMTP call from \S+ %(host_info)sdropped: too many nonmail commands \(last was "\S+"\)\s*$
^%(pid)s SMTP protocol error in "AUTH \S*(?: \S*)?" %(host_info)sAUTH command used when not advertised\s*$ ^%(pid)s SMTP protocol error in "AUTH \S*(?: \S*)?" %(host_info)sAUTH command used when not advertised\s*$
^%(pid)s no MAIL in SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sD=\d+s(?: C=\S*)?\s*$ ^%(pid)s no MAIL in SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sD=\d+s(?: C=\S*)?\s*$
^%(pid)s \S+ SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sclosed by DROP in ACL\s*$ ^%(pid)s ([\w\-]+ )?SMTP connection from (?:\S* )?(?:\(\S*\) )?%(host_info)sclosed by DROP in ACL\s*$
ignoreregex = ignoreregex =

View File

@ -25,8 +25,11 @@ _daemon = Froxlor
# (?:::f{4,6}:)?(?P<host>[\w\-.^_]+) # (?:::f{4,6}:)?(?P<host>[\w\-.^_]+)
# Values: TEXT # Values: TEXT
# #
failregex = ^%(__prefix_line)s\[Login Action <HOST>\] Unknown user \S* tried to login.$
^%(__prefix_line)s\[Login Action <HOST>\] User \S* tried to login with wrong password.$ prefregex = ^%(__prefix_line)s\[Login Action <HOST>\] <F-CONTENT>.+</F-CONTENT>$
failregex = ^Unknown user \S* tried to login.$
^User \S* tried to login with wrong password.$
# Option: ignoreregex # Option: ignoreregex

View File

@ -17,8 +17,10 @@ _usernameregex = [^>]+
_prefix = \s+\d+ => <\d+:%(_usernameregex)s\(-1\)> Rejected connection from <HOST>:\d+: _prefix = \s+\d+ => <\d+:%(_usernameregex)s\(-1\)> Rejected connection from <HOST>:\d+:
failregex = ^%(_prefix)s Invalid server password$ prefregex = ^%(_prefix)s <F-CONTENT>.+</F-CONTENT>$
^%(_prefix)s Wrong certificate or password for existing user$
failregex = ^Invalid server password$
^Wrong certificate or password for existing user$
ignoreregex = ignoreregex =

View File

@ -34,9 +34,11 @@ __daemon_combs_re=(?:%(__pid_re)s?:\s+%(__daemon_re)s|%(__daemon_re)s%(__pid_re)
# this can be optional (for instance if we match named native log files) # this can be optional (for instance if we match named native log files)
__line_prefix=(?:\s\S+ %(__daemon_combs_re)s\s+)? __line_prefix=(?:\s\S+ %(__daemon_combs_re)s\s+)?
failregex = ^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: (view (internal|external): )?query(?: \(cache\))? '.*' denied\s*$ prefregex = ^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: <F-CONTENT>.+</F-CONTENT>$
^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: zone transfer '\S+/AXFR/\w+' denied\s*$
^%(__line_prefix)s( error:)?\s*client <HOST>#\S+( \([\S.]+\))?: bad zone transfer request: '\S+/IN': non-authoritative zone \(NOTAUTH\)\s*$ failregex = ^(view (internal|external): )?query(?: \(cache\))? '.*' denied\s*$
^zone transfer '\S+/AXFR/\w+' denied\s*$
^bad zone transfer request: '\S+/IN': non-authoritative zone \(NOTAUTH\)\s*$
ignoreregex = ignoreregex =

View File

@ -16,7 +16,12 @@ _ttys_re=\S*
__pam_re=\(?%(__pam_auth)s(?:\(\S+\))?\)?:? __pam_re=\(?%(__pam_auth)s(?:\(\S+\))?\)?:?
_daemon = \S+ _daemon = \S+
failregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=%(_ttys_re)s ruser=\S* rhost=<HOST>(?:\s+user=.*)?\s*$ prefregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=%(_ttys_re)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^ruser=<F-USER>\S*</F-USER> rhost=<HOST>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>\S*</F-USER>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>.*?</F-USER>\s*$
^ruser=<F-USER>.*?</F-USER> rhost=<HOST>\s*$
ignoreregex = ignoreregex =

View File

@ -12,13 +12,15 @@ before = common.conf
_daemon = postfix(-\w+)?/(?:submission/|smtps/)?smtp[ds] _daemon = postfix(-\w+)?/(?:submission/|smtps/)?smtp[ds]
failregex = ^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 554 5\.7\.1 .*$ prefregex = ^%(__prefix_line)s(?:NOQUEUE: reject:|improper command pipelining) <F-CONTENT>.+</F-CONTENT>$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.7\.1 Client host rejected: cannot find your hostname, (\[\S*\]); from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.7\.1 : Helo command rejected: Host not found; from=<> to=<> proto=ESMTP helo= *$ failregex = ^RCPT from \S+\[<HOST>\]: 554 5\.7\.1
^%(__prefix_line)sNOQUEUE: reject: EHLO from \S+\[<HOST>\]: 504 5\.5\.2 <\S+>: Helo command rejected: need fully-qualified hostname; ^RCPT from \S+\[<HOST>\]: 450 4\.7\.1 Client host rejected: cannot find your hostname, (\[\S*\]); from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^%(__prefix_line)sNOQUEUE: reject: VRFY from \S+\[<HOST>\]: 550 5\.1\.1 .*$ ^RCPT from \S+\[<HOST>\]: 450 4\.7\.1 : Helo command rejected: Host not found; from=<> to=<> proto=ESMTP helo= *$
^%(__prefix_line)sNOQUEUE: reject: RCPT from \S+\[<HOST>\]: 450 4\.1\.8 <\S*>: Sender address rejected: Domain not found; from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$ ^EHLO from \S+\[<HOST>\]: 504 5\.5\.2 <\S+>: Helo command rejected: need fully-qualified hostname;
^%(__prefix_line)simproper command pipelining after \S+ from [^[]*\[<HOST>\]:?$ ^VRFY from \S+\[<HOST>\]: 550 5\.1\.1
^RCPT from \S+\[<HOST>\]: 450 4\.1\.8 <\S*>: Sender address rejected: Domain not found; from=<\S*> to=<\S+> proto=ESMTP helo=<\S*>$
^after \S+ from [^[]*\[<HOST>\]:?$
ignoreregex = ignoreregex =

View File

@ -16,10 +16,14 @@ _daemon = proftpd
__suffix_failed_login = (User not authorized for login|No such user found|Incorrect password|Password expired|Account disabled|Invalid shell: '\S+'|User in \S+|Limit (access|configuration) denies login|Not a UserAlias|maximum login length exceeded).? __suffix_failed_login = (User not authorized for login|No such user found|Incorrect password|Password expired|Account disabled|Invalid shell: '\S+'|User in \S+|Limit (access|configuration) denies login|Not a UserAlias|maximum login length exceeded).?
failregex = ^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ USER .*: no such user found from \S+ \[\S+\] to \S+:\S+ *$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ USER .* \(Login failed\): %(__suffix_failed_login)s\s*$ prefregex = ^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ <F-CONTENT>(?:USER|SECURITY|Maximum).+</F-CONTENT>$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ SECURITY VIOLATION: .* login attempted\. *$
^%(__prefix_line)s%(__hostname)s \(\S+\[<HOST>\]\)[: -]+ Maximum login attempts \(\d+\) exceeded *$
failregex = ^USER .*: no such user found from \S+ \[\S+\] to \S+:\S+ *$
^USER .* \(Login failed\): %(__suffix_failed_login)s\s*$
^SECURITY VIOLATION: .* login attempted\. *$
^Maximum login attempts \(\d+\) exceeded *$
ignoreregex = ignoreregex =

View File

@ -24,37 +24,37 @@ __pref = (?:(?:error|fatal): (?:PAM: )?)?
__suff = (?: \[preauth\])?\s* __suff = (?: \[preauth\])?\s*
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)? __on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
# single line prefix: prefregex = ^<F-MLFID>%(__prefix_line)s</F-MLFID>%(__pref)s<F-CONTENT>.+</F-CONTENT>$
__prefix_line_sl = %(__prefix_line)s%(__pref)s
# multi line prefixes (for first and second lines):
__prefix_line_ml1 = (?P<__prefix>%(__prefix_line)s)%(__pref)s
__prefix_line_ml2 = %(__suff)s$<SKIPLINES>^(?P=__prefix)%(__pref)s
mode = %(normal)s mode = %(normal)s
normal = ^%(__prefix_line_sl)s[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*%(__suff)s$ normal = ^[aA]uthentication (?:failure|error|failed) for <F-USER>.*</F-USER> from <HOST>( via \S+)?\s*%(__suff)s$
^%(__prefix_line_sl)sUser not known to the underlying authentication module for .* from <HOST>\s*%(__suff)s$ ^User not known to the underlying authentication module for <F-USER>.*</F-USER> from <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)sFailed \S+ for (?P<cond_inv>invalid user )?(?P<user>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)) from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$) ^Failed \S+ for (?P<cond_inv>invalid user )?<F-USER>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^%(__prefix_line_sl)sROOT LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$ ^<F-USER>ROOT</F-USER> LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)s[iI](?:llegal|nvalid) user .*? from <HOST>%(__on_port_opt)s\s*$ ^[iI](?:llegal|nvalid) user <F-USER>.*?</F-USER> from <HOST>%(__on_port_opt)s\s*$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$ ^User <F-USER>.+</F-USER> from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$ ^User <F-USER>.+</F-USER> from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not in any group\s*%(__suff)s$ ^User <F-USER>.+</F-USER> from <HOST> not allowed because not in any group\s*%(__suff)s$
^%(__prefix_line_sl)srefused connect from \S+ \(<HOST>\)\s*%(__suff)s$ ^refused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$ ^Received disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$ ^User <F-USER>.+</F-USER> from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$ ^User <F-USER>.+</F-USER> from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^%(__prefix_line_sl)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*%(__suff)s$ ^pam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=<F-USER>\S*</F-USER>\s*rhost=<HOST>\s.*%(__suff)s$
^%(__prefix_line_sl)s(error: )?maximum authentication attempts exceeded for .* from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$ ^(error: )?maximum authentication attempts exceeded for <F-USER>.*</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$
^%(__prefix_line_ml1)sUser .+ not allowed because account is locked%(__prefix_line_ml2)sReceived disconnect from <HOST>: 11: .+%(__suff)s$ ^User <F-USER>.+</F-USER> not allowed because account is locked%(__suff)s
^%(__prefix_line_ml1)sDisconnecting: Too many authentication failures for .+?%(__prefix_line_ml2)sConnection closed by <HOST>%(__suff)s$ ^Disconnecting: Too many authentication failures for <F-USER>.+?</F-USER>%(__suff)s
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sDisconnecting: Too many authentication failures for .+%(__suff)s$ ^<F-NOFAIL>Received disconnect</F-NOFAIL> from <HOST>: 11:
^<F-NOFAIL>Connection closed</F-NOFAIL> by <HOST>%(__suff)s$
ddos = ^%(__prefix_line_sl)sDid not receive identification string from <HOST>%(__suff)s$ ddos = ^Did not receive identification string from <HOST>%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$ ^Received disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
^%(__prefix_line_sl)sUnable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found. ^Unable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found.
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sUnable to negotiate a (?:cipher|key exchange method)%(__suff)s$ ^Unable to negotiate a (?:cipher|key exchange method)%(__suff)s$
^%(__prefix_line_ml1)sSSH: Server;Ltype: (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:.*%(__prefix_line_ml2)sRead from socket failed: Connection reset by peer%(__suff)s$ ^<F-NOFAIL>SSH: Server;Ltype:</F-NOFAIL> (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:
^Read from socket failed: Connection reset by peer \[preauth\]
common = ^<F-NOFAIL>Connection from</F-NOFAIL> <HOST>
aggressive = %(normal)s aggressive = %(normal)s
%(ddos)s %(ddos)s
@ -62,11 +62,11 @@ aggressive = %(normal)s
[Definition] [Definition]
failregex = %(mode)s failregex = %(mode)s
%(common)s
ignoreregex = ignoreregex =
# "maxlines" is number of log lines to buffer for multi-line regex searches maxlines = 1
maxlines = 10
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd

View File

@ -14,8 +14,10 @@ before = common.conf
_daemon = xinetd _daemon = xinetd
failregex = ^%(__prefix_line)sFAIL: \S+ address from=<HOST>$ prefregex = ^%(__prefix_line)sFAIL: <F-CONTENT>.+</F-CONTENT>$
^%(__prefix_line)sFAIL: \S+ libwrap from=<HOST>$
failregex = ^\S+ address from=<HOST>$
^\S+ libwrap from=<HOST>$
ignoreregex = ignoreregex =

View File

@ -28,6 +28,7 @@ import os
from .configreader import DefinitionInitConfigReader from .configreader import DefinitionInitConfigReader
from ..helpers import getLogger from ..helpers import getLogger
from ..server.action import CommandAction
# Gets the instance of the logger. # Gets the instance of the logger.
logSys = getLogger(__name__) logSys = getLogger(__name__)
@ -69,7 +70,8 @@ class ActionReader(DefinitionInitConfigReader):
return self._name return self._name
def convert(self): def convert(self):
opts = self.getCombined(ignore=('timeout', 'bantime')) opts = self.getCombined(
ignore=CommandAction._escapedTags | set(('timeout', 'bantime')))
# type-convert only after combined (otherwise boolean converting prevents substitution): # type-convert only after combined (otherwise boolean converting prevents substitution):
if opts.get('norestored'): if opts.get('norestored'):
opts['norestored'] = self._convert_to_boolean(opts['norestored']) opts['norestored'] = self._convert_to_boolean(opts['norestored'])

View File

@ -29,8 +29,7 @@ import os
from ConfigParser import NoOptionError, NoSectionError from ConfigParser import NoOptionError, NoSectionError
from .configparserinc import sys, SafeConfigParserWithIncludes, logLevel from .configparserinc import sys, SafeConfigParserWithIncludes, logLevel
from ..helpers import getLogger from ..helpers import getLogger, substituteRecursiveTags
from ..server.action import CommandAction
# Gets the instance of the logger. # Gets the instance of the logger.
logSys = getLogger(__name__) logSys = getLogger(__name__)
@ -225,6 +224,7 @@ class ConfigReaderUnshared(SafeConfigParserWithIncludes):
values = dict() values = dict()
if pOptions is None: if pOptions is None:
pOptions = {} pOptions = {}
# Get only specified options:
for optname in options: for optname in options:
if isinstance(options, (list,tuple)): if isinstance(options, (list,tuple)):
if len(optname) > 2: if len(optname) > 2:
@ -280,6 +280,8 @@ class DefinitionInitConfigReader(ConfigReader):
self.setFile(file_) self.setFile(file_)
self.setJailName(jailName) self.setJailName(jailName)
self._initOpts = initOpts self._initOpts = initOpts
self._pOpts = dict()
self._defCache = dict()
def setFile(self, fileName): def setFile(self, fileName):
self._file = fileName self._file = fileName
@ -312,7 +314,7 @@ class DefinitionInitConfigReader(ConfigReader):
pOpts = _merge_dicts(pOpts, self._initOpts) pOpts = _merge_dicts(pOpts, self._initOpts)
self._opts = ConfigReader.getOptions( self._opts = ConfigReader.getOptions(
self, "Definition", self._configOpts, pOpts) self, "Definition", self._configOpts, pOpts)
self._pOpts = pOpts
if self.has_section("Init"): if self.has_section("Init"):
for opt in self.options("Init"): for opt in self.options("Init"):
v = self.get("Init", opt) v = self.get("Init", opt)
@ -324,6 +326,22 @@ class DefinitionInitConfigReader(ConfigReader):
def _convert_to_boolean(self, value): def _convert_to_boolean(self, value):
return value.lower() in ("1", "yes", "true", "on") return value.lower() in ("1", "yes", "true", "on")
def getCombOption(self, optname):
"""Get combined definition option (as string) using pre-set and init
options as preselection (values with higher precedence as specified in section).
Can be used only after calling of getOptions.
"""
try:
return self._defCache[optname]
except KeyError:
try:
v = self.get("Definition", optname, vars=self._pOpts)
except (NoSectionError, NoOptionError, ValueError):
v = None
self._defCache[optname] = v
return v
def getCombined(self, ignore=()): def getCombined(self, ignore=()):
combinedopts = self._opts combinedopts = self._opts
ignore = set(ignore).copy() ignore = set(ignore).copy()
@ -338,7 +356,8 @@ class DefinitionInitConfigReader(ConfigReader):
n, cond = cond.groups() n, cond = cond.groups()
ignore.add(n) ignore.add(n)
# substiture options already specified direct: # substiture options already specified direct:
opts = CommandAction.substituteRecursiveTags(combinedopts, ignore=ignore) opts = substituteRecursiveTags(combinedopts,
ignore=ignore, addrepl=self.getCombOption)
if not opts: if not opts:
raise ValueError('recursive tag definitions unable to be resolved') raise ValueError('recursive tag definitions unable to be resolved')
return opts return opts

View File

@ -61,7 +61,7 @@ def debuggexURL(sample, regex):
'flavor': 'python' }) 'flavor': 'python' })
return 'https://www.debuggex.com/?' + q return 'https://www.debuggex.com/?' + q
def output(args): def output(args): # pragma: no cover (overriden in test-cases)
print(args) print(args)
def shortstr(s, l=53): def shortstr(s, l=53):
@ -235,7 +235,7 @@ class Fail2banRegex(object):
else: else:
self._maxlines = 20 self._maxlines = 20
if opts.journalmatch is not None: if opts.journalmatch is not None:
self.setJournalMatch(opts.journalmatch.split()) self.setJournalMatch(shlex.split(opts.journalmatch))
if opts.datepattern: if opts.datepattern:
self.setDatePattern(opts.datepattern) self.setDatePattern(opts.datepattern)
if opts.usedns: if opts.usedns:
@ -243,6 +243,7 @@ class Fail2banRegex(object):
self._filter.returnRawHost = opts.raw self._filter.returnRawHost = opts.raw
self._filter.checkFindTime = False self._filter.checkFindTime = False
self._filter.checkAllRegex = True self._filter.checkAllRegex = True
self._opts = opts
def decode_line(self, line): def decode_line(self, line):
return FileContainer.decode_line('<LOG>', self._encoding, line) return FileContainer.decode_line('<LOG>', self._encoding, line)
@ -265,23 +266,22 @@ class Fail2banRegex(object):
output( "Use maxlines : %d" % self._filter.getMaxLines() ) output( "Use maxlines : %d" % self._filter.getMaxLines() )
def setJournalMatch(self, v): def setJournalMatch(self, v):
if self._journalmatch is None:
self._journalmatch = v self._journalmatch = v
def readRegex(self, value, regextype): def readRegex(self, value, regextype):
assert(regextype in ('fail', 'ignore')) assert(regextype in ('fail', 'ignore'))
regex = regextype + 'regex' regex = regextype + 'regex'
if os.path.isfile(value) or os.path.isfile(value + '.conf'): if regextype == 'fail' and (os.path.isfile(value) or os.path.isfile(value + '.conf')):
if os.path.basename(os.path.dirname(value)) == 'filter.d': if os.path.basename(os.path.dirname(value)) == 'filter.d':
## within filter.d folder - use standard loading algorithm to load filter completely (with .local etc.): ## within filter.d folder - use standard loading algorithm to load filter completely (with .local etc.):
basedir = os.path.dirname(os.path.dirname(value)) basedir = os.path.dirname(os.path.dirname(value))
value = os.path.splitext(os.path.basename(value))[0] value = os.path.splitext(os.path.basename(value))[0]
output( "Use %11s filter file : %s, basedir: %s" % (regex, value, basedir) ) output( "Use %11s filter file : %s, basedir: %s" % (regex, value, basedir) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config, basedir=basedir) reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config, basedir=basedir)
if not reader.read(): if not reader.read(): # pragma: no cover
output( "ERROR: failed to load filter %s" % value ) output( "ERROR: failed to load filter %s" % value )
return False return False
else: else: # pragma: no cover
## foreign file - readexplicit this file and includes if possible: ## foreign file - readexplicit this file and includes if possible:
output( "Use %11s file : %s" % (regex, value) ) output( "Use %11s file : %s" % (regex, value) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config) reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config)
@ -291,38 +291,51 @@ class Fail2banRegex(object):
return False return False
reader.getOptions(None) reader.getOptions(None)
readercommands = reader.convert() readercommands = reader.convert()
regex_values = [
RegexStat(m[3]) regex_values = {}
for m in filter( for opt in readercommands:
lambda x: x[0] == 'set' and x[2] == "add%sregex" % regextype, if opt[0] == 'multi-set':
readercommands) optval = opt[3]
] + [ elif opt[0] == 'set':
RegexStat(m) optval = opt[3:]
for mm in filter( else: # pragma: no cover
lambda x: x[0] == 'multi-set' and x[2] == "add%sregex" % regextype, continue
readercommands)
for m in mm[3]
]
# Read out and set possible value of maxlines
for command in readercommands:
if command[2] == "maxlines":
maxlines = int(command[3])
try: try:
self.setMaxLines(maxlines) if opt[2] == "prefregex":
except ValueError: for optval in optval:
output( "ERROR: Invalid value for maxlines (%(maxlines)r) " \ self._filter.prefRegex = optval
"read from %(value)s" % locals() ) elif opt[2] == "addfailregex":
stor = regex_values.get('fail')
if not stor: stor = regex_values['fail'] = list()
for optval in optval:
stor.append(RegexStat(optval))
#self._filter.addFailRegex(optval)
elif opt[2] == "addignoreregex":
stor = regex_values.get('ignore')
if not stor: stor = regex_values['ignore'] = list()
for optval in optval:
stor.append(RegexStat(optval))
#self._filter.addIgnoreRegex(optval)
elif opt[2] == "maxlines":
for optval in optval:
self.setMaxLines(optval)
elif opt[2] == "datepattern":
for optval in optval:
self.setDatePattern(optval)
elif opt[2] == "addjournalmatch": # pragma: no cover
if self._opts.journalmatch is None:
self.setJournalMatch(optval)
except ValueError as e: # pragma: no cover
output( "ERROR: Invalid value for %s (%r) " \
"read from %s: %s" % (opt[2], optval, value, e) )
return False return False
elif command[2] == 'addjournalmatch':
journalmatch = command[3:]
self.setJournalMatch(journalmatch)
elif command[2] == 'datepattern':
datepattern = command[3]
self.setDatePattern(datepattern)
else: else:
output( "Use %11s line : %s" % (regex, shortstr(value)) ) output( "Use %11s line : %s" % (regex, shortstr(value)) )
regex_values = [RegexStat(value)] regex_values = {regextype: [RegexStat(value)]}
for regextype, regex_values in regex_values.iteritems():
regex = regextype + 'regex'
setattr(self, "_" + regex, regex_values) setattr(self, "_" + regex, regex_values)
for regex in regex_values: for regex in regex_values:
getattr( getattr(
@ -337,7 +350,7 @@ class Fail2banRegex(object):
if ret is not None: if ret is not None:
found = True found = True
regex = self._ignoreregex[ret].inc() regex = self._ignoreregex[ret].inc()
except RegexException as e: except RegexException as e: # pragma: no cover
output( 'ERROR: %s' % e ) output( 'ERROR: %s' % e )
return False return False
return found return found
@ -355,7 +368,7 @@ class Fail2banRegex(object):
regex = self._failregex[match[0]] regex = self._failregex[match[0]]
regex.inc() regex.inc()
regex.appendIP(match) regex.appendIP(match)
except RegexException as e: except RegexException as e: # pragma: no cover
output( 'ERROR: %s' % e ) output( 'ERROR: %s' % e )
return False return False
for bufLine in orgLineBuffer[int(fullBuffer):]: for bufLine in orgLineBuffer[int(fullBuffer):]:
@ -502,14 +515,14 @@ class Fail2banRegex(object):
for line in hdlr: for line in hdlr:
yield self.decode_line(line) yield self.decode_line(line)
def start(self, opts, args): def start(self, args):
cmd_log, cmd_regex = args[:2] cmd_log, cmd_regex = args[:2]
try: try:
if not self.readRegex(cmd_regex, 'fail'): if not self.readRegex(cmd_regex, 'fail'): # pragma: no cover
return False return False
if len(args) == 3 and not self.readRegex(args[2], 'ignore'): if len(args) == 3 and not self.readRegex(args[2], 'ignore'): # pragma: no cover
return False return False
except RegexException as e: except RegexException as e:
output( 'ERROR: %s' % e ) output( 'ERROR: %s' % e )
@ -521,7 +534,7 @@ class Fail2banRegex(object):
output( "Use log file : %s" % cmd_log ) output( "Use log file : %s" % cmd_log )
output( "Use encoding : %s" % self._encoding ) output( "Use encoding : %s" % self._encoding )
test_lines = self.file_lines_gen(hdlr) test_lines = self.file_lines_gen(hdlr)
except IOError as e: except IOError as e: # pragma: no cover
output( e ) output( e )
return False return False
elif cmd_log.startswith("systemd-journal"): # pragma: no cover elif cmd_log.startswith("systemd-journal"): # pragma: no cover
@ -595,5 +608,5 @@ def exec_command_line(*args):
logSys.addHandler(stdout) logSys.addHandler(stdout)
fail2banRegex = Fail2banRegex(opts) fail2banRegex = Fail2banRegex(opts)
if not fail2banRegex.start(opts, args): if not fail2banRegex.start(args):
sys.exit(-1) sys.exit(-1)

View File

@ -37,6 +37,7 @@ logSys = getLogger(__name__)
class FilterReader(DefinitionInitConfigReader): class FilterReader(DefinitionInitConfigReader):
_configOpts = { _configOpts = {
"prefregex": ["string", None],
"ignoreregex": ["string", None], "ignoreregex": ["string", None],
"failregex": ["string", ""], "failregex": ["string", ""],
"maxlines": ["int", None], "maxlines": ["int", None],
@ -68,12 +69,11 @@ class FilterReader(DefinitionInitConfigReader):
stream.append(["multi-set", self._jailName, "add" + opt, multi]) stream.append(["multi-set", self._jailName, "add" + opt, multi])
elif len(multi): elif len(multi):
stream.append(["set", self._jailName, "add" + opt, multi[0]]) stream.append(["set", self._jailName, "add" + opt, multi[0]])
elif opt == 'maxlines': elif opt in ('maxlines', 'prefregex'):
# We warn when multiline regex is used without maxlines > 1 # Be sure we set this options first.
# therefore keep sure we set this option first. stream.insert(0, ["set", self._jailName, opt, value])
stream.insert(0, ["set", self._jailName, "maxlines", value]) elif opt in ('datepattern'):
elif opt == 'datepattern': stream.append(["set", self._jailName, opt, value])
stream.append(["set", self._jailName, "datepattern", value])
# Do not send a command if the match is empty. # Do not send a command if the match is empty.
elif opt == 'journalmatch': elif opt == 'journalmatch':
if value is None: continue if value is None: continue

View File

@ -200,6 +200,108 @@ else:
raise raise
return uni_decode(x, enc, 'replace') return uni_decode(x, enc, 'replace')
#
# Following facilities used for safe recursive interpolation of
# tags (<tag>) in tagged options.
#
# max tag replacement count:
MAX_TAG_REPLACE_COUNT = 10
# compiled RE for tag name (replacement name)
TAG_CRE = re.compile(r'<([^ <>]+)>')
def substituteRecursiveTags(inptags, conditional='',
ignore=(), addrepl=None
):
"""Sort out tag definitions within other tags.
Since v.0.9.2 supports embedded interpolation (see test cases for examples).
so: becomes:
a = 3 a = 3
b = <a>_3 b = 3_3
Parameters
----------
inptags : dict
Dictionary of tags(keys) and their values.
Returns
-------
dict
Dictionary of tags(keys) and their values, with tags
within the values recursively replaced.
"""
#logSys = getLogger("fail2ban")
tre_search = TAG_CRE.search
# copy return tags dict to prevent modifying of inptags:
tags = inptags.copy()
# init:
ignore = set(ignore)
done = set()
# repeat substitution while embedded-recursive (repFlag is True)
while True:
repFlag = False
# substitute each value:
for tag in tags.iterkeys():
# ignore escaped or already done (or in ignore list):
if tag in ignore or tag in done: continue
value = orgval = str(tags[tag])
# search and replace all tags within value, that can be interpolated using other tags:
m = tre_search(value)
refCounts = {}
#logSys.log(5, 'TAG: %s, value: %s' % (tag, value))
while m:
# found replacement tag:
rtag = m.group(1)
# don't replace tags that should be currently ignored (pre-replacement):
if rtag in ignore:
m = tre_search(value, m.end())
continue
#logSys.log(5, 'found: %s' % rtag)
if rtag == tag or refCounts.get(rtag, 1) > MAX_TAG_REPLACE_COUNT:
# recursive definitions are bad
#logSys.log(5, 'recursion fail tag: %s value: %s' % (tag, value) )
raise ValueError(
"properties contain self referencing definitions "
"and cannot be resolved, fail tag: %s, found: %s in %s, value: %s" %
(tag, rtag, refCounts, value))
repl = None
if conditional:
repl = tags.get(rtag + '?' + conditional)
if repl is None:
repl = tags.get(rtag)
# try to find tag using additional replacement (callable):
if repl is None and addrepl is not None:
repl = addrepl(rtag)
if repl is None:
# Missing tags - just continue on searching after end of match
# Missing tags are ok - cInfo can contain aInfo elements like <HOST> and valid shell
# constructs like <STDIN>.
m = tre_search(value, m.end())
continue
value = value.replace('<%s>' % rtag, repl)
#logSys.log(5, 'value now: %s' % value)
# increment reference count:
refCounts[rtag] = refCounts.get(rtag, 0) + 1
# the next match for replace:
m = tre_search(value, m.start())
#logSys.log(5, 'TAG: %s, newvalue: %s' % (tag, value))
# was substituted?
if orgval != value:
# check still contains any tag - should be repeated (possible embedded-recursive substitution):
if tre_search(value):
repFlag = True
tags[tag] = value
# no more sub tags (and no possible composite), add this tag to done set (just to be faster):
if '<' not in value: done.add(tag)
# stop interpolation, if no replacements anymore:
if not repFlag:
break
return tags
class BgService(object): class BgService(object):
"""Background servicing """Background servicing

View File

@ -32,10 +32,11 @@ import time
from abc import ABCMeta from abc import ABCMeta
from collections import MutableMapping from collections import MutableMapping
from .failregex import mapTag2Opt
from .ipdns import asip from .ipdns import asip
from .mytime import MyTime from .mytime import MyTime
from .utils import Utils from .utils import Utils
from ..helpers import getLogger from ..helpers import getLogger, substituteRecursiveTags, TAG_CRE, MAX_TAG_REPLACE_COUNT
# Gets the instance of the logger. # Gets the instance of the logger.
logSys = getLogger(__name__) logSys = getLogger(__name__)
@ -46,14 +47,17 @@ _cmd_lock = threading.Lock()
# Todo: make it configurable resp. automatically set, ex.: `[ -f /proc/net/if_inet6 ] && echo 'yes' || echo 'no'`: # Todo: make it configurable resp. automatically set, ex.: `[ -f /proc/net/if_inet6 ] && echo 'yes' || echo 'no'`:
allowed_ipv6 = True allowed_ipv6 = True
# max tag replacement count: # capture groups from filter for map to ticket data:
MAX_TAG_REPLACE_COUNT = 10 FCUSTAG_CRE = re.compile(r'<F-([A-Z0-9_\-]+)>'); # currently uppercase only
# compiled RE for tag name (replacement name) # New line, space
TAG_CRE = re.compile(r'<([^ <>]+)>') ADD_REPL_TAGS = {
"br": "\n",
"sp": " "
}
class CallingMap(MutableMapping): class CallingMap(MutableMapping, object):
"""A Mapping type which returns the result of callable values. """A Mapping type which returns the result of callable values.
`CallingMap` behaves similar to a standard python dictionary, `CallingMap` behaves similar to a standard python dictionary,
@ -70,23 +74,64 @@ class CallingMap(MutableMapping):
The dictionary data which can be accessed to obtain items uncalled The dictionary data which can be accessed to obtain items uncalled
""" """
# immutable=True saves content between actions, without interim copying (save original on demand, recoverable via reset)
__slots__ = ('data', 'storage', 'immutable', '__org_data')
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
self.storage = dict()
self.immutable = True
self.data = dict(*args, **kwargs) self.data = dict(*args, **kwargs)
def reset(self, immutable=True):
self.storage = dict()
try:
self.data = self.__org_data
except AttributeError:
pass
self.immutable = immutable
def __repr__(self): def __repr__(self):
return "%s(%r)" % (self.__class__.__name__, self.data) return "%s(%r)" % (self.__class__.__name__, self._asdict())
def _asdict(self):
try:
return dict(self)
except:
return dict(self.data, **self.storage)
def __getitem__(self, key): def __getitem__(self, key):
try:
value = self.storage[key]
except KeyError:
value = self.data[key] value = self.data[key]
if callable(value): if callable(value):
return value() # check arguments can be supplied to callable (for backwards compatibility):
else: value = value(self) if hasattr(value, '__code__') and value.__code__.co_argcount else value()
self.storage[key] = value
return value return value
def __setitem__(self, key, value): def __setitem__(self, key, value):
self.data[key] = value # mutate to copy:
if self.immutable:
self.storage = self.storage.copy()
self.__org_data = self.data
self.data = self.data.copy()
self.immutable = False
self.storage[key] = value
def __unavailable(self, key):
raise KeyError("Key %r was deleted" % key)
def __delitem__(self, key): def __delitem__(self, key):
# mutate to copy:
if self.immutable:
self.storage = self.storage.copy()
self.__org_data = self.data
self.data = self.data.copy()
self.immutable = False
try:
del self.storage[key]
except KeyError:
pass
del self.data[key] del self.data[key]
def __iter__(self): def __iter__(self):
@ -95,7 +140,7 @@ class CallingMap(MutableMapping):
def __len__(self): def __len__(self):
return len(self.data) return len(self.data)
def copy(self): def copy(self): # pargma: no cover
return self.__class__(self.data.copy()) return self.__class__(self.data.copy())
@ -259,6 +304,16 @@ class CommandAction(ActionBase):
# set: # set:
self.__dict__[name] = value self.__dict__[name] = value
def __delattr__(self, name):
if not name.startswith('_'):
# parameters changed - clear properties and substitution cache:
self.__properties = None
self.__substCache.clear()
#self._logSys.debug("Unset action %r %s", self._name, name)
self._logSys.debug(" Unset %s", name)
# del:
del self.__dict__[name]
@property @property
def _properties(self): def _properties(self):
"""A dictionary of the actions properties. """A dictionary of the actions properties.
@ -364,88 +419,6 @@ class CommandAction(ActionBase):
""" """
return self._executeOperation('<actionreload>', 'reloading') return self._executeOperation('<actionreload>', 'reloading')
@classmethod
def substituteRecursiveTags(cls, inptags, conditional='', ignore=()):
"""Sort out tag definitions within other tags.
Since v.0.9.2 supports embedded interpolation (see test cases for examples).
so: becomes:
a = 3 a = 3
b = <a>_3 b = 3_3
Parameters
----------
inptags : dict
Dictionary of tags(keys) and their values.
Returns
-------
dict
Dictionary of tags(keys) and their values, with tags
within the values recursively replaced.
"""
# copy return tags dict to prevent modifying of inptags:
tags = inptags.copy()
t = TAG_CRE
ignore = set(ignore)
done = cls._escapedTags.copy() | ignore
# repeat substitution while embedded-recursive (repFlag is True)
while True:
repFlag = False
# substitute each value:
for tag in tags.iterkeys():
# ignore escaped or already done (or in ignore list):
if tag in done: continue
value = orgval = str(tags[tag])
# search and replace all tags within value, that can be interpolated using other tags:
m = t.search(value)
refCounts = {}
#logSys.log(5, 'TAG: %s, value: %s' % (tag, value))
while m:
found_tag = m.group(1)
# don't replace tags that should be currently ignored (pre-replacement):
if found_tag in ignore:
m = t.search(value, m.end())
continue
#logSys.log(5, 'found: %s' % found_tag)
if found_tag == tag or refCounts.get(found_tag, 1) > MAX_TAG_REPLACE_COUNT:
# recursive definitions are bad
#logSys.log(5, 'recursion fail tag: %s value: %s' % (tag, value) )
raise ValueError(
"properties contain self referencing definitions "
"and cannot be resolved, fail tag: %s, found: %s in %s, value: %s" %
(tag, found_tag, refCounts, value))
repl = None
if found_tag not in cls._escapedTags:
repl = tags.get(found_tag + '?' + conditional)
if repl is None:
repl = tags.get(found_tag)
if repl is None:
# Escaped or missing tags - just continue on searching after end of match
# Missing tags are ok - cInfo can contain aInfo elements like <HOST> and valid shell
# constructs like <STDIN>.
m = t.search(value, m.end())
continue
value = value.replace('<%s>' % found_tag, repl)
#logSys.log(5, 'value now: %s' % value)
# increment reference count:
refCounts[found_tag] = refCounts.get(found_tag, 0) + 1
# the next match for replace:
m = t.search(value, m.start())
#logSys.log(5, 'TAG: %s, newvalue: %s' % (tag, value))
# was substituted?
if orgval != value:
# check still contains any tag - should be repeated (possible embedded-recursive substitution):
if t.search(value):
repFlag = True
tags[tag] = value
# no more sub tags (and no possible composite), add this tag to done set (just to be faster):
if '<' not in value: done.add(tag)
# stop interpolation, if no replacements anymore:
if not repFlag:
break
return tags
@staticmethod @staticmethod
def escapeTag(value): def escapeTag(value):
"""Escape characters which may be used for command injection. """Escape characters which may be used for command injection.
@ -488,33 +461,73 @@ class CommandAction(ActionBase):
str str
`query` string with tags replaced. `query` string with tags replaced.
""" """
if '<' not in query: return query
# use cache if allowed: # use cache if allowed:
if cache is not None: if cache is not None:
ckey = (query, conditional) ckey = (query, conditional)
string = cache.get(ckey) try:
if string is not None: return cache[ckey]
return string except KeyError:
# replace: pass
string = query
aInfo = cls.substituteRecursiveTags(aInfo, conditional) # first try get cached tags dictionary:
for tag in aInfo: subInfo = csubkey = None
if "<%s>" % tag in query: if cache is not None:
value = aInfo.get(tag + '?' + conditional) csubkey = ('subst-tags', id(aInfo), conditional)
try:
subInfo = cache[csubkey]
except KeyError:
pass
# interpolation of dictionary:
if subInfo is None:
subInfo = substituteRecursiveTags(aInfo, conditional, ignore=cls._escapedTags)
# cache if possible:
if csubkey is not None:
cache[csubkey] = subInfo
# substitution callable, used by interpolation of each tag
repeatSubst = {0: 0}
def substVal(m):
tag = m.group(1) # tagname from match
value = None
if conditional:
value = subInfo.get(tag + '?' + conditional)
if value is None: if value is None:
value = aInfo.get(tag) value = subInfo.get(tag)
if value is None:
# fallback (no or default replacement)
return ADD_REPL_TAGS.get(tag, m.group())
value = str(value) # assure string value = str(value) # assure string
if tag in cls._escapedTags: if tag in cls._escapedTags:
# That one needs to be escaped since its content is # That one needs to be escaped since its content is
# out of our control # out of our control
value = cls.escapeTag(value) value = cls.escapeTag(value)
string = string.replace('<' + tag + '>', value) # possible contains tags:
# New line, space if '<' in value:
string = reduce(lambda s, kv: s.replace(*kv), (("<br>", '\n'), ("<sp>", " ")), string) repeatSubst[0] = 1
# cache if properties: return value
# interpolation of query:
count = MAX_TAG_REPLACE_COUNT + 1
while True:
repeatSubst[0] = 0
value = TAG_CRE.sub(substVal, query)
# possible recursion ?
if not repeatSubst or value == query: break
query = value
count -= 1
if count <= 0:
raise ValueError(
"unexpected too long replacement interpolation, "
"possible self referencing definitions in query: %s" % (query,))
# cache if possible:
if cache is not None: if cache is not None:
cache[ckey] = string cache[ckey] = value
# #
return string return value
def _processCmd(self, cmd, aInfo=None, conditional=''): def _processCmd(self, cmd, aInfo=None, conditional=''):
"""Executes a command with preliminary checks and substitutions. """Executes a command with preliminary checks and substitutions.
@ -580,9 +593,21 @@ class CommandAction(ActionBase):
realCmd = self.replaceTag(cmd, self._properties, realCmd = self.replaceTag(cmd, self._properties,
conditional=conditional, cache=self.__substCache) conditional=conditional, cache=self.__substCache)
# Replace tags # Replace dynamical tags (don't use cache here)
if aInfo is not None: if aInfo is not None:
realCmd = self.replaceTag(realCmd, aInfo, conditional=conditional) realCmd = self.replaceTag(realCmd, aInfo, conditional=conditional)
# Replace ticket options (filter capture groups) non-recursive:
if '<' in realCmd:
tickData = aInfo.get("F-*")
if not tickData: tickData = {}
def substTag(m):
tn = mapTag2Opt(m.groups()[0])
try:
return str(tickData[tn])
except KeyError:
return ""
realCmd = FCUSTAG_CRE.sub(substTag, realCmd)
else: else:
realCmd = cmd realCmd = cmd

View File

@ -287,7 +287,39 @@ class Actions(JailThread, Mapping):
self.stopActions() self.stopActions()
return True return True
def __getBansMerged(self, mi, overalljails=False): class ActionInfo(CallingMap):
AI_DICT = {
"ip": lambda self: self.__ticket.getIP(),
"ip-rev": lambda self: self['ip'].getPTR(''),
"fid": lambda self: self.__ticket.getID(),
"failures": lambda self: self.__ticket.getAttempt(),
"time": lambda self: self.__ticket.getTime(),
"matches": lambda self: "\n".join(self.__ticket.getMatches()),
# to bypass actions, that should not be executed for restored tickets
"restored": lambda self: (1 if self.__ticket.restored else 0),
# extra-interpolation - all match-tags (captured from the filter):
"F-*": lambda self, tag=None: self.__ticket.getData(tag),
# merged info:
"ipmatches": lambda self: "\n".join(self._mi4ip(True).getMatches()),
"ipjailmatches": lambda self: "\n".join(self._mi4ip().getMatches()),
"ipfailures": lambda self: self._mi4ip(True).getAttempt(),
"ipjailfailures": lambda self: self._mi4ip().getAttempt(),
}
__slots__ = CallingMap.__slots__ + ('__ticket', '__jail', '__mi4ip')
def __init__(self, ticket, jail=None, immutable=True, data=AI_DICT):
self.__ticket = ticket
self.__jail = jail
self.storage = dict()
self.immutable = immutable
self.data = data
def copy(self): # pargma: no cover
return self.__class__(self.__ticket, self.__jail, self.immutable, self.data.copy())
def _mi4ip(self, overalljails=False):
"""Gets bans merged once, a helper for lambda(s), prevents stop of executing action by any exception inside. """Gets bans merged once, a helper for lambda(s), prevents stop of executing action by any exception inside.
This function never returns None for ainfo lambdas - always a ticket (merged or single one) This function never returns None for ainfo lambdas - always a ticket (merged or single one)
@ -296,8 +328,6 @@ class Actions(JailThread, Mapping):
Parameters Parameters
---------- ----------
mi : dict
merge info, initial for lambda should contains {ip, ticket}
overalljails : bool overalljails : bool
switch to get a merged bans : switch to get a merged bans :
False - (default) bans merged for current jail only False - (default) bans merged for current jail only
@ -308,13 +338,18 @@ class Actions(JailThread, Mapping):
BanTicket BanTicket
merged or self ticket only merged or self ticket only
""" """
if not hasattr(self, '__mi4ip'):
self.__mi4ip = {}
mi = self.__mi4ip
idx = 'all' if overalljails else 'jail' idx = 'all' if overalljails else 'jail'
if idx in mi: if idx in mi:
return mi[idx] if mi[idx] is not None else mi['ticket'] return mi[idx] if mi[idx] is not None else self.__ticket
try: try:
jail=self._jail jail = self.__jail
ip=mi['ip'] ip = self['ip']
mi[idx] = None mi[idx] = None
if not jail.database: # pragma: no cover
return self.__ticket
if overalljails: if overalljails:
mi[idx] = jail.database.getBansMerged(ip=ip) mi[idx] = jail.database.getBansMerged(ip=ip)
else: else:
@ -324,7 +359,14 @@ class Actions(JailThread, Mapping):
"Failed to get %s bans merged, jail '%s': %s", "Failed to get %s bans merged, jail '%s': %s",
idx, jail.name, e, idx, jail.name, e,
exc_info=logSys.getEffectiveLevel()<=logging.DEBUG) exc_info=logSys.getEffectiveLevel()<=logging.DEBUG)
return mi[idx] if mi[idx] is not None else mi['ticket'] return mi[idx] if mi[idx] is not None else self.__ticket
def __getActionInfo(self, ticket):
ip = ticket.getIP()
aInfo = Actions.ActionInfo(ticket, self._jail)
return aInfo
def __checkBan(self): def __checkBan(self):
"""Check for IP address to ban. """Check for IP address to ban.
@ -342,7 +384,6 @@ class Actions(JailThread, Mapping):
ticket = self._jail.getFailTicket() ticket = self._jail.getFailTicket()
if not ticket: if not ticket:
break break
aInfo = CallingMap()
bTicket = BanManager.createBanTicket(ticket) bTicket = BanManager.createBanTicket(ticket)
btime = ticket.getBanTime() btime = ticket.getBanTime()
if btime is not None: if btime is not None:
@ -353,20 +394,7 @@ class Actions(JailThread, Mapping):
if ticket.restored: if ticket.restored:
bTicket.restored = True bTicket.restored = True
ip = bTicket.getIP() ip = bTicket.getIP()
aInfo["ip"] = ip aInfo = self.__getActionInfo(bTicket)
aInfo["failures"] = bTicket.getAttempt()
aInfo["time"] = bTicket.getTime()
aInfo["matches"] = "\n".join(bTicket.getMatches())
# to bypass actions, that should not be executed for restored tickets
aInfo["restored"] = 1 if ticket.restored else 0
# retarded merge info via twice lambdas : once for merge, once for matches/failures:
if self._jail.database is not None:
mi4ip = lambda overalljails=False, self=self, \
mi={'ip':ip, 'ticket':bTicket}: self.__getBansMerged(mi, overalljails)
aInfo["ipmatches"] = lambda: "\n".join(mi4ip(True).getMatches())
aInfo["ipjailmatches"] = lambda: "\n".join(mi4ip().getMatches())
aInfo["ipfailures"] = lambda: mi4ip(True).getAttempt()
aInfo["ipjailfailures"] = lambda: mi4ip().getAttempt()
reason = {} reason = {}
if self.__banManager.addBanTicket(bTicket, reason=reason): if self.__banManager.addBanTicket(bTicket, reason=reason):
cnt += 1 cnt += 1
@ -379,7 +407,8 @@ class Actions(JailThread, Mapping):
try: try:
if ticket.restored and getattr(action, 'norestored', False): if ticket.restored and getattr(action, 'norestored', False):
continue continue
action.ban(aInfo.copy()) if not aInfo.immutable: aInfo.reset()
action.ban(aInfo)
except Exception as e: except Exception as e:
logSys.error( logSys.error(
"Failed to execute ban jail '%s' action '%s' " "Failed to execute ban jail '%s' action '%s' "
@ -465,21 +494,17 @@ class Actions(JailThread, Mapping):
unbactions = self._actions unbactions = self._actions
else: else:
unbactions = actions unbactions = actions
aInfo = dict() ip = ticket.getIP()
aInfo["ip"] = ticket.getIP() aInfo = self.__getActionInfo(ticket)
aInfo["failures"] = ticket.getAttempt()
aInfo["time"] = ticket.getTime()
aInfo["matches"] = "".join(ticket.getMatches())
# to bypass actions, that should not be executed for restored tickets
aInfo["restored"] = 1 if ticket.restored else 0
if actions is None: if actions is None:
logSys.notice("[%s] Unban %s", self._jail.name, aInfo["ip"]) logSys.notice("[%s] Unban %s", self._jail.name, aInfo["ip"])
for name, action in unbactions.iteritems(): for name, action in unbactions.iteritems():
try: try:
if ticket.restored and getattr(action, 'norestored', False): if ticket.restored and getattr(action, 'norestored', False):
continue continue
logSys.debug("[%s] action %r: unban %s", self._jail.name, name, aInfo["ip"]) logSys.debug("[%s] action %r: unban %s", self._jail.name, name, ip)
action.unban(aInfo.copy()) if not aInfo.immutable: aInfo.reset()
action.unban(aInfo)
except Exception as e: except Exception as e:
logSys.error( logSys.error(
"Failed to execute unban jail '%s' action '%s' " "Failed to execute unban jail '%s' action '%s' "

View File

@ -483,7 +483,7 @@ class Fail2BanDb(object):
cur.execute( cur.execute(
"INSERT OR REPLACE INTO bips(ip, jail, timeofban, bantime, bancount, data) VALUES(?, ?, ?, ?, ?, ?)", "INSERT OR REPLACE INTO bips(ip, jail, timeofban, bantime, bancount, data) VALUES(?, ?, ?, ?, ?, ?)",
(ip, jail.name, int(round(ticket.getTime())), ticket.getBanTime(jail.actions.getBanTime()), ticket.getBanCount(), (ip, jail.name, int(round(ticket.getTime())), ticket.getBanTime(jail.actions.getBanTime()), ticket.getBanCount(),
{"matches": ticket.getMatches(), "failures": ticket.getAttempt()})) ticket.getData()))
@commitandrollback @commitandrollback
def delBan(self, cur, jail, ip): def delBan(self, cur, jail, ip):

View File

@ -27,6 +27,68 @@ import sys
from .ipdns import IPAddr from .ipdns import IPAddr
FTAG_CRE = re.compile(r'</?[\w\-]+/?>')
FCUSTNAME_CRE = re.compile(r'^(/?)F-([A-Z0-9_\-]+)$'); # currently uppercase only
R_HOST = [
# separated ipv4:
r"""(?:::f{4,6}:)?(?P<ip4>%s)""" % (IPAddr.IP_4_RE,),
# separated ipv6:
r"""(?P<ip6>%s)""" % (IPAddr.IP_6_RE,),
# place-holder for ipv6 enclosed in optional [] (used in addr-, host-regex)
"",
# separated dns:
r"""(?P<dns>[\w\-.^_]*\w)""",
# place-holder for ADDR tag-replacement (joined):
"",
# place-holder for HOST tag replacement (joined):
""
]
RI_IPV4 = 0
RI_IPV6 = 1
RI_IPV6BR = 2
RI_DNS = 3
RI_ADDR = 4
RI_HOST = 5
R_HOST[RI_IPV6BR] = r"""\[?%s\]?""" % (R_HOST[RI_IPV6],)
R_HOST[RI_ADDR] = "(?:%s)" % ("|".join((R_HOST[RI_IPV4], R_HOST[RI_IPV6BR])),)
R_HOST[RI_HOST] = "(?:%s)" % ("|".join((R_HOST[RI_IPV4], R_HOST[RI_IPV6BR], R_HOST[RI_DNS])),)
RH4TAG = {
# separated ipv4 (self closed, closed):
"IP4": R_HOST[RI_IPV4],
"F-IP4/": R_HOST[RI_IPV4],
# separated ipv6 (self closed, closed):
"IP6": R_HOST[RI_IPV6],
"F-IP6/": R_HOST[RI_IPV6],
# 2 address groups instead of <ADDR> - in opposition to `<HOST>`,
# for separate usage of 2 address groups only (regardless of `usedns`), `ip4` and `ip6` together
"ADDR": R_HOST[RI_ADDR],
"F-ADDR/": R_HOST[RI_ADDR],
# separated dns (self closed, closed):
"DNS": R_HOST[RI_DNS],
"F-DNS/": R_HOST[RI_DNS],
# default failure-id as no space tag:
"F-ID/": r"""(?P<fid>\S+)""",
# default failure port, like 80 or http :
"F-PORT/": r"""(?P<fport>\w+)""",
}
# default failure groups map for customizable expressions (with different group-id):
R_MAP = {
"ID": "fid",
"PORT": "fport",
}
def mapTag2Opt(tag):
try: # if should be mapped:
return R_MAP[tag]
except KeyError:
return tag.lower()
## ##
# Regular expression class. # Regular expression class.
# #
@ -71,38 +133,44 @@ class Regex:
@staticmethod @staticmethod
def _resolveHostTag(regex, useDns="yes"): def _resolveHostTag(regex, useDns="yes"):
# separated ipv4:
r_host = []
r = r"""(?:::f{4,6}:)?(?P<ip4>%s)""" % (IPAddr.IP_4_RE,)
regex = regex.replace("<IP4>", r); # self closed
regex = regex.replace("<F-IP4/>", r); # closed
r_host.append(r)
# separated ipv6:
r = r"""(?P<ip6>%s)""" % (IPAddr.IP_6_RE,)
regex = regex.replace("<IP6>", r); # self closed
regex = regex.replace("<F-IP6/>", r); # closed
r_host.append(r"""\[?%s\]?""" % (r,)); # enclose ipv6 in optional [] in host-regex
# 2 address groups instead of <ADDR> - in opposition to `<HOST>`,
# for separate usage of 2 address groups only (regardless of `usedns`), `ip4` and `ip6` together
regex = regex.replace("<ADDR>", "(?:%s)" % ("|".join(r_host),))
# separated dns:
r = r"""(?P<dns>[\w\-.^_]*\w)"""
regex = regex.replace("<DNS>", r); # self closed
regex = regex.replace("<F-DNS/>", r); # closed
if useDns not in ("no",):
r_host.append(r)
# 3 groups instead of <HOST> - separated ipv4, ipv6 and host (dns)
regex = regex.replace("<HOST>", "(?:%s)" % ("|".join(r_host),))
# default failure-id as no space tag:
regex = regex.replace("<F-ID/>", r"""(?P<fid>\S+)"""); # closed
# default failure port, like 80 or http :
regex = regex.replace("<F-PORT/>", r"""(?P<port>\w+)"""); # closed
# default failure groups (begin / end tag) for customizable expressions:
for o,r in (('IP4', 'ip4'), ('IP6', 'ip6'), ('DNS', 'dns'), ('ID', 'fid'), ('PORT', 'fport')):
regex = regex.replace("<F-%s>" % o, "(?P<%s>" % r); # open tag
regex = regex.replace("</F-%s>" % o, ")"); # close tag
return regex openTags = dict()
# tag interpolation callable:
def substTag(m):
tag = m.group()
tn = tag[1:-1]
# 3 groups instead of <HOST> - separated ipv4, ipv6 and host (dns)
if tn == "HOST":
return R_HOST[RI_HOST if useDns not in ("no",) else RI_ADDR]
# static replacement from RH4TAG:
try:
return RH4TAG[tn]
except KeyError:
pass
# (begin / end tag) for customizable expressions, additionally used as
# user custom tags (match will be stored in ticket data, can be used in actions):
m = FCUSTNAME_CRE.match(tn)
if m: # match F-...
m = m.groups()
tn = m[1]
# close tag:
if m[0]:
# check it was already open:
if openTags.get(tn):
return ")"
return tag; # tag not opened, use original
# open tag:
openTags[tn] = 1
# if should be mapped:
tn = mapTag2Opt(tn)
return "(?P<%s>" % (tn,)
# original, no replacement:
return tag
# substitute tags:
return FTAG_CRE.sub(substTag, regex)
## ##
# Gets the regular expression. # Gets the regular expression.
@ -121,30 +189,35 @@ class Regex:
# method of this object. # method of this object.
# @param a list of tupples. The tupples are ( prematch, datematch, postdatematch ) # @param a list of tupples. The tupples are ( prematch, datematch, postdatematch )
def search(self, tupleLines): def search(self, tupleLines, orgLines=None):
self._matchCache = self._regexObj.search( self._matchCache = self._regexObj.search(
"\n".join("".join(value[::2]) for value in tupleLines) + "\n") "\n".join("".join(value[::2]) for value in tupleLines) + "\n")
if self.hasMatched(): if self._matchCache:
if orgLines is None: orgLines = tupleLines
# if single-line:
if len(orgLines) <= 1:
self._matchedTupleLines = orgLines
self._unmatchedTupleLines = []
else:
# Find start of the first line where the match was found # Find start of the first line where the match was found
try: try:
self._matchLineStart = self._matchCache.string.rindex( matchLineStart = self._matchCache.string.rindex(
"\n", 0, self._matchCache.start() +1 ) + 1 "\n", 0, self._matchCache.start() +1 ) + 1
except ValueError: except ValueError:
self._matchLineStart = 0 matchLineStart = 0
# Find end of the last line where the match was found # Find end of the last line where the match was found
try: try:
self._matchLineEnd = self._matchCache.string.index( matchLineEnd = self._matchCache.string.index(
"\n", self._matchCache.end() - 1) + 1 "\n", self._matchCache.end() - 1) + 1
except ValueError: except ValueError:
self._matchLineEnd = len(self._matchCache.string) matchLineEnd = len(self._matchCache.string)
lineCount1 = self._matchCache.string.count( lineCount1 = self._matchCache.string.count(
"\n", 0, self._matchLineStart) "\n", 0, matchLineStart)
lineCount2 = self._matchCache.string.count( lineCount2 = self._matchCache.string.count(
"\n", 0, self._matchLineEnd) "\n", 0, matchLineEnd)
self._matchedTupleLines = tupleLines[lineCount1:lineCount2] self._matchedTupleLines = orgLines[lineCount1:lineCount2]
self._unmatchedTupleLines = tupleLines[:lineCount1] self._unmatchedTupleLines = orgLines[:lineCount1]
n = 0 n = 0
for skippedLine in self.getSkippedLines(): for skippedLine in self.getSkippedLines():
for m, matchedTupleLine in enumerate( for m, matchedTupleLine in enumerate(
@ -154,7 +227,7 @@ class Regex:
self._matchedTupleLines.pop(n+m)) self._matchedTupleLines.pop(n+m))
n += m n += m
break break
self._unmatchedTupleLines.extend(tupleLines[lineCount2:]) self._unmatchedTupleLines.extend(orgLines[lineCount2:])
# Checks if the previous call to search() matched. # Checks if the previous call to search() matched.
# #
@ -166,6 +239,13 @@ class Regex:
else: else:
return False return False
##
# Returns all matched groups.
#
def getGroups(self):
return self._matchCache.groupdict()
## ##
# Returns skipped lines. # Returns skipped lines.
# #
@ -243,6 +323,10 @@ class RegexException(Exception):
# #
FAILURE_ID_GROPS = ("fid", "ip4", "ip6", "dns") FAILURE_ID_GROPS = ("fid", "ip4", "ip6", "dns")
# Additionally allows multi-line failure-id (used for wrapping e. g. conn-id to host)
#
FAILURE_ID_PRESENTS = FAILURE_ID_GROPS + ("mlfid",)
## ##
# Regular expression class. # Regular expression class.
# #
@ -257,20 +341,16 @@ class FailRegex(Regex):
# avoid construction of invalid object. # avoid construction of invalid object.
# @param value the regular expression # @param value the regular expression
def __init__(self, regex, **kwargs): def __init__(self, regex, prefRegex=None, **kwargs):
# Initializes the parent. # Initializes the parent.
Regex.__init__(self, regex, **kwargs) Regex.__init__(self, regex, **kwargs)
# Check for group "dns", "ip4", "ip6", "fid" # Check for group "dns", "ip4", "ip6", "fid"
if not [grp for grp in FAILURE_ID_GROPS if grp in self._regexObj.groupindex]: if (not [grp for grp in FAILURE_ID_PRESENTS if grp in self._regexObj.groupindex]
and (prefRegex is None or
not [grp for grp in FAILURE_ID_PRESENTS if grp in prefRegex._regexObj.groupindex])
):
raise RegexException("No failure-id group in '%s'" % self._regex) raise RegexException("No failure-id group in '%s'" % self._regex)
##
# Returns all matched groups.
#
def getGroups(self):
return self._matchCache.groupdict()
## ##
# Returns the matched failure id. # Returns the matched failure id.
# #

View File

@ -39,6 +39,7 @@ from .datedetector import DateDetector
from .mytime import MyTime from .mytime import MyTime
from .failregex import FailRegex, Regex, RegexException from .failregex import FailRegex, Regex, RegexException
from .action import CommandAction from .action import CommandAction
from .utils import Utils
from ..helpers import getLogger, PREFER_ENC from ..helpers import getLogger, PREFER_ENC
# Gets the instance of the logger. # Gets the instance of the logger.
@ -66,6 +67,8 @@ class Filter(JailThread):
self.jail = jail self.jail = jail
## The failures manager. ## The failures manager.
self.failManager = FailManager() self.failManager = FailManager()
## Regular expression pre-filtering matching the failures.
self.__prefRegex = None
## The regular expression list matching the failures. ## The regular expression list matching the failures.
self.__failRegex = list() self.__failRegex = list()
## The regular expression list with expressions to ignore. ## The regular expression list with expressions to ignore.
@ -87,6 +90,8 @@ class Filter(JailThread):
self.__ignoreCommand = False self.__ignoreCommand = False
## Default or preferred encoding (to decode bytes from file or journal): ## Default or preferred encoding (to decode bytes from file or journal):
self.__encoding = PREFER_ENC self.__encoding = PREFER_ENC
## Cache temporary holds failures info (used by multi-line for wrapping e. g. conn-id to host):
self.__mlfidCache = None
## Error counter (protected, so can be used in filter implementations) ## Error counter (protected, so can be used in filter implementations)
## if it reached 100 (at once), run-cycle will go idle ## if it reached 100 (at once), run-cycle will go idle
self._errors = 0 self._errors = 0
@ -100,7 +105,7 @@ class Filter(JailThread):
self.ticks = 0 self.ticks = 0
self.dateDetector = DateDetector() self.dateDetector = DateDetector()
logSys.debug("Created %s" % self) logSys.debug("Created %s", self)
def __repr__(self): def __repr__(self):
return "%s(%r)" % (self.__class__.__name__, self.jail) return "%s(%r)" % (self.__class__.__name__, self.jail)
@ -130,6 +135,23 @@ class Filter(JailThread):
self.delLogPath(path) self.delLogPath(path)
delattr(self, '_reload_logs') delattr(self, '_reload_logs')
@property
def mlfidCache(self):
if self.__mlfidCache:
return self.__mlfidCache
self.__mlfidCache = Utils.Cache(maxCount=100, maxTime=5*60)
return self.__mlfidCache
@property
def prefRegex(self):
return self.__prefRegex
@prefRegex.setter
def prefRegex(self, value):
if value:
self.__prefRegex = Regex(value, useDns=self.__useDns)
else:
self.__prefRegex = None
## ##
# Add a regular expression which matches the failure. # Add a regular expression which matches the failure.
# #
@ -139,7 +161,7 @@ class Filter(JailThread):
def addFailRegex(self, value): def addFailRegex(self, value):
try: try:
regex = FailRegex(value, useDns=self.__useDns) regex = FailRegex(value, prefRegex=self.__prefRegex, useDns=self.__useDns)
self.__failRegex.append(regex) self.__failRegex.append(regex)
if "\n" in regex.getRegex() and not self.getMaxLines() > 1: if "\n" in regex.getRegex() and not self.getMaxLines() > 1:
logSys.warning( logSys.warning(
@ -159,7 +181,7 @@ class Filter(JailThread):
del self.__failRegex[index] del self.__failRegex[index]
except IndexError: except IndexError:
logSys.error("Cannot remove regular expression. Index %d is not " logSys.error("Cannot remove regular expression. Index %d is not "
"valid" % index) "valid", index)
## ##
# Get the regular expression which matches the failure. # Get the regular expression which matches the failure.
@ -197,7 +219,7 @@ class Filter(JailThread):
del self.__ignoreRegex[index] del self.__ignoreRegex[index]
except IndexError: except IndexError:
logSys.error("Cannot remove regular expression. Index %d is not " logSys.error("Cannot remove regular expression. Index %d is not "
"valid" % index) "valid", index)
## ##
# Get the regular expression which matches the failure. # Get the regular expression which matches the failure.
@ -220,9 +242,9 @@ class Filter(JailThread):
value = value.lower() # must be a string by now value = value.lower() # must be a string by now
if not (value in ('yes', 'warn', 'no', 'raw')): if not (value in ('yes', 'warn', 'no', 'raw')):
logSys.error("Incorrect value %r specified for usedns. " logSys.error("Incorrect value %r specified for usedns. "
"Using safe 'no'" % (value,)) "Using safe 'no'", value)
value = 'no' value = 'no'
logSys.debug("Setting usedns = %s for %s" % (value, self)) logSys.debug("Setting usedns = %s for %s", value, self)
self.__useDns = value self.__useDns = value
## ##
@ -335,7 +357,7 @@ class Filter(JailThread):
encoding = PREFER_ENC encoding = PREFER_ENC
codecs.lookup(encoding) # Raise LookupError if invalid codec codecs.lookup(encoding) # Raise LookupError if invalid codec
self.__encoding = encoding self.__encoding = encoding
logSys.info(" encoding: %s" % encoding) logSys.info(" encoding: %s", encoding)
return encoding return encoding
## ##
@ -380,7 +402,7 @@ class Filter(JailThread):
if not isinstance(ip, IPAddr): if not isinstance(ip, IPAddr):
ip = IPAddr(ip) ip = IPAddr(ip)
if self.inIgnoreIPList(ip): if self.inIgnoreIPList(ip):
logSys.warning('Requested to manually ban an ignored IP %s. User knows best. Proceeding to ban it.' % ip) logSys.warning('Requested to manually ban an ignored IP %s. User knows best. Proceeding to ban it.', ip)
unixTime = MyTime.time() unixTime = MyTime.time()
self.failManager.addFailure(FailTicket(ip, unixTime), self.failManager.getMaxRetry()) self.failManager.addFailure(FailTicket(ip, unixTime), self.failManager.getMaxRetry())
@ -424,7 +446,7 @@ class Filter(JailThread):
def logIgnoreIp(self, ip, log_ignore, ignore_source="unknown source"): def logIgnoreIp(self, ip, log_ignore, ignore_source="unknown source"):
if log_ignore: if log_ignore:
logSys.info("[%s] Ignore %s by %s" % (self.jailName, ip, ignore_source)) logSys.info("[%s] Ignore %s by %s", self.jailName, ip, ignore_source)
def getIgnoreIP(self): def getIgnoreIP(self):
return self.__ignoreIpList return self.__ignoreIpList
@ -448,7 +470,7 @@ class Filter(JailThread):
if self.__ignoreCommand: if self.__ignoreCommand:
command = CommandAction.replaceTag(self.__ignoreCommand, { 'ip': ip } ) command = CommandAction.replaceTag(self.__ignoreCommand, { 'ip': ip } )
logSys.debug('ignore command: ' + command) logSys.debug('ignore command: %s', command)
ret, ret_ignore = CommandAction.executeCmd(command, success_codes=(0, 1)) ret, ret_ignore = CommandAction.executeCmd(command, success_codes=(0, 1))
ret_ignore = ret and ret_ignore == 0 ret_ignore = ret and ret_ignore == 0
self.logIgnoreIp(ip, log_ignore and ret_ignore, ignore_source="command") self.logIgnoreIp(ip, log_ignore and ret_ignore, ignore_source="command")
@ -487,10 +509,7 @@ class Filter(JailThread):
for element in self.processLine(line, date): for element in self.processLine(line, date):
ip = element[1] ip = element[1]
unixTime = element[2] unixTime = element[2]
lines = element[3] fail = element[3]
fail = {}
if len(element) > 4:
fail = element[4]
logSys.debug("Processing line with time:%s and ip:%s", logSys.debug("Processing line with time:%s and ip:%s",
unixTime, ip) unixTime, ip)
if self.inIgnoreIPList(ip, log_ignore=True): if self.inIgnoreIPList(ip, log_ignore=True):
@ -498,7 +517,7 @@ class Filter(JailThread):
logSys.info( logSys.info(
"[%s] Found %s - %s", self.jailName, ip, datetime.datetime.fromtimestamp(unixTime).strftime("%Y-%m-%d %H:%M:%S") "[%s] Found %s - %s", self.jailName, ip, datetime.datetime.fromtimestamp(unixTime).strftime("%Y-%m-%d %H:%M:%S")
) )
tick = FailTicket(ip, unixTime, lines, data=fail) tick = FailTicket(ip, unixTime, data=fail)
self.failManager.addFailure(tick) self.failManager.addFailure(tick)
# report to observer - failure was found, for possibly increasing of it retry counter (asynchronous) # report to observer - failure was found, for possibly increasing of it retry counter (asynchronous)
if Observers.Main is not None: if Observers.Main is not None:
@ -536,6 +555,29 @@ class Filter(JailThread):
return ignoreRegexIndex return ignoreRegexIndex
return None return None
def _mergeFailure(self, mlfid, fail, failRegex):
mlfidFail = self.mlfidCache.get(mlfid) if self.__mlfidCache else None
if mlfidFail:
mlfidGroups = mlfidFail[1]
# if current line not failure, but previous was failure:
if fail.get('nofail') and not mlfidGroups.get('nofail'):
del fail['nofail'] # remove nofail flag - was already market as failure
self.mlfidCache.unset(mlfid) # remove cache entry
# if current line is failure, but previous was not:
elif not fail.get('nofail') and mlfidGroups.get('nofail'):
del mlfidGroups['nofail'] # remove nofail flag
self.mlfidCache.unset(mlfid) # remove cache entry
fail2 = mlfidGroups.copy()
fail2.update(fail)
fail2["matches"] = fail.get("matches", []) + failRegex.getMatchedTupleLines()
fail = fail2
elif fail.get('nofail'):
fail["matches"] = failRegex.getMatchedTupleLines()
mlfidFail = [self.__lastDate, fail]
self.mlfidCache.set(mlfid, mlfidFail)
return fail
## ##
# Finds the failure in a line given split into time and log parts. # Finds the failure in a line given split into time and log parts.
# #
@ -568,7 +610,7 @@ class Filter(JailThread):
dateTimeMatch = self.dateDetector.getTime(timeText, tupleLine[3]) dateTimeMatch = self.dateDetector.getTime(timeText, tupleLine[3])
if dateTimeMatch is None: if dateTimeMatch is None:
logSys.error("findFailure failed to parse timeText: " + timeText) logSys.error("findFailure failed to parse timeText: %s", timeText)
date = self.__lastDate date = self.__lastDate
else: else:
@ -586,14 +628,32 @@ class Filter(JailThread):
date, MyTime.time(), self.getFindTime()) date, MyTime.time(), self.getFindTime())
return failList return failList
self.__lineBuffer = ( if self.__lineBufferSize > 1:
orgBuffer = self.__lineBuffer = (
self.__lineBuffer + [tupleLine[:3]])[-self.__lineBufferSize:] self.__lineBuffer + [tupleLine[:3]])[-self.__lineBufferSize:]
logSys.log(5, "Looking for failregex match of %r" % self.__lineBuffer) else:
orgBuffer = self.__lineBuffer = [tupleLine[:3]]
logSys.log(5, "Looking for failregex match of %r", self.__lineBuffer)
# Pre-filter fail regex (if available):
preGroups = {}
if self.__prefRegex:
self.__prefRegex.search(self.__lineBuffer)
if not self.__prefRegex.hasMatched():
return failList
preGroups = self.__prefRegex.getGroups()
logSys.log(7, "Pre-filter matched %s", preGroups)
repl = preGroups.get('content')
# Content replacement:
if repl:
del preGroups['content']
self.__lineBuffer = [('', '', repl)]
# Iterates over all the regular expressions. # Iterates over all the regular expressions.
for failRegexIndex, failRegex in enumerate(self.__failRegex): for failRegexIndex, failRegex in enumerate(self.__failRegex):
failRegex.search(self.__lineBuffer) failRegex.search(self.__lineBuffer, orgBuffer)
if failRegex.hasMatched(): if not failRegex.hasMatched():
continue
# The failregex matched. # The failregex matched.
logSys.log(7, "Matched %s", failRegex) logSys.log(7, "Matched %s", failRegex)
# Checks if we must ignore this match. # Checks if we must ignore this match.
@ -614,45 +674,68 @@ class Filter(JailThread):
"If format is complex, please " "If format is complex, please "
"file a detailed issue on" "file a detailed issue on"
" https://github.com/fail2ban/fail2ban/issues " " https://github.com/fail2ban/fail2ban/issues "
"in order to get support for this format." "in order to get support for this format.",
% ("\n".join(failRegex.getMatchedLines()), timeText)) "\n".join(failRegex.getMatchedLines()), timeText)
else: continue
self.__lineBuffer = failRegex.getUnmatchedTupleLines() self.__lineBuffer = failRegex.getUnmatchedTupleLines()
# retrieve failure-id, host, etc from failure match: # retrieve failure-id, host, etc from failure match:
raw = returnRawHost
try: try:
raw = returnRawHost
if preGroups:
fail = preGroups.copy()
fail.update(failRegex.getGroups())
else:
fail = failRegex.getGroups() fail = failRegex.getGroups()
# first try to check we have mlfid case (caching of connection id by multi-line):
mlfid = fail.get('mlfid')
if mlfid is not None:
fail = self._mergeFailure(mlfid, fail, failRegex)
else:
# matched lines:
fail["matches"] = fail.get("matches", []) + failRegex.getMatchedTupleLines()
# failure-id: # failure-id:
fid = fail.get('fid') fid = fail.get('fid')
# ip-address or host: # ip-address or host:
host = fail.get('ip4') or fail.get('ip6') host = fail.get('ip4')
if host is not None: if host is not None:
cidr = IPAddr.FAM_IPv4
raw = True raw = True
else: else:
host = fail.get('ip6')
if host is not None:
cidr = IPAddr.FAM_IPv6
raw = True
if host is None:
host = fail.get('dns') host = fail.get('dns')
if host is None: if host is None:
# if no failure-id also (obscure case, wrong regex), throw error inside getFailID: # first try to check we have mlfid case (cache connection id):
if fid is None: if fid is None:
if mlfid:
fail = self._mergeFailure(mlfid, fail, failRegex)
else:
# if no failure-id also (obscure case, wrong regex), throw error inside getFailID:
fid = failRegex.getFailID() fid = failRegex.getFailID()
host = fid host = fid
cidr = IPAddr.CIDR_RAW cidr = IPAddr.CIDR_RAW
# if mlfid case (not failure):
if host is None:
if not self.checkAllRegex: # or fail.get('nofail'):
return failList
ips = [None]
# if raw - add single ip or failure-id, # if raw - add single ip or failure-id,
# otherwise expand host to multiple ips using dns (or ignore it if not valid): # otherwise expand host to multiple ips using dns (or ignore it if not valid):
if raw: elif raw:
ip = IPAddr(host, cidr) ip = IPAddr(host, cidr)
# check host equal failure-id, if not - failure with complex id: # check host equal failure-id, if not - failure with complex id:
if fid is not None and fid != host: if fid is not None and fid != host:
ip = IPAddr(fid, IPAddr.CIDR_RAW) ip = IPAddr(fid, IPAddr.CIDR_RAW)
failList.append([failRegexIndex, ip, date, ips = [ip]
failRegex.getMatchedLines(), fail]) # otherwise, try to use dns conversion:
if not self.checkAllRegex:
break
else: else:
ips = DNSUtils.textToIp(host, self.__useDns) ips = DNSUtils.textToIp(host, self.__useDns)
if ips: # append failure with match to the list:
for ip in ips: for ip in ips:
failList.append([failRegexIndex, ip, date, failList.append([failRegexIndex, ip, date, fail])
failRegex.getMatchedLines(), fail])
if not self.checkAllRegex: if not self.checkAllRegex:
break break
except RegexException as e: # pragma: no cover - unsure if reachable except RegexException as e: # pragma: no cover - unsure if reachable
@ -720,7 +803,7 @@ class FileFilter(Filter):
db = self.jail.database db = self.jail.database
if db is not None: if db is not None:
db.updateLog(self.jail, log) db.updateLog(self.jail, log)
logSys.info("Removed logfile: %r" % path) logSys.info("Removed logfile: %r", path)
self._delLogPath(path) self._delLogPath(path)
return return
@ -785,7 +868,7 @@ class FileFilter(Filter):
def getFailures(self, filename): def getFailures(self, filename):
log = self.getLog(filename) log = self.getLog(filename)
if log is None: if log is None:
logSys.error("Unable to get failures in " + filename) logSys.error("Unable to get failures in %s", filename)
return False return False
# We should always close log (file), otherwise may be locked (log-rotate, etc.) # We should always close log (file), otherwise may be locked (log-rotate, etc.)
try: try:
@ -794,11 +877,11 @@ class FileFilter(Filter):
has_content = log.open() has_content = log.open()
# see http://python.org/dev/peps/pep-3151/ # see http://python.org/dev/peps/pep-3151/
except IOError as e: except IOError as e:
logSys.error("Unable to open %s" % filename) logSys.error("Unable to open %s", filename)
logSys.exception(e) logSys.exception(e)
return False return False
except OSError as e: # pragma: no cover - requires race condition to tigger this except OSError as e: # pragma: no cover - requires race condition to tigger this
logSys.error("Error opening %s" % filename) logSys.error("Error opening %s", filename)
logSys.exception(e) logSys.exception(e)
return False return False
except Exception as e: # pragma: no cover - Requires implemention error in FileContainer to generate except Exception as e: # pragma: no cover - Requires implemention error in FileContainer to generate
@ -1019,7 +1102,7 @@ class FileContainer:
## sys.stdout.flush() ## sys.stdout.flush()
# Compare hash and inode # Compare hash and inode
if self.__hash != myHash or self.__ino != stats.st_ino: if self.__hash != myHash or self.__ino != stats.st_ino:
logSys.info("Log rotation detected for %s" % self.__filename) logSys.info("Log rotation detected for %s", self.__filename)
self.__hash = myHash self.__hash = myHash
self.__ino = stats.st_ino self.__ino = stats.st_ino
self.__pos = 0 self.__pos = 0

View File

@ -64,16 +64,19 @@ class DNSUtils:
if ips is not None: if ips is not None:
return ips return ips
# retrieve ips # retrieve ips
try:
ips = list() ips = list()
for result in socket.getaddrinfo(dns, None, 0, 0, socket.IPPROTO_TCP): saveerr = None
ip = IPAddr(result[4][0]) for fam, ipfam in ((socket.AF_INET, IPAddr.FAM_IPv4), (socket.AF_INET6, IPAddr.FAM_IPv6)):
try:
for result in socket.getaddrinfo(dns, None, fam, 0, socket.IPPROTO_TCP):
ip = IPAddr(result[4][0], ipfam)
if ip.isValid: if ip.isValid:
ips.append(ip) ips.append(ip)
except socket.error as e: except socket.error as e:
# todo: make configurable the expired time of cache entry: saveerr = e
logSys.warning("Unable to find a corresponding IP address for %s: %s", dns, e) if not ips and saveerr:
ips = list() logSys.warning("Unable to find a corresponding IP address for %s: %s", dns, saveerr)
DNSUtils.CACHE_nameToIp.set(dns, ips) DNSUtils.CACHE_nameToIp.set(dns, ips)
return ips return ips
@ -140,6 +143,8 @@ class IPAddr(object):
CIDR_RAW = -2 CIDR_RAW = -2
CIDR_UNSPEC = -1 CIDR_UNSPEC = -1
FAM_IPv4 = CIDR_RAW - socket.AF_INET
FAM_IPv6 = CIDR_RAW - socket.AF_INET6
def __new__(cls, ipstr, cidr=CIDR_UNSPEC): def __new__(cls, ipstr, cidr=CIDR_UNSPEC):
# check already cached as IPAddr # check already cached as IPAddr
@ -191,7 +196,11 @@ class IPAddr(object):
self._raw = ipstr self._raw = ipstr
# if not raw - recognize family, set addr, etc.: # if not raw - recognize family, set addr, etc.:
if cidr != IPAddr.CIDR_RAW: if cidr != IPAddr.CIDR_RAW:
for family in [socket.AF_INET, socket.AF_INET6]: if cidr is not None and cidr < IPAddr.CIDR_RAW:
family = [IPAddr.CIDR_RAW - cidr]
else:
family = [socket.AF_INET, socket.AF_INET6]
for family in family:
try: try:
binary = socket.inet_pton(family, ipstr) binary = socket.inet_pton(family, ipstr)
self._family = family self._family = family
@ -346,7 +355,7 @@ class IPAddr(object):
return socket.inet_ntop(self._family, binary) + add return socket.inet_ntop(self._family, binary) + add
def getPTR(self, suffix=""): def getPTR(self, suffix=None):
""" return the DNS PTR string of the provided IP address object """ return the DNS PTR string of the provided IP address object
If "suffix" is provided it will be appended as the second and top If "suffix" is provided it will be appended as the second and top
@ -356,11 +365,11 @@ class IPAddr(object):
""" """
if self.isIPv4: if self.isIPv4:
exploded_ip = self.ntoa.split(".") exploded_ip = self.ntoa.split(".")
if not suffix: if suffix is None:
suffix = "in-addr.arpa." suffix = "in-addr.arpa."
elif self.isIPv6: elif self.isIPv6:
exploded_ip = self.hexdump exploded_ip = self.hexdump
if not suffix: if suffix is None:
suffix = "ip6.arpa." suffix = "ip6.arpa."
else: else:
return "" return ""

View File

@ -402,6 +402,14 @@ class Server:
def getIgnoreCommand(self, name): def getIgnoreCommand(self, name):
return self.__jails[name].filter.getIgnoreCommand() return self.__jails[name].filter.getIgnoreCommand()
def setPrefRegex(self, name, value):
flt = self.__jails[name].filter
logSys.debug(" prefregex: %r", value)
flt.prefRegex = value
def getPrefRegex(self, name):
return self.__jails[name].filter.prefRegex
def addFailRegex(self, name, value, multiple=False): def addFailRegex(self, name, value, multiple=False):
flt = self.__jails[name].filter flt = self.__jails[name].filter
if not multiple: value = (value,) if not multiple: value = (value,)

View File

@ -54,7 +54,9 @@ class Ticket(object):
self._time = time if time is not None else MyTime.time() self._time = time if time is not None else MyTime.time()
self._data = {'matches': matches or [], 'failures': 0} self._data = {'matches': matches or [], 'failures': 0}
if data is not None: if data is not None:
self._data.update(data) for k,v in data.iteritems():
if v is not None:
self._data[k] = v
if ticket: if ticket:
# ticket available - copy whole information from ticket: # ticket available - copy whole information from ticket:
self.__dict__.update(i for i in ticket.__dict__.iteritems() if i[0] in self.__dict__) self.__dict__.update(i for i in ticket.__dict__.iteritems() if i[0] in self.__dict__)
@ -135,7 +137,8 @@ class Ticket(object):
self._data['matches'] = matches or [] self._data['matches'] = matches or []
def getMatches(self): def getMatches(self):
return self._data.get('matches', []) return [(line if isinstance(line, basestring) else "".join(line)) \
for line in self._data.get('matches', ())]
@property @property
def restored(self): def restored(self):
@ -232,7 +235,11 @@ class FailTicket(Ticket):
self.__retry += count self.__retry += count
self._data['failures'] += attempt self._data['failures'] += attempt
if matches: if matches:
self._data['matches'] += matches # we should duplicate "matches", because possibly referenced to multiple tickets:
if self._data['matches']:
self._data['matches'] = self._data['matches'] + matches
else:
self._data['matches'] = matches
def setLastTime(self, value): def setLastTime(self, value):
if value > self._time: if value > self._time:

View File

@ -221,6 +221,10 @@ class Transmitter:
value = command[2:] value = command[2:]
self.__server.delJournalMatch(name, value) self.__server.delJournalMatch(name, value)
return self.__server.getJournalMatch(name) return self.__server.getJournalMatch(name)
elif command[1] == "prefregex":
value = command[2]
self.__server.setPrefRegex(name, value)
return self.__server.getPrefRegex(name)
elif command[1] == "addfailregex": elif command[1] == "addfailregex":
value = command[2] value = command[2]
self.__server.addFailRegex(name, value, multiple=multiple) self.__server.addFailRegex(name, value, multiple=multiple)
@ -346,6 +350,8 @@ class Transmitter:
return self.__server.getIgnoreIP(name) return self.__server.getIgnoreIP(name)
elif command[1] == "ignorecommand": elif command[1] == "ignorecommand":
return self.__server.getIgnoreCommand(name) return self.__server.getIgnoreCommand(name)
elif command[1] == "prefregex":
return self.__server.getPrefRegex(name)
elif command[1] == "failregex": elif command[1] == "failregex":
return self.__server.getFailRegex(name) return self.__server.getFailRegex(name)
elif command[1] == "ignoreregex": elif command[1] == "ignoreregex":

View File

@ -98,6 +98,12 @@ class Utils():
cache.popitem() cache.popitem()
cache[k] = (v, t + self.maxTime) cache[k] = (v, t + self.maxTime)
def unset(self, k):
try:
del self._cache[k]
except KeyError: # pragme: no cover
pass
@staticmethod @staticmethod
def setFBlockMode(fhandle, value): def setFBlockMode(fhandle, value):

View File

@ -29,7 +29,7 @@ import tempfile
import time import time
import unittest import unittest
from ..server.action import CommandAction, CallingMap from ..server.action import CommandAction, CallingMap, substituteRecursiveTags
from ..server.actions import OrderedDict from ..server.actions import OrderedDict
from ..server.utils import Utils from ..server.utils import Utils
@ -40,11 +40,19 @@ class CommandActionTest(LogCaptureTestCase):
def setUp(self): def setUp(self):
"""Call before every test case.""" """Call before every test case."""
self.__action = CommandAction(None, "Test")
LogCaptureTestCase.setUp(self) LogCaptureTestCase.setUp(self)
self.__action = CommandAction(None, "Test")
# prevent execute stop if start fails (or event not started at all):
self.__action_started = False
orgstart = self.__action.start
def _action_start():
self.__action_started = True
return orgstart()
self.__action.start = _action_start
def tearDown(self): def tearDown(self):
"""Call after every test case.""" """Call after every test case."""
if self.__action_started:
self.__action.stop() self.__action.stop()
LogCaptureTestCase.tearDown(self) LogCaptureTestCase.tearDown(self)
@ -56,30 +64,30 @@ class CommandActionTest(LogCaptureTestCase):
} }
# Recursion is bad # Recursion is bad
self.assertRaises(ValueError, self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<A>'})) lambda: substituteRecursiveTags({'A': '<A>'}))
self.assertRaises(ValueError, self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<B>', 'B': '<A>'})) lambda: substituteRecursiveTags({'A': '<B>', 'B': '<A>'}))
self.assertRaises(ValueError, self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': '<B>', 'B': '<C>', 'C': '<A>'})) lambda: substituteRecursiveTags({'A': '<B>', 'B': '<C>', 'C': '<A>'}))
# Unresolveable substition # Unresolveable substition
self.assertRaises(ValueError, self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'A': 'to=<B> fromip=<IP>', 'C': '<B>', 'B': '<C>', 'D': ''})) lambda: substituteRecursiveTags({'A': 'to=<B> fromip=<IP>', 'C': '<B>', 'B': '<C>', 'D': ''}))
self.assertRaises(ValueError, self.assertRaises(ValueError,
lambda: CommandAction.substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP>', 'sweet': '<honeypot>', 'honeypot': '<sweet>', 'ignoreregex': ''})) lambda: substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP>', 'sweet': '<honeypot>', 'honeypot': '<sweet>', 'ignoreregex': ''}))
# We need here an ordered, because the sequence of iteration is very important for this test # We need here an ordered, because the sequence of iteration is very important for this test
if OrderedDict: if OrderedDict:
# No cyclic recursion, just multiple replacement of tag <T>, should be successful: # No cyclic recursion, just multiple replacement of tag <T>, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict( self.assertEqual(substituteRecursiveTags( OrderedDict(
(('X', 'x=x<T>'), ('T', '1'), ('Z', '<X> <T> <Y>'), ('Y', 'y=y<T>'))) (('X', 'x=x<T>'), ('T', '1'), ('Z', '<X> <T> <Y>'), ('Y', 'y=y<T>')))
), {'X': 'x=x1', 'T': '1', 'Y': 'y=y1', 'Z': 'x=x1 1 y=y1'} ), {'X': 'x=x1', 'T': '1', 'Y': 'y=y1', 'Z': 'x=x1 1 y=y1'}
) )
# No cyclic recursion, just multiple replacement of tag <T> in composite tags, should be successful: # No cyclic recursion, just multiple replacement of tag <T> in composite tags, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict( self.assertEqual(substituteRecursiveTags( OrderedDict(
(('X', 'x=x<T> <Z> <<R1>> <<R2>>'), ('R1', 'Z'), ('R2', 'Y'), ('T', '1'), ('Z', '<T> <Y>'), ('Y', 'y=y<T>'))) (('X', 'x=x<T> <Z> <<R1>> <<R2>>'), ('R1', 'Z'), ('R2', 'Y'), ('T', '1'), ('Z', '<T> <Y>'), ('Y', 'y=y<T>')))
), {'X': 'x=x1 1 y=y1 1 y=y1 y=y1', 'R1': 'Z', 'R2': 'Y', 'T': '1', 'Z': '1 y=y1', 'Y': 'y=y1'} ), {'X': 'x=x1 1 y=y1 1 y=y1 y=y1', 'R1': 'Z', 'R2': 'Y', 'T': '1', 'Z': '1 y=y1', 'Y': 'y=y1'}
) )
# No cyclic recursion, just multiple replacement of same tags, should be successful: # No cyclic recursion, just multiple replacement of same tags, should be successful:
self.assertEqual(CommandAction.substituteRecursiveTags( OrderedDict(( self.assertEqual(substituteRecursiveTags( OrderedDict((
('actionstart', 'ipset create <ipmset> hash:ip timeout <bantime> family <ipsetfamily>\n<iptables> -I <chain> <actiontype>'), ('actionstart', 'ipset create <ipmset> hash:ip timeout <bantime> family <ipsetfamily>\n<iptables> -I <chain> <actiontype>'),
('ipmset', 'f2b-<name>'), ('ipmset', 'f2b-<name>'),
('name', 'any'), ('name', 'any'),
@ -111,42 +119,42 @@ class CommandActionTest(LogCaptureTestCase):
)) ))
) )
# Cyclic recursion by composite tag creation, tags "create" another tag, that closes cycle: # Cyclic recursion by composite tag creation, tags "create" another tag, that closes cycle:
self.assertRaises(ValueError, lambda: CommandAction.substituteRecursiveTags( OrderedDict(( self.assertRaises(ValueError, lambda: substituteRecursiveTags( OrderedDict((
('A', '<<B><C>>'), ('A', '<<B><C>>'),
('B', 'D'), ('C', 'E'), ('B', 'D'), ('C', 'E'),
('DE', 'cycle <A>'), ('DE', 'cycle <A>'),
)) )) )) ))
self.assertRaises(ValueError, lambda: CommandAction.substituteRecursiveTags( OrderedDict(( self.assertRaises(ValueError, lambda: substituteRecursiveTags( OrderedDict((
('DE', 'cycle <A>'), ('DE', 'cycle <A>'),
('A', '<<B><C>>'), ('A', '<<B><C>>'),
('B', 'D'), ('C', 'E'), ('B', 'D'), ('C', 'E'),
)) )) )) ))
# missing tags are ok # missing tags are ok
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C>'}), {'A': '<C>'}) self.assertEqual(substituteRecursiveTags({'A': '<C>'}), {'A': '<C>'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C> <D> <X>','X':'fun'}), {'A': '<C> <D> fun', 'X':'fun'}) self.assertEqual(substituteRecursiveTags({'A': '<C> <D> <X>','X':'fun'}), {'A': '<C> <D> fun', 'X':'fun'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<C> <B>', 'B': 'cool'}), {'A': '<C> cool', 'B': 'cool'}) self.assertEqual(substituteRecursiveTags({'A': '<C> <B>', 'B': 'cool'}), {'A': '<C> cool', 'B': 'cool'})
# Escaped tags should be ignored # Escaped tags should be ignored
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<matches> <B>', 'B': 'cool'}), {'A': '<matches> cool', 'B': 'cool'}) self.assertEqual(substituteRecursiveTags({'A': '<matches> <B>', 'B': 'cool'}), {'A': '<matches> cool', 'B': 'cool'})
# Multiple stuff on same line is ok # Multiple stuff on same line is ok
self.assertEqual(CommandAction.substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP> evilperson=<honeypot>', 'honeypot': 'pokie', 'ignoreregex': ''}), self.assertEqual(substituteRecursiveTags({'failregex': 'to=<honeypot> fromip=<IP> evilperson=<honeypot>', 'honeypot': 'pokie', 'ignoreregex': ''}),
{ 'failregex': "to=pokie fromip=<IP> evilperson=pokie", { 'failregex': "to=pokie fromip=<IP> evilperson=pokie",
'honeypot': 'pokie', 'honeypot': 'pokie',
'ignoreregex': '', 'ignoreregex': '',
}) })
# rest is just cool # rest is just cool
self.assertEqual(CommandAction.substituteRecursiveTags(aInfo), self.assertEqual(substituteRecursiveTags(aInfo),
{ 'HOST': "192.0.2.0", { 'HOST': "192.0.2.0",
'ABC': '123 192.0.2.0', 'ABC': '123 192.0.2.0',
'xyz': '890 123 192.0.2.0', 'xyz': '890 123 192.0.2.0',
}) })
# obscure embedded case # obscure embedded case
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4'}), self.assertEqual(substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4'}),
{'A': '<IPV4HOST>', 'PREF': 'IPV4'}) {'A': '<IPV4HOST>', 'PREF': 'IPV4'})
self.assertEqual(CommandAction.substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'}), self.assertEqual(substituteRecursiveTags({'A': '<<PREF>HOST>', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'}),
{'A': '1.2.3.4', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'}) {'A': '1.2.3.4', 'PREF': 'IPV4', 'IPV4HOST': '1.2.3.4'})
# more embedded within a string and two interpolations # more embedded within a string and two interpolations
self.assertEqual(CommandAction.substituteRecursiveTags({'A': 'A <IP<PREF>HOST> B IP<PREF> C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'}), self.assertEqual(substituteRecursiveTags({'A': 'A <IP<PREF>HOST> B IP<PREF> C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'}),
{'A': 'A 1.2.3.4 B IPV4 C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'}) {'A': 'A 1.2.3.4 B IPV4 C', 'PREF': 'V4', 'IPV4HOST': '1.2.3.4'})
def testReplaceTag(self): def testReplaceTag(self):
@ -186,7 +194,7 @@ class CommandActionTest(LogCaptureTestCase):
# Callable # Callable
self.assertEqual( self.assertEqual(
self.__action.replaceTag("09 <matches> 11", self.__action.replaceTag("09 <matches> 11",
CallingMap(matches=lambda: str(10))), CallingMap(matches=lambda self: str(10))),
"09 10 11") "09 10 11")
def testReplaceNoTag(self): def testReplaceNoTag(self):
@ -194,7 +202,27 @@ class CommandActionTest(LogCaptureTestCase):
# Will raise ValueError if it is # Will raise ValueError if it is
self.assertEqual( self.assertEqual(
self.__action.replaceTag("abc", self.__action.replaceTag("abc",
CallingMap(matches=lambda: int("a"))), "abc") CallingMap(matches=lambda self: int("a"))), "abc")
def testReplaceTagSelfRecursion(self):
setattr(self.__action, 'a', "<a")
setattr(self.__action, 'b', "c>")
setattr(self.__action, 'b?family=inet6', "b>")
setattr(self.__action, 'ac', "<a><b>")
setattr(self.__action, 'ab', "<ac>")
setattr(self.__action, 'x?family=inet6', "")
# produce self-referencing properties except:
self.assertRaisesRegexp(ValueError, r"properties contain self referencing definitions",
lambda: self.__action.replaceTag("<a><b>",
self.__action._properties, conditional="family=inet4")
)
# remore self-referencing in props:
delattr(self.__action, 'ac')
# produce self-referencing query except:
self.assertRaisesRegexp(ValueError, r"possible self referencing definitions in query",
lambda: self.__action.replaceTag("<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x<x>>>>>>>>>>>>>>>>>>>>>",
self.__action._properties, conditional="family=inet6")
)
def testReplaceTagConditionalCached(self): def testReplaceTagConditionalCached(self):
setattr(self.__action, 'abc', "123") setattr(self.__action, 'abc', "123")
@ -217,10 +245,10 @@ class CommandActionTest(LogCaptureTestCase):
self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties, self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties,
conditional="family=inet6", cache=cache), conditional="family=inet6", cache=cache),
"Text 890-567 text 567 '567'") "Text 890-567 text 567 '567'")
self.assertEqual(len(cache) if cache is not None else -1, 3) self.assertTrue(len(cache) >= 3)
# set one parameter - internal properties and cache should be reseted: # set one parameter - internal properties and cache should be reseted:
setattr(self.__action, 'xyz', "000-<abc>") setattr(self.__action, 'xyz', "000-<abc>")
self.assertEqual(len(cache) if cache is not None else -1, 0) self.assertEqual(len(cache), 0)
# test againg, should have 000 instead of 890: # test againg, should have 000 instead of 890:
for i in range(2): for i in range(2):
self.assertEqual( self.assertEqual(
@ -235,7 +263,7 @@ class CommandActionTest(LogCaptureTestCase):
self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties, self.__action.replaceTag("<banaction> '<abc>'", self.__action._properties,
conditional="family=inet6", cache=cache), conditional="family=inet6", cache=cache),
"Text 000-567 text 567 '567'") "Text 000-567 text 567 '567'")
self.assertEqual(len(cache), 3) self.assertTrue(len(cache) >= 3)
def testExecuteActionBan(self): def testExecuteActionBan(self):
@ -301,13 +329,24 @@ class CommandActionTest(LogCaptureTestCase):
self.assertEqual(self.__action.ROST,"192.0.2.0") self.assertEqual(self.__action.ROST,"192.0.2.0")
def testExecuteActionUnbanAinfo(self): def testExecuteActionUnbanAinfo(self):
aInfo = { aInfo = CallingMap({
'ABC': "123", 'ABC': "123",
'ip': '192.0.2.1',
'F-*': lambda self: {
'fid': 111,
'fport': 222,
'user': "tester"
} }
self.__action.actionban = "touch /tmp/fail2ban.test.123" })
self.__action.actionunban = "rm /tmp/fail2ban.test.<ABC>" self.__action.actionban = "touch /tmp/fail2ban.test.123; echo 'failure <F-ID> of <F-USER> -<F-TEST>- from <ip>:<F-PORT>'"
self.__action.actionunban = "rm /tmp/fail2ban.test.<ABC>; echo 'user <F-USER> unbanned'"
self.__action.ban(aInfo) self.__action.ban(aInfo)
self.__action.unban(aInfo) self.__action.unban(aInfo)
self.assertLogged(
" -- stdout: 'failure 111 of tester -- from 192.0.2.1:222'",
" -- stdout: 'user tester unbanned'",
all=True
)
def testExecuteActionStartEmpty(self): def testExecuteActionStartEmpty(self):
self.__action.actionstart = "" self.__action.actionstart = ""
@ -403,7 +442,7 @@ class CommandActionTest(LogCaptureTestCase):
"stderr: 'The rain in Spain stays mainly in the plain'\n") "stderr: 'The rain in Spain stays mainly in the plain'\n")
def testCallingMap(self): def testCallingMap(self):
mymap = CallingMap(callme=lambda: str(10), error=lambda: int('a'), mymap = CallingMap(callme=lambda self: str(10), error=lambda self: int('a'),
dontcallme= "string", number=17) dontcallme= "string", number=17)
# Should work fine # Should work fine
@ -412,3 +451,43 @@ class CommandActionTest(LogCaptureTestCase):
"10 okay string 17") "10 okay string 17")
# Error will now trip, demonstrating delayed call # Error will now trip, demonstrating delayed call
self.assertRaises(ValueError, lambda x: "%(error)i" % x, mymap) self.assertRaises(ValueError, lambda x: "%(error)i" % x, mymap)
def testCallingMapModify(self):
m = CallingMap({
'a': lambda self: 2 + 3,
'b': lambda self: self['a'] + 6,
'c': 'test',
})
# test reset (without modifications):
m.reset()
# do modifications:
m['a'] = 4
del m['c']
# test set and delete:
self.assertEqual(len(m), 2)
self.assertNotIn('c', m)
self.assertEqual((m['a'], m['b']), (4, 10))
# reset to original and test again:
m.reset()
s = repr(m)
self.assertEqual(len(m), 3)
self.assertIn('c', m)
self.assertEqual((m['a'], m['b'], m['c']), (5, 11, 'test'))
def testCallingMapRep(self):
m = CallingMap({
'a': lambda self: 2 + 3,
'b': lambda self: self['a'] + 6,
'c': ''
})
s = repr(m)
self.assertIn("'a': 5", s)
self.assertIn("'b': 11", s)
self.assertIn("'c': ''", s)
m['c'] = lambda self: self['xxx'] + 7; # unresolvable
s = repr(m)
self.assertIn("'a': 5", s)
self.assertIn("'b': 11", s)
self.assertIn("'c': ", s) # presents as callable
self.assertNotIn("'c': ''", s) # but not empty

View File

@ -0,0 +1,76 @@
# Fail2Ban obsolete multiline example resp. test filter (previously sshd.conf)
#
[INCLUDES]
# Read common prefixes. If any customizations available -- read them from
# common.local
before = ../../../../config/filter.d/common.conf
[DEFAULT]
_daemon = sshd
# optional prefix (logged from several ssh versions) like "error: ", "error: PAM: " or "fatal: "
__pref = (?:(?:error|fatal): (?:PAM: )?)?
# optional suffix (logged from several ssh versions) like " [preauth]"
__suff = (?: \[preauth\])?\s*
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
# single line prefix:
__prefix_line_sl = %(__prefix_line)s%(__pref)s
# multi line prefixes (for first and second lines):
__prefix_line_ml1 = (?P<__prefix>%(__prefix_line)s)%(__pref)s
__prefix_line_ml2 = %(__suff)s$<SKIPLINES>^(?P=__prefix)%(__pref)s
mode = %(normal)s
normal = ^%(__prefix_line_sl)s[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*%(__suff)s$
^%(__prefix_line_sl)sUser not known to the underlying authentication module for .* from <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)sFailed \S+ for (?P<cond_inv>invalid user )?(?P<user>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)) from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^%(__prefix_line_sl)sROOT LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)s[iI](?:llegal|nvalid) user .*? from <HOST>%(__on_port_opt)s\s*$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not in any group\s*%(__suff)s$
^%(__prefix_line_sl)srefused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^%(__prefix_line_sl)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*%(__suff)s$
^%(__prefix_line_sl)s(error: )?maximum authentication attempts exceeded for .* from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$
^%(__prefix_line_ml1)sUser .+ not allowed because account is locked%(__prefix_line_ml2)sReceived disconnect from <HOST>: 11: .+%(__suff)s$
^%(__prefix_line_ml1)sDisconnecting: Too many authentication failures for .+?%(__prefix_line_ml2)sConnection closed by <HOST>%(__suff)s$
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sDisconnecting: Too many authentication failures for .+%(__suff)s$
ddos = ^%(__prefix_line_sl)sDid not receive identification string from <HOST>%(__suff)s$
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
^%(__prefix_line_sl)sUnable to negotiate with <HOST>%(__on_port_opt)s: no matching (?:cipher|key exchange method) found.
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sUnable to negotiate a (?:cipher|key exchange method)%(__suff)s$
^%(__prefix_line_ml1)sSSH: Server;Ltype: (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:.*%(__prefix_line_ml2)sRead from socket failed: Connection reset by peer%(__suff)s$
aggressive = %(normal)s
%(ddos)s
[Definition]
failregex = %(mode)s
ignoreregex =
# "maxlines" is number of log lines to buffer for multi-line regex searches
maxlines = 10
journalmatch = _SYSTEMD_UNIT=sshd.service + _COMM=sshd
datepattern = {^LN-BEG}
# DEV Notes:
#
# "Failed \S+ for .*? from <HOST>..." failregex uses non-greedy catch-all because
# it is coming before use of <HOST> which is not hard-anchored at the end as well,
# and later catch-all's could contain user-provided input, which need to be greedily
# matched away first.
#
# Author: Cyril Jaquier, Yaroslav Halchenko, Petr Voralek, Daniel Black

View File

@ -769,9 +769,10 @@ class Fail2banServerTest(Fail2banClientServerBase):
"[Definition]", "[Definition]",
"norestored = %(_exec_once)s", "norestored = %(_exec_once)s",
"restore = ", "restore = ",
"info = ",
"actionstart = echo '[%(name)s] %(actname)s: ** start'", start, "actionstart = echo '[%(name)s] %(actname)s: ** start'", start,
"actionreload = echo '[%(name)s] %(actname)s: .. reload'", reload, "actionreload = echo '[%(name)s] %(actname)s: .. reload'", reload,
"actionban = echo '[%(name)s] %(actname)s: ++ ban <ip> %(restore)s'", ban, "actionban = echo '[%(name)s] %(actname)s: ++ ban <ip> %(restore)s%(info)s'", ban,
"actionunban = echo '[%(name)s] %(actname)s: -- unban <ip>'", unban, "actionunban = echo '[%(name)s] %(actname)s: -- unban <ip>'", unban,
"actionstop = echo '[%(name)s] %(actname)s: __ stop'", stop, "actionstop = echo '[%(name)s] %(actname)s: __ stop'", stop,
) )
@ -785,28 +786,28 @@ class Fail2banServerTest(Fail2banClientServerBase):
"usedns = no", "usedns = no",
"maxretry = 3", "maxretry = 3",
"findtime = 10m", "findtime = 10m",
"failregex = ^\s*failure (401|403) from <HOST>", "failregex = ^\s*failure <F-ERRCODE>401|403</F-ERRCODE> from <HOST>",
"datepattern = {^LN-BEG}EPOCH", "datepattern = {^LN-BEG}EPOCH",
"", "",
"[test-jail1]", "backend = " + backend, "filter =", "[test-jail1]", "backend = " + backend, "filter =",
"action = ", "action = ",
" test-action1[name='%(__name__)s']" \ " test-action1[name='%(__name__)s']" \
if 1 in actions else "", if 1 in actions else "",
" test-action2[name='%(__name__)s', restore='restored: <restored>']" \ " test-action2[name='%(__name__)s', restore='restored: <restored>', info=', err-code: <F-ERRCODE>']" \
if 2 in actions else "", if 2 in actions else "",
" test-action2[name='%(__name__)s', actname=test-action3, _exec_once=1, restore='restored: <restored>']" \ " test-action2[name='%(__name__)s', actname=test-action3, _exec_once=1, restore='restored: <restored>']" \
if 3 in actions else "", if 3 in actions else "",
"logpath = " + test1log, "logpath = " + test1log,
" " + test2log if 2 in enabled else "", " " + test2log if 2 in enabled else "",
" " + test3log if 2 in enabled else "", " " + test3log if 2 in enabled else "",
"failregex = ^\s*failure (401|403) from <HOST>", "failregex = ^\s*failure <F-ERRCODE>401|403</F-ERRCODE> from <HOST>",
" ^\s*error (401|403) from <HOST>" \ " ^\s*error <F-ERRCODE>401|403</F-ERRCODE> from <HOST>" \
if 2 in enabled else "", if 2 in enabled else "",
"enabled = true" if 1 in enabled else "", "enabled = true" if 1 in enabled else "",
"", "",
"[test-jail2]", "backend = " + backend, "filter =", "[test-jail2]", "backend = " + backend, "filter =",
"action = ", "action = ",
" test-action2[name='%(__name__)s', restore='restored: <restored>']" \ " test-action2[name='%(__name__)s', restore='restored: <restored>', info=', err-code: <F-ERRCODE>']" \
if 2 in actions else "", if 2 in actions else "",
" test-action2[name='%(__name__)s', actname=test-action3, _exec_once=1, restore='restored: <restored>']" \ " test-action2[name='%(__name__)s', actname=test-action3, _exec_once=1, restore='restored: <restored>']" \
if 3 in actions else "", if 3 in actions else "",
@ -845,7 +846,7 @@ class Fail2banServerTest(Fail2banClientServerBase):
"stdout: '[test-jail1] test-action2: ** start'", all=True) "stdout: '[test-jail1] test-action2: ** start'", all=True)
# test restored is 0 (both actions available): # test restored is 0 (both actions available):
self.assertLogged( self.assertLogged(
"stdout: '[test-jail1] test-action2: ++ ban 192.0.2.1 restored: 0'", "stdout: '[test-jail1] test-action2: ++ ban 192.0.2.1 restored: 0, err-code: 401'",
"stdout: '[test-jail1] test-action3: ++ ban 192.0.2.1 restored: 0'", "stdout: '[test-jail1] test-action3: ++ ban 192.0.2.1 restored: 0'",
all=True, wait=MID_WAITTIME) all=True, wait=MID_WAITTIME)
@ -968,8 +969,8 @@ class Fail2banServerTest(Fail2banClientServerBase):
) )
# test restored is 1 (only test-action2): # test restored is 1 (only test-action2):
self.assertLogged( self.assertLogged(
"stdout: '[test-jail2] test-action2: ++ ban 192.0.2.4 restored: 1'", "stdout: '[test-jail2] test-action2: ++ ban 192.0.2.4 restored: 1, err-code: 401'",
"stdout: '[test-jail2] test-action2: ++ ban 192.0.2.8 restored: 1'", "stdout: '[test-jail2] test-action2: ++ ban 192.0.2.8 restored: 1, err-code: 401'",
all=True, wait=MID_WAITTIME) all=True, wait=MID_WAITTIME)
# test test-action3 not executed at all (norestored check): # test test-action3 not executed at all (norestored check):
self.assertNotLogged( self.assertNotLogged(

View File

@ -27,17 +27,18 @@ import os
import sys import sys
from ..client import fail2banregex from ..client import fail2banregex
from ..client.fail2banregex import Fail2banRegex, get_opt_parser, exec_command_line, output from ..client.fail2banregex import Fail2banRegex, get_opt_parser, exec_command_line, output, str2LogLevel
from .utils import setUpMyTime, tearDownMyTime, LogCaptureTestCase, logSys from .utils import setUpMyTime, tearDownMyTime, LogCaptureTestCase, logSys
from .utils import CONFIG_DIR from .utils import CONFIG_DIR
fail2banregex.logSys = logSys fail2banregex.logSys = logSys
def _test_output(*args): def _test_output(*args):
logSys.info(args[0]) logSys.notice(args[0])
fail2banregex.output = _test_output fail2banregex.output = _test_output
TEST_CONFIG_DIR = os.path.join(os.path.dirname(__file__), "config")
TEST_FILES_DIR = os.path.join(os.path.dirname(__file__), "files") TEST_FILES_DIR = os.path.join(os.path.dirname(__file__), "files")
DEV_NULL = None DEV_NULL = None
@ -45,6 +46,9 @@ DEV_NULL = None
def _Fail2banRegex(*args): def _Fail2banRegex(*args):
parser = get_opt_parser() parser = get_opt_parser()
(opts, args) = parser.parse_args(list(args)) (opts, args) = parser.parse_args(list(args))
# put down log-level if expected, because of too many debug-messages:
if opts.log_level in ("notice", "warning"):
logSys.setLevel(str2LogLevel(opts.log_level))
return (opts, args, Fail2banRegex(opts)) return (opts, args, Fail2banRegex(opts))
class ExitException(Exception): class ExitException(Exception):
@ -80,7 +84,13 @@ class Fail2banRegexTest(LogCaptureTestCase):
FILENAME_02 = os.path.join(TEST_FILES_DIR, "testcase02.log") FILENAME_02 = os.path.join(TEST_FILES_DIR, "testcase02.log")
FILENAME_WRONGCHAR = os.path.join(TEST_FILES_DIR, "testcase-wrong-char.log") FILENAME_WRONGCHAR = os.path.join(TEST_FILES_DIR, "testcase-wrong-char.log")
FILENAME_SSHD = os.path.join(TEST_FILES_DIR, "logs", "sshd")
FILTER_SSHD = os.path.join(CONFIG_DIR, 'filter.d', 'sshd.conf') FILTER_SSHD = os.path.join(CONFIG_DIR, 'filter.d', 'sshd.conf')
FILENAME_ZZZ_SSHD = os.path.join(TEST_FILES_DIR, 'zzz-sshd-obsolete-multiline.log')
FILTER_ZZZ_SSHD = os.path.join(TEST_CONFIG_DIR, 'filter.d', 'zzz-sshd-obsolete-multiline.conf')
FILENAME_ZZZ_GEN = os.path.join(TEST_FILES_DIR, "logs", "zzz-generic-example")
FILTER_ZZZ_GEN = os.path.join(TEST_CONFIG_DIR, 'filter.d', 'zzz-generic-example.conf')
def setUp(self): def setUp(self):
"""Call before every test case.""" """Call before every test case."""
@ -96,7 +106,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
(opts, args, fail2banRegex) = _Fail2banRegex( (opts, args, fail2banRegex) = _Fail2banRegex(
"test", r".** from <HOST>$" "test", r".** from <HOST>$"
) )
self.assertFalse(fail2banRegex.start(opts, args)) self.assertFalse(fail2banRegex.start(args))
self.assertLogged("Unable to compile regular expression") self.assertLogged("Unable to compile regular expression")
def testWrongIngnoreRE(self): def testWrongIngnoreRE(self):
@ -104,7 +114,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
"--datepattern", "{^LN-BEG}EPOCH", "--datepattern", "{^LN-BEG}EPOCH",
"test", r".*? from <HOST>$", r".**" "test", r".*? from <HOST>$", r".**"
) )
self.assertFalse(fail2banRegex.start(opts, args)) self.assertFalse(fail2banRegex.start(args))
self.assertLogged("Unable to compile regular expression") self.assertLogged("Unable to compile regular expression")
def testDirectFound(self): def testDirectFound(self):
@ -114,7 +124,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
"Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0", "Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0",
r"Authentication failure for .*? from <HOST>$" r"Authentication failure for .*? from <HOST>$"
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 1 lines, 0 ignored, 1 matched, 0 missed') self.assertLogged('Lines: 1 lines, 0 ignored, 1 matched, 0 missed')
def testDirectNotFound(self): def testDirectNotFound(self):
@ -123,7 +133,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
"Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0", "Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0",
r"XYZ from <HOST>$" r"XYZ from <HOST>$"
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 1 lines, 0 ignored, 0 matched, 1 missed') self.assertLogged('Lines: 1 lines, 0 ignored, 0 matched, 1 missed')
def testDirectIgnored(self): def testDirectIgnored(self):
@ -133,7 +143,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
r"Authentication failure for .*? from <HOST>$", r"Authentication failure for .*? from <HOST>$",
r"kevin from 192.0.2.0$" r"kevin from 192.0.2.0$"
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 1 lines, 1 ignored, 0 matched, 0 missed') self.assertLogged('Lines: 1 lines, 1 ignored, 0 matched, 0 missed')
def testDirectRE_1(self): def testDirectRE_1(self):
@ -143,7 +153,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_01, Fail2banRegexTest.FILENAME_01,
Fail2banRegexTest.RE_00 Fail2banRegexTest.RE_00
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 19 lines, 0 ignored, 13 matched, 6 missed') self.assertLogged('Lines: 19 lines, 0 ignored, 13 matched, 6 missed')
self.assertLogged('Error decoding line'); self.assertLogged('Error decoding line');
@ -159,7 +169,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_01, Fail2banRegexTest.FILENAME_01,
Fail2banRegexTest.RE_00 Fail2banRegexTest.RE_00
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 19 lines, 0 ignored, 16 matched, 3 missed') self.assertLogged('Lines: 19 lines, 0 ignored, 16 matched, 3 missed')
def testDirectRE_1raw_noDns(self): def testDirectRE_1raw_noDns(self):
@ -169,7 +179,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_01, Fail2banRegexTest.FILENAME_01,
Fail2banRegexTest.RE_00 Fail2banRegexTest.RE_00
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 19 lines, 0 ignored, 13 matched, 6 missed') self.assertLogged('Lines: 19 lines, 0 ignored, 13 matched, 6 missed')
def testDirectRE_2(self): def testDirectRE_2(self):
@ -179,7 +189,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_02, Fail2banRegexTest.FILENAME_02,
Fail2banRegexTest.RE_00 Fail2banRegexTest.RE_00
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed') self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed')
def testVerbose(self): def testVerbose(self):
@ -189,18 +199,69 @@ class Fail2banRegexTest(LogCaptureTestCase):
Fail2banRegexTest.FILENAME_02, Fail2banRegexTest.FILENAME_02,
Fail2banRegexTest.RE_00 Fail2banRegexTest.RE_00
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed') self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed')
self.assertLogged('141.3.81.106 Sun Aug 14 11:53:59 2005') self.assertLogged('141.3.81.106 Sun Aug 14 11:53:59 2005')
self.assertLogged('141.3.81.106 Sun Aug 14 11:54:59 2005') self.assertLogged('141.3.81.106 Sun Aug 14 11:54:59 2005')
def testWronChar(self): def testVerboseFullSshd(self):
(opts, args, fail2banRegex) = _Fail2banRegex( (opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"-v", "--verbose-date", "--print-all-matched",
Fail2banRegexTest.FILENAME_SSHD, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(args))
# test failure line and not-failure lines both presents:
self.assertLogged("[29116]: User root not allowed because account is locked",
"[29116]: Received disconnect from 1.2.3.4", all=True)
def testFastSshd(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--print-all-matched",
Fail2banRegexTest.FILENAME_ZZZ_SSHD, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(args))
# test failure line and all not-failure lines presents:
self.assertLogged(
"[29116]: Connection from 192.0.2.4",
"[29116]: User root not allowed because account is locked",
"[29116]: Received disconnect from 192.0.2.4", all=True)
def testMultilineSshd(self):
# by the way test of missing lines by multiline in `for bufLine in orgLineBuffer[int(fullBuffer):]`
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--print-all-matched", "--print-all-missed",
Fail2banRegexTest.FILENAME_ZZZ_SSHD, Fail2banRegexTest.FILTER_ZZZ_SSHD
)
self.assertTrue(fail2banRegex.start(args))
# test "failure" line presents (2nd part only, because multiline fewer precise):
self.assertLogged(
"[29116]: Received disconnect from 192.0.2.4", all=True)
def testFullGeneric(self):
# by the way test of ignoreregex (specified in filter file)...
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
Fail2banRegexTest.FILENAME_ZZZ_GEN, Fail2banRegexTest.FILTER_ZZZ_GEN
)
self.assertTrue(fail2banRegex.start(args))
def _reset(self):
# reset global warn-counter:
from ..server.filter import _decode_line_warn
_decode_line_warn.clear()
def testWronChar(self):
self._reset()
(opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--datepattern", "^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?", "--datepattern", "^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?",
Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed') self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed')
self.assertLogged('Error decoding line') self.assertLogged('Error decoding line')
@ -210,12 +271,15 @@ class Fail2banRegexTest(LogCaptureTestCase):
self.assertLogged('Nov 8 00:16:12 main sshd[32547]: pam_succeed_if(sshd:auth): error retrieving information about user llinco') self.assertLogged('Nov 8 00:16:12 main sshd[32547]: pam_succeed_if(sshd:auth): error retrieving information about user llinco')
def testWronCharDebuggex(self): def testWronCharDebuggex(self):
self._reset()
(opts, args, fail2banRegex) = _Fail2banRegex( (opts, args, fail2banRegex) = _Fail2banRegex(
"-l", "notice", # put down log-level, because of too many debug-messages
"--datepattern", "^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?", "--datepattern", "^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?",
"--debuggex", "--print-all-matched", "--debuggex", "--print-all-matched",
Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD
) )
self.assertTrue(fail2banRegex.start(opts, args)) self.assertTrue(fail2banRegex.start(args))
self.assertLogged('Error decoding line')
self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed') self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed')
self.assertLogged('https://') self.assertLogged('https://')

View File

@ -1,17 +1,23 @@
# failJSON: { "time": "2005-02-07T15:10:42", "match": true , "host": "192.168.1.1" } # failJSON: { "time": "2005-02-07T15:10:42", "match": true , "host": "192.168.1.1", "user": "sample-user" }
Feb 7 15:10:42 example pure-ftpd: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=pure-ftpd ruser=sample-user rhost=192.168.1.1 Feb 7 15:10:42 example pure-ftpd: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=pure-ftpd ruser=sample-user rhost=192.168.1.1
# failJSON: { "time": "2005-05-12T09:47:54", "match": true , "host": "71-13-115-12.static.mdsn.wi.charter.com" } # failJSON: { "time": "2005-05-12T09:47:54", "match": true , "host": "71-13-115-12.static.mdsn.wi.charter.com", "user": "root" }
May 12 09:47:54 vaio sshd[16004]: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=71-13-115-12.static.mdsn.wi.charter.com user=root May 12 09:47:54 vaio sshd[16004]: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=71-13-115-12.static.mdsn.wi.charter.com user=root
# failJSON: { "time": "2005-05-12T09:48:03", "match": true , "host": "71-13-115-12.static.mdsn.wi.charter.com" } # failJSON: { "time": "2005-05-12T09:48:03", "match": true , "host": "71-13-115-12.static.mdsn.wi.charter.com" }
May 12 09:48:03 vaio sshd[16021]: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=71-13-115-12.static.mdsn.wi.charter.com May 12 09:48:03 vaio sshd[16021]: (pam_unix) authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=71-13-115-12.static.mdsn.wi.charter.com
# failJSON: { "time": "2005-05-15T18:02:12", "match": true , "host": "66.232.129.62" } # failJSON: { "time": "2005-05-15T18:02:12", "match": true , "host": "66.232.129.62", "user": "mark" }
May 15 18:02:12 localhost proftpd: (pam_unix) authentication failure; logname= uid=0 euid=0 tty= ruser= rhost=66.232.129.62 user=mark May 15 18:02:12 localhost proftpd: (pam_unix) authentication failure; logname= uid=0 euid=0 tty= ruser= rhost=66.232.129.62 user=mark
# linux-pam messages before commit f0f9c4479303b5a9c37667cf07f58426dc081676 (release 0.99.2.0 ) - nolonger supported # linux-pam messages before commit f0f9c4479303b5a9c37667cf07f58426dc081676 (release 0.99.2.0 ) - nolonger supported
# failJSON: { "time": "2004-11-25T17:12:13", "match": false } # failJSON: { "time": "2004-11-25T17:12:13", "match": false }
Nov 25 17:12:13 webmail pop(pam_unix)[4920]: authentication failure; logname= uid=0 euid=0 tty= ruser= rhost=192.168.10.3 user=mailuser Nov 25 17:12:13 webmail pop(pam_unix)[4920]: authentication failure; logname= uid=0 euid=0 tty= ruser= rhost=192.168.10.3 user=mailuser
# failJSON: { "time": "2005-07-19T18:11:26", "match": true , "host": "www.google.com" } # failJSON: { "time": "2005-07-19T18:11:26", "match": true , "host": "www.google.com", "user": "an8767" }
Jul 19 18:11:26 srv2 vsftpd: pam_unix(vsftpd:auth): authentication failure; logname= uid=0 euid=0 tty=ftp ruser=an8767 rhost=www.google.com Jul 19 18:11:26 srv2 vsftpd: pam_unix(vsftpd:auth): authentication failure; logname= uid=0 euid=0 tty=ftp ruser=an8767 rhost=www.google.com
# failJSON: { "time": "2005-07-19T18:11:26", "match": true , "host": "www.google.com" } # failJSON: { "time": "2005-07-19T18:11:26", "match": true , "host": "www.google.com" }
Jul 19 18:11:26 srv2 vsftpd: pam_unix: authentication failure; logname= uid=0 euid=0 tty=ftp ruser=an8767 rhost=www.google.com Jul 19 18:11:26 srv2 vsftpd: pam_unix: authentication failure; logname= uid=0 euid=0 tty=ftp ruser=an8767 rhost=www.google.com
# failJSON: { "time": "2005-07-19T18:11:50", "match": true , "host": "192.0.2.1", "user": "test rhost=192.0.2.151", "desc": "Injecting on username"}
Jul 19 18:11:50 srv2 daemon: pam_unix(auth): authentication failure; logname= uid=0 euid=0 tty=xxx ruser=test rhost=192.0.2.151 rhost=192.0.2.1
# failJSON: { "time": "2005-07-19T18:11:52", "match": true , "host": "192.0.2.2", "user": "test rhost=192.0.2.152", "desc": "Injecting on username after host"}
Jul 19 18:11:52 srv2 daemon: pam_unix(auth): authentication failure; logname= uid=0 euid=0 tty=xxx ruser= rhost=192.0.2.2 user=test rhost=192.0.2.152

View File

@ -0,0 +1,2 @@
# test sshd file:
# addFILE: "sshd"

View File

@ -0,0 +1,4 @@
Apr 27 13:02:01 host sshd[29116]: Connection from 192.0.2.4 port 55555
Apr 27 13:02:02 host sshd[29116]: User root not allowed because account is locked
Apr 27 13:02:03 host sshd[29116]: input_userauth_request: invalid user root [preauth]
Apr 27 13:02:04 host sshd[29116]: Received disconnect from 192.0.2.4: 11: Normal Shutdown, Thank you for playing [preauth]

View File

@ -337,6 +337,11 @@ class IgnoreIP(LogCaptureTestCase):
for ip in ipList: for ip in ipList:
self.filter.addIgnoreIP(ip) self.filter.addIgnoreIP(ip)
self.assertFalse(self.filter.inIgnoreIPList(ip)) self.assertFalse(self.filter.inIgnoreIPList(ip))
if not unittest.F2B.no_network: # pragma: no cover
self.assertLogged(
'Unable to find a corresponding IP address for 999.999.999.999',
'Unable to find a corresponding IP address for abcdef.abcdef',
'Unable to find a corresponding IP address for 192.168.0.', all=True)
def testIgnoreIPCIDR(self): def testIgnoreIPCIDR(self):
self.filter.addIgnoreIP('192.168.1.0/25') self.filter.addIgnoreIP('192.168.1.0/25')
@ -1426,6 +1431,7 @@ class GetFailures(LogCaptureTestCase):
('no', output_no), ('no', output_no),
('warn', output_yes) ('warn', output_yes)
): ):
self.pruneLog("[test-phase useDns=%s]" % useDns)
jail = DummyJail() jail = DummyJail()
filter_ = FileFilter(jail, useDns=useDns) filter_ = FileFilter(jail, useDns=useDns)
filter_.active = True filter_.active = True

View File

@ -102,7 +102,9 @@ def testSampleRegexsFactory(name, basedir):
else: else:
continue continue
for optval in optval: for optval in optval:
if opt[2] == "addfailregex": if opt[2] == "prefregex":
self.filter.prefRegex = optval
elif opt[2] == "addfailregex":
self.filter.addFailRegex(optval) self.filter.addFailRegex(optval)
elif opt[2] == "addignoreregex": elif opt[2] == "addignoreregex":
self.filter.addIgnoreRegex(optval) self.filter.addIgnoreRegex(optval)
@ -148,23 +150,34 @@ def testSampleRegexsFactory(name, basedir):
else: else:
faildata = {} faildata = {}
try:
ret = self.filter.processLine(line) ret = self.filter.processLine(line)
if not ret: if not ret:
# Check line is flagged as none match # Check line is flagged as none match
self.assertFalse(faildata.get('match', True), self.assertFalse(faildata.get('match', True),
"Line not matched when should have: %s:%i %r" % "Line not matched when should have")
(logFile.filename(), logFile.filelineno(), line)) continue
elif ret:
failregex, fid, fail2banTime, fail = ret[0]
# Bypass no failure helpers-regexp:
if not faildata.get('match', False) and (fid is None or fail.get('nofail')):
regexsUsed.add(failregex)
continue
# Check line is flagged to match # Check line is flagged to match
self.assertTrue(faildata.get('match', False), self.assertTrue(faildata.get('match', False),
"Line matched when shouldn't have: %s:%i %r" % "Line matched when shouldn't have")
(logFile.filename(), logFile.filelineno(), line)) self.assertEqual(len(ret), 1,
self.assertEqual(len(ret), 1, "Multiple regexs matched %r - %s:%i" % "Multiple regexs matched %r" % (map(lambda x: x[0], ret)))
(map(lambda x: x[0], ret),logFile.filename(), logFile.filelineno()))
# Verify timestamp and host as expected # Fallback for backwards compatibility (previously no fid, was host only):
failregex, host, fail2banTime, lines, fail = ret[0] if faildata.get("host", None) is not None and fail.get("host", None) is None:
self.assertEqual(host, faildata.get("host", None)) fail["host"] = fid
# Verify match captures (at least fid/host) and timestamp as expected
for k, v in faildata.iteritems():
if k not in ("time", "match", "desc"):
fv = fail.get(k, None)
self.assertEqual(fv, v)
t = faildata.get("time", None) t = faildata.get("time", None)
try: try:
@ -177,12 +190,15 @@ def testSampleRegexsFactory(name, basedir):
jsonTime += jsonTimeLocal.microsecond / 1000000 jsonTime += jsonTimeLocal.microsecond / 1000000
self.assertEqual(fail2banTime, jsonTime, self.assertEqual(fail2banTime, jsonTime,
"UTC Time mismatch fail2ban %s (%s) != failJson %s (%s) (diff %.3f seconds) on: %s:%i %r:" % "UTC Time mismatch %s (%s) != %s (%s) (diff %.3f seconds)" %
(fail2banTime, time.strftime("%Y-%m-%dT%H:%M:%S", time.gmtime(fail2banTime)), (fail2banTime, time.strftime("%Y-%m-%dT%H:%M:%S", time.gmtime(fail2banTime)),
jsonTime, time.strftime("%Y-%m-%dT%H:%M:%S", time.gmtime(jsonTime)), jsonTime, time.strftime("%Y-%m-%dT%H:%M:%S", time.gmtime(jsonTime)),
fail2banTime - jsonTime, logFile.filename(), logFile.filelineno(), line ) ) fail2banTime - jsonTime) )
regexsUsed.add(failregex) regexsUsed.add(failregex)
except AssertionError as e: # pragma: no cover
raise AssertionError("%s on: %s:%i, line:\n%s" % (
e, logFile.filename(), logFile.filelineno(), line))
for failRegexIndex, failRegex in enumerate(self.filter.getFailRegex()): for failRegexIndex, failRegex in enumerate(self.filter.getFailRegex()):
self.assertTrue( self.assertTrue(

View File

@ -1654,9 +1654,10 @@ class ServerConfigReaderTests(LogCaptureTestCase):
# replace pipe to mail with pipe to cat: # replace pipe to mail with pipe to cat:
realCmd = re.sub(r'\)\s*\|\s*mail\b([^\n]*)', realCmd = re.sub(r'\)\s*\|\s*mail\b([^\n]*)',
r' echo mail \1 ) | cat', realCmd) r' echo mail \1 ) | cat', realCmd)
# replace abuse retrieving (possible no-network): # replace abuse retrieving (possible no-network), just replace first occurrence of 'dig...':
realCmd = re.sub(r'[^\n]+\bADDRESSES=\$\(dig\s[^\n]+', realCmd = re.sub(r'\bADDRESSES=\$\(dig\s[^\n]+',
lambda m: 'ADDRESSES="abuse-1@abuse-test-server, abuse-2@abuse-test-server"', realCmd) lambda m: 'ADDRESSES="abuse-1@abuse-test-server, abuse-2@abuse-test-server"',
realCmd, 1)
# execute action: # execute action:
return _actions.CommandAction.executeCmd(realCmd, timeout=timeout) return _actions.CommandAction.executeCmd(realCmd, timeout=timeout)
@ -1686,18 +1687,29 @@ class ServerConfigReaderTests(LogCaptureTestCase):
('j-complain-abuse', ('j-complain-abuse',
'complain[' 'complain['
'name=%(__name__)s, grepopts="-m 1", grepmax=2, mailcmd="mail -s",' + 'name=%(__name__)s, grepopts="-m 1", grepmax=2, mailcmd="mail -s",' +
# test reverse ip:
'debug=1,' +
# 2 logs to test grep from multiple logs: # 2 logs to test grep from multiple logs:
'logpath="' + os.path.join(TEST_FILES_DIR, "testcase01.log") + '\n' + 'logpath="' + os.path.join(TEST_FILES_DIR, "testcase01.log") + '\n' +
' ' + os.path.join(TEST_FILES_DIR, "testcase01a.log") + '", ' ' ' + os.path.join(TEST_FILES_DIR, "testcase01a.log") + '", '
']', ']',
{ {
'ip4-ban': ( 'ip4-ban': (
# test reverse ip:
'try to resolve 10.124.142.87.abuse-contacts.abusix.org',
'Lines containing failures of 87.142.124.10 (max 2)', 'Lines containing failures of 87.142.124.10 (max 2)',
'testcase01.log:Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 87.142.124.10', 'testcase01.log:Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 87.142.124.10',
'testcase01a.log:Dec 31 11:55:01 [sshd] error: PAM: Authentication failure for test from 87.142.124.10', 'testcase01a.log:Dec 31 11:55:01 [sshd] error: PAM: Authentication failure for test from 87.142.124.10',
# both abuse mails should be separated with space: # both abuse mails should be separated with space:
'mail -s Abuse from 87.142.124.10 abuse-1@abuse-test-server abuse-2@abuse-test-server', 'mail -s Abuse from 87.142.124.10 abuse-1@abuse-test-server abuse-2@abuse-test-server',
), ),
'ip6-ban': (
# test reverse ip:
'try to resolve 1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.8.b.d.0.1.0.0.2.abuse-contacts.abusix.org',
'Lines containing failures of 2001:db8::1 (max 2)',
# both abuse mails should be separated with space:
'mail -s Abuse from 2001:db8::1 abuse-1@abuse-test-server abuse-2@abuse-test-server',
),
}), }),
) )
server = TestServer() server = TestServer()
@ -1718,6 +1730,8 @@ class ServerConfigReaderTests(LogCaptureTestCase):
jails = server._Server__jails jails = server._Server__jails
ipv4 = IPAddr('87.142.124.10')
ipv6 = IPAddr('2001:db8::1');
for jail, act, tests in testJailsActions: for jail, act, tests in testJailsActions:
# print(jail, jails[jail]) # print(jail, jails[jail])
for a in jails[jail].actions: for a in jails[jail].actions:
@ -1728,8 +1742,10 @@ class ServerConfigReaderTests(LogCaptureTestCase):
# wrap default command processor: # wrap default command processor:
action.executeCmd = self._executeMailCmd action.executeCmd = self._executeMailCmd
# test ban : # test ban :
self.pruneLog('# === ban ===') for (test, ip) in (('ip4-ban', ipv4), ('ip6-ban', ipv6)):
action.ban({'ip': IPAddr('87.142.124.10'), if not tests.get(test): continue
'failures': 100, self.pruneLog('# === %s ===' % test)
}) ticket = _actions.CallingMap({
self.assertLogged(*tests['ip4-ban'], all=True) 'ip': ip, 'ip-rev': lambda self: self['ip'].getPTR(''), 'failures': 100,})
action.ban(ticket)
self.assertLogged(*tests[test], all=True)