Merge branch '0.10' into 0.11-2

pull/2116/head
sebres 2018-04-03 14:14:44 +02:00
commit 7dfd61f462
25 changed files with 475 additions and 218 deletions

View File

@ -64,15 +64,24 @@ ver. 0.10.3-dev-1 (20??/??/??) - development edition
### Fixes
* `filter.d/asterisk.conf`: fixed failregex prefix by log over remote syslog server (gh-2060);
* `filter.d/exim.conf`: failregex extended - SMTP call dropped: too many syntax or protocol errors (gh-2048);
* `filter.d/recidive.conf`: fixed if logging into systemd-journal (SYSLOG) with daemon name in prefix, gh-2069;
* `filter.d/sshd.conf`:
- failregex got an optional space in order to match new log-format (see gh-2061);
- fixed ddos-mode regex to match refactored message (some versions can contain port now, see gh-2062);
- fixed root login refused regex (optional port before preauth, gh-2080);
- avoid banning of legitimate users when pam_unix used in combination with other password method, so
bypass pam_unix failures if accepted available for this user gh-2070;
- amend to gh-1263 with better handling of multiple attempts (failures for different user-names recognized immediatelly);
- mode `ddos` (and `aggressive`) extended to catch `Connection closed by ... [preauth]`, so in DDOS mode
it counts failure on closing connection within preauth-stage (gh-2085);
* `action.d/badips.py`: implicit convert IPAddr to str, solves an issue "expected string, IPAddr found" (gh-2059);
* `action.d/hostsdeny.conf`: fixed IPv6 syntax (enclosed in square brackets, gh-2066);
* (Free)BSD ipfw actionban fixed to allow same rule added several times (gh-2054);
### New Features
### Enhancements
* `filter.d/apache-noscript.conf`: extend failregex to match "Primary script unknown", e. g. from php-fpm (gh-2073);
* date-detector extended with long epoch (`LEPOCH`) to parse milliseconds/microseconds posix-dates (gh-2029);
* possibility to specify own regex-pattern to match epoch date-time, e. g. `^\[{EPOCH}\]` or `^\[{LEPOCH}\]` (gh-2038);
the epoch-pattern similar to `{DATE}` patterns does the capture and cuts out the match of whole pattern from the log-line,

View File

@ -6,43 +6,44 @@
## Fail2Ban: ban hosts that cause multiple authentication errors
Fail2Ban scans log files like `/var/log/auth.log` and bans IP addresses having
Fail2Ban scans log files like `/var/log/auth.log` and bans IP addresses conducting
too many failed login attempts. It does this by updating system firewall rules
to reject new connections from those IP addresses, for a configurable amount
of time. Fail2Ban comes out-of-the-box ready to read many standard log files,
such as those for sshd and Apache, and is easy to configure to read any log
file you choose, for any error you choose.
such as those for sshd and Apache, and is easily configured to read any log
file of your choosing, for any error you wish.
Though Fail2Ban is able to reduce the rate of incorrect authentications
attempts, it cannot eliminate the risk that weak authentication presents.
Configure services to use only two factor or public/private authentication
Though Fail2Ban is able to reduce the rate of incorrect authentication
attempts, it cannot eliminate the risk presented by weak authentication.
Set up services to use only two factor, or public/private authentication
mechanisms if you really want to protect services.
<img src="http://www.worldipv6launch.org/wp-content/themes/ipv6/downloads/World_IPv6_launch_logo.svg" height="52pt"/> | Since v0.10 fail2ban supports the matching of IPv6 addresses.
------|------
This README is a quick introduction to Fail2ban. More documentation, FAQ, HOWTOs
are available in fail2ban(1) manpage, [Wiki](https://github.com/fail2ban/fail2ban/wiki)
and on the website http://www.fail2ban.org
This README is a quick introduction to Fail2Ban. More documentation, FAQ, and HOWTOs
to be found on fail2ban(1) manpage, [Wiki](https://github.com/fail2ban/fail2ban/wiki)
and the website: https://www.fail2ban.org
Installation:
-------------
**It is possible that Fail2ban is already packaged for your distribution. In
this case, you should use it instead.**
**It is possible that Fail2Ban is already packaged for your distribution. In
this case, you should use that instead.**
Required:
- [Python2 >= 2.6 or Python >= 3.2](http://www.python.org) or [PyPy](http://pypy.org)
- [Python2 >= 2.6 or Python >= 3.2](https://www.python.org) or [PyPy](https://pypy.org)
Optional:
- [pyinotify >= 0.8.3](https://github.com/seb-m/pyinotify)
- Linux >= 2.6.13
- [pyinotify >= 0.8.3](https://github.com/seb-m/pyinotify), may require:
* Linux >= 2.6.13
- [gamin >= 0.0.21](http://www.gnome.org/~veillard/gamin)
- [systemd >= 204](http://www.freedesktop.org/wiki/Software/systemd) and python bindings:
- [python-systemd package](https://www.freedesktop.org/software/systemd/python-systemd/index.html)
* [python-systemd package](https://www.freedesktop.org/software/systemd/python-systemd/index.html)
- [dnspython](http://www.dnspython.org/)
To install, just do:
To install:
tar xvfj fail2ban-0.11.0.tar.bz2
cd fail2ban-0.11.0
@ -55,7 +56,7 @@ Alternatively, you can clone the source from GitHub to a directory of Your choic
sudo python setup.py install
This will install Fail2Ban into the python library directory. The executable
scripts are placed into `/usr/bin`, and configuration under `/etc/fail2ban`.
scripts are placed into `/usr/bin`, and configuration in `/etc/fail2ban`.
Fail2Ban should be correctly installed now. Just type:
@ -100,7 +101,7 @@ Contact:
See [CONTRIBUTING.md](https://github.com/fail2ban/fail2ban/blob/master/CONTRIBUTING.md)
### You just appreciate this program:
send kudos to the original author ([Cyril Jaquier](mailto:cyril.jaquier@fail2ban.org))
Send kudos to the original author ([Cyril Jaquier](mailto:cyril.jaquier@fail2ban.org))
or *better* to the [mailing list](https://lists.sourceforge.net/lists/listinfo/fail2ban-users)
since Fail2Ban is "community-driven" for years now.

View File

@ -31,7 +31,7 @@ actioncheck =
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionban = IP=<ip> && printf %%b "<daemon_list>: $IP\n" >> <file>
actionban = printf %%b "<daemon_list>: <_ip_value>\n" >> <file>
# Option: actionunban
# Notes.: command executed when unbanning an IP. Take care that the
@ -39,7 +39,7 @@ actionban = IP=<ip> && printf %%b "<daemon_list>: $IP\n" >> <file>
# Tags: See jail.conf(5) man page
# Values: CMD
#
actionunban = IP=$(echo <ip> | sed 's/\./\\./g') && sed -i "/^<daemon_list>: $IP$/d" <file>
actionunban = IP=$(echo "<_ip_value>" | sed 's/[][\.]/\\\0/g') && sed -i "/^<daemon_list>: $IP$/d" <file>
[Init]
@ -54,3 +54,9 @@ file = /etc/hosts.deny
# for hosts.deny/hosts_access. Default is all services.
# Values: STR Default: ALL
daemon_list = ALL
# internal variable IP (to differentiate the IPv4 and IPv6 syntax, where it is enclosed in brackets):
_ip_value = <ip>
[Init?family=inet6]
_ip_value = [<ip>]

View File

@ -15,10 +15,10 @@ prefregex = ^%(_apache_error_client)s (?:AH\d+: )?<F-CONTENT>.+</F-CONTENT>$
auth_type = ([A-Z]\w+: )?
failregex = ^client (?:denied by server configuration|used wrong authentication scheme)\b
^user <F-USER>(?:\S*|.*?)</F-USER> (?:auth(?:oriz|entic)ation failure|not found|denied by provider)\b
^user (?!`)<F-USER>(?:\S*|.*?)</F-USER> (?:auth(?:oriz|entic)ation failure|not found|denied by provider)\b
^Authorization of user <F-USER>(?:\S*|.*?)</F-USER> to access .*? failed\b
^%(auth_type)suser <F-USER>(?:\S*|.*?)</F-USER>: password mismatch\b
^%(auth_type)suser `<F-USER>(?:[^']*|.*?)</F-USER>' in realm `.+' (not found|denied by provider)\b
^%(auth_type)suser `<F-USER>(?:[^']*|.*?)</F-USER>' in realm `.+' (auth(?:oriz|entic)ation failure|not found|denied by provider)\b
^%(auth_type)sinvalid nonce .* received - length is not\b
^%(auth_type)srealm mismatch - got `(?:[^']*|.*?)' but expected\b
^%(auth_type)sunknown algorithm `(?:[^']*|.*?)' received\b

View File

@ -17,8 +17,13 @@ before = apache-common.conf
[Definition]
failregex = ^%(_apache_error_client)s ((AH001(28|30): )?File does not exist|(AH01264: )?script not found or unable to stat): /\S*(php([45]|[.-]cgi)?|\.asp|\.exe|\.pl)(, referer: \S+)?\s*$
^%(_apache_error_client)s script '/\S*(php([45]|[.-]cgi)?|\.asp|\.exe|\.pl)\S*' not found or unable to stat(, referer: \S+)?\s*$
script = /\S*(?:php(?:[45]|[.-]cgi)?|\.asp|\.exe|\.pl)
prefregex = ^%(_apache_error_client)s (?:AH0(?:01(?:28|30)|1(?:264|071)): )?(?:(?:[Ff]ile|script|[Gg]ot) )<F-CONTENT>.+</F-CONTENT>$
failregex = ^(?:does not exist|not found or unable to stat): <script>\b
^'<script>\S*' not found or unable to stat
^error '[Pp]rimary script unknown\\n'
ignoreregex =

View File

@ -16,15 +16,14 @@ _ttys_re=\S*
__pam_re=\(?%(__pam_auth)s(?:\(\S+\))?\)?:?
_daemon = \S+
prefregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure; logname=\S* uid=\S* euid=\S* tty=%(_ttys_re)s <F-CONTENT>.+</F-CONTENT>$
prefregex = ^%(__prefix_line)s%(__pam_re)s\s+authentication failure;(?:\s+(?:(?:logname|e?uid)=\S*)){0,3} tty=%(_ttys_re)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^ruser=<F-USER>\S*</F-USER> rhost=<HOST>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>\S*</F-USER>\s*$
^ruser= rhost=<HOST>\s+user=<F-USER>.*?</F-USER>\s*$
^ruser=<F-USER>.*?</F-USER> rhost=<HOST>\s*$
failregex = ^ruser=<F-ALT_USER>(?:\S*|.*?)</F-ALT_USER> rhost=<HOST>(?:\s+user=<F-USER>(?:\S*|.*?)</F-USER>)?\s*$
ignoreregex =
datepattern = {^LN-BEG}
# DEV Notes:
#
# for linux-pam before 0.99.2.0 (late 2005) (removed before 0.8.11 release)

View File

@ -21,18 +21,18 @@ before = common.conf
[Definition]
_daemon = fail2ban\.actions\s*
_daemon = (?:fail2ban(?:-server|\.actions)\s*)
# The name of the jail that this filter is used for. In jail.conf, name the
# jail using this filter 'recidive', or change this line!
# The name of the jail that this filter is used for. In jail.conf, name the jail using
# this filter 'recidive', or supply another name with `filter = recidive[_jailname="jail"]`
_jailname = recidive
failregex = ^(%(__prefix_line)s| %(_daemon)s%(__pid_re)s?:\s+)NOTICE\s+\[(?!%(_jailname)s\])(?:.*)\]\s+Ban\s+<HOST>\s*$
failregex = ^%(__prefix_line)s(?:\s*fail2ban\.actions\s*%(__pid_re)s?:\s+)?NOTICE\s+\[(?!%(_jailname)s\])(?:.*)\]\s+Ban\s+<HOST>\s*$
datepattern = ^{DATE}
ignoreregex =
[Init]
journalmatch = _SYSTEMD_UNIT=fail2ban.service PRIORITY=5
# Author: Tom Hendrikx, modifications by Amir Caspi

View File

@ -21,52 +21,62 @@ _daemon = sshd
# optional prefix (logged from several ssh versions) like "error: ", "error: PAM: " or "fatal: "
__pref = (?:(?:error|fatal): (?:PAM: )?)?
# optional suffix (logged from several ssh versions) like " [preauth]"
__suff = (?: \[preauth\])?\s*
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
#__suff = (?: port \d+)?(?: \[preauth\])?\s*
__suff = (?: (?:port \d+|on \S+|\[preauth\])){0,3}\s*
__on_port_opt = (?: (?:port \d+|on \S+)){0,2}
# for all possible (also future) forms of "no matching (cipher|mac|MAC|compression method|key exchange method|host key type) found",
# see ssherr.c for all possible SSH_ERR_..._ALG_MATCH errors.
__alg_match = (?:(?:\w+ (?!found\b)){0,2}\w+)
# PAM authentication mechanism, can be overridden, e. g. `filter = sshd[__pam_auth='pam_ldap']`:
__pam_auth = pam_[a-z]+
[Definition]
prefregex = ^<F-MLFID>%(__prefix_line)s</F-MLFID>%(__pref)s<F-CONTENT>.+</F-CONTENT>$
cmnfailre = ^[aA]uthentication (?:failure|error|failed) for <F-USER>.*</F-USER> from <HOST>( via \S+)?\s*%(__suff)s$
^User not known to the underlying authentication module for <F-USER>.*</F-USER> from <HOST>\s*%(__suff)s$
^Failed \S+ for invalid user <F-USER>(?P<cond_user>\S+)|(?:(?! from ).)*?</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
cmnfailre = ^[aA]uthentication (?:failure|error|failed) for <F-USER>.*</F-USER> from <HOST>( via \S+)?%(__suff)s$
^User not known to the underlying authentication module for <F-USER>.*</F-USER> from <HOST>%(__suff)s$
^Failed publickey for invalid user <F-USER>(?P<cond_user>\S+)|(?:(?! from ).)*?</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^Failed \b(?!publickey)\S+ for (?P<cond_inv>invalid user )?<F-USER>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^<F-USER>ROOT</F-USER> LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^[iI](?:llegal|nvalid) user <F-USER>.*?</F-USER> from <HOST>%(__on_port_opt)s\s*$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not in any group\s*%(__suff)s$
^refused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^<F-USER>ROOT</F-USER> LOGIN REFUSED FROM <HOST>
^[iI](?:llegal|nvalid) user <F-USER>.*?</F-USER> from <HOST>%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not listed in AllowUsers%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because listed in DenyUsers%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because not in any group%(__suff)s$
^refused connect from \S+ \(<HOST>\)
^Received <F-MLFFORGET>disconnect</F-MLFFORGET> from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^pam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=<F-USER>\S*</F-USER>\s*rhost=<HOST>\s.*%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because a group is listed in DenyGroups%(__suff)s$
^User <F-USER>.+</F-USER> from <HOST> not allowed because none of user's groups are listed in AllowGroups%(__suff)s$
^<F-NOFAIL>%(__pam_auth)s\(sshd:auth\):\s+authentication failure;</F-NOFAIL>(?:\s+(?:(?:logname|e?uid|tty)=\S*)){0,4}\s+ruser=<F-ALT_USER>\S*</F-ALT_USER>\s+rhost=<HOST>(?:\s+user=<F-USER>\S*</F-USER>)?%(__suff)s$
^(error: )?maximum authentication attempts exceeded for <F-USER>.*</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?%(__suff)s$
^User <F-USER>.+</F-USER> not allowed because account is locked%(__suff)s
^<F-MLFFORGET>Disconnecting</F-MLFFORGET>: Too many authentication failures(?: for <F-USER>.+?</F-USER>)?%(__suff)s
^<F-MLFFORGET>Disconnecting</F-MLFFORGET>: Too many authentication failures(?: for <F-USER>.+?</F-USER>)?%(__suff)s$
^<F-NOFAIL>Received <F-MLFFORGET>disconnect</F-MLFFORGET></F-NOFAIL> from <HOST>%(__on_port_opt)s:\s*11:
^<F-NOFAIL>Connection <F-MLFFORGET>closed</F-MLFFORGET></F-NOFAIL> by <HOST>%(__suff)s$
^<F-MLFFORGET><F-NOFAIL>Accepted publickey</F-NOFAIL></F-MLFFORGET> for \S+ from <HOST>(?:\s|$)
^<F-NOFAIL>Connection <F-MLFFORGET>closed</F-MLFFORGET></F-NOFAIL> by <HOST><mdrp-<mode>-suff-onclosed>
^<F-MLFFORGET><F-NOFAIL>Accepted \w+</F-NOFAIL></F-MLFFORGET> for <F-USER>\S+</F-USER> from <HOST>(?:\s|$)
mdre-normal =
# used to differentiate "connection closed" with and without `[preauth]` (fail/nofail cases in ddos mode)
mdrp-normal-suff-onclosed =
mdre-ddos = ^Did not receive identification string from <HOST>%(__on_port_opt)s%(__suff)s
^Connection <F-MLFFORGET>reset</F-MLFFORGET> by <HOST>%(__on_port_opt)s%(__suff)s
mdre-ddos = ^Did not receive identification string from <HOST>
^Connection <F-MLFFORGET>reset</F-MLFFORGET> by <HOST>
^Connection <F-MLFFORGET>closed</F-MLFFORGET> by <HOST>%(__on_port_opt)s\s+\[preauth\]\s*$
^<F-NOFAIL>SSH: Server;Ltype:</F-NOFAIL> (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:
^Read from socket failed: Connection <F-MLFFORGET>reset</F-MLFFORGET> by peer%(__suff)s
^Read from socket failed: Connection <F-MLFFORGET>reset</F-MLFFORGET> by peer
mdrp-ddos-suff-onclosed = %(__on_port_opt)s\s*$
mdre-extra = ^Received <F-MLFFORGET>disconnect</F-MLFFORGET> from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
mdre-extra = ^Received <F-MLFFORGET>disconnect</F-MLFFORGET> from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available
^Unable to negotiate with <HOST>%(__on_port_opt)s: no matching <__alg_match> found.
^Unable to negotiate a <__alg_match>%(__suff)s$
^Unable to negotiate a <__alg_match>
^no matching <__alg_match> found:
mdrp-extra-suff-onclosed = %(mdrp-normal-suff-onclosed)s
mdre-aggressive = %(mdre-ddos)s
%(mdre-extra)s
mdrp-aggressive-suff-onclosed = %(mdrp-ddos-suff-onclosed)s
cfooterre = ^<F-NOFAIL>Connection from</F-NOFAIL> <HOST>

View File

@ -99,7 +99,7 @@ class Fail2banClient(Fail2banCmdLine, Thread):
ret = client.send(c)
if ret[0] == 0:
logSys.log(5, "OK : %r", ret[1])
if showRet or c[0] == 'echo':
if showRet or c[0] in ('echo', 'server-status'):
output(beautifier.beautify(ret[1]))
else:
logSys.error("NOK: %r", ret[1].args)
@ -128,7 +128,7 @@ class Fail2banClient(Fail2banCmdLine, Thread):
except Exception as e: # pragma: no cover
if showRet or self._conf["verbose"] > 1:
logSys.debug(e)
if showRet or c[0] == 'echo':
if showRet or c[0] in ('echo', 'server-status'):
sys.stdout.flush()
return streamRet
@ -186,7 +186,7 @@ class Fail2banClient(Fail2banCmdLine, Thread):
logSys.error("Fail2ban seems to be in unexpected state (not running but the socket exists)")
return None
stream.append(['echo', 'Server ready'])
stream.append(['server-status'])
return stream
##

View File

@ -109,9 +109,6 @@ class DateDetectorCache(object):
"""Cache Fail2Ban's default template.
"""
if isinstance(template, str):
# exact given template with word begin-end boundary:
template = _getPatternTemplate(template)
# if not already line-begin anchored, additional template, that prefers datetime
# at start of a line (safety+performance feature):
name = template.name
@ -126,60 +123,74 @@ class DateDetectorCache(object):
# add template:
self.__tmpcache[1].append(template)
def _addDefaultTemplate(self):
"""Add resp. cache Fail2Ban's default set of date templates.
"""
self.__tmpcache = [], []
DEFAULT_TEMPLATES = [
# ISO 8601, simple date, optional subsecond and timezone:
# 2005-01-23T21:59:59.981746, 2005-01-23 21:59:59, 2005-01-23 8:59:59
# simple date: 2005/01/23 21:59:59
# custom for syslog-ng 2006.12.21 06:43:20
self._cacheTemplate("%ExY(?P<_sep>[-/.])%m(?P=_sep)%d(?:T| ?)%H:%M:%S(?:[.,]%f)?(?:\s*%z)?")
"%ExY(?P<_sep>[-/.])%m(?P=_sep)%d(?:T| ?)%H:%M:%S(?:[.,]%f)?(?:\s*%z)?",
# asctime with optional day, subsecond and/or year:
# Sun Jan 23 21:59:59.011 2005
self._cacheTemplate("(?:%a )?%b %d %k:%M:%S(?:\.%f)?(?: %ExY)?")
"(?:%a )?%b %d %k:%M:%S(?:\.%f)?(?: %ExY)?",
# asctime with optional day, subsecond and/or year coming after day
# http://bugs.debian.org/798923
# Sun Jan 23 2005 21:59:59.011
self._cacheTemplate("(?:%a )?%b %d %ExY %k:%M:%S(?:\.%f)?")
"(?:%a )?%b %d %ExY %k:%M:%S(?:\.%f)?",
# simple date too (from x11vnc): 23/01/2005 21:59:59
# and with optional year given by 2 digits: 23/01/05 21:59:59
# (See http://bugs.debian.org/537610)
# 17-07-2008 17:23:25
self._cacheTemplate("%d(?P<_sep>[-/])%m(?P=_sep)(?:%ExY|%Exy) %k:%M:%S")
"%d(?P<_sep>[-/])%m(?P=_sep)(?:%ExY|%Exy) %k:%M:%S",
# Apache format optional time zone:
# [31/Oct/2006:09:22:55 -0000]
# 26-Jul-2007 15:20:52
# named 26-Jul-2007 15:20:52.252
# roundcube 26-Jul-2007 15:20:52 +0200
self._cacheTemplate("%d(?P<_sep>[-/])%b(?P=_sep)%ExY[ :]?%H:%M:%S(?:\.%f)?(?: %z)?")
"%d(?P<_sep>[-/])%b(?P=_sep)%ExY[ :]?%H:%M:%S(?:\.%f)?(?: %z)?",
# CPanel 05/20/2008:01:57:39
self._cacheTemplate("%m/%d/%ExY:%H:%M:%S")
"%m/%d/%ExY:%H:%M:%S",
# 01-27-2012 16:22:44.252
# subseconds explicit to avoid possible %m<->%d confusion
# with previous ("%d-%m-%ExY %k:%M:%S" by "%d(?P<_sep>[-/])%m(?P=_sep)(?:%ExY|%Exy) %k:%M:%S")
self._cacheTemplate("%m-%d-%ExY %k:%M:%S(?:\.%f)?")
"%m-%d-%ExY %k:%M:%S(?:\.%f)?",
# Epoch
self._cacheTemplate('EPOCH')
"EPOCH",
# Only time information in the log
self._cacheTemplate("{^LN-BEG}%H:%M:%S")
"{^LN-BEG}%H:%M:%S",
# <09/16/08@05:03:30>
self._cacheTemplate("^<%m/%d/%Exy@%H:%M:%S>")
"^<%m/%d/%Exy@%H:%M:%S>",
# MySQL: 130322 11:46:11
self._cacheTemplate("%Exy%Exm%Exd ?%H:%M:%S")
"%Exy%Exm%Exd ?%H:%M:%S",
# Apache Tomcat
self._cacheTemplate("%b %d, %ExY %I:%M:%S %p")
"%b %d, %ExY %I:%M:%S %p",
# ASSP: Apr-27-13 02:33:06
self._cacheTemplate("^%b-%d-%Exy %k:%M:%S")
"^%b-%d-%Exy %k:%M:%S",
# 20050123T215959, 20050123 215959, 20050123 85959
self._cacheTemplate("%ExY%Exm%Exd(?:T| ?)%ExH%ExM%ExS(?:[.,]%f)?(?:\s*%z)?")
"%ExY%Exm%Exd(?:T| ?)%ExH%ExM%ExS(?:[.,]%f)?(?:\s*%z)?",
# prefixed with optional named time zone (monit):
# PDT Apr 16 21:05:29
self._cacheTemplate("(?:%Z )?(?:%a )?%b %d %k:%M:%S(?:\.%f)?(?: %ExY)?")
"(?:%Z )?(?:%a )?%b %d %k:%M:%S(?:\.%f)?(?: %ExY)?",
# +00:00 Jan 23 21:59:59.011 2005
self._cacheTemplate("(?:%z )?(?:%a )?%b %d %k:%M:%S(?:\.%f)?(?: %ExY)?")
"(?:%z )?(?:%a )?%b %d %k:%M:%S(?:\.%f)?(?: %ExY)?",
# TAI64N
self._cacheTemplate("TAI64N")
"TAI64N",
]
@property
def defaultTemplates(self):
if isinstance(DateDetectorCache.DEFAULT_TEMPLATES[0], str):
for i, dt in enumerate(DateDetectorCache.DEFAULT_TEMPLATES):
dt = _getPatternTemplate(dt)
DateDetectorCache.DEFAULT_TEMPLATES[i] = dt
return DateDetectorCache.DEFAULT_TEMPLATES
def _addDefaultTemplate(self):
"""Add resp. cache Fail2Ban's default set of date templates.
"""
self.__tmpcache = [], []
# cache default templates:
for dt in self.defaultTemplates:
self._cacheTemplate(dt)
#
self.__templates = self.__tmpcache[0] + self.__tmpcache[1]
del self.__tmpcache
@ -269,8 +280,7 @@ class DateDetector(object):
self.addDefaultTemplate(flt)
return
elif "{DATE}" in key:
self.addDefaultTemplate(
lambda template: not template.flags & DateTemplate.LINE_BEGIN, pattern)
self.addDefaultTemplate(preMatch=pattern, allDefaults=False)
return
else:
template = _getPatternTemplate(pattern, key)
@ -283,18 +293,20 @@ class DateDetector(object):
logSys.debug(" date pattern regex for %r: %s",
getattr(template, 'pattern', ''), template.regex)
def addDefaultTemplate(self, filterTemplate=None, preMatch=None):
def addDefaultTemplate(self, filterTemplate=None, preMatch=None, allDefaults=True):
"""Add Fail2Ban's default set of date templates.
"""
ignoreDup = len(self.__templates) > 0
for template in DateDetector._defCache.templates:
for template in (
DateDetector._defCache.templates if allDefaults else DateDetector._defCache.defaultTemplates
):
# filter if specified:
if filterTemplate is not None and not filterTemplate(template): continue
# if exact pattern available - create copy of template, contains replaced {DATE} with default regex:
if preMatch is not None:
# get cached or create a copy with modified name/pattern, using preMatch replacement for {DATE}:
template = _getAnchoredTemplate(template,
wrap=lambda s: RE_DATE_PREMATCH.sub(lambda m: s, preMatch))
wrap=lambda s: RE_DATE_PREMATCH.sub(lambda m: DateTemplate.unboundPattern(s), preMatch))
# append date detector template (ignore duplicate if some was added before default):
self._appendTemplate(template, ignoreDup=ignoreDup)

View File

@ -37,8 +37,10 @@ RE_GROUPED = re.compile(r'(?<!(?:\(\?))(?<!\\)\((?!\?)')
RE_GROUP = ( re.compile(r'^((?:\(\?\w+\))?\^?(?:\(\?\w+\))?)(.*?)(\$?)$'), r"\1(\2)\3" )
RE_EXLINE_BOUND_BEG = re.compile(r'^\{\^LN-BEG\}')
RE_EXSANC_BOUND_BEG = re.compile(r'^\(\?:\^\|\\b\|\\W\)')
RE_EXEANC_BOUND_BEG = re.compile(r'\(\?=\\b\|\\W\|\$\)$')
RE_NO_WRD_BOUND_BEG = re.compile(r'^\(*(?:\(\?\w+\))?(?:\^|\(*\*\*|\(\?:\^)')
RE_NO_WRD_BOUND_END = re.compile(r'(?<!\\)(?:\$\)?|\*\*\)*)$')
RE_NO_WRD_BOUND_END = re.compile(r'(?<!\\)(?:\$\)?|\\b|\\s|\*\*\)*)$')
RE_DEL_WRD_BOUNDS = ( re.compile(r'^\(*(?:\(\?\w+\))?\(*\*\*|(?<!\\)\*\*\)*$'),
lambda m: m.group().replace('**', '') )
@ -131,7 +133,7 @@ class DateTemplate(object):
# remove possible special pattern "**" in front and end of regex:
regex = RE_DEL_WRD_BOUNDS[0].sub(RE_DEL_WRD_BOUNDS[1], regex)
self._regex = regex
logSys.debug(' constructed regex %s', regex)
logSys.log(7, ' constructed regex %s', regex)
self._cRegex = None
regex = property(getRegex, setRegex, doc=
@ -182,6 +184,14 @@ class DateTemplate(object):
"""
raise NotImplementedError("getDate() is abstract")
@staticmethod
def unboundPattern(pattern):
return RE_EXEANC_BOUND_BEG.sub('',
RE_EXSANC_BOUND_BEG.sub('',
RE_EXLINE_BOUND_BEG.sub('', pattern)
)
)
class DateEpoch(DateTemplate):
"""A date template which searches for Unix timestamps.
@ -197,12 +207,12 @@ class DateEpoch(DateTemplate):
def __init__(self, lineBeginOnly=False, pattern=None, longFrm=False):
DateTemplate.__init__(self)
self.name = "Epoch"
self.name = "Epoch" if not pattern else pattern
self._longFrm = longFrm;
self._grpIdx = 1
epochRE = r"\d{10,11}\b(?:\.\d{3,6})?"
if longFrm:
self.name = "LongEpoch";
self.name = "LongEpoch" if not pattern else pattern
epochRE = r"\d{10,11}(?:\d{3}(?:\.\d{1,6}|\d{3})?)?"
if pattern:
# pattern should capture/cut out the whole match:

View File

@ -89,6 +89,11 @@ def mapTag2Opt(tag):
except KeyError:
return tag.lower()
# alternate names to be merged, e. g. alt_user_1 -> user ...
ALTNAME_PRE = 'alt_'
ALTNAME_CRE = re.compile(r'^' + ALTNAME_PRE + r'(.*)(?:_\d+)?$')
##
# Regular expression class.
#
@ -114,6 +119,14 @@ class Regex:
try:
self._regexObj = re.compile(regex, re.MULTILINE if multiline else 0)
self._regex = regex
self._altValues = {}
for k in filter(
lambda k: len(k) > len(ALTNAME_PRE) and k.startswith(ALTNAME_PRE),
self._regexObj.groupindex
):
n = ALTNAME_CRE.match(k).group(1)
self._altValues[k] = n
self._altValues = list(self._altValues.items()) if len(self._altValues) else None
except sre_constants.error:
raise RegexException("Unable to compile regular expression '%s'" %
regex)
@ -185,6 +198,13 @@ class Regex:
def getRegex(self):
return self._regex
##
# Returns string buffer using join of the tupleLines.
#
@staticmethod
def _tupleLinesBuf(tupleLines):
return "\n".join(map(lambda v: "".join(v[::2]), tupleLines)) + "\n"
##
# Searches the regular expression.
#
@ -194,8 +214,10 @@ class Regex:
# @param a list of tupples. The tupples are ( prematch, datematch, postdatematch )
def search(self, tupleLines, orgLines=None):
self._matchCache = self._regexObj.search(
"\n".join("".join(value[::2]) for value in tupleLines) + "\n")
buf = tupleLines
if not isinstance(tupleLines, basestring):
buf = Regex._tupleLinesBuf(tupleLines)
self._matchCache = self._regexObj.search(buf)
if self._matchCache:
if orgLines is None: orgLines = tupleLines
# if single-line:
@ -248,7 +270,16 @@ class Regex:
#
def getGroups(self):
return self._matchCache.groupdict()
if not self._altValues:
return self._matchCache.groupdict()
# merge alternate values (e. g. 'alt_user_1' -> 'user' or 'alt_host' -> 'host'):
fail = self._matchCache.groupdict()
#fail = fail.copy()
for k,n in self._altValues:
v = fail.get(k)
if v and not fail.get(n):
fail[n] = v
return fail
##
# Returns skipped lines.

View File

@ -586,37 +586,93 @@ class Filter(JailThread):
# @return: a boolean
def ignoreLine(self, tupleLines):
buf = Regex._tupleLinesBuf(tupleLines)
for ignoreRegexIndex, ignoreRegex in enumerate(self.__ignoreRegex):
ignoreRegex.search(tupleLines)
ignoreRegex.search(buf, tupleLines)
if ignoreRegex.hasMatched():
return ignoreRegexIndex
return None
def _updateUsers(self, fail, user=()):
users = fail.get('users')
# only for regex contains user:
if user:
if not users:
fail['users'] = users = set()
users.add(user)
return users
return None
# # ATM incremental (non-empty only) merge deactivated ...
# @staticmethod
# def _updateFailure(self, mlfidGroups, fail):
# # reset old failure-ids when new types of id available in this failure:
# fids = set()
# for k in ('fid', 'ip4', 'ip6', 'dns'):
# if fail.get(k):
# fids.add(k)
# if fids:
# for k in ('fid', 'ip4', 'ip6', 'dns'):
# if k not in fids:
# try:
# del mlfidGroups[k]
# except:
# pass
# # update not empty values:
# mlfidGroups.update(((k,v) for k,v in fail.iteritems() if v))
def _mergeFailure(self, mlfid, fail, failRegex):
mlfidFail = self.mlfidCache.get(mlfid) if self.__mlfidCache else None
users = None
nfflgs = 0
if fail.get('nofail'): nfflgs |= 1
if fail.get('mlfforget'): nfflgs |= 2
# if multi-line failure id (connection id) known:
if mlfidFail:
mlfidGroups = mlfidFail[1]
# update - if not forget (disconnect/reset):
if not fail.get('mlfforget'):
mlfidGroups.update(fail)
else:
self.mlfidCache.unset(mlfid) # remove cached entry
# merge with previous info:
fail2 = mlfidGroups.copy()
fail2.update(fail)
if not fail.get('nofail'): # be sure we've correct current state
try:
del fail2['nofail']
except KeyError:
pass
fail2["matches"] = fail.get("matches", []) + failRegex.getMatchedTupleLines()
fail = fail2
elif not fail.get('mlfforget'):
# update users set (hold all users of connect):
users = self._updateUsers(mlfidGroups, fail.get('user'))
# be sure we've correct current state ('nofail' only from last failure)
try:
del mlfidGroups['nofail']
except KeyError:
pass
# # ATM incremental (non-empty only) merge deactivated (for future version only),
# # it can be simulated using alternate value tags, like <F-ALT_VAL>...</F-ALT_VAL>,
# # so previous value 'val' will be overwritten only if 'alt_val' is not empty...
# _updateFailure(mlfidGroups, fail)
#
# overwrite multi-line failure with all values, available in fail:
mlfidGroups.update(fail)
# new merged failure data:
fail = mlfidGroups
# if forget (disconnect/reset) - remove cached entry:
if nfflgs & 2:
self.mlfidCache.unset(mlfid)
elif not (nfflgs & 2): # not mlfforget
users = self._updateUsers(fail, fail.get('user'))
mlfidFail = [self.__lastDate, fail]
self.mlfidCache.set(mlfid, mlfidFail)
if fail.get('nofail'):
fail["matches"] = failRegex.getMatchedTupleLines()
# check users in order to avoid reset failure by multiple logon-attempts:
if users and len(users) > 1:
# we've new user, reset 'nofail' because of multiple users attempts:
try:
del fail['nofail']
except KeyError:
pass
# merge matches:
if not fail.get('nofail'): # current state (corresponding users)
try:
m = fail.pop("nofail-matches")
m += fail.get("matches", [])
except KeyError:
m = fail.get("matches", [])
if not (nfflgs & 2): # not mlfforget:
m += failRegex.getMatchedTupleLines()
fail["matches"] = m
elif not (nfflgs & 2) and (nfflgs & 1): # not mlfforget and nofail:
fail["nofail-matches"] = fail.get("nofail-matches", []) + failRegex.getMatchedTupleLines()
# return merged:
return fail
@ -630,6 +686,7 @@ class Filter(JailThread):
def findFailure(self, tupleLine, date=None):
failList = list()
ll = logSys.getEffectiveLevel()
returnRawHost = self.returnRawHost
cidr = IPAddr.CIDR_UNSPEC
if self.__useDns == "raw":
@ -639,7 +696,7 @@ class Filter(JailThread):
# Checks if we mut ignore this line.
if self.ignoreLine([tupleLine[::2]]) is not None:
# The ignoreregex matched. Return.
logSys.log(7, "Matched ignoreregex and was \"%s\" ignored",
if ll <= 7: logSys.log(7, "Matched ignoreregex and was \"%s\" ignored",
"".join(tupleLine[::2]))
return failList
@ -666,7 +723,7 @@ class Filter(JailThread):
date = self.__lastDate
if self.checkFindTime and date is not None and date < MyTime.time() - self.getFindTime():
logSys.log(5, "Ignore line since time %s < %s - %s",
if ll <= 5: logSys.log(5, "Ignore line since time %s < %s - %s",
date, MyTime.time(), self.getFindTime())
return failList
@ -675,71 +732,75 @@ class Filter(JailThread):
self.__lineBuffer + [tupleLine[:3]])[-self.__lineBufferSize:]
else:
orgBuffer = self.__lineBuffer = [tupleLine[:3]]
logSys.log(5, "Looking for match of %r", self.__lineBuffer)
if ll <= 5: logSys.log(5, "Looking for match of %r", self.__lineBuffer)
buf = Regex._tupleLinesBuf(self.__lineBuffer)
# Pre-filter fail regex (if available):
preGroups = {}
if self.__prefRegex:
if logSys.getEffectiveLevel() <= logging.HEAVYDEBUG: # pragma: no cover
logSys.log(5, " Looking for prefregex %r", self.__prefRegex.getRegex())
self.__prefRegex.search(self.__lineBuffer)
if ll <= 5: logSys.log(5, " Looking for prefregex %r", self.__prefRegex.getRegex())
self.__prefRegex.search(buf, self.__lineBuffer)
if not self.__prefRegex.hasMatched():
logSys.log(5, " Prefregex not matched")
if ll <= 5: logSys.log(5, " Prefregex not matched")
return failList
preGroups = self.__prefRegex.getGroups()
logSys.log(7, " Pre-filter matched %s", preGroups)
if ll <= 7: logSys.log(7, " Pre-filter matched %s", preGroups)
repl = preGroups.get('content')
# Content replacement:
if repl:
del preGroups['content']
self.__lineBuffer = [('', '', repl)]
self.__lineBuffer, buf = [('', '', repl)], None
# Iterates over all the regular expressions.
for failRegexIndex, failRegex in enumerate(self.__failRegex):
if logSys.getEffectiveLevel() <= logging.HEAVYDEBUG: # pragma: no cover
logSys.log(5, " Looking for failregex %r", failRegex.getRegex())
failRegex.search(self.__lineBuffer, orgBuffer)
if not failRegex.hasMatched():
continue
# The failregex matched.
logSys.log(7, " Matched %s", failRegex)
# Checks if we must ignore this match.
if self.ignoreLine(failRegex.getMatchedTupleLines()) \
is not None:
# The ignoreregex matched. Remove ignored match.
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
logSys.log(7, " Matched ignoreregex and was ignored")
if not self.checkAllRegex:
break
else:
continue
if date is None:
logSys.warning(
"Found a match for %r but no valid date/time "
"found for %r. Please try setting a custom "
"date pattern (see man page jail.conf(5)). "
"If format is complex, please "
"file a detailed issue on"
" https://github.com/fail2ban/fail2ban/issues "
"in order to get support for this format.",
"\n".join(failRegex.getMatchedLines()), timeText)
continue
self.__lineBuffer = failRegex.getUnmatchedTupleLines()
# retrieve failure-id, host, etc from failure match:
try:
# buffer from tuples if changed:
if buf is None:
buf = Regex._tupleLinesBuf(self.__lineBuffer)
if ll <= 5: logSys.log(5, " Looking for failregex %d - %r", failRegexIndex, failRegex.getRegex())
failRegex.search(buf, orgBuffer)
if not failRegex.hasMatched():
continue
# current failure data (matched group dict):
fail = failRegex.getGroups()
# The failregex matched.
if ll <= 7: logSys.log(7, " Matched failregex %d: %s", failRegexIndex, fail)
# Checks if we must ignore this match.
if self.ignoreLine(failRegex.getMatchedTupleLines()) \
is not None:
# The ignoreregex matched. Remove ignored match.
self.__lineBuffer, buf = failRegex.getUnmatchedTupleLines(), None
if ll <= 7: logSys.log(7, " Matched ignoreregex and was ignored")
if not self.checkAllRegex:
break
else:
continue
if date is None:
logSys.warning(
"Found a match for %r but no valid date/time "
"found for %r. Please try setting a custom "
"date pattern (see man page jail.conf(5)). "
"If format is complex, please "
"file a detailed issue on"
" https://github.com/fail2ban/fail2ban/issues "
"in order to get support for this format.",
"\n".join(failRegex.getMatchedLines()), timeText)
continue
# we should check all regex (bypass on multi-line, otherwise too complex):
if not self.checkAllRegex or self.getMaxLines() > 1:
self.__lineBuffer, buf = failRegex.getUnmatchedTupleLines(), None
# merge data if multi-line failure:
raw = returnRawHost
if preGroups:
fail = preGroups.copy()
fail.update(failRegex.getGroups())
else:
fail = failRegex.getGroups()
currFail, fail = fail, preGroups.copy()
fail.update(currFail)
# first try to check we have mlfid case (caching of connection id by multi-line):
mlfid = fail.get('mlfid')
if mlfid is not None:
fail = self._mergeFailure(mlfid, fail, failRegex)
# bypass if no-failure case:
if fail.get('nofail'):
logSys.log(7, "Nofail by mlfid %r in regex %s: %s",
if ll <= 7: logSys.log(7, "Nofail by mlfid %r in regex %s: %s",
mlfid, failRegexIndex, fail.get('mlfforget', "waiting for failure"))
if not self.checkAllRegex: return failList
else:
@ -768,7 +829,7 @@ class Filter(JailThread):
cidr = IPAddr.CIDR_RAW
# if mlfid case (not failure):
if host is None:
logSys.log(7, "No failure-id by mlfid %r in regex %s: %s",
if ll <= 7: logSys.log(7, "No failure-id by mlfid %r in regex %s: %s",
mlfid, failRegexIndex, fail.get('mlfforget', "waiting for identifier"))
if not self.checkAllRegex: return failList
ips = [None]

View File

@ -588,7 +588,7 @@ class Server:
self.__logTarget = target
return True
# set a format which is simpler for console use
fmt = "%(name)-24s[%(process)d]: %(levelname)-7s %(message)s"
fmt = "%(name)-23.23s [%(process)d]: %(levelname)-7s %(message)s"
if systarget == "SYSLOG":
facility = logOptions.get('facility', 'DAEMON').upper()
try:

View File

@ -115,6 +115,9 @@ class Transmitter:
return cnt
elif command[0] == "echo":
return command[1:]
elif command[0] == "server-status":
logSys.debug("Server ready")
return "Server ready"
elif command[0] == "sleep":
value = command[1]
time.sleep(float(value))

View File

@ -58,6 +58,7 @@ if sys.version_info >= (2,7): # pragma: no cover - may be unavailable
self.jail.actions.add("badips", pythonModuleName, initOpts={
'category': "ssh",
'banaction': "test",
'age': "2w",
'score': 5,
'key': "fail2ban-test-suite",
#'bankey': "fail2ban-test-suite",

View File

@ -14,8 +14,8 @@ _daemon = sshd
# optional prefix (logged from several ssh versions) like "error: ", "error: PAM: " or "fatal: "
__pref = (?:(?:error|fatal): (?:PAM: )?)?
# optional suffix (logged from several ssh versions) like " [preauth]"
__suff = (?: \[preauth\])?\s*
__on_port_opt = (?: port \d+)?(?: on \S+(?: port \d+)?)?
__suff = (?: (?:port \d+|on \S+|\[preauth\])){0,3}\s*
__on_port_opt = (?: (?:port \d+|on \S+)){0,2}
# single line prefix:
__prefix_line_sl = %(__prefix_line)s%(__pref)s
@ -27,22 +27,25 @@ __prefix_line_ml2 = %(__suff)s$<SKIPLINES>^(?P=__prefix)%(__pref)s
# see ssherr.c for all possible SSH_ERR_..._ALG_MATCH errors.
__alg_match = (?:(?:\w+ (?!found\b)){0,2}\w+)
# PAM authentication mechanism, can be overridden, e. g. `filter = sshd[__pam_auth='pam_ldap']`:
__pam_auth = pam_[a-z]+
[Definition]
cmnfailre = ^%(__prefix_line_sl)s[aA]uthentication (?:failure|error|failed) for .* from <HOST>( via \S+)?\s*%(__suff)s$
^%(__prefix_line_sl)sUser not known to the underlying authentication module for .* from <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)sFailed \S+ for invalid user <F-USER>(?P<cond_user>\S+)|(?:(?! from ).)*?</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^%(__prefix_line_sl)sFailed \b(?!publickey)\S+ for (?P<cond_inv>invalid user )?<F-USER>(?P<cond_user>\S+)|(?(cond_inv)(?:(?! from ).)*?|[^:]+)</F-USER> from <HOST>%(__on_port_opt)s(?: ssh\d*)?(?(cond_user): |(?:(?:(?! from ).)*)$)
^%(__prefix_line_sl)sROOT LOGIN REFUSED.* FROM <HOST>\s*%(__suff)s$
^%(__prefix_line_sl)s[iI](?:llegal|nvalid) user .*? from <HOST>%(__on_port_opt)s\s*$
^%(__prefix_line_sl)sROOT LOGIN REFUSED FROM <HOST>
^%(__prefix_line_sl)s[iI](?:llegal|nvalid) user .*? from <HOST>%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not listed in AllowUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because listed in DenyUsers\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because not in any group\s*%(__suff)s$
^%(__prefix_line_sl)srefused connect from \S+ \(<HOST>\)\s*%(__suff)s$
^%(__prefix_line_sl)srefused connect from \S+ \(<HOST>\)
^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*3: .*: Auth fail%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because a group is listed in DenyGroups\s*%(__suff)s$
^%(__prefix_line_sl)sUser .+ from <HOST> not allowed because none of user's groups are listed in AllowGroups\s*%(__suff)s$
^%(__prefix_line_sl)spam_unix\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*%(__suff)s$
^%(__prefix_line_ml1)s%(__pam_auth)s\(sshd:auth\):\s+authentication failure;\s*logname=\S*\s*uid=\d*\s*euid=\d*\s*tty=\S*\s*ruser=\S*\s*rhost=<HOST>\s.*%(__suff)s$%(__prefix_line_ml2)sConnection closed
^%(__prefix_line_sl)s(error: )?maximum authentication attempts exceeded for .* from <HOST>%(__on_port_opt)s(?: ssh\d*)? \[preauth\]$
^%(__prefix_line_ml1)sUser .+ not allowed because account is locked%(__prefix_line_ml2)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*11: .+%(__suff)s$
^%(__prefix_line_ml1)sDisconnecting: Too many authentication failures(?: for .+?)?%(__suff)s%(__prefix_line_ml2)sConnection closed by <HOST>%(__suff)s$
@ -50,13 +53,13 @@ cmnfailre = ^%(__prefix_line_sl)s[aA]uthentication (?:failure|error|failed) for
mdre-normal =
mdre-ddos = ^%(__prefix_line_sl)sDid not receive identification string from <HOST>%(__on_port_opt)s%(__suff)s
^%(__prefix_line_sl)sConnection reset by <HOST>%(__on_port_opt)s%(__suff)s
mdre-ddos = ^%(__prefix_line_sl)sDid not receive identification string from <HOST>
^%(__prefix_line_sl)sConnection reset by <HOST>
^%(__prefix_line_ml1)sSSH: Server;Ltype: (?:Authname|Version|Kex);Remote: <HOST>-\d+;[A-Z]\w+:.*%(__prefix_line_ml2)sRead from socket failed: Connection reset by peer%(__suff)s$
mdre-extra = ^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available%(__suff)s$
mdre-extra = ^%(__prefix_line_sl)sReceived disconnect from <HOST>%(__on_port_opt)s:\s*14: No supported authentication methods available
^%(__prefix_line_sl)sUnable to negotiate with <HOST>%(__on_port_opt)s: no matching <__alg_match> found.
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sUnable to negotiate a <__alg_match>%(__suff)s$
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sUnable to negotiate a <__alg_match>
^%(__prefix_line_ml1)sConnection from <HOST>%(__on_port_opt)s%(__prefix_line_ml2)sno matching <__alg_match> found:
mdre-aggressive = %(mdre-ddos)s

View File

@ -55,8 +55,8 @@ CLIENT = "fail2ban-client"
SERVER = "fail2ban-server"
BIN = dirname(Fail2banServer.getServerPath())
MAX_WAITTIME = 30 if not unittest.F2B.fast else 5
MID_WAITTIME = MAX_WAITTIME
MAX_WAITTIME = unittest.F2B.maxWaitTime(unittest.F2B.MAX_WAITTIME)
MID_WAITTIME = unittest.F2B.maxWaitTime(unittest.F2B.MID_WAITTIME)
##
# Several wrappers and settings for proper testing:

View File

@ -16,3 +16,5 @@
# apache 2.4
# failJSON: { "time": "2013-12-23T07:49:01", "match": true , "host": "204.232.202.107" }
[Mon Dec 23 07:49:01.981912 2013] [:error] [pid 3790] [client 204.232.202.107:46301] script '/var/www/timthumb.php' not found or unable to stat
# failJSON: { "time": "2018-03-11T08:56:20", "match": true , "host": "192.0.2.106", "desc": "php-fpm error" }
[Sun Mar 11 08:56:20.913548 2018] [proxy_fcgi:error] [pid 742:tid 140142593419008] [client 192.0.2.106:50900] AH01071: Got error 'Primary script unknown\n'

View File

@ -12,3 +12,8 @@ Sep 16 00:44:55 spaceman fail2ban.actions: NOTICE [jail] Ban 10.0.0.7
# failJSON: { "time": "2006-02-13T15:52:30", "match": true , "host": "1.2.3.4", "desc": "Extended with [PID] and padding" }
2006-02-13 15:52:30,388 fail2ban.actions [123]: NOTICE [sendmail] Ban 1.2.3.4
# failJSON: { "time": "2005-01-16T17:11:25", "match": true , "host": "192.0.2.1", "desc": "SYSLOG / systemd-journal without daemon-name" }
Jan 16 17:11:25 testorg fail2ban.actions[6605]: NOTICE [postfix-auth] Ban 192.0.2.1
# failJSON: { "time": "2005-03-05T08:41:28", "match": true , "host": "192.0.2.2", "desc": "SYSLOG / systemd-journal with daemon-name" }
Mar 05 08:41:28 test.org fail2ban-server[11524]: fail2ban.actions [11524]: NOTICE [postfix-auth] Ban 192.0.2.2

View File

@ -24,6 +24,8 @@ Feb 25 14:34:11 belka sshd[31603]: Failed password for invalid user ROOT from aa
# failJSON: { "time": "2005-01-05T01:31:41", "match": true , "host": "1.2.3.4" }
Jan 5 01:31:41 www sshd[1643]: ROOT LOGIN REFUSED FROM 1.2.3.4
# failJSON: { "time": "2005-01-05T01:31:41", "match": true , "host": "1.2.3.4" }
Jan 5 01:31:41 www sshd[1643]: ROOT LOGIN REFUSED FROM 1.2.3.4 port 12345 [preauth]
# failJSON: { "time": "2005-01-05T01:31:41", "match": true , "host": "1.2.3.4" }
Jan 5 01:31:41 www sshd[1643]: ROOT LOGIN REFUSED FROM ::ffff:1.2.3.4
#4
@ -118,7 +120,7 @@ Sep 29 16:28:02 spaceman sshd[16699]: Failed password for dan from 127.0.0.1 por
# failJSON: { "match": false, "desc": "no failure, just cache mlfid (conn-id)" }
Sep 29 16:28:05 localhost sshd[16700]: Connection from 192.0.2.5
# failJSON: { "match": false, "desc": "no failure, just covering mlfid (conn-id) forget" }
Sep 29 16:28:05 localhost sshd[16700]: Connection closed by 192.0.2.5 [preauth]
Sep 29 16:28:05 localhost sshd[16700]: Connection closed by 192.0.2.5
# failJSON: { "time": "2004-09-29T17:15:02", "match": true , "host": "127.0.0.1" }
Sep 29 17:15:02 spaceman sshd[12946]: Failed hostbased for dan from 127.0.0.1 port 45785 ssh2: RSA 8c:e3:aa:0f:64:51:02:f7:14:79:89:3f:65:84:7c:30, client user "dan", client host "localhost.localdomain"
@ -157,7 +159,7 @@ Nov 28 09:16:03 srv sshd[32307]: Postponed publickey for git from 192.0.2.1 port
# failJSON: { "match": false }
Nov 28 09:16:03 srv sshd[32307]: Accepted publickey for git from 192.0.2.1 port 57904 ssh2: DSA 36:48:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx
# failJSON: { "match": false, "desc": "Should be forgotten by success/accepted public key" }
Nov 28 09:16:03 srv sshd[32307]: Connection closed by 192.0.2.1 [preauth]
Nov 28 09:16:03 srv sshd[32307]: Connection closed by 192.0.2.1
# Failure on connect with valid user-name but wrong public keys (retarded to disconnect/too many errors, because of gh-1263):
# failJSON: { "match": false }
@ -177,7 +179,7 @@ Nov 23 21:50:37 sshd[8148]: Connection closed by 61.0.0.1 [preauth]
# failJSON: { "match": false }
Nov 23 21:50:19 sshd[9148]: Disconnecting: Too many authentication failures for root [preauth]
# failJSON: { "match": false , "desc": "Pids don't match" }
Nov 23 21:50:37 sshd[7148]: Connection closed by 61.0.0.1 [preauth]
Nov 23 21:50:37 sshd[7148]: Connection closed by 61.0.0.1
# failJSON: { "time": "2005-07-13T18:44:28", "match": true , "host": "89.24.13.192", "desc": "from gh-289" }
Jul 13 18:44:28 mdop sshd[4931]: Received disconnect from 89.24.13.192: 3: com.jcraft.jsch.JSchException: Auth fail
@ -210,9 +212,46 @@ Apr 27 13:02:04 host sshd[29116]: input_userauth_request: invalid user root [pre
# failJSON: { "time": "2005-04-27T13:02:04", "match": true , "host": "1.2.3.4", "desc": "No Bye-Bye" }
Apr 27 13:02:04 host sshd[29116]: Received disconnect from 1.2.3.4: 11: Normal Shutdown, Thank you for playing [preauth]
# Match sshd auth errors on OpenSUSE systems
# failJSON: { "time": "2015-04-16T20:02:50", "match": true , "host": "222.186.21.217", "desc": "Authentication for user failed" }
2015-04-16T18:02:50.321974+00:00 host sshd[2716]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=222.186.21.217 user=root
# Match sshd auth errors on OpenSUSE systems (gh-1024)
# failJSON: { "match": false, "desc": "No failure until closed or another fail (e. g. F-MLFFORGET by success/accepted password can avoid failure, see gh-2070)" }
2015-04-16T18:02:50.321974+00:00 host sshd[2716]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.0.2.112 user=root
# failJSON: { "time": "2015-04-16T20:02:50", "match": true , "host": "192.0.2.112", "desc": "Should catch failure - no success/no accepted password" }
2015-04-16T18:02:50.568798+00:00 host sshd[2716]: Connection closed by 192.0.2.112 [preauth]
# disable this test-cases block for obsolete multi-line filter (zzz-sshd-obsolete...):
# filterOptions: [{"test.condition":"name=='sshd'"}]
# 2 methods auth: pam_unix and pam_ldap are used in combination (gh-2070), succeeded after "failure" in first method:
# failJSON: { "match": false , "desc": "No failure" }
Mar 7 18:53:20 bar sshd[1556]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=192.0.2.113 user=rda
# failJSON: { "match": false , "desc": "No failure" }
Mar 7 18:53:20 bar sshd[1556]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser=rda rhost=192.0.2.113 [preauth]
# failJSON: { "match": false , "desc": "No failure" }
Mar 7 18:53:20 bar sshd[1556]: Accepted password for rda from 192.0.2.113 port 52100 ssh2
# failJSON: { "match": false , "desc": "No failure" }
Mar 7 18:53:20 bar sshd[1556]: pam_unix(sshd:session): session opened for user rda by (uid=0)
# failJSON: { "match": false , "desc": "No failure" }
Mar 7 18:53:20 bar sshd[1556]: Connection closed by 192.0.2.113
# several attempts, intruder tries to "forget" failed attempts by success login (all 3 attempts with different users):
# failJSON: { "match": false , "desc": "Still no failure (first try)" }
Mar 7 18:53:22 bar sshd[1558]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser=root rhost=192.0.2.114
# failJSON: { "time": "2005-03-07T18:53:23", "match": true , "attempts": 2, "users": ["root", "sudoer"], "host": "192.0.2.114", "desc": "Failure: attempt 2nd user" }
Mar 7 18:53:23 bar sshd[1558]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser=sudoer rhost=192.0.2.114
# failJSON: { "time": "2005-03-07T18:53:24", "match": true , "attempts": 2, "users": ["root", "sudoer", "known"], "host": "192.0.2.114", "desc": "Failure: attempt 3rd user" }
Mar 7 18:53:24 bar sshd[1558]: Accepted password for known from 192.0.2.114 port 52100 ssh2
# failJSON: { "match": false , "desc": "No failure" }
Mar 7 18:53:24 bar sshd[1558]: pam_unix(sshd:session): session opened for user known by (uid=0)
# several attempts, intruder tries to "forget" failed attempts by success login (accepted for other user as in first failed attempt):
# failJSON: { "match": false , "desc": "Still no failure (first try)" }
Mar 7 18:53:32 bar sshd[1559]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser=root rhost=192.0.2.116
# failJSON: { "match": false , "desc": "Still no failure (second try, same user)" }
Mar 7 18:53:32 bar sshd[1559]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser=root rhost=192.0.2.116
# failJSON: { "time": "2005-03-07T18:53:34", "match": true , "attempts": 2, "users": ["root", "known"], "host": "192.0.2.116", "desc": "Failure: attempt 2nd user" }
Mar 7 18:53:34 bar sshd[1559]: Accepted password for known from 192.0.2.116 port 52100 ssh2
# failJSON: { "match": false , "desc": "No failure" }
Mar 7 18:53:38 bar sshd[1559]: Connection closed by 192.0.2.116
# filterOptions: [{"mode": "ddos"}, {"mode": "aggressive"}]
@ -244,6 +283,14 @@ Nov 24 23:46:43 host sshd[32686]: fatal: Read from socket failed: Connection res
# failJSON: { "time": "2005-03-15T09:20:57", "match": true , "host": "192.0.2.39", "desc": "Singleline for connection reset by" }
Mar 15 09:20:57 host sshd[28972]: Connection reset by 192.0.2.39 port 14282 [preauth]
# filterOptions: [{"test.condition":"name=='sshd'", "mode": "ddos"}, {"test.condition":"name=='sshd'", "mode": "aggressive"}]
# failJSON: { "time": "2005-03-15T09:21:01", "match": true , "host": "192.0.2.212", "desc": "DDOS mode causes failure on close within preauth stage" }
Mar 15 09:21:01 host sshd[2717]: Connection closed by 192.0.2.212 [preauth]
# failJSON: { "time": "2005-03-15T09:21:02", "match": true , "host": "192.0.2.212", "desc": "DDOS mode causes failure on close within preauth stage" }
Mar 15 09:21:02 host sshd[2717]: Connection closed by 192.0.2.212 [preauth]
# filterOptions: [{"mode": "extra"}, {"mode": "aggressive"}]

View File

@ -82,10 +82,7 @@ def _killfile(f, name):
_killfile(None, name + '.bak')
def _maxWaitTime(wtime):
if unittest.F2B.fast: # pragma: no cover
wtime /= 10.0
return wtime
_maxWaitTime = unittest.F2B.maxWaitTime
class _tmSerial():
@ -657,12 +654,12 @@ class LogFileMonitor(LogCaptureTestCase):
_killfile(self.file, self.name)
pass
def isModified(self, delay=2.):
def isModified(self, delay=2):
"""Wait up to `delay` sec to assure that it was modified or not
"""
return Utils.wait_for(lambda: self.filter.isModified(self.name), _maxWaitTime(delay))
def notModified(self, delay=2.):
def notModified(self, delay=2):
"""Wait up to `delay` sec as long as it was not modified
"""
return Utils.wait_for(lambda: not self.filter.isModified(self.name), _maxWaitTime(delay))
@ -817,7 +814,7 @@ class CommonMonitorTestCase(unittest.TestCase):
super(CommonMonitorTestCase, self).setUp()
self._failTotal = 0
def waitFailTotal(self, count, delay=1.):
def waitFailTotal(self, count, delay=1):
"""Wait up to `delay` sec to assure that expected failure `count` reached
"""
ret = Utils.wait_for(
@ -826,7 +823,7 @@ class CommonMonitorTestCase(unittest.TestCase):
self._failTotal += count
return ret
def isFilled(self, delay=1.):
def isFilled(self, delay=1):
"""Wait up to `delay` sec to assure that it was modified or not
"""
return Utils.wait_for(self.jail.isFilled, _maxWaitTime(delay))
@ -836,7 +833,7 @@ class CommonMonitorTestCase(unittest.TestCase):
"""
return Utils.wait_for(self.jail.isEmpty, _maxWaitTime(delay))
def waitForTicks(self, ticks, delay=2.):
def waitForTicks(self, ticks, delay=2):
"""Wait up to `delay` sec to assure that it was modified or not
"""
last_ticks = self.filter.ticks
@ -1148,6 +1145,7 @@ def get_monitor_failures_journal_testcase(Filter_): # pragma: systemd no cover
def setUp(self):
"""Call before every test case."""
super(MonitorJournalFailures, self).setUp()
self._runtimeJournal = None
self.test_file = os.path.join(TEST_FILES_DIR, "testcase-journal.log")
self.jail = DummyJail()
self.filter = None
@ -1159,6 +1157,7 @@ def get_monitor_failures_journal_testcase(Filter_): # pragma: systemd no cover
'TEST_FIELD': "1", 'TEST_UUID': self.test_uuid}
def _initFilter(self, **kwargs):
self._getRuntimeJournal() # check journal available
self.filter = Filter_(self.jail, **kwargs)
self.filter.addJournalMatch([
"SYSLOG_IDENTIFIER=fail2ban-testcases",
@ -1179,21 +1178,26 @@ def get_monitor_failures_journal_testcase(Filter_): # pragma: systemd no cover
def _getRuntimeJournal(self):
"""Retrieve current system journal path
If none found, None will be returned
If not found, SkipTest exception will be raised.
"""
# Depending on the system, it could be found under /run or /var/log (e.g. Debian)
# which are pointed by different systemd-path variables. We will
# check one at at time until the first hit
for systemd_var in 'system-runtime-logs', 'system-state-logs':
tmp = Utils.executeCmd(
'find "$(systemd-path %s)" -name system.journal' % systemd_var,
timeout=10, shell=True, output=True
)
self.assertTrue(tmp)
out = str(tmp[1].decode('utf-8')).split('\n')[0]
if out:
return out
# we can cache it:
if self._runtimeJournal is None:
# Depending on the system, it could be found under /run or /var/log (e.g. Debian)
# which are pointed by different systemd-path variables. We will
# check one at at time until the first hit
for systemd_var in 'system-runtime-logs', 'system-state-logs':
tmp = Utils.executeCmd(
'find "$(systemd-path %s)" -name system.journal' % systemd_var,
timeout=10, shell=True, output=True
)
self.assertTrue(tmp)
out = str(tmp[1].decode('utf-8')).split('\n')[0]
if out: break
self._runtimeJournal = out
if self._runtimeJournal:
return self._runtimeJournal
raise unittest.SkipTest('systemd journal seems to be not available (e. g. no rights to read)')
def testJournalFilesArg(self):
# retrieve current system journal path
jrnlfile = self._getRuntimeJournal()

View File

@ -287,6 +287,20 @@ class TestsUtilsTest(LogCaptureTestCase):
self.assertNotLogged('test "xyz"')
self.assertNotLogged('test', 'xyz', all=False)
self.assertNotLogged('test', 'xyz', 'zyx', all=True)
## maxWaitTime:
orgfast, unittest.F2B.fast = unittest.F2B.fast, False
self.assertFalse(isinstance(unittest.F2B.maxWaitTime(True), bool))
self.assertEqual(unittest.F2B.maxWaitTime(lambda: 50)(), 50)
self.assertEqual(unittest.F2B.maxWaitTime(25), 25)
self.assertEqual(unittest.F2B.maxWaitTime(25.), 25.0)
unittest.F2B.fast = True
try:
self.assertEqual(unittest.F2B.maxWaitTime(lambda: 50)(), 50)
self.assertEqual(unittest.F2B.maxWaitTime(25), 2.5)
self.assertEqual(unittest.F2B.maxWaitTime(25.), 25.0)
finally:
unittest.F2B.fast = orgfast
self.assertFalse(unittest.F2B.maxWaitTime(False))
## assertLogged, assertNotLogged negative case:
self.pruneLog()
logSys.debug('test "xyz"')
@ -296,8 +310,12 @@ class TestsUtilsTest(LogCaptureTestCase):
self.assertNotLogged, 'test', 'xyz', all=True)
self._testAssertionErrorRE(r"was not found in the log",
self.assertLogged, 'test', 'zyx', all=True)
self._testAssertionErrorRE(r"was not found in the log, waited 1e-06",
self.assertLogged, 'test', 'zyx', all=True, wait=1e-6)
self._testAssertionErrorRE(r"None among .* was found in the log",
self.assertLogged, 'test_zyx', 'zyx', all=False)
self._testAssertionErrorRE(r"None among .* was found in the log, waited 1e-06",
self.assertLogged, 'test_zyx', 'zyx', all=False, wait=1e-6)
self._testAssertionErrorRE(r"All of the .* were found present in the log",
self.assertNotLogged, 'test', 'xyz', all=False)
## assertDictEqual:

View File

@ -144,12 +144,14 @@ def testSampleRegexsFactory(name, basedir):
regexsUsedRe = set()
# process each test-file (note: array filenames can grow during processing):
faildata = {}
i = 0
while i < len(filenames):
filename = filenames[i]; i += 1;
logFile = fileinput.FileInput(os.path.join(TEST_FILES_DIR, "logs",
filename))
ignoreBlock = False
for line in logFile:
jsonREMatch = re.match("^#+ ?(failJSON|filterOptions|addFILE):(.+)$", line)
if jsonREMatch:
@ -159,9 +161,13 @@ def testSampleRegexsFactory(name, basedir):
if jsonREMatch.group(1) == 'filterOptions':
# following lines with another filter options:
self._filterTests = []
ignoreBlock = False
for opts in (faildata if isinstance(faildata, list) else [faildata]):
# unique filter name (using options combination):
self.assertTrue(isinstance(opts, dict))
if opts.get('test.condition'):
ignoreBlock = not eval(opts.get('test.condition'))
del opts['test.condition']
fltName = opts.get('filterName')
if not fltName: fltName = str(opts) if opts else ''
fltName = name + fltName
@ -178,10 +184,11 @@ def testSampleRegexsFactory(name, basedir):
raise ValueError("%s: %s:%i" %
(e, logFile.filename(), logFile.filelineno()))
line = next(logFile)
elif line.startswith("#") or not line.strip():
elif ignoreBlock or line.startswith("#") or not line.strip():
continue
else: # pragma: no cover - normally unreachable
faildata = {}
if ignoreBlock: continue
# if filter options was not yet specified:
if not self._filterTests:
@ -195,6 +202,7 @@ def testSampleRegexsFactory(name, basedir):
regexList = flt.getFailRegex()
try:
fail = {}
ret = flt.processLine(line)
if not ret:
# Bypass if filter constraint specified:
@ -222,9 +230,17 @@ def testSampleRegexsFactory(name, basedir):
for k, v in faildata.iteritems():
if k not in ("time", "match", "desc", "filter"):
fv = fail.get(k, None)
# Fallback for backwards compatibility (previously no fid, was host only):
if k == "host" and fv is None:
fv = fid
if fv is None:
# Fallback for backwards compatibility (previously no fid, was host only):
if k == "host":
fv = fid
# special case for attempts counter:
if k == "attempts":
fv = len(fail.get('matches', {}))
# compare sorted (if set)
if isinstance(fv, (set, list, dict)):
self.assertSortedEqual(fv, v)
continue
self.assertEqual(fv, v)
t = faildata.get("time", None)
@ -246,8 +262,12 @@ def testSampleRegexsFactory(name, basedir):
regexsUsedIdx.add(failregex)
regexsUsedRe.add(regexList[failregex])
except AssertionError as e: # pragma: no cover
raise AssertionError("%s: %s on: %s:%i, line:\n%s" % (
fltName, e, logFile.filename(), logFile.filelineno(), line))
import pprint
raise AssertionError("%s: %s on: %s:%i, line:\n%s\n"
"faildata: %s\nfail: %s" % (
fltName, e, logFile.filename(), logFile.filelineno(), line,
'\n'.join(pprint.pformat(faildata).splitlines()),
'\n'.join(pprint.pformat(fail).splitlines())))
# check missing samples for regex using each filter-options combination:
for fltName, flt in self._filters.iteritems():

View File

@ -180,6 +180,10 @@ def initProcess(opts):
class F2B(DefaultTestOptions):
MAX_WAITTIME = 60
MID_WAITTIME = 30
def __init__(self, opts):
self.__dict__ = opts.__dict__
if self.fast:
@ -215,8 +219,12 @@ class F2B(DefaultTestOptions):
return wrapper
return _deco_wrapper
def maxWaitTime(self,wtime):
if self.fast:
def maxWaitTime(self, wtime=True):
if isinstance(wtime, bool) and wtime:
wtime = self.MAX_WAITTIME
# short only integer interval (avoid by conditional wait with callable, and dual
# wrapping in some routines, if it will be called twice):
if self.fast and isinstance(wtime, int):
wtime = float(wtime) / 10
return wtime
@ -676,7 +684,7 @@ class LogCaptureTestCase(unittest.TestCase):
return self._val
# try to lock, if not possible - return cached/empty (max 5 times):
lck = self._lock.acquire(False)
if not lck: # pargma: no cover (may be too sporadic on slow systems)
if not lck: # pragma: no cover (may be too sporadic on slow systems)
self._nolckCntr += 1
if self._nolckCntr <= 5:
return self._val if self._val is not None else ''
@ -765,21 +773,24 @@ class LogCaptureTestCase(unittest.TestCase):
"""
wait = kwargs.get('wait', None)
if wait:
wait = unittest.F2B.maxWaitTime(wait)
res = Utils.wait_for(lambda: self._is_logged(*s, **kwargs), wait)
else:
res = self._is_logged(*s, **kwargs)
if not kwargs.get('all', False):
# at least one entry should be found:
if not res: # pragma: no cover
if not res:
logged = self._log.getvalue()
self.fail("None among %r was found in the log: ===\n%s===" % (s, logged))
self.fail("None among %r was found in the log%s: ===\n%s===" % (s,
((', waited %s' % wait) if wait else ''), logged))
else:
# each entry should be found:
if not res: # pragma: no cover
if not res:
logged = self._log.getvalue()
for s_ in s:
if s_ not in logged:
self.fail("%r was not found in the log: ===\n%s===" % (s_, logged))
self.fail("%r was not found in the log%s: ===\n%s===" % (s_,
((', waited %s' % wait) if wait else ''), logged))
def assertNotLogged(self, *s, **kwargs):
"""Assert that strings were not logged
@ -796,11 +807,10 @@ class LogCaptureTestCase(unittest.TestCase):
for s_ in s:
if s_ not in logged:
return
if True: # pragma: no cover
self.fail("All of the %r were found present in the log: ===\n%s===" % (s, logged))
self.fail("All of the %r were found present in the log: ===\n%s===" % (s, logged))
else:
for s_ in s:
if s_ in logged: # pragma: no cover
if s_ in logged:
self.fail("%r was found in the log: ===\n%s===" % (s_, logged))
def pruneLog(self, logphase=None):