Merge branch 'master' into journaldefault

Conflicts:
	ChangeLog
pull/1225/head
Orion Poplawski 2015-11-13 15:22:59 -07:00
commit c656cb0d36
16 changed files with 1046 additions and 607 deletions

5
.mailmap Normal file
View File

@ -0,0 +1,5 @@
Lee Clemens <java@leeclemens.net>
Serg G. Brester <info@sebres.de>
Serg G. Brester <serg.brester@sebres.de>
Serg G. Brester <sergey.brester@W7-DEHBG0189.wincor-nixdorf.com>
Viktor Szépe <viktor@szepe.net>

View File

@ -22,9 +22,25 @@ ver. 0.9.4 (2015/XX/XXX) - wanna-be-released
different log messages), which addresses different behavior on different
exit codes of dash and bash (gh-1155)
* Fix jail.conf.5 man's section (gh-1226)
* Fixed default banaction for allports jails like pam-generic, recidive, etc
with new default variable `banaction_allports` (gh-1216)
* Fixed `fail2ban-regex` stops working on invalid (wrong encoded) character
for python version < 3.x (gh-1248)
* Use postfix_log logpath for postfix-rbl jail
- New Features:
* New interpolation feature for definition config readers - `<known/parameter>`
(means last known init definition of filters or actions with name `parameter`).
This interpolation makes possible to extend a parameters of stock filter or
action directly in jail inside jail.local file, without creating a separately
filter.d/*.local file.
As extension to interpolation `%(known/parameter)s`, that does not works for
filter and action init parameters
* New filters:
- openhab - domotic software authentication failure with the
rest api and web interface (gh-1223)
- nginx-limit-req - ban hosts, that were failed through nginx by limit
request processing rate (ngx_http_limit_req_module)
- Enhancements:
* Do not rotate empty log files
@ -38,6 +54,7 @@ ver. 0.9.4 (2015/XX/XXX) - wanna-be-released
(Thanks Pablo Rodriguez Fernandez)
* Enhance filter against atacker's Googlebot PTR fake records
(gh-1226)
* Nginx log paths extended (prefixed with "*" wildcard) (gh-1237)
* Added filter for openhab domotic software authentication failure with the
rest api and web interface (gh-1223)
* Add *_backend options for services to allow distros to set the default

View File

@ -29,563 +29,6 @@ __author__ = "Fail2Ban Developers"
__copyright__ = "Copyright (c) 2004-2008 Cyril Jaquier, 2012-2014 Yaroslav Halchenko"
__license__ = "GPL"
import getopt
import locale
import logging
import os
import shlex
import sys
import time
import time
import urllib
from optparse import OptionParser, Option
from fail2ban.client.fail2banregex import exec_command_line
from ConfigParser import NoOptionError, NoSectionError, MissingSectionHeaderError
try:
from systemd import journal
from fail2ban.server.filtersystemd import FilterSystemd
except ImportError:
journal = None
from fail2ban.version import version
from fail2ban.client.filterreader import FilterReader
from fail2ban.server.filter import Filter
from fail2ban.server.failregex import RegexException
from fail2ban.helpers import FormatterWithTraceBack, getLogger
# Gets the instance of the logger.
logSys = getLogger("fail2ban")
def debuggexURL(sample, regex):
q = urllib.urlencode({ 're': regex.replace('<HOST>', '(?&.ipv4)'),
'str': sample,
'flavor': 'python' })
return 'http://www.debuggex.com/?' + q
def shortstr(s, l=53):
"""Return shortened string
"""
if len(s) > l:
return s[:l-3] + '...'
return s
def pprint_list(l, header=None):
if not len(l):
return
if header:
s = "|- %s\n" % header
else:
s = ''
print s + "| " + "\n| ".join(l) + '\n`-'
def file_lines_gen(hdlr):
for line in hdlr:
try:
line = line.decode(fail2banRegex.encoding, 'strict')
except UnicodeDecodeError:
if sys.version_info >= (3,): # Python 3 must be decoded
line = line.decode(fail2banRegex.encoding, 'ignore')
yield line
def journal_lines_gen(myjournal):
while True:
try:
entry = myjournal.get_next()
except OSError:
continue
if not entry:
break
yield FilterSystemd.formatJournalEntry(entry)
def get_opt_parser():
# use module docstring for help output
p = OptionParser(
usage="%s [OPTIONS] <LOG> <REGEX> [IGNOREREGEX]\n" % sys.argv[0] + __doc__
+ """
LOG:
string a string representing a log line
filename path to a log file (/var/log/auth.log)
"systemd-journal" search systemd journal (systemd-python required)
REGEX:
string a string representing a 'failregex'
filename path to a filter file (filter.d/sshd.conf)
IGNOREREGEX:
string a string representing an 'ignoreregex'
filename path to a filter file (filter.d/sshd.conf)
Copyright (c) 2004-2008 Cyril Jaquier, 2008- Fail2Ban Contributors
Copyright of modifications held by their respective authors.
Licensed under the GNU General Public License v2 (GPL).
Written by Cyril Jaquier <cyril.jaquier@fail2ban.org>.
Many contributions by Yaroslav O. Halchenko and Steven Hiscocks.
Report bugs to https://github.com/fail2ban/fail2ban/issues
""",
version="%prog " + version)
p.add_options([
Option("-d", "--datepattern",
help="set custom pattern used to match date/times"),
Option("-e", "--encoding",
help="File encoding. Default: system locale"),
Option("-L", "--maxlines", type=int, default=0,
help="maxlines for multi-line regex"),
Option("-m", "--journalmatch",
help="journalctl style matches overriding filter file. "
"\"systemd-journal\" only"),
Option('-l', "--log-level", type="choice",
dest="log_level",
choices=('heavydebug', 'debug', 'info', 'notice', 'warning', 'error', 'critical'),
default=None,
help="Log level for the Fail2Ban logger to use"),
Option("-v", "--verbose", action='store_true',
help="Be verbose in output"),
Option("-D", "--debuggex", action='store_true',
help="Produce debuggex.com urls for debugging there"),
Option("--print-no-missed", action='store_true',
help="Do not print any missed lines"),
Option("--print-no-ignored", action='store_true',
help="Do not print any ignored lines"),
Option("--print-all-matched", action='store_true',
help="Print all matched lines"),
Option("--print-all-missed", action='store_true',
help="Print all missed lines, no matter how many"),
Option("--print-all-ignored", action='store_true',
help="Print all ignored lines, no matter how many"),
Option("-t", "--log-traceback", action='store_true',
help="Enrich log-messages with compressed tracebacks"),
Option("--full-traceback", action='store_true',
help="Either to make the tracebacks full, not compressed (as by default)"),
])
return p
class RegexStat(object):
def __init__(self, failregex):
self._stats = 0
self._failregex = failregex
self._ipList = list()
def __str__(self):
return "%s(%r) %d failed: %s" \
% (self.__class__, self._failregex, self._stats, self._ipList)
def inc(self):
self._stats += 1
def getStats(self):
return self._stats
def getFailRegex(self):
return self._failregex
def appendIP(self, value):
self._ipList.append(value)
def getIPList(self):
return self._ipList
class LineStats(object):
"""Just a convenience container for stats
"""
def __init__(self):
self.tested = self.matched = 0
self.matched_lines = []
self.missed = 0
self.missed_lines = []
self.missed_lines_timeextracted = []
self.ignored = 0
self.ignored_lines = []
self.ignored_lines_timeextracted = []
def __str__(self):
return "%(tested)d lines, %(ignored)d ignored, %(matched)d matched, %(missed)d missed" % self
# just for convenient str
def __getitem__(self, key):
return getattr(self, key)
class Fail2banRegex(object):
def __init__(self, opts):
self._verbose = opts.verbose
self._debuggex = opts.debuggex
self._maxlines = 20
self._print_no_missed = opts.print_no_missed
self._print_no_ignored = opts.print_no_ignored
self._print_all_matched = opts.print_all_matched
self._print_all_missed = opts.print_all_missed
self._print_all_ignored = opts.print_all_ignored
self._maxlines_set = False # so we allow to override maxlines in cmdline
self._datepattern_set = False
self._journalmatch = None
self.share_config=dict()
self._filter = Filter(None)
self._ignoreregex = list()
self._failregex = list()
self._time_elapsed = None
self._line_stats = LineStats()
if opts.maxlines:
self.setMaxLines(opts.maxlines)
if opts.journalmatch is not None:
self.setJournalMatch(opts.journalmatch.split())
if opts.datepattern:
self.setDatePattern(opts.datepattern)
if opts.encoding:
self.encoding = opts.encoding
else:
self.encoding = locale.getpreferredencoding()
def setDatePattern(self, pattern):
if not self._datepattern_set:
self._filter.setDatePattern(pattern)
self._datepattern_set = True
if pattern is not None:
print "Use datepattern : %s" % (
self._filter.getDatePattern()[1], )
def setMaxLines(self, v):
if not self._maxlines_set:
self._filter.setMaxLines(int(v))
self._maxlines_set = True
print "Use maxlines : %d" % self._filter.getMaxLines()
def setJournalMatch(self, v):
if self._journalmatch is None:
self._journalmatch = v
def readRegex(self, value, regextype):
assert(regextype in ('fail', 'ignore'))
regex = regextype + 'regex'
if os.path.isfile(value) or os.path.isfile(value + '.conf'):
if os.path.basename(os.path.dirname(value)) == 'filter.d':
## within filter.d folder - use standard loading algorithm to load filter completely (with .local etc.):
basedir = os.path.dirname(os.path.dirname(value))
value = os.path.splitext(os.path.basename(value))[0]
print "Use %11s filter file : %s, basedir: %s" % (regex, value, basedir)
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config, basedir=basedir)
if not reader.read():
print "ERROR: failed to load filter %s" % value
return False
else:
## foreign file - readexplicit this file and includes if possible:
print "Use %11s file : %s" % (regex, value)
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config)
reader.setBaseDir(None)
if not reader.readexplicit():
print "ERROR: failed to read %s" % value
return False
reader.getOptions(None)
readercommands = reader.convert()
regex_values = [
RegexStat(m[3])
for m in filter(
lambda x: x[0] == 'set' and x[2] == "add%sregex" % regextype,
readercommands)]
# Read out and set possible value of maxlines
for command in readercommands:
if command[2] == "maxlines":
maxlines = int(command[3])
try:
self.setMaxLines(maxlines)
except ValueError:
print "ERROR: Invalid value for maxlines (%(maxlines)r) " \
"read from %(value)s" % locals()
return False
elif command[2] == 'addjournalmatch':
journalmatch = command[3:]
self.setJournalMatch(journalmatch)
elif command[2] == 'datepattern':
datepattern = command[3]
self.setDatePattern(datepattern)
else:
print "Use %11s line : %s" % (regex, shortstr(value))
regex_values = [RegexStat(value)]
setattr(self, "_" + regex, regex_values)
for regex in regex_values:
getattr(
self._filter,
'add%sRegex' % regextype.title())(regex.getFailRegex())
return True
def testIgnoreRegex(self, line):
found = False
try:
ret = self._filter.ignoreLine([(line, "", "")])
if ret is not None:
found = True
regex = self._ignoreregex[ret].inc()
except RegexException, e:
print e
return False
return found
def testRegex(self, line, date=None):
orgLineBuffer = self._filter._Filter__lineBuffer
fullBuffer = len(orgLineBuffer) >= self._filter.getMaxLines()
try:
line, ret = self._filter.processLine(line, date, checkAllRegex=True)
for match in ret:
# Append True/False flag depending if line was matched by
# more than one regex
match.append(len(ret)>1)
regex = self._failregex[match[0]]
regex.inc()
regex.appendIP(match)
except RegexException, e:
print e
return False
except IndexError:
print "Sorry, but no <HOST> found in regex"
return False
for bufLine in orgLineBuffer[int(fullBuffer):]:
if bufLine not in self._filter._Filter__lineBuffer:
try:
self._line_stats.missed_lines.pop(
self._line_stats.missed_lines.index("".join(bufLine)))
self._line_stats.missed_lines_timeextracted.pop(
self._line_stats.missed_lines_timeextracted.index(
"".join(bufLine[::2])))
except ValueError:
pass
else:
self._line_stats.matched += 1
self._line_stats.missed -= 1
return line, ret
def process(self, test_lines):
t0 = time.time()
for line_no, line in enumerate(test_lines):
if isinstance(line, tuple):
line_datetimestripped, ret = fail2banRegex.testRegex(
line[0], line[1])
line = "".join(line[0])
else:
line = line.rstrip('\r\n')
if line.startswith('#') or not line:
# skip comment and empty lines
continue
line_datetimestripped, ret = fail2banRegex.testRegex(line)
is_ignored = fail2banRegex.testIgnoreRegex(line_datetimestripped)
if is_ignored:
self._line_stats.ignored += 1
if not self._print_no_ignored and (self._print_all_ignored or self._line_stats.ignored <= self._maxlines + 1):
self._line_stats.ignored_lines.append(line)
self._line_stats.ignored_lines_timeextracted.append(line_datetimestripped)
if len(ret) > 0:
assert(not is_ignored)
self._line_stats.matched += 1
if self._print_all_matched:
self._line_stats.matched_lines.append(line)
else:
if not is_ignored:
self._line_stats.missed += 1
if not self._print_no_missed and (self._print_all_missed or self._line_stats.missed <= self._maxlines + 1):
self._line_stats.missed_lines.append(line)
self._line_stats.missed_lines_timeextracted.append(line_datetimestripped)
self._line_stats.tested += 1
if line_no % 10 == 0 and self._filter.dateDetector is not None:
self._filter.dateDetector.sortTemplate()
self._time_elapsed = time.time() - t0
def printLines(self, ltype):
lstats = self._line_stats
assert(self._line_stats.missed == lstats.tested - (lstats.matched + lstats.ignored))
lines = lstats[ltype]
l = lstats[ltype + '_lines']
if lines:
header = "%s line(s):" % (ltype.capitalize(),)
if self._debuggex:
if ltype == 'missed' or ltype == 'matched':
regexlist = self._failregex
else:
regexlist = self._ignoreregex
l = lstats[ltype + '_lines_timeextracted']
if lines < self._maxlines or getattr(self, '_print_all_' + ltype):
ans = [[]]
for arg in [l, regexlist]:
ans = [ x + [y] for x in ans for y in arg ]
b = map(lambda a: a[0] + ' | ' + a[1].getFailRegex() + ' | ' + debuggexURL(a[0], a[1].getFailRegex()), ans)
pprint_list([x.rstrip() for x in b], header)
else:
print "%s too many to print. Use --print-all-%s " \
"to print all %d lines" % (header, ltype, lines)
elif lines < self._maxlines or getattr(self, '_print_all_' + ltype):
pprint_list([x.rstrip() for x in l], header)
else:
print "%s too many to print. Use --print-all-%s " \
"to print all %d lines" % (header, ltype, lines)
def printStats(self):
print
print "Results"
print "======="
def print_failregexes(title, failregexes):
# Print title
total, out = 0, []
for cnt, failregex in enumerate(failregexes):
match = failregex.getStats()
total += match
if (match or self._verbose):
out.append("%2d) [%d] %s" % (cnt+1, match, failregex.getFailRegex()))
if self._verbose and len(failregex.getIPList()):
for ip in failregex.getIPList():
timeTuple = time.localtime(ip[2])
timeString = time.strftime("%a %b %d %H:%M:%S %Y", timeTuple)
out.append(
" %s %s%s" % (
ip[1],
timeString,
ip[-1] and " (multiple regex matched)" or ""))
print "\n%s: %d total" % (title, total)
pprint_list(out, " #) [# of hits] regular expression")
return total
# Print title
total = print_failregexes("Failregex", self._failregex)
_ = print_failregexes("Ignoreregex", self._ignoreregex)
if self._filter.dateDetector is not None:
print "\nDate template hits:"
out = []
for template in self._filter.dateDetector.templates:
if self._verbose or template.hits:
out.append("[%d] %s" % (
template.hits, template.name))
pprint_list(out, "[# of hits] date format")
print "\nLines: %s" % self._line_stats,
if self._time_elapsed is not None:
print "[processed in %.2f sec]" % self._time_elapsed,
print
if self._print_all_matched:
self.printLines('matched')
if not self._print_no_ignored:
self.printLines('ignored')
if not self._print_no_missed:
self.printLines('missed')
return True
if __name__ == "__main__":
parser = get_opt_parser()
(opts, args) = parser.parse_args()
if opts.print_no_missed and opts.print_all_missed:
sys.stderr.write("ERROR: --print-no-missed and --print-all-missed are mutually exclusive.\n\n")
parser.print_help()
sys.exit(-1)
if opts.print_no_ignored and opts.print_all_ignored:
sys.stderr.write("ERROR: --print-no-ignored and --print-all-ignored are mutually exclusive.\n\n")
parser.print_help()
sys.exit(-1)
print
print "Running tests"
print "============="
print
fail2banRegex = Fail2banRegex(opts)
# We need 2 or 3 parameters
if not len(args) in (2, 3):
sys.stderr.write("ERROR: provide both <LOG> and <REGEX>.\n\n")
parser.print_help()
sys.exit(-1)
# TODO: taken from -testcases -- move common functionality somewhere
if opts.log_level is not None: # pragma: no cover
# so we had explicit settings
logSys.setLevel(getattr(logging, opts.log_level.upper()))
else: # pragma: no cover
# suppress the logging but it would leave unittests' progress dots
# ticking, unless like with '-l critical' which would be silent
# unless error occurs
logSys.setLevel(getattr(logging, 'CRITICAL'))
# Add the default logging handler
stdout = logging.StreamHandler(sys.stdout)
fmt = 'D: %(message)s'
if opts.log_traceback:
Formatter = FormatterWithTraceBack
fmt = (opts.full_traceback and ' %(tb)s' or ' %(tbc)s') + fmt
else:
Formatter = logging.Formatter
# Custom log format for the verbose tests runs
if opts.verbose: # pragma: no cover
stdout.setFormatter(Formatter(' %(asctime)-15s %(thread)s' + fmt))
else: # pragma: no cover
# just prefix with the space
stdout.setFormatter(Formatter(fmt))
logSys.addHandler(stdout)
cmd_log, cmd_regex = args[:2]
fail2banRegex.readRegex(cmd_regex, 'fail') or sys.exit(-1)
if len(args) == 3:
fail2banRegex.readRegex(args[2], 'ignore') or sys.exit(-1)
if os.path.isfile(cmd_log):
try:
hdlr = open(cmd_log, 'rb')
print "Use log file : %s" % cmd_log
print "Use encoding : %s" % fail2banRegex.encoding
test_lines = file_lines_gen(hdlr)
except IOError, e:
print e
sys.exit(-1)
elif cmd_log == "systemd-journal":
if not journal:
print "Error: systemd library not found. Exiting..."
sys.exit(-1)
myjournal = journal.Reader(converters={'__CURSOR': lambda x: x})
journalmatch = fail2banRegex._journalmatch
fail2banRegex.setDatePattern(None)
if journalmatch:
try:
for element in journalmatch:
if element == "+":
myjournal.add_disjunction()
else:
myjournal.add_match(element)
except ValueError:
print "Error: Invalid journalmatch: %s" % shortstr(" ".join(journalmatch))
sys.exit(-1)
print "Use journal match : %s" % " ".join(journalmatch)
test_lines = journal_lines_gen(myjournal)
else:
print "Use single line : %s" % shortstr(cmd_log)
test_lines = [ cmd_log ]
print
fail2banRegex.process(test_lines)
fail2banRegex.printStats() or sys.exit(-1)
exec_command_line()

View File

@ -0,0 +1,45 @@
# Fail2ban filter configuration for nginx :: limit_req
# used to ban hosts, that were failed through nginx by limit request processing rate
#
# Author: Serg G. Brester (sebres)
#
# To use 'nginx-limit-req' filter you should have `ngx_http_limit_req_module`
# and define `limit_req` and `limit_req_zone` as described in nginx documentation
# http://nginx.org/en/docs/http/ngx_http_limit_req_module.html
#
# Example:
#
# http {
# ...
# limit_req_zone $binary_remote_addr zone=lr_zone:10m rate=1r/s;
# ...
# # http, server, or location:
# location ... {
# limit_req zone=lr_zone burst=1 nodelay;
# ...
# }
# ...
# }
# ...
#
[Definition]
# Specify following expression to define exact zones, if you want to ban IPs limited
# from specified zones only.
# Example:
#
# ngx_limit_req_zones = lr_zone|lr_zone2
#
ngx_limit_req_zones = [^"]+
# Use following full expression if you should range limit request to specified
# servers, requests, referrers etc. only :
#
# failregex = ^\s*\[error\] \d+#\d+: \*\d+ limiting requests, excess: [\d\.]+ by zone "(?:%(ngx_limit_req_zones)s)", client: <HOST>, server: \S*, request: "\S+ \S+ HTTP/\d+\.\d+", host: "\S+"(, referrer: "\S+")?\s*$
# Shortly, much faster and stable version of regexp:
failregex = ^\s*\[error\] \d+#\d+: \*\d+ limiting requests, excess: [\d\.]+ by zone "(?:%(ngx_limit_req_zones)s)", client: <HOST>
ignoreregex =

View File

@ -80,7 +80,7 @@ maxretry = 5
# auto: will try to use the following backends, in order:
# pyinotify, gamin, polling.
#
# Note: if systemd backend is choses as the default but you enable a jail
# Note: if systemd backend is chosen as the default but you enable a jail
# for which logs are present only in its own log files, specify some other
# backend for that jail (e.g. polling) and provide empty value for
# journalmatch. See https://github.com/fail2ban/fail2ban/issues/959#issuecomment-74901200
@ -154,6 +154,7 @@ port = 0:65535
# action_* variables. Can be overridden globally or per
# section within jail.local file
banaction = iptables-multiport
banaction_allports = iptables-allports
# The simplest action to take: ban only
action_ = %(banaction)s[name=%(__name__)s, bantime="%(bantime)s", port="%(port)s", protocol="%(protocol)s", chain="%(chain)s"]
@ -320,6 +321,14 @@ logpath = /opt/openhab/logs/request.log
port = http,https
logpath = %(nginx_error_log)s
# To use 'nginx-limit-req' jail you should have `ngx_http_limit_req_module`
# and define `limit_req` and `limit_req_zone` as described in nginx documentation
# http://nginx.org/en/docs/http/ngx_http_limit_req_module.html
# or for example see in 'config/filter.d/nginx-limit-req.conf'
[nginx-limit-req]
port = http,https
logpath = %(nginx_error_log)s
[nginx-botsearch]
port = http,https
@ -728,7 +737,7 @@ maxretry = 5
[recidive]
logpath = /var/log/fail2ban.log
banaction = iptables-allports
banaction = %(banaction_allports)s
bantime = 604800 ; 1 week
findtime = 86400 ; 1 day
maxretry = 5
@ -739,7 +748,7 @@ maxretry = 5
[pam-generic]
# pam-generic filter can be customized to monitor specific subset of 'tty's
banaction = iptables-allports
banaction = %(banaction_allports)s
logpath = %(syslog_authpriv)s
backend = %(syslog_backend)s
@ -788,7 +797,7 @@ maxretry = 1
enabled = false
logpath = /opt/sun/comms/messaging64/log/mail.log_current
maxretry = 6
banaction = iptables-allports
banaction = %(banaction_allports)s
[directadmin]
enabled = false

View File

@ -30,9 +30,9 @@ auditd_log = /var/log/audit/audit.log
exim_main_log = /var/log/exim/mainlog
nginx_error_log = /var/log/nginx/error.log
nginx_error_log = /var/log/nginx/*error.log
nginx_access_log = /var/log/nginx/access.log
nginx_access_log = /var/log/nginx/*access.log
lighttpd_error_log = /var/log/lighttpd/error.log

View File

@ -285,8 +285,10 @@ class DefinitionInitConfigReader(ConfigReader):
if self.has_section("Init"):
for opt in self.options("Init"):
v = self.get("Init", opt)
self._initOpts['known/'+opt] = v
if not opt in self._initOpts:
self._initOpts[opt] = self.get("Init", opt)
self._initOpts[opt] = v
def convert(self):
raise NotImplementedError

599
fail2ban/client/fail2banregex.py Executable file
View File

@ -0,0 +1,599 @@
#!/usr/bin/python
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: t -*-
# vi: set ft=python sts=4 ts=4 sw=4 noet :
#
# This file is part of Fail2Ban.
#
# Fail2Ban is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# Fail2Ban is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Fail2Ban; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
Fail2Ban reads log file that contains password failure report
and bans the corresponding IP addresses using firewall rules.
This tools can test regular expressions for "fail2ban".
"""
__author__ = "Fail2Ban Developers"
__copyright__ = "Copyright (c) 2004-2008 Cyril Jaquier, 2012-2014 Yaroslav Halchenko"
__license__ = "GPL"
import getopt
import locale
import logging
import os
import shlex
import sys
import time
import time
import urllib
from optparse import OptionParser, Option
from ConfigParser import NoOptionError, NoSectionError, MissingSectionHeaderError
try:
from systemd import journal
from ..server.filtersystemd import FilterSystemd
except ImportError:
journal = None
from ..version import version
from .filterreader import FilterReader
from ..server.filter import Filter, FileContainer
from ..server.failregex import RegexException
from ..helpers import FormatterWithTraceBack, getLogger
# Gets the instance of the logger.
logSys = getLogger("fail2ban")
def debuggexURL(sample, regex):
q = urllib.urlencode({ 're': regex.replace('<HOST>', '(?&.ipv4)'),
'str': sample,
'flavor': 'python' })
return 'http://www.debuggex.com/?' + q
def output(args):
print(args)
def shortstr(s, l=53):
"""Return shortened string
"""
if len(s) > l:
return s[:l-3] + '...'
return s
def pprint_list(l, header=None):
if not len(l):
return
if header:
s = "|- %s\n" % header
else:
s = ''
output( s + "| " + "\n| ".join(l) + '\n`-' )
def journal_lines_gen(myjournal):
while True:
try:
entry = myjournal.get_next()
except OSError:
continue
if not entry:
break
yield FilterSystemd.formatJournalEntry(entry)
def get_opt_parser():
# use module docstring for help output
p = OptionParser(
usage="%s [OPTIONS] <LOG> <REGEX> [IGNOREREGEX]\n" % sys.argv[0] + __doc__
+ """
LOG:
string a string representing a log line
filename path to a log file (/var/log/auth.log)
"systemd-journal" search systemd journal (systemd-python required)
REGEX:
string a string representing a 'failregex'
filename path to a filter file (filter.d/sshd.conf)
IGNOREREGEX:
string a string representing an 'ignoreregex'
filename path to a filter file (filter.d/sshd.conf)
Copyright (c) 2004-2008 Cyril Jaquier, 2008- Fail2Ban Contributors
Copyright of modifications held by their respective authors.
Licensed under the GNU General Public License v2 (GPL).
Written by Cyril Jaquier <cyril.jaquier@fail2ban.org>.
Many contributions by Yaroslav O. Halchenko and Steven Hiscocks.
Report bugs to https://github.com/fail2ban/fail2ban/issues
""",
version="%prog " + version)
p.add_options([
Option("-d", "--datepattern",
help="set custom pattern used to match date/times"),
Option("-e", "--encoding",
help="File encoding. Default: system locale"),
Option("-L", "--maxlines", type=int, default=0,
help="maxlines for multi-line regex"),
Option("-m", "--journalmatch",
help="journalctl style matches overriding filter file. "
"\"systemd-journal\" only"),
Option('-l', "--log-level", type="choice",
dest="log_level",
choices=('heavydebug', 'debug', 'info', 'notice', 'warning', 'error', 'critical'),
default=None,
help="Log level for the Fail2Ban logger to use"),
Option("-v", "--verbose", action='store_true',
help="Be verbose in output"),
Option("-D", "--debuggex", action='store_true',
help="Produce debuggex.com urls for debugging there"),
Option("--print-no-missed", action='store_true',
help="Do not print any missed lines"),
Option("--print-no-ignored", action='store_true',
help="Do not print any ignored lines"),
Option("--print-all-matched", action='store_true',
help="Print all matched lines"),
Option("--print-all-missed", action='store_true',
help="Print all missed lines, no matter how many"),
Option("--print-all-ignored", action='store_true',
help="Print all ignored lines, no matter how many"),
Option("-t", "--log-traceback", action='store_true',
help="Enrich log-messages with compressed tracebacks"),
Option("--full-traceback", action='store_true',
help="Either to make the tracebacks full, not compressed (as by default)"),
])
return p
class RegexStat(object):
def __init__(self, failregex):
self._stats = 0
self._failregex = failregex
self._ipList = list()
def __str__(self):
return "%s(%r) %d failed: %s" \
% (self.__class__, self._failregex, self._stats, self._ipList)
def inc(self):
self._stats += 1
def getStats(self):
return self._stats
def getFailRegex(self):
return self._failregex
def appendIP(self, value):
self._ipList.append(value)
def getIPList(self):
return self._ipList
class LineStats(object):
"""Just a convenience container for stats
"""
def __init__(self):
self.tested = self.matched = 0
self.matched_lines = []
self.missed = 0
self.missed_lines = []
self.missed_lines_timeextracted = []
self.ignored = 0
self.ignored_lines = []
self.ignored_lines_timeextracted = []
def __str__(self):
return "%(tested)d lines, %(ignored)d ignored, %(matched)d matched, %(missed)d missed" % self
# just for convenient str
def __getitem__(self, key):
return getattr(self, key) if hasattr(self, key) else ''
class Fail2banRegex(object):
def __init__(self, opts):
self._verbose = opts.verbose
self._debuggex = opts.debuggex
self._maxlines = 20
self._print_no_missed = opts.print_no_missed
self._print_no_ignored = opts.print_no_ignored
self._print_all_matched = opts.print_all_matched
self._print_all_missed = opts.print_all_missed
self._print_all_ignored = opts.print_all_ignored
self._maxlines_set = False # so we allow to override maxlines in cmdline
self._datepattern_set = False
self._journalmatch = None
self.share_config=dict()
self._filter = Filter(None)
self._ignoreregex = list()
self._failregex = list()
self._time_elapsed = None
self._line_stats = LineStats()
if opts.maxlines:
self.setMaxLines(opts.maxlines)
if opts.journalmatch is not None:
self.setJournalMatch(opts.journalmatch.split())
if opts.datepattern:
self.setDatePattern(opts.datepattern)
if opts.encoding:
self.encoding = opts.encoding
else:
self.encoding = locale.getpreferredencoding()
def decode_line(self, line):
return FileContainer.decode_line('<LOG>', self.encoding, line)
def encode_line(self, line):
return line.encode(self.encoding, 'ignore')
def setDatePattern(self, pattern):
if not self._datepattern_set:
self._filter.setDatePattern(pattern)
self._datepattern_set = True
if pattern is not None:
output( "Use datepattern : %s" % (
self._filter.getDatePattern()[1], ) )
def setMaxLines(self, v):
if not self._maxlines_set:
self._filter.setMaxLines(int(v))
self._maxlines_set = True
output( "Use maxlines : %d" % self._filter.getMaxLines() )
def setJournalMatch(self, v):
if self._journalmatch is None:
self._journalmatch = v
def readRegex(self, value, regextype):
assert(regextype in ('fail', 'ignore'))
regex = regextype + 'regex'
if os.path.isfile(value) or os.path.isfile(value + '.conf'):
if os.path.basename(os.path.dirname(value)) == 'filter.d':
## within filter.d folder - use standard loading algorithm to load filter completely (with .local etc.):
basedir = os.path.dirname(os.path.dirname(value))
value = os.path.splitext(os.path.basename(value))[0]
output( "Use %11s filter file : %s, basedir: %s" % (regex, value, basedir) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config, basedir=basedir)
if not reader.read():
output( "ERROR: failed to load filter %s" % value )
return False
else:
## foreign file - readexplicit this file and includes if possible:
output( "Use %11s file : %s" % (regex, value) )
reader = FilterReader(value, 'fail2ban-regex-jail', {}, share_config=self.share_config)
reader.setBaseDir(None)
if not reader.readexplicit():
output( "ERROR: failed to read %s" % value )
return False
reader.getOptions(None)
readercommands = reader.convert()
regex_values = [
RegexStat(m[3])
for m in filter(
lambda x: x[0] == 'set' and x[2] == "add%sregex" % regextype,
readercommands)]
# Read out and set possible value of maxlines
for command in readercommands:
if command[2] == "maxlines":
maxlines = int(command[3])
try:
self.setMaxLines(maxlines)
except ValueError:
output( "ERROR: Invalid value for maxlines (%(maxlines)r) " \
"read from %(value)s" % locals() )
return False
elif command[2] == 'addjournalmatch':
journalmatch = command[3:]
self.setJournalMatch(journalmatch)
elif command[2] == 'datepattern':
datepattern = command[3]
self.setDatePattern(datepattern)
else:
output( "Use %11s line : %s" % (regex, shortstr(value)) )
regex_values = [RegexStat(value)]
setattr(self, "_" + regex, regex_values)
for regex in regex_values:
getattr(
self._filter,
'add%sRegex' % regextype.title())(regex.getFailRegex())
return True
def testIgnoreRegex(self, line):
found = False
try:
ret = self._filter.ignoreLine([(line, "", "")])
if ret is not None:
found = True
regex = self._ignoreregex[ret].inc()
except RegexException, e:
output( e )
return False
return found
def testRegex(self, line, date=None):
orgLineBuffer = self._filter._Filter__lineBuffer
fullBuffer = len(orgLineBuffer) >= self._filter.getMaxLines()
try:
line, ret = self._filter.processLine(line, date, checkAllRegex=True)
for match in ret:
# Append True/False flag depending if line was matched by
# more than one regex
match.append(len(ret)>1)
regex = self._failregex[match[0]]
regex.inc()
regex.appendIP(match)
except RegexException, e:
output( e )
return False
except IndexError:
output( "Sorry, but no <HOST> found in regex" )
return False
for bufLine in orgLineBuffer[int(fullBuffer):]:
if bufLine not in self._filter._Filter__lineBuffer:
try:
self._line_stats.missed_lines.pop(
self._line_stats.missed_lines.index("".join(bufLine)))
self._line_stats.missed_lines_timeextracted.pop(
self._line_stats.missed_lines_timeextracted.index(
"".join(bufLine[::2])))
except ValueError:
pass
else:
self._line_stats.matched += 1
self._line_stats.missed -= 1
return line, ret
def process(self, test_lines):
t0 = time.time()
for line_no, line in enumerate(test_lines):
if isinstance(line, tuple):
line_datetimestripped, ret = self.testRegex(
line[0], line[1])
line = "".join(line[0])
else:
line = line.rstrip('\r\n')
if line.startswith('#') or not line:
# skip comment and empty lines
continue
line_datetimestripped, ret = self.testRegex(line)
is_ignored = self.testIgnoreRegex(line_datetimestripped)
if is_ignored:
self._line_stats.ignored += 1
if not self._print_no_ignored and (self._print_all_ignored or self._line_stats.ignored <= self._maxlines + 1):
self._line_stats.ignored_lines.append(line)
self._line_stats.ignored_lines_timeextracted.append(line_datetimestripped)
if len(ret) > 0:
assert(not is_ignored)
self._line_stats.matched += 1
if self._print_all_matched:
self._line_stats.matched_lines.append(line)
else:
if not is_ignored:
self._line_stats.missed += 1
if not self._print_no_missed and (self._print_all_missed or self._line_stats.missed <= self._maxlines + 1):
self._line_stats.missed_lines.append(line)
self._line_stats.missed_lines_timeextracted.append(line_datetimestripped)
self._line_stats.tested += 1
if line_no % 10 == 0 and self._filter.dateDetector is not None:
self._filter.dateDetector.sortTemplate()
self._time_elapsed = time.time() - t0
def printLines(self, ltype):
lstats = self._line_stats
assert(self._line_stats.missed == lstats.tested - (lstats.matched + lstats.ignored))
lines = lstats[ltype]
l = lstats[ltype + '_lines']
if lines:
header = "%s line(s):" % (ltype.capitalize(),)
if self._debuggex:
if ltype == 'missed' or ltype == 'matched':
regexlist = self._failregex
else:
regexlist = self._ignoreregex
l = lstats[ltype + '_lines_timeextracted']
if lines < self._maxlines or getattr(self, '_print_all_' + ltype):
ans = [[]]
for arg in [l, regexlist]:
ans = [ x + [y] for x in ans for y in arg ]
b = map(lambda a: a[0] + ' | ' + a[1].getFailRegex() + ' | ' +
debuggexURL(self.encode_line(a[0]), a[1].getFailRegex()), ans)
pprint_list([x.rstrip() for x in b], header)
else:
output( "%s too many to print. Use --print-all-%s " \
"to print all %d lines" % (header, ltype, lines) )
elif lines < self._maxlines or getattr(self, '_print_all_' + ltype):
pprint_list([x.rstrip() for x in l], header)
else:
output( "%s too many to print. Use --print-all-%s " \
"to print all %d lines" % (header, ltype, lines) )
def printStats(self):
output( "" )
output( "Results" )
output( "=======" )
def print_failregexes(title, failregexes):
# Print title
total, out = 0, []
for cnt, failregex in enumerate(failregexes):
match = failregex.getStats()
total += match
if (match or self._verbose):
out.append("%2d) [%d] %s" % (cnt+1, match, failregex.getFailRegex()))
if self._verbose and len(failregex.getIPList()):
for ip in failregex.getIPList():
timeTuple = time.localtime(ip[2])
timeString = time.strftime("%a %b %d %H:%M:%S %Y", timeTuple)
out.append(
" %s %s%s" % (
ip[1],
timeString,
ip[-1] and " (multiple regex matched)" or ""))
output( "\n%s: %d total" % (title, total) )
pprint_list(out, " #) [# of hits] regular expression")
return total
# Print title
total = print_failregexes("Failregex", self._failregex)
_ = print_failregexes("Ignoreregex", self._ignoreregex)
if self._filter.dateDetector is not None:
output( "\nDate template hits:" )
out = []
for template in self._filter.dateDetector.templates:
if self._verbose or template.hits:
out.append("[%d] %s" % (
template.hits, template.name))
pprint_list(out, "[# of hits] date format")
output( "\nLines: %s" % self._line_stats, )
if self._time_elapsed is not None:
output( "[processed in %.2f sec]" % self._time_elapsed, )
output( "" )
if self._print_all_matched:
self.printLines('matched')
if not self._print_no_ignored:
self.printLines('ignored')
if not self._print_no_missed:
self.printLines('missed')
return True
def file_lines_gen(self, hdlr):
for line in hdlr:
yield self.decode_line(line)
def start(self, opts, args):
cmd_log, cmd_regex = args[:2]
if not self.readRegex(cmd_regex, 'fail'):
return False
if len(args) == 3 and not self.readRegex(args[2], 'ignore'):
return False
if os.path.isfile(cmd_log):
try:
hdlr = open(cmd_log, 'rb')
output( "Use log file : %s" % cmd_log )
output( "Use encoding : %s" % self.encoding )
test_lines = self.file_lines_gen(hdlr)
except IOError, e:
output( e )
return False
elif cmd_log == "systemd-journal": # pragma: no cover
if not journal:
output( "Error: systemd library not found. Exiting..." )
return False
myjournal = journal.Reader(converters={'__CURSOR': lambda x: x})
journalmatch = self._journalmatch
self.setDatePattern(None)
if journalmatch:
try:
for element in journalmatch:
if element == "+":
myjournal.add_disjunction()
else:
myjournal.add_match(element)
except ValueError:
output( "Error: Invalid journalmatch: %s" % shortstr(" ".join(journalmatch)) )
return False
output( "Use journal match : %s" % " ".join(journalmatch) )
test_lines = journal_lines_gen(myjournal)
else:
output( "Use single line : %s" % shortstr(cmd_log) )
test_lines = [ cmd_log ]
output( "" )
self.process(test_lines)
if not self.printStats():
return False
return True
def exec_command_line():
parser = get_opt_parser()
(opts, args) = parser.parse_args()
if opts.print_no_missed and opts.print_all_missed:
sys.stderr.write("ERROR: --print-no-missed and --print-all-missed are mutually exclusive.\n\n")
parser.print_help()
sys.exit(-1)
if opts.print_no_ignored and opts.print_all_ignored:
sys.stderr.write("ERROR: --print-no-ignored and --print-all-ignored are mutually exclusive.\n\n")
parser.print_help()
sys.exit(-1)
# We need 2 or 3 parameters
if not len(args) in (2, 3):
sys.stderr.write("ERROR: provide both <LOG> and <REGEX>.\n\n")
parser.print_help()
return False
output( "" )
output( "Running tests" )
output( "=============" )
output( "" )
# TODO: taken from -testcases -- move common functionality somewhere
if opts.log_level is not None:
# so we had explicit settings
logSys.setLevel(getattr(logging, opts.log_level.upper()))
else:
# suppress the logging but it would leave unittests' progress dots
# ticking, unless like with '-l critical' which would be silent
# unless error occurs
logSys.setLevel(getattr(logging, 'CRITICAL'))
# Add the default logging handler
stdout = logging.StreamHandler(sys.stdout)
fmt = 'D: %(message)s'
if opts.log_traceback:
Formatter = FormatterWithTraceBack
fmt = (opts.full_traceback and ' %(tb)s' or ' %(tbc)s') + fmt
else:
Formatter = logging.Formatter
# Custom log format for the verbose tests runs
if opts.verbose:
stdout.setFormatter(Formatter(' %(asctime)-15s %(thread)s' + fmt))
else:
# just prefix with the space
stdout.setFormatter(Formatter(fmt))
logSys.addHandler(stdout)
fail2banRegex = Fail2banRegex(opts)
if not fail2banRegex.start(opts, args):
sys.exit(-1)

View File

@ -792,23 +792,27 @@ class FileContainer:
self.__handler.seek(self.__pos)
return True
def readline(self):
if self.__handler is None:
return ""
line = self.__handler.readline()
@staticmethod
def decode_line(filename, enc, line):
try:
line = line.decode(self.getEncoding(), 'strict')
line = line.decode(enc, 'strict')
except UnicodeDecodeError:
logSys.warning(
"Error decoding line from '%s' with '%s'."
" Consider setting logencoding=utf-8 (or another appropriate"
" encoding) for this jail. Continuing"
" to process line ignoring invalid characters: %r" %
(self.getFileName(), self.getEncoding(), line))
(filename, enc, line))
# decode with replacing error chars:
line = line.decode(self.getEncoding(), 'replace')
line = line.decode(enc, 'replace')
return line
def readline(self):
if self.__handler is None:
return ""
return FileContainer.decode_line(
self.getFileName(), self.getEncoding(), self.__handler.readline())
def close(self):
if not self.__handler is None:
# Saves the last position.

View File

@ -165,11 +165,11 @@ class JailReaderTest(LogCaptureTestCase):
self.__share_cfg = {}
def testIncorrectJail(self):
jail = JailReader('XXXABSENTXXX', basedir=CONFIG_DIR, share_config = self.__share_cfg)
jail = JailReader('XXXABSENTXXX', basedir=CONFIG_DIR, share_config=self.__share_cfg)
self.assertRaises(ValueError, jail.read)
def testJailActionEmpty(self):
jail = JailReader('emptyaction', basedir=IMPERFECT_CONFIG, share_config = self.__share_cfg)
jail = JailReader('emptyaction', basedir=IMPERFECT_CONFIG, share_config=self.__share_cfg)
self.assertTrue(jail.read())
self.assertTrue(jail.getOptions())
self.assertTrue(jail.isEnabled())
@ -177,7 +177,7 @@ class JailReaderTest(LogCaptureTestCase):
self.assertLogged('No actions were defined for emptyaction')
def testJailActionFilterMissing(self):
jail = JailReader('missingbitsjail', basedir=IMPERFECT_CONFIG, share_config = self.__share_cfg)
jail = JailReader('missingbitsjail', basedir=IMPERFECT_CONFIG, share_config=self.__share_cfg)
self.assertTrue(jail.read())
self.assertFalse(jail.getOptions())
self.assertTrue(jail.isEnabled())
@ -200,7 +200,7 @@ class JailReaderTest(LogCaptureTestCase):
if STOCK:
def testStockSSHJail(self):
jail = JailReader('sshd', basedir=CONFIG_DIR, share_config = self.__share_cfg) # we are running tests from root project dir atm
jail = JailReader('sshd', basedir=CONFIG_DIR, share_config=self.__share_cfg) # we are running tests from root project dir atm
self.assertTrue(jail.read())
self.assertTrue(jail.getOptions())
self.assertFalse(jail.isEnabled())
@ -274,6 +274,10 @@ class JailReaderTest(LogCaptureTestCase):
class FilterReaderTest(unittest.TestCase):
def __init__(self, *args, **kwargs):
super(FilterReaderTest, self).__init__(*args, **kwargs)
self.__share_cfg = {}
def testConvert(self):
output = [['set', 'testcase01', 'addfailregex',
"^\\s*(?:\\S+ )?(?:kernel: \\[\\d+\\.\\d+\\] )?(?:@vserver_\\S+ )"
@ -311,9 +315,8 @@ class FilterReaderTest(unittest.TestCase):
# is unreliable
self.assertEqual(sorted(filterReader.convert()), sorted(output))
filterReader = FilterReader(
"testcase01", "testcase01", {'maxlines': "5"})
filterReader.setBaseDir(TEST_FILES_DIR)
filterReader = FilterReader("testcase01", "testcase01", {'maxlines': "5"},
share_config=self.__share_cfg, basedir=TEST_FILES_DIR)
filterReader.read()
#filterReader.getOptions(["failregex", "ignoreregex"])
filterReader.getOptions(None)
@ -322,8 +325,8 @@ class FilterReaderTest(unittest.TestCase):
def testFilterReaderSubstitionDefault(self):
output = [['set', 'jailname', 'addfailregex', 'to=sweet@example.com fromip=<IP>']]
filterReader = FilterReader('substition', "jailname", {})
filterReader.setBaseDir(TEST_FILES_DIR)
filterReader = FilterReader('substition', "jailname", {},
share_config=self.__share_cfg, basedir=TEST_FILES_DIR)
filterReader.read()
filterReader.getOptions(None)
c = filterReader.convert()
@ -331,16 +334,34 @@ class FilterReaderTest(unittest.TestCase):
def testFilterReaderSubstitionSet(self):
output = [['set', 'jailname', 'addfailregex', 'to=sour@example.com fromip=<IP>']]
filterReader = FilterReader('substition', "jailname", {'honeypot': 'sour@example.com'})
filterReader.setBaseDir(TEST_FILES_DIR)
filterReader = FilterReader('substition', "jailname", {'honeypot': 'sour@example.com'},
share_config=self.__share_cfg, basedir=TEST_FILES_DIR)
filterReader.read()
filterReader.getOptions(None)
c = filterReader.convert()
self.assertEqual(sorted(c), sorted(output))
def testFilterReaderSubstitionKnown(self):
output = [['set', 'jailname', 'addfailregex', 'to=test,sweet@example.com,test2,sweet@example.com fromip=<IP>']]
filterName, filterOpt = JailReader.extractOptions(
'substition[honeypot="<sweet>,<known/honeypot>", sweet="test,<known/honeypot>,test2"]')
filterReader = FilterReader('substition', "jailname", filterOpt,
share_config=self.__share_cfg, basedir=TEST_FILES_DIR)
filterReader.read()
filterReader.getOptions(None)
c = filterReader.convert()
self.assertEqual(sorted(c), sorted(output))
def testFilterReaderSubstitionFail(self):
filterReader = FilterReader('substition', "jailname", {'honeypot': '<sweet>', 'sweet': '<honeypot>'})
filterReader.setBaseDir(TEST_FILES_DIR)
# directly subst the same var :
filterReader = FilterReader('substition', "jailname", {'honeypot': '<honeypot>'},
share_config=self.__share_cfg, basedir=TEST_FILES_DIR)
filterReader.read()
filterReader.getOptions(None)
self.assertRaises(ValueError, FilterReader.convert, filterReader)
# cross subst the same var :
filterReader = FilterReader('substition', "jailname", {'honeypot': '<sweet>', 'sweet': '<honeypot>'},
share_config=self.__share_cfg, basedir=TEST_FILES_DIR)
filterReader.read()
filterReader.getOptions(None)
self.assertRaises(ValueError, FilterReader.convert, filterReader)
@ -508,12 +529,13 @@ class JailsReaderTest(LogCaptureTestCase):
if jail == 'INCLUDES':
continue
filterName = jails.get(jail, 'filter')
filterName, filterOpt = JailReader.extractOptions(filterName)
allFilters.add(filterName)
self.assertTrue(len(filterName))
# moreover we must have a file for it
# and it must be readable as a Filter
filterReader = FilterReader(filterName, jail, {})
filterReader.setBaseDir(CONFIG_DIR)
filterReader = FilterReader(filterName, jail, filterOpt,
share_config=self.__share_cfg, basedir=CONFIG_DIR)
self.assertTrue(filterReader.read(),"Failed to read filter:" + filterName) # opens fine
filterReader.getOptions({}) # reads fine
@ -551,7 +573,10 @@ class JailsReaderTest(LogCaptureTestCase):
filters = set(os.path.splitext(os.path.split(a)[1])[0]
for a in glob.glob(os.path.join('config', 'filter.d', '*.conf'))
if not a.endswith('common.conf'))
filters_jail = set(jail.options['filter'] for jail in jails.jails)
# get filters of all jails (filter names without options inside filter[...])
filters_jail = set(
JailReader.extractOptions(jail.options['filter'])[0] for jail in jails.jails
)
self.maxDiff = None
self.assertTrue(filters.issubset(filters_jail),
"More filters exists than are referenced in stock jail.conf %r" % filters.difference(filters_jail))

View File

@ -0,0 +1,181 @@
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: t -*-
# vi: set ft=python sts=4 ts=4 sw=4 noet :
# This file is part of Fail2Ban.
#
# Fail2Ban is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# Fail2Ban is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Fail2Ban; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
# Fail2Ban developers
__author__ = "Serg Brester"
__copyright__ = "Copyright (c) 2015 Serg G. Brester (sebres), 2008- Fail2Ban Contributors"
__license__ = "GPL"
from __builtin__ import open as fopen
import unittest
import getpass
import os
import sys
import time
import tempfile
import uuid
try:
from systemd import journal
except ImportError:
journal = None
from ..client import fail2banregex
from ..client.fail2banregex import Fail2banRegex, get_opt_parser, output
from .utils import LogCaptureTestCase, logSys
fail2banregex.logSys = logSys
def _test_output(*args):
logSys.info(args[0])
fail2banregex.output = _test_output
CONF_FILES_DIR = os.path.abspath(
os.path.join(os.path.dirname(__file__),"..", "..", "config"))
TEST_FILES_DIR = os.path.join(os.path.dirname(__file__), "files")
def _Fail2banRegex(*args):
parser = get_opt_parser()
(opts, args) = parser.parse_args(list(args))
return (opts, args, Fail2banRegex(opts))
class Fail2banRegexTest(LogCaptureTestCase):
RE_00 = r"(?:(?:Authentication failure|Failed [-/\w+]+) for(?: [iI](?:llegal|nvalid) user)?|[Ii](?:llegal|nvalid) user|ROOT LOGIN REFUSED) .*(?: from|FROM) <HOST>"
FILENAME_01 = os.path.join(TEST_FILES_DIR, "testcase01.log")
FILENAME_02 = os.path.join(TEST_FILES_DIR, "testcase02.log")
FILENAME_WRONGCHAR = os.path.join(TEST_FILES_DIR, "testcase-wrong-char.log")
FILTER_SSHD = os.path.join(CONF_FILES_DIR, 'filter.d', 'sshd.conf')
def setUp(self):
"""Call before every test case."""
LogCaptureTestCase.setUp(self)
def tearDown(self):
"""Call after every test case."""
LogCaptureTestCase.tearDown(self)
def testWrongRE(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"test", r".** from <HOST>$"
)
self.assertRaises(Exception, lambda: fail2banRegex.start(opts, args))
self.assertLogged("Unable to compile regular expression")
def testWrongIngnoreRE(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"test", r".*? from <HOST>$", r".**"
)
self.assertRaises(Exception, lambda: fail2banRegex.start(opts, args))
self.assertLogged("Unable to compile regular expression")
def testDirectFound(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"--print-all-matched", "--print-no-missed",
"Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0",
r"Authentication failure for .*? from <HOST>$"
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 1 lines, 0 ignored, 1 matched, 0 missed')
def testDirectNotFound(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"--print-all-missed",
"Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0",
r"XYZ from <HOST>$"
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 1 lines, 0 ignored, 0 matched, 1 missed')
def testDirectIgnored(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"--print-all-ignored",
"Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0",
r"Authentication failure for .*? from <HOST>$",
r"kevin from 192.0.2.0$"
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 1 lines, 1 ignored, 0 matched, 0 missed')
def testDirectRE_1(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"--print-all-matched",
Fail2banRegexTest.FILENAME_01,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 19 lines, 0 ignored, 13 matched, 6 missed')
self.assertLogged('Error decoding line');
self.assertLogged('Continuing to process line ignoring invalid characters')
self.assertLogged('Dez 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 193.168.0.128')
self.assertLogged('Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 87.142.124.10')
def testDirectRE_2(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"--print-all-matched",
Fail2banRegexTest.FILENAME_02,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed')
def testVerbose(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"--verbose", "--print-no-missed",
Fail2banRegexTest.FILENAME_02,
Fail2banRegexTest.RE_00
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 13 lines, 0 ignored, 5 matched, 8 missed')
self.assertLogged('141.3.81.106 Fri Aug 14 11:53:59 2015')
self.assertLogged('141.3.81.106 Fri Aug 14 11:54:59 2015')
def testWronChar(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed')
self.assertLogged('Error decoding line');
self.assertLogged('Continuing to process line ignoring invalid characters:', '2015-01-14 20:00:58 user ');
self.assertLogged('Continuing to process line ignoring invalid characters:', '2015-01-14 20:00:59 user ');
self.assertLogged('Nov 8 00:16:12 main sshd[32548]: input_userauth_request: invalid user llinco')
self.assertLogged('Nov 8 00:16:12 main sshd[32547]: pam_succeed_if(sshd:auth): error retrieving information about user llinco')
def testWronCharDebuggex(self):
(opts, args, fail2banRegex) = _Fail2banRegex(
"--debuggex", "--print-all-matched",
Fail2banRegexTest.FILENAME_WRONGCHAR, Fail2banRegexTest.FILTER_SSHD
)
self.assertTrue(fail2banRegex.start(opts, args))
self.assertLogged('Lines: 4 lines, 0 ignored, 2 matched, 2 missed')
self.assertLogged('http://')

View File

@ -0,0 +1,6 @@
# failJSON: { "time": "2015-10-29T20:01:02", "match": true , "host": "1.2.3.4" }
2015/10/29 20:01:02 [error] 256554#0: *99927 limiting requests, excess: 1.852 by zone "one", client: 1.2.3.4, server: example.com, request: "POST /index.htm HTTP/1.0", host: "exmaple.com"
# failJSON: { "time": "2015-10-29T19:24:05", "match": true , "host": "192.0.2.0" }
2015/10/29 19:24:05 [error] 12684#12684: *22174 limiting requests, excess: 1.495 by zone "one", client: 192.0.2.0, server: example.com, request: "GET /index.php HTTP/1.1", host: "example.com", referrer: "https://example.com"

View File

@ -0,0 +1,4 @@
Nov 8 00:16:12 main sshd[32547]: Invalid user llinco\361ir from 192.0.2.0
Nov 8 00:16:12 main sshd[32548]: input_userauth_request: invalid user llinco\361ir
Nov 8 00:16:12 main sshd[32547]: pam_succeed_if(sshd:auth): error retrieving information about user llincoñir
Nov 8 00:16:14 main sshd[32547]: Failed password for invalid user llinco\361ir from 192.0.2.0 port 57025 ssh2

View File

@ -90,7 +90,11 @@ def _assert_equal_entries(utest, found, output, count=None):
found_time, output_time = \
MyTime.localtime(found[2]),\
MyTime.localtime(output[2])
utest.assertEqual(found_time, output_time)
try:
utest.assertEqual(found_time, output_time)
except AssertionError as e:
# assert more structured:
utest.assertEqual((float(found[2]), found_time), (float(output[2]), output_time))
if len(output) > 3 and count is None: # match matches
# do not check if custom count (e.g. going through them twice)
if os.linesep != '\n' or sys.platform.startswith('cygwin'):
@ -216,6 +220,14 @@ class BasicFilter(unittest.TestCase):
("^%Y-%m-%d-%H%M%S.%f %z",
"^Year-Month-Day-24hourMinuteSecond.Microseconds Zone offset"))
def testAssertWrongTime(self):
self.assertRaises(AssertionError,
lambda: _assert_equal_entries(self,
('1.1.1.1', 1, 1421262060.0),
('1.1.1.1', 1, 1421262059.0),
1)
)
class IgnoreIP(LogCaptureTestCase):
@ -810,7 +822,7 @@ def get_monitor_failures_journal_testcase(Filter_): # pragma: systemd no cover
return MonitorJournalFailures
class GetFailures(unittest.TestCase):
class GetFailures(LogCaptureTestCase):
FILENAME_01 = os.path.join(TEST_FILES_DIR, "testcase01.log")
FILENAME_02 = os.path.join(TEST_FILES_DIR, "testcase02.log")
@ -825,6 +837,7 @@ class GetFailures(unittest.TestCase):
def setUp(self):
"""Call before every test case."""
LogCaptureTestCase.setUp(self)
setUpMyTime()
self.jail = DummyJail()
self.filter = FileFilter(self.jail)
@ -836,6 +849,7 @@ class GetFailures(unittest.TestCase):
def tearDown(self):
"""Call after every test case."""
tearDownMyTime()
LogCaptureTestCase.tearDown(self)
def testTail(self):
self.filter.addLogPath(GetFailures.FILENAME_01, tail=True)
@ -900,6 +914,41 @@ class GetFailures(unittest.TestCase):
except FailManagerEmpty:
pass
def testGetFailuresWrongChar(self):
# write wrong utf-8 char:
fname = tempfile.mktemp(prefix='tmp_fail2ban', suffix='crlf')
fout = fopen(fname, 'wb')
try:
# write:
for l in (
b'2015-01-14 20:00:58 user \"test\xf1ing\" from \"192.0.2.0\"\n', # wrong utf-8 char
b'2015-01-14 20:00:59 user \"\xd1\xe2\xe5\xf2\xe0\" from \"192.0.2.0\"\n', # wrong utf-8 chars
b'2015-01-14 20:01:00 user \"testing\" from \"192.0.2.0\"\n' # correct utf-8 chars
):
fout.write(l)
fout.close()
#
output = ('192.0.2.0', 3, 1421262060.0)
failregex = "^\s*user \"[^\"]*\" from \"<HOST>\"\s*$"
# test encoding auto or direct set of encoding:
for enc in (None, 'utf-8', 'ascii'):
if enc is not None:
self.tearDown();self.setUp();
self.filter.setLogEncoding(enc);
self.assertNotLogged('Error decoding line');
self.filter.addLogPath(fname)
self.filter.addFailRegex(failregex)
self.filter.getFailures(fname)
_assert_correct_last_attempt(self, self.filter, output)
self.assertLogged('Error decoding line');
self.assertLogged('Continuing to process line ignoring invalid characters:', '2015-01-14 20:00:58 user ');
self.assertLogged('Continuing to process line ignoring invalid characters:', '2015-01-14 20:00:59 user ');
finally:
_killfile(fout, fname)
def testGetFailuresUseDNS(self):
# We should still catch failures with usedns = no ;-)
output_yes = ('93.184.216.34', 2, 1124013539.0,

View File

@ -85,6 +85,7 @@ def gatherTests(regexps=None, no_network=False):
from . import misctestcase
from . import databasetestcase
from . import samplestestcase
from . import fail2banregextestcase
if not regexps: # pragma: no cover
tests = unittest.TestSuite()
@ -152,6 +153,9 @@ def gatherTests(regexps=None, no_network=False):
# Filter Regex tests with sample logs
tests.addTest(unittest.makeSuite(samplestestcase.FilterSamplesRegex))
# bin/fail2ban-regex
tests.addTest(unittest.makeSuite(fail2banregextestcase.Fail2banRegexTest))
#
# Python action testcases
#

View File

@ -1,4 +1,4 @@
.TH JAIL.CONF "5" "October 2013" "Fail2Ban" "Fail2Ban Configuration"
.TH JAIL.CONF "5" "November 2015" "Fail2Ban" "Fail2Ban Configuration"
.SH NAME
jail.conf \- configuration for the fail2ban server
.SH SYNOPSIS
@ -89,16 +89,36 @@ indicates that the specified file is to be parsed before the current file.
indicates that the specified file is to be parsed after the current file.
.RE
Using Python "string interpolation" mechanisms, other definitions are allowed and can later be used within other definitions as %(name)s. For example.
Using Python "string interpolation" mechanisms, other definitions are allowed and can later be used within other definitions as %(name)s.
Additionally fail2ban has an extended interpolation feature named \fB%(known/parameter)s\fR (means last known option with name \fBparameter\fR). This interpolation makes possible to extend a stock filter or jail regexp in .local file (opposite to simply set failregex/ignoreregex that overwrites it), e.g.
.RS
.nf
baduseragents = IE|wget
.RE
.RS
failregex = useragent=%(baduseragents)s
failregex = %(known/failregex)s
useragent=%(baduseragents)s
.fi
.RE
Comments: use '#' for comment lines and '; ' (space is important) for inline comments. When using Python2.X '; ' can only be used on the first line due to an Python library bug.
Additionally to interpolation \fB%(known/parameter)s\fR, that does not works for filter/action init parameters, an interpolation tag \fB<known/parameter>\fR can be used (means last known init definition of filters or actions with name \fBparameter\fR). This interpolation makes possible to extend a parameters of stock filter or action directly in jail inside \fIjail.conf/jail.local\fR file without creating a separately filter.d/*.local file, e.g.
.RS
# filter.d/test.conf:
.nf
[Init]
test.method = GET
baduseragents = IE|wget
[Definition]
failregex = ^%(__prefix_line)\\s+"<test.method>"\\s+test\\s+regexp\\s+-\\s+useragent=(?:<baduseragents>)
# jail.local:
[test]
# use filter "test", overwrite method to "POST" and extend known bad agents with "badagent":
filter = test[test.method=POST, baduseragents="badagent|<known/baduseragents>"]
.fi
.RE
Comments: use '#' for comment lines and '; ' (space is important) for inline comments. When using Python2.X, '; ' can only be used on the first line due to an Python library bug.
.SH "FAIL2BAN CONFIGURATION FILE(S) (\fIfail2ban.conf\fB)"
@ -110,34 +130,44 @@ The items that can be set are:
verbosity level of log output: CRITICAL, ERROR, WARNING, NOTICE, INFO, DEBUG. Default: ERROR
.TP
.B logtarget
log target: filename, SYSLOG, STDERR or STDOUT. Default: STDERR . Only a single log target can be specified.
log target: filename, SYSLOG, STDERR or STDOUT. Default: STDERR
.br
Only a single log target can be specified.
If you change logtarget from the default value and you are using logrotate -- also adjust or disable rotation in the
corresponding configuration file (e.g. /etc/logrotate.d/fail2ban on Debian systems).
.TP
.B socket
socket filename. Default: /var/run/fail2ban/fail2ban.sock .
socket filename. Default: /var/run/fail2ban/fail2ban.sock
.br
This is used for communication with the fail2ban server daemon. Do not remove this file when Fail2ban is running. It will not be possible to communicate with the server afterwards.
.TP
.B pidfile
PID filename. Default: /var/run/fail2ban/fail2ban.pid.
PID filename. Default: /var/run/fail2ban/fail2ban.pid
.br
This is used to store the process ID of the fail2ban server.
.TP
.B dbfile
Database filename. Default: /var/lib/fail2ban/fail2ban.sqlite3
.br
This defines where the persistent data for fail2ban is stored. This persistent data allows bans to be reinstated and continue reading log files from the last read position when fail2ban is restarted. A value of \fINone\fR disables this feature.
.TP
.B dbpurgeage
Database purge age in seconds. Default: 86400 (24hours)
.br
This sets the age at which bans should be purged from the database.
.SH "JAIL CONFIGURATION FILE(S) (\fIjail.conf\fB)"
The following options are applicable to any jail. They appear in a section specifying the jail name or in the \fI[DEFAULT]\fR section which defines default values to be used if not specified in the individual section.
.TP
.B filter
name of the filter -- filename of the filter in /etc/fail2ban/filter.d/ without the .conf/.local extension. Only one filter can be specified.
name of the filter -- filename of the filter in /etc/fail2ban/filter.d/ without the .conf/.local extension.
.br
Only one filter can be specified.
.TP
.B logpath
filename(s) of the log files to be monitored, separated by new lines. Globs -- paths containing * and ? or [0-9] -- can be used however only the files that exist at start up matching this glob pattern will be considered.
filename(s) of the log files to be monitored, separated by new lines.
.br
Globs -- paths containing * and ? or [0-9] -- can be used however only the files that exist at start up matching this glob pattern will be considered.
Optional space separated option 'tail' can be added to the end of the path to cause the log file to be read from the end, else default 'head' option reads file from the beginning
@ -146,8 +176,18 @@ Ensure syslog or the program that generates the log file isn't configured to com
.B logencoding
encoding of log files used for decoding. Default value of "auto" uses current system locale.
.TP
.B banaction
banning action (default iptables-multiport) typically specified in the \fI[DEFAULT]\fR section for all jails.
.br
This parameter will be used by the standard substitution of \fIaction\fR and can be redefined central in the \fI[DEFAULT]\fR section inside \fIjail.local\fR (to apply it to all jails at once) or separately in each jail, where this substitution will be used.
.TP
.B banaction_allports
the same as \fIbanaction\fR but for some "allports" jails like "pam-generic" or "recidive" (default iptables-allports).
.TP
.B action
action(s) from \fI/etc/fail2ban/action.d/\fR without the \fI.conf\fR/\fI.local\fR extension. Arguments can be passed to actions to override the default values from the [Init] section in the action file. Arguments are specified by:
action(s) from \fI/etc/fail2ban/action.d/\fR without the \fI.conf\fR/\fI.local\fR extension.
.br
Arguments can be passed to actions to override the default values from the [Init] section in the action file. Arguments are specified by:
.RS
.RS
@ -161,7 +201,9 @@ Values can also be quoted (required when value includes a ","). More that one ac
list of IPs not to ban. They can include a CIDR mask too.
.TP
.B ignorecommand
command that is executed to determine if the current candidate IP for banning should not be banned. IP will not be banned if command returns successfully (exit code 0).
command that is executed to determine if the current candidate IP for banning should not be banned.
.br
IP will not be banned if command returns successfully (exit code 0).
Like ACTION FILES, tags like <ip> are can be included in the ignorecommand value and will be substituted before execution. Currently only <ip> is supported however more will be added later.
.TP
.B bantime
@ -174,7 +216,9 @@ time interval (in seconds) before the current time where failures will count tow
number of failures that have to occur in the last \fBfindtime\fR seconds to ban then IP.
.TP
.B backend
backend to be used to detect changes in the logpath. It defaults to "auto" which will try "pyinotify", "gamin", "systemd" before "polling". Any of these can be specified. "pyinotify" is only valid on Linux systems with the "pyinotify" Python libraries. "gamin" requires the "gamin" libraries.
backend to be used to detect changes in the logpath.
.br
It defaults to "auto" which will try "pyinotify", "gamin", "systemd" before "polling". Any of these can be specified. "pyinotify" is only valid on Linux systems with the "pyinotify" Python libraries. "gamin" requires the "gamin" libraries.
.TP
.B usedns
use DNS to resolve HOST names that appear in the logs. By default it is "warn" which will resolve hostnames to IPs however it will also log a warning. If you are using DNS here you could be blocking the wrong IPs due to the asymmetric nature of reverse DNS (that the application used to write the domain name to log) compared to forward DNS that fail2ban uses to resolve this back to an IP (but not necessarily the same one). Ideally you should configure your applications to log a real IP. This can be set to "yes" to prevent warnings in the log or "no" to disable DNS resolution altogether (thus ignoring entries where hostname, not an IP is logged)..
@ -245,7 +289,7 @@ The maximum period of time in seconds that a command can executed, before being
.RE
Commands specified in the [Definition] section are executed through a system shell so shell redirection and process control is allowed. The commands should
return 0, otherwise error would be logged. Moreover if \fBactioncheck\fR exits with non-0 status, it is taken as indication that firewall status has changed and fail2ban needs to reinitialize itself (i.e. issue \fBactionstop\fR and \fBactionstart\fR commands).
return 0, otherwise error would be logged. Moreover if \fBactioncheck\fR exits with non-0 status, it is taken as indication that firewall status has changed and fail2ban needs to reinitialize itself (i.e. issue \fBactionstop\fR and \fBactionstart\fR commands).
Tags are enclosed in <>. All the elements of [Init] are tags that are replaced in all action commands. Tags can be added by the
\fBfail2ban-client\fR using the "set <JAIL> action <ACT>" command. \fB<br>\fR is a tag that is always a new line (\\n).
@ -306,7 +350,7 @@ is the regex to identify log entries that should be ignored by Fail2Ban, even if
.PP
Similar to actions, filters have an [Init] section which can be overridden in \fIjail.conf/jail.local\fR. The filter [Init] section is limited to the following options:
Similar to actions, filters have an [Init] section which can be overridden in \fIjail.conf/jail.local\fR. Besides the filter-specific settings, the filter [Init] section can be used to set following standard options:
.TP
\fBmaxlines\fR
specifies the maximum number of lines to buffer to match multi-line regexs. For some log formats this will not required to be changed. Other logs may require to increase this value if a particular log file is frequently written to.
@ -321,6 +365,8 @@ Also, special values of \fIEpoch\fR (UNIX Timestamp), \fITAI64N\fR and \fIISO860
\fBjournalmatch\fR
specifies the systemd journal match used to filter the journal entries. See \fBjournalctl(1)\fR and \fBsystemd.journal-fields(7)\fR for matches syntax and more details on special journal fields. This option is only valid for the \fIsystemd\fR backend.
.PP
Similar to actions [Init] section enables filter-specific settings. All parameters specified in [Init] section can be redefined or extended in \fIjail.conf/jail.local\fR.
Filters can also have a section called [INCLUDES]. This is used to read other configuration files.
.TP