Merge branch 'master' into master

pull/2881/head
Sergey G. Brester 2021-04-04 00:04:08 +02:00 committed by GitHub
commit dda70d60c0
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
72 changed files with 1012 additions and 1017 deletions

View File

@ -1,49 +0,0 @@
_We will be very grateful, if your problem was described as completely as possible,
enclosing excerpts from logs (if possible within DEBUG mode, if no errors evident
within INFO mode), and configuration in particular of effected relevant settings
(e.g., with ` fail2ban-client -d | grep 'affected-jail-name' ` for a particular
jail troubleshooting).
Thank you in advance for the details, because such issues like "It does not work"
alone could not help to resolve anything!
Thanks! (remove this paragraph and other comments upon reading)_
### Environment:
_Fill out and check (`[x]`) the boxes which apply. If your Fail2Ban version is outdated,
and you can't verify that the issue persists in the recent release, better seek support
from the distribution you obtained Fail2Ban from_
- Fail2Ban version (including any possible distribution suffixes):
- OS, including release name/version:
- [ ] Fail2Ban installed via OS/distribution mechanisms
- [ ] You have not applied any additional foreign patches to the codebase
- [ ] Some customizations were done to the configuration (provide details below is so)
### The issue:
_Summary here_
#### Steps to reproduce
#### Expected behavior
#### Observed behavior
#### Any additional information
### Configuration, dump and another helpful excerpts
#### Any customizations done to /etc/fail2ban/ configuration
```
```
#### Relevant parts of /var/log/fail2ban.log file:
_preferably obtained while running fail2ban with `loglevel = 4`_
```
```
#### Relevant lines from monitored log files in question:
```
```

70
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@ -0,0 +1,70 @@
---
name: Bug report
about: Report a bug within the fail2ban engines (not filters or jails)
title: '[BR]: '
labels: bug
assignees: ''
---
<!--
- Before reporting, please make sure to search the open and closed issues for any reports in the past.
- Use this issue template to report a bug in the fail2ban engine (not in a filter or jail).
- If you want to request a feature or a new filter, please use "Feature request" or "Filter request" instead.
- If you have rather some question, please open or join to some discussion.
We will be very grateful, if your problem was described as completely as possible,
enclosing excerpts from logs (if possible within DEBUG mode, if no errors evident
within INFO mode), and configuration in particular of effected relevant settings
(e.g., with ` fail2ban-client -d | grep 'affected-jail-name' ` for a particular
jail troubleshooting).
Thank you in advance for the details, because such issues like "It does not work"
alone could not help to resolve anything!
Thanks!
(you can remove this paragraph and other comments upon reading)
-->
### Environment:
<!--
Fill out and check (`[x]`) the boxes which apply. If your Fail2Ban version is outdated,
and you can't verify that the issue persists in the recent release, better seek support
from the distribution you obtained Fail2Ban from
-->
- Fail2Ban version <!-- including any possible distribution suffixes --> :
- OS, including release name/version :
- [ ] Fail2Ban installed via OS/distribution mechanisms
- [ ] You have not applied any additional foreign patches to the codebase
- [ ] Some customizations were done to the configuration (provide details below is so)
### The issue:
<!-- summary here -->
#### Steps to reproduce
#### Expected behavior
#### Observed behavior
#### Any additional information
### Configuration, dump and another helpful excerpts
#### Any customizations done to /etc/fail2ban/ configuration
<!-- put your configuration excerpts between next 2 lines -->
```
```
#### Relevant parts of /var/log/fail2ban.log file:
<!-- preferably obtained while running fail2ban with `loglevel = 4` -->
<!-- put your log excerpt between next 2 lines -->
```
```
#### Relevant lines from monitored log files:
<!-- put your log excerpt between next 2 lines -->
```
```

View File

@ -0,0 +1,35 @@
---
name: Feature request
about: Suggest an idea or an enhancement for this project
title: '[RFE]: '
labels: enhancement
assignees: ''
---
<!--
- Before requesting, please make sure to search the open and closed issues for any requests in the past.
- Use this issue template to request a feature in the fail2ban engine (not a new filter or jail).
- If you want to request a new filter or failregex, please use "Filter request" instead.
- If you have rather some question, please open or join to some discussion.
-->
#### Feature request type
<!--
Please provide a summary description of the feature request.
-->
#### Description
<!--
Please describe the feature in more detail.
-->
#### Considered alternatives
<!--
A clear and concise description of any alternative solutions or features you've considered.
-->
#### Any additional information
<!--
Add any other context or screenshots about the feature request here.
-->

View File

@ -0,0 +1,59 @@
---
name: Filter request
about: Request a new jail or filter to be supported or existing filter extended with new failregex
title: '[FR]: '
labels: filter-request
assignees: ''
---
<!--
- Before requesting, please make sure to search the open and closed issues for any requests in the past.
- Sometimes failregex have been already requested before but are not implemented yet due to various reasons.
- If there are no hits for your concerns, please proceed otherwise add a comment to the related issue (also if it is closed).
- If you want to request a new feature, please use "Feature request" instead.
- If you have rather some question, please open or join to some discussion.
-->
### Environment:
<!--
Fill out and check (`[x]`) the boxes which apply.
-->
- Fail2Ban version <!-- including any possible distribution suffixes --> :
- OS, including release name/version :
#### Service, project or product which log or journal should be monitored
- Name of filter or jail in Fail2Ban (if already exists) :
- Service, project or product name, including release name/version :
- Repository or URL (if known) :
- Service type :
- Ports and protocols the service is listening :
#### Log or journal information
<!-- Delete unrelated group -->
<!-- Log file -->
- Log file name(s) :
<!-- Systemd journal -->
- Journal identifier or unit name :
#### Any additional information
### Relevant lines from monitored log files:
#### failures in sense of fail2ban filter (fail2ban must match):
<!-- put your log excerpt between next 2 lines -->
```
```
#### legitimate messages (fail2ban should not consider as failures):
<!-- put your log excerpt between next 2 lines -->
```
```

View File

@ -22,7 +22,7 @@ jobs:
runs-on: ubuntu-20.04 runs-on: ubuntu-20.04
strategy: strategy:
matrix: matrix:
python-version: [2.7, 3.5, 3.6, 3.7, 3.8, 3.9, pypy2, pypy3] python-version: [2.7, 3.5, 3.6, 3.7, 3.8, 3.9, '3.10.0-alpha.5', pypy2, pypy3]
fail-fast: false fail-fast: false
# Steps represent a sequence of tasks that will be executed as part of the job # Steps represent a sequence of tasks that will be executed as part of the job
steps: steps:

View File

@ -10,17 +10,25 @@ ver. 1.0.1-dev-1 (20??/??/??) - development nightly edition
----------- -----------
### Compatibility: ### Compatibility:
* potential incompatibility by parsing of options of `backend`, `filter` and `action` parameters (if they
are partially incorrect), because fail2ban could throw an error now (doesn't silently bypass it anymore).
* to v.0.11: * to v.0.11:
- due to change of `actioncheck` behavior (gh-488), some actions can be incompatible as regards - due to change of `actioncheck` behavior (gh-488), some actions can be incompatible as regards
the invariant check, if `actionban` or `actionunban` would not throw an error (exit code the invariant check, if `actionban` or `actionunban` would not throw an error (exit code
different from 0) in case of unsane environment. different from 0) in case of unsane environment.
### Fixes ### Fixes
* readline fixed to consider interim new-line character as part of code point in multi-byte logs
(e. g. unicode encoding like utf-16be, utf-16le);
* `filter.d/drupal-auth.conf` more strict regex, extended to match "Login attempt failed from" (gh-2742)
### New Features and Enhancements ### New Features and Enhancements
* `actioncheck` behavior is changed now (gh-488), so invariant check as well as restore or repair * `actioncheck` behavior is changed now (gh-488), so invariant check as well as restore or repair
of sane environment (in case of recognized unsane state) would only occur on action errors (e. g. of sane environment (in case of recognized unsane state) would only occur on action errors (e. g.
if ban or unban operations are exiting with other code as 0) if ban or unban operations are exiting with other code as 0)
* better recognition of log rotation, better performance by reopen: avoid unnecessary seek to begin of file
(and hash calculation)
* file filter reads only complete lines (ended with new-line) now, so waits for end of line (for its completion)
* `filter.d/nginx-http-auth.conf` - extended with parameter mode, so additionally to `auth` (or `normal`) * `filter.d/nginx-http-auth.conf` - extended with parameter mode, so additionally to `auth` (or `normal`)
mode `fallback` (or combined as `aggressive`) can find SSL errors while SSL handshaking, gh-2881 mode `fallback` (or combined as `aggressive`) can find SSL errors while SSL handshaking, gh-2881

View File

@ -278,6 +278,7 @@ to tune it. fail2ban-regex -D ... will present Debuggex URLs for the regexs
and sample log files that you pass into it. and sample log files that you pass into it.
In general use when using regex debuggers for generating fail2ban filters: In general use when using regex debuggers for generating fail2ban filters:
* use regex from the ./fail2ban-regex output (to ensure all substitutions are * use regex from the ./fail2ban-regex output (to ensure all substitutions are
done) done)
* replace <HOST> with (?&.ipv4) * replace <HOST> with (?&.ipv4)

View File

@ -5,8 +5,6 @@ bin/fail2ban-testcases
ChangeLog ChangeLog
config/action.d/abuseipdb.conf config/action.d/abuseipdb.conf
config/action.d/apf.conf config/action.d/apf.conf
config/action.d/badips.conf
config/action.d/badips.py
config/action.d/blocklist_de.conf config/action.d/blocklist_de.conf
config/action.d/bsd-ipfw.conf config/action.d/bsd-ipfw.conf
config/action.d/cloudflare.conf config/action.d/cloudflare.conf
@ -220,7 +218,6 @@ fail2ban/setup.py
fail2ban-testcases-all fail2ban-testcases-all
fail2ban-testcases-all-python3 fail2ban-testcases-all-python3
fail2ban/tests/action_d/__init__.py fail2ban/tests/action_d/__init__.py
fail2ban/tests/action_d/test_badips.py
fail2ban/tests/action_d/test_smtp.py fail2ban/tests/action_d/test_smtp.py
fail2ban/tests/actionstestcase.py fail2ban/tests/actionstestcase.py
fail2ban/tests/actiontestcase.py fail2ban/tests/actiontestcase.py

View File

@ -1,19 +0,0 @@
# Fail2ban reporting to badips.com
#
# Note: This reports an IP only and does not actually ban traffic. Use
# another action in the same jail if you want bans to occur.
#
# Set the category to the appropriate value before use.
#
# To get see register and optional key to get personalised graphs see:
# http://www.badips.com/blog/personalized-statistics-track-the-attackers-of-all-your-servers-with-one-key
[Definition]
actionban = curl --fail --user-agent "<agent>" http://www.badips.com/add/<category>/<ip>
[Init]
# Option: category
# Notes.: Values are from the list here: http://www.badips.com/get/categories
category =

View File

@ -1,391 +0,0 @@
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: t -*-
# vi: set ft=python sts=4 ts=4 sw=4 noet :
# This file is part of Fail2Ban.
#
# Fail2Ban is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# Fail2Ban is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Fail2Ban; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import sys
if sys.version_info < (2, 7): # pragma: no cover
raise ImportError("badips.py action requires Python >= 2.7")
import json
import threading
import logging
if sys.version_info >= (3, ): # pragma: 2.x no cover
from urllib.request import Request, urlopen
from urllib.parse import urlencode
from urllib.error import HTTPError
else: # pragma: 3.x no cover
from urllib2 import Request, urlopen, HTTPError
from urllib import urlencode
from fail2ban.server.actions import Actions, ActionBase, BanTicket
from fail2ban.helpers import splitwords, str2LogLevel
class BadIPsAction(ActionBase): # pragma: no cover - may be unavailable
"""Fail2Ban action which reports bans to badips.com, and also
blacklist bad IPs listed on badips.com by using another action's
ban method.
Parameters
----------
jail : Jail
The jail which the action belongs to.
name : str
Name assigned to the action.
category : str
Valid badips.com category for reporting failures.
score : int, optional
Minimum score for bad IPs. Default 3.
age : str, optional
Age of last report for bad IPs, per badips.com syntax.
Default "24h" (24 hours)
banaction : str, optional
Name of banaction to use for blacklisting bad IPs. If `None`,
no blacklist of IPs will take place.
Default `None`.
bancategory : str, optional
Name of category to use for blacklisting, which can differ
from category used for reporting. e.g. may want to report
"postfix", but want to use whole "mail" category for blacklist.
Default `category`.
bankey : str, optional
Key issued by badips.com to retrieve personal list
of blacklist IPs.
updateperiod : int, optional
Time in seconds between updating bad IPs blacklist.
Default 900 (15 minutes)
loglevel : int/str, optional
Log level of the message when an IP is (un)banned.
Default `DEBUG`.
Can be also supplied as two-value list (comma- or space separated) to
provide level of the summary message when a group of IPs is (un)banned.
Example `DEBUG,INFO`.
agent : str, optional
User agent transmitted to server.
Default `Fail2Ban/ver.`
Raises
------
ValueError
If invalid `category`, `score`, `banaction` or `updateperiod`.
"""
TIMEOUT = 10
_badips = "https://www.badips.com"
def _Request(self, url, **argv):
return Request(url, headers={'User-Agent': self.agent}, **argv)
def __init__(self, jail, name, category, score=3, age="24h",
banaction=None, bancategory=None, bankey=None, updateperiod=900,
loglevel='DEBUG', agent="Fail2Ban", timeout=TIMEOUT):
super(BadIPsAction, self).__init__(jail, name)
self.timeout = timeout
self.agent = agent
self.category = category
self.score = score
self.age = age
self.banaction = banaction
self.bancategory = bancategory or category
self.bankey = bankey
loglevel = splitwords(loglevel)
self.sumloglevel = str2LogLevel(loglevel[-1])
self.loglevel = str2LogLevel(loglevel[0])
self.updateperiod = updateperiod
self._bannedips = set()
# Used later for threading.Timer for updating badips
self._timer = None
@staticmethod
def isAvailable(timeout=1):
try:
response = urlopen(Request("/".join([BadIPsAction._badips]),
headers={'User-Agent': "Fail2Ban"}), timeout=timeout)
return True, ''
except Exception as e: # pragma: no cover
return False, e
def logError(self, response, what=''): # pragma: no cover - sporadical (502: Bad Gateway, etc)
messages = {}
try:
messages = json.loads(response.read().decode('utf-8'))
except:
pass
self._logSys.error(
"%s. badips.com response: '%s'", what,
messages.get('err', 'Unknown'))
def getCategories(self, incParents=False):
"""Get badips.com categories.
Returns
-------
set
Set of categories.
Raises
------
HTTPError
Any issues with badips.com request.
ValueError
If badips.com response didn't contain necessary information
"""
try:
response = urlopen(
self._Request("/".join([self._badips, "get", "categories"])), timeout=self.timeout)
except HTTPError as response: # pragma: no cover
self.logError(response, "Failed to fetch categories")
raise
else:
response_json = json.loads(response.read().decode('utf-8'))
if not 'categories' in response_json:
err = "badips.com response lacked categories specification. Response was: %s" \
% (response_json,)
self._logSys.error(err)
raise ValueError(err)
categories = response_json['categories']
categories_names = set(
value['Name'] for value in categories)
if incParents:
categories_names.update(set(
value['Parent'] for value in categories
if "Parent" in value))
return categories_names
def getList(self, category, score, age, key=None):
"""Get badips.com list of bad IPs.
Parameters
----------
category : str
Valid badips.com category.
score : int
Minimum score for bad IPs.
age : str
Age of last report for bad IPs, per badips.com syntax.
key : str, optional
Key issued by badips.com to fetch IPs reported with the
associated key.
Returns
-------
set
Set of bad IPs.
Raises
------
HTTPError
Any issues with badips.com request.
"""
try:
url = "?".join([
"/".join([self._badips, "get", "list", category, str(score)]),
urlencode({'age': age})])
if key:
url = "&".join([url, urlencode({'key': key})])
self._logSys.debug('badips.com: get list, url: %r', url)
response = urlopen(self._Request(url), timeout=self.timeout)
except HTTPError as response: # pragma: no cover
self.logError(response, "Failed to fetch bad IP list")
raise
else:
return set(response.read().decode('utf-8').split())
@property
def category(self):
"""badips.com category for reporting IPs.
"""
return self._category
@category.setter
def category(self, category):
if category not in self.getCategories():
self._logSys.error("Category name '%s' not valid. "
"see badips.com for list of valid categories",
category)
raise ValueError("Invalid category: %s" % category)
self._category = category
@property
def bancategory(self):
"""badips.com bancategory for fetching IPs.
"""
return self._bancategory
@bancategory.setter
def bancategory(self, bancategory):
if bancategory != "any" and bancategory not in self.getCategories(incParents=True):
self._logSys.error("Category name '%s' not valid. "
"see badips.com for list of valid categories",
bancategory)
raise ValueError("Invalid bancategory: %s" % bancategory)
self._bancategory = bancategory
@property
def score(self):
"""badips.com minimum score for fetching IPs.
"""
return self._score
@score.setter
def score(self, score):
score = int(score)
if 0 <= score <= 5:
self._score = score
else:
raise ValueError("Score must be 0-5")
@property
def banaction(self):
"""Jail action to use for banning/unbanning.
"""
return self._banaction
@banaction.setter
def banaction(self, banaction):
if banaction is not None and banaction not in self._jail.actions:
self._logSys.error("Action name '%s' not in jail '%s'",
banaction, self._jail.name)
raise ValueError("Invalid banaction")
self._banaction = banaction
@property
def updateperiod(self):
"""Period in seconds between banned bad IPs will be updated.
"""
return self._updateperiod
@updateperiod.setter
def updateperiod(self, updateperiod):
updateperiod = int(updateperiod)
if updateperiod > 0:
self._updateperiod = updateperiod
else:
raise ValueError("Update period must be integer greater than 0")
def _banIPs(self, ips):
for ip in ips:
try:
ai = Actions.ActionInfo(BanTicket(ip), self._jail)
self._jail.actions[self.banaction].ban(ai)
except Exception as e:
self._logSys.error(
"Error banning IP %s for jail '%s' with action '%s': %s",
ip, self._jail.name, self.banaction, e,
exc_info=self._logSys.getEffectiveLevel()<=logging.DEBUG)
else:
self._bannedips.add(ip)
self._logSys.log(self.loglevel,
"Banned IP %s for jail '%s' with action '%s'",
ip, self._jail.name, self.banaction)
def _unbanIPs(self, ips):
for ip in ips:
try:
ai = Actions.ActionInfo(BanTicket(ip), self._jail)
self._jail.actions[self.banaction].unban(ai)
except Exception as e:
self._logSys.error(
"Error unbanning IP %s for jail '%s' with action '%s': %s",
ip, self._jail.name, self.banaction, e,
exc_info=self._logSys.getEffectiveLevel()<=logging.DEBUG)
else:
self._logSys.log(self.loglevel,
"Unbanned IP %s for jail '%s' with action '%s'",
ip, self._jail.name, self.banaction)
finally:
self._bannedips.remove(ip)
def start(self):
"""If `banaction` set, blacklists bad IPs.
"""
if self.banaction is not None:
self.update()
def update(self):
"""If `banaction` set, updates blacklisted IPs.
Queries badips.com for list of bad IPs, removing IPs from the
blacklist if no longer present, and adds new bad IPs to the
blacklist.
"""
if self.banaction is not None:
if self._timer:
self._timer.cancel()
self._timer = None
try:
ips = self.getList(
self.bancategory, self.score, self.age, self.bankey)
# Remove old IPs no longer listed
s = self._bannedips - ips
m = len(s)
self._unbanIPs(s)
# Add new IPs which are now listed
s = ips - self._bannedips
p = len(s)
self._banIPs(s)
if m != 0 or p != 0:
self._logSys.log(self.sumloglevel,
"Updated IPs for jail '%s' (-%d/+%d)",
self._jail.name, m, p)
self._logSys.debug(
"Next update for jail '%' in %i seconds",
self._jail.name, self.updateperiod)
finally:
self._timer = threading.Timer(self.updateperiod, self.update)
self._timer.start()
def stop(self):
"""If `banaction` set, clears blacklisted IPs.
"""
if self.banaction is not None:
if self._timer:
self._timer.cancel()
self._timer = None
self._unbanIPs(self._bannedips.copy())
def ban(self, aInfo):
"""Reports banned IP to badips.com.
Parameters
----------
aInfo : dict
Dictionary which includes information in relation to
the ban.
Raises
------
HTTPError
Any issues with badips.com request.
"""
try:
url = "/".join([self._badips, "add", self.category, str(aInfo['ip'])])
self._logSys.debug('badips.com: ban, url: %r', url)
response = urlopen(self._Request(url), timeout=self.timeout)
except HTTPError as response: # pragma: no cover
self.logError(response, "Failed to ban")
raise
else:
messages = json.loads(response.read().decode('utf-8'))
self._logSys.debug(
"Response from badips.com report: '%s'",
messages['suc'])
Action = BadIPsAction

View File

@ -44,7 +44,7 @@ actioncheck =
#actionban = curl -s -o /dev/null https://www.cloudflare.com/api_json.html -d 'a=ban' -d 'tkn=<cftoken>' -d 'email=<cfuser>' -d 'key=<ip>' #actionban = curl -s -o /dev/null https://www.cloudflare.com/api_json.html -d 'a=ban' -d 'tkn=<cftoken>' -d 'email=<cfuser>' -d 'key=<ip>'
# API v4 # API v4
actionban = curl -s -o /dev/null -X POST <_cf_api_prms> \ actionban = curl -s -o /dev/null -X POST <_cf_api_prms> \
-d '{"mode":"block","configuration":{"target":"ip","value":"<ip>"},"notes":"Fail2Ban <name>"}' \ -d '{"mode":"block","configuration":{"target":"<cftarget>","value":"<ip>"},"notes":"Fail2Ban <name>"}' \
<_cf_api_url> <_cf_api_url>
# Option: actionunban # Option: actionunban
@ -59,7 +59,7 @@ actionban = curl -s -o /dev/null -X POST <_cf_api_prms> \
#actionunban = curl -s -o /dev/null https://www.cloudflare.com/api_json.html -d 'a=nul' -d 'tkn=<cftoken>' -d 'email=<cfuser>' -d 'key=<ip>' #actionunban = curl -s -o /dev/null https://www.cloudflare.com/api_json.html -d 'a=nul' -d 'tkn=<cftoken>' -d 'email=<cfuser>' -d 'key=<ip>'
# API v4 # API v4
actionunban = id=$(curl -s -X GET <_cf_api_prms> \ actionunban = id=$(curl -s -X GET <_cf_api_prms> \
"<_cf_api_url>?mode=block&configuration_target=ip&configuration_value=<ip>&page=1&per_page=1&notes=Fail2Ban%%20<name>" \ "<_cf_api_url>?mode=block&configuration_target=<cftarget>&configuration_value=<ip>&page=1&per_page=1&notes=Fail2Ban%%20<name>" \
| { jq -r '.result[0].id' 2>/dev/null || tr -d '\n' | sed -nE 's/^.*"result"\s*:\s*\[\s*\{\s*"id"\s*:\s*"([^"]+)".*$/\1/p'; }) | { jq -r '.result[0].id' 2>/dev/null || tr -d '\n' | sed -nE 's/^.*"result"\s*:\s*\[\s*\{\s*"id"\s*:\s*"([^"]+)".*$/\1/p'; })
if [ -z "$id" ]; then echo "<name>: id for <ip> cannot be found"; exit 0; fi; if [ -z "$id" ]; then echo "<name>: id for <ip> cannot be found"; exit 0; fi;
curl -s -o /dev/null -X DELETE <_cf_api_prms> "<_cf_api_url>/$id" curl -s -o /dev/null -X DELETE <_cf_api_prms> "<_cf_api_url>/$id"
@ -81,3 +81,8 @@ _cf_api_prms = -H 'X-Auth-Email: <cfuser>' -H 'X-Auth-Key: <cftoken>' -H 'Conten
cftoken = cftoken =
cfuser = cfuser =
cftarget = ip
[Init?family=inet6]
cftarget = ip6

View File

@ -84,8 +84,15 @@ srv_cfg_path = /etc/nginx/
#srv_cmd = nginx -c %(srv_cfg_path)s/nginx.conf #srv_cmd = nginx -c %(srv_cfg_path)s/nginx.conf
srv_cmd = nginx srv_cmd = nginx
# first test configuration is correct, hereafter send reload signal: # pid file (used to check nginx is running):
blck_lst_reload = %(srv_cmd)s -qt; if [ $? -eq 0 ]; then srv_pid = /run/nginx.pid
# command used to check whether nginx is running and configuration is valid:
srv_is_running = [ -f "%(srv_pid)s" ]
srv_check_cmd = %(srv_is_running)s && %(srv_cmd)s -qt
# first test nginx is running and configuration is correct, hereafter send reload signal:
blck_lst_reload = %(srv_check_cmd)s; if [ $? -eq 0 ]; then
%(srv_cmd)s -s reload; if [ $? -ne 0 ]; then echo 'reload failed.'; fi; %(srv_cmd)s -s reload; if [ $? -ne 0 ]; then echo 'reload failed.'; fi;
fi; fi;

View File

@ -55,6 +55,12 @@ socket = /var/run/fail2ban/fail2ban.sock
# #
pidfile = /var/run/fail2ban/fail2ban.pid pidfile = /var/run/fail2ban/fail2ban.pid
# Option: allowipv6
# Notes.: Allows IPv6 interface:
# Default: auto
# Values: [ auto yes (on, true, 1) no (off, false, 0) ] Default: auto
#allowipv6 = auto
# Options: dbfile # Options: dbfile
# Notes.: Set the file for the fail2ban persistent data to be stored. # Notes.: Set the file for the fail2ban persistent data to be stored.
# A value of ":memory:" means database is only stored in memory # A value of ":memory:" means database is only stored in memory

View File

@ -8,7 +8,7 @@ before = apache-common.conf
[Definition] [Definition]
failregex = ^%(_apache_error_client)s (?:(?:AH0013[456]: )?Invalid (method|URI) in request\b|(?:AH00565: )?request failed: URI too long \(longer than \d+\)|request failed: erroneous characters after protocol string:|(?:AH00566: )?request failed: invalid characters in URI\b) failregex = ^%(_apache_error_client)s (?:(?:AH001[23][456]: )?Invalid (method|URI) in request\b|(?:AH00565: )?request failed: URI too long \(longer than \d+\)|request failed: erroneous characters after protocol string:|(?:AH00566: )?request failed: invalid characters in URI\b)
ignoreregex = ignoreregex =

View File

@ -21,7 +21,7 @@ log_prefix= (?:NOTICE|SECURITY|WARNING)%(__pid_re)s:?(?:\[C-[\da-f]*\])?:? [^:]+
prefregex = ^%(__prefix_line)s%(log_prefix)s <F-CONTENT>.+</F-CONTENT>$ prefregex = ^%(__prefix_line)s%(log_prefix)s <F-CONTENT>.+</F-CONTENT>$
failregex = ^Registration from '[^']*' failed for '<HOST>(:\d+)?' - (?:Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$ failregex = ^Registration from '[^']*' failed for '<HOST>(:\d+)?' - (?:Wrong password|Username/auth name mismatch|No matching peer found|Not a local domain|Device does not match ACL|Peer is not supposed to register|ACL error \(permit/deny\)|Not a local domain)$
^Call from '[^']*' \(<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context ^Call from '[^']*' \((?:(?:TCP|UDP):)?<HOST>:\d+\) to extension '[^']*' rejected because extension not found in context
^(?:Host )?<HOST> (?:failed (?:to authenticate\b|MD5 authentication\b)|tried to authenticate with nonexistent user\b) ^(?:Host )?<HOST> (?:failed (?:to authenticate\b|MD5 authentication\b)|tried to authenticate with nonexistent user\b)
^No registration for peer '[^']*' \(from <HOST>\)$ ^No registration for peer '[^']*' \(from <HOST>\)$
^hacking attempt detected '<HOST>'$ ^hacking attempt detected '<HOST>'$

View File

@ -14,7 +14,7 @@ before = common.conf
[Definition] [Definition]
failregex = ^%(__prefix_line)s(https?:\/\/)([\da-z\.-]+)\.([a-z\.]{2,6})(\/[\w\.-]+)*\|\d{10}\|user\|<HOST>\|.+\|.+\|\d\|.*\|Login attempt failed for .+\.$ failregex = ^%(__prefix_line)s(?:https?:\/\/)[^|]+\|[^|]+\|[^|]+\|<ADDR>\|(?:[^|]*\|)*Login attempt failed (?:for|from) <F-USER>[^|]+</F-USER>\.$
ignoreregex = ignoreregex =

View File

@ -6,24 +6,35 @@
# #
import sys import sys
from fail2ban.server.ipdns import DNSUtils, IPAddr from fail2ban.server.ipdns import DNSUtils, IPAddr
from threading import Thread
def process_args(argv): def process_args(argv):
if len(argv) != 2: if len(argv) - 1 not in (1, 2):
raise ValueError("Please provide a single IP as an argument. Got: %s\n" raise ValueError("Usage %s ip ?timeout?. Got: %s\n"
% (argv[1:])) % (argv[0], argv[1:]))
ip = argv[1] ip = argv[1]
if not IPAddr(ip).isValid: if not IPAddr(ip).isValid:
raise ValueError("Argument must be a single valid IP. Got: %s\n" raise ValueError("Argument must be a single valid IP. Got: %s\n"
% ip) % ip)
return ip return argv[1:]
google_ips = None google_ips = None
def is_googlebot(ip): def is_googlebot(ip, timeout=55):
import re import re
host = DNSUtils.ipToName(ip) timeout = float(timeout or 0)
if timeout:
def ipToNameTO(host, ip, timeout):
host[0] = DNSUtils.ipToName(ip)
host = [None]
th = Thread(target=ipToNameTO, args=(host, ip, timeout)); th.daemon=True; th.start()
th.join(timeout)
host = host[0]
else:
host = DNSUtils.ipToName(ip)
if not host or not re.match(r'.*\.google(bot)?\.com$', host): if not host or not re.match(r'.*\.google(bot)?\.com$', host):
return False return False
host_ips = DNSUtils.dnsToIp(host) host_ips = DNSUtils.dnsToIp(host)
@ -31,7 +42,7 @@ def is_googlebot(ip):
if __name__ == '__main__': # pragma: no cover if __name__ == '__main__': # pragma: no cover
try: try:
ret = is_googlebot(process_args(sys.argv)) ret = is_googlebot(*process_args(sys.argv))
except ValueError as e: except ValueError as e:
sys.stderr.write(str(e)) sys.stderr.write(str(e))
sys.exit(2) sys.exit(2)

View File

@ -0,0 +1,15 @@
# Fail2Ban filter for failed MSSQL Server authentication attempts
[Definition]
failregex = ^\s*Logon\s+Login failed for user '<F-USER>(?:[^']*|.*)</F-USER>'\. [^'\[]+\[CLIENT: <ADDR>\]$
# DEV Notes:
# Tested with SQL Server 2019 on Ubuntu 18.04
#
# Example:
# 2020-02-24 14:48:55.12 Logon Login failed for user 'root'. Reason: Could not find a login matching the name provided. [CLIENT: 127.0.0.1]
#
# Author: Rüdiger Olschewsky
#

View File

@ -32,7 +32,7 @@ __daemon_combs_re=(?:%(__pid_re)s?:\s+%(__daemon_re)s|%(__daemon_re)s%(__pid_re)
# hostname daemon_id spaces # hostname daemon_id spaces
# this can be optional (for instance if we match named native log files) # this can be optional (for instance if we match named native log files)
__line_prefix=(?:\s\S+ %(__daemon_combs_re)s\s+)? __line_prefix=(?:\s*\S+ %(__daemon_combs_re)s\s+)?
prefregex = ^%(__line_prefix)s(?: error:)?\s*client(?: @\S*)? <HOST>#\S+(?: \([\S.]+\))?: <F-CONTENT>.+</F-CONTENT>\s(?:denied|\(NOTAUTH\))\s*$ prefregex = ^%(__line_prefix)s(?: error:)?\s*client(?: @\S*)? <HOST>#\S+(?: \([\S.]+\))?: <F-CONTENT>.+</F-CONTENT>\s(?:denied|\(NOTAUTH\))\s*$

View File

@ -11,4 +11,6 @@ datepattern = {^LN-BEG}%%ExY(?P<_sep>[-/.])%%m(?P=_sep)%%d[T ]%%H:%%M:%%S(?:[.,]
^[^\[]*\[({DATE}) ^[^\[]*\[({DATE})
{^LN-BEG} {^LN-BEG}
journalmatch = _SYSTEMD_UNIT=nginx.service + _COMM=nginx
# Author: Jan Przybylak # Author: Jan Przybylak

View File

@ -17,7 +17,9 @@ datepattern = {^LN-BEG}%%ExY(?P<_sep>[-/.])%%m(?P=_sep)%%d[T ]%%H:%%M:%%S(?:[.,]
^[^\[]*\[({DATE}) ^[^\[]*\[({DATE})
{^LN-BEG} {^LN-BEG}
journalmatch = _SYSTEMD_UNIT=nginx.service + _COMM=nginx
# DEV Notes: # DEV Notes:
# Based on apache-botsearch filter # Based on apache-botsearch filter
# #
# Author: Frantisek Sumsal # Author: Frantisek Sumsal

View File

@ -3,9 +3,10 @@
[Definition] [Definition]
mdre-auth = ^ \[error\] \d+#\d+: \*\d+ user "(?:[^"]+|.*?)":? (?:password mismatch|was not found in "[^\"]*"), client: <HOST>, server: \S*, request: "\S+ \S+ HTTP/\d+\.\d+", host: "\S+"(?:, referrer: "\S+")?\s*$ mode = normal
mdre-fallback = ^\s*\[crit\] \d+#\d+: \*\d+ SSL_do_handshake\(\) failed \(SSL: error:\S+(?: \S+){1,3} too (?:long|short)\)[^,]*, client: <HOST>
mdre-auth = ^\s*\[error\] \d+#\d+: \*\d+ user "(?:[^"]+|.*?)":? (?:password mismatch|was not found in "[^\"]*"), client: <HOST>, server: \S*, request: "\S+ \S+ HTTP/\d+\.\d+", host: "\S+"(?:, referrer: "\S+")?\s*$
mdre-fallback = ^\s*\[crit\] \d+#\d+: \*\d+ SSL_do_handshake\(\) failed \(SSL: error:\S+(?: \S+){1,3} too (?:long|short)\)[^,]*, client: <HOST>
mdre-normal = %(mdre-auth)s mdre-normal = %(mdre-auth)s
mdre-aggressive = %(mdre-auth)s mdre-aggressive = %(mdre-auth)s
@ -17,7 +18,7 @@ ignoreregex =
datepattern = {^LN-BEG} datepattern = {^LN-BEG}
mode = normal journalmatch = _SYSTEMD_UNIT=nginx.service + _COMM=nginx
# DEV NOTES: # DEV NOTES:
# mdre-auth: # mdre-auth:

View File

@ -44,3 +44,6 @@ failregex = ^\s*\[[a-z]+\] \d+#\d+: \*\d+ limiting requests, excess: [\d\.]+ by
ignoreregex = ignoreregex =
datepattern = {^LN-BEG} datepattern = {^LN-BEG}
journalmatch = _SYSTEMD_UNIT=nginx.service + _COMM=nginx

View File

@ -22,10 +22,10 @@ _daemon = nsd
# (?:::f{4,6}:)?(?P<host>[\w\-.^_]+) # (?:::f{4,6}:)?(?P<host>[\w\-.^_]+)
# Values: TEXT # Values: TEXT
failregex = ^%(__prefix_line)sinfo: ratelimit block .* query <HOST> TYPE255$ failregex = ^%(__prefix_line)sinfo: ratelimit block .* query <ADDR> TYPE255$
^%(__prefix_line)sinfo: .* <HOST> refused, no acl matches\.$ ^%(__prefix_line)sinfo: .* from(?: client)? <ADDR> refused, no acl matches\.?$
ignoreregex = ignoreregex =
datepattern = {^LN-BEG}Epoch datepattern = {^LN-BEG}Epoch
{^LN-BEG} {^LN-BEG}

View File

@ -37,7 +37,9 @@ mdre-rbl = ^RCPT from [^[]*\[<HOST>\]%(_port)s: [45]54 [45]\.7\.1 Service unava
mdpr-more = %(mdpr-normal)s mdpr-more = %(mdpr-normal)s
mdre-more = %(mdre-normal)s mdre-more = %(mdre-normal)s
mdpr-ddos = (?:lost connection after(?! DATA) [A-Z]+|disconnect(?= from \S+(?: \S+=\d+)* auth=0/(?:[1-9]|\d\d+))) # Includes some of the log messages described in
# <http://www.postfix.org/POSTSCREEN_README.html>.
mdpr-ddos = (?:lost connection after(?! DATA) [A-Z]+|disconnect(?= from \S+(?: \S+=\d+)* auth=0/(?:[1-9]|\d\d+))|(?:PREGREET \d+|HANGUP) after \S+)
mdre-ddos = ^from [^[]*\[<HOST>\]%(_port)s:? mdre-ddos = ^from [^[]*\[<HOST>\]%(_port)s:?
mdpr-extra = (?:%(mdpr-auth)s|%(mdpr-normal)s) mdpr-extra = (?:%(mdpr-auth)s|%(mdpr-normal)s)

View File

@ -0,0 +1,17 @@
# Fail2Ban filter for port scans detected by scanlogd
[INCLUDES]
# Read common prefixes. If any customizations available -- read them from
# common.local
before = common.conf
[Definition]
_daemon = scanlogd
failregex = ^%(__prefix_line)s<ADDR>(?::<F-PORT/>)? to \S+ ports\b
ignoreregex =
# Author: Mike Gabriel <mike.gabriel@das-netzwerkteam.de>

View File

@ -67,7 +67,7 @@ before = paths-debian.conf
# more aggressive example of formula has the same values only for factor "2.0 / 2.885385" : # more aggressive example of formula has the same values only for factor "2.0 / 2.885385" :
#bantime.formula = ban.Time * math.exp(float(ban.Count+1)*banFactor)/math.exp(1*banFactor) #bantime.formula = ban.Time * math.exp(float(ban.Count+1)*banFactor)/math.exp(1*banFactor)
# "bantime.multipliers" used to calculate next value of ban time instead of formula, coresponding # "bantime.multipliers" used to calculate next value of ban time instead of formula, corresponding
# previously ban count and given "bantime.factor" (for multipliers default is 1); # previously ban count and given "bantime.factor" (for multipliers default is 1);
# following example grows ban time by 1, 2, 4, 8, 16 ... and if last ban count greater as multipliers count, # following example grows ban time by 1, 2, 4, 8, 16 ... and if last ban count greater as multipliers count,
# always used last multiplier (64 in example), for factor '1' and original ban time 600 - 10.6 hours # always used last multiplier (64 in example), for factor '1' and original ban time 600 - 10.6 hours
@ -77,7 +77,7 @@ before = paths-debian.conf
#bantime.multipliers = 1 5 30 60 300 720 1440 2880 #bantime.multipliers = 1 5 30 60 300 720 1440 2880
# "bantime.overalljails" (if true) specifies the search of IP in the database will be executed # "bantime.overalljails" (if true) specifies the search of IP in the database will be executed
# cross over all jails, if false (dafault), only current jail of the ban IP will be searched # cross over all jails, if false (default), only current jail of the ban IP will be searched
#bantime.overalljails = false #bantime.overalljails = false
# -------------------- # --------------------
@ -242,20 +242,6 @@ action_cf_mwl = cloudflare[cfuser="%(cfemail)s", cftoken="%(cfapikey)s"]
# #
action_blocklist_de = blocklist_de[email="%(sender)s", service="%(__name__)s", apikey="%(blocklist_de_apikey)s", agent="%(fail2ban_agent)s"] action_blocklist_de = blocklist_de[email="%(sender)s", service="%(__name__)s", apikey="%(blocklist_de_apikey)s", agent="%(fail2ban_agent)s"]
# Report ban via badips.com, and use as blacklist
#
# See BadIPsAction docstring in config/action.d/badips.py for
# documentation for this action.
#
# NOTE: This action relies on banaction being present on start and therefore
# should be last action defined for a jail.
#
action_badips = badips.py[category="%(__name__)s", banaction="%(banaction)s", agent="%(fail2ban_agent)s"]
#
# Report ban via badips.com (uses action.d/badips.conf for reporting only)
#
action_badips_report = badips[category="%(__name__)s", agent="%(fail2ban_agent)s"]
# Report ban via abuseipdb.com. # Report ban via abuseipdb.com.
# #
# See action.d/abuseipdb.conf for usage example and details. # See action.d/abuseipdb.conf for usage example and details.
@ -802,6 +788,14 @@ logpath = %(mysql_log)s
backend = %(mysql_backend)s backend = %(mysql_backend)s
[mssql-auth]
# Default configuration for Microsoft SQL Server for Linux
# See the 'mssql-conf' manpage how to change logpath or port
logpath = /var/opt/mssql/log/errorlog
port = 1433
filter = mssql-auth
# Log wrong MongoDB auth (for details see filter 'filter.d/mongodb-auth.conf') # Log wrong MongoDB auth (for details see filter 'filter.d/mongodb-auth.conf')
[mongodb-auth] [mongodb-auth]
# change port when running with "--shardsvr" or "--configsvr" runtime operation # change port when running with "--shardsvr" or "--configsvr" runtime operation
@ -967,3 +961,7 @@ logpath = %(apache_error_log)s
# see `filter.d/traefik-auth.conf` for details and service example. # see `filter.d/traefik-auth.conf` for details and service example.
port = http,https port = http,https
logpath = /var/log/traefik/access.log logpath = /var/log/traefik/access.log
[scanlogd]
logpath = %(syslog_local0)s
banaction = %(banaction_allports)s

View File

@ -230,7 +230,7 @@ class Fail2banClient(Fail2banCmdLine, Thread):
logSys.log(5, ' client phase %s', phase) logSys.log(5, ' client phase %s', phase)
if not stream: if not stream:
return False return False
# wait a litle bit for phase "start-ready" before enter active waiting: # wait a little bit for phase "start-ready" before enter active waiting:
if phase is not None: if phase is not None:
Utils.wait_for(lambda: phase.get('start-ready', None) is not None, 0.5, 0.001) Utils.wait_for(lambda: phase.get('start-ready', None) is not None, 0.5, 0.001)
phase['configure'] = (True if stream else False) phase['configure'] = (True if stream else False)

View File

@ -192,7 +192,7 @@ class Fail2banCmdLine():
cmdOpts = 'hc:s:p:xfbdtviqV' cmdOpts = 'hc:s:p:xfbdtviqV'
cmdLongOpts = ['loglevel=', 'logtarget=', 'syslogsocket=', 'test', 'async', cmdLongOpts = ['loglevel=', 'logtarget=', 'syslogsocket=', 'test', 'async',
'conf=', 'pidfile=', 'pname=', 'socket=', 'conf=', 'pidfile=', 'pname=', 'socket=',
'timeout=', 'str2sec=', 'help', 'version', 'dp', '--dump-pretty'] 'timeout=', 'str2sec=', 'help', 'version', 'dp', 'dump-pretty']
optList, self._args = getopt.getopt(self._argv[1:], cmdOpts, cmdLongOpts) optList, self._args = getopt.getopt(self._argv[1:], cmdOpts, cmdLongOpts)
except getopt.GetoptError: except getopt.GetoptError:
self.dispUsage() self.dispUsage()

View File

@ -53,6 +53,7 @@ class Fail2banReader(ConfigReader):
opts = [["string", "loglevel", "INFO" ], opts = [["string", "loglevel", "INFO" ],
["string", "logtarget", "STDERR"], ["string", "logtarget", "STDERR"],
["string", "syslogsocket", "auto"], ["string", "syslogsocket", "auto"],
["string", "allowipv6", "auto"],
["string", "dbfile", "/var/lib/fail2ban/fail2ban.sqlite3"], ["string", "dbfile", "/var/lib/fail2ban/fail2ban.sqlite3"],
["int", "dbmaxmatches", None], ["int", "dbmaxmatches", None],
["string", "dbpurgeage", "1d"]] ["string", "dbpurgeage", "1d"]]
@ -74,6 +75,7 @@ class Fail2banReader(ConfigReader):
# Also dbfile should be set before all other database options. # Also dbfile should be set before all other database options.
# So adding order indices into items, to be stripped after sorting, upon return # So adding order indices into items, to be stripped after sorting, upon return
order = {"thread":0, "syslogsocket":11, "loglevel":12, "logtarget":13, order = {"thread":0, "syslogsocket":11, "loglevel":12, "logtarget":13,
"allowipv6": 14,
"dbfile":50, "dbmaxmatches":51, "dbpurgeage":51} "dbfile":50, "dbmaxmatches":51, "dbpurgeage":51}
stream = list() stream = list()
for opt in self.__opts: for opt in self.__opts:

View File

@ -35,11 +35,11 @@ __license__ = "GPL"
import getopt import getopt
import logging import logging
import re
import os import os
import shlex import shlex
import sys import sys
import time import time
import time
import urllib import urllib
from optparse import OptionParser, Option from optparse import OptionParser, Option
@ -52,7 +52,7 @@ except ImportError:
from ..version import version, normVersion from ..version import version, normVersion
from .filterreader import FilterReader from .filterreader import FilterReader
from ..server.filter import Filter, FileContainer from ..server.filter import Filter, FileContainer, MyTime
from ..server.failregex import Regex, RegexException from ..server.failregex import Regex, RegexException
from ..helpers import str2LogLevel, getVerbosityFormat, FormatterWithTraceBack, getLogger, \ from ..helpers import str2LogLevel, getVerbosityFormat, FormatterWithTraceBack, getLogger, \
@ -269,15 +269,19 @@ class Fail2banRegex(object):
self.setJournalMatch(shlex.split(opts.journalmatch)) self.setJournalMatch(shlex.split(opts.journalmatch))
if opts.timezone: if opts.timezone:
self._filter.setLogTimeZone(opts.timezone) self._filter.setLogTimeZone(opts.timezone)
self._filter.checkFindTime = False
if True: # not opts.out:
MyTime.setAlternateNow(0); # accept every date (years from 19xx up to end of current century, '%ExY' and 'Exy' patterns)
from ..server.strptime import _updateTimeRE
_updateTimeRE()
if opts.datepattern: if opts.datepattern:
self.setDatePattern(opts.datepattern) self.setDatePattern(opts.datepattern)
if opts.usedns: if opts.usedns:
self._filter.setUseDns(opts.usedns) self._filter.setUseDns(opts.usedns)
self._filter.returnRawHost = opts.raw self._filter.returnRawHost = opts.raw
self._filter.checkFindTime = False
self._filter.checkAllRegex = opts.checkAllRegex and not opts.out self._filter.checkAllRegex = opts.checkAllRegex and not opts.out
# ignore pending (without ID/IP), added to matches if it hits later (if ID/IP can be retreved) # ignore pending (without ID/IP), added to matches if it hits later (if ID/IP can be retreved)
self._filter.ignorePending = opts.out self._filter.ignorePending = bool(opts.out)
# callback to increment ignored RE's by index (during process): # callback to increment ignored RE's by index (during process):
self._filter.onIgnoreRegex = self._onIgnoreRegex self._filter.onIgnoreRegex = self._onIgnoreRegex
self._backend = 'auto' self._backend = 'auto'
@ -285,9 +289,6 @@ class Fail2banRegex(object):
def output(self, line): def output(self, line):
if not self._opts.out: output(line) if not self._opts.out: output(line)
def decode_line(self, line):
return FileContainer.decode_line('<LOG>', self._encoding, line)
def encode_line(self, line): def encode_line(self, line):
return line.encode(self._encoding, 'ignore') return line.encode(self._encoding, 'ignore')
@ -326,26 +327,33 @@ class Fail2banRegex(object):
regex = regextype + 'regex' regex = regextype + 'regex'
# try to check - we've case filter?[options...]?: # try to check - we've case filter?[options...]?:
basedir = self._opts.config basedir = self._opts.config
fltName = value
fltFile = None fltFile = None
fltOpt = {} fltOpt = {}
if regextype == 'fail': if regextype == 'fail':
fltName, fltOpt = extractOptions(value) if re.search(r'^/{0,3}[\w/_\-.]+(?:\[.*\])?$', value):
if fltName is not None: try:
if "." in fltName[~5:]: fltName, fltOpt = extractOptions(value)
tryNames = (fltName,) if "." in fltName[~5:]:
else: tryNames = (fltName,)
tryNames = (fltName, fltName + '.conf', fltName + '.local')
for fltFile in tryNames:
if not "/" in fltFile:
if os.path.basename(basedir) == 'filter.d':
fltFile = os.path.join(basedir, fltFile)
else:
fltFile = os.path.join(basedir, 'filter.d', fltFile)
else: else:
basedir = os.path.dirname(fltFile) tryNames = (fltName, fltName + '.conf', fltName + '.local')
if os.path.isfile(fltFile): for fltFile in tryNames:
break if not "/" in fltFile:
fltFile = None if os.path.basename(basedir) == 'filter.d':
fltFile = os.path.join(basedir, fltFile)
else:
fltFile = os.path.join(basedir, 'filter.d', fltFile)
else:
basedir = os.path.dirname(fltFile)
if os.path.isfile(fltFile):
break
fltFile = None
except Exception as e:
output("ERROR: Wrong filter name or options: %s" % (str(e),))
output(" while parsing: %s" % (value,))
if self._verbose: raise(e)
return False
# if it is filter file: # if it is filter file:
if fltFile is not None: if fltFile is not None:
if (basedir == self._opts.config if (basedir == self._opts.config
@ -712,10 +720,6 @@ class Fail2banRegex(object):
return True return True
def file_lines_gen(self, hdlr):
for line in hdlr:
yield self.decode_line(line)
def start(self, args): def start(self, args):
cmd_log, cmd_regex = args[:2] cmd_log, cmd_regex = args[:2]
@ -734,10 +738,10 @@ class Fail2banRegex(object):
if os.path.isfile(cmd_log): if os.path.isfile(cmd_log):
try: try:
hdlr = open(cmd_log, 'rb') test_lines = FileContainer(cmd_log, self._encoding, doOpen=True)
self.output( "Use log file : %s" % cmd_log ) self.output( "Use log file : %s" % cmd_log )
self.output( "Use encoding : %s" % self._encoding ) self.output( "Use encoding : %s" % self._encoding )
test_lines = self.file_lines_gen(hdlr)
except IOError as e: # pragma: no cover except IOError as e: # pragma: no cover
output( e ) output( e )
return False return False

View File

@ -140,9 +140,10 @@ class JailReader(ConfigReader):
# Read filter # Read filter
flt = self.__opts["filter"] flt = self.__opts["filter"]
if flt: if flt:
filterName, filterOpt = extractOptions(flt) try:
if not filterName: filterName, filterOpt = extractOptions(flt)
raise JailDefError("Invalid filter definition %r" % flt) except ValueError as e:
raise JailDefError("Invalid filter definition %r: %s" % (flt, e))
self.__filter = FilterReader( self.__filter = FilterReader(
filterName, self.__name, filterOpt, filterName, self.__name, filterOpt,
share_config=self.share_config, basedir=self.getBaseDir()) share_config=self.share_config, basedir=self.getBaseDir())
@ -174,10 +175,10 @@ class JailReader(ConfigReader):
if not act: # skip empty actions if not act: # skip empty actions
continue continue
# join with previous line if needed (consider possible new-line): # join with previous line if needed (consider possible new-line):
actName, actOpt = extractOptions(act) try:
prevln = '' actName, actOpt = extractOptions(act)
if not actName: except ValueError as e:
raise JailDefError("Invalid action definition %r" % act) raise JailDefError("Invalid action definition %r: %s" % (act, e))
if actName.endswith(".py"): if actName.endswith(".py"):
self.__actions.append([ self.__actions.append([
"set", "set",

View File

@ -371,7 +371,7 @@ OPTION_CRE = re.compile(r"^([^\[]+)(?:\[(.*)\])?\s*$", re.DOTALL)
# since v0.10 separator extended with `]\s*[` for support of multiple option groups, syntax # since v0.10 separator extended with `]\s*[` for support of multiple option groups, syntax
# `action = act[p1=...][p2=...]` # `action = act[p1=...][p2=...]`
OPTION_EXTRACT_CRE = re.compile( OPTION_EXTRACT_CRE = re.compile(
r'([\w\-_\.]+)=(?:"([^"]*)"|\'([^\']*)\'|([^,\]]*))(?:,|\]\s*\[|$)', re.DOTALL) r'\s*([\w\-_\.]+)=(?:"([^"]*)"|\'([^\']*)\'|([^,\]]*))(?:,|\]\s*\[|$|(?P<wrngA>.+))|,?\s*$|(?P<wrngB>.+)', re.DOTALL)
# split by new-line considering possible new-lines within options [...]: # split by new-line considering possible new-lines within options [...]:
OPTION_SPLIT_CRE = re.compile( OPTION_SPLIT_CRE = re.compile(
r'(?:[^\[\s]+(?:\s*\[\s*(?:[\w\-_\.]+=(?:"[^"]*"|\'[^\']*\'|[^,\]]*)\s*(?:,|\]\s*\[)?\s*)*\])?\s*|\S+)(?=\n\s*|\s+|$)', re.DOTALL) r'(?:[^\[\s]+(?:\s*\[\s*(?:[\w\-_\.]+=(?:"[^"]*"|\'[^\']*\'|[^,\]]*)\s*(?:,|\]\s*\[)?\s*)*\])?\s*|\S+)(?=\n\s*|\s+|$)', re.DOTALL)
@ -379,13 +379,19 @@ OPTION_SPLIT_CRE = re.compile(
def extractOptions(option): def extractOptions(option):
match = OPTION_CRE.match(option) match = OPTION_CRE.match(option)
if not match: if not match:
# TODO proper error handling raise ValueError("unexpected option syntax")
return None, None
option_name, optstr = match.groups() option_name, optstr = match.groups()
option_opts = dict() option_opts = dict()
if optstr: if optstr:
for optmatch in OPTION_EXTRACT_CRE.finditer(optstr): for optmatch in OPTION_EXTRACT_CRE.finditer(optstr):
if optmatch.group("wrngA"):
raise ValueError("unexpected syntax at %d after option %r: %s" % (
optmatch.start("wrngA"), optmatch.group(1), optmatch.group("wrngA")[0:25]))
if optmatch.group("wrngB"):
raise ValueError("expected option, wrong syntax at %d: %s" % (
optmatch.start("wrngB"), optmatch.group("wrngB")[0:25]))
opt = optmatch.group(1) opt = optmatch.group(1)
if not opt: continue
value = [ value = [
val for val in optmatch.group(2,3,4) if val is not None][0] val for val in optmatch.group(2,3,4) if val is not None][0]
option_opts[opt.strip()] = value.strip() option_opts[opt.strip()] = value.strip()

View File

@ -30,7 +30,10 @@ import tempfile
import threading import threading
import time import time
from abc import ABCMeta from abc import ABCMeta
from collections import MutableMapping try:
from collections.abc import MutableMapping
except ImportError:
from collections import MutableMapping
from .failregex import mapTag2Opt from .failregex import mapTag2Opt
from .ipdns import DNSUtils from .ipdns import DNSUtils

View File

@ -28,7 +28,10 @@ import logging
import os import os
import sys import sys
import time import time
from collections import Mapping try:
from collections.abc import Mapping
except ImportError:
from collections import Mapping
try: try:
from collections import OrderedDict from collections import OrderedDict
except ImportError: except ImportError:
@ -81,7 +84,7 @@ class Actions(JailThread, Mapping):
self._jail = jail self._jail = jail
self._actions = OrderedDict() self._actions = OrderedDict()
## The ban manager. ## The ban manager.
self.__banManager = BanManager() self.banManager = BanManager()
self.banEpoch = 0 self.banEpoch = 0
self.__lastConsistencyCheckTM = 0 self.__lastConsistencyCheckTM = 0
## Precedence of ban (over unban), so max number of tickets banned (to call an unban check): ## Precedence of ban (over unban), so max number of tickets banned (to call an unban check):
@ -200,7 +203,7 @@ class Actions(JailThread, Mapping):
def setBanTime(self, value): def setBanTime(self, value):
value = MyTime.str2seconds(value) value = MyTime.str2seconds(value)
self.__banManager.setBanTime(value) self.banManager.setBanTime(value)
logSys.info(" banTime: %s" % value) logSys.info(" banTime: %s" % value)
## ##
@ -209,10 +212,10 @@ class Actions(JailThread, Mapping):
# @return the time # @return the time
def getBanTime(self): def getBanTime(self):
return self.__banManager.getBanTime() return self.banManager.getBanTime()
def getBanned(self, ids): def getBanned(self, ids):
lst = self.__banManager.getBanList() lst = self.banManager.getBanList()
if not ids: if not ids:
return lst return lst
if len(ids) == 1: if len(ids) == 1:
@ -227,7 +230,7 @@ class Actions(JailThread, Mapping):
list list
The list of banned IP addresses. The list of banned IP addresses.
""" """
return self.__banManager.getBanList(ordered=True, withTime=withTime) return self.banManager.getBanList(ordered=True, withTime=withTime)
def addBannedIP(self, ip): def addBannedIP(self, ip):
"""Ban an IP or list of IPs.""" """Ban an IP or list of IPs."""
@ -279,7 +282,7 @@ class Actions(JailThread, Mapping):
if db and self._jail.database is not None: if db and self._jail.database is not None:
self._jail.database.delBan(self._jail, ip) self._jail.database.delBan(self._jail, ip)
# Find the ticket with the IP. # Find the ticket with the IP.
ticket = self.__banManager.getTicketByID(ip) ticket = self.banManager.getTicketByID(ip)
if ticket is not None: if ticket is not None:
# Unban the IP. # Unban the IP.
self.__unBan(ticket) self.__unBan(ticket)
@ -288,7 +291,7 @@ class Actions(JailThread, Mapping):
if not isinstance(ip, IPAddr): if not isinstance(ip, IPAddr):
ipa = IPAddr(ip) ipa = IPAddr(ip)
if not ipa.isSingle: # subnet (mask/cidr) or raw (may be dns/hostname): if not ipa.isSingle: # subnet (mask/cidr) or raw (may be dns/hostname):
ips = filter(ipa.contains, self.__banManager.getBanList()) ips = filter(ipa.contains, self.banManager.getBanList())
if ips: if ips:
return self.removeBannedIP(ips, db, ifexists) return self.removeBannedIP(ips, db, ifexists)
# not found: # not found:
@ -347,7 +350,7 @@ class Actions(JailThread, Mapping):
continue continue
# wait for ban (stop if gets inactive, pending ban or unban): # wait for ban (stop if gets inactive, pending ban or unban):
bancnt = 0 bancnt = 0
wt = min(self.sleeptime, self.__banManager._nextUnbanTime - MyTime.time()) wt = min(self.sleeptime, self.banManager._nextUnbanTime - MyTime.time())
logSys.log(5, "Actions: wait for pending tickets %s (default %s)", wt, self.sleeptime) logSys.log(5, "Actions: wait for pending tickets %s (default %s)", wt, self.sleeptime)
if Utils.wait_for(lambda: not self.active or self._jail.hasFailTickets, wt): if Utils.wait_for(lambda: not self.active or self._jail.hasFailTickets, wt):
bancnt = self.__checkBan() bancnt = self.__checkBan()
@ -394,7 +397,12 @@ class Actions(JailThread, Mapping):
"ipfailures": lambda self: self._mi4ip(True).getAttempt(), "ipfailures": lambda self: self._mi4ip(True).getAttempt(),
"ipjailfailures": lambda self: self._mi4ip().getAttempt(), "ipjailfailures": lambda self: self._mi4ip().getAttempt(),
# raw ticket info: # raw ticket info:
"raw-ticket": lambda self: repr(self.__ticket) "raw-ticket": lambda self: repr(self.__ticket),
# jail info:
"jail.banned": lambda self: self.__jail.actions.banManager.size(),
"jail.banned_total": lambda self: self.__jail.actions.banManager.getBanTotal(),
"jail.found": lambda self: self.__jail.filter.failManager.size(),
"jail.found_total": lambda self: self.__jail.filter.failManager.getFailTotal()
} }
__slots__ = CallingMap.__slots__ + ('__ticket', '__jail', '__mi4ip') __slots__ = CallingMap.__slots__ + ('__ticket', '__jail', '__mi4ip')
@ -491,11 +499,11 @@ class Actions(JailThread, Mapping):
for ticket in tickets: for ticket in tickets:
bTicket = BanTicket.wrap(ticket) bTicket = BanTicket.wrap(ticket)
btime = ticket.getBanTime(self.__banManager.getBanTime()) btime = ticket.getBanTime(self.banManager.getBanTime())
ip = bTicket.getIP() ip = bTicket.getIP()
aInfo = self._getActionInfo(bTicket) aInfo = self._getActionInfo(bTicket)
reason = {} reason = {}
if self.__banManager.addBanTicket(bTicket, reason=reason): if self.banManager.addBanTicket(bTicket, reason=reason):
cnt += 1 cnt += 1
# report ticket to observer, to check time should be increased and hereafter observer writes ban to database (asynchronous) # report ticket to observer, to check time should be increased and hereafter observer writes ban to database (asynchronous)
if Observers.Main is not None and not bTicket.restored: if Observers.Main is not None and not bTicket.restored:
@ -554,7 +562,7 @@ class Actions(JailThread, Mapping):
# and increase ticket time if "bantime.increment" set) # and increase ticket time if "bantime.increment" set)
if cnt: if cnt:
logSys.debug("Banned %s / %s, %s ticket(s) in %r", cnt, logSys.debug("Banned %s / %s, %s ticket(s) in %r", cnt,
self.__banManager.getBanTotal(), self.__banManager.size(), self._jail.name) self.banManager.getBanTotal(), self.banManager.size(), self._jail.name)
return cnt return cnt
def __reBan(self, ticket, actions=None, log=True): def __reBan(self, ticket, actions=None, log=True):
@ -594,7 +602,7 @@ class Actions(JailThread, Mapping):
def _prolongBan(self, ticket): def _prolongBan(self, ticket):
# prevent to prolong ticket that was removed in-between, # prevent to prolong ticket that was removed in-between,
# if it in ban list - ban time already prolonged (and it stays there): # if it in ban list - ban time already prolonged (and it stays there):
if not self.__banManager._inBanList(ticket): return if not self.banManager._inBanList(ticket): return
# do actions : # do actions :
aInfo = None aInfo = None
for name, action in self._actions.iteritems(): for name, action in self._actions.iteritems():
@ -619,13 +627,13 @@ class Actions(JailThread, Mapping):
Unban IP addresses which are outdated. Unban IP addresses which are outdated.
""" """
lst = self.__banManager.unBanList(MyTime.time(), maxCount) lst = self.banManager.unBanList(MyTime.time(), maxCount)
for ticket in lst: for ticket in lst:
self.__unBan(ticket) self.__unBan(ticket)
cnt = len(lst) cnt = len(lst)
if cnt: if cnt:
logSys.debug("Unbanned %s, %s ticket(s) in %r", logSys.debug("Unbanned %s, %s ticket(s) in %r",
cnt, self.__banManager.size(), self._jail.name) cnt, self.banManager.size(), self._jail.name)
return cnt return cnt
def __flushBan(self, db=False, actions=None, stop=False): def __flushBan(self, db=False, actions=None, stop=False):
@ -639,10 +647,10 @@ class Actions(JailThread, Mapping):
log = True log = True
if actions is None: if actions is None:
logSys.debug(" Flush ban list") logSys.debug(" Flush ban list")
lst = self.__banManager.flushBanList() lst = self.banManager.flushBanList()
else: else:
log = False # don't log "[jail] Unban ..." if removing actions only. log = False # don't log "[jail] Unban ..." if removing actions only.
lst = iter(self.__banManager) lst = iter(self.banManager)
cnt = 0 cnt = 0
# first we'll execute flush for actions supporting this operation: # first we'll execute flush for actions supporting this operation:
unbactions = {} unbactions = {}
@ -679,7 +687,7 @@ class Actions(JailThread, Mapping):
self.__unBan(ticket, actions=actions, log=log) self.__unBan(ticket, actions=actions, log=log)
cnt += 1 cnt += 1
logSys.debug(" Unbanned %s, %s ticket(s) in %r", logSys.debug(" Unbanned %s, %s ticket(s) in %r",
cnt, self.__banManager.size(), self._jail.name) cnt, self.banManager.size(), self._jail.name)
return cnt return cnt
def __unBan(self, ticket, actions=None, log=True): def __unBan(self, ticket, actions=None, log=True):
@ -722,18 +730,18 @@ class Actions(JailThread, Mapping):
logSys.warning("Unsupported extended jail status flavor %r. Supported: %s" % (flavor, supported_flavors)) logSys.warning("Unsupported extended jail status flavor %r. Supported: %s" % (flavor, supported_flavors))
# Always print this information (basic) # Always print this information (basic)
if flavor != "short": if flavor != "short":
banned = self.__banManager.getBanList() banned = self.banManager.getBanList()
cnt = len(banned) cnt = len(banned)
else: else:
cnt = self.__banManager.size() cnt = self.banManager.size()
ret = [("Currently banned", cnt), ret = [("Currently banned", cnt),
("Total banned", self.__banManager.getBanTotal())] ("Total banned", self.banManager.getBanTotal())]
if flavor != "short": if flavor != "short":
ret += [("Banned IP list", banned)] ret += [("Banned IP list", banned)]
if flavor == "cymru": if flavor == "cymru":
cymru_info = self.__banManager.getBanListExtendedCymruInfo() cymru_info = self.banManager.getBanListExtendedCymruInfo()
ret += \ ret += \
[("Banned ASN list", self.__banManager.geBanListExtendedASN(cymru_info)), [("Banned ASN list", self.banManager.geBanListExtendedASN(cymru_info)),
("Banned Country list", self.__banManager.geBanListExtendedCountry(cymru_info)), ("Banned Country list", self.banManager.geBanListExtendedCountry(cymru_info)),
("Banned RIR list", self.__banManager.geBanListExtendedRIR(cymru_info))] ("Banned RIR list", self.banManager.geBanListExtendedRIR(cymru_info))]
return ret return ret

View File

@ -502,7 +502,7 @@ class Fail2BanDb(object):
except TypeError: except TypeError:
firstLineMD5 = None firstLineMD5 = None
if not firstLineMD5 and (pos or md5): if firstLineMD5 is None and (pos or md5 is not None):
cur.execute( cur.execute(
"INSERT OR REPLACE INTO logs(jail, path, firstlinemd5, lastfilepos) " "INSERT OR REPLACE INTO logs(jail, path, firstlinemd5, lastfilepos) "
"VALUES(?, ?, ?, ?)", (jail.name, name, md5, pos)) "VALUES(?, ?, ?, ?)", (jail.name, name, md5, pos))

View File

@ -35,7 +35,7 @@ from ..helpers import getLogger
# Gets the instance of the logger. # Gets the instance of the logger.
logSys = getLogger(__name__) logSys = getLogger(__name__)
logLevel = 6 logLevel = 5
RE_DATE_PREMATCH = re.compile(r"(?<!\\)\{DATE\}", re.IGNORECASE) RE_DATE_PREMATCH = re.compile(r"(?<!\\)\{DATE\}", re.IGNORECASE)
DD_patternCache = Utils.Cache(maxCount=1000, maxTime=60*60) DD_patternCache = Utils.Cache(maxCount=1000, maxTime=60*60)

View File

@ -136,7 +136,7 @@ class DateTemplate(object):
# remove possible special pattern "**" in front and end of regex: # remove possible special pattern "**" in front and end of regex:
regex = RE_DEL_WRD_BOUNDS[0].sub(RE_DEL_WRD_BOUNDS[1], regex) regex = RE_DEL_WRD_BOUNDS[0].sub(RE_DEL_WRD_BOUNDS[1], regex)
self._regex = regex self._regex = regex
logSys.log(7, ' constructed regex %s', regex) logSys.log(4, ' constructed regex %s', regex)
self._cRegex = None self._cRegex = None
regex = property(getRegex, setRegex, doc= regex = property(getRegex, setRegex, doc=
@ -159,6 +159,7 @@ class DateTemplate(object):
""" """
if not self._cRegex: if not self._cRegex:
self._compileRegex() self._compileRegex()
logSys.log(4, " search %s", self.regex)
dateMatch = self._cRegex.search(line, *args); # pos, endpos dateMatch = self._cRegex.search(line, *args); # pos, endpos
if dateMatch: if dateMatch:
self.hits += 1 self.hits += 1

View File

@ -127,9 +127,10 @@ class FailManager:
return len(self.__failList) return len(self.__failList)
def cleanup(self, time): def cleanup(self, time):
time -= self.__maxTime
with self.__lock: with self.__lock:
todelete = [fid for fid,item in self.__failList.iteritems() \ todelete = [fid for fid,item in self.__failList.iteritems() \
if item.getTime() + self.__maxTime <= time] if item.getTime() <= time]
if len(todelete) == len(self.__failList): if len(todelete) == len(self.__failList):
# remove all: # remove all:
self.__failList = dict() self.__failList = dict()
@ -143,7 +144,7 @@ class FailManager:
else: else:
# create new dictionary without items to be deleted: # create new dictionary without items to be deleted:
self.__failList = dict((fid,item) for fid,item in self.__failList.iteritems() \ self.__failList = dict((fid,item) for fid,item in self.__failList.iteritems() \
if item.getTime() + self.__maxTime > time) if item.getTime() > time)
self.__bgSvc.service() self.__bgSvc.service()
def delFailure(self, fid): def delFailure(self, fid):

View File

@ -94,6 +94,8 @@ class Filter(JailThread):
## Store last time stamp, applicable for multi-line ## Store last time stamp, applicable for multi-line
self.__lastTimeText = "" self.__lastTimeText = ""
self.__lastDate = None self.__lastDate = None
## Next service (cleanup) time
self.__nextSvcTime = -(1<<63)
## if set, treat log lines without explicit time zone to be in this time zone ## if set, treat log lines without explicit time zone to be in this time zone
self.__logtimezone = None self.__logtimezone = None
## Default or preferred encoding (to decode bytes from file or journal): ## Default or preferred encoding (to decode bytes from file or journal):
@ -115,10 +117,10 @@ class Filter(JailThread):
self.checkFindTime = True self.checkFindTime = True
## shows that filter is in operation mode (processing new messages): ## shows that filter is in operation mode (processing new messages):
self.inOperation = True self.inOperation = True
## if true prevents against retarded banning in case of RC by too many failures (disabled only for test purposes):
self.banASAP = True
## Ticks counter ## Ticks counter
self.ticks = 0 self.ticks = 0
## Processed lines counter
self.procLines = 0
## Thread name: ## Thread name:
self.name="f2b/f."+self.jailName self.name="f2b/f."+self.jailName
@ -442,12 +444,23 @@ class Filter(JailThread):
def performBan(self, ip=None): def performBan(self, ip=None):
"""Performs a ban for IPs (or given ip) that are reached maxretry of the jail.""" """Performs a ban for IPs (or given ip) that are reached maxretry of the jail."""
try: # pragma: no branch - exception is the only way out while True:
while True: try:
ticket = self.failManager.toBan(ip) ticket = self.failManager.toBan(ip)
self.jail.putFailTicket(ticket) except FailManagerEmpty:
except FailManagerEmpty: break
self.failManager.cleanup(MyTime.time()) self.jail.putFailTicket(ticket)
if ip: break
self.performSvc()
def performSvc(self, force=False):
"""Performs a service tasks (clean failure list)."""
tm = MyTime.time()
# avoid too early clean up:
if force or tm >= self.__nextSvcTime:
self.__nextSvcTime = tm + 5
# clean up failure list:
self.failManager.cleanup(tm)
def addAttempt(self, ip, *matches): def addAttempt(self, ip, *matches):
"""Generate a failed attempt for ip""" """Generate a failed attempt for ip"""
@ -695,11 +708,15 @@ class Filter(JailThread):
attempts = self.failManager.addFailure(tick) attempts = self.failManager.addFailure(tick)
# avoid RC on busy filter (too many failures) - if attempts for IP/ID reached maxretry, # avoid RC on busy filter (too many failures) - if attempts for IP/ID reached maxretry,
# we can speedup ban, so do it as soon as possible: # we can speedup ban, so do it as soon as possible:
if self.banASAP and attempts >= self.failManager.getMaxRetry(): if attempts >= self.failManager.getMaxRetry():
self.performBan(ip) self.performBan(ip)
# report to observer - failure was found, for possibly increasing of it retry counter (asynchronous) # report to observer - failure was found, for possibly increasing of it retry counter (asynchronous)
if Observers.Main is not None: if Observers.Main is not None:
Observers.Main.add('failureFound', self.failManager, self.jail, tick) Observers.Main.add('failureFound', self.failManager, self.jail, tick)
self.procLines += 1
# every 100 lines check need to perform service tasks:
if self.procLines % 100 == 0:
self.performSvc()
# reset (halve) error counter (successfully processed line): # reset (halve) error counter (successfully processed line):
if self._errors: if self._errors:
self._errors //= 2 self._errors //= 2
@ -1068,6 +1085,7 @@ class FileFilter(Filter):
# is created and is added to the FailManager. # is created and is added to the FailManager.
def getFailures(self, filename, inOperation=None): def getFailures(self, filename, inOperation=None):
if self.idle: return False
log = self.getLog(filename) log = self.getLog(filename)
if log is None: if log is None:
logSys.error("Unable to get failures in %s", filename) logSys.error("Unable to get failures in %s", filename)
@ -1113,14 +1131,14 @@ class FileFilter(Filter):
while not self.idle: while not self.idle:
line = log.readline() line = log.readline()
if not self.active: break; # jail has been stopped if not self.active: break; # jail has been stopped
if not line: if line is None:
# The jail reached the bottom, simply set in operation for this log # The jail reached the bottom, simply set in operation for this log
# (since we are first time at end of file, growing is only possible after modifications): # (since we are first time at end of file, growing is only possible after modifications):
log.inOperation = True log.inOperation = True
break break
# acquire in operation from log and process: # acquire in operation from log and process:
self.inOperation = inOperation if inOperation is not None else log.inOperation self.inOperation = inOperation if inOperation is not None else log.inOperation
self.processLineAndAdd(line.rstrip('\r\n')) self.processLineAndAdd(line)
finally: finally:
log.close() log.close()
db = self.jail.database db = self.jail.database
@ -1137,6 +1155,8 @@ class FileFilter(Filter):
if logSys.getEffectiveLevel() <= logging.DEBUG: if logSys.getEffectiveLevel() <= logging.DEBUG:
logSys.debug("Seek to find time %s (%s), file size %s", date, logSys.debug("Seek to find time %s (%s), file size %s", date,
MyTime.time2str(date), fs) MyTime.time2str(date), fs)
if not fs:
return
minp = container.getPos() minp = container.getPos()
maxp = fs maxp = fs
tryPos = minp tryPos = minp
@ -1160,8 +1180,8 @@ class FileFilter(Filter):
dateTimeMatch = None dateTimeMatch = None
nextp = None nextp = None
while True: while True:
line = container.readline() line = container.readline(False)
if not line: if line is None:
break break
(timeMatch, template) = self.dateDetector.matchTime(line) (timeMatch, template) = self.dateDetector.matchTime(line)
if timeMatch: if timeMatch:
@ -1258,25 +1278,34 @@ except ImportError: # pragma: no cover
class FileContainer: class FileContainer:
def __init__(self, filename, encoding, tail=False): def __init__(self, filename, encoding, tail=False, doOpen=False):
self.__filename = filename self.__filename = filename
self.waitForLineEnd = True
self.setEncoding(encoding) self.setEncoding(encoding)
self.__tail = tail self.__tail = tail
self.__handler = None self.__handler = None
self.__pos = 0
self.__pos4hash = 0
self.__hash = ''
self.__hashNextTime = time.time() + 30
# Try to open the file. Raises an exception if an error occurred. # Try to open the file. Raises an exception if an error occurred.
handler = open(filename, 'rb') handler = open(filename, 'rb')
stats = os.fstat(handler.fileno()) if doOpen: # fail2ban-regex only (don't need to reopen it and check for rotation)
self.__ino = stats.st_ino self.__handler = handler
return
try: try:
firstLine = handler.readline() stats = os.fstat(handler.fileno())
# Computes the MD5 of the first line. self.__ino = stats.st_ino
self.__hash = md5sum(firstLine).hexdigest() if stats.st_size:
# Start at the beginning of file if tail mode is off. firstLine = handler.readline()
if tail: # first line available and contains new-line:
handler.seek(0, 2) if firstLine != firstLine.rstrip(b'\r\n'):
self.__pos = handler.tell() # Computes the MD5 of the first line.
else: self.__hash = md5sum(firstLine).hexdigest()
self.__pos = 0 # if tail mode scroll to the end of file
if tail:
handler.seek(0, 2)
self.__pos = handler.tell()
finally: finally:
handler.close() handler.close()
## shows that log is in operation mode (expecting new messages only from here): ## shows that log is in operation mode (expecting new messages only from here):
@ -1286,6 +1315,10 @@ class FileContainer:
return self.__filename return self.__filename
def getFileSize(self): def getFileSize(self):
h = self.__handler
if h is not None:
stats = os.fstat(h.fileno())
return stats.st_size
return os.path.getsize(self.__filename); return os.path.getsize(self.__filename);
def setEncoding(self, encoding): def setEncoding(self, encoding):
@ -1304,38 +1337,54 @@ class FileContainer:
def setPos(self, value): def setPos(self, value):
self.__pos = value self.__pos = value
def open(self): def open(self, forcePos=None):
self.__handler = open(self.__filename, 'rb') h = open(self.__filename, 'rb')
# Set the file descriptor to be FD_CLOEXEC try:
fd = self.__handler.fileno() # Set the file descriptor to be FD_CLOEXEC
flags = fcntl.fcntl(fd, fcntl.F_GETFD) fd = h.fileno()
fcntl.fcntl(fd, fcntl.F_SETFD, flags | fcntl.FD_CLOEXEC) flags = fcntl.fcntl(fd, fcntl.F_GETFD)
# Stat the file before even attempting to read it fcntl.fcntl(fd, fcntl.F_SETFD, flags | fcntl.FD_CLOEXEC)
stats = os.fstat(self.__handler.fileno()) myHash = self.__hash
if not stats.st_size: # Stat the file before even attempting to read it
# yoh: so it is still an empty file -- nothing should be stats = os.fstat(h.fileno())
# read from it yet rotflg = stats.st_size < self.__pos or stats.st_ino != self.__ino
# print "D: no content -- return" if rotflg or not len(myHash) or time.time() > self.__hashNextTime:
return False myHash = ''
firstLine = self.__handler.readline() firstLine = h.readline()
# Computes the MD5 of the first line. # Computes the MD5 of the first line (if it is complete)
myHash = md5sum(firstLine).hexdigest() if firstLine != firstLine.rstrip(b'\r\n'):
## print "D: fn=%s hashes=%s/%s inos=%s/%s pos=%s rotate=%s" % ( myHash = md5sum(firstLine).hexdigest()
## self.__filename, self.__hash, myHash, stats.st_ino, self.__ino, self.__pos, self.__hashNextTime = time.time() + 30
## self.__hash != myHash or self.__ino != stats.st_ino) elif stats.st_size == self.__pos:
## sys.stdout.flush() myHash = self.__hash
# Compare hash and inode # Compare size, hash and inode
if self.__hash != myHash or self.__ino != stats.st_ino: if rotflg or myHash != self.__hash:
logSys.log(logging.MSG, "Log rotation detected for %s", self.__filename) if self.__hash != '':
self.__hash = myHash logSys.log(logging.MSG, "Log rotation detected for %s, reason: %r", self.__filename,
self.__ino = stats.st_ino (stats.st_size, self.__pos, stats.st_ino, self.__ino, myHash, self.__hash))
self.__pos = 0 self.__ino = stats.st_ino
# Sets the file pointer to the last position. self.__pos = 0
self.__handler.seek(self.__pos) self.__hash = myHash
# if nothing to read from file yet (empty or no new data):
if forcePos is not None:
self.__pos = forcePos
elif stats.st_size <= self.__pos:
return False
# Sets the file pointer to the last position.
h.seek(self.__pos)
# leave file open (to read content):
self.__handler = h; h = None
finally:
# close (no content or error only)
if h:
h.close(); h = None
return True return True
def seek(self, offs, endLine=True): def seek(self, offs, endLine=True):
h = self.__handler h = self.__handler
if h is None:
self.open(offs)
h = self.__handler
# seek to given position # seek to given position
h.seek(offs, 0) h.seek(offs, 0)
# goto end of next line # goto end of next line
@ -1353,6 +1402,9 @@ class FileContainer:
try: try:
return line.decode(enc, 'strict') return line.decode(enc, 'strict')
except (UnicodeDecodeError, UnicodeEncodeError) as e: except (UnicodeDecodeError, UnicodeEncodeError) as e:
# avoid warning if got incomplete end of line (e. g. '\n' in "...[0A" followed by "00]..." for utf-16le:
if (e.end == len(line) and line[e.start] in b'\r\n'):
return line[0:e.start].decode(enc, 'replace')
global _decode_line_warn global _decode_line_warn
lev = 7 lev = 7
if not _decode_line_warn.get(filename, 0): if not _decode_line_warn.get(filename, 0):
@ -1361,29 +1413,85 @@ class FileContainer:
logSys.log(lev, logSys.log(lev,
"Error decoding line from '%s' with '%s'.", filename, enc) "Error decoding line from '%s' with '%s'.", filename, enc)
if logSys.getEffectiveLevel() <= lev: if logSys.getEffectiveLevel() <= lev:
logSys.log(lev, "Consider setting logencoding=utf-8 (or another appropriate" logSys.log(lev,
" encoding) for this jail. Continuing" "Consider setting logencoding to appropriate encoding for this jail. "
" to process line ignoring invalid characters: %r", "Continuing to process line ignoring invalid characters: %r",
line) line)
# decode with replacing error chars: # decode with replacing error chars:
line = line.decode(enc, 'replace') line = line.decode(enc, 'replace')
return line return line
def readline(self): def readline(self, complete=True):
"""Read line from file
In opposite to pythons readline it doesn't return new-line,
so returns either the line if line is complete (and complete=True) or None
if line is not complete (and complete=True) or there is no content to read.
If line is complete (and complete is True), it also shift current known
position to begin of next line.
Also it is safe against interim new-line bytes (e. g. part of multi-byte char)
in given encoding.
"""
if self.__handler is None: if self.__handler is None:
return "" return ""
return FileContainer.decode_line( # read raw bytes up to \n char:
self.getFileName(), self.getEncoding(), self.__handler.readline()) b = self.__handler.readline()
if not b:
return None
bl = len(b)
# convert to log-encoding (new-line char could disappear if it is part of multi-byte sequence):
r = FileContainer.decode_line(
self.getFileName(), self.getEncoding(), b)
# trim new-line at end and check the line was written complete (contains a new-line):
l = r.rstrip('\r\n')
if complete:
if l == r:
# try to fill buffer in order to find line-end in log encoding:
fnd = 0
while 1:
r = self.__handler.readline()
if not r:
break
b += r
bl += len(r)
# convert to log-encoding:
r = FileContainer.decode_line(
self.getFileName(), self.getEncoding(), b)
# ensure new-line is not in the middle (buffered 2 strings, e. g. in utf-16le it is "...[0A"+"00]..."):
e = r.find('\n')
if e >= 0 and e != len(r)-1:
l, r = r[0:e], r[0:e+1]
# back to bytes and get offset to seek after NL:
r = r.encode(self.getEncoding(), 'replace')
self.__handler.seek(-bl+len(r), 1)
return l
# trim new-line at end and check the line was written complete (contains a new-line):
l = r.rstrip('\r\n')
if l != r:
return l
if self.waitForLineEnd:
# not fulfilled - seek back and return:
self.__handler.seek(-bl, 1)
return None
return l
def close(self): def close(self):
if not self.__handler is None: if self.__handler is not None:
# Saves the last position. # Saves the last real position.
self.__pos = self.__handler.tell() self.__pos = self.__handler.tell()
# Closes the file. # Closes the file.
self.__handler.close() self.__handler.close()
self.__handler = None self.__handler = None
## print "D: Closed %s with pos %d" % (handler, self.__pos)
## sys.stdout.flush() def __iter__(self):
return self
def next(self):
line = self.readline()
if line is None:
self.close()
raise StopIteration
return line
_decode_line_warn = Utils.Cache(maxCount=1000, maxTime=24*60*60); _decode_line_warn = Utils.Cache(maxCount=1000, maxTime=24*60*60);

View File

@ -55,7 +55,6 @@ class FilterGamin(FileFilter):
def __init__(self, jail): def __init__(self, jail):
FileFilter.__init__(self, jail) FileFilter.__init__(self, jail)
self.__modified = False
# Gamin monitor # Gamin monitor
self.monitor = gamin.WatchMonitor() self.monitor = gamin.WatchMonitor()
fd = self.monitor.get_fd() fd = self.monitor.get_fd()
@ -67,21 +66,9 @@ class FilterGamin(FileFilter):
logSys.log(4, "Got event: " + repr(event) + " for " + path) logSys.log(4, "Got event: " + repr(event) + " for " + path)
if event in (gamin.GAMCreated, gamin.GAMChanged, gamin.GAMExists): if event in (gamin.GAMCreated, gamin.GAMChanged, gamin.GAMExists):
logSys.debug("File changed: " + path) logSys.debug("File changed: " + path)
self.__modified = True
self.ticks += 1 self.ticks += 1
self._process_file(path)
def _process_file(self, path):
"""Process a given file
TODO -- RF:
this is a common logic and must be shared/provided by FileFilter
"""
self.getFailures(path) self.getFailures(path)
if not self.banASAP: # pragma: no cover
self.performBan()
self.__modified = False
## ##
# Add a log file path # Add a log file path
@ -128,6 +115,9 @@ class FilterGamin(FileFilter):
Utils.wait_for(lambda: not self.active or self._handleEvents(), Utils.wait_for(lambda: not self.active or self._handleEvents(),
self.sleeptime) self.sleeptime)
self.ticks += 1 self.ticks += 1
if self.ticks % 10 == 0:
self.performSvc()
logSys.debug("[%s] filter terminated", self.jailName) logSys.debug("[%s] filter terminated", self.jailName)
return True return True

View File

@ -27,9 +27,7 @@ __license__ = "GPL"
import os import os
import time import time
from .failmanager import FailManagerEmpty
from .filter import FileFilter from .filter import FileFilter
from .mytime import MyTime
from .utils import Utils from .utils import Utils
from ..helpers import getLogger, logging from ..helpers import getLogger, logging
@ -55,7 +53,6 @@ class FilterPoll(FileFilter):
def __init__(self, jail): def __init__(self, jail):
FileFilter.__init__(self, jail) FileFilter.__init__(self, jail)
self.__modified = False
## The time of the last modification of the file. ## The time of the last modification of the file.
self.__prevStats = dict() self.__prevStats = dict()
self.__file404Cnt = dict() self.__file404Cnt = dict()
@ -115,13 +112,10 @@ class FilterPoll(FileFilter):
break break
for filename in modlst: for filename in modlst:
self.getFailures(filename) self.getFailures(filename)
self.__modified = True
self.ticks += 1 self.ticks += 1
if self.__modified: if self.ticks % 10 == 0:
if not self.banASAP: # pragma: no cover self.performSvc()
self.performBan()
self.__modified = False
except Exception as e: # pragma: no cover except Exception as e: # pragma: no cover
if not self.active: # if not active - error by stop... if not self.active: # if not active - error by stop...
break break

View File

@ -75,7 +75,6 @@ class FilterPyinotify(FileFilter):
def __init__(self, jail): def __init__(self, jail):
FileFilter.__init__(self, jail) FileFilter.__init__(self, jail)
self.__modified = False
# Pyinotify watch manager # Pyinotify watch manager
self.__monitor = pyinotify.WatchManager() self.__monitor = pyinotify.WatchManager()
self.__notifier = None self.__notifier = None
@ -140,9 +139,6 @@ class FilterPyinotify(FileFilter):
""" """
if not self.idle: if not self.idle:
self.getFailures(path) self.getFailures(path)
if not self.banASAP: # pragma: no cover
self.performBan()
self.__modified = False
def _addPending(self, path, reason, isDir=False): def _addPending(self, path, reason, isDir=False):
if path not in self.__pending: if path not in self.__pending:
@ -352,9 +348,14 @@ class FilterPyinotify(FileFilter):
if not self.active: break if not self.active: break
self.__notifier.read_events() self.__notifier.read_events()
self.ticks += 1
# check pending files/dirs (logrotate ready): # check pending files/dirs (logrotate ready):
if not self.idle: if self.idle:
self._checkPending() continue
self._checkPending()
if self.ticks % 10 == 0:
self.performSvc()
except Exception as e: # pragma: no cover except Exception as e: # pragma: no cover
if not self.active: # if not active - error by stop... if not self.active: # if not active - error by stop...
@ -364,8 +365,6 @@ class FilterPyinotify(FileFilter):
# incr common error counter: # incr common error counter:
self.commonError() self.commonError()
self.ticks += 1
logSys.debug("[%s] filter exited (pyinotifier)", self.jailName) logSys.debug("[%s] filter exited (pyinotifier)", self.jailName)
self.__notifier = None self.__notifier = None

View File

@ -94,6 +94,11 @@ class FilterSystemd(JournalFilter): # pragma: systemd no cover
# be sure all journal types will be opened if files specified (don't set flags): # be sure all journal types will be opened if files specified (don't set flags):
if 'files' not in args or not len(args['files']): if 'files' not in args or not len(args['files']):
args['flags'] = 4 args['flags'] = 4
try:
args['namespace'] = kwargs.pop('namespace')
except KeyError:
pass
return args return args
@ -317,13 +322,12 @@ class FilterSystemd(JournalFilter): # pragma: systemd no cover
break break
else: else:
break break
if self.__modified: self.__modified = 0
if not self.banASAP: # pragma: no cover if self.ticks % 10 == 0:
self.performBan() self.performSvc()
self.__modified = 0 # update position in log (time and iso string):
# update position in log (time and iso string): if self.jail.database is not None:
if self.jail.database is not None: self.jail.database.updateJournal(self.jail, 'systemd-journal', line[1], line[0][1])
self.jail.database.updateJournal(self.jail, 'systemd-journal', line[1], line[0][1])
except Exception as e: # pragma: no cover except Exception as e: # pragma: no cover
if not self.active: # if not active - error by stop... if not self.active: # if not active - error by stop...
break break

View File

@ -169,27 +169,31 @@ class DNSUtils:
DNSUtils.CACHE_ipToName.set(key, name) DNSUtils.CACHE_ipToName.set(key, name)
return name return name
# key find cached own hostnames (this tuple-key cannot be used elsewhere):
_getSelfNames_key = ('self','dns')
@staticmethod @staticmethod
def getSelfNames(): def getSelfNames():
"""Get own host names of self""" """Get own host names of self"""
# try find cached own hostnames (this tuple-key cannot be used elsewhere): # try find cached own hostnames:
key = ('self','dns') names = DNSUtils.CACHE_ipToName.get(DNSUtils._getSelfNames_key)
names = DNSUtils.CACHE_ipToName.get(key)
# get it using different ways (a set with names of localhost, hostname, fully qualified): # get it using different ways (a set with names of localhost, hostname, fully qualified):
if names is None: if names is None:
names = set([ names = set([
'localhost', DNSUtils.getHostname(False), DNSUtils.getHostname(True) 'localhost', DNSUtils.getHostname(False), DNSUtils.getHostname(True)
]) - set(['']) # getHostname can return '' ]) - set(['']) # getHostname can return ''
# cache and return : # cache and return :
DNSUtils.CACHE_ipToName.set(key, names) DNSUtils.CACHE_ipToName.set(DNSUtils._getSelfNames_key, names)
return names return names
# key to find cached own IPs (this tuple-key cannot be used elsewhere):
_getSelfIPs_key = ('self','ips')
@staticmethod @staticmethod
def getSelfIPs(): def getSelfIPs():
"""Get own IP addresses of self""" """Get own IP addresses of self"""
# try find cached own IPs (this tuple-key cannot be used elsewhere): # to find cached own IPs:
key = ('self','ips') ips = DNSUtils.CACHE_nameToIp.get(DNSUtils._getSelfIPs_key)
ips = DNSUtils.CACHE_nameToIp.get(key)
# get it using different ways (a set with IPs of localhost, hostname, fully qualified): # get it using different ways (a set with IPs of localhost, hostname, fully qualified):
if ips is None: if ips is None:
ips = set() ips = set()
@ -199,13 +203,30 @@ class DNSUtils:
except Exception as e: # pragma: no cover except Exception as e: # pragma: no cover
logSys.warning("Retrieving own IPs of %s failed: %s", hostname, e) logSys.warning("Retrieving own IPs of %s failed: %s", hostname, e)
# cache and return : # cache and return :
DNSUtils.CACHE_nameToIp.set(key, ips) DNSUtils.CACHE_nameToIp.set(DNSUtils._getSelfIPs_key, ips)
return ips return ips
_IPv6IsAllowed = None
@staticmethod
def setIPv6IsAllowed(value):
DNSUtils._IPv6IsAllowed = value
logSys.debug("IPv6 is %s", ('on' if value else 'off') if value is not None else 'auto')
return value
# key to find cached value of IPv6 allowance (this tuple-key cannot be used elsewhere):
_IPv6IsAllowed_key = ('self','ipv6-allowed')
@staticmethod @staticmethod
def IPv6IsAllowed(): def IPv6IsAllowed():
# return os.path.exists("/proc/net/if_inet6") || any((':' in ip) for ip in DNSUtils.getSelfIPs()) if DNSUtils._IPv6IsAllowed is not None:
return any((':' in ip.ntoa) for ip in DNSUtils.getSelfIPs()) return DNSUtils._IPv6IsAllowed
v = DNSUtils.CACHE_nameToIp.get(DNSUtils._IPv6IsAllowed_key)
if v is not None:
return v
v = any((':' in ip.ntoa) for ip in DNSUtils.getSelfIPs())
DNSUtils.CACHE_nameToIp.set(DNSUtils._IPv6IsAllowed_key, v)
return v
## ##

View File

@ -22,7 +22,10 @@ __copyright__ = "Copyright (c) 2004 Cyril Jaquier, 2013- Yaroslav Halchenko"
__license__ = "GPL" __license__ = "GPL"
from threading import Lock from threading import Lock
from collections import Mapping try:
from collections.abc import Mapping
except ImportError:
from collections import Mapping
from ..exceptions import DuplicateJailException, UnknownJailException from ..exceptions import DuplicateJailException, UnknownJailException
from .jail import Jail from .jail import Jail

View File

@ -232,7 +232,7 @@ class ObserverThread(JailThread):
if self._paused: if self._paused:
continue continue
else: else:
## notify event deleted (shutdown) - just sleep a litle bit (waiting for shutdown events, prevent high cpu usage) ## notify event deleted (shutdown) - just sleep a little bit (waiting for shutdown events, prevent high cpu usage)
time.sleep(ObserverThread.DEFAULT_SLEEP_INTERVAL) time.sleep(ObserverThread.DEFAULT_SLEEP_INTERVAL)
## stop by shutdown and empty queue : ## stop by shutdown and empty queue :
if not self.is_full: if not self.is_full:

View File

@ -34,7 +34,7 @@ import sys
from .observer import Observers, ObserverThread from .observer import Observers, ObserverThread
from .jails import Jails from .jails import Jails
from .filter import FileFilter, JournalFilter from .filter import DNSUtils, FileFilter, JournalFilter
from .transmitter import Transmitter from .transmitter import Transmitter
from .asyncserver import AsyncServer, AsyncServerException from .asyncserver import AsyncServer, AsyncServerException
from .. import version from .. import version
@ -293,6 +293,11 @@ class Server:
for name in self.__jails.keys(): for name in self.__jails.keys():
self.delJail(name, stop=False, join=True) self.delJail(name, stop=False, join=True)
def clearCaches(self):
# we need to clear caches, to be able to recognize new IPs/families etc:
DNSUtils.CACHE_nameToIp.clear()
DNSUtils.CACHE_ipToName.clear()
def reloadJails(self, name, opts, begin): def reloadJails(self, name, opts, begin):
if begin: if begin:
# begin reload: # begin reload:
@ -314,6 +319,8 @@ class Server:
if "--restart" in opts: if "--restart" in opts:
self.stopJail(name) self.stopJail(name)
else: else:
# invalidate caches by reload
self.clearCaches()
# first unban all ips (will be not restored after (re)start): # first unban all ips (will be not restored after (re)start):
if "--unban" in opts: if "--unban" in opts:
self.setUnbanIP() self.setUnbanIP()
@ -803,6 +810,11 @@ class Server:
logSys.info("flush performed on %s" % self.__logTarget) logSys.info("flush performed on %s" % self.__logTarget)
return "flushed" return "flushed"
@staticmethod
def setIPv6IsAllowed(value):
value = _as_bool(value) if value != 'auto' else None
return DNSUtils.setIPv6IsAllowed(value)
def setThreadOptions(self, value): def setThreadOptions(self, value):
for o, v in value.iteritems(): for o, v in value.iteritems():
if o == 'stacksize': if o == 'stacksize':

View File

@ -30,17 +30,6 @@ locale_time = LocaleTime()
TZ_ABBR_RE = r"[A-Z](?:[A-Z]{2,4})?" TZ_ABBR_RE = r"[A-Z](?:[A-Z]{2,4})?"
FIXED_OFFSET_TZ_RE = re.compile(r"(%s)?([+-][01]\d(?::?\d{2})?)?$" % (TZ_ABBR_RE,)) FIXED_OFFSET_TZ_RE = re.compile(r"(%s)?([+-][01]\d(?::?\d{2})?)?$" % (TZ_ABBR_RE,))
def _getYearCentRE(cent=(0,3), distance=3, now=(MyTime.now(), MyTime.alternateNow)):
""" Build century regex for last year and the next years (distance).
Thereby respect possible run in the test-cases (alternate date used there)
"""
cent = lambda year, f=cent[0], t=cent[1]: str(year)[f:t]
exprset = set( cent(now[0].year + i) for i in (-1, distance) )
if len(now) and now[1]:
exprset |= set( cent(now[1].year + i) for i in (-1, distance) )
return "(?:%s)" % "|".join(exprset) if len(exprset) > 1 else "".join(exprset)
timeRE = TimeRE() timeRE = TimeRE()
# %k - one- or two-digit number giving the hour of the day (0-23) on a 24-hour clock, # %k - one- or two-digit number giving the hour of the day (0-23) on a 24-hour clock,
@ -63,20 +52,68 @@ timeRE['z'] = r"(?P<z>Z|UTC|GMT|[+-][01]\d(?::?\d{2})?)"
timeRE['ExZ'] = r"(?P<Z>%s)" % (TZ_ABBR_RE,) timeRE['ExZ'] = r"(?P<Z>%s)" % (TZ_ABBR_RE,)
timeRE['Exz'] = r"(?P<z>(?:%s)?[+-][01]\d(?::?\d{2})?|%s)" % (TZ_ABBR_RE, TZ_ABBR_RE) timeRE['Exz'] = r"(?P<z>(?:%s)?[+-][01]\d(?::?\d{2})?|%s)" % (TZ_ABBR_RE, TZ_ABBR_RE)
# overwrite default patterns, since they can be non-optimal:
timeRE['d'] = r"(?P<d>[1-2]\d|[0 ]?[1-9]|3[0-1])"
timeRE['m'] = r"(?P<m>0?[1-9]|1[0-2])"
timeRE['Y'] = r"(?P<Y>\d{4})"
timeRE['H'] = r"(?P<H>[0-1]?\d|2[0-3])"
timeRE['M'] = r"(?P<M>[0-5]?\d)"
timeRE['S'] = r"(?P<S>[0-5]?\d|6[0-1])"
# Extend build-in TimeRE with some exact patterns # Extend build-in TimeRE with some exact patterns
# exact two-digit patterns: # exact two-digit patterns:
timeRE['Exd'] = r"(?P<d>3[0-1]|[1-2]\d|0[1-9])" timeRE['Exd'] = r"(?P<d>[1-2]\d|0[1-9]|3[0-1])"
timeRE['Exm'] = r"(?P<m>1[0-2]|0[1-9])" timeRE['Exm'] = r"(?P<m>0[1-9]|1[0-2])"
timeRE['ExH'] = r"(?P<H>2[0-3]|[0-1]\d)" timeRE['ExH'] = r"(?P<H>[0-1]\d|2[0-3])"
timeRE['Exk'] = r" ?(?P<H>2[0-3]|[0-1]\d|\d)" timeRE['Exk'] = r" ?(?P<H>[0-1]?\d|2[0-3])"
timeRE['Exl'] = r" ?(?P<I>1[0-2]|\d)" timeRE['Exl'] = r" ?(?P<I>1[0-2]|\d)"
timeRE['ExM'] = r"(?P<M>[0-5]\d)" timeRE['ExM'] = r"(?P<M>[0-5]\d)"
timeRE['ExS'] = r"(?P<S>6[0-1]|[0-5]\d)" timeRE['ExS'] = r"(?P<S>[0-5]\d|6[0-1])"
# more precise year patterns, within same century of last year and
# the next 3 years (for possible long uptime of fail2ban); thereby def _updateTimeRE():
# respect possible run in the test-cases (alternate date used there): def _getYearCentRE(cent=(0,3), distance=3, now=(MyTime.now(), MyTime.alternateNow)):
timeRE['ExY'] = r"(?P<Y>%s\d)" % _getYearCentRE(cent=(0,3), distance=3) """ Build century regex for last year and the next years (distance).
timeRE['Exy'] = r"(?P<y>%s\d)" % _getYearCentRE(cent=(2,3), distance=3)
Thereby respect possible run in the test-cases (alternate date used there)
"""
cent = lambda year, f=cent[0], t=cent[1]: str(year)[f:t]
def grp(exprset):
c = None
if len(exprset) > 1:
for i in exprset:
if c is None or i[0:-1] == c:
c = i[0:-1]
else:
c = None
break
if not c:
for i in exprset:
if c is None or i[0] == c:
c = i[0]
else:
c = None
break
if c:
return "%s%s" % (c, grp([i[len(c):] for i in exprset]))
return ("(?:%s)" % "|".join(exprset) if len(exprset[0]) > 1 else "[%s]" % "".join(exprset)) \
if len(exprset) > 1 else "".join(exprset)
exprset = set( cent(now[0].year + i) for i in (-1, distance) )
if len(now) > 1 and now[1]:
exprset |= set( cent(now[1].year + i) for i in xrange(-1, now[0].year-now[1].year+1, distance) )
return grp(sorted(list(exprset)))
# more precise year patterns, within same century of last year and
# the next 3 years (for possible long uptime of fail2ban); thereby
# respect possible run in the test-cases (alternate date used there):
if MyTime.alternateNowTime != 0:
timeRE['ExY'] = r"(?P<Y>%s\d)" % _getYearCentRE(cent=(0,3), distance=3)
timeRE['Exy'] = r"(?P<y>%s\d)" % _getYearCentRE(cent=(2,3), distance=3)
else: # accept years: 19xx|2xxx up to current century
timeRE['ExY'] = r"(?P<Y>(?:19\d{2}|%s\d))" % _getYearCentRE(cent=(0,3), distance=3,
now=(MyTime.now(), datetime.datetime.fromtimestamp(978393600)))
timeRE['Exy'] = r"(?P<y>\d{2})"
_updateTimeRE()
def getTimePatternRE(): def getTimePatternRE():
keys = timeRE.keys() keys = timeRE.keys()
@ -168,9 +205,9 @@ def reGroupDictStrptime(found_dict, msec=False, default_tz=None):
""" """
now = \ now = \
year = month = day = hour = minute = tzoffset = \ year = month = day = tzoffset = \
weekday = julian = week_of_year = None weekday = julian = week_of_year = None
second = fraction = 0 hour = minute = second = fraction = 0
for key, val in found_dict.iteritems(): for key, val in found_dict.iteritems():
if val is None: continue if val is None: continue
# Directives not explicitly handled below: # Directives not explicitly handled below:

View File

@ -173,6 +173,11 @@ class Transmitter:
return self.__server.getSyslogSocket() return self.__server.getSyslogSocket()
else: else:
raise Exception("Failed to change syslog socket") raise Exception("Failed to change syslog socket")
elif name == "allowipv6":
value = command[1]
self.__server.setIPv6IsAllowed(value)
if self.__quiet: return
return value
#Thread #Thread
elif name == "thread": elif name == "thread":
value = command[1] value = command[1]

View File

@ -332,11 +332,9 @@ class Utils():
timeout_expr = lambda: time.time() > time0 timeout_expr = lambda: time.time() > time0
else: else:
timeout_expr = timeout timeout_expr = timeout
if not interval:
interval = Utils.DEFAULT_SLEEP_INTERVAL
if timeout_expr(): if timeout_expr():
break break
stm = min(stm + interval, Utils.DEFAULT_SLEEP_TIME) stm = min(stm + (interval or Utils.DEFAULT_SLEEP_INTERVAL), Utils.DEFAULT_SLEEP_TIME)
time.sleep(stm) time.sleep(stm)
return ret return ret

View File

@ -1,157 +0,0 @@
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: t -*-
# vi: set ft=python sts=4 ts=4 sw=4 noet :
# This file is part of Fail2Ban.
#
# Fail2Ban is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# Fail2Ban is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Fail2Ban; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import os
import unittest
import sys
from functools import wraps
from socket import timeout
from ssl import SSLError
from ..actiontestcase import CallingMap
from ..dummyjail import DummyJail
from ..servertestcase import IPAddr
from ..utils import LogCaptureTestCase, CONFIG_DIR
if sys.version_info >= (3, ): # pragma: 2.x no cover
from urllib.error import HTTPError, URLError
else: # pragma: 3.x no cover
from urllib2 import HTTPError, URLError
def skip_if_not_available(f):
"""Helper to decorate tests to skip in case of timeout/http-errors like "502 bad gateway".
"""
@wraps(f)
def wrapper(self, *args):
try:
return f(self, *args)
except (SSLError, HTTPError, URLError, timeout) as e: # pragma: no cover - timeout/availability issues
if not isinstance(e, timeout) and 'timed out' not in str(e):
if not hasattr(e, 'code') or e.code > 200 and e.code <= 404:
raise
raise unittest.SkipTest('Skip test because of %s' % e)
return wrapper
if sys.version_info >= (2,7): # pragma: no cover - may be unavailable
class BadIPsActionTest(LogCaptureTestCase):
available = True, None
pythonModule = None
modAction = None
@skip_if_not_available
def setUp(self):
"""Call before every test case."""
super(BadIPsActionTest, self).setUp()
unittest.F2B.SkipIfNoNetwork()
self.jail = DummyJail()
self.jail.actions.add("test")
pythonModuleName = os.path.join(CONFIG_DIR, "action.d", "badips.py")
# check availability (once if not alive, used shorter timeout as in test cases):
if BadIPsActionTest.available[0]:
if not BadIPsActionTest.modAction:
if not BadIPsActionTest.pythonModule:
BadIPsActionTest.pythonModule = self.jail.actions._load_python_module(pythonModuleName)
BadIPsActionTest.modAction = BadIPsActionTest.pythonModule.Action
self.jail.actions._load_python_module(pythonModuleName)
BadIPsActionTest.available = BadIPsActionTest.modAction.isAvailable(timeout=2 if unittest.F2B.fast else 30)
if not BadIPsActionTest.available[0]:
raise unittest.SkipTest('Skip test because service is not available: %s' % BadIPsActionTest.available[1])
self.jail.actions.add("badips", pythonModuleName, initOpts={
'category': "ssh",
'banaction': "test",
'age': "2w",
'score': 5,
#'key': "fail2ban-test-suite",
#'bankey': "fail2ban-test-suite",
'timeout': (3 if unittest.F2B.fast else 60),
})
self.action = self.jail.actions["badips"]
def tearDown(self):
"""Call after every test case."""
# Must cancel timer!
if self.action._timer:
self.action._timer.cancel()
super(BadIPsActionTest, self).tearDown()
@skip_if_not_available
def testCategory(self):
categories = self.action.getCategories()
self.assertIn("ssh", categories)
self.assertTrue(len(categories) >= 10)
self.assertRaises(
ValueError, setattr, self.action, "category",
"invalid-category")
# Not valid for reporting category...
self.assertRaises(
ValueError, setattr, self.action, "category", "mail")
# but valid for blacklisting.
self.action.bancategory = "mail"
@skip_if_not_available
def testScore(self):
self.assertRaises(ValueError, setattr, self.action, "score", -5)
self.action.score = 3
self.action.score = "3"
@skip_if_not_available
def testBanaction(self):
self.assertRaises(
ValueError, setattr, self.action, "banaction",
"invalid-action")
self.action.banaction = "test"
@skip_if_not_available
def testUpdateperiod(self):
self.assertRaises(
ValueError, setattr, self.action, "updateperiod", -50)
self.assertRaises(
ValueError, setattr, self.action, "updateperiod", 0)
self.action.updateperiod = 900
self.action.updateperiod = "900"
@skip_if_not_available
def testStartStop(self):
self.action.start()
self.assertTrue(len(self.action._bannedips) > 10,
"%s is fewer as 10: %r" % (len(self.action._bannedips), self.action._bannedips))
self.action.stop()
self.assertTrue(len(self.action._bannedips) == 0)
@skip_if_not_available
def testBanIP(self):
aInfo = CallingMap({
'ip': IPAddr('192.0.2.1')
})
self.action.ban(aInfo)
self.assertLogged('badips.com: ban', wait=True)
self.pruneLog()
# produce an error using wrong category/IP:
self.action._category = 'f2b-this-category-dont-available-test-suite-only'
aInfo['ip'] = ''
self.assertRaises(BadIPsActionTest.pythonModule.HTTPError, self.action.ban, aInfo)
self.assertLogged('IP is invalid', 'invalid category', wait=True, all=False)

View File

@ -29,6 +29,7 @@ import unittest
from .utils import setUpMyTime, tearDownMyTime from .utils import setUpMyTime, tearDownMyTime
from ..server.banmanager import BanManager from ..server.banmanager import BanManager
from ..server.ipdns import DNSUtils
from ..server.ticket import BanTicket from ..server.ticket import BanTicket
class AddFailure(unittest.TestCase): class AddFailure(unittest.TestCase):
@ -176,10 +177,10 @@ class StatusExtendedCymruInfo(unittest.TestCase):
super(StatusExtendedCymruInfo, self).setUp() super(StatusExtendedCymruInfo, self).setUp()
unittest.F2B.SkipIfNoNetwork() unittest.F2B.SkipIfNoNetwork()
setUpMyTime() setUpMyTime()
self.__ban_ip = "93.184.216.34" self.__ban_ip = iter(DNSUtils.dnsToIp("resolver1.opendns.com")).next()
self.__asn = "15133" self.__asn = "36692"
self.__country = "EU" self.__country = "US"
self.__rir = "ripencc" self.__rir = "arin"
ticket = BanTicket(self.__ban_ip, 1167605999.0) ticket = BanTicket(self.__ban_ip, 1167605999.0)
self.__banManager = BanManager() self.__banManager = BanManager()
self.assertTrue(self.__banManager.addBanTicket(ticket)) self.assertTrue(self.__banManager.addBanTicket(ticket))

View File

@ -381,13 +381,16 @@ class JailReaderTest(LogCaptureTestCase):
self.assertEqual(('mail.who_is', {'a':'cat', 'b':'dog'}), extractOptions("mail.who_is[a=cat,b=dog]")) self.assertEqual(('mail.who_is', {'a':'cat', 'b':'dog'}), extractOptions("mail.who_is[a=cat,b=dog]"))
self.assertEqual(('mail--ho_is', {}), extractOptions("mail--ho_is")) self.assertEqual(('mail--ho_is', {}), extractOptions("mail--ho_is"))
self.assertEqual(('mail--ho_is', {}), extractOptions("mail--ho_is['s']"))
#print(self.getLog())
#self.assertLogged("Invalid argument ['s'] in ''s''")
self.assertEqual(('mail', {'a': ','}), extractOptions("mail[a=',']")) self.assertEqual(('mail', {'a': ','}), extractOptions("mail[a=',']"))
self.assertEqual(('mail', {'a': 'b'}), extractOptions("mail[a=b, ]"))
#self.assertRaises(ValueError, extractOptions ,'mail-how[') self.assertRaises(ValueError, extractOptions ,'mail-how[')
self.assertRaises(ValueError, extractOptions, """mail[a="test with interim (wrong) "" quotes"]""")
self.assertRaises(ValueError, extractOptions, """mail[a='test with interim (wrong) '' quotes']""")
self.assertRaises(ValueError, extractOptions, """mail[a='x, y, z', b=x, y, z]""")
self.assertRaises(ValueError, extractOptions, """mail['s']""")
# Empty option # Empty option
option = "abc[]" option = "abc[]"
@ -455,8 +458,6 @@ class JailReaderTest(LogCaptureTestCase):
('sender', 'f2b-test@example.com'), ('blocklist_de_apikey', 'test-key'), ('sender', 'f2b-test@example.com'), ('blocklist_de_apikey', 'test-key'),
('action', ('action',
'%(action_blocklist_de)s\n' '%(action_blocklist_de)s\n'
'%(action_badips_report)s\n'
'%(action_badips)s\n'
'mynetwatchman[port=1234,protocol=udp,agent="%(fail2ban_agent)s"]' 'mynetwatchman[port=1234,protocol=udp,agent="%(fail2ban_agent)s"]'
), ),
)) ))
@ -470,16 +471,14 @@ class JailReaderTest(LogCaptureTestCase):
if len(cmd) <= 4: if len(cmd) <= 4:
continue continue
# differentiate between set and multi-set (wrop it here to single set): # differentiate between set and multi-set (wrop it here to single set):
if cmd[0] == 'set' and (cmd[4] == 'agent' or cmd[4].endswith('badips.py')): if cmd[0] == 'set' and cmd[4] == 'agent':
act.append(cmd) act.append(cmd)
elif cmd[0] == 'multi-set': elif cmd[0] == 'multi-set':
act.extend([['set'] + cmd[1:4] + o for o in cmd[4] if o[0] == 'agent']) act.extend([['set'] + cmd[1:4] + o for o in cmd[4] if o[0] == 'agent'])
useragent = 'Fail2Ban/%s' % version useragent = 'Fail2Ban/%s' % version
self.assertEqual(len(act), 4) self.assertEqual(len(act), 2)
self.assertEqual(act[0], ['set', 'blocklisttest', 'action', 'blocklist_de', 'agent', useragent]) self.assertEqual(act[0], ['set', 'blocklisttest', 'action', 'blocklist_de', 'agent', useragent])
self.assertEqual(act[1], ['set', 'blocklisttest', 'action', 'badips', 'agent', useragent]) self.assertEqual(act[1], ['set', 'blocklisttest', 'action', 'mynetwatchman', 'agent', useragent])
self.assertEqual(eval(act[2][5]).get('agent', '<wrong>'), useragent)
self.assertEqual(act[3], ['set', 'blocklisttest', 'action', 'mynetwatchman', 'agent', useragent])
@with_tmpdir @with_tmpdir
def testGlob(self, d): def testGlob(self, d):
@ -752,9 +751,9 @@ class JailsReaderTest(LogCaptureTestCase):
['add', 'tz_correct', 'auto'], ['add', 'tz_correct', 'auto'],
['start', 'tz_correct'], ['start', 'tz_correct'],
['config-error', ['config-error',
"Jail 'brokenactiondef' skipped, because of wrong configuration: Invalid action definition 'joho[foo'"], "Jail 'brokenactiondef' skipped, because of wrong configuration: Invalid action definition 'joho[foo': unexpected option syntax"],
['config-error', ['config-error',
"Jail 'brokenfilterdef' skipped, because of wrong configuration: Invalid filter definition 'flt[test'"], "Jail 'brokenfilterdef' skipped, because of wrong configuration: Invalid filter definition 'flt[test': unexpected option syntax"],
['config-error', ['config-error',
"Jail 'missingaction' skipped, because of wrong configuration: Unable to read action 'noactionfileforthisaction'"], "Jail 'missingaction' skipped, because of wrong configuration: Unable to read action 'noactionfileforthisaction'"],
['config-error', ['config-error',
@ -975,6 +974,7 @@ class JailsReaderTest(LogCaptureTestCase):
['set', 'syslogsocket', 'auto'], ['set', 'syslogsocket', 'auto'],
['set', 'loglevel', "INFO"], ['set', 'loglevel', "INFO"],
['set', 'logtarget', '/var/log/fail2ban.log'], ['set', 'logtarget', '/var/log/fail2ban.log'],
['set', 'allowipv6', 'auto'],
['set', 'dbfile', '/var/lib/fail2ban/fail2ban.sqlite3'], ['set', 'dbfile', '/var/lib/fail2ban/fail2ban.sqlite3'],
['set', 'dbmaxmatches', 10], ['set', 'dbmaxmatches', 10],
['set', 'dbpurgeage', '1d'], ['set', 'dbpurgeage', '1d'],

View File

@ -29,7 +29,7 @@ import tempfile
import sqlite3 import sqlite3
import shutil import shutil
from ..server.filter import FileContainer from ..server.filter import FileContainer, Filter
from ..server.mytime import MyTime from ..server.mytime import MyTime
from ..server.ticket import FailTicket from ..server.ticket import FailTicket
from ..server.actions import Actions, Utils from ..server.actions import Actions, Utils
@ -212,19 +212,20 @@ class DatabaseTest(LogCaptureTestCase):
self.jail.name in self.db.getJailNames(True), self.jail.name in self.db.getJailNames(True),
"Jail not added to database") "Jail not added to database")
def testAddLog(self): def _testAddLog(self):
self.testAddJail() # Jail required self.testAddJail() # Jail required
_, filename = tempfile.mkstemp(".log", "Fail2BanDb_") _, filename = tempfile.mkstemp(".log", "Fail2BanDb_")
self.fileContainer = FileContainer(filename, "utf-8") self.fileContainer = FileContainer(filename, "utf-8")
self.db.addLog(self.jail, self.fileContainer) pos = self.db.addLog(self.jail, self.fileContainer)
self.assertTrue(pos is None); # unknown previously
self.assertIn(filename, self.db.getLogPaths(self.jail)) self.assertIn(filename, self.db.getLogPaths(self.jail))
os.remove(filename) os.remove(filename)
def testUpdateLog(self): def testUpdateLog(self):
self.testAddLog() # Add log file self._testAddLog() # Add log file
# Write some text # Write some text
filename = self.fileContainer.getFileName() filename = self.fileContainer.getFileName()
@ -544,17 +545,21 @@ class DatabaseTest(LogCaptureTestCase):
self.testAddJail() # Jail required self.testAddJail() # Jail required
self.jail.database = self.db self.jail.database = self.db
self.db.addJail(self.jail) self.db.addJail(self.jail)
actions = Actions(self.jail) actions = self.jail.actions
actions.add( actions.add(
"action_checkainfo", "action_checkainfo",
os.path.join(TEST_FILES_DIR, "action.d/action_checkainfo.py"), os.path.join(TEST_FILES_DIR, "action.d/action_checkainfo.py"),
{}) {})
actions.banManager.setBanTotal(20)
self.jail._Jail__filter = flt = Filter(self.jail)
flt.failManager.setFailTotal(50)
ticket = FailTicket("1.2.3.4") ticket = FailTicket("1.2.3.4")
ticket.setAttempt(5) ticket.setAttempt(5)
ticket.setMatches(['test', 'test']) ticket.setMatches(['test', 'test'])
self.jail.putFailTicket(ticket) self.jail.putFailTicket(ticket)
actions._Actions__checkBan() actions._Actions__checkBan()
self.assertLogged("ban ainfo %s, %s, %s, %s" % (True, True, True, True)) self.assertLogged("ban ainfo %s, %s, %s, %s" % (True, True, True, True))
self.assertLogged("jail info %d, %d, %d, %d" % (1, 21, 0, 50))
def testDelAndAddJail(self): def testDelAndAddJail(self):
self.testAddJail() # Add jail self.testAddJail() # Add jail

View File

@ -554,6 +554,9 @@ class CustomDateFormatsTest(unittest.TestCase):
(1123970401.0, "^%ExH:%ExM:%ExS**", '00:00:01'), (1123970401.0, "^%ExH:%ExM:%ExS**", '00:00:01'),
# cover date with current year, in test cases now == Aug 2005 -> back to last year (Sep 2004): # cover date with current year, in test cases now == Aug 2005 -> back to last year (Sep 2004):
(1094068799.0, "^%m/%d %ExH:%ExM:%ExS**", '09/01 21:59:59'), (1094068799.0, "^%m/%d %ExH:%ExM:%ExS**", '09/01 21:59:59'),
# no time (only date) in pattern, assume local 00:00:00 for H:M:S :
(1093989600.0, "^%Y-%m-%d**", '2004-09-01'),
(1093996800.0, "^%Y-%m-%d%z**", '2004-09-01Z'),
): ):
logSys.debug('== test: %r', (matched, dp, line)) logSys.debug('== test: %r', (matched, dp, line))
dd = DateDetector() dd = DateDetector()

View File

@ -230,7 +230,7 @@ def _start_params(tmp, use_stock=False, use_stock_cfg=None,
os.symlink(os.path.abspath(pjoin(STOCK_CONF_DIR, n)), pjoin(cfg, n)) os.symlink(os.path.abspath(pjoin(STOCK_CONF_DIR, n)), pjoin(cfg, n))
if create_before_start: if create_before_start:
for n in create_before_start: for n in create_before_start:
_write_file(n % {'tmp': tmp}, 'w', '') _write_file(n % {'tmp': tmp}, 'w')
# parameters (sock/pid and config, increase verbosity, set log, etc.): # parameters (sock/pid and config, increase verbosity, set log, etc.):
vvv, llev = (), "INFO" vvv, llev = (), "INFO"
if unittest.F2B.log_level < logging.INFO: # pragma: no cover if unittest.F2B.log_level < logging.INFO: # pragma: no cover
@ -937,10 +937,8 @@ class Fail2banServerTest(Fail2banClientServerBase):
"Jail 'broken-jail' skipped, because of wrong configuration", all=True) "Jail 'broken-jail' skipped, because of wrong configuration", all=True)
# enable both jails, 3 logs for jail1, etc... # enable both jails, 3 logs for jail1, etc...
# truncate test-log - we should not find unban/ban again by reload:
self.pruneLog("[test-phase 1b]") self.pruneLog("[test-phase 1b]")
_write_jail_cfg(actions=[1,2]) _write_jail_cfg(actions=[1,2])
_write_file(test1log, "w+")
if unittest.F2B.log_level < logging.DEBUG: # pragma: no cover if unittest.F2B.log_level < logging.DEBUG: # pragma: no cover
_out_file(test1log) _out_file(test1log)
self.execCmd(SUCCESS, startparams, "reload") self.execCmd(SUCCESS, startparams, "reload")
@ -1003,7 +1001,7 @@ class Fail2banServerTest(Fail2banClientServerBase):
self.pruneLog("[test-phase 2b]") self.pruneLog("[test-phase 2b]")
# write new failures: # write new failures:
_write_file(test2log, "w+", *( _write_file(test2log, "a+", *(
(str(int(MyTime.time())) + " error 403 from 192.0.2.2: test 2",) * 3 + (str(int(MyTime.time())) + " error 403 from 192.0.2.2: test 2",) * 3 +
(str(int(MyTime.time())) + " error 403 from 192.0.2.3: test 2",) * 3 + (str(int(MyTime.time())) + " error 403 from 192.0.2.3: test 2",) * 3 +
(str(int(MyTime.time())) + " failure 401 from 192.0.2.4: test 2",) * 3 + (str(int(MyTime.time())) + " failure 401 from 192.0.2.4: test 2",) * 3 +
@ -1062,10 +1060,6 @@ class Fail2banServerTest(Fail2banClientServerBase):
self.assertEqual(self.execCmdDirect(startparams, self.assertEqual(self.execCmdDirect(startparams,
'get', 'test-jail1', 'banned', '192.0.2.3', '192.0.2.9')[1], [1, 0]) 'get', 'test-jail1', 'banned', '192.0.2.3', '192.0.2.9')[1], [1, 0])
# rotate logs:
_write_file(test1log, "w+")
_write_file(test2log, "w+")
# restart jail without unban all: # restart jail without unban all:
self.pruneLog("[test-phase 2c]") self.pruneLog("[test-phase 2c]")
self.execCmd(SUCCESS, startparams, self.execCmd(SUCCESS, startparams,
@ -1183,7 +1177,7 @@ class Fail2banServerTest(Fail2banClientServerBase):
# now write failures again and check already banned (jail1 was alive the whole time) and new bans occurred (jail1 was alive the whole time): # now write failures again and check already banned (jail1 was alive the whole time) and new bans occurred (jail1 was alive the whole time):
self.pruneLog("[test-phase 5]") self.pruneLog("[test-phase 5]")
_write_file(test1log, "w+", *( _write_file(test1log, "a+", *(
(str(int(MyTime.time())) + " failure 401 from 192.0.2.1: test 5",) * 3 + (str(int(MyTime.time())) + " failure 401 from 192.0.2.1: test 5",) * 3 +
(str(int(MyTime.time())) + " error 403 from 192.0.2.5: test 5",) * 3 + (str(int(MyTime.time())) + " error 403 from 192.0.2.5: test 5",) * 3 +
(str(int(MyTime.time())) + " failure 401 from 192.0.2.6: test 5",) * 3 (str(int(MyTime.time())) + " failure 401 from 192.0.2.6: test 5",) * 3
@ -1326,7 +1320,7 @@ class Fail2banServerTest(Fail2banClientServerBase):
'backend = polling', 'backend = polling',
'usedns = no', 'usedns = no',
'logpath = %(tmp)s/blck-failures.log', 'logpath = %(tmp)s/blck-failures.log',
'action = nginx-block-map[blck_lst_reload="", blck_lst_file="%(tmp)s/blck-lst.map"]', 'action = nginx-block-map[srv_cmd="echo nginx", srv_pid="%(tmp)s/f2b.pid", blck_lst_file="%(tmp)s/blck-lst.map"]',
' blocklist_de[actionban=\'curl() { echo "*** curl" "$*";}; <Definition/actionban>\', email="Fail2Ban <fail2ban@localhost>", ' ' blocklist_de[actionban=\'curl() { echo "*** curl" "$*";}; <Definition/actionban>\', email="Fail2Ban <fail2ban@localhost>", '
'apikey="TEST-API-KEY", agent="fail2ban-test-agent", service=<name>]', 'apikey="TEST-API-KEY", agent="fail2ban-test-agent", service=<name>]',
'filter =', 'filter =',
@ -1366,6 +1360,8 @@ class Fail2banServerTest(Fail2banClientServerBase):
self.assertIn('\\125-000-004 1;\n', mp) self.assertIn('\\125-000-004 1;\n', mp)
self.assertIn('\\125-000-005 1;\n', mp) self.assertIn('\\125-000-005 1;\n', mp)
# check nginx reload is logged (pid of fail2ban is used to simulate success check nginx is running):
self.assertLogged("stdout: 'nginx -qt'", "stdout: 'nginx -s reload'", all=True)
# check blocklist_de substitution (e. g. new-line after <matches>): # check blocklist_de substitution (e. g. new-line after <matches>):
self.assertLogged( self.assertLogged(
"stdout: '*** curl --fail --data-urlencode server=Fail2Ban <fail2ban@localhost>" "stdout: '*** curl --fail --data-urlencode server=Fail2Ban <fail2ban@localhost>"
@ -1408,8 +1404,9 @@ class Fail2banServerTest(Fail2banClientServerBase):
'jails': ( 'jails': (
# default: # default:
'''test_action = dummy[actionstart_on_demand=1, init="start: %(__name__)s", target="%(tmp)s/test.txt", '''test_action = dummy[actionstart_on_demand=1, init="start: %(__name__)s", target="%(tmp)s/test.txt",
actionban='<known/actionban>; actionban='<known/actionban>; echo "found: <jail.found> / <jail.found_total>, banned: <jail.banned> / <jail.banned_total>"
echo "<matches>"; printf "=====\\n%%b\\n=====\\n\\n" "<matches>" >> <target>']''', echo "<matches>"; printf "=====\\n%%b\\n=====\\n\\n" "<matches>" >> <target>',
actionstop='<known/actionstop>; echo "stats <name> - found: <jail.found_total>, banned: <jail.banned_total>"']''',
# jail sendmail-auth: # jail sendmail-auth:
'[sendmail-auth]', '[sendmail-auth]',
'backend = polling', 'backend = polling',
@ -1454,7 +1451,8 @@ class Fail2banServerTest(Fail2banClientServerBase):
_write_file(lgfn, "w+", *smaut_msg) _write_file(lgfn, "w+", *smaut_msg)
# wait and check it caused banned (and dump in the test-file): # wait and check it caused banned (and dump in the test-file):
self.assertLogged( self.assertLogged(
"[sendmail-auth] Ban 192.0.2.1", "1 ticket(s) in 'sendmail-auth'", all=True, wait=MID_WAITTIME) "[sendmail-auth] Ban 192.0.2.1", "stdout: 'found: 0 / 3, banned: 1 / 1'",
"1 ticket(s) in 'sendmail-auth'", all=True, wait=MID_WAITTIME)
_out_file(tofn) _out_file(tofn)
td = _read_file(tofn) td = _read_file(tofn)
# check matches (maxmatches = 2, so only 2 & 3 available): # check matches (maxmatches = 2, so only 2 & 3 available):
@ -1465,10 +1463,11 @@ class Fail2banServerTest(Fail2banClientServerBase):
self.pruneLog("[test-phase sendmail-reject]") self.pruneLog("[test-phase sendmail-reject]")
# write log: # write log:
_write_file(lgfn, "w+", *smrej_msg) _write_file(lgfn, "a+", *smrej_msg)
# wait and check it caused banned (and dump in the test-file): # wait and check it caused banned (and dump in the test-file):
self.assertLogged( self.assertLogged(
"[sendmail-reject] Ban 192.0.2.2", "1 ticket(s) in 'sendmail-reject'", all=True, wait=MID_WAITTIME) "[sendmail-reject] Ban 192.0.2.2", "stdout: 'found: 0 / 3, banned: 1 / 1'",
"1 ticket(s) in 'sendmail-reject'", all=True, wait=MID_WAITTIME)
_out_file(tofn) _out_file(tofn)
td = _read_file(tofn) td = _read_file(tofn)
# check matches (no maxmatches, so all matched messages are available): # check matches (no maxmatches, so all matched messages are available):
@ -1482,6 +1481,8 @@ class Fail2banServerTest(Fail2banClientServerBase):
# wait a bit: # wait a bit:
self.assertLogged( self.assertLogged(
"Reload finished.", "Reload finished.",
"stdout: 'stats sendmail-auth - found: 3, banned: 1'",
"stdout: 'stats sendmail-reject - found: 3, banned: 1'",
"[sendmail-auth] Restore Ban 192.0.2.1", "1 ticket(s) in 'sendmail-auth'", all=True, wait=MID_WAITTIME) "[sendmail-auth] Restore Ban 192.0.2.1", "1 ticket(s) in 'sendmail-auth'", all=True, wait=MID_WAITTIME)
# check matches again - (dbmaxmatches = 1), so it should be only last match after restart: # check matches again - (dbmaxmatches = 1), so it should be only last match after restart:
td = _read_file(tofn) td = _read_file(tofn)
@ -1590,7 +1591,7 @@ class Fail2banServerTest(Fail2banClientServerBase):
wakeObs = False wakeObs = False
_observer_wait_before_incrban(lambda: wakeObs) _observer_wait_before_incrban(lambda: wakeObs)
# write again (IP already bad): # write again (IP already bad):
_write_file(test1log, "w+", *( _write_file(test1log, "a+", *(
(str(int(MyTime.time())) + " failure 401 from 192.0.2.11: I'm very bad \"hacker\" `` $(echo test)",) * 2 (str(int(MyTime.time())) + " failure 401 from 192.0.2.11: I'm very bad \"hacker\" `` $(echo test)",) * 2
)) ))
# wait for ban: # wait for ban:

View File

@ -25,6 +25,7 @@ __license__ = "GPL"
import os import os
import sys import sys
import tempfile
import unittest import unittest
from ..client import fail2banregex from ..client import fail2banregex
@ -80,6 +81,11 @@ def _test_exec_command_line(*args):
sys.stderr = _org['stderr'] sys.stderr = _org['stderr']
return _exit_code return _exit_code
def _reset():
# reset global warn-counter:
from ..server.filter import _decode_line_warn
_decode_line_warn.clear()
STR_00 = "Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0" STR_00 = "Dec 31 11:59:59 [sshd] error: PAM: Authentication failure for kevin from 192.0.2.0"
STR_00_NODT = "[sshd] error: PAM: Authentication failure for kevin from 192.0.2.0" STR_00_NODT = "[sshd] error: PAM: Authentication failure for kevin from 192.0.2.0"
@ -122,6 +128,7 @@ class Fail2banRegexTest(LogCaptureTestCase):
"""Call before every test case.""" """Call before every test case."""
LogCaptureTestCase.setUp(self) LogCaptureTestCase.setUp(self)
setUpMyTime() setUpMyTime()
_reset()
def tearDown(self): def tearDown(self):
"""Call after every test case.""" """Call after every test case."""
@ -141,6 +148,12 @@ class Fail2banRegexTest(LogCaptureTestCase):
)) ))
self.assertLogged("Unable to compile regular expression") self.assertLogged("Unable to compile regular expression")
def testWrongFilterOptions(self):
self.assertFalse(_test_exec(
"test", "flt[a='x,y,z',b=z,y,x]"
))
self.assertLogged("Wrong filter name or options", "wrong syntax at 14: y,x", all=True)
def testDirectFound(self): def testDirectFound(self):
self.assertTrue(_test_exec( self.assertTrue(_test_exec(
"--datepattern", r"^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?", "--datepattern", r"^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?",
@ -448,14 +461,8 @@ class Fail2banRegexTest(LogCaptureTestCase):
FILENAME_ZZZ_GEN, FILENAME_ZZZ_GEN FILENAME_ZZZ_GEN, FILENAME_ZZZ_GEN
)) ))
def _reset(self):
# reset global warn-counter:
from ..server.filter import _decode_line_warn
_decode_line_warn.clear()
def testWronChar(self): def testWronChar(self):
unittest.F2B.SkipIfCfgMissing(stock=True) unittest.F2B.SkipIfCfgMissing(stock=True)
self._reset()
self.assertTrue(_test_exec( self.assertTrue(_test_exec(
"-l", "notice", # put down log-level, because of too many debug-messages "-l", "notice", # put down log-level, because of too many debug-messages
"--datepattern", r"^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?", "--datepattern", r"^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?",
@ -471,7 +478,6 @@ class Fail2banRegexTest(LogCaptureTestCase):
def testWronCharDebuggex(self): def testWronCharDebuggex(self):
unittest.F2B.SkipIfCfgMissing(stock=True) unittest.F2B.SkipIfCfgMissing(stock=True)
self._reset()
self.assertTrue(_test_exec( self.assertTrue(_test_exec(
"-l", "notice", # put down log-level, because of too many debug-messages "-l", "notice", # put down log-level, because of too many debug-messages
"--datepattern", r"^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?", "--datepattern", r"^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?",
@ -484,6 +490,36 @@ class Fail2banRegexTest(LogCaptureTestCase):
self.assertLogged('https://') self.assertLogged('https://')
def testNLCharAsPartOfUniChar(self):
fname = tempfile.mktemp(prefix='tmp_fail2ban', suffix='uni')
# test two multi-byte encodings (both contains `\x0A` in either \x02\x0A or \x0A\x02):
for enc in ('utf-16be', 'utf-16le'):
self.pruneLog("[test-phase encoding=%s]" % enc)
try:
fout = open(fname, 'wb')
# test on unicode string containing \x0A as part of uni-char,
# it must produce exactly 2 lines (both are failures):
for l in (
u'1490349000 \u20AC Failed auth: invalid user Test\u020A from 192.0.2.1\n',
u'1490349000 \u20AC Failed auth: invalid user TestI from 192.0.2.2\n'
):
fout.write(l.encode(enc))
fout.close()
self.assertTrue(_test_exec(
"-l", "notice", # put down log-level, because of too many debug-messages
"--encoding", enc,
"--datepattern", r"^EPOCH",
fname, r"Failed .* from <HOST>",
))
self.assertLogged(" encoding : %s" % enc,
"Lines: 2 lines, 0 ignored, 2 matched, 0 missed", all=True)
self.assertNotLogged("Missed line(s)")
finally:
fout.close()
os.unlink(fname)
def testExecCmdLine_Usage(self): def testExecCmdLine_Usage(self):
self.assertNotEqual(_test_exec_command_line(), 0) self.assertNotEqual(_test_exec_command_line(), 0)
self.pruneLog() self.pruneLog()

View File

@ -8,6 +8,9 @@ class TestAction(ActionBase):
self._logSys.info("ban ainfo %s, %s, %s, %s", self._logSys.info("ban ainfo %s, %s, %s, %s",
aInfo["ipmatches"] != '', aInfo["ipjailmatches"] != '', aInfo["ipfailures"] > 0, aInfo["ipjailfailures"] > 0 aInfo["ipmatches"] != '', aInfo["ipjailmatches"] != '', aInfo["ipfailures"] > 0, aInfo["ipjailfailures"] > 0
) )
self._logSys.info("jail info %d, %d, %d, %d",
aInfo["jail.banned"], aInfo["jail.banned_total"], aInfo["jail.found"], aInfo["jail.found_total"]
)
def unban(self, aInfo): def unban(self, aInfo):
pass pass

View File

@ -3,6 +3,8 @@
[Tue Mar 16 15:39:29 2010] [error] [client 58.179.109.179] Invalid URI in request \xf9h\xa9\xf3\x88\x8cXKj \xbf-l*4\x87n\xe4\xfe\xd4\x1d\x06\x8c\xf8m\\rS\xf6n\xeb\x8 [Tue Mar 16 15:39:29 2010] [error] [client 58.179.109.179] Invalid URI in request \xf9h\xa9\xf3\x88\x8cXKj \xbf-l*4\x87n\xe4\xfe\xd4\x1d\x06\x8c\xf8m\\rS\xf6n\xeb\x8
# failJSON: { "time": "2010-03-15T15:44:47", "match": true , "host": "121.222.2.133" } # failJSON: { "time": "2010-03-15T15:44:47", "match": true , "host": "121.222.2.133" }
[Mon Mar 15 15:44:47 2010] [error] [client 121.222.2.133] Invalid URI in request n\xed*\xbe*\xab\xefd\x80\xb5\xae\xf6\x01\x10M?\xf2\xce\x13\x9c\xd7\xa0N\xa7\xdb%0\xde\xe0\xfc\xd2\xa0\xfe\xe9w\xee\xc4`v\x9b[{\x0c:\xcb\x93\xc6\xa0\x93\x9c`l\\\x8d\xc9 [Mon Mar 15 15:44:47 2010] [error] [client 121.222.2.133] Invalid URI in request n\xed*\xbe*\xab\xefd\x80\xb5\xae\xf6\x01\x10M?\xf2\xce\x13\x9c\xd7\xa0N\xa7\xdb%0\xde\xe0\xfc\xd2\xa0\xfe\xe9w\xee\xc4`v\x9b[{\x0c:\xcb\x93\xc6\xa0\x93\x9c`l\\\x8d\xc9
# failJSON: { "time": "2010-03-15T16:04:06", "match": true , "host": "192.0.2.1", "desc": "AH00126 failure, gh-2908" }
[Sat Mar 15 16:04:06.105212 2010] [core:error] [pid 17408] [client 192.0.2.1:55280] AH00126: Invalid URI in request GET /static/../../../a/../../../../etc/passwd HTTP/1.1
# http://forum.nconf.org/viewtopic.php?f=14&t=427&p=1488 # http://forum.nconf.org/viewtopic.php?f=14&t=427&p=1488
# failJSON: { "time": "2010-07-30T11:23:54", "match": true , "host": "10.85.6.69" } # failJSON: { "time": "2010-07-30T11:23:54", "match": true , "host": "10.85.6.69" }

View File

@ -19,6 +19,8 @@
[2012-02-13 17:44:26] NOTICE[1638] chan_iax2.c: Host 1.2.3.4 failed MD5 authentication for 'Fail2ban' (e7df7cd2ca07f4f1ab415d457a6e1c13 != 53ac4bc41ee4ec77888ed4aa50677247) [2012-02-13 17:44:26] NOTICE[1638] chan_iax2.c: Host 1.2.3.4 failed MD5 authentication for 'Fail2ban' (e7df7cd2ca07f4f1ab415d457a6e1c13 != 53ac4bc41ee4ec77888ed4aa50677247)
# failJSON: { "time": "2013-02-05T23:44:42", "match": true , "host": "1.2.3.4" } # failJSON: { "time": "2013-02-05T23:44:42", "match": true , "host": "1.2.3.4" }
[2013-02-05 23:44:42] NOTICE[436][C-00000fa9] chan_sip.c: Call from '' (1.2.3.4:10836) to extension '0972598285108' rejected because extension not found in context 'default'. [2013-02-05 23:44:42] NOTICE[436][C-00000fa9] chan_sip.c: Call from '' (1.2.3.4:10836) to extension '0972598285108' rejected because extension not found in context 'default'.
# failJSON: { "time": "2005-01-18T17:39:50", "match": true , "host": "1.2.3.4" }
[Jan 18 17:39:50] NOTICE[12049]: res_pjsip_session.c:2337 new_invite: Call from 'anonymous' (TCP:[1.2.3.4]:61470) to extension '9011+442037690237' rejected because extension not found in context 'default'.
# failJSON: { "time": "2013-03-26T15:47:54", "match": true , "host": "1.2.3.4" } # failJSON: { "time": "2013-03-26T15:47:54", "match": true , "host": "1.2.3.4" }
[2013-03-26 15:47:54] NOTICE[1237] chan_sip.c: Registration from '"100"sip:100@1.2.3.4' failed for '1.2.3.4:23930' - No matching peer found [2013-03-26 15:47:54] NOTICE[1237] chan_sip.c: Registration from '"100"sip:100@1.2.3.4' failed for '1.2.3.4:23930' - No matching peer found
# failJSON: { "time": "2013-05-13T07:10:53", "match": true , "host": "1.2.3.4" } # failJSON: { "time": "2013-05-13T07:10:53", "match": true , "host": "1.2.3.4" }

View File

@ -3,5 +3,15 @@ Apr 26 13:15:25 webserver example.com: https://example.com|1430068525|user|1.2.3
# failJSON: { "time": "2005-04-26T13:15:25", "match": true , "host": "1.2.3.4" } # failJSON: { "time": "2005-04-26T13:15:25", "match": true , "host": "1.2.3.4" }
Apr 26 13:15:25 webserver example.com: https://example.com/subdir|1430068525|user|1.2.3.4|https://example.com/subdir/user|https://example.com/subdir/user|0||Login attempt failed for drupaladmin. Apr 26 13:15:25 webserver example.com: https://example.com/subdir|1430068525|user|1.2.3.4|https://example.com/subdir/user|https://example.com/subdir/user|0||Login attempt failed for drupaladmin.
# failJSON: { "time": "2005-04-26T13:19:08", "match": false , "host": "1.2.3.4" } # failJSON: { "time": "2005-04-26T13:19:08", "match": false , "host": "1.2.3.4", "user": "drupaladmin" }
Apr 26 13:19:08 webserver example.com: https://example.com|1430068748|user|1.2.3.4|https://example.com/user|https://example.com/user|1||Session opened for drupaladmin. Apr 26 13:19:08 webserver example.com: https://example.com|1430068748|user|1.2.3.4|https://example.com/user|https://example.com/user|1||Session opened for drupaladmin.
# failJSON: { "time": "2005-04-26T13:20:00", "match": false, "desc": "attempt to inject on URI (pipe, login failed for), not a failure, gh-2742" }
Apr 26 13:20:00 host drupal-site: https://example.com|1613063581|user|192.0.2.5|https://example.com/user/login?test=%7C&test2=%7C...|https://example.com/user/login?test=|&test2=|0||Login attempt failed for tester|2||Session revisited for drupaladmin.
# failJSON: { "time": "2005-04-26T13:20:01", "match": true , "host": "192.0.2.7", "user": "Jack Sparrow", "desc": "log-format change - for -> from, user name with space, gh-2742" }
Apr 26 13:20:01 mweb drupal_site[24864]: https://www.example.com|1613058599|user|192.0.2.7|https://www.example.com/en/user/login|https://www.example.com/en/user/login|0||Login attempt failed from Jack Sparrow.
# failJSON: { "time": "2005-04-26T13:20:02", "match": true , "host": "192.0.2.4", "desc": "attempt to inject on URI (pipe), login failed, gh-2742" }
Apr 26 13:20:02 host drupal-site: https://example.com|1613063581|user|192.0.2.4|https://example.com/user/login?test=%7C&test2=%7C|https://example.com/user/login?test=|&test2=||0||Login attempt failed from 192.0.2.4.
# failJSON: { "time": "2005-04-26T13:20:03", "match": false, "desc": "attempt to inject on URI (pipe, login failed from), not a failure, gh-2742" }
Apr 26 13:20:03 host drupal-site: https://example.com|1613063581|user|192.0.2.5|https://example.com/user/login?test=%7C&test2=%7C...|https://example.com/user/login?test=|&test2=|0||Login attempt failed from 1.2.3.4|2||Session revisited for drupaladmin.

View File

@ -0,0 +1,11 @@
# failJSON: { "time": "2020-02-24T16:05:21", "match": true , "host": "192.0.2.1" }
2020-02-24 16:05:21.00 Logon Login failed for user 'Backend'. Reason: Could not find a login matching the name provided. [CLIENT: 192.0.2.1]
# failJSON: { "time": "2020-02-24T16:30:25", "match": true , "host": "192.0.2.2" }
2020-02-24 16:30:25.88 Logon Login failed for user '===)jf02hüas9ä##22f'. Reason: Could not find a login matching the name provided. [CLIENT: 192.0.2.2]
# failJSON: { "time": "2020-02-24T16:31:12", "match": true , "host": "192.0.2.3" }
2020-02-24 16:31:12.20 Logon Login failed for user ''. Reason: An attempt to login using SQL authentication failed. Server is configured for Integrated authentication only. [CLIENT: 192.0.2.3]
# failJSON: { "time": "2020-02-24T16:31:26", "match": true , "host": "192.0.2.4", "user":"O'Leary" }
2020-02-24 16:31:26.01 Logon Login failed for user 'O'Leary'. Reason: Could not find a login matching the name provided. [CLIENT: 192.0.2.4]
# failJSON: { "time": "2020-02-24T16:31:26", "match": false, "desc": "test injection in possibly unescaped foreign input" }
2020-02-24 16:31:26.02 Wrong data received: Logon Login failed for user 'test'. Reason: Could not find a login matching the name provided. [CLIENT: 192.0.2.5]

View File

@ -26,3 +26,8 @@ Aug 27 16:58:31 vhost1-ua named[29206]: client 176.9.92.38#42592 (simmarket.com.
# failJSON: { "time": "2004-08-27T16:59:00", "match": true , "host": "192.0.2.1", "desc": "new log format, 9.11.0 (#2406)" } # failJSON: { "time": "2004-08-27T16:59:00", "match": true , "host": "192.0.2.1", "desc": "new log format, 9.11.0 (#2406)" }
Aug 27 16:59:00 host named[28098]: client @0x7f6450002ef0 192.0.2.1#23332 (example.com): bad zone transfer request: 'test.com/IN': non-authoritative zone (NOTAUTH) Aug 27 16:59:00 host named[28098]: client @0x7f6450002ef0 192.0.2.1#23332 (example.com): bad zone transfer request: 'test.com/IN': non-authoritative zone (NOTAUTH)
# filterOptions: {"logtype": "journal"}
# failJSON: { "match": true , "host": "192.0.2.1", "desc": "systemd-journal entry" }
atom named[1806]: client @0x7fb13400eec0 192.0.2.1#61977 (.): query (cache) './ANY/IN' denied

View File

@ -2,3 +2,5 @@
[1387288694] nsd[7745]: info: ratelimit block example.com. type any target 192.0.2.0/24 query 192.0.2.105 TYPE255 [1387288694] nsd[7745]: info: ratelimit block example.com. type any target 192.0.2.0/24 query 192.0.2.105 TYPE255
# failJSON: { "time": "2013-12-18T07:42:15", "match": true , "host": "192.0.2.115" } # failJSON: { "time": "2013-12-18T07:42:15", "match": true , "host": "192.0.2.115" }
[1387348935] nsd[23600]: info: axfr for zone domain.nl. from client 192.0.2.115 refused, no acl matches. [1387348935] nsd[23600]: info: axfr for zone domain.nl. from client 192.0.2.115 refused, no acl matches.
# failJSON: { "time": "2021-03-05T05:25:14", "match": true , "host": "192.0.2.32", "desc": "new format, no client after from, no dot at end, gh-2965" }
[2021-03-05 05:25:14.562] nsd[160800]: info: axfr for example.com. from 192.0.2.32 refused, no acl matches

View File

@ -151,6 +151,11 @@ Feb 18 09:48:04 xxx postfix/smtpd[23]: lost connection after AUTH from unknown[1
# failJSON: { "time": "2005-02-18T09:48:04", "match": true , "host": "192.0.2.23" } # failJSON: { "time": "2005-02-18T09:48:04", "match": true , "host": "192.0.2.23" }
Feb 18 09:48:04 xxx postfix/smtpd[23]: lost connection after AUTH from unknown[192.0.2.23] Feb 18 09:48:04 xxx postfix/smtpd[23]: lost connection after AUTH from unknown[192.0.2.23]
# failJSON: { "time": "2004-12-23T19:39:13", "match": true , "host": "192.0.2.2" }
Dec 23 19:39:13 xxx postfix/postscreen[21057]: PREGREET 14 after 0.08 from [192.0.2.2]:59415: EHLO ylmf-pc\r\n
# failJSON: { "time": "2004-12-24T00:54:36", "match": true , "host": "192.0.2.3" }
Dec 24 00:54:36 xxx postfix/postscreen[22515]: HANGUP after 16 from [192.0.2.3]:48119 in tests after SMTP handshake
# filterOptions: [{}, {"mode": "ddos"}, {"mode": "aggressive"}] # filterOptions: [{}, {"mode": "ddos"}, {"mode": "aggressive"}]
# failJSON: { "match": false, "desc": "don't affect lawful data (sporadical connection aborts within DATA-phase, see gh-1813 for discussion)" } # failJSON: { "match": false, "desc": "don't affect lawful data (sporadical connection aborts within DATA-phase, see gh-1813 for discussion)" }
Feb 18 09:50:05 xxx postfix/smtpd[42]: lost connection after DATA from good-host.example.com[192.0.2.10] Feb 18 09:50:05 xxx postfix/smtpd[42]: lost connection after DATA from good-host.example.com[192.0.2.10]

View File

@ -0,0 +1,8 @@
# failJSON: { "time": "2005-03-05T21:44:43", "match": true , "host": "192.0.2.123" }
Mar 5 21:44:43 srv scanlogd: 192.0.2.123 to 192.0.2.1 ports 80, 81, 83, 88, 99, 443, 1080, 3128, ..., f????uxy, TOS 00, TTL 49 @20:44:43
# failJSON: { "time": "2005-03-05T21:44:44", "match": true , "host": "192.0.2.123" }
Mar 5 21:44:44 srv scanlogd: 192.0.2.123 to 192.0.2.1 ports 497, 515, 544, 543, 464, 513, ..., fSrpauxy, TOS 00 @09:04:25
# failJSON: { "time": "2005-03-05T21:44:45", "match": true , "host": "192.0.2.123" }
Mar 5 21:44:45 srv scanlogd: 192.0.2.123 to 192.0.2.1 ports 593, 548, 636, 646, 625, 631, ..., fSrpauxy, TOS 00, TTL 239 @17:34:00
# failJSON: { "time": "2005-03-05T21:44:46", "match": true , "host": "192.0.2.123" }
Mar 5 21:44:46 srv scanlogd: 192.0.2.123 to 192.0.2.1 ports 22, 26, 37, 80, 25, 79, ..., fSrpauxy, TOS 00 @22:38:37

View File

@ -164,23 +164,31 @@ def _assert_correct_last_attempt(utest, filter_, output, count=None):
# get fail ticket from jail # get fail ticket from jail
found.append(_ticket_tuple(filter_.getFailTicket())) found.append(_ticket_tuple(filter_.getFailTicket()))
else: else:
# when we are testing without jails # when we are testing without jails wait for failures (up to max time)
# wait for failures (up to max time) if filter_.jail:
Utils.wait_for( while True:
lambda: filter_.failManager.getFailCount() >= (tickcount, failcount), t = filter_.jail.getFailTicket()
_maxWaitTime(10)) if not t: break
# get fail ticket(s) from filter found.append(_ticket_tuple(t))
while tickcount: if found:
try: tickcount -= len(found)
found.append(_ticket_tuple(filter_.failManager.toBan())) if tickcount > 0:
except FailManagerEmpty: Utils.wait_for(
break lambda: filter_.failManager.getFailCount() >= (tickcount, failcount),
tickcount -= 1 _maxWaitTime(10))
# get fail ticket(s) from filter
while tickcount:
try:
found.append(_ticket_tuple(filter_.failManager.toBan()))
except FailManagerEmpty:
break
tickcount -= 1
if not isinstance(output[0], (tuple,list)): if not isinstance(output[0], (tuple,list)):
utest.assertEqual(len(found), 1) utest.assertEqual(len(found), 1)
_assert_equal_entries(utest, found[0], output, count) _assert_equal_entries(utest, found[0], output, count)
else: else:
utest.assertEqual(len(found), len(output))
# sort by string representation of ip (multiple failures with different ips): # sort by string representation of ip (multiple failures with different ips):
found = sorted(found, key=lambda x: str(x)) found = sorted(found, key=lambda x: str(x))
output = sorted(output, key=lambda x: str(x)) output = sorted(output, key=lambda x: str(x))
@ -188,7 +196,7 @@ def _assert_correct_last_attempt(utest, filter_, output, count=None):
_assert_equal_entries(utest, f, o) _assert_equal_entries(utest, f, o)
def _copy_lines_between_files(in_, fout, n=None, skip=0, mode='a', terminal_line=""): def _copy_lines_between_files(in_, fout, n=None, skip=0, mode='a', terminal_line="", lines=None):
"""Copy lines from one file to another (which might be already open) """Copy lines from one file to another (which might be already open)
Returns open fout Returns open fout
@ -205,9 +213,9 @@ def _copy_lines_between_files(in_, fout, n=None, skip=0, mode='a', terminal_line
fin.readline() fin.readline()
# Read # Read
i = 0 i = 0
lines = [] if not lines: lines = []
while n is None or i < n: while n is None or i < n:
l = FileContainer.decode_line(in_, 'UTF-8', fin.readline()).rstrip('\r\n') l = fin.readline().decode('UTF-8', 'replace').rstrip('\r\n')
if terminal_line is not None and l == terminal_line: if terminal_line is not None and l == terminal_line:
break break
lines.append(l) lines.append(l)
@ -215,6 +223,7 @@ def _copy_lines_between_files(in_, fout, n=None, skip=0, mode='a', terminal_line
# Write: all at once and flush # Write: all at once and flush
if isinstance(fout, str): if isinstance(fout, str):
fout = open(fout, mode) fout = open(fout, mode)
DefLogSys.debug(' ++ write %d test lines', len(lines))
fout.write('\n'.join(lines)+'\n') fout.write('\n'.join(lines)+'\n')
fout.flush() fout.flush()
if isinstance(in_, str): # pragma: no branch - only used with str in test cases if isinstance(in_, str): # pragma: no branch - only used with str in test cases
@ -246,7 +255,7 @@ def _copy_lines_to_journal(in_, fields={},n=None, skip=0, terminal_line=""): # p
# Read/Write # Read/Write
i = 0 i = 0
while n is None or i < n: while n is None or i < n:
l = FileContainer.decode_line(in_, 'UTF-8', fin.readline()).rstrip('\r\n') l = fin.readline().decode('UTF-8', 'replace').rstrip('\r\n')
if terminal_line is not None and l == terminal_line: if terminal_line is not None and l == terminal_line:
break break
journal.send(MESSAGE=l.strip(), **fields) journal.send(MESSAGE=l.strip(), **fields)
@ -599,13 +608,14 @@ class IgnoreIPDNS(LogCaptureTestCase):
cmd = os.path.join(STOCK_CONF_DIR, "filter.d/ignorecommands/apache-fakegooglebot") cmd = os.path.join(STOCK_CONF_DIR, "filter.d/ignorecommands/apache-fakegooglebot")
## below test direct as python module: ## below test direct as python module:
mod = Utils.load_python_module(cmd) mod = Utils.load_python_module(cmd)
self.assertFalse(mod.is_googlebot(mod.process_args([cmd, "128.178.222.69"]))) self.assertFalse(mod.is_googlebot(*mod.process_args([cmd, "128.178.222.69"])))
self.assertFalse(mod.is_googlebot(mod.process_args([cmd, "192.0.2.1"]))) self.assertFalse(mod.is_googlebot(*mod.process_args([cmd, "192.0.2.1"])))
self.assertFalse(mod.is_googlebot(*mod.process_args([cmd, "192.0.2.1", 0.1])))
bot_ips = ['66.249.66.1'] bot_ips = ['66.249.66.1']
for ip in bot_ips: for ip in bot_ips:
self.assertTrue(mod.is_googlebot(mod.process_args([cmd, str(ip)])), "test of googlebot ip %s failed" % ip) self.assertTrue(mod.is_googlebot(*mod.process_args([cmd, str(ip)])), "test of googlebot ip %s failed" % ip)
self.assertRaises(ValueError, lambda: mod.is_googlebot(mod.process_args([cmd]))) self.assertRaises(ValueError, lambda: mod.is_googlebot(*mod.process_args([cmd])))
self.assertRaises(ValueError, lambda: mod.is_googlebot(mod.process_args([cmd, "192.0"]))) self.assertRaises(ValueError, lambda: mod.is_googlebot(*mod.process_args([cmd, "192.0"])))
## via command: ## via command:
self.filter.ignoreCommand = cmd + " <ip>" self.filter.ignoreCommand = cmd + " <ip>"
for ip in bot_ips: for ip in bot_ips:
@ -617,7 +627,7 @@ class IgnoreIPDNS(LogCaptureTestCase):
self.pruneLog() self.pruneLog()
self.filter.ignoreCommand = cmd + " bad arguments <ip>" self.filter.ignoreCommand = cmd + " bad arguments <ip>"
self.assertFalse(self.filter.inIgnoreIPList("192.0")) self.assertFalse(self.filter.inIgnoreIPList("192.0"))
self.assertLogged('Please provide a single IP as an argument.') self.assertLogged('Usage')
@ -635,6 +645,19 @@ class LogFile(LogCaptureTestCase):
self.filter = FilterPoll(None) self.filter = FilterPoll(None)
self.assertRaises(IOError, self.filter.addLogPath, LogFile.MISSING) self.assertRaises(IOError, self.filter.addLogPath, LogFile.MISSING)
def testDecodeLineWarn(self):
# incomplete line (missing byte at end), warning is suppressed:
l = u"correct line\n"
r = l.encode('utf-16le')
self.assertEqual(FileContainer.decode_line('TESTFILE', 'utf-16le', r), l)
self.assertEqual(FileContainer.decode_line('TESTFILE', 'utf-16le', r[0:-1]), l[0:-1])
self.assertNotLogged('Error decoding line')
# complete line (incorrect surrogate in the middle), warning is there:
r = b"incorrect \xc8\x0a line\n"
l = r.decode('utf-8', 'replace')
self.assertEqual(FileContainer.decode_line('TESTFILE', 'utf-8', r), l)
self.assertLogged('Error decoding line')
class LogFileFilterPoll(unittest.TestCase): class LogFileFilterPoll(unittest.TestCase):
@ -800,7 +823,6 @@ class LogFileMonitor(LogCaptureTestCase):
_, self.name = tempfile.mkstemp('fail2ban', 'monitorfailures') _, self.name = tempfile.mkstemp('fail2ban', 'monitorfailures')
self.file = open(self.name, 'a') self.file = open(self.name, 'a')
self.filter = FilterPoll(DummyJail()) self.filter = FilterPoll(DummyJail())
self.filter.banASAP = False # avoid immediate ban in this tests
self.filter.addLogPath(self.name, autoSeek=False) self.filter.addLogPath(self.name, autoSeek=False)
self.filter.active = True self.filter.active = True
self.filter.addFailRegex(r"(?:(?:Authentication failure|Failed [-/\w+]+) for(?: [iI](?:llegal|nvalid) user)?|[Ii](?:llegal|nvalid) user|ROOT LOGIN REFUSED) .*(?: from|FROM) <HOST>") self.filter.addFailRegex(r"(?:(?:Authentication failure|Failed [-/\w+]+) for(?: [iI](?:llegal|nvalid) user)?|[Ii](?:llegal|nvalid) user|ROOT LOGIN REFUSED) .*(?: from|FROM) <HOST>")
@ -960,7 +982,7 @@ class LogFileMonitor(LogCaptureTestCase):
os.rename(self.name, self.name + '.bak') os.rename(self.name, self.name + '.bak')
_copy_lines_between_files(GetFailures.FILENAME_01, self.name, skip=14, n=1).close() _copy_lines_between_files(GetFailures.FILENAME_01, self.name, skip=14, n=1).close()
self.filter.getFailures(self.name) self.filter.getFailures(self.name)
_assert_correct_last_attempt(self, self.filter, GetFailures.FAILURES_01) #_assert_correct_last_attempt(self, self.filter, GetFailures.FAILURES_01)
self.assertEqual(self.filter.failManager.getFailTotal(), 3) self.assertEqual(self.filter.failManager.getFailTotal(), 3)
@ -1018,7 +1040,6 @@ def get_monitor_failures_testcase(Filter_):
self.file = open(self.name, 'a') self.file = open(self.name, 'a')
self.jail = DummyJail() self.jail = DummyJail()
self.filter = Filter_(self.jail) self.filter = Filter_(self.jail)
self.filter.banASAP = False # avoid immediate ban in this tests
self.filter.addLogPath(self.name, autoSeek=False) self.filter.addLogPath(self.name, autoSeek=False)
# speedup search using exact date pattern: # speedup search using exact date pattern:
self.filter.setDatePattern(r'^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?') self.filter.setDatePattern(r'^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?')
@ -1111,12 +1132,13 @@ def get_monitor_failures_testcase(Filter_):
skip=12, n=3, mode='w') skip=12, n=3, mode='w')
self.assert_correct_last_attempt(GetFailures.FAILURES_01) self.assert_correct_last_attempt(GetFailures.FAILURES_01)
def _wait4failures(self, count=2): def _wait4failures(self, count=2, waitEmpty=True):
# Poll might need more time # Poll might need more time
self.assertTrue(self.isEmpty(_maxWaitTime(5)), if waitEmpty:
"Queue must be empty but it is not: %s." self.assertTrue(self.isEmpty(_maxWaitTime(5)),
% (', '.join([str(x) for x in self.jail.queue]))) "Queue must be empty but it is not: %s."
self.assertRaises(FailManagerEmpty, self.filter.failManager.toBan) % (', '.join([str(x) for x in self.jail.queue])))
self.assertRaises(FailManagerEmpty, self.filter.failManager.toBan)
Utils.wait_for(lambda: self.filter.failManager.getFailTotal() >= count, _maxWaitTime(10)) Utils.wait_for(lambda: self.filter.failManager.getFailTotal() >= count, _maxWaitTime(10))
self.assertEqual(self.filter.failManager.getFailTotal(), count) self.assertEqual(self.filter.failManager.getFailTotal(), count)
@ -1129,13 +1151,15 @@ def get_monitor_failures_testcase(Filter_):
# move aside, but leaving the handle still open... # move aside, but leaving the handle still open...
os.rename(self.name, self.name + '.bak') os.rename(self.name, self.name + '.bak')
_copy_lines_between_files(GetFailures.FILENAME_01, self.name, skip=14, n=1).close() _copy_lines_between_files(GetFailures.FILENAME_01, self.name, skip=14, n=1,
lines=["Aug 14 11:59:59 [logrotate] rotation 1"]).close()
self.assert_correct_last_attempt(GetFailures.FAILURES_01) self.assert_correct_last_attempt(GetFailures.FAILURES_01)
self.assertEqual(self.filter.failManager.getFailTotal(), 3) self.assertEqual(self.filter.failManager.getFailTotal(), 3)
# now remove the moved file # now remove the moved file
_killfile(None, self.name + '.bak') _killfile(None, self.name + '.bak')
_copy_lines_between_files(GetFailures.FILENAME_01, self.name, skip=12, n=3).close() _copy_lines_between_files(GetFailures.FILENAME_01, self.name, skip=12, n=3,
lines=["Aug 14 11:59:59 [logrotate] rotation 2"]).close()
self.assert_correct_last_attempt(GetFailures.FAILURES_01) self.assert_correct_last_attempt(GetFailures.FAILURES_01)
self.assertEqual(self.filter.failManager.getFailTotal(), 6) self.assertEqual(self.filter.failManager.getFailTotal(), 6)
@ -1189,7 +1213,7 @@ def get_monitor_failures_testcase(Filter_):
os.rename(tmpsub1, tmpsub2 + 'a') os.rename(tmpsub1, tmpsub2 + 'a')
os.mkdir(tmpsub1) os.mkdir(tmpsub1)
self.file = _copy_lines_between_files(GetFailures.FILENAME_01, self.name, self.file = _copy_lines_between_files(GetFailures.FILENAME_01, self.name,
skip=12, n=1, mode='w') skip=12, n=1, mode='w', lines=["Aug 14 11:59:59 [logrotate] rotation 1"])
self.file.close() self.file.close()
self._wait4failures(2) self._wait4failures(2)
@ -1200,7 +1224,7 @@ def get_monitor_failures_testcase(Filter_):
os.mkdir(tmpsub1) os.mkdir(tmpsub1)
self.waitForTicks(2) self.waitForTicks(2)
self.file = _copy_lines_between_files(GetFailures.FILENAME_01, self.name, self.file = _copy_lines_between_files(GetFailures.FILENAME_01, self.name,
skip=12, n=1, mode='w') skip=12, n=1, mode='w', lines=["Aug 14 11:59:59 [logrotate] rotation 2"])
self.file.close() self.file.close()
self._wait4failures(3) self._wait4failures(3)
@ -1277,14 +1301,14 @@ def get_monitor_failures_testcase(Filter_):
# tail written before, so let's not copy anything yet # tail written before, so let's not copy anything yet
#_copy_lines_between_files(GetFailures.FILENAME_01, self.name, n=100) #_copy_lines_between_files(GetFailures.FILENAME_01, self.name, n=100)
# we should detect the failures # we should detect the failures
self.assert_correct_last_attempt(GetFailures.FAILURES_01, count=6) # was needed if we write twice above self.assert_correct_last_attempt(GetFailures.FAILURES_01, count=3) # was needed if we write twice above
# now copy and get even more # now copy and get even more
_copy_lines_between_files(GetFailures.FILENAME_01, self.file, skip=12, n=3) _copy_lines_between_files(GetFailures.FILENAME_01, self.file, skip=12, n=3)
# check for 3 failures (not 9), because 6 already get above... # check for 3 failures (not 9), because 6 already get above...
self.assert_correct_last_attempt(GetFailures.FAILURES_01) self.assert_correct_last_attempt(GetFailures.FAILURES_01, count=3)
# total count in this test: # total count in this test:
self.assertEqual(self.filter.failManager.getFailTotal(), 12) self._wait4failures(12, False)
cls = MonitorFailures cls = MonitorFailures
cls.__qualname__ = cls.__name__ = "MonitorFailures<%s>(%s)" \ cls.__qualname__ = cls.__name__ = "MonitorFailures<%s>(%s)" \
@ -1316,7 +1340,6 @@ def get_monitor_failures_journal_testcase(Filter_): # pragma: systemd no cover
def _initFilter(self, **kwargs): def _initFilter(self, **kwargs):
self._getRuntimeJournal() # check journal available self._getRuntimeJournal() # check journal available
self.filter = Filter_(self.jail, **kwargs) self.filter = Filter_(self.jail, **kwargs)
self.filter.banASAP = False # avoid immediate ban in this tests
self.filter.addJournalMatch([ self.filter.addJournalMatch([
"SYSLOG_IDENTIFIER=fail2ban-testcases", "SYSLOG_IDENTIFIER=fail2ban-testcases",
"TEST_FIELD=1", "TEST_FIELD=1",
@ -1512,7 +1535,7 @@ def get_monitor_failures_journal_testcase(Filter_): # pragma: systemd no cover
"SYSLOG_IDENTIFIER=fail2ban-testcases", "SYSLOG_IDENTIFIER=fail2ban-testcases",
"TEST_FIELD=1", "TEST_FIELD=1",
"TEST_UUID=%s" % self.test_uuid]) "TEST_UUID=%s" % self.test_uuid])
self.assert_correct_ban("193.168.0.128", 4) self.assert_correct_ban("193.168.0.128", 3)
_copy_lines_to_journal( _copy_lines_to_journal(
self.test_file, self.journal_fields, n=6, skip=10) self.test_file, self.journal_fields, n=6, skip=10)
# we should detect the failures # we should detect the failures
@ -1526,7 +1549,7 @@ def get_monitor_failures_journal_testcase(Filter_): # pragma: systemd no cover
self.test_file, self.journal_fields, skip=15, n=4) self.test_file, self.journal_fields, skip=15, n=4)
self.waitForTicks(1) self.waitForTicks(1)
self.assertTrue(self.isFilled(10)) self.assertTrue(self.isFilled(10))
self.assert_correct_ban("87.142.124.10", 4) self.assert_correct_ban("87.142.124.10", 3)
# Add direct utf, unicode, blob: # Add direct utf, unicode, blob:
for l in ( for l in (
"error: PAM: Authentication failure for \xe4\xf6\xfc\xdf from 192.0.2.1", "error: PAM: Authentication failure for \xe4\xf6\xfc\xdf from 192.0.2.1",
@ -1570,7 +1593,6 @@ class GetFailures(LogCaptureTestCase):
setUpMyTime() setUpMyTime()
self.jail = DummyJail() self.jail = DummyJail()
self.filter = FileFilter(self.jail) self.filter = FileFilter(self.jail)
self.filter.banASAP = False # avoid immediate ban in this tests
self.filter.active = True self.filter.active = True
# speedup search using exact date pattern: # speedup search using exact date pattern:
self.filter.setDatePattern(r'^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?') self.filter.setDatePattern(r'^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?')
@ -1625,22 +1647,56 @@ class GetFailures(LogCaptureTestCase):
def testCRLFFailures01(self): def testCRLFFailures01(self):
# We first adjust logfile/failures to end with CR+LF # We first adjust logfile/failures to end with CR+LF
fname = tempfile.mktemp(prefix='tmp_fail2ban', suffix='crlf') fname = tempfile.mktemp(prefix='tmp_fail2ban', suffix='crlf')
# poor man unix2dos: try:
fin, fout = open(GetFailures.FILENAME_01, 'rb'), open(fname, 'wb') # poor man unix2dos:
for l in fin.read().splitlines(): fin, fout = open(GetFailures.FILENAME_01, 'rb'), open(fname, 'wb')
fout.write(l + b'\r\n') for l in fin.read().splitlines():
fin.close() fout.write(l + b'\r\n')
fout.close() fin.close()
fout.close()
# now see if we should be getting the "same" failures # now see if we should be getting the "same" failures
self.testGetFailures01(filename=fname) self.testGetFailures01(filename=fname)
_killfile(fout, fname) finally:
_killfile(fout, fname)
def testNLCharAsPartOfUniChar(self):
fname = tempfile.mktemp(prefix='tmp_fail2ban', suffix='uni')
# test two multi-byte encodings (both contains `\x0A` in either \x02\x0A or \x0A\x02):
for enc in ('utf-16be', 'utf-16le'):
self.pruneLog("[test-phase encoding=%s]" % enc)
try:
fout = open(fname, 'wb')
tm = int(time.time())
# test on unicode string containing \x0A as part of uni-char,
# it must produce exactly 2 lines (both are failures):
for l in (
u'%s \u20AC Failed auth: invalid user Test\u020A from 192.0.2.1\n' % tm,
u'%s \u20AC Failed auth: invalid user TestI from 192.0.2.2\n' % tm
):
fout.write(l.encode(enc))
fout.close()
self.filter.setLogEncoding(enc)
self.filter.addLogPath(fname, autoSeek=0)
self.filter.setDatePattern((r'^EPOCH',))
self.filter.addFailRegex(r"Failed .* from <HOST>")
self.filter.getFailures(fname)
self.assertLogged(
"[DummyJail] Found 192.0.2.1",
"[DummyJail] Found 192.0.2.2", all=True, wait=True)
finally:
_killfile(fout, fname)
self.filter.delLogPath(fname)
# must find 4 failures and generate 2 tickets (2 IPs with each 2 failures):
self.assertEqual(self.filter.failManager.getFailCount(), (2, 4))
def testGetFailures02(self): def testGetFailures02(self):
output = ('141.3.81.106', 4, 1124013539.0, output = ('141.3.81.106', 4, 1124013539.0,
[u'Aug 14 11:%d:59 i60p295 sshd[12365]: Failed publickey for roehl from ::ffff:141.3.81.106 port 51332 ssh2' [u'Aug 14 11:%d:59 i60p295 sshd[12365]: Failed publickey for roehl from ::ffff:141.3.81.106 port 51332 ssh2'
% m for m in 53, 54, 57, 58]) % m for m in 53, 54, 57, 58])
self.filter.setMaxRetry(4)
self.filter.addLogPath(GetFailures.FILENAME_02, autoSeek=0) self.filter.addLogPath(GetFailures.FILENAME_02, autoSeek=0)
self.filter.addFailRegex(r"Failed .* from <HOST>") self.filter.addFailRegex(r"Failed .* from <HOST>")
self.filter.getFailures(GetFailures.FILENAME_02) self.filter.getFailures(GetFailures.FILENAME_02)
@ -1649,6 +1705,7 @@ class GetFailures(LogCaptureTestCase):
def testGetFailures03(self): def testGetFailures03(self):
output = ('203.162.223.135', 6, 1124013600.0) output = ('203.162.223.135', 6, 1124013600.0)
self.filter.setMaxRetry(6)
self.filter.addLogPath(GetFailures.FILENAME_03, autoSeek=0) self.filter.addLogPath(GetFailures.FILENAME_03, autoSeek=0)
self.filter.addFailRegex(r"error,relay=<HOST>,.*550 User unknown") self.filter.addFailRegex(r"error,relay=<HOST>,.*550 User unknown")
self.filter.getFailures(GetFailures.FILENAME_03) self.filter.getFailures(GetFailures.FILENAME_03)
@ -1657,6 +1714,7 @@ class GetFailures(LogCaptureTestCase):
def testGetFailures03_InOperation(self): def testGetFailures03_InOperation(self):
output = ('203.162.223.135', 9, 1124013600.0) output = ('203.162.223.135', 9, 1124013600.0)
self.filter.setMaxRetry(9)
self.filter.addLogPath(GetFailures.FILENAME_03, autoSeek=0) self.filter.addLogPath(GetFailures.FILENAME_03, autoSeek=0)
self.filter.addFailRegex(r"error,relay=<HOST>,.*550 User unknown") self.filter.addFailRegex(r"error,relay=<HOST>,.*550 User unknown")
self.filter.getFailures(GetFailures.FILENAME_03, inOperation=True) self.filter.getFailures(GetFailures.FILENAME_03, inOperation=True)
@ -1674,7 +1732,7 @@ class GetFailures(LogCaptureTestCase):
def testGetFailures03_Seek2(self): def testGetFailures03_Seek2(self):
# same test as above but with seek to 'Aug 14 11:59:04' - so other output ... # same test as above but with seek to 'Aug 14 11:59:04' - so other output ...
output = ('203.162.223.135', 2, 1124013600.0) output = ('203.162.223.135', 2, 1124013600.0)
self.filter.setMaxRetry(1) self.filter.setMaxRetry(2)
self.filter.addLogPath(GetFailures.FILENAME_03, autoSeek=output[2]) self.filter.addLogPath(GetFailures.FILENAME_03, autoSeek=output[2])
self.filter.addFailRegex(r"error,relay=<HOST>,.*550 User unknown") self.filter.addFailRegex(r"error,relay=<HOST>,.*550 User unknown")
@ -1684,10 +1742,12 @@ class GetFailures(LogCaptureTestCase):
def testGetFailures04(self): def testGetFailures04(self):
# because of not exact time in testcase04.log (no year), we should always use our test time: # because of not exact time in testcase04.log (no year), we should always use our test time:
self.assertEqual(MyTime.time(), 1124013600) self.assertEqual(MyTime.time(), 1124013600)
# should find exact 4 failures for *.186 and 2 failures for *.185 # should find exact 4 failures for *.186 and 2 failures for *.185, but maxretry is 2, so 3 tickets:
output = (('212.41.96.186', 4, 1124013600.0), output = (
('212.41.96.185', 2, 1124013598.0)) ('212.41.96.186', 2, 1124013480.0),
('212.41.96.186', 2, 1124013600.0),
('212.41.96.185', 2, 1124013598.0)
)
# speedup search using exact date pattern: # speedup search using exact date pattern:
self.filter.setDatePattern((r'^%ExY(?P<_sep>[-/.])%m(?P=_sep)%d[T ]%H:%M:%S(?:[.,]%f)?(?:\s*%z)?', self.filter.setDatePattern((r'^%ExY(?P<_sep>[-/.])%m(?P=_sep)%d[T ]%H:%M:%S(?:[.,]%f)?(?:\s*%z)?',
r'^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?', r'^(?:%a )?%b %d %H:%M:%S(?:\.%f)?(?: %ExY)?',
@ -1744,9 +1804,11 @@ class GetFailures(LogCaptureTestCase):
unittest.F2B.SkipIfNoNetwork() unittest.F2B.SkipIfNoNetwork()
# We should still catch failures with usedns = no ;-) # We should still catch failures with usedns = no ;-)
output_yes = ( output_yes = (
('93.184.216.34', 2, 1124013539.0, ('93.184.216.34', 1, 1124013299.0,
[u'Aug 14 11:54:59 i60p295 sshd[12365]: Failed publickey for roehl from example.com port 51332 ssh2', [u'Aug 14 11:54:59 i60p295 sshd[12365]: Failed publickey for roehl from example.com port 51332 ssh2']
u'Aug 14 11:58:59 i60p295 sshd[12365]: Failed publickey for roehl from ::ffff:93.184.216.34 port 51332 ssh2'] ),
('93.184.216.34', 1, 1124013539.0,
[u'Aug 14 11:58:59 i60p295 sshd[12365]: Failed publickey for roehl from ::ffff:93.184.216.34 port 51332 ssh2']
), ),
('2606:2800:220:1:248:1893:25c8:1946', 1, 1124013299.0, ('2606:2800:220:1:248:1893:25c8:1946', 1, 1124013299.0,
[u'Aug 14 11:54:59 i60p295 sshd[12365]: Failed publickey for roehl from example.com port 51332 ssh2'] [u'Aug 14 11:54:59 i60p295 sshd[12365]: Failed publickey for roehl from example.com port 51332 ssh2']
@ -1771,7 +1833,6 @@ class GetFailures(LogCaptureTestCase):
self.pruneLog("[test-phase useDns=%s]" % useDns) self.pruneLog("[test-phase useDns=%s]" % useDns)
jail = DummyJail() jail = DummyJail()
filter_ = FileFilter(jail, useDns=useDns) filter_ = FileFilter(jail, useDns=useDns)
filter_.banASAP = False # avoid immediate ban in this tests
filter_.active = True filter_.active = True
filter_.failManager.setMaxRetry(1) # we might have just few failures filter_.failManager.setMaxRetry(1) # we might have just few failures
@ -1781,8 +1842,11 @@ class GetFailures(LogCaptureTestCase):
_assert_correct_last_attempt(self, filter_, output) _assert_correct_last_attempt(self, filter_, output)
def testGetFailuresMultiRegex(self): def testGetFailuresMultiRegex(self):
output = ('141.3.81.106', 8, 1124013541.0) output = [
('141.3.81.106', 8, 1124013541.0)
]
self.filter.setMaxRetry(8)
self.filter.addLogPath(GetFailures.FILENAME_02, autoSeek=False) self.filter.addLogPath(GetFailures.FILENAME_02, autoSeek=False)
self.filter.addFailRegex(r"Failed .* from <HOST>") self.filter.addFailRegex(r"Failed .* from <HOST>")
self.filter.addFailRegex(r"Accepted .* from <HOST>") self.filter.addFailRegex(r"Accepted .* from <HOST>")
@ -1800,26 +1864,25 @@ class GetFailures(LogCaptureTestCase):
self.assertRaises(FailManagerEmpty, self.filter.failManager.toBan) self.assertRaises(FailManagerEmpty, self.filter.failManager.toBan)
def testGetFailuresMultiLine(self): def testGetFailuresMultiLine(self):
output = [("192.0.43.10", 2, 1124013599.0), output = [
("192.0.43.11", 1, 1124013598.0)] ("192.0.43.10", 1, 1124013598.0),
("192.0.43.10", 1, 1124013599.0),
("192.0.43.11", 1, 1124013598.0)
]
self.filter.addLogPath(GetFailures.FILENAME_MULTILINE, autoSeek=False) self.filter.addLogPath(GetFailures.FILENAME_MULTILINE, autoSeek=False)
self.filter.setMaxLines(100) self.filter.setMaxLines(100)
self.filter.addFailRegex(r"^.*rsyncd\[(?P<pid>\d+)\]: connect from .+ \(<HOST>\)$<SKIPLINES>^.+ rsyncd\[(?P=pid)\]: rsync error: .*$") self.filter.addFailRegex(r"^.*rsyncd\[(?P<pid>\d+)\]: connect from .+ \(<HOST>\)$<SKIPLINES>^.+ rsyncd\[(?P=pid)\]: rsync error: .*$")
self.filter.setMaxRetry(1) self.filter.setMaxRetry(1)
self.filter.getFailures(GetFailures.FILENAME_MULTILINE) self.filter.getFailures(GetFailures.FILENAME_MULTILINE)
foundList = [] _assert_correct_last_attempt(self, self.filter, output)
while True:
try:
foundList.append(
_ticket_tuple(self.filter.failManager.toBan())[0:3])
except FailManagerEmpty:
break
self.assertSortedEqual(foundList, output)
def testGetFailuresMultiLineIgnoreRegex(self): def testGetFailuresMultiLineIgnoreRegex(self):
output = [("192.0.43.10", 2, 1124013599.0)] output = [
("192.0.43.10", 1, 1124013598.0),
("192.0.43.10", 1, 1124013599.0)
]
self.filter.addLogPath(GetFailures.FILENAME_MULTILINE, autoSeek=False) self.filter.addLogPath(GetFailures.FILENAME_MULTILINE, autoSeek=False)
self.filter.setMaxLines(100) self.filter.setMaxLines(100)
self.filter.addFailRegex(r"^.*rsyncd\[(?P<pid>\d+)\]: connect from .+ \(<HOST>\)$<SKIPLINES>^.+ rsyncd\[(?P=pid)\]: rsync error: .*$") self.filter.addFailRegex(r"^.*rsyncd\[(?P<pid>\d+)\]: connect from .+ \(<HOST>\)$<SKIPLINES>^.+ rsyncd\[(?P=pid)\]: rsync error: .*$")
@ -1828,14 +1891,17 @@ class GetFailures(LogCaptureTestCase):
self.filter.getFailures(GetFailures.FILENAME_MULTILINE) self.filter.getFailures(GetFailures.FILENAME_MULTILINE)
_assert_correct_last_attempt(self, self.filter, output.pop()) _assert_correct_last_attempt(self, self.filter, output)
self.assertRaises(FailManagerEmpty, self.filter.failManager.toBan) self.assertRaises(FailManagerEmpty, self.filter.failManager.toBan)
def testGetFailuresMultiLineMultiRegex(self): def testGetFailuresMultiLineMultiRegex(self):
output = [("192.0.43.10", 2, 1124013599.0), output = [
("192.0.43.10", 1, 1124013598.0),
("192.0.43.10", 1, 1124013599.0),
("192.0.43.11", 1, 1124013598.0), ("192.0.43.11", 1, 1124013598.0),
("192.0.43.15", 1, 1124013598.0)] ("192.0.43.15", 1, 1124013598.0)
]
self.filter.addLogPath(GetFailures.FILENAME_MULTILINE, autoSeek=False) self.filter.addLogPath(GetFailures.FILENAME_MULTILINE, autoSeek=False)
self.filter.setMaxLines(100) self.filter.setMaxLines(100)
self.filter.addFailRegex(r"^.*rsyncd\[(?P<pid>\d+)\]: connect from .+ \(<HOST>\)$<SKIPLINES>^.+ rsyncd\[(?P=pid)\]: rsync error: .*$") self.filter.addFailRegex(r"^.*rsyncd\[(?P<pid>\d+)\]: connect from .+ \(<HOST>\)$<SKIPLINES>^.+ rsyncd\[(?P=pid)\]: rsync error: .*$")
@ -1844,14 +1910,9 @@ class GetFailures(LogCaptureTestCase):
self.filter.getFailures(GetFailures.FILENAME_MULTILINE) self.filter.getFailures(GetFailures.FILENAME_MULTILINE)
foundList = [] _assert_correct_last_attempt(self, self.filter, output)
while True:
try: self.assertRaises(FailManagerEmpty, self.filter.failManager.toBan)
foundList.append(
_ticket_tuple(self.filter.failManager.toBan())[0:3])
except FailManagerEmpty:
break
self.assertSortedEqual(foundList, output)
class DNSUtilsTests(unittest.TestCase): class DNSUtilsTests(unittest.TestCase):
@ -2192,6 +2253,7 @@ class DNSUtilsNetworkTests(unittest.TestCase):
ip1 = IPAddr('2606:2800:220:1:248:1893:25c8:1946'); ip2 = IPAddr('2606:2800:220:1:248:1893:25c8:1946'); self.assertEqual(id(ip1), id(ip2)) ip1 = IPAddr('2606:2800:220:1:248:1893:25c8:1946'); ip2 = IPAddr('2606:2800:220:1:248:1893:25c8:1946'); self.assertEqual(id(ip1), id(ip2))
def testFQDN(self): def testFQDN(self):
unittest.F2B.SkipIfNoNetwork()
sname = DNSUtils.getHostname(fqdn=False) sname = DNSUtils.getHostname(fqdn=False)
lname = DNSUtils.getHostname(fqdn=True) lname = DNSUtils.getHostname(fqdn=True)
# FQDN is not localhost if short hostname is not localhost too (or vice versa): # FQDN is not localhost if short hostname is not localhost too (or vice versa):

View File

@ -23,7 +23,6 @@ __copyright__ = "Copyright (c) 2013 Steven Hiscocks"
__license__ = "GPL" __license__ = "GPL"
import datetime import datetime
import fileinput
import inspect import inspect
import json import json
import os import os
@ -156,12 +155,15 @@ def testSampleRegexsFactory(name, basedir):
i = 0 i = 0
while i < len(filenames): while i < len(filenames):
filename = filenames[i]; i += 1; filename = filenames[i]; i += 1;
logFile = fileinput.FileInput(os.path.join(TEST_FILES_DIR, "logs", logFile = FileContainer(os.path.join(TEST_FILES_DIR, "logs",
filename), mode='rb') filename), 'UTF-8', doOpen=True)
# avoid errors if no NL char at end of test log-file:
logFile.waitForLineEnd = False
ignoreBlock = False ignoreBlock = False
lnnum = 0
for line in logFile: for line in logFile:
line = FileContainer.decode_line(logFile.filename(), 'UTF-8', line) lnnum += 1
jsonREMatch = re.match("^#+ ?(failJSON|(?:file|filter)Options|addFILE):(.+)$", line) jsonREMatch = re.match("^#+ ?(failJSON|(?:file|filter)Options|addFILE):(.+)$", line)
if jsonREMatch: if jsonREMatch:
try: try:
@ -201,9 +203,8 @@ def testSampleRegexsFactory(name, basedir):
# failJSON - faildata contains info of the failure to check it. # failJSON - faildata contains info of the failure to check it.
except ValueError as e: # pragma: no cover - we've valid json's except ValueError as e: # pragma: no cover - we've valid json's
raise ValueError("%s: %s:%i" % raise ValueError("%s: %s:%i" %
(e, logFile.filename(), logFile.filelineno())) (e, logFile.getFileName(), lnnum))
line = next(logFile) line = next(logFile)
line = FileContainer.decode_line(logFile.filename(), 'UTF-8', line)
elif ignoreBlock or line.startswith("#") or not line.strip(): elif ignoreBlock or line.startswith("#") or not line.strip():
continue continue
else: # pragma: no cover - normally unreachable else: # pragma: no cover - normally unreachable
@ -298,7 +299,7 @@ def testSampleRegexsFactory(name, basedir):
import pprint import pprint
raise AssertionError("%s: %s on: %s:%i, line:\n %s\nregex (%s):\n %s\n" raise AssertionError("%s: %s on: %s:%i, line:\n %s\nregex (%s):\n %s\n"
"faildata: %s\nfail: %s" % ( "faildata: %s\nfail: %s" % (
fltName, e, logFile.filename(), logFile.filelineno(), fltName, e, logFile.getFileName(), lnnum,
line, failregex, regexList[failregex] if failregex != -1 else None, line, failregex, regexList[failregex] if failregex != -1 else None,
'\n'.join(pprint.pformat(faildata).splitlines()), '\n'.join(pprint.pformat(faildata).splitlines()),
'\n'.join(pprint.pformat(fail).splitlines()))) '\n'.join(pprint.pformat(fail).splitlines())))

View File

@ -35,7 +35,7 @@ import platform
from ..server.failregex import Regex, FailRegex, RegexException from ..server.failregex import Regex, FailRegex, RegexException
from ..server import actions as _actions from ..server import actions as _actions
from ..server.server import Server from ..server.server import Server
from ..server.ipdns import IPAddr from ..server.ipdns import DNSUtils, IPAddr
from ..server.jail import Jail from ..server.jail import Jail
from ..server.jailthread import JailThread from ..server.jailthread import JailThread
from ..server.ticket import BanTicket from ..server.ticket import BanTicket
@ -66,9 +66,12 @@ class TestServer(Server):
class TransmitterBase(LogCaptureTestCase): class TransmitterBase(LogCaptureTestCase):
TEST_SRV_CLASS = TestServer
def setUp(self): def setUp(self):
"""Call before every test case.""" """Call before every test case."""
super(TransmitterBase, self).setUp() super(TransmitterBase, self).setUp()
self.server = self.TEST_SRV_CLASS()
self.transm = self.server._Server__transm self.transm = self.server._Server__transm
# To test thransmitter we don't need to start server... # To test thransmitter we don't need to start server...
#self.server.start('/dev/null', '/dev/null', force=False) #self.server.start('/dev/null', '/dev/null', force=False)
@ -157,10 +160,6 @@ class TransmitterBase(LogCaptureTestCase):
class Transmitter(TransmitterBase): class Transmitter(TransmitterBase):
def setUp(self):
self.server = TestServer()
super(Transmitter, self).setUp()
def testServerIsNotStarted(self): def testServerIsNotStarted(self):
# so far isStarted only tested but not used otherwise # so far isStarted only tested but not used otherwise
# and here we don't really .start server # and here we don't really .start server
@ -175,6 +174,19 @@ class Transmitter(TransmitterBase):
def testVersion(self): def testVersion(self):
self.assertEqual(self.transm.proceed(["version"]), (0, version.version)) self.assertEqual(self.transm.proceed(["version"]), (0, version.version))
def testSetIPv6(self):
try:
self.assertEqual(self.transm.proceed(["set", "allowipv6", 'yes']), (0, 'yes'))
self.assertTrue(DNSUtils.IPv6IsAllowed())
self.assertLogged("IPv6 is on"); self.pruneLog()
self.assertEqual(self.transm.proceed(["set", "allowipv6", 'no']), (0, 'no'))
self.assertFalse(DNSUtils.IPv6IsAllowed())
self.assertLogged("IPv6 is off"); self.pruneLog()
finally:
# restore back to auto:
self.assertEqual(self.transm.proceed(["set", "allowipv6", "auto"]), (0, "auto"))
self.assertLogged("IPv6 is auto"); self.pruneLog()
def testSleep(self): def testSleep(self):
if not unittest.F2B.fast: if not unittest.F2B.fast:
t0 = time.time() t0 = time.time()
@ -924,8 +936,9 @@ class Transmitter(TransmitterBase):
class TransmitterLogging(TransmitterBase): class TransmitterLogging(TransmitterBase):
TEST_SRV_CLASS = Server
def setUp(self): def setUp(self):
self.server = Server()
super(TransmitterLogging, self).setUp() super(TransmitterLogging, self).setUp()
self.server.setLogTarget("/dev/null") self.server.setLogTarget("/dev/null")
self.server.setLogLevel("CRITICAL") self.server.setLogLevel("CRITICAL")

View File

@ -47,7 +47,7 @@ from ..server import asyncserver
from ..version import version from ..version import version
logSys = getLogger(__name__) logSys = getLogger("fail2ban")
TEST_NOW = 1124013600 TEST_NOW = 1124013600
@ -126,9 +126,6 @@ def getOptParser(doc=""):
def initProcess(opts): def initProcess(opts):
# Logger: # Logger:
global logSys
logSys = getLogger("fail2ban")
llev = None llev = None
if opts.log_level is not None: # pragma: no cover if opts.log_level is not None: # pragma: no cover
# so we had explicit settings # so we had explicit settings
@ -320,6 +317,7 @@ def initTests(opts):
# precache all invalid ip's (TEST-NET-1, ..., TEST-NET-3 according to RFC 5737): # precache all invalid ip's (TEST-NET-1, ..., TEST-NET-3 according to RFC 5737):
c = DNSUtils.CACHE_ipToName c = DNSUtils.CACHE_ipToName
c.clear = lambda: logSys.warn('clear CACHE_ipToName is disabled in test suite')
# increase max count and max time (too many entries, long time testing): # increase max count and max time (too many entries, long time testing):
c.setOptions(maxCount=10000, maxTime=5*60) c.setOptions(maxCount=10000, maxTime=5*60)
for i in xrange(256): for i in xrange(256):
@ -337,6 +335,7 @@ def initTests(opts):
c.set('8.8.4.4', 'dns.google') c.set('8.8.4.4', 'dns.google')
# precache all dns to ip's used in test cases: # precache all dns to ip's used in test cases:
c = DNSUtils.CACHE_nameToIp c = DNSUtils.CACHE_nameToIp
c.clear = lambda: logSys.warn('clear CACHE_nameToIp is disabled in test suite')
for i in ( for i in (
('999.999.999.999', set()), ('999.999.999.999', set()),
('abcdef.abcdef', set()), ('abcdef.abcdef', set()),
@ -780,8 +779,9 @@ class LogCaptureTestCase(unittest.TestCase):
"""Call after every test case.""" """Call after every test case."""
# print "O: >>%s<<" % self._log.getvalue() # print "O: >>%s<<" % self._log.getvalue()
self.pruneLog() self.pruneLog()
self._log.close()
logSys.handlers = self._old_handlers logSys.handlers = self._old_handlers
logSys.level = self._old_level logSys.setLevel(self._old_level)
super(LogCaptureTestCase, self).tearDown() super(LogCaptureTestCase, self).tearDown()
def _is_logged(self, *s, **kwargs): def _is_logged(self, *s, **kwargs):

View File

@ -151,6 +151,11 @@ PID filename. Default: /var/run/fail2ban/fail2ban.pid
.br .br
This is used to store the process ID of the fail2ban server. This is used to store the process ID of the fail2ban server.
.TP .TP
.B allowipv6
option to allow IPv6 interface - auto, yes (on, true, 1) or no (off, false, 0). Default: auto
.br
This value can be used to declare fail2ban whether IPv6 is allowed or not.
.TP
.B dbfile .B dbfile
Database filename. Default: /var/lib/fail2ban/fail2ban.sqlite3 Database filename. Default: /var/lib/fail2ban/fail2ban.sqlite3
.br .br