mirror of https://github.com/jumpserver/jumpserver
commit
8cf8a3701b
|
@ -1,11 +1,35 @@
|
|||
---
|
||||
name: 需求建议
|
||||
about: 提出针对本项目的想法和建议
|
||||
title: "[Feature] "
|
||||
title: "[Feature] 需求标题"
|
||||
labels: 类型:需求
|
||||
assignees:
|
||||
- ibuler
|
||||
- baijiangjie
|
||||
---
|
||||
|
||||
**请描述您的需求或者改进建议.**
|
||||
## 注意
|
||||
_针对过于简单的需求描述不予考虑。请确保提供足够的细节和信息以支持功能的开发和实现。_
|
||||
|
||||
## 功能名称
|
||||
[在这里输入功能的名称或标题]
|
||||
|
||||
## 功能描述
|
||||
[在这里描述该功能的详细内容,包括其作用、目的和所需的功能]
|
||||
|
||||
## 用户故事(可选)
|
||||
[如果适用,可以提供用户故事来更好地理解该功能的使用场景和用户期望]
|
||||
|
||||
## 功能要求
|
||||
- [要求1:描述该功能的具体要求,如界面设计、交互逻辑等]
|
||||
- [要求2:描述该功能的另一个具体要求]
|
||||
- [以此类推,列出所有相关的功能要求]
|
||||
|
||||
## 示例或原型(可选)
|
||||
[如果有的话,提供该功能的示例或原型图以更好地说明功能的实现方式]
|
||||
|
||||
## 优先级
|
||||
[描述该功能的优先级,如高、中、低,或使用数字等其他标识]
|
||||
|
||||
## 备注(可选)
|
||||
[在这里添加任何其他相关信息或备注]
|
||||
|
|
|
@ -1,22 +1,51 @@
|
|||
---
|
||||
name: Bug 提交
|
||||
about: 提交产品缺陷帮助我们更好的改进
|
||||
title: "[Bug] "
|
||||
title: "[Bug] Bug 标题"
|
||||
labels: 类型:Bug
|
||||
assignees:
|
||||
- baijiangjie
|
||||
---
|
||||
|
||||
**JumpServer 版本( v2.28 之前的版本不再支持 )**
|
||||
## 注意
|
||||
**JumpServer 版本( v2.28 之前的版本不再支持 )** <br>
|
||||
_针对过于简单的 Bug 描述不予考虑。请确保提供足够的细节和信息以支持 Bug 的复现和修复。_
|
||||
|
||||
## 当前使用的 JumpServer 版本 (必填)
|
||||
[在这里输入当前使用的 JumpServer 的版本号]
|
||||
|
||||
## 使用的版本类型 (必填)
|
||||
- [ ] 社区版
|
||||
- [ ] 企业版
|
||||
- [ ] 企业试用版
|
||||
|
||||
|
||||
**浏览器版本**
|
||||
## 版本安装方式 (必填)
|
||||
- [ ] 在线安装 (一键命令)
|
||||
- [ ] 离线安装 (下载离线包)
|
||||
- [ ] All-in-One
|
||||
- [ ] 1Panel 安装
|
||||
- [ ] Kubernetes 安装
|
||||
- [ ] 源码安装
|
||||
|
||||
## Bug 描述 (详细)
|
||||
[在这里描述 Bug 的详细情况,包括其影响和出现的具体情况]
|
||||
|
||||
**Bug 描述**
|
||||
## 复现步骤
|
||||
1. [描述如何复现 Bug 的第一步]
|
||||
2. [描述如何复现 Bug 的第二步]
|
||||
3. [以此类推,列出所有复现 Bug 所需的步骤]
|
||||
|
||||
## 期望行为
|
||||
[描述 Bug 出现时期望的系统行为或结果]
|
||||
|
||||
**Bug 重现步骤(有截图更好)**
|
||||
1.
|
||||
2.
|
||||
3.
|
||||
## 实际行为
|
||||
[描述实际上发生了什么,以及 Bug 出现的具体情况]
|
||||
|
||||
## 系统环境
|
||||
- 操作系统:[例如:Windows 10, macOS Big Sur]
|
||||
- 浏览器/应用版本:[如果适用,请提供相关版本信息]
|
||||
- 其他相关环境信息:[如果有其他相关环境信息,请在此处提供]
|
||||
|
||||
## 附加信息(可选)
|
||||
[在这里添加任何其他相关信息,如截图、错误信息等]
|
||||
|
|
|
@ -1,10 +1,50 @@
|
|||
---
|
||||
name: 问题咨询
|
||||
about: 提出针对本项目安装部署、使用及其他方面的相关问题
|
||||
title: "[Question] "
|
||||
title: "[Question] 问题标题"
|
||||
labels: 类型:提问
|
||||
assignees:
|
||||
- baijiangjie
|
||||
---
|
||||
## 注意
|
||||
**请描述您的问题.** <br>
|
||||
**JumpServer 版本( v2.28 之前的版本不再支持 )** <br>
|
||||
_针对过于简单的 Bug 描述不予考虑。请确保提供足够的细节和信息以支持 Bug 的复现和修复。_
|
||||
|
||||
## 当前使用的 JumpServer 版本 (必填)
|
||||
[在这里输入当前使用的 JumpServer 的版本号]
|
||||
|
||||
## 使用的版本类型 (必填)
|
||||
- [ ] 社区版
|
||||
- [ ] 企业版
|
||||
- [ ] 企业试用版
|
||||
|
||||
|
||||
## 版本安装方式 (必填)
|
||||
- [ ] 在线安装 (一键命令)
|
||||
- [ ] 离线安装 (下载离线包)
|
||||
- [ ] All-in-One
|
||||
- [ ] 1Panel 安装
|
||||
- [ ] Kubernetes 安装
|
||||
- [ ] 源码安装
|
||||
|
||||
## 问题描述 (详细)
|
||||
[在这里描述你遇到的问题]
|
||||
|
||||
## 背景信息
|
||||
- 操作系统:[例如:Windows 10, macOS Big Sur]
|
||||
- 浏览器/应用版本:[如果适用,请提供相关版本信息]
|
||||
- 其他相关环境信息:[如果有其他相关环境信息,请在此处提供]
|
||||
|
||||
## 具体问题
|
||||
[在这里详细描述你的问题,包括任何相关细节或错误信息]
|
||||
|
||||
## 尝试过的解决方法
|
||||
[如果你已经尝试过解决问题,请在这里列出你已经尝试过的解决方法]
|
||||
|
||||
## 预期结果
|
||||
[描述你期望的解决方案或结果]
|
||||
|
||||
## 我们的期望
|
||||
[描述你希望我们提供的帮助或支持]
|
||||
|
||||
**请描述您的问题.**
|
||||
|
|
|
@ -43,3 +43,4 @@ releashe
|
|||
data/*
|
||||
test.py
|
||||
.history/
|
||||
.test/
|
||||
|
|
|
@ -111,8 +111,17 @@ RUN --mount=type=cache,target=/var/cache/apt,sharing=locked,id=core-apt \
|
|||
&& sed -i "s@# export @export @g" ~/.bashrc \
|
||||
&& sed -i "s@# alias @alias @g" ~/.bashrc
|
||||
|
||||
ARG RECEPTOR_VERSION=v1.4.5
|
||||
RUN set -ex \
|
||||
&& wget -O /opt/receptor.tar.gz https://github.com/ansible/receptor/releases/download/${RECEPTOR_VERSION}/receptor_${RECEPTOR_VERSION/v/}_linux_${TARGETARCH}.tar.gz \
|
||||
&& tar -xf /opt/receptor.tar.gz -C /usr/local/bin/ \
|
||||
&& chown root:root /usr/local/bin/receptor \
|
||||
&& chmod 755 /usr/local/bin/receptor \
|
||||
&& rm -f /opt/receptor.tar.gz
|
||||
|
||||
COPY --from=stage-2 /opt/py3 /opt/py3
|
||||
COPY --from=stage-1 /opt/jumpserver/release/jumpserver /opt/jumpserver
|
||||
COPY --from=stage-1 /opt/jumpserver/release/jumpserver/apps/libs/ansible/ansible.cfg /etc/ansible/
|
||||
|
||||
WORKDIR /opt/jumpserver
|
||||
|
||||
|
|
|
@ -85,7 +85,7 @@ If you find a security problem, please contact us directly:
|
|||
- 400-052-0755
|
||||
|
||||
### License & Copyright
|
||||
Copyright (c) 2014-2022 FIT2CLOUD Tech, Inc., All rights reserved.
|
||||
Copyright (c) 2014-2024 FIT2CLOUD Tech, Inc., All rights reserved.
|
||||
|
||||
Licensed under The GNU General Public License version 3 (GPLv3) (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
|
||||
|
||||
|
|
|
@ -18,9 +18,8 @@ __all__ = [
|
|||
|
||||
class AccountBackupPlanViewSet(OrgBulkModelViewSet):
|
||||
model = AccountBackupAutomation
|
||||
filter_fields = ('name',)
|
||||
search_fields = filter_fields
|
||||
ordering = ('name',)
|
||||
filterset_fields = ('name',)
|
||||
search_fields = filterset_fields
|
||||
serializer_class = serializers.AccountBackupSerializer
|
||||
|
||||
|
||||
|
|
|
@ -20,8 +20,8 @@ __all__ = [
|
|||
class AutomationAssetsListApi(generics.ListAPIView):
|
||||
model = BaseAutomation
|
||||
serializer_class = serializers.AutomationAssetsSerializer
|
||||
filter_fields = ("name", "address")
|
||||
search_fields = filter_fields
|
||||
filterset_fields = ("name", "address")
|
||||
search_fields = filterset_fields
|
||||
|
||||
def get_object(self):
|
||||
pk = self.kwargs.get('pk')
|
||||
|
|
|
@ -6,9 +6,12 @@ from rest_framework.response import Response
|
|||
|
||||
from accounts import serializers
|
||||
from accounts.const import AutomationTypes
|
||||
from accounts.filters import ChangeSecretRecordFilterSet
|
||||
from accounts.models import ChangeSecretAutomation, ChangeSecretRecord
|
||||
from accounts.tasks import execute_automation_record_task
|
||||
from authentication.permissions import UserConfirmation, ConfirmType
|
||||
from orgs.mixins.api import OrgBulkModelViewSet, OrgGenericViewSet
|
||||
from rbac.permissions import RBACPermission
|
||||
from .base import (
|
||||
AutomationAssetsListApi, AutomationRemoveAssetApi, AutomationAddAssetApi,
|
||||
AutomationNodeAddRemoveApi, AutomationExecutionViewSet
|
||||
|
@ -24,35 +27,54 @@ __all__ = [
|
|||
|
||||
class ChangeSecretAutomationViewSet(OrgBulkModelViewSet):
|
||||
model = ChangeSecretAutomation
|
||||
filter_fields = ('name', 'secret_type', 'secret_strategy')
|
||||
search_fields = filter_fields
|
||||
filterset_fields = ('name', 'secret_type', 'secret_strategy')
|
||||
search_fields = filterset_fields
|
||||
serializer_class = serializers.ChangeSecretAutomationSerializer
|
||||
|
||||
|
||||
class ChangeSecretRecordViewSet(mixins.ListModelMixin, OrgGenericViewSet):
|
||||
serializer_class = serializers.ChangeSecretRecordSerializer
|
||||
filterset_fields = ('asset_id', 'execution_id')
|
||||
filterset_class = ChangeSecretRecordFilterSet
|
||||
search_fields = ('asset__address',)
|
||||
tp = AutomationTypes.change_secret
|
||||
serializer_classes = {
|
||||
'default': serializers.ChangeSecretRecordSerializer,
|
||||
'secret': serializers.ChangeSecretRecordViewSecretSerializer,
|
||||
}
|
||||
rbac_perms = {
|
||||
'execute': 'accounts.add_changesecretexecution',
|
||||
'secret': 'accounts.view_changesecretrecord',
|
||||
}
|
||||
|
||||
def get_permissions(self):
|
||||
if self.action == 'secret':
|
||||
self.permission_classes = [
|
||||
RBACPermission,
|
||||
UserConfirmation.require(ConfirmType.MFA)
|
||||
]
|
||||
return super().get_permissions()
|
||||
|
||||
def get_queryset(self):
|
||||
return ChangeSecretRecord.objects.all()
|
||||
|
||||
@action(methods=['post'], detail=False, url_path='execute')
|
||||
def execute(self, request, *args, **kwargs):
|
||||
record_id = request.data.get('record_id')
|
||||
record = self.get_queryset().filter(pk=record_id)
|
||||
if not record:
|
||||
record_ids = request.data.get('record_ids')
|
||||
records = self.get_queryset().filter(id__in=record_ids)
|
||||
execution_count = records.values_list('execution_id', flat=True).distinct().count()
|
||||
if execution_count != 1:
|
||||
return Response(
|
||||
{'detail': 'record not found'},
|
||||
status=status.HTTP_404_NOT_FOUND
|
||||
{'detail': 'Only one execution is allowed to execute'},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
task = execute_automation_record_task.delay(record_id, self.tp)
|
||||
task = execute_automation_record_task.delay(record_ids, self.tp)
|
||||
return Response({'task': task.id}, status=status.HTTP_200_OK)
|
||||
|
||||
@action(methods=['get'], detail=True, url_path='secret')
|
||||
def secret(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
serializer = self.get_serializer(instance)
|
||||
return Response(serializer.data)
|
||||
|
||||
|
||||
class ChangSecretExecutionViewSet(AutomationExecutionViewSet):
|
||||
rbac_perms = (
|
||||
|
|
|
@ -20,8 +20,8 @@ __all__ = [
|
|||
|
||||
class GatherAccountsAutomationViewSet(OrgBulkModelViewSet):
|
||||
model = GatherAccountsAutomation
|
||||
filter_fields = ('name',)
|
||||
search_fields = filter_fields
|
||||
filterset_fields = ('name',)
|
||||
search_fields = filterset_fields
|
||||
serializer_class = serializers.GatherAccountAutomationSerializer
|
||||
|
||||
|
||||
|
|
|
@ -20,8 +20,8 @@ __all__ = [
|
|||
|
||||
class PushAccountAutomationViewSet(OrgBulkModelViewSet):
|
||||
model = PushAccountAutomation
|
||||
filter_fields = ('name', 'secret_type', 'secret_strategy')
|
||||
search_fields = filter_fields
|
||||
filterset_fields = ('name', 'secret_type', 'secret_strategy')
|
||||
search_fields = filterset_fields
|
||||
serializer_class = serializers.PushAccountAutomationSerializer
|
||||
|
||||
|
||||
|
|
|
@ -6,7 +6,7 @@ from django.conf import settings
|
|||
from rest_framework import serializers
|
||||
from xlsxwriter import Workbook
|
||||
|
||||
from accounts.const.automation import AccountBackupType
|
||||
from accounts.const import AccountBackupType
|
||||
from accounts.models.automations.backup_account import AccountBackupAutomation
|
||||
from accounts.notifications import AccountBackupExecutionTaskMsg, AccountBackupByObjStorageExecutionTaskMsg
|
||||
from accounts.serializers import AccountSecretSerializer
|
||||
|
|
|
@ -18,6 +18,8 @@
|
|||
become_user: "{{ custom_become_user | default('') }}"
|
||||
become_password: "{{ custom_become_password | default('') }}"
|
||||
become_private_key_path: "{{ custom_become_private_key_path | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
register: ping_info
|
||||
delegate_to: localhost
|
||||
|
||||
|
@ -54,4 +56,6 @@
|
|||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
delegate_to: localhost
|
||||
|
|
|
@ -85,6 +85,7 @@
|
|||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
|
@ -95,5 +96,6 @@
|
|||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
|
|
@ -85,6 +85,7 @@
|
|||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
|
@ -95,5 +96,6 @@
|
|||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
|
|
@ -7,9 +7,9 @@ from django.utils import timezone
|
|||
from django.utils.translation import gettext_lazy as _
|
||||
from xlsxwriter import Workbook
|
||||
|
||||
from accounts.const import AutomationTypes, SecretType, SSHKeyStrategy, SecretStrategy
|
||||
from accounts.const import AutomationTypes, SecretType, SSHKeyStrategy, SecretStrategy, ChangeSecretRecordStatusChoice
|
||||
from accounts.models import ChangeSecretRecord
|
||||
from accounts.notifications import ChangeSecretExecutionTaskMsg
|
||||
from accounts.notifications import ChangeSecretExecutionTaskMsg, ChangeSecretFailedMsg
|
||||
from accounts.serializers import ChangeSecretRecordBackUpSerializer
|
||||
from assets.const import HostTypes
|
||||
from common.utils import get_logger
|
||||
|
@ -27,7 +27,7 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
|||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.record_id = self.execution.snapshot.get('record_id')
|
||||
self.record_map = self.execution.snapshot.get('record_map', {})
|
||||
self.secret_type = self.execution.snapshot.get('secret_type')
|
||||
self.secret_strategy = self.execution.snapshot.get(
|
||||
'secret_strategy', SecretStrategy.custom
|
||||
|
@ -123,14 +123,20 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
|||
print(f'new_secret is None, account: {account}')
|
||||
continue
|
||||
|
||||
if self.record_id is None:
|
||||
asset_account_id = f'{asset.id}-{account.id}'
|
||||
if asset_account_id not in self.record_map:
|
||||
recorder = ChangeSecretRecord(
|
||||
asset=asset, account=account, execution=self.execution,
|
||||
old_secret=account.secret, new_secret=new_secret,
|
||||
)
|
||||
records.append(recorder)
|
||||
else:
|
||||
recorder = ChangeSecretRecord.objects.get(id=self.record_id)
|
||||
record_id = self.record_map[asset_account_id]
|
||||
try:
|
||||
recorder = ChangeSecretRecord.objects.get(id=record_id)
|
||||
except ChangeSecretRecord.DoesNotExist:
|
||||
print(f"Record {record_id} not found")
|
||||
continue
|
||||
|
||||
self.name_recorder_mapper[h['name']] = recorder
|
||||
|
||||
|
@ -158,25 +164,43 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
|||
recorder = self.name_recorder_mapper.get(host)
|
||||
if not recorder:
|
||||
return
|
||||
recorder.status = 'success'
|
||||
recorder.status = ChangeSecretRecordStatusChoice.success.value
|
||||
recorder.date_finished = timezone.now()
|
||||
recorder.save()
|
||||
|
||||
account = recorder.account
|
||||
if not account:
|
||||
print("Account not found, deleted ?")
|
||||
return
|
||||
account.secret = recorder.new_secret
|
||||
account.date_updated = timezone.now()
|
||||
account.save(update_fields=['secret', 'date_updated'])
|
||||
|
||||
max_retries = 3
|
||||
retry_count = 0
|
||||
|
||||
while retry_count < max_retries:
|
||||
try:
|
||||
recorder.save()
|
||||
account.save(update_fields=['secret', 'version', 'date_updated'])
|
||||
break
|
||||
except Exception as e:
|
||||
retry_count += 1
|
||||
if retry_count == max_retries:
|
||||
self.on_host_error(host, str(e), result)
|
||||
else:
|
||||
print(f'retry {retry_count} times for {host} recorder save error: {e}')
|
||||
time.sleep(1)
|
||||
|
||||
def on_host_error(self, host, error, result):
|
||||
recorder = self.name_recorder_mapper.get(host)
|
||||
if not recorder:
|
||||
return
|
||||
recorder.status = 'failed'
|
||||
recorder.status = ChangeSecretRecordStatusChoice.failed.value
|
||||
recorder.date_finished = timezone.now()
|
||||
recorder.error = error
|
||||
recorder.save()
|
||||
try:
|
||||
recorder.save()
|
||||
except Exception as e:
|
||||
print(f"\033[31m Save {host} recorder error: {e} \033[0m\n")
|
||||
|
||||
def on_runner_failed(self, runner, e):
|
||||
logger.error("Account error: ", e)
|
||||
|
@ -192,7 +216,7 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
|||
def get_summary(recorders):
|
||||
total, succeed, failed = 0, 0, 0
|
||||
for recorder in recorders:
|
||||
if recorder.status == 'success':
|
||||
if recorder.status == ChangeSecretRecordStatusChoice.success.value:
|
||||
succeed += 1
|
||||
else:
|
||||
failed += 1
|
||||
|
@ -209,18 +233,35 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
|||
summary = self.get_summary(recorders)
|
||||
print(summary, end='')
|
||||
|
||||
if self.record_id:
|
||||
if self.record_map:
|
||||
return
|
||||
|
||||
self.send_recorder_mail(recorders, summary)
|
||||
failed_recorders = [
|
||||
r for r in recorders
|
||||
if r.status == ChangeSecretRecordStatusChoice.failed.value
|
||||
]
|
||||
|
||||
def send_recorder_mail(self, recorders, summary):
|
||||
recipients = self.execution.recipients
|
||||
if not recorders or not recipients:
|
||||
recipients = User.objects.filter(id__in=list(recipients.keys()))
|
||||
if not recipients:
|
||||
return
|
||||
|
||||
recipients = User.objects.filter(id__in=list(recipients.keys()))
|
||||
if failed_recorders:
|
||||
name = self.execution.snapshot.get('name')
|
||||
execution_id = str(self.execution.id)
|
||||
_ids = [r.id for r in failed_recorders]
|
||||
asset_account_errors = ChangeSecretRecord.objects.filter(
|
||||
id__in=_ids).values_list('asset__name', 'account__username', 'error')
|
||||
|
||||
for user in recipients:
|
||||
ChangeSecretFailedMsg(name, execution_id, user, asset_account_errors).publish()
|
||||
|
||||
if not recorders:
|
||||
return
|
||||
|
||||
self.send_recorder_mail(recipients, recorders, summary)
|
||||
|
||||
def send_recorder_mail(self, recipients, recorders, summary):
|
||||
name = self.execution.snapshot['name']
|
||||
path = os.path.join(os.path.dirname(settings.BASE_DIR), 'tmp')
|
||||
filename = os.path.join(path, f'{name}-{local_now_filename()}-{time.time()}.xlsx')
|
||||
|
|
|
@ -58,7 +58,7 @@ class GatherAccountsManager(AccountBasePlaybookManager):
|
|||
result = self.filter_success_result(asset.type, info)
|
||||
self.collect_asset_account_info(asset, result)
|
||||
else:
|
||||
logger.error(f'Not found {host} info')
|
||||
print(f'\033[31m Not found {host} info \033[0m\n')
|
||||
|
||||
def update_or_create_accounts(self):
|
||||
for asset, data in self.asset_account_info.items():
|
||||
|
|
|
@ -85,6 +85,7 @@
|
|||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
|
@ -95,6 +96,7 @@
|
|||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
||||
|
|
|
@ -85,6 +85,7 @@
|
|||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
|
@ -95,6 +96,7 @@
|
|||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
||||
|
|
|
@ -60,8 +60,11 @@ class RemoveAccountManager(AccountBasePlaybookManager):
|
|||
if not tuple_asset_gather_account:
|
||||
return
|
||||
asset, gather_account = tuple_asset_gather_account
|
||||
Account.objects.filter(
|
||||
asset_id=asset.id,
|
||||
username=gather_account.username
|
||||
).delete()
|
||||
gather_account.delete()
|
||||
try:
|
||||
Account.objects.filter(
|
||||
asset_id=asset.id,
|
||||
username=gather_account.username
|
||||
).delete()
|
||||
gather_account.delete()
|
||||
except Exception as e:
|
||||
print(f'\033[31m Delete account {gather_account.username} failed: {e} \033[0m\n')
|
||||
|
|
|
@ -3,6 +3,7 @@
|
|||
vars:
|
||||
ansible_shell_type: sh
|
||||
ansible_connection: local
|
||||
ansible_python_interpreter: /opt/py3/bin/python
|
||||
|
||||
tasks:
|
||||
- name: Verify account (pyfreerdp)
|
||||
|
|
|
@ -19,3 +19,5 @@
|
|||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
|
|
|
@ -76,8 +76,14 @@ class VerifyAccountManager(AccountBasePlaybookManager):
|
|||
|
||||
def on_host_success(self, host, result):
|
||||
account = self.host_account_mapper.get(host)
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
try:
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} connectivity failed: {e} \033[0m\n')
|
||||
|
||||
def on_host_error(self, host, error, result):
|
||||
account = self.host_account_mapper.get(host)
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
try:
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} connectivity failed: {e} \033[0m\n')
|
||||
|
|
|
@ -15,6 +15,7 @@ class AliasAccount(TextChoices):
|
|||
INPUT = '@INPUT', _('Manual input')
|
||||
USER = '@USER', _('Dynamic user')
|
||||
ANON = '@ANON', _('Anonymous account')
|
||||
SPEC = '@SPEC', _('Specified account')
|
||||
|
||||
@classmethod
|
||||
def virtual_choices(cls):
|
||||
|
|
|
@ -16,7 +16,7 @@ DEFAULT_PASSWORD_RULES = {
|
|||
__all__ = [
|
||||
'AutomationTypes', 'SecretStrategy', 'SSHKeyStrategy', 'Connectivity',
|
||||
'DEFAULT_PASSWORD_LENGTH', 'DEFAULT_PASSWORD_RULES', 'TriggerChoice',
|
||||
'PushAccountActionChoice', 'AccountBackupType'
|
||||
'PushAccountActionChoice', 'AccountBackupType', 'ChangeSecretRecordStatusChoice',
|
||||
]
|
||||
|
||||
|
||||
|
@ -103,3 +103,9 @@ class AccountBackupType(models.TextChoices):
|
|||
email = 'email', _('Email')
|
||||
# 目前只支持sftp方式
|
||||
object_storage = 'object_storage', _('SFTP')
|
||||
|
||||
|
||||
class ChangeSecretRecordStatusChoice(models.TextChoices):
|
||||
failed = 'failed', _('Failed')
|
||||
success = 'success', _('Success')
|
||||
pending = 'pending', _('Pending')
|
||||
|
|
|
@ -5,7 +5,7 @@ from django_filters import rest_framework as drf_filters
|
|||
|
||||
from assets.models import Node
|
||||
from common.drf.filters import BaseFilterSet
|
||||
from .models import Account, GatheredAccount
|
||||
from .models import Account, GatheredAccount, ChangeSecretRecord
|
||||
|
||||
|
||||
class AccountFilterSet(BaseFilterSet):
|
||||
|
@ -61,3 +61,13 @@ class GatheredAccountFilterSet(BaseFilterSet):
|
|||
class Meta:
|
||||
model = GatheredAccount
|
||||
fields = ['id', 'username']
|
||||
|
||||
|
||||
class ChangeSecretRecordFilterSet(BaseFilterSet):
|
||||
asset_name = drf_filters.CharFilter(field_name='asset__name', lookup_expr='icontains')
|
||||
account_username = drf_filters.CharFilter(field_name='account__username', lookup_expr='icontains')
|
||||
execution_id = drf_filters.CharFilter(field_name='execution_id', lookup_expr='exact')
|
||||
|
||||
class Meta:
|
||||
model = ChangeSecretRecord
|
||||
fields = ['id', 'status', 'asset_id', 'execution']
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
# Generated by Django 4.1.10 on 2023-08-01 09:12
|
||||
|
||||
from django.db import migrations, models
|
||||
import uuid
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
|
@ -20,7 +21,7 @@ class Migration(migrations.Migration):
|
|||
('date_updated', models.DateTimeField(auto_now=True, verbose_name='Date updated')),
|
||||
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
|
||||
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('alias', models.CharField(choices=[('@INPUT', 'Manual input'), ('@USER', 'Dynamic user'), ('@ANON', 'Anonymous account')], max_length=128, verbose_name='Alias')),
|
||||
('alias', models.CharField(choices=[('@INPUT', 'Manual input'), ('@USER', 'Dynamic user'), ('@ANON', 'Anonymous account'), ('@SPEC', 'Specified account')], max_length=128, verbose_name='Alias')),
|
||||
('secret_from_login', models.BooleanField(default=None, null=True, verbose_name='Secret from login')),
|
||||
],
|
||||
options={
|
||||
|
|
|
@ -8,7 +8,7 @@ from django.db import models
|
|||
from django.db.models import F
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from accounts.const.automation import AccountBackupType
|
||||
from accounts.const import AccountBackupType
|
||||
from common.const.choices import Trigger
|
||||
from common.db import fields
|
||||
from common.db.encoder import ModelJSONFieldEncoder
|
||||
|
|
|
@ -2,7 +2,7 @@ from django.db import models
|
|||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from accounts.const import (
|
||||
AutomationTypes
|
||||
AutomationTypes, ChangeSecretRecordStatusChoice
|
||||
)
|
||||
from common.db import fields
|
||||
from common.db.models import JMSBaseModel
|
||||
|
@ -40,7 +40,10 @@ class ChangeSecretRecord(JMSBaseModel):
|
|||
new_secret = fields.EncryptTextField(blank=True, null=True, verbose_name=_('New secret'))
|
||||
date_started = models.DateTimeField(blank=True, null=True, verbose_name=_('Date started'))
|
||||
date_finished = models.DateTimeField(blank=True, null=True, verbose_name=_('Date finished'))
|
||||
status = models.CharField(max_length=16, default='pending', verbose_name=_('Status'))
|
||||
status = models.CharField(
|
||||
max_length=16, verbose_name=_('Status'),
|
||||
default=ChangeSecretRecordStatusChoice.pending.value
|
||||
)
|
||||
error = models.TextField(blank=True, null=True, verbose_name=_('Error'))
|
||||
|
||||
class Meta:
|
||||
|
|
|
@ -137,16 +137,13 @@ class BaseAccount(VaultModelMixin, JMSOrgBaseModel):
|
|||
else:
|
||||
return None
|
||||
|
||||
@property
|
||||
def private_key_path(self):
|
||||
def get_private_key_path(self, path):
|
||||
if self.secret_type != SecretType.SSH_KEY \
|
||||
or not self.secret \
|
||||
or not self.private_key:
|
||||
return None
|
||||
project_dir = settings.PROJECT_DIR
|
||||
tmp_dir = os.path.join(project_dir, 'tmp')
|
||||
key_name = '.' + md5(self.private_key.encode('utf-8')).hexdigest()
|
||||
key_path = os.path.join(tmp_dir, key_name)
|
||||
key_path = os.path.join(path, key_name)
|
||||
if not os.path.exists(key_path):
|
||||
# https://github.com/ansible/ansible-runner/issues/544
|
||||
# ssh requires OpenSSH format keys to have a full ending newline.
|
||||
|
@ -158,6 +155,12 @@ class BaseAccount(VaultModelMixin, JMSOrgBaseModel):
|
|||
os.chmod(key_path, 0o400)
|
||||
return key_path
|
||||
|
||||
@property
|
||||
def private_key_path(self):
|
||||
project_dir = settings.PROJECT_DIR
|
||||
tmp_dir = os.path.join(project_dir, 'tmp')
|
||||
return self.get_private_key_path(tmp_dir)
|
||||
|
||||
def get_private_key(self):
|
||||
if not self.private_key:
|
||||
return None
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
from django.template.loader import render_to_string
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from accounts.models import ChangeSecretRecord
|
||||
from common.tasks import send_mail_attachment_async, upload_backup_to_obj_storage
|
||||
from notifications.notifications import UserMessage
|
||||
from terminal.models.component.storage import ReplayStorage
|
||||
|
@ -98,3 +99,35 @@ class GatherAccountChangeMsg(UserMessage):
|
|||
def gen_test_msg(cls):
|
||||
user = User.objects.first()
|
||||
return cls(user, {})
|
||||
|
||||
|
||||
class ChangeSecretFailedMsg(UserMessage):
|
||||
subject = _('Change secret or push account failed information')
|
||||
|
||||
def __init__(self, name, execution_id, user, asset_account_errors: list):
|
||||
self.name = name
|
||||
self.execution_id = execution_id
|
||||
self.asset_account_errors = asset_account_errors
|
||||
super().__init__(user)
|
||||
|
||||
def get_html_msg(self) -> dict:
|
||||
context = {
|
||||
'name': self.name,
|
||||
'recipient': self.user,
|
||||
'execution_id': self.execution_id,
|
||||
'asset_account_errors': self.asset_account_errors
|
||||
}
|
||||
message = render_to_string('accounts/change_secret_failed_info.html', context)
|
||||
|
||||
return {
|
||||
'subject': str(self.subject),
|
||||
'message': message
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def gen_test_msg(cls):
|
||||
name = 'test'
|
||||
user = User.objects.first()
|
||||
record = ChangeSecretRecord.objects.first()
|
||||
execution_id = str(record.execution_id)
|
||||
return cls(name, execution_id, user, [])
|
||||
|
|
|
@ -21,6 +21,7 @@ __all__ = [
|
|||
class BaseAutomationSerializer(PeriodTaskSerializerMixin, BulkOrgResourceModelSerializer):
|
||||
assets = ObjectRelatedField(many=True, required=False, queryset=Asset.objects, label=_('Assets'))
|
||||
nodes = ObjectRelatedField(many=True, required=False, queryset=Node.objects, label=_('Nodes'))
|
||||
is_periodic = serializers.BooleanField(default=False, required=False, label=_("Periodic perform"))
|
||||
|
||||
class Meta:
|
||||
read_only_fields = [
|
||||
|
|
|
@ -4,7 +4,8 @@ from django.utils.translation import gettext_lazy as _
|
|||
from rest_framework import serializers
|
||||
|
||||
from accounts.const import (
|
||||
AutomationTypes, SecretType, SecretStrategy, SSHKeyStrategy
|
||||
AutomationTypes, SecretType, SecretStrategy,
|
||||
SSHKeyStrategy, ChangeSecretRecordStatusChoice
|
||||
)
|
||||
from accounts.models import (
|
||||
Account, ChangeSecretAutomation,
|
||||
|
@ -21,6 +22,7 @@ logger = get_logger(__file__)
|
|||
__all__ = [
|
||||
'ChangeSecretAutomationSerializer',
|
||||
'ChangeSecretRecordSerializer',
|
||||
'ChangeSecretRecordViewSecretSerializer',
|
||||
'ChangeSecretRecordBackUpSerializer',
|
||||
'ChangeSecretUpdateAssetSerializer',
|
||||
'ChangeSecretUpdateNodeSerializer',
|
||||
|
@ -104,7 +106,10 @@ class ChangeSecretAutomationSerializer(AuthValidateMixin, BaseAutomationSerializ
|
|||
class ChangeSecretRecordSerializer(serializers.ModelSerializer):
|
||||
is_success = serializers.SerializerMethodField(label=_('Is success'))
|
||||
asset = ObjectRelatedField(queryset=Asset.objects, label=_('Asset'))
|
||||
account = ObjectRelatedField(queryset=Account.objects, label=_('Account'))
|
||||
account = ObjectRelatedField(
|
||||
queryset=Account.objects, label=_('Account'),
|
||||
attrs=("id", "name", "username")
|
||||
)
|
||||
execution = ObjectRelatedField(
|
||||
queryset=AutomationExecution.objects, label=_('Automation task execution')
|
||||
)
|
||||
|
@ -119,7 +124,16 @@ class ChangeSecretRecordSerializer(serializers.ModelSerializer):
|
|||
|
||||
@staticmethod
|
||||
def get_is_success(obj):
|
||||
return obj.status == 'success'
|
||||
return obj.status == ChangeSecretRecordStatusChoice.success.value
|
||||
|
||||
|
||||
class ChangeSecretRecordViewSecretSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = ChangeSecretRecord
|
||||
fields = [
|
||||
'id', 'old_secret', 'new_secret',
|
||||
]
|
||||
read_only_fields = fields
|
||||
|
||||
|
||||
class ChangeSecretRecordBackUpSerializer(serializers.ModelSerializer):
|
||||
|
@ -145,7 +159,7 @@ class ChangeSecretRecordBackUpSerializer(serializers.ModelSerializer):
|
|||
|
||||
@staticmethod
|
||||
def get_is_success(obj):
|
||||
if obj.status == 'success':
|
||||
if obj.status == ChangeSecretRecordStatusChoice.success.value:
|
||||
return _("Success")
|
||||
return _("Failed")
|
||||
|
||||
|
|
|
@ -36,14 +36,14 @@ def execute_account_automation_task(pid, trigger, tp):
|
|||
instance.execute(trigger)
|
||||
|
||||
|
||||
def record_task_activity_callback(self, record_id, *args, **kwargs):
|
||||
def record_task_activity_callback(self, record_ids, *args, **kwargs):
|
||||
from accounts.models import ChangeSecretRecord
|
||||
with tmp_to_root_org():
|
||||
record = get_object_or_none(ChangeSecretRecord, id=record_id)
|
||||
if not record:
|
||||
records = ChangeSecretRecord.objects.filter(id__in=record_ids)
|
||||
if not records:
|
||||
return
|
||||
resource_ids = [record.id]
|
||||
org_id = record.execution.org_id
|
||||
resource_ids = [str(i.id) for i in records]
|
||||
org_id = records[0].execution.org_id
|
||||
return resource_ids, org_id
|
||||
|
||||
|
||||
|
@ -51,22 +51,26 @@ def record_task_activity_callback(self, record_id, *args, **kwargs):
|
|||
queue='ansible', verbose_name=_('Execute automation record'),
|
||||
activity_callback=record_task_activity_callback
|
||||
)
|
||||
def execute_automation_record_task(record_id, tp):
|
||||
def execute_automation_record_task(record_ids, tp):
|
||||
from accounts.models import ChangeSecretRecord
|
||||
task_name = gettext_noop('Execute automation record')
|
||||
|
||||
with tmp_to_root_org():
|
||||
instance = get_object_or_none(ChangeSecretRecord, pk=record_id)
|
||||
if not instance:
|
||||
logger.error("No automation record found: {}".format(record_id))
|
||||
records = ChangeSecretRecord.objects.filter(id__in=record_ids)
|
||||
|
||||
if not records:
|
||||
logger.error('No automation record found: {}'.format(record_ids))
|
||||
return
|
||||
|
||||
task_name = gettext_noop('Execute automation record')
|
||||
record = records[0]
|
||||
record_map = {f'{record.asset_id}-{record.account_id}': str(record.id) for record in records}
|
||||
task_snapshot = {
|
||||
'secret': instance.new_secret,
|
||||
'secret_type': instance.execution.snapshot.get('secret_type'),
|
||||
'accounts': [str(instance.account_id)],
|
||||
'assets': [str(instance.asset_id)],
|
||||
'params': {},
|
||||
'record_id': record_id,
|
||||
'record_map': record_map,
|
||||
'secret': record.new_secret,
|
||||
'secret_type': record.execution.snapshot.get('secret_type'),
|
||||
'assets': [str(instance.asset_id) for instance in records],
|
||||
'accounts': [str(instance.account_id) for instance in records],
|
||||
}
|
||||
with tmp_to_org(instance.execution.org_id):
|
||||
with tmp_to_org(record.execution.org_id):
|
||||
quickstart_automation_by_snapshot(task_name, tp, task_snapshot)
|
||||
|
|
|
@ -55,7 +55,7 @@ def clean_historical_accounts():
|
|||
history_model = Account.history.model
|
||||
history_id_mapper = defaultdict(list)
|
||||
|
||||
ids = history_model.objects.values('id').annotate(count=Count('id', distinct=True)) \
|
||||
ids = history_model.objects.values('id').annotate(count=Count('id')) \
|
||||
.filter(count__gte=limit).values_list('id', flat=True)
|
||||
|
||||
if not ids:
|
||||
|
|
|
@ -29,7 +29,8 @@ def template_sync_related_accounts(template_id, user_id=None):
|
|||
name = template.name
|
||||
username = template.username
|
||||
secret_type = template.secret_type
|
||||
print(f'\033[32m>>> 开始同步模版名称、用户名、密钥类型到相关联的账号 ({datetime.now().strftime("%Y-%m-%d %H:%M:%S")})')
|
||||
print(
|
||||
f'\033[32m>>> 开始同步模板名称、用户名、密钥类型到相关联的账号 ({datetime.now().strftime("%Y-%m-%d %H:%M:%S")})')
|
||||
with tmp_to_org(org_id):
|
||||
for account in accounts:
|
||||
account.name = name
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
{% load i18n %}
|
||||
|
||||
|
||||
<h3>{% trans 'Gather account change information' %}</h3>
|
||||
<table style="width: 100%; border-collapse: collapse; max-width: 100%; text-align: left; margin-top: 20px;">
|
||||
<caption></caption>
|
||||
<tr style="background-color: #f2f2f2;">
|
||||
<th style="border: 1px solid #ddd; padding: 10px; font-weight: bold;">{% trans 'Asset' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Asset' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Added account' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Deleted account' %}</th>
|
||||
</tr>
|
||||
|
|
|
@ -0,0 +1,36 @@
|
|||
{% load i18n %}
|
||||
|
||||
<h3>{% trans 'Task name' %}: {{ name }}</h3>
|
||||
<h3>{% trans 'Task execution id' %}: {{ execution_id }}</h3>
|
||||
<p>{% trans 'Respectful' %} {{ recipient }}</p>
|
||||
<p>{% trans 'Hello! The following is the failure of changing the password of your assets or pushing the account. Please check and handle it in time.' %}</p>
|
||||
<table style="width: 100%; border-collapse: collapse; max-width: 100%; text-align: left; margin-top: 20px;">
|
||||
<caption></caption>
|
||||
<thead>
|
||||
<tr style="background-color: #f2f2f2;">
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Asset' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Account' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Error' %}</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for asset_name, account_username, error in asset_account_errors %}
|
||||
<tr>
|
||||
<td style="border: 1px solid #ddd; padding: 10px;">{{ asset_name }}</td>
|
||||
<td style="border: 1px solid #ddd; padding: 10px;">{{ account_username }}</td>
|
||||
<td style="border: 1px solid #ddd; padding: 10px;">
|
||||
<div style="
|
||||
max-width: 90%;
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
display: block;"
|
||||
title="{{ error }}"
|
||||
>
|
||||
{{ error }}
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
|
@ -32,6 +32,7 @@ __all__ = [
|
|||
|
||||
class AssetFilterSet(BaseFilterSet):
|
||||
platform = django_filters.CharFilter(method='filter_platform')
|
||||
exclude_platform = django_filters.CharFilter(field_name="platform__name", lookup_expr='exact', exclude=True)
|
||||
domain = django_filters.CharFilter(method='filter_domain')
|
||||
type = django_filters.CharFilter(field_name="platform__type", lookup_expr="exact")
|
||||
category = django_filters.CharFilter(field_name="platform__category", lookup_expr="exact")
|
||||
|
@ -92,7 +93,6 @@ class AssetViewSet(SuggestionMixin, OrgBulkModelViewSet):
|
|||
model = Asset
|
||||
filterset_class = AssetFilterSet
|
||||
search_fields = ("name", "address", "comment")
|
||||
ordering = ('name',)
|
||||
ordering_fields = ('name', 'address', 'connectivity', 'platform', 'date_updated', 'date_created')
|
||||
serializer_classes = (
|
||||
("default", serializers.AssetSerializer),
|
||||
|
|
|
@ -19,7 +19,6 @@ class DomainViewSet(OrgBulkModelViewSet):
|
|||
model = Domain
|
||||
filterset_fields = ("name",)
|
||||
search_fields = filterset_fields
|
||||
ordering = ('name',)
|
||||
serializer_classes = {
|
||||
'default': serializers.DomainSerializer,
|
||||
'list': serializers.DomainListSerializer,
|
||||
|
@ -30,6 +29,10 @@ class DomainViewSet(OrgBulkModelViewSet):
|
|||
return serializers.DomainWithGatewaySerializer
|
||||
return super().get_serializer_class()
|
||||
|
||||
def partial_update(self, request, *args, **kwargs):
|
||||
kwargs['partial'] = True
|
||||
return self.update(request, *args, **kwargs)
|
||||
|
||||
|
||||
class GatewayViewSet(HostViewSet):
|
||||
perm_model = Gateway
|
||||
|
|
|
@ -21,6 +21,7 @@ class AssetPlatformViewSet(JMSModelViewSet):
|
|||
}
|
||||
filterset_fields = ['name', 'category', 'type']
|
||||
search_fields = ['name']
|
||||
ordering = ['-internal', 'name']
|
||||
rbac_perms = {
|
||||
'categories': 'assets.view_platform',
|
||||
'type_constraints': 'assets.view_platform',
|
||||
|
|
|
@ -12,7 +12,7 @@ from sshtunnel import SSHTunnelForwarder
|
|||
|
||||
from assets.automations.methods import platform_automation_methods
|
||||
from common.utils import get_logger, lazyproperty, is_openssh_format_key, ssh_pubkey_gen
|
||||
from ops.ansible import JMSInventory, PlaybookRunner, DefaultCallback
|
||||
from ops.ansible import JMSInventory, SuperPlaybookRunner, DefaultCallback
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
@ -54,7 +54,7 @@ class SSHTunnelManager:
|
|||
not_valid.append(k)
|
||||
else:
|
||||
local_bind_port = server.local_bind_port
|
||||
host['ansible_host'] = jms_asset['address'] = host['login_host'] = '127.0.0.1'
|
||||
host['ansible_host'] = jms_asset['address'] = host['login_host'] = 'jms_celery'
|
||||
host['ansible_port'] = jms_asset['port'] = host['login_port'] = local_bind_port
|
||||
servers.append(server)
|
||||
|
||||
|
@ -269,7 +269,7 @@ class BasePlaybookManager:
|
|||
if not playbook_path:
|
||||
continue
|
||||
|
||||
runer = PlaybookRunner(
|
||||
runer = SuperPlaybookRunner(
|
||||
inventory_path,
|
||||
playbook_path,
|
||||
self.runtime_dir,
|
||||
|
@ -314,7 +314,7 @@ class BasePlaybookManager:
|
|||
def delete_runtime_dir(self):
|
||||
if settings.DEBUG_DEV:
|
||||
return
|
||||
shutil.rmtree(self.runtime_dir)
|
||||
shutil.rmtree(self.runtime_dir, ignore_errors=True)
|
||||
|
||||
def run(self, *args, **kwargs):
|
||||
print(">>> 任务准备阶段\n")
|
||||
|
@ -333,6 +333,7 @@ class BasePlaybookManager:
|
|||
ssh_tunnel = SSHTunnelManager()
|
||||
ssh_tunnel.local_gateway_prepare(runner)
|
||||
try:
|
||||
kwargs.update({"clean_workspace": False})
|
||||
cb = runner.run(**kwargs)
|
||||
self.on_runner_success(runner, cb)
|
||||
except Exception as e:
|
||||
|
|
|
@ -3,6 +3,7 @@
|
|||
vars:
|
||||
ansible_shell_type: sh
|
||||
ansible_connection: local
|
||||
ansible_python_interpreter: /opt/py3/bin/python
|
||||
|
||||
tasks:
|
||||
- name: Test asset connection (pyfreerdp)
|
||||
|
|
|
@ -19,3 +19,6 @@
|
|||
become_user: "{{ custom_become_user | default('') }}"
|
||||
become_password: "{{ custom_become_password | default('') }}"
|
||||
become_private_key_path: "{{ custom_become_private_key_path | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
|
||||
|
|
|
@ -0,0 +1,11 @@
|
|||
- hosts: custom
|
||||
gather_facts: no
|
||||
vars:
|
||||
ansible_connection: local
|
||||
ansible_shell_type: sh
|
||||
|
||||
tasks:
|
||||
- name: Test asset connection (telnet)
|
||||
telnet_ping:
|
||||
login_host: "{{ jms_asset.address }}"
|
||||
login_port: "{{ jms_asset.port }}"
|
|
@ -0,0 +1,16 @@
|
|||
id: ping_by_telnet
|
||||
name: "{{ 'Ping by telnet' | trans }}"
|
||||
category:
|
||||
- device
|
||||
- host
|
||||
type:
|
||||
- all
|
||||
method: ping
|
||||
protocol: telnet
|
||||
priority: 50
|
||||
|
||||
i18n:
|
||||
Ping by telnet:
|
||||
zh: '使用 Python 模块 telnet 测试主机可连接性'
|
||||
en: 'Ping by telnet module'
|
||||
ja: 'Pythonモジュールtelnetを使用したホスト接続性のテスト'
|
|
@ -25,14 +25,22 @@ class PingManager(BasePlaybookManager):
|
|||
|
||||
def on_host_success(self, host, result):
|
||||
asset, account = self.host_asset_and_account_mapper.get(host)
|
||||
asset.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
try:
|
||||
asset.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {asset.name} connectivity failed: {e} \033[0m\n')
|
||||
|
||||
def on_host_error(self, host, error, result):
|
||||
asset, account = self.host_asset_and_account_mapper.get(host)
|
||||
asset.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
try:
|
||||
asset.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {asset.name} connectivity failed: {e} \033[0m\n')
|
||||
|
|
|
@ -92,18 +92,26 @@ class PingGatewayManager:
|
|||
@staticmethod
|
||||
def on_host_success(gateway, account):
|
||||
print('\033[32m {} -> {}\033[0m\n'.format(gateway, account))
|
||||
gateway.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
try:
|
||||
gateway.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {gateway.name} connectivity failed: {e} \033[0m\n')
|
||||
|
||||
@staticmethod
|
||||
def on_host_error(gateway, account, error):
|
||||
print('\033[31m {} -> {} 原因: {} \033[0m\n'.format(gateway, account, error))
|
||||
gateway.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
try:
|
||||
gateway.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {gateway.name} connectivity failed: {e} \033[0m\n')
|
||||
|
||||
@staticmethod
|
||||
def before_runner_start():
|
||||
|
|
|
@ -38,6 +38,14 @@ class Protocol(ChoicesMixin, models.TextChoices):
|
|||
cls.ssh: {
|
||||
'port': 22,
|
||||
'secret_types': ['password', 'ssh_key'],
|
||||
'setting': {
|
||||
'old_ssh_version': {
|
||||
'type': 'bool',
|
||||
'default': False,
|
||||
'label': _('Old SSH version'),
|
||||
'help_text': _('Old SSH version like openssh 5.x or 6.x')
|
||||
}
|
||||
}
|
||||
},
|
||||
cls.sftp: {
|
||||
'port': 22,
|
||||
|
@ -187,6 +195,14 @@ class Protocol(ChoicesMixin, models.TextChoices):
|
|||
'port': 27017,
|
||||
'required': True,
|
||||
'secret_types': ['password'],
|
||||
'setting': {
|
||||
'auth_source': {
|
||||
'type': 'str',
|
||||
'default': 'admin',
|
||||
'label': _('Auth source'),
|
||||
'help_text': _('The database to authenticate against')
|
||||
}
|
||||
}
|
||||
},
|
||||
cls.redis: {
|
||||
'port': 6379,
|
||||
|
|
|
@ -73,3 +73,7 @@ class Gateway(Host):
|
|||
def private_key_path(self):
|
||||
account = self.select_account
|
||||
return account.private_key_path if account else None
|
||||
|
||||
def get_private_key_path(self, path):
|
||||
account = self.select_account
|
||||
return account.get_private_key_path(path) if account else None
|
||||
|
|
|
@ -73,6 +73,10 @@ class FamilyMixin:
|
|||
@classmethod
|
||||
def get_nodes_all_children(cls, nodes, with_self=True):
|
||||
pattern = cls.get_nodes_children_key_pattern(nodes, with_self=with_self)
|
||||
if not pattern:
|
||||
# 如果 pattern = ''
|
||||
# key__iregex 报错 (1139, "Got error 'empty (sub)expression' from regexp")
|
||||
return cls.objects.none()
|
||||
return Node.objects.filter(key__iregex=pattern)
|
||||
|
||||
@classmethod
|
||||
|
|
|
@ -1,12 +1,13 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
from django.db.models import Count
|
||||
from django.db.models import Count, Q
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
|
||||
from common.serializers import ResourceLabelsMixin
|
||||
from common.serializers.fields import ObjectRelatedField
|
||||
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
|
||||
from assets.models.gateway import Gateway
|
||||
from .gateway import GatewayWithAccountSecretSerializer
|
||||
from ..models import Domain
|
||||
|
||||
|
@ -15,7 +16,7 @@ __all__ = ['DomainSerializer', 'DomainWithGatewaySerializer', 'DomainListSeriali
|
|||
|
||||
class DomainSerializer(ResourceLabelsMixin, BulkOrgResourceModelSerializer):
|
||||
gateways = ObjectRelatedField(
|
||||
many=True, required=False, label=_('Gateway'), read_only=True,
|
||||
many=True, required=False, label=_('Gateway'), queryset=Gateway.objects
|
||||
)
|
||||
|
||||
class Meta:
|
||||
|
@ -25,6 +26,9 @@ class DomainSerializer(ResourceLabelsMixin, BulkOrgResourceModelSerializer):
|
|||
fields_m2m = ['assets', 'gateways']
|
||||
read_only_fields = ['date_created']
|
||||
fields = fields_small + fields_m2m + read_only_fields
|
||||
extra_kwargs = {
|
||||
'assets': {'required': False},
|
||||
}
|
||||
|
||||
def to_representation(self, instance):
|
||||
data = super().to_representation(instance)
|
||||
|
@ -35,12 +39,17 @@ class DomainSerializer(ResourceLabelsMixin, BulkOrgResourceModelSerializer):
|
|||
data['assets'] = [i for i in assets if str(i['id']) not in gateway_ids]
|
||||
return data
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
def create(self, validated_data):
|
||||
assets = validated_data.pop('assets', [])
|
||||
assets = assets + list(instance.gateways)
|
||||
validated_data['assets'] = assets
|
||||
instance = super().update(instance, validated_data)
|
||||
return instance
|
||||
gateways = validated_data.pop('gateways', [])
|
||||
validated_data['assets'] = assets + gateways
|
||||
return super().create(validated_data)
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
assets = validated_data.pop('assets', list(instance.assets.all()))
|
||||
gateways = validated_data.pop('gateways', list(instance.gateways.all()))
|
||||
validated_data['assets'] = assets + gateways
|
||||
return super().update(instance, validated_data)
|
||||
|
||||
@classmethod
|
||||
def setup_eager_loading(cls, queryset):
|
||||
|
@ -58,7 +67,7 @@ class DomainListSerializer(DomainSerializer):
|
|||
@classmethod
|
||||
def setup_eager_loading(cls, queryset):
|
||||
queryset = queryset.annotate(
|
||||
assets_amount=Count('assets', distinct=True),
|
||||
assets_amount=Count('assets', filter=~Q(assets__platform__name='Gateway'), distinct=True),
|
||||
)
|
||||
return queryset
|
||||
|
||||
|
|
|
@ -1,6 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
|
||||
from importlib import import_module
|
||||
|
||||
from django.conf import settings
|
||||
|
@ -66,7 +65,7 @@ class FTPLogViewSet(OrgModelViewSet):
|
|||
date_range_filter_fields = [
|
||||
('date_start', ('date_from', 'date_to'))
|
||||
]
|
||||
filterset_fields = ['user', 'asset', 'account', 'filename']
|
||||
filterset_fields = ['user', 'asset', 'account', 'filename', 'session']
|
||||
search_fields = filterset_fields
|
||||
ordering = ['-date_start']
|
||||
http_method_names = ['post', 'get', 'head', 'options', 'patch']
|
||||
|
@ -269,7 +268,7 @@ class UserSessionViewSet(CommonApiMixin, viewsets.ModelViewSet):
|
|||
return user_ids
|
||||
|
||||
def get_queryset(self):
|
||||
keys = UserSession.get_keys()
|
||||
keys = user_session_manager.get_keys()
|
||||
queryset = UserSession.objects.filter(key__in=keys)
|
||||
if current_org.is_root():
|
||||
return queryset
|
||||
|
@ -288,6 +287,6 @@ class UserSessionViewSet(CommonApiMixin, viewsets.ModelViewSet):
|
|||
|
||||
keys = queryset.values_list('key', flat=True)
|
||||
for key in keys:
|
||||
user_session_manager.decrement_or_remove(key)
|
||||
user_session_manager.remove(key)
|
||||
queryset.delete()
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
|
|
@ -12,7 +12,10 @@ from common.utils.timezone import as_current_tz
|
|||
from jumpserver.utils import current_request
|
||||
from orgs.models import Organization
|
||||
from orgs.utils import get_current_org_id
|
||||
from settings.models import Setting
|
||||
from settings.serializers import SettingsSerializer
|
||||
from users.models import Preference
|
||||
from users.serializers import PreferenceSerializer
|
||||
from .backends import get_operate_log_storage
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
@ -87,19 +90,15 @@ class OperatorLogHandler(metaclass=Singleton):
|
|||
return log_id, before, after
|
||||
|
||||
@staticmethod
|
||||
def get_resource_display_from_setting(resource):
|
||||
resource_display = None
|
||||
setting_serializer = SettingsSerializer()
|
||||
label = setting_serializer.get_field_label(resource)
|
||||
if label is not None:
|
||||
resource_display = label
|
||||
return resource_display
|
||||
|
||||
def get_resource_display(self, resource):
|
||||
resource_display = str(resource)
|
||||
return_value = self.get_resource_display_from_setting(resource_display)
|
||||
if return_value is not None:
|
||||
resource_display = return_value
|
||||
def get_resource_display(resource):
|
||||
if isinstance(resource, Setting):
|
||||
serializer = SettingsSerializer()
|
||||
resource_display = serializer.get_field_label(resource.name)
|
||||
elif isinstance(resource, Preference):
|
||||
serializer = PreferenceSerializer()
|
||||
resource_display = serializer.get_field_label(resource.name)
|
||||
else:
|
||||
resource_display = str(resource)
|
||||
return resource_display
|
||||
|
||||
@staticmethod
|
||||
|
|
|
@ -288,16 +288,9 @@ class UserSession(models.Model):
|
|||
ttl = caches[settings.SESSION_CACHE_ALIAS].ttl(cache_key)
|
||||
return timezone.now() + timedelta(seconds=ttl)
|
||||
|
||||
@staticmethod
|
||||
def get_keys():
|
||||
session_store_cls = import_module(settings.SESSION_ENGINE).SessionStore
|
||||
cache_key_prefix = session_store_cls.cache_key_prefix
|
||||
keys = caches[settings.SESSION_CACHE_ALIAS].iter_keys('*')
|
||||
return [k.replace(cache_key_prefix, '') for k in keys]
|
||||
|
||||
@classmethod
|
||||
def clear_expired_sessions(cls):
|
||||
keys = cls.get_keys()
|
||||
keys = user_session_manager.get_keys()
|
||||
cls.objects.exclude(key__in=keys).delete()
|
||||
|
||||
class Meta:
|
||||
|
|
|
@ -43,7 +43,7 @@ class FTPLogSerializer(serializers.ModelSerializer):
|
|||
fields_small = fields_mini + [
|
||||
"user", "remote_addr", "asset", "account",
|
||||
"org_id", "operate", "filename", "date_start",
|
||||
"is_success", "has_file",
|
||||
"is_success", "has_file", "session"
|
||||
]
|
||||
fields = fields_small
|
||||
|
||||
|
|
|
@ -36,6 +36,7 @@ class AuthBackendLabelMapping(LazyObject):
|
|||
backend_label_mapping[settings.AUTH_BACKEND_AUTH_TOKEN] = _("Auth Token")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_WECOM] = _("WeCom")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_FEISHU] = _("FeiShu")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_LARK] = 'Lark'
|
||||
backend_label_mapping[settings.AUTH_BACKEND_SLACK] = _("Slack")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_DINGTALK] = _("DingTalk")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_TEMP_TOKEN] = _("Temporary token")
|
||||
|
|
|
@ -178,7 +178,7 @@ def on_django_start_set_operate_log_monitor_models(sender, **kwargs):
|
|||
'PermedAsset', 'PermedAccount', 'MenuPermission',
|
||||
'Permission', 'TicketSession', 'ApplyLoginTicket',
|
||||
'ApplyCommandTicket', 'ApplyLoginAssetTicket',
|
||||
'FavoriteAsset', 'Asset'
|
||||
'FavoriteAsset',
|
||||
}
|
||||
for i, app in enumerate(apps.get_models(), 1):
|
||||
app_name = app._meta.app_label
|
||||
|
|
|
@ -7,18 +7,17 @@ import subprocess
|
|||
from celery import shared_task
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import default_storage
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from common.const.crontab import CRONTAB_AT_AM_TWO
|
||||
from common.utils import get_log_keep_day, get_logger
|
||||
from common.storage.ftp_file import FTPFileStorageHandler
|
||||
from ops.celery.decorator import (
|
||||
register_as_period_task, after_app_shutdown_clean_periodic
|
||||
)
|
||||
from common.utils import get_log_keep_day, get_logger
|
||||
from ops.celery.decorator import register_as_period_task
|
||||
from ops.models import CeleryTaskExecution
|
||||
from terminal.models import Session, Command
|
||||
from terminal.backends import server_replay_storage
|
||||
from terminal.models import Session, Command
|
||||
from .models import UserLoginLog, OperateLog, FTPLog, ActivityLog, PasswordChangeLog
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
@ -57,9 +56,9 @@ def clean_ftp_log_period():
|
|||
now = timezone.now()
|
||||
days = get_log_keep_day('FTP_LOG_KEEP_DAYS')
|
||||
expired_day = now - datetime.timedelta(days=days)
|
||||
file_store_dir = os.path.join(default_storage.base_location, 'ftp_file')
|
||||
file_store_dir = os.path.join(default_storage.base_location, FTPLog.upload_to)
|
||||
FTPLog.objects.filter(date_start__lt=expired_day).delete()
|
||||
command = "find %s -mtime +%s -exec rm -f {} \\;" % (
|
||||
command = "find %s -mtime +%s -type f -exec rm -f {} \\;" % (
|
||||
file_store_dir, days
|
||||
)
|
||||
subprocess.call(command, shell=True)
|
||||
|
@ -84,6 +83,15 @@ def clean_celery_tasks_period():
|
|||
subprocess.call(command, shell=True)
|
||||
|
||||
|
||||
def batch_delete(queryset, batch_size=3000):
|
||||
model = queryset.model
|
||||
count = queryset.count()
|
||||
with transaction.atomic():
|
||||
for i in range(0, count, batch_size):
|
||||
pks = queryset[i:i + batch_size].values_list('id', flat=True)
|
||||
model.objects.filter(id__in=list(pks)).delete()
|
||||
|
||||
|
||||
def clean_expired_session_period():
|
||||
logger.info("Start clean expired session record, commands and replay")
|
||||
days = get_log_keep_day('TERMINAL_SESSION_KEEP_DURATION')
|
||||
|
@ -93,9 +101,9 @@ def clean_expired_session_period():
|
|||
expired_commands = Command.objects.filter(timestamp__lt=timestamp)
|
||||
replay_dir = os.path.join(default_storage.base_location, 'replay')
|
||||
|
||||
expired_sessions.delete()
|
||||
batch_delete(expired_sessions)
|
||||
logger.info("Clean session item done")
|
||||
expired_commands.delete()
|
||||
batch_delete(expired_commands)
|
||||
logger.info("Clean session command done")
|
||||
command = "find %s -mtime +%s \\( -name '*.json' -o -name '*.tar' -o -name '*.gz' \\) -exec rm -f {} \\;" % (
|
||||
replay_dir, days
|
||||
|
@ -108,7 +116,6 @@ def clean_expired_session_period():
|
|||
|
||||
@shared_task(verbose_name=_('Clean audits session task log'))
|
||||
@register_as_period_task(crontab=CRONTAB_AT_AM_TWO)
|
||||
@after_app_shutdown_clean_periodic
|
||||
def clean_audits_log_period():
|
||||
print("Start clean audit session task log")
|
||||
clean_login_log_period()
|
||||
|
|
|
@ -2,13 +2,15 @@
|
|||
#
|
||||
|
||||
from .access_key import *
|
||||
from .common import *
|
||||
from .confirm import *
|
||||
from .connection_token import *
|
||||
from .feishu import *
|
||||
from .lark import *
|
||||
from .login_confirm import *
|
||||
from .mfa import *
|
||||
from .password import *
|
||||
from .session import *
|
||||
from .sso import *
|
||||
from .temp_token import *
|
||||
from .token import *
|
||||
from .common import *
|
||||
|
|
|
@ -12,7 +12,6 @@ from common.permissions import IsValidUser, OnlySuperUser
|
|||
from common.utils import get_logger
|
||||
from users.models import User
|
||||
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
|
@ -24,6 +23,7 @@ class QRUnBindBase(APIView):
|
|||
'wecom': {'user_field': 'wecom_id', 'not_bind_err': errors.WeComNotBound},
|
||||
'dingtalk': {'user_field': 'dingtalk_id', 'not_bind_err': errors.DingTalkNotBound},
|
||||
'feishu': {'user_field': 'feishu_id', 'not_bind_err': errors.FeiShuNotBound},
|
||||
'lark': {'user_field': 'lark_id', 'not_bind_err': errors.LarkNotBound},
|
||||
'slack': {'user_field': 'slack_id', 'not_bind_err': errors.SlackNotBound},
|
||||
}
|
||||
user = self.user
|
||||
|
|
|
@ -223,12 +223,17 @@ class ExtraActionApiMixin(RDPFileClientProtocolURLMixin):
|
|||
validate_exchange_token: callable
|
||||
|
||||
@action(methods=['POST', 'GET'], detail=True, url_path='rdp-file')
|
||||
def get_rdp_file(self, *args, **kwargs):
|
||||
def get_rdp_file(self, request, *args, **kwargs):
|
||||
token = self.get_object()
|
||||
token.is_valid()
|
||||
filename, content = self.get_rdp_file_info(token)
|
||||
filename = '{}.rdp'.format(filename)
|
||||
response = HttpResponse(content, content_type='application/octet-stream')
|
||||
|
||||
if is_true(request.query_params.get('reusable')):
|
||||
token.set_reusable(True)
|
||||
filename = '{}-{}'.format(filename, token.date_expired.strftime('%Y%m%d_%H%M%S'))
|
||||
|
||||
filename += '.rdp'
|
||||
response['Content-Disposition'] = 'attachment; filename*=UTF-8\'\'%s' % filename
|
||||
return response
|
||||
|
||||
|
@ -379,6 +384,7 @@ class ConnectionTokenViewSet(ExtraActionApiMixin, RootOrgViewMixin, JMSModelView
|
|||
|
||||
if account.username != AliasAccount.INPUT:
|
||||
data['input_username'] = ''
|
||||
|
||||
ticket = self._validate_acl(user, asset, account)
|
||||
if ticket:
|
||||
data['from_ticket'] = ticket
|
||||
|
@ -413,7 +419,10 @@ class ConnectionTokenViewSet(ExtraActionApiMixin, RootOrgViewMixin, JMSModelView
|
|||
|
||||
def _validate_acl(self, user, asset, account):
|
||||
from acls.models import LoginAssetACL
|
||||
acls = LoginAssetACL.filter_queryset(user=user, asset=asset, account=account)
|
||||
kwargs = {'user': user, 'asset': asset, 'account': account}
|
||||
if account.username == AliasAccount.INPUT:
|
||||
kwargs['account_username'] = self.input_username
|
||||
acls = LoginAssetACL.filter_queryset(**kwargs)
|
||||
ip = get_request_ip_or_data(self.request)
|
||||
acl = LoginAssetACL.get_match_rule_acls(user, ip, acls)
|
||||
if not acl:
|
||||
|
@ -503,20 +512,16 @@ class SuperConnectionTokenViewSet(ConnectionTokenViewSet):
|
|||
token.is_valid()
|
||||
serializer = self.get_serializer(instance=token)
|
||||
|
||||
expire_now = request.data.get('expire_now', None)
|
||||
expire_now = request.data.get('expire_now', True)
|
||||
asset_type = token.asset.type
|
||||
# 设置默认值
|
||||
if expire_now is None:
|
||||
# TODO 暂时特殊处理 k8s 不过期
|
||||
if asset_type in ['k8s', 'kubernetes']:
|
||||
expire_now = False
|
||||
else:
|
||||
expire_now = not settings.CONNECTION_TOKEN_REUSABLE
|
||||
if asset_type in ['k8s', 'kubernetes']:
|
||||
expire_now = False
|
||||
|
||||
if is_false(expire_now):
|
||||
logger.debug('Api specified, now expire now')
|
||||
elif token.is_reusable and settings.CONNECTION_TOKEN_REUSABLE:
|
||||
if token.is_reusable and settings.CONNECTION_TOKEN_REUSABLE:
|
||||
logger.debug('Token is reusable, not expire now')
|
||||
elif is_false(expire_now):
|
||||
logger.debug('Api specified, now expire now')
|
||||
else:
|
||||
token.expire()
|
||||
|
||||
|
|
|
@ -0,0 +1,8 @@
|
|||
from common.utils import get_logger
|
||||
from .feishu import FeiShuEventSubscriptionCallback
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
class LarkEventSubscriptionCallback(FeiShuEventSubscriptionCallback):
|
||||
pass
|
|
@ -9,6 +9,7 @@ from common.utils import get_logger
|
|||
from .. import errors, mixins
|
||||
|
||||
__all__ = ['TicketStatusApi']
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
|
|
|
@ -0,0 +1,68 @@
|
|||
import time
|
||||
from threading import Thread
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import logout
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from rest_framework import generics
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
|
||||
from common.sessions.cache import user_session_manager
|
||||
from common.utils import get_logger
|
||||
|
||||
__all__ = ['UserSessionApi']
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
class UserSessionManager:
|
||||
|
||||
def __init__(self, request):
|
||||
self.request = request
|
||||
self.session = request.session
|
||||
|
||||
def connect(self):
|
||||
user_session_manager.add_or_increment(self.session.session_key)
|
||||
|
||||
def disconnect(self):
|
||||
user_session_manager.decrement(self.session.session_key)
|
||||
if self.should_delete_session():
|
||||
thread = Thread(target=self.delay_delete_session)
|
||||
thread.start()
|
||||
|
||||
def should_delete_session(self):
|
||||
return (self.session.modified or settings.SESSION_SAVE_EVERY_REQUEST) and \
|
||||
not self.session.is_empty() and \
|
||||
self.session.get_expire_at_browser_close() and \
|
||||
not user_session_manager.check_active(self.session.session_key)
|
||||
|
||||
def delay_delete_session(self):
|
||||
timeout = 6
|
||||
check_interval = 0.5
|
||||
|
||||
start_time = time.time()
|
||||
while time.time() - start_time < timeout:
|
||||
time.sleep(check_interval)
|
||||
if user_session_manager.check_active(self.session.session_key):
|
||||
return
|
||||
|
||||
logout(self.request)
|
||||
|
||||
|
||||
class UserSessionApi(generics.RetrieveDestroyAPIView):
|
||||
permission_classes = ()
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
if isinstance(request.user, AnonymousUser):
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
UserSessionManager(request).connect()
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
def destroy(self, request, *args, **kwargs):
|
||||
if isinstance(request.user, AnonymousUser):
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
UserSessionManager(request).disconnect()
|
||||
return Response(status=status.HTTP_204_NO_CONTENT)
|
|
@ -5,11 +5,13 @@ from django.conf import settings
|
|||
from django.contrib.auth import login
|
||||
from django.http.response import HttpResponseRedirect
|
||||
from rest_framework import serializers
|
||||
from rest_framework import status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.permissions import AllowAny
|
||||
from rest_framework.request import Request
|
||||
from rest_framework.response import Response
|
||||
|
||||
from authentication.errors import ACLError
|
||||
from common.api import JMSGenericViewSet
|
||||
from common.const.http import POST, GET
|
||||
from common.permissions import OnlySuperUser
|
||||
|
@ -17,7 +19,10 @@ from common.serializers import EmptySerializer
|
|||
from common.utils import reverse, safe_next_url
|
||||
from common.utils.timezone import utc_now
|
||||
from users.models import User
|
||||
from ..errors import SSOAuthClosed
|
||||
from users.utils import LoginBlockUtil, LoginIpBlockUtil
|
||||
from ..errors import (
|
||||
SSOAuthClosed, AuthFailedError, LoginConfirmBaseError, SSOAuthKeyTTLError
|
||||
)
|
||||
from ..filters import AuthKeyQueryDeclaration
|
||||
from ..mixins import AuthMixin
|
||||
from ..models import SSOToken
|
||||
|
@ -63,31 +68,58 @@ class SSOViewSet(AuthMixin, JMSGenericViewSet):
|
|||
此接口违反了 `Restful` 的规范
|
||||
`GET` 应该是安全的方法,但此接口是不安全的
|
||||
"""
|
||||
status_code = status.HTTP_400_BAD_REQUEST
|
||||
request.META['HTTP_X_JMS_LOGIN_TYPE'] = 'W'
|
||||
authkey = request.query_params.get(AUTH_KEY)
|
||||
next_url = request.query_params.get(NEXT_URL)
|
||||
if not next_url or not next_url.startswith('/'):
|
||||
next_url = reverse('index')
|
||||
|
||||
if not authkey:
|
||||
raise serializers.ValidationError("authkey is required")
|
||||
|
||||
try:
|
||||
if not authkey:
|
||||
raise serializers.ValidationError("authkey is required")
|
||||
|
||||
authkey = UUID(authkey)
|
||||
token = SSOToken.objects.get(authkey=authkey, expired=False)
|
||||
# 先过期,只能访问这一次
|
||||
except (ValueError, SSOToken.DoesNotExist, serializers.ValidationError) as e:
|
||||
error_msg = str(e)
|
||||
self.send_auth_signal(success=False, reason=error_msg)
|
||||
return Response({'error': error_msg}, status=status_code)
|
||||
|
||||
error_msg = None
|
||||
user = token.user
|
||||
username = user.username
|
||||
ip = self.get_request_ip()
|
||||
|
||||
try:
|
||||
if (utc_now().timestamp() - token.date_created.timestamp()) > settings.AUTH_SSO_AUTHKEY_TTL:
|
||||
raise SSOAuthKeyTTLError()
|
||||
|
||||
self._check_is_block(username, True)
|
||||
self._check_only_allow_exists_user_auth(username)
|
||||
self._check_login_acl(user, ip)
|
||||
self.check_user_login_confirm_if_need(user)
|
||||
|
||||
self.request.session['auth_backend'] = settings.AUTH_BACKEND_SSO
|
||||
login(self.request, user, settings.AUTH_BACKEND_SSO)
|
||||
self.send_auth_signal(success=True, user=user)
|
||||
self.mark_mfa_ok('otp', user)
|
||||
|
||||
LoginIpBlockUtil(ip).clean_block_if_need()
|
||||
LoginBlockUtil(username, ip).clean_failed_count()
|
||||
self.clear_auth_mark()
|
||||
except (ACLError, LoginConfirmBaseError): # 无需记录日志
|
||||
pass
|
||||
except (AuthFailedError, SSOAuthKeyTTLError) as e:
|
||||
error_msg = e.msg
|
||||
except Exception as e:
|
||||
error_msg = str(e)
|
||||
finally:
|
||||
token.expired = True
|
||||
token.save()
|
||||
except (ValueError, SSOToken.DoesNotExist):
|
||||
self.send_auth_signal(success=False, reason='authkey_invalid')
|
||||
return HttpResponseRedirect(next_url)
|
||||
|
||||
# 判断是否过期
|
||||
if (utc_now().timestamp() - token.date_created.timestamp()) > settings.AUTH_SSO_AUTHKEY_TTL:
|
||||
self.send_auth_signal(success=False, reason='authkey_timeout')
|
||||
if error_msg:
|
||||
self.send_auth_signal(success=False, username=username, reason=error_msg)
|
||||
return Response({'error': error_msg}, status=status_code)
|
||||
else:
|
||||
return HttpResponseRedirect(next_url)
|
||||
|
||||
user = token.user
|
||||
login(self.request, user, settings.AUTH_BACKEND_SSO)
|
||||
self.send_auth_signal(success=True, user=user)
|
||||
return HttpResponseRedirect(next_url)
|
||||
|
|
|
@ -4,10 +4,13 @@ from django.contrib import auth
|
|||
from django.http import HttpResponseRedirect
|
||||
from django.urls import reverse
|
||||
from django.utils.http import urlencode
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from authentication.utils import build_absolute_uri
|
||||
from common.utils import get_logger
|
||||
from authentication.views.mixins import FlashMessageMixin
|
||||
from authentication.mixins import authenticate
|
||||
from common.utils import get_logger
|
||||
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
@ -39,7 +42,7 @@ class OAuth2AuthRequestView(View):
|
|||
return HttpResponseRedirect(redirect_url)
|
||||
|
||||
|
||||
class OAuth2AuthCallbackView(View):
|
||||
class OAuth2AuthCallbackView(View, FlashMessageMixin):
|
||||
http_method_names = ['get', ]
|
||||
|
||||
def get(self, request):
|
||||
|
@ -51,6 +54,11 @@ class OAuth2AuthCallbackView(View):
|
|||
if 'code' in callback_params:
|
||||
logger.debug(log_prompt.format('Process authenticate'))
|
||||
user = authenticate(code=callback_params['code'], request=request)
|
||||
|
||||
if err_msg := getattr(request, 'error_message', ''):
|
||||
login_url = reverse('authentication:login') + '?admin=1'
|
||||
return self.get_failed_response(login_url, title=_('Authentication failed'), msg=err_msg)
|
||||
|
||||
if user and user.is_valid:
|
||||
logger.debug(log_prompt.format('Login: {}'.format(user)))
|
||||
auth.login(self.request, user)
|
||||
|
|
|
@ -55,6 +55,12 @@ class FeiShuAuthentication(JMSModelBackend):
|
|||
pass
|
||||
|
||||
|
||||
class LarkAuthentication(FeiShuAuthentication):
|
||||
@staticmethod
|
||||
def is_enabled():
|
||||
return settings.AUTH_LARK
|
||||
|
||||
|
||||
class SlackAuthentication(JMSModelBackend):
|
||||
"""
|
||||
什么也不做呀😺
|
||||
|
@ -72,5 +78,6 @@ class AuthorizationTokenAuthentication(JMSModelBackend):
|
|||
"""
|
||||
什么也不做呀😺
|
||||
"""
|
||||
|
||||
def authenticate(self, request, **kwargs):
|
||||
pass
|
||||
|
|
|
@ -52,6 +52,10 @@ class AuthFailedError(Exception):
|
|||
return str(self.msg)
|
||||
|
||||
|
||||
class SSOAuthKeyTTLError(Exception):
|
||||
msg = 'sso_authkey_timeout'
|
||||
|
||||
|
||||
class BlockGlobalIpLoginError(AuthFailedError):
|
||||
error = 'block_global_ip_login'
|
||||
|
||||
|
|
|
@ -33,6 +33,11 @@ class FeiShuNotBound(JMSException):
|
|||
default_detail = _('FeiShu is not bound')
|
||||
|
||||
|
||||
class LarkNotBound(JMSException):
|
||||
default_code = 'lark_not_bound'
|
||||
default_detail = _('Lark is not bound')
|
||||
|
||||
|
||||
class SlackNotBound(JMSException):
|
||||
default_code = 'slack_not_bound'
|
||||
default_detail = _('Slack is not bound')
|
||||
|
|
|
@ -17,10 +17,6 @@ class EncryptedField(forms.CharField):
|
|||
|
||||
|
||||
class UserLoginForm(forms.Form):
|
||||
days_auto_login = int(settings.SESSION_COOKIE_AGE / 3600 / 24)
|
||||
disable_days_auto_login = settings.SESSION_EXPIRE_AT_BROWSER_CLOSE \
|
||||
or days_auto_login < 1
|
||||
|
||||
username = forms.CharField(
|
||||
label=_('Username'), max_length=100,
|
||||
widget=forms.TextInput(attrs={
|
||||
|
@ -34,15 +30,15 @@ class UserLoginForm(forms.Form):
|
|||
)
|
||||
auto_login = forms.BooleanField(
|
||||
required=False, initial=False,
|
||||
widget=forms.CheckboxInput(
|
||||
attrs={'disabled': disable_days_auto_login}
|
||||
)
|
||||
widget=forms.CheckboxInput()
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
auto_login_field = self.fields['auto_login']
|
||||
auto_login_field.label = _("{} days auto login").format(self.days_auto_login or 1)
|
||||
auto_login_field.label = _("Auto login")
|
||||
if settings.SESSION_EXPIRE_AT_BROWSER_CLOSE:
|
||||
auto_login_field.widget = forms.HiddenInput()
|
||||
|
||||
def confirm_login_allowed(self, user):
|
||||
if not user.is_staff:
|
||||
|
|
|
@ -363,7 +363,6 @@ class AuthACLMixin:
|
|||
if acl.is_action(acl.ActionChoices.notice):
|
||||
self.request.session['auth_notice_required'] = '1'
|
||||
self.request.session['auth_acl_id'] = str(acl.id)
|
||||
return
|
||||
|
||||
def _check_third_party_login_acl(self):
|
||||
request = self.request
|
||||
|
|
|
@ -82,12 +82,15 @@ class ConnectionToken(JMSOrgBaseModel):
|
|||
self.save(update_fields=['date_expired'])
|
||||
|
||||
def set_reusable(self, is_reusable):
|
||||
if not settings.CONNECTION_TOKEN_REUSABLE:
|
||||
return
|
||||
self.is_reusable = is_reusable
|
||||
if self.is_reusable:
|
||||
seconds = settings.CONNECTION_TOKEN_REUSABLE_EXPIRATION
|
||||
else:
|
||||
seconds = settings.CONNECTION_TOKEN_ONETIME_EXPIRATION
|
||||
self.date_expired = timezone.now() + timedelta(seconds=seconds)
|
||||
|
||||
self.date_expired = self.date_created + timedelta(seconds=seconds)
|
||||
self.save(update_fields=['is_reusable', 'date_expired'])
|
||||
|
||||
def renewal(self):
|
||||
|
|
|
@ -1,5 +1,3 @@
|
|||
from importlib import import_module
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import user_logged_in
|
||||
from django.core.cache import cache
|
||||
|
@ -8,6 +6,7 @@ from django_cas_ng.signals import cas_user_authenticated
|
|||
|
||||
from apps.jumpserver.settings.auth import AUTHENTICATION_BACKENDS_THIRD_PARTY
|
||||
from audits.models import UserSession
|
||||
from common.sessions.cache import user_session_manager
|
||||
from .signals import post_auth_success, post_auth_failed, user_auth_failed, user_auth_success
|
||||
|
||||
|
||||
|
@ -32,8 +31,7 @@ def on_user_auth_login_success(sender, user, request, **kwargs):
|
|||
lock_key = 'single_machine_login_' + str(user.id)
|
||||
session_key = cache.get(lock_key)
|
||||
if session_key and session_key != request.session.session_key:
|
||||
session = import_module(settings.SESSION_ENGINE).SessionStore(session_key)
|
||||
session.delete()
|
||||
user_session_manager.remove(session_key)
|
||||
UserSession.objects.filter(key=session_key).delete()
|
||||
cache.set(lock_key, request.session.session_key, None)
|
||||
|
||||
|
|
|
@ -95,6 +95,7 @@ function doRequestAuth() {
|
|||
}
|
||||
clearInterval(interval);
|
||||
clearInterval(checkInterval);
|
||||
cancelTicket();
|
||||
$(".copy-btn").attr('disabled', 'disabled');
|
||||
errorMsgRef.html(data.msg)
|
||||
}
|
||||
|
|
|
@ -22,6 +22,9 @@ urlpatterns = [
|
|||
path('feishu/event/subscription/callback/', api.FeiShuEventSubscriptionCallback.as_view(),
|
||||
name='feishu-event-subscription-callback'),
|
||||
|
||||
path('lark/event/subscription/callback/', api.LarkEventSubscriptionCallback.as_view(),
|
||||
name='lark-event-subscription-callback'),
|
||||
|
||||
path('auth/', api.TokenCreateApi.as_view(), name='user-auth'),
|
||||
path('confirm-oauth/', api.ConfirmBindORUNBindOAuth.as_view(), name='confirm-oauth'),
|
||||
path('tokens/', api.TokenCreateApi.as_view(), name='auth-token'),
|
||||
|
@ -32,6 +35,7 @@ urlpatterns = [
|
|||
path('password/reset-code/', api.UserResetPasswordSendCodeApi.as_view(), name='reset-password-code'),
|
||||
path('password/verify/', api.UserPasswordVerifyApi.as_view(), name='user-password-verify'),
|
||||
path('login-confirm-ticket/status/', api.TicketStatusApi.as_view(), name='login-confirm-ticket-status'),
|
||||
path('user-session/', api.UserSessionApi.as_view(), name='user-session'),
|
||||
]
|
||||
|
||||
urlpatterns += router.urls + passkey_urlpatterns
|
||||
|
|
|
@ -49,6 +49,12 @@ urlpatterns = [
|
|||
path('feishu/qr/bind/callback/', views.FeiShuQRBindCallbackView.as_view(), name='feishu-qr-bind-callback'),
|
||||
path('feishu/qr/login/callback/', views.FeiShuQRLoginCallbackView.as_view(), name='feishu-qr-login-callback'),
|
||||
|
||||
path('lark/bind/start/', views.LarkEnableStartView.as_view(), name='lark-bind-start'),
|
||||
path('lark/qr/bind/', views.LarkQRBindView.as_view(), name='lark-qr-bind'),
|
||||
path('lark/qr/login/', views.LarkQRLoginView.as_view(), name='lark-qr-login'),
|
||||
path('lark/qr/bind/callback/', views.LarkQRBindCallbackView.as_view(), name='lark-qr-bind-callback'),
|
||||
path('lark/qr/login/callback/', views.LarkQRLoginCallbackView.as_view(), name='lark-qr-login-callback'),
|
||||
|
||||
path('slack/bind/start/', views.SlackEnableStartView.as_view(), name='slack-bind-start'),
|
||||
path('slack/qr/bind/', views.SlackQRBindView.as_view(), name='slack-qr-bind'),
|
||||
path('slack/qr/login/', views.SlackQRLoginView.as_view(), name='slack-qr-login'),
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
from .login import *
|
||||
from .mfa import *
|
||||
from .wecom import *
|
||||
from .dingtalk import *
|
||||
from .feishu import *
|
||||
from .lark import *
|
||||
from .login import *
|
||||
from .mfa import *
|
||||
from .slack import *
|
||||
from .wecom import *
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
from functools import lru_cache
|
||||
|
||||
from django.conf import settings
|
||||
from django.db.utils import IntegrityError
|
||||
from django.contrib.auth import logout as auth_logout
|
||||
from django.db.utils import IntegrityError
|
||||
from django.utils.module_loading import import_string
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.views import View
|
||||
|
@ -12,8 +12,8 @@ from authentication import errors
|
|||
from authentication.mixins import AuthMixin
|
||||
from authentication.notifications import OAuthBindMessage
|
||||
from common.utils import get_logger
|
||||
from common.utils.django import reverse, get_object_or_none
|
||||
from common.utils.common import get_request_ip
|
||||
from common.utils.django import reverse, get_object_or_none
|
||||
from users.models import User
|
||||
from users.signal_handlers import check_only_allow_exist_user_auth
|
||||
from .mixins import FlashMessageMixin
|
||||
|
@ -83,7 +83,15 @@ class BaseLoginCallbackView(AuthMixin, FlashMessageMixin, IMClientMixin, View):
|
|||
if not self.verify_state():
|
||||
return self.get_verify_state_failed_response(redirect_url)
|
||||
|
||||
user_id, other_info = self.client.get_user_id_by_code(code)
|
||||
try:
|
||||
user_id, other_info = self.client.get_user_id_by_code(code)
|
||||
except Exception:
|
||||
response = self.get_failed_response(
|
||||
login_url, title=self.msg_client_err,
|
||||
msg=self.msg_not_found_user_from_client_err
|
||||
)
|
||||
return response
|
||||
|
||||
if not user_id:
|
||||
# 正常流程不会出这个错误,hack 行为
|
||||
err = self.msg_not_found_user_from_client_err
|
||||
|
|
|
@ -21,24 +21,45 @@ from .mixins import FlashMessageMixin
|
|||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
FEISHU_STATE_SESSION_KEY = '_feishu_state'
|
||||
|
||||
class FeiShuEnableStartView(UserVerifyPasswordView):
|
||||
category = 'feishu'
|
||||
|
||||
def get_success_url(self):
|
||||
referer = self.request.META.get('HTTP_REFERER')
|
||||
redirect_url = self.request.GET.get("redirect_url")
|
||||
|
||||
success_url = reverse(f'authentication:{self.category}-qr-bind')
|
||||
|
||||
success_url += '?' + urlencode({
|
||||
'redirect_url': redirect_url or referer
|
||||
})
|
||||
|
||||
return success_url
|
||||
|
||||
|
||||
class FeiShuQRMixin(UserConfirmRequiredExceptionMixin, PermissionsMixin, FlashMessageMixin, View):
|
||||
category = 'feishu'
|
||||
error = _('FeiShu Error')
|
||||
error_msg = _('FeiShu is already bound')
|
||||
state_session_key = f'_{category}_state'
|
||||
|
||||
@property
|
||||
def url_object(self):
|
||||
return URL()
|
||||
|
||||
def dispatch(self, request, *args, **kwargs):
|
||||
try:
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
except APIException as e:
|
||||
msg = str(e.detail)
|
||||
return self.get_failed_response(
|
||||
'/',
|
||||
_('FeiShu Error'),
|
||||
msg
|
||||
'/', self.error, msg
|
||||
)
|
||||
|
||||
def verify_state(self):
|
||||
state = self.request.GET.get('state')
|
||||
session_state = self.request.session.get(FEISHU_STATE_SESSION_KEY)
|
||||
session_state = self.request.session.get(self.state_session_key)
|
||||
if state != session_state:
|
||||
return False
|
||||
return True
|
||||
|
@ -49,19 +70,18 @@ class FeiShuQRMixin(UserConfirmRequiredExceptionMixin, PermissionsMixin, FlashMe
|
|||
|
||||
def get_qr_url(self, redirect_uri):
|
||||
state = random_string(16)
|
||||
self.request.session[FEISHU_STATE_SESSION_KEY] = state
|
||||
self.request.session[self.state_session_key] = state
|
||||
|
||||
params = {
|
||||
'app_id': settings.FEISHU_APP_ID,
|
||||
'app_id': getattr(settings, f'{self.category}_APP_ID'.upper()),
|
||||
'state': state,
|
||||
'redirect_uri': redirect_uri,
|
||||
}
|
||||
url = URL().authen + '?' + urlencode(params)
|
||||
url = self.url_object.authen + '?' + urlencode(params)
|
||||
return url
|
||||
|
||||
def get_already_bound_response(self, redirect_url):
|
||||
msg = _('FeiShu is already bound')
|
||||
response = self.get_failed_response(redirect_url, msg, msg)
|
||||
response = self.get_failed_response(redirect_url, self.error_msg, self.error_msg)
|
||||
return response
|
||||
|
||||
|
||||
|
@ -71,7 +91,7 @@ class FeiShuQRBindView(FeiShuQRMixin, View):
|
|||
def get(self, request: HttpRequest):
|
||||
redirect_url = request.GET.get('redirect_url')
|
||||
|
||||
redirect_uri = reverse('authentication:feishu-qr-bind-callback', external=True)
|
||||
redirect_uri = reverse(f'authentication:{self.category}-qr-bind-callback', external=True)
|
||||
redirect_uri += '?' + urlencode({'redirect_url': redirect_url})
|
||||
|
||||
url = self.get_qr_url(redirect_uri)
|
||||
|
@ -81,25 +101,16 @@ class FeiShuQRBindView(FeiShuQRMixin, View):
|
|||
class FeiShuQRBindCallbackView(FeiShuQRMixin, BaseBindCallbackView):
|
||||
permission_classes = (IsAuthenticated,)
|
||||
|
||||
client_type_path = 'common.sdk.im.feishu.FeiShu'
|
||||
client_auth_params = {'app_id': 'FEISHU_APP_ID', 'app_secret': 'FEISHU_APP_SECRET'}
|
||||
auth_type = 'feishu'
|
||||
auth_type_label = _('FeiShu')
|
||||
client_type_path = f'common.sdk.im.{auth_type}.FeiShu'
|
||||
|
||||
|
||||
class FeiShuEnableStartView(UserVerifyPasswordView):
|
||||
|
||||
def get_success_url(self):
|
||||
referer = self.request.META.get('HTTP_REFERER')
|
||||
redirect_url = self.request.GET.get("redirect_url")
|
||||
|
||||
success_url = reverse('authentication:feishu-qr-bind')
|
||||
|
||||
success_url += '?' + urlencode({
|
||||
'redirect_url': redirect_url or referer
|
||||
})
|
||||
|
||||
return success_url
|
||||
@property
|
||||
def client_auth_params(self):
|
||||
return {
|
||||
'app_id': f'{self.auth_type}_APP_ID'.upper(),
|
||||
'app_secret': f'{self.auth_type}_APP_SECRET'.upper()
|
||||
}
|
||||
|
||||
|
||||
class FeiShuQRLoginView(FeiShuQRMixin, View):
|
||||
|
@ -107,7 +118,7 @@ class FeiShuQRLoginView(FeiShuQRMixin, View):
|
|||
|
||||
def get(self, request: HttpRequest):
|
||||
redirect_url = request.GET.get('redirect_url') or reverse('index')
|
||||
redirect_uri = reverse('authentication:feishu-qr-login-callback', external=True)
|
||||
redirect_uri = reverse(f'authentication:{self.category}-qr-login-callback', external=True)
|
||||
redirect_uri += '?' + urlencode({
|
||||
'redirect_url': redirect_url,
|
||||
})
|
||||
|
@ -119,11 +130,19 @@ class FeiShuQRLoginView(FeiShuQRMixin, View):
|
|||
class FeiShuQRLoginCallbackView(FeiShuQRMixin, BaseLoginCallbackView):
|
||||
permission_classes = (AllowAny,)
|
||||
|
||||
client_type_path = 'common.sdk.im.feishu.FeiShu'
|
||||
client_auth_params = {'app_id': 'FEISHU_APP_ID', 'app_secret': 'FEISHU_APP_SECRET'}
|
||||
user_type = 'feishu'
|
||||
auth_backend = 'AUTH_BACKEND_FEISHU'
|
||||
auth_type = user_type
|
||||
client_type_path = f'common.sdk.im.{auth_type}.FeiShu'
|
||||
|
||||
msg_client_err = _('FeiShu Error')
|
||||
msg_user_not_bound_err = _('FeiShu is not bound')
|
||||
msg_not_found_user_from_client_err = _('Failed to get user from FeiShu')
|
||||
|
||||
auth_backend = f'AUTH_BACKEND_{auth_type}'.upper()
|
||||
|
||||
@property
|
||||
def client_auth_params(self):
|
||||
return {
|
||||
'app_id': f'{self.auth_type}_APP_ID'.upper(),
|
||||
'app_secret': f'{self.auth_type}_APP_SECRET'.upper()
|
||||
}
|
||||
|
|
|
@ -0,0 +1,51 @@
|
|||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from common.sdk.im.lark import URL
|
||||
from common.utils import get_logger
|
||||
from .feishu import (
|
||||
FeiShuEnableStartView, FeiShuQRBindView, FeiShuQRBindCallbackView,
|
||||
FeiShuQRLoginView, FeiShuQRLoginCallbackView
|
||||
)
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
class LarkEnableStartView(FeiShuEnableStartView):
|
||||
category = 'lark'
|
||||
|
||||
|
||||
class BaseLarkQRMixin:
|
||||
category = 'lark'
|
||||
error = _('Lark Error')
|
||||
error_msg = _('Lark is already bound')
|
||||
state_session_key = f'_{category}_state'
|
||||
|
||||
@property
|
||||
def url_object(self):
|
||||
return URL()
|
||||
|
||||
|
||||
class LarkQRBindView(BaseLarkQRMixin, FeiShuQRBindView):
|
||||
pass
|
||||
|
||||
|
||||
class LarkQRBindCallbackView(BaseLarkQRMixin, FeiShuQRBindCallbackView):
|
||||
auth_type = 'lark'
|
||||
auth_type_label = auth_type.capitalize()
|
||||
client_type_path = f'common.sdk.im.{auth_type}.Lark'
|
||||
|
||||
|
||||
class LarkQRLoginView(BaseLarkQRMixin, FeiShuQRLoginView):
|
||||
pass
|
||||
|
||||
|
||||
class LarkQRLoginCallbackView(BaseLarkQRMixin, FeiShuQRLoginCallbackView):
|
||||
user_type = 'lark'
|
||||
auth_type = user_type
|
||||
client_type_path = f'common.sdk.im.{auth_type}.Lark'
|
||||
|
||||
msg_client_err = _('Lark Error')
|
||||
msg_user_not_bound_err = _('Lark is not bound')
|
||||
msg_not_found_user_from_client_err = _('Failed to get user from Lark')
|
||||
|
||||
auth_backend = f'AUTH_BACKEND_{auth_type}'.upper()
|
|
@ -91,6 +91,12 @@ class UserLoginContextMixin:
|
|||
'url': reverse('authentication:feishu-qr-login'),
|
||||
'logo': static('img/login_feishu_logo.png')
|
||||
},
|
||||
{
|
||||
'name': 'Lark',
|
||||
'enabled': settings.AUTH_LARK,
|
||||
'url': reverse('authentication:lark-qr-login'),
|
||||
'logo': static('img/login_lark_logo.png')
|
||||
},
|
||||
{
|
||||
'name': _('Slack'),
|
||||
'enabled': settings.AUTH_SLACK,
|
||||
|
@ -113,6 +119,10 @@ class UserLoginContextMixin:
|
|||
'title': '中文(简体)',
|
||||
'code': 'zh-hans'
|
||||
},
|
||||
{
|
||||
'title': '中文(繁體)',
|
||||
'code': 'zh-hant'
|
||||
},
|
||||
{
|
||||
'title': 'English',
|
||||
'code': 'en'
|
||||
|
|
|
@ -6,6 +6,7 @@ from typing import Callable
|
|||
|
||||
from django.db import models
|
||||
from django.db.models.signals import m2m_changed
|
||||
from rest_framework.request import Request
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.settings import api_settings
|
||||
|
||||
|
@ -19,7 +20,7 @@ from .serializer import SerializerMixin
|
|||
|
||||
__all__ = [
|
||||
'CommonApiMixin', 'PaginatedResponseMixin', 'RelationMixin',
|
||||
'ExtraFilterFieldsMixin',
|
||||
'ExtraFilterFieldsMixin'
|
||||
]
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
@ -89,6 +90,7 @@ class RelationMixin:
|
|||
|
||||
class QuerySetMixin:
|
||||
action: str
|
||||
request: Request
|
||||
get_serializer_class: Callable
|
||||
get_queryset: Callable
|
||||
|
||||
|
@ -98,8 +100,18 @@ class QuerySetMixin:
|
|||
return queryset
|
||||
if self.action == 'metadata':
|
||||
queryset = queryset.none()
|
||||
queryset = self.setup_eager_loading(queryset)
|
||||
return queryset
|
||||
|
||||
# Todo: 未来考虑自定义 pagination
|
||||
def setup_eager_loading(self, queryset):
|
||||
if self.request.query_params.get('format') not in ['csv', 'xlsx']:
|
||||
return queryset
|
||||
serializer_class = self.get_serializer_class()
|
||||
if not serializer_class or not hasattr(serializer_class, 'setup_eager_loading'):
|
||||
return queryset
|
||||
return serializer_class.setup_eager_loading(queryset)
|
||||
|
||||
def paginate_queryset(self, queryset):
|
||||
page = super().paginate_queryset(queryset)
|
||||
serializer_class = self.get_serializer_class()
|
||||
|
@ -186,10 +198,7 @@ class OrderingFielderFieldsMixin:
|
|||
model = self.queryset.model
|
||||
else:
|
||||
queryset = self.get_queryset()
|
||||
if isinstance(queryset, list):
|
||||
model = None
|
||||
else:
|
||||
model = queryset.model
|
||||
model = None if isinstance(queryset, list) else queryset.model
|
||||
|
||||
if not model:
|
||||
return []
|
||||
|
|
|
@ -27,6 +27,8 @@ class SerializerMixin:
|
|||
return None
|
||||
serializer_classes = dict(serializer_classes)
|
||||
view_action = self.request.query_params.get('action') or self.action or 'list'
|
||||
if self.request.query_params.get('format'):
|
||||
view_action = 'retrieve'
|
||||
serializer_class = serializer_classes.get(view_action)
|
||||
|
||||
if serializer_class is None:
|
||||
|
|
|
@ -469,7 +469,7 @@ class JSONManyToManyDescriptor:
|
|||
rule_match = rule.get('match', 'exact')
|
||||
|
||||
custom_filter_q = None
|
||||
spec_attr_filter = getattr(to_model, "get_filter_{}_attr_q".format(rule['name']), None)
|
||||
spec_attr_filter = getattr(to_model, "get_{}_filter_attr_q".format(rule['name']), None)
|
||||
if spec_attr_filter:
|
||||
custom_filter_q = spec_attr_filter(rule_value, rule_match)
|
||||
elif custom_attr_filter:
|
||||
|
@ -478,59 +478,61 @@ class JSONManyToManyDescriptor:
|
|||
custom_q &= custom_filter_q
|
||||
continue
|
||||
|
||||
if rule_match == 'in':
|
||||
res &= value in rule_value or '*' in rule_value
|
||||
elif rule_match == 'exact':
|
||||
res &= value == rule_value or rule_value == '*'
|
||||
elif rule_match == 'contains':
|
||||
res &= (rule_value in value)
|
||||
elif rule_match == 'startswith':
|
||||
res &= str(value).startswith(str(rule_value))
|
||||
elif rule_match == 'endswith':
|
||||
res &= str(value).endswith(str(rule_value))
|
||||
elif rule_match == 'regex':
|
||||
try:
|
||||
matched = bool(re.search(r'{}'.format(rule_value), value))
|
||||
except Exception as e:
|
||||
logging.error('Error regex match: %s', e)
|
||||
matched = False
|
||||
res &= matched
|
||||
elif rule_match == 'not':
|
||||
res &= value != rule_value
|
||||
elif rule['match'] == 'gte':
|
||||
res &= value >= rule_value
|
||||
elif rule['match'] == 'lte':
|
||||
res &= value <= rule_value
|
||||
elif rule['match'] == 'gt':
|
||||
res &= value > rule_value
|
||||
elif rule['match'] == 'lt':
|
||||
res &= value < rule_value
|
||||
elif rule['match'] == 'ip_in':
|
||||
if isinstance(rule_value, str):
|
||||
rule_value = [rule_value]
|
||||
res &= '*' in rule_value or contains_ip(value, rule_value)
|
||||
elif rule['match'].startswith('m2m'):
|
||||
if isinstance(value, Manager):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, QuerySet):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, models.Model):
|
||||
value = [value.id]
|
||||
if isinstance(rule_value, (str, int)):
|
||||
rule_value = [rule_value]
|
||||
value = set(map(str, value))
|
||||
rule_value = set(map(str, rule_value))
|
||||
match rule_match:
|
||||
case 'in':
|
||||
res &= value in rule_value or '*' in rule_value
|
||||
case 'exact':
|
||||
res &= value == rule_value or rule_value == '*'
|
||||
case 'contains':
|
||||
res &= rule_value in value
|
||||
case 'startswith':
|
||||
res &= str(value).startswith(str(rule_value))
|
||||
case 'endswith':
|
||||
res &= str(value).endswith(str(rule_value))
|
||||
case 'regex':
|
||||
try:
|
||||
matched = bool(re.search(r'{}'.format(rule_value), value))
|
||||
except Exception as e:
|
||||
logging.error('Error regex match: %s', e)
|
||||
matched = False
|
||||
res &= matched
|
||||
case 'not':
|
||||
res &= value != rule_value
|
||||
case 'gte' | 'lte' | 'gt' | 'lt':
|
||||
operations = {
|
||||
'gte': lambda x, y: x >= y,
|
||||
'lte': lambda x, y: x <= y,
|
||||
'gt': lambda x, y: x > y,
|
||||
'lt': lambda x, y: x < y
|
||||
}
|
||||
res &= operations[rule_match](value, rule_value)
|
||||
case 'ip_in':
|
||||
if isinstance(rule_value, str):
|
||||
rule_value = [rule_value]
|
||||
res &= '*' in rule_value or contains_ip(value, rule_value)
|
||||
case rule_match if rule_match.startswith('m2m'):
|
||||
if isinstance(value, Manager):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, QuerySet):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, models.Model):
|
||||
value = [value.id]
|
||||
if isinstance(rule_value, (str, int)):
|
||||
rule_value = [rule_value]
|
||||
value = set(map(str, value))
|
||||
rule_value = set(map(str, rule_value))
|
||||
|
||||
if rule['match'] == 'm2m_all':
|
||||
res &= rule_value.issubset(value)
|
||||
else:
|
||||
res &= bool(value & rule_value)
|
||||
else:
|
||||
logging.error("unknown match: {}".format(rule['match']))
|
||||
res &= False
|
||||
if rule['match'] == 'm2m_all':
|
||||
res &= rule_value.issubset(value)
|
||||
else:
|
||||
res &= bool(value & rule_value)
|
||||
case __:
|
||||
logging.error("unknown match: {}".format(rule['match']))
|
||||
res &= False
|
||||
|
||||
if not res:
|
||||
return res
|
||||
|
||||
if custom_q:
|
||||
res &= to_model.objects.filter(custom_q).filter(id=obj.id).exists()
|
||||
return res
|
||||
|
|
|
@ -3,6 +3,7 @@
|
|||
import base64
|
||||
import json
|
||||
import logging
|
||||
from collections import defaultdict
|
||||
|
||||
from django.core.cache import cache
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
|
@ -12,6 +13,7 @@ from rest_framework import filters
|
|||
from rest_framework.compat import coreapi, coreschema
|
||||
from rest_framework.fields import DateTimeField
|
||||
from rest_framework.serializers import ValidationError
|
||||
from rest_framework.filters import OrderingFilter
|
||||
|
||||
from common import const
|
||||
from common.db.fields import RelatedManager
|
||||
|
@ -23,6 +25,7 @@ __all__ = [
|
|||
'IDInFilterBackend', "CustomFilterBackend",
|
||||
"BaseFilterSet", 'IDNotFilterBackend',
|
||||
'NotOrRelFilterBackend', 'LabelFilterBackend',
|
||||
'RewriteOrderingFilter'
|
||||
]
|
||||
|
||||
|
||||
|
@ -180,7 +183,7 @@ class LabelFilterBackend(filters.BaseFilterBackend):
|
|||
]
|
||||
|
||||
@staticmethod
|
||||
def parse_label_ids(labels_id):
|
||||
def parse_labels(labels_id):
|
||||
from labels.models import Label
|
||||
label_ids = [i.strip() for i in labels_id.split(',')]
|
||||
cleaned = []
|
||||
|
@ -201,8 +204,8 @@ class LabelFilterBackend(filters.BaseFilterBackend):
|
|||
q = Q()
|
||||
for kwarg in args:
|
||||
q |= Q(**kwarg)
|
||||
ids = Label.objects.filter(q).values_list('id', flat=True)
|
||||
cleaned.extend(list(ids))
|
||||
labels = Label.objects.filter(q)
|
||||
cleaned.extend(list(labels))
|
||||
return cleaned
|
||||
|
||||
def filter_queryset(self, request, queryset, view):
|
||||
|
@ -221,13 +224,23 @@ class LabelFilterBackend(filters.BaseFilterBackend):
|
|||
app_label = model._meta.app_label
|
||||
model_name = model._meta.model_name
|
||||
|
||||
resources = labeled_resource_cls.objects.filter(
|
||||
full_resources = labeled_resource_cls.objects.filter(
|
||||
res_type__app_label=app_label, res_type__model=model_name,
|
||||
)
|
||||
label_ids = self.parse_label_ids(labels_id)
|
||||
resources = model.filter_resources_by_labels(resources, label_ids)
|
||||
res_ids = resources.values_list('res_id', flat=True)
|
||||
queryset = queryset.filter(id__in=set(res_ids))
|
||||
labels = self.parse_labels(labels_id)
|
||||
grouped = defaultdict(set)
|
||||
for label in labels:
|
||||
grouped[label.name].add(label.id)
|
||||
|
||||
matched_ids = set()
|
||||
for name, label_ids in grouped.items():
|
||||
resources = model.filter_resources_by_labels(full_resources, label_ids, rel='any')
|
||||
res_ids = resources.values_list('res_id', flat=True)
|
||||
if not matched_ids:
|
||||
matched_ids = set(res_ids)
|
||||
else:
|
||||
matched_ids &= set(res_ids)
|
||||
queryset = queryset.filter(id__in=matched_ids)
|
||||
return queryset
|
||||
|
||||
|
||||
|
@ -324,3 +337,17 @@ class NotOrRelFilterBackend(filters.BaseFilterBackend):
|
|||
queryset.query.where.connector = 'OR'
|
||||
queryset._result_cache = None
|
||||
return queryset
|
||||
|
||||
|
||||
class RewriteOrderingFilter(OrderingFilter):
|
||||
default_ordering_if_has = ('name', )
|
||||
|
||||
def get_default_ordering(self, view):
|
||||
ordering = super().get_default_ordering(view)
|
||||
# 如果 view.ordering = [] 表示不排序, 这样可以节约性能 (比如: 用户授权的资产)
|
||||
if ordering is not None:
|
||||
return ordering
|
||||
ordering_fields = getattr(view, 'ordering_fields', self.ordering_fields)
|
||||
if ordering_fields:
|
||||
ordering = tuple([f for f in ordering_fields if f in self.default_ordering_if_has])
|
||||
return ordering
|
||||
|
|
|
@ -4,6 +4,7 @@ import re
|
|||
from datetime import datetime
|
||||
|
||||
import pyzipper
|
||||
from django.conf import settings
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
from rest_framework.renderers import BaseRenderer
|
||||
|
@ -16,7 +17,7 @@ logger = get_logger(__file__)
|
|||
|
||||
|
||||
class BaseFileRenderer(BaseRenderer):
|
||||
# 渲染模版标识, 导入、导出、更新模版: ['import', 'update', 'export']
|
||||
# 渲染模板标识, 导入、导出、更新模板: ['import', 'update', 'export']
|
||||
template = 'export'
|
||||
serializer = None
|
||||
|
||||
|
@ -77,7 +78,7 @@ class BaseFileRenderer(BaseRenderer):
|
|||
results = [results[0]] if results else results
|
||||
else:
|
||||
# 限制数据数量
|
||||
results = results[:10000]
|
||||
results = results[:settings.MAX_LIMIT_PER_PAGE]
|
||||
# 会将一些 UUID 字段转化为 string
|
||||
results = json.loads(json.dumps(results, cls=encoders.JSONEncoder))
|
||||
return results
|
||||
|
|
|
@ -23,7 +23,14 @@ class CSVFileRenderer(BaseFileRenderer):
|
|||
self.writer = csv_writer
|
||||
|
||||
def write_row(self, row):
|
||||
self.writer.writerow(row)
|
||||
row_escape = []
|
||||
for d in row:
|
||||
if isinstance(d, str) and d.strip().startswith(('=', '@')):
|
||||
d = "'{}".format(d)
|
||||
row_escape.append(d)
|
||||
else:
|
||||
row_escape.append(d)
|
||||
self.writer.writerow(row_escape)
|
||||
|
||||
def get_rendered_value(self):
|
||||
value = self.buffer.getvalue()
|
||||
|
|
|
@ -25,7 +25,9 @@ class ExcelFileRenderer(BaseFileRenderer):
|
|||
# 处理非法字符
|
||||
column_count += 1
|
||||
cell_value = ILLEGAL_CHARACTERS_RE.sub(r'', str(cell_value))
|
||||
self.ws.cell(row=self.row_count, column=column_count, value=str(cell_value))
|
||||
cell = self.ws.cell(row=self.row_count, column=column_count, value=str(cell_value))
|
||||
# 设置单元格格式为纯文本, 防止执行公式
|
||||
cell.data_type = 's'
|
||||
|
||||
def after_render(self):
|
||||
for col in self.ws.columns:
|
||||
|
|
|
@ -27,7 +27,7 @@ class Services(TextChoices):
|
|||
cls.flower: services.FlowerService,
|
||||
cls.celery_default: services.CeleryDefaultService,
|
||||
cls.celery_ansible: services.CeleryAnsibleService,
|
||||
cls.beat: services.BeatService
|
||||
cls.beat: services.BeatService,
|
||||
}
|
||||
return services_map.get(name)
|
||||
|
||||
|
|
|
@ -12,8 +12,8 @@ class CeleryBaseService(BaseService):
|
|||
@property
|
||||
def cmd(self):
|
||||
print('\n- Start Celery as Distributed Task Queue: {}'.format(self.queue.capitalize()))
|
||||
ansible_config_path = os.path.join(settings.APPS_DIR, 'ops', 'ansible', 'ansible.cfg')
|
||||
ansible_modules_path = os.path.join(settings.APPS_DIR, 'ops', 'ansible', 'modules')
|
||||
ansible_config_path = os.path.join(settings.APPS_DIR, 'libs', 'ansible', 'ansible.cfg')
|
||||
ansible_modules_path = os.path.join(settings.APPS_DIR, 'libs', 'ansible', 'modules')
|
||||
os.environ.setdefault('LC_ALL', 'C.UTF-8')
|
||||
os.environ.setdefault('PYTHONOPTIMIZE', '1')
|
||||
os.environ.setdefault('ANSIBLE_FORCE_COLOR', 'True')
|
||||
|
|
|
@ -2,24 +2,18 @@ import json
|
|||
|
||||
from rest_framework.exceptions import APIException
|
||||
|
||||
from django.conf import settings
|
||||
from users.utils import construct_user_email
|
||||
from common.utils.common import get_logger
|
||||
from common.sdk.im.utils import digest
|
||||
from common.sdk.im.mixin import RequestMixin, BaseRequest
|
||||
from common.sdk.im.utils import digest
|
||||
from common.utils.common import get_logger
|
||||
from users.utils import construct_user_email
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
class URL:
|
||||
# https://open.feishu.cn/document/ukTMukTMukTM/uEDO4UjLxgDO14SM4gTN
|
||||
@property
|
||||
def host(self):
|
||||
if settings.FEISHU_VERSION == 'feishu':
|
||||
h = 'https://open.feishu.cn'
|
||||
else:
|
||||
h = 'https://open.larksuite.com'
|
||||
return h
|
||||
|
||||
host = 'https://open.feishu.cn'
|
||||
|
||||
@property
|
||||
def authen(self):
|
||||
|
@ -87,12 +81,13 @@ class FeiShu(RequestMixin):
|
|||
"""
|
||||
非业务数据导致的错误直接抛异常,说明是系统配置错误,业务代码不用理会
|
||||
"""
|
||||
requests_cls = FeishuRequests
|
||||
|
||||
def __init__(self, app_id, app_secret, timeout=None):
|
||||
self._app_id = app_id or ''
|
||||
self._app_secret = app_secret or ''
|
||||
|
||||
self._requests = FeishuRequests(
|
||||
self._requests = self.requests_cls(
|
||||
app_id=app_id,
|
||||
app_secret=app_secret,
|
||||
timeout=timeout
|
||||
|
@ -130,7 +125,7 @@ class FeiShu(RequestMixin):
|
|||
body['receive_id'] = user_id
|
||||
|
||||
try:
|
||||
logger.info(f'Feishu send text: user_ids={user_ids} msg={msg}')
|
||||
logger.info(f'{self.__class__.__name__} send text: user_ids={user_ids} msg={msg}')
|
||||
self._requests.post(URL().send_message, params=params, json=body)
|
||||
except APIException as e:
|
||||
# 只处理可预知的错误
|
||||
|
|
|
@ -0,0 +1,16 @@
|
|||
from common.utils.common import get_logger
|
||||
from ..feishu import URL as FeiShuURL, FeishuRequests, FeiShu
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
class URL(FeiShuURL):
|
||||
host = 'https://open.larksuite.com'
|
||||
|
||||
|
||||
class LarkRequests(FeishuRequests):
|
||||
pass
|
||||
|
||||
|
||||
class Lark(FeiShu):
|
||||
requests_cls = LarkRequests
|
|
@ -43,6 +43,7 @@ class CustomSMS(BaseSMSClient):
|
|||
raise JMSException(detail=response.text, code=response.status_code)
|
||||
except Exception as exc:
|
||||
logger.error('Custom sms error: {}'.format(exc))
|
||||
raise JMSException(exc)
|
||||
|
||||
|
||||
client = CustomSMS
|
||||
|
|
|
@ -1,16 +1,19 @@
|
|||
import re
|
||||
from importlib import import_module
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.sessions.backends.cache import (
|
||||
SessionStore as DjangoSessionStore
|
||||
)
|
||||
from django.core.cache import cache
|
||||
from django.core.cache import cache, caches
|
||||
|
||||
from jumpserver.utils import get_current_request
|
||||
|
||||
|
||||
class SessionStore(DjangoSessionStore):
|
||||
ignore_urls = [
|
||||
r'^/api/v1/users/profile/'
|
||||
r'^/api/v1/users/profile/',
|
||||
r'^/api/v1/authentication/user-session/'
|
||||
]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
@ -32,10 +35,16 @@ class RedisUserSessionManager:
|
|||
def add_or_increment(self, session_key):
|
||||
self.client.hincrby(self.JMS_SESSION_KEY, session_key, 1)
|
||||
|
||||
def decrement_or_remove(self, session_key):
|
||||
new_count = self.client.hincrby(self.JMS_SESSION_KEY, session_key, -1)
|
||||
if new_count <= 0:
|
||||
def decrement(self, session_key):
|
||||
self.client.hincrby(self.JMS_SESSION_KEY, session_key, -1)
|
||||
|
||||
def remove(self, session_key):
|
||||
try:
|
||||
self.client.hdel(self.JMS_SESSION_KEY, session_key)
|
||||
session_store = import_module(settings.SESSION_ENGINE).SessionStore(session_key)
|
||||
session_store.delete()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def check_active(self, session_key):
|
||||
count = self.client.hget(self.JMS_SESSION_KEY, session_key)
|
||||
|
@ -52,5 +61,12 @@ class RedisUserSessionManager:
|
|||
session_keys.append(key)
|
||||
return session_keys
|
||||
|
||||
@staticmethod
|
||||
def get_keys():
|
||||
session_store_cls = import_module(settings.SESSION_ENGINE).SessionStore
|
||||
cache_key_prefix = session_store_cls.cache_key_prefix
|
||||
keys = caches[settings.SESSION_CACHE_ALIAS].iter_keys('*')
|
||||
return [k.replace(cache_key_prefix, '') for k in keys]
|
||||
|
||||
|
||||
user_session_manager = RedisUserSessionManager()
|
||||
|
|
|
@ -12,7 +12,7 @@ class ReplayStorageHandler(BaseStorageHandler):
|
|||
# 获取外部存储路径名
|
||||
session_path = self.obj.find_ok_relative_path_in_storage(storage)
|
||||
if not session_path:
|
||||
return None
|
||||
return None, None
|
||||
|
||||
# 通过外部存储路径名后缀,构造真实的本地存储路径
|
||||
return session_path, self.obj.get_local_path_by_relative_path(session_path)
|
||||
|
|
|
@ -30,7 +30,7 @@ class SendAndVerifyCodeUtil(object):
|
|||
self.other_args = kwargs
|
||||
|
||||
def gen_and_send_async(self):
|
||||
return send_async.delay(self)
|
||||
return send_async.apply_async(kwargs={"sender": self}, priority=100)
|
||||
|
||||
def gen_and_send(self):
|
||||
ttl = self.__ttl()
|
||||
|
|
|
@ -118,9 +118,14 @@ class DateTimeMixin:
|
|||
return self.get_logs_queryset_filter(qs, 'date_start')
|
||||
|
||||
@lazyproperty
|
||||
def command_queryset(self):
|
||||
qs = Command.objects.all()
|
||||
return self.get_logs_queryset_filter(qs, 'timestamp', is_timestamp=True)
|
||||
def command_type_queryset_tuple(self):
|
||||
type_queryset_tuple = Command.get_all_type_queryset_tuple()
|
||||
return (
|
||||
(tp, self.get_logs_queryset_filter(
|
||||
qs, 'timestamp', is_timestamp=True
|
||||
))
|
||||
for tp, qs in type_queryset_tuple
|
||||
)
|
||||
|
||||
@lazyproperty
|
||||
def job_logs_queryset(self):
|
||||
|
@ -131,7 +136,7 @@ class DateTimeMixin:
|
|||
class DatesLoginMetricMixin:
|
||||
dates_list: list
|
||||
date_start_end: tuple
|
||||
command_queryset: Command.objects
|
||||
command_type_queryset_tuple: tuple
|
||||
sessions_queryset: Session.objects
|
||||
ftp_logs_queryset: FTPLog.objects
|
||||
job_logs_queryset: JobLog.objects
|
||||
|
@ -229,13 +234,29 @@ class DatesLoginMetricMixin:
|
|||
def change_password_logs_amount(self):
|
||||
return self.password_change_logs_queryset.count()
|
||||
|
||||
@lazyproperty
|
||||
def command_statistics(self):
|
||||
from terminal.const import CommandStorageType
|
||||
total_amount = 0
|
||||
danger_amount = 0
|
||||
for tp, qs in self.command_type_queryset_tuple:
|
||||
if tp == CommandStorageType.es:
|
||||
total_amount += qs.count(limit_to_max_result_window=False)
|
||||
danger_amount += qs.filter(risk_level=RiskLevelChoices.reject).count(limit_to_max_result_window=False)
|
||||
else:
|
||||
total_amount += qs.count()
|
||||
danger_amount += qs.filter(risk_level=RiskLevelChoices.reject).count()
|
||||
return total_amount, danger_amount
|
||||
|
||||
@lazyproperty
|
||||
def commands_amount(self):
|
||||
return self.command_queryset.count()
|
||||
total_amount, __ = self.command_statistics
|
||||
return total_amount
|
||||
|
||||
@lazyproperty
|
||||
def commands_danger_amount(self):
|
||||
return self.command_queryset.filter(risk_level=RiskLevelChoices.reject).count()
|
||||
__, danger_amount = self.command_statistics
|
||||
return danger_amount
|
||||
|
||||
@lazyproperty
|
||||
def job_logs_running_amount(self):
|
||||
|
|
|
@ -277,6 +277,7 @@ class Config(dict):
|
|||
'AUTH_LDAP_START_TLS': False,
|
||||
'AUTH_LDAP_USER_ATTR_MAP': {"username": "cn", "name": "sn", "email": "mail"},
|
||||
'AUTH_LDAP_CONNECT_TIMEOUT': 10,
|
||||
'AUTH_LDAP_CACHE_TIMEOUT': 3600 * 24 * 30,
|
||||
'AUTH_LDAP_SEARCH_PAGED_SIZE': 1000,
|
||||
'AUTH_LDAP_SYNC_IS_PERIODIC': False,
|
||||
'AUTH_LDAP_SYNC_INTERVAL': None,
|
||||
|
@ -407,7 +408,11 @@ class Config(dict):
|
|||
'AUTH_FEISHU': False,
|
||||
'FEISHU_APP_ID': '',
|
||||
'FEISHU_APP_SECRET': '',
|
||||
'FEISHU_VERSION': 'feishu',
|
||||
|
||||
# Lark
|
||||
'AUTH_LARK': False,
|
||||
'LARK_APP_ID': '',
|
||||
'LARK_APP_SECRET': '',
|
||||
|
||||
# Slack
|
||||
'AUTH_SLACK': False,
|
||||
|
@ -609,7 +614,11 @@ class Config(dict):
|
|||
|
||||
'FILE_UPLOAD_SIZE_LIMIT_MB': 200,
|
||||
|
||||
'TICKET_APPLY_ASSET_SCOPE': 'all'
|
||||
'TICKET_APPLY_ASSET_SCOPE': 'all',
|
||||
|
||||
# Ansible Receptor
|
||||
'ANSIBLE_RECEPTOR_ENABLE': True,
|
||||
'ANSIBLE_RECEPTOR_SOCK_PATH': '{}/data/share/control.sock'.format(PROJECT_DIR)
|
||||
}
|
||||
|
||||
old_config_map = {
|
||||
|
|
|
@ -1,5 +1,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
import datetime
|
||||
|
||||
from django.conf import settings
|
||||
from django.templatetags.static import static
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
@ -12,17 +14,18 @@ default_interface = dict((
|
|||
('login_title', _('JumpServer Open Source Bastion Host')),
|
||||
('theme', 'classic_green'),
|
||||
('theme_info', {}),
|
||||
('beian_link', ''),
|
||||
('beian_text', '')
|
||||
('footer_content', ''),
|
||||
))
|
||||
|
||||
current_year = datetime.datetime.now().year
|
||||
|
||||
default_context = {
|
||||
'DEFAULT_PK': '00000000-0000-0000-0000-000000000000',
|
||||
'LOGIN_CAS_logo_logout': static('img/login_cas_logo.png'),
|
||||
'LOGIN_WECOM_logo_logout': static('img/login_wecom_logo.png'),
|
||||
'LOGIN_DINGTALK_logo_logout': static('img/login_dingtalk_logo.png'),
|
||||
'LOGIN_FEISHU_logo_logout': static('img/login_feishu_logo.png'),
|
||||
'COPYRIGHT': 'FIT2CLOUD 飞致云' + ' © 2014-2023',
|
||||
'COPYRIGHT': f'FIT2CLOUD 飞致云 © 2014-{current_year}',
|
||||
'INTERFACE': default_interface,
|
||||
}
|
||||
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue