Merge pull request #13452 from jumpserver/dev

v3.10.11-lts
v3.10.11-lts
Bryan 2024-06-19 16:01:26 +08:00 committed by GitHub
commit e4ac73896f
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
59 changed files with 1484 additions and 984 deletions

View File

@ -1,35 +0,0 @@
---
name: 需求建议
about: 提出针对本项目的想法和建议
title: "[Feature] 需求标题"
labels: 类型:需求
assignees:
- ibuler
- baijiangjie
---
## 注意
_针对过于简单的需求描述不予考虑。请确保提供足够的细节和信息以支持功能的开发和实现。_
## 功能名称
[在这里输入功能的名称或标题]
## 功能描述
[在这里描述该功能的详细内容,包括其作用、目的和所需的功能]
## 用户故事(可选)
[如果适用,可以提供用户故事来更好地理解该功能的使用场景和用户期望]
## 功能要求
- [要求1描述该功能的具体要求如界面设计、交互逻辑等]
- [要求2描述该功能的另一个具体要求]
- [以此类推,列出所有相关的功能要求]
## 示例或原型(可选)
[如果有的话,提供该功能的示例或原型图以更好地说明功能的实现方式]
## 优先级
[描述该功能的优先级,如高、中、低,或使用数字等其他标识]
## 备注(可选)
[在这里添加任何其他相关信息或备注]

72
.github/ISSUE_TEMPLATE/1_bug_report.yml vendored Normal file
View File

@ -0,0 +1,72 @@
name: '🐛 Bug Report'
description: 'Report an Bug'
title: '[Bug] '
labels: ['🐛 Bug']
assignees:
- baijiangjie
body:
- type: input
attributes:
label: 'Product Version'
description: The versions prior to v2.28 (inclusive) are no longer supported.
validations:
required: true
- type: checkboxes
attributes:
label: 'Product Edition'
options:
- label: 'Community Edition'
- label: 'Enterprise Edition'
- label: 'Enterprise Trial Edition'
validations:
required: true
- type: checkboxes
attributes:
label: 'Installation Method'
options:
- label: 'Online Installation (One-click command installation)'
- label: 'Offline Package Installation'
- label: 'All-in-One'
- label: '1Panel'
- label: 'Kubernetes'
- label: 'Source Code'
- type: textarea
attributes:
label: 'Environment Information'
description: Please provide a clear and concise description outlining your environment information.
validations:
required: true
- type: textarea
attributes:
label: '🐛 Bug Description'
description:
Please provide a clear and concise description of the defect. If the issue is complex, please provide detailed explanations. <br/>
Unclear descriptions will not be processed. Please ensure you provide enough detail and information to support replicating and fixing the defect.
validations:
required: true
- type: textarea
attributes:
label: 'Recurrence Steps'
description: Please provide a clear and concise description outlining how to reproduce the issue.
validations:
required: true
- type: textarea
attributes:
label: 'Expected Behavior'
description: Please provide a clear and concise description of what you expect to happen.
- type: textarea
attributes:
label: 'Additional Information'
description: Please add any additional background information about the issue here.
- type: textarea
attributes:
label: 'Attempted Solutions'
description: If you have already attempted to solve the issue, please list the solutions you have tried here.

View File

@ -0,0 +1,72 @@
name: '🐛 反馈缺陷'
description: '反馈一个缺陷'
title: '[Bug] '
labels: ['🐛 Bug']
assignees:
- baijiangjie
body:
- type: input
attributes:
label: '产品版本'
description: 不再支持 v2.28(含)之前的版本。
validations:
required: true
- type: checkboxes
attributes:
label: '版本类型'
options:
- label: '社区版'
- label: '企业版'
- label: '企业试用版'
validations:
required: true
- type: checkboxes
attributes:
label: '安装方式'
options:
- label: '在线安装 (一键命令安装)'
- label: '离线包安装'
- label: 'All-in-One'
- label: '1Panel'
- label: 'Kubernetes'
- label: '源码安装'
- type: textarea
attributes:
label: '环境信息'
description: 请提供一个清晰且简洁的描述,说明你的环境信息。
validations:
required: true
- type: textarea
attributes:
label: '🐛 缺陷描述'
description: |
请提供一个清晰且简洁的缺陷描述,如果问题比较复杂,也请详细说明。<br/>
针对不清晰的描述信息将不予处理。请确保提供足够的细节和信息,以支持对缺陷进行复现和修复。
validations:
required: true
- type: textarea
attributes:
label: '复现步骤'
description: 请提供一个清晰且简洁的描述,说明如何复现问题。
validations:
required: true
- type: textarea
attributes:
label: '期望结果'
description: 请提供一个清晰且简洁的描述,说明你期望发生什么。
- type: textarea
attributes:
label: '补充信息'
description: 在这里添加关于问题的任何其他背景信息。
- type: textarea
attributes:
label: '尝试过的解决方案'
description: 如果你已经尝试解决问题,请在此列出你尝试过的解决方案。

View File

@ -0,0 +1,56 @@
name: '⭐️ Feature Request'
description: 'Suggest an idea'
title: '[Feature] '
labels: ['⭐️ Feature Request']
assignees:
- baijiangjie
- ibuler
body:
- type: input
attributes:
label: 'Product Version'
description: The versions prior to v2.28 (inclusive) are no longer supported.
validations:
required: true
- type: checkboxes
attributes:
label: 'Product Edition'
options:
- label: 'Community Edition'
- label: 'Enterprise Edition'
- label: 'Enterprise Trial Edition'
validations:
required: true
- type: checkboxes
attributes:
label: 'Installation Method'
options:
- label: 'Online Installation (One-click command installation)'
- label: 'Offline Package Installation'
- label: 'All-in-One'
- label: '1Panel'
- label: 'Kubernetes'
- label: 'Source Code'
- type: textarea
attributes:
label: '⭐️ Feature Description'
description: |
Please add a clear and concise description of the problem you aim to solve with this feature request.<br/>
Unclear descriptions will not be processed.
validations:
required: true
- type: textarea
attributes:
label: 'Proposed Solution'
description: Please provide a clear and concise description of the solution you desire.
validations:
required: true
- type: textarea
attributes:
label: 'Additional Information'
description: Please add any additional background information about the issue here.

View File

@ -0,0 +1,56 @@
name: '⭐️ 功能需求'
description: '提出需求或建议'
title: '[Feature] '
labels: ['⭐️ Feature Request']
assignees:
- baijiangjie
- ibuler
body:
- type: input
attributes:
label: '产品版本'
description: 不再支持 v2.28(含)之前的版本。
validations:
required: true
- type: checkboxes
attributes:
label: '版本类型'
options:
- label: '社区版'
- label: '企业版'
- label: '企业试用版'
validations:
required: true
- type: checkboxes
attributes:
label: '安装方式'
options:
- label: '在线安装 (一键命令安装)'
- label: '离线包安装'
- label: 'All-in-One'
- label: '1Panel'
- label: 'Kubernetes'
- label: '源码安装'
- type: textarea
attributes:
label: '⭐️ 需求描述'
description: |
请添加一个清晰且简洁的问题描述,阐述你希望通过这个功能需求解决的问题。<br/>
针对不清晰的描述信息将不予处理。
validations:
required: true
- type: textarea
attributes:
label: '解决方案'
description: 请清晰且简洁地描述你想要的解决方案。
validations:
required: true
- type: textarea
attributes:
label: '补充信息'
description: 在这里添加关于问题的任何其他背景信息。

60
.github/ISSUE_TEMPLATE/3_question.yml vendored Normal file
View File

@ -0,0 +1,60 @@
name: '🤔 Question'
description: 'Pose a question'
title: '[Question] '
labels: ['🤔 Question']
assignees:
- baijiangjie
body:
- type: input
attributes:
label: 'Product Version'
description: The versions prior to v2.28 (inclusive) are no longer supported.
validations:
required: true
- type: checkboxes
attributes:
label: 'Product Edition'
options:
- label: 'Community Edition'
- label: 'Enterprise Edition'
- label: 'Enterprise Trial Edition'
validations:
required: true
- type: checkboxes
attributes:
label: 'Installation Method'
options:
- label: 'Online Installation (One-click command installation)'
- label: 'Offline Package Installation'
- label: 'All-in-One'
- label: '1Panel'
- label: 'Kubernetes'
- label: 'Source Code'
- type: textarea
attributes:
label: 'Environment Information'
description: Please provide a clear and concise description outlining your environment information.
validations:
required: true
- type: textarea
attributes:
label: '🤔 Question Description'
description: |
Please provide a clear and concise description of the defect. If the issue is complex, please provide detailed explanations. <br/>
Unclear descriptions will not be processed.
validations:
required: true
- type: textarea
attributes:
label: 'Expected Behavior'
description: Please provide a clear and concise description of what you expect to happen.
- type: textarea
attributes:
label: 'Additional Information'
description: Please add any additional background information about the issue here.

View File

@ -0,0 +1,61 @@
name: '🤔 问题咨询'
description: '提出一个问题'
title: '[Question] '
labels: ['🤔 Question']
assignees:
- baijiangjie
body:
- type: input
attributes:
label: '产品版本'
description: 不再支持 v2.28(含)之前的版本。
validations:
required: true
- type: checkboxes
attributes:
label: '版本类型'
options:
- label: '社区版'
- label: '企业版'
- label: '企业试用版'
validations:
required: true
- type: checkboxes
attributes:
label: '安装方式'
options:
- label: '在线安装 (一键命令安装)'
- label: '离线包安装'
- label: 'All-in-One'
- label: '1Panel'
- label: 'Kubernetes'
- label: '源码安装'
- type: textarea
attributes:
label: '环境信息'
description: 请在此详细描述你的环境信息,如操作系统、浏览器和部署架构等。
validations:
required: true
- type: textarea
attributes:
label: '🤔 问题描述'
description: |
请提供一个清晰且简洁的问题描述,如果问题比较复杂,也请详细说明。<br/>
针对不清晰的描述信息将不予处理。
validations:
required: true
- type: textarea
attributes:
label: '期望结果'
description: 请提供一个清晰且简洁的描述,说明你期望发生什么。
- type: textarea
attributes:
label: '补充信息'
description: 在这里添加关于问题的任何其他背景信息。

View File

@ -1,51 +0,0 @@
---
name: Bug 提交
about: 提交产品缺陷帮助我们更好的改进
title: "[Bug] Bug 标题"
labels: 类型:Bug
assignees:
- baijiangjie
---
## 注意
**JumpServer 版本( v2.28 之前的版本不再支持 )** <br>
_针对过于简单的 Bug 描述不予考虑。请确保提供足够的细节和信息以支持 Bug 的复现和修复。_
## 当前使用的 JumpServer 版本 (必填)
[在这里输入当前使用的 JumpServer 的版本号]
## 使用的版本类型 (必填)
- [ ] 社区版
- [ ] 企业版
- [ ] 企业试用版
## 版本安装方式 (必填)
- [ ] 在线安装 (一键命令)
- [ ] 离线安装 (下载离线包)
- [ ] All-in-One
- [ ] 1Panel 安装
- [ ] Kubernetes 安装
- [ ] 源码安装
## Bug 描述 (详细)
[在这里描述 Bug 的详细情况,包括其影响和出现的具体情况]
## 复现步骤
1. [描述如何复现 Bug 的第一步]
2. [描述如何复现 Bug 的第二步]
3. [以此类推,列出所有复现 Bug 所需的步骤]
## 期望行为
[描述 Bug 出现时期望的系统行为或结果]
## 实际行为
[描述实际上发生了什么,以及 Bug 出现的具体情况]
## 系统环境
- 操作系统:[例如Windows 10, macOS Big Sur]
- 浏览器/应用版本:[如果适用,请提供相关版本信息]
- 其他相关环境信息:[如果有其他相关环境信息,请在此处提供]
## 附加信息(可选)
[在这里添加任何其他相关信息,如截图、错误信息等]

View File

@ -1,50 +0,0 @@
---
name: 问题咨询
about: 提出针对本项目安装部署、使用及其他方面的相关问题
title: "[Question] 问题标题"
labels: 类型:提问
assignees:
- baijiangjie
---
## 注意
**请描述您的问题.** <br>
**JumpServer 版本( v2.28 之前的版本不再支持 )** <br>
_针对过于简单的 Bug 描述不予考虑。请确保提供足够的细节和信息以支持 Bug 的复现和修复。_
## 当前使用的 JumpServer 版本 (必填)
[在这里输入当前使用的 JumpServer 的版本号]
## 使用的版本类型 (必填)
- [ ] 社区版
- [ ] 企业版
- [ ] 企业试用版
## 版本安装方式 (必填)
- [ ] 在线安装 (一键命令)
- [ ] 离线安装 (下载离线包)
- [ ] All-in-One
- [ ] 1Panel 安装
- [ ] Kubernetes 安装
- [ ] 源码安装
## 问题描述 (详细)
[在这里描述你遇到的问题]
## 背景信息
- 操作系统:[例如Windows 10, macOS Big Sur]
- 浏览器/应用版本:[如果适用,请提供相关版本信息]
- 其他相关环境信息:[如果有其他相关环境信息,请在此处提供]
## 具体问题
[在这里详细描述你的问题,包括任何相关细节或错误信息]
## 尝试过的解决方法
[如果你已经尝试过解决问题,请在这里列出你已经尝试过的解决方法]
## 预期结果
[描述你期望的解决方案或结果]
## 我们的期望
[描述你希望我们提供的帮助或支持]

View File

@ -12,7 +12,9 @@ jobs:
uses: actions-cool/issues-helper@v2 uses: actions-cool/issues-helper@v2
with: with:
actions: 'close-issues' actions: 'close-issues'
labels: '状态:待反馈' labels: '⏳ Pending feedback'
inactive-day: 30 inactive-day: 30
body: | body: |
You haven't provided feedback for over 30 days.
We will close this issue. If you have any further needs, you can reopen it or submit a new issue.
您超过 30 天未反馈信息,我们将关闭该 issue如有需求您可以重新打开或者提交新的 issue。 您超过 30 天未反馈信息,我们将关闭该 issue如有需求您可以重新打开或者提交新的 issue。

View File

@ -13,4 +13,4 @@ jobs:
if: ${{ !github.event.issue.pull_request }} if: ${{ !github.event.issue.pull_request }}
with: with:
actions: 'remove-labels' actions: 'remove-labels'
labels: '状态:待处理,状态:待反馈' labels: '🔔 Pending processing,⏳ Pending feedback'

View File

@ -13,13 +13,13 @@ jobs:
uses: actions-cool/issues-helper@v2 uses: actions-cool/issues-helper@v2
with: with:
actions: 'add-labels' actions: 'add-labels'
labels: '状态:待处理' labels: '🔔 Pending processing'
- name: Remove require reply label - name: Remove require reply label
uses: actions-cool/issues-helper@v2 uses: actions-cool/issues-helper@v2
with: with:
actions: 'remove-labels' actions: 'remove-labels'
labels: '状态:待反馈' labels: '⏳ Pending feedback'
add-label-if-is-member: add-label-if-is-member:
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -55,11 +55,11 @@ jobs:
uses: actions-cool/issues-helper@v2 uses: actions-cool/issues-helper@v2
with: with:
actions: 'add-labels' actions: 'add-labels'
labels: '状态:待反馈' labels: '⏳ Pending feedback'
- name: Remove require handle label - name: Remove require handle label
if: contains(steps.member_names.outputs.data, github.event.comment.user.login) if: contains(steps.member_names.outputs.data, github.event.comment.user.login)
uses: actions-cool/issues-helper@v2 uses: actions-cool/issues-helper@v2
with: with:
actions: 'remove-labels' actions: 'remove-labels'
labels: '状态:待处理' labels: '🔔 Pending processing'

View File

@ -13,4 +13,4 @@ jobs:
if: ${{ !github.event.issue.pull_request }} if: ${{ !github.event.issue.pull_request }}
with: with:
actions: 'add-labels' actions: 'add-labels'
labels: '状态:待处理' labels: '🔔 Pending processing'

View File

@ -10,3 +10,4 @@ jobs:
- uses: jumpserver/action-generic-handler@master - uses: jumpserver/action-generic-handler@master
env: env:
GITHUB_TOKEN: ${{ secrets.PRIVATE_TOKEN }} GITHUB_TOKEN: ${{ secrets.PRIVATE_TOKEN }}
I18N_TOKEN: ${{ secrets.I18N_TOKEN }}

View File

@ -5,5 +5,3 @@
ansible.windows.win_user: ansible.windows.win_user:
name: "{{ account.username }}" name: "{{ account.username }}"
state: absent state: absent
purge: yes
force: yes

View File

@ -225,7 +225,7 @@ class AccountSerializer(AccountCreateUpdateSerializerMixin, BaseAccountSerialize
fields = BaseAccountSerializer.Meta.fields + [ fields = BaseAccountSerializer.Meta.fields + [
'su_from', 'asset', 'version', 'su_from', 'asset', 'version',
'source', 'source_id', 'connectivity', 'source', 'source_id', 'connectivity',
] + AccountCreateUpdateSerializerMixin.Meta.fields ] + list(set(AccountCreateUpdateSerializerMixin.Meta.fields) - {'params'})
read_only_fields = BaseAccountSerializer.Meta.read_only_fields + [ read_only_fields = BaseAccountSerializer.Meta.read_only_fields + [
'connectivity' 'connectivity'
] ]

View File

@ -126,7 +126,7 @@ class NodeChildrenAsTreeApi(SerializeToTreeNodeMixin, NodeChildrenApi):
include_assets = self.request.query_params.get('assets', '0') == '1' include_assets = self.request.query_params.get('assets', '0') == '1'
if not self.instance or not include_assets: if not self.instance or not include_assets:
return Asset.objects.none() return Asset.objects.none()
if self.instance.is_org_root(): if not self.request.GET.get('search') and self.instance.is_org_root():
return Asset.objects.none() return Asset.objects.none()
if query_all: if query_all:
assets = self.instance.get_all_assets() assets = self.instance.get_all_assets()

View File

@ -8,6 +8,7 @@ class DatabaseTypes(BaseType):
ORACLE = 'oracle', 'Oracle' ORACLE = 'oracle', 'Oracle'
SQLSERVER = 'sqlserver', 'SQLServer' SQLSERVER = 'sqlserver', 'SQLServer'
DB2 = 'db2', 'DB2' DB2 = 'db2', 'DB2'
DAMENG = 'dameng', 'Dameng'
CLICKHOUSE = 'clickhouse', 'ClickHouse' CLICKHOUSE = 'clickhouse', 'ClickHouse'
MONGODB = 'mongodb', 'MongoDB' MONGODB = 'mongodb', 'MongoDB'
REDIS = 'redis', 'Redis' REDIS = 'redis', 'Redis'
@ -55,6 +56,15 @@ class DatabaseTypes(BaseType):
'change_secret_enabled': False, 'change_secret_enabled': False,
'push_account_enabled': False, 'push_account_enabled': False,
}, },
cls.DAMENG: {
'ansible_enabled': False,
'ping_enabled': False,
'gather_facts_enabled': False,
'gather_accounts_enabled': False,
'verify_account_enabled': False,
'change_secret_enabled': False,
'push_account_enabled': False,
},
cls.CLICKHOUSE: { cls.CLICKHOUSE: {
'ansible_enabled': False, 'ansible_enabled': False,
'ping_enabled': False, 'ping_enabled': False,
@ -84,6 +94,7 @@ class DatabaseTypes(BaseType):
cls.ORACLE: [{'name': 'Oracle'}], cls.ORACLE: [{'name': 'Oracle'}],
cls.SQLSERVER: [{'name': 'SQLServer'}], cls.SQLSERVER: [{'name': 'SQLServer'}],
cls.DB2: [{'name': 'DB2'}], cls.DB2: [{'name': 'DB2'}],
cls.DAMENG: [{'name': 'Dameng'}],
cls.CLICKHOUSE: [{'name': 'ClickHouse'}], cls.CLICKHOUSE: [{'name': 'ClickHouse'}],
cls.MONGODB: [{'name': 'MongoDB'}], cls.MONGODB: [{'name': 'MongoDB'}],
cls.REDIS: [ cls.REDIS: [

View File

@ -23,6 +23,7 @@ class Protocol(ChoicesMixin, models.TextChoices):
postgresql = 'postgresql', 'PostgreSQL' postgresql = 'postgresql', 'PostgreSQL'
sqlserver = 'sqlserver', 'SQLServer' sqlserver = 'sqlserver', 'SQLServer'
db2 = 'db2', 'DB2' db2 = 'db2', 'DB2'
dameng = 'dameng', 'Dameng'
clickhouse = 'clickhouse', 'ClickHouse' clickhouse = 'clickhouse', 'ClickHouse'
redis = 'redis', 'Redis' redis = 'redis', 'Redis'
mongodb = 'mongodb', 'MongoDB' mongodb = 'mongodb', 'MongoDB'
@ -185,6 +186,12 @@ class Protocol(ChoicesMixin, models.TextChoices):
'secret_types': ['password'], 'secret_types': ['password'],
'xpack': True, 'xpack': True,
}, },
cls.dameng: {
'port': 5236,
'required': True,
'secret_types': ['password'],
'xpack': True,
},
cls.clickhouse: { cls.clickhouse: {
'port': 9000, 'port': 9000,
'required': True, 'required': True,

View File

@ -0,0 +1,31 @@
# Generated by Django 4.1.10 on 2023-10-07 06:37
from django.db import migrations
def add_dameng_platform(apps, schema_editor):
platform_cls = apps.get_model('assets', 'Platform')
automation_cls = apps.get_model('assets', 'PlatformAutomation')
platform, _ = platform_cls.objects.update_or_create(
name='Dameng', defaults={
'name': 'Dameng', 'category': 'database',
'internal': True, 'type': 'dameng',
'domain_enabled': True, 'su_enabled': False,
'su_method': None, 'comment': 'Dameng', 'created_by': 'System',
'updated_by': 'System', 'custom_fields': []
}
)
platform.protocols.update_or_create(name='dameng', defaults={
'name': 'dameng', 'port': 5236, 'primary': True, 'setting': {}
})
automation_cls.objects.update_or_create(platform=platform, defaults={'ansible_enabled': False})
class Migration(migrations.Migration):
dependencies = [
('assets', '0127_automation_remove_account'),
]
operations = [
migrations.RunPython(add_dameng_platform)
]

View File

@ -123,7 +123,7 @@ class AutomationExecution(OrgModelMixin):
) )
class Meta: class Meta:
ordering = ('-date_start',) ordering = ('org_id', '-date_start',)
verbose_name = _('Automation task execution') verbose_name = _('Automation task execution')
@property @property

View File

@ -381,6 +381,7 @@ class AssetSerializer(BulkOrgResourceModelSerializer, ResourceLabelsMixin, Writa
class DetailMixin(serializers.Serializer): class DetailMixin(serializers.Serializer):
accounts = AssetAccountSerializer(many=True, required=False, label=_('Accounts'))
spec_info = MethodSerializer(label=_('Spec info'), read_only=True) spec_info = MethodSerializer(label=_('Spec info'), read_only=True)
gathered_info = MethodSerializer(label=_('Gathered info'), read_only=True) gathered_info = MethodSerializer(label=_('Gathered info'), read_only=True)
auto_config = serializers.DictField(read_only=True, label=_('Auto info')) auto_config = serializers.DictField(read_only=True, label=_('Auto info'))
@ -395,7 +396,7 @@ class DetailMixin(serializers.Serializer):
def get_field_names(self, declared_fields, info): def get_field_names(self, declared_fields, info):
names = super().get_field_names(declared_fields, info) names = super().get_field_names(declared_fields, info)
names.extend([ names.extend([
'gathered_info', 'spec_info', 'auto_config', 'accounts', 'gathered_info', 'spec_info', 'auto_config',
]) ])
return names return names

View File

@ -52,7 +52,7 @@ class OperateLogStore(object):
resource_map = { resource_map = {
'Asset permission': lambda k, v: ActionChoices.display(int(v)) if k == 'Actions' else v 'Asset permission': lambda k, v: ActionChoices.display(int(v)) if k == 'Actions' else v
} }
return resource_map.get(resource_type, lambda k, v: v) return resource_map.get(resource_type, lambda k, v: _(v))
@classmethod @classmethod
def convert_diff_friendly(cls, op_log): def convert_diff_friendly(cls, op_log):

View File

@ -37,6 +37,9 @@ class ActionChoices(TextChoices):
approve = 'approve', _('Approve') approve = 'approve', _('Approve')
close = 'close', _('Close') close = 'close', _('Close')
# Custom action
finished = 'finished', _('Finished')
class LoginTypeChoices(TextChoices): class LoginTypeChoices(TextChoices):
web = "W", _("Web") web = "W", _("Web")

View File

@ -58,7 +58,7 @@ class OperatorLogHandler(metaclass=Singleton):
return return
key = '%s_%s' % (self.CACHE_KEY, instance_id) key = '%s_%s' % (self.CACHE_KEY, instance_id)
cache.set(key, instance_dict, 3 * 60) cache.set(key, instance_dict, 3)
def get_instance_dict_from_cache(self, instance_id): def get_instance_dict_from_cache(self, instance_id):
if instance_id is None: if instance_id is None:

View File

@ -257,6 +257,8 @@ class UserLoginLog(models.Model):
class UserSession(models.Model): class UserSession(models.Model):
_OPERATE_LOG_ACTION = {'delete': ActionChoices.finished}
id = models.UUIDField(default=uuid.uuid4, primary_key=True) id = models.UUIDField(default=uuid.uuid4, primary_key=True)
ip = models.GenericIPAddressField(verbose_name=_("Login IP")) ip = models.GenericIPAddressField(verbose_name=_("Login IP"))
key = models.CharField(max_length=128, verbose_name=_("Session key")) key = models.CharField(max_length=128, verbose_name=_("Session key"))

View File

@ -3,7 +3,9 @@
import uuid import uuid
from django.apps import apps from django.apps import apps
from django.db.models.signals import post_save, pre_save, m2m_changed, pre_delete from django.db.models.signals import (
pre_delete, pre_save, m2m_changed, post_delete, post_save
)
from django.dispatch import receiver from django.dispatch import receiver
from django.utils import translation from django.utils import translation
@ -94,7 +96,7 @@ def signal_of_operate_log_whether_continue(
return condition return condition
@receiver(pre_save) @receiver([pre_save, pre_delete])
def on_object_pre_create_or_update( def on_object_pre_create_or_update(
sender, instance=None, raw=False, using=None, update_fields=None, **kwargs sender, instance=None, raw=False, using=None, update_fields=None, **kwargs
): ):
@ -103,6 +105,7 @@ def on_object_pre_create_or_update(
) )
if not ok: if not ok:
return return
with translation.override('en'): with translation.override('en'):
# users.PrivateToken Model 没有 id 有 pk字段 # users.PrivateToken Model 没有 id 有 pk字段
instance_id = getattr(instance, 'id', getattr(instance, 'pk', None)) instance_id = getattr(instance, 'id', getattr(instance, 'pk', None))
@ -145,7 +148,7 @@ def on_object_created_or_update(
) )
@receiver(pre_delete) @receiver(post_delete)
def on_object_delete(sender, instance=None, **kwargs): def on_object_delete(sender, instance=None, **kwargs):
ok = signal_of_operate_log_whether_continue(sender, instance, False) ok = signal_of_operate_log_whether_continue(sender, instance, False)
if not ok: if not ok:
@ -153,9 +156,15 @@ def on_object_delete(sender, instance=None, **kwargs):
with translation.override('en'): with translation.override('en'):
resource_type = sender._meta.verbose_name resource_type = sender._meta.verbose_name
action = getattr(sender, '_OPERATE_LOG_ACTION', {})
action = action.get('delete', ActionChoices.delete)
instance_id = getattr(instance, 'id', getattr(instance, 'pk', None))
log_id, before = get_instance_dict_from_cache(instance_id)
if not log_id:
log_id, before = None, model_to_dict(instance)
create_or_update_operate_log( create_or_update_operate_log(
ActionChoices.delete, resource_type, action, resource_type, log_id=log_id,
resource=instance, before=model_to_dict(instance) resource=instance, before=before,
) )
@ -166,7 +175,7 @@ def on_django_start_set_operate_log_monitor_models(sender, **kwargs):
'django_celery_beat', 'contenttypes', 'sessions', 'auth', 'django_celery_beat', 'contenttypes', 'sessions', 'auth',
} }
exclude_models = { exclude_models = {
'UserPasswordHistory', 'ContentType', 'UserPasswordHistory', 'ContentType', 'Asset',
'MessageContent', 'SiteMessage', 'MessageContent', 'SiteMessage',
'PlatformAutomation', 'PlatformProtocol', 'Protocol', 'PlatformAutomation', 'PlatformProtocol', 'Protocol',
'HistoricalAccount', 'GatheredUser', 'ApprovalRule', 'HistoricalAccount', 'GatheredUser', 'ApprovalRule',
@ -178,13 +187,15 @@ def on_django_start_set_operate_log_monitor_models(sender, **kwargs):
'PermedAsset', 'PermedAccount', 'MenuPermission', 'PermedAsset', 'PermedAccount', 'MenuPermission',
'Permission', 'TicketSession', 'ApplyLoginTicket', 'Permission', 'TicketSession', 'ApplyLoginTicket',
'ApplyCommandTicket', 'ApplyLoginAssetTicket', 'ApplyCommandTicket', 'ApplyLoginAssetTicket',
'FavoriteAsset', 'FavoriteAsset', 'ChangeSecretRecord'
} }
include_models = {'UserSession'}
for i, app in enumerate(apps.get_models(), 1): for i, app in enumerate(apps.get_models(), 1):
app_name = app._meta.app_label app_name = app._meta.app_label
model_name = app._meta.object_name model_name = app._meta.object_name
if app_name in exclude_apps or \ if app_name in exclude_apps or \
model_name in exclude_models or \ model_name in exclude_models or \
model_name.endswith('Execution'): model_name.endswith('Execution'):
continue if model_name not in include_models:
continue
MODELS_NEED_RECORD.add(model_name) MODELS_NEED_RECORD.add(model_name)

View File

@ -49,9 +49,15 @@ def _get_instance_field_value(
continue continue
value = getattr(instance, f.name, None) or getattr(instance, f.attname, None) value = getattr(instance, f.name, None) or getattr(instance, f.attname, None)
if not isinstance(value, bool) and not value: if not isinstance(value, (bool, int)) and not value:
continue continue
choices = getattr(f, 'choices', []) or []
for c_value, c_label in choices:
if c_value == value:
value = c_label
break
if getattr(f, 'primary_key', False): if getattr(f, 'primary_key', False):
f.verbose_name = 'id' f.verbose_name = 'id'
elif isinstance(value, list): elif isinstance(value, list):

View File

@ -4,7 +4,6 @@ from django.contrib import auth
from django.http import HttpResponseRedirect from django.http import HttpResponseRedirect
from django.urls import reverse from django.urls import reverse
from django.utils.http import urlencode from django.utils.http import urlencode
from django.utils.translation import gettext_lazy as _
from authentication.utils import build_absolute_uri from authentication.utils import build_absolute_uri
from authentication.views.mixins import FlashMessageMixin from authentication.views.mixins import FlashMessageMixin
@ -55,11 +54,7 @@ class OAuth2AuthCallbackView(View, FlashMessageMixin):
logger.debug(log_prompt.format('Process authenticate')) logger.debug(log_prompt.format('Process authenticate'))
user = authenticate(code=callback_params['code'], request=request) user = authenticate(code=callback_params['code'], request=request)
if err_msg := getattr(request, 'error_message', ''): if user:
login_url = reverse('authentication:login') + '?admin=1'
return self.get_failed_response(login_url, title=_('Authentication failed'), msg=err_msg)
if user and user.is_valid:
logger.debug(log_prompt.format('Login: {}'.format(user))) logger.debug(log_prompt.format('Login: {}'.format(user)))
auth.login(self.request, user) auth.login(self.request, user)
logger.debug(log_prompt.format('Redirect')) logger.debug(log_prompt.format('Redirect'))
@ -68,8 +63,7 @@ class OAuth2AuthCallbackView(View, FlashMessageMixin):
) )
logger.debug(log_prompt.format('Redirect')) logger.debug(log_prompt.format('Redirect'))
# OAuth2 服务端认证成功, 但是用户被禁用了, 这时候需要调用服务端的logout redirect_url = settings.AUTH_OAUTH2_PROVIDER_END_SESSION_ENDPOINT or '/'
redirect_url = settings.AUTH_OAUTH2_PROVIDER_END_SESSION_ENDPOINT
return HttpResponseRedirect(redirect_url) return HttpResponseRedirect(redirect_url)

View File

@ -1,5 +1,5 @@
from django.conf import settings from django.conf import settings
from django.contrib.auth import user_logged_in from django.contrib.auth import user_logged_in, BACKEND_SESSION_KEY
from django.core.cache import cache from django.core.cache import cache
from django.dispatch import receiver from django.dispatch import receiver
from django_cas_ng.signals import cas_user_authenticated from django_cas_ng.signals import cas_user_authenticated
@ -20,8 +20,9 @@ def on_user_auth_login_success(sender, user, request, **kwargs):
and user.mfa_enabled \ and user.mfa_enabled \
and not request.session.get('auth_mfa'): and not request.session.get('auth_mfa'):
request.session['auth_mfa_required'] = 1 request.session['auth_mfa_required'] = 1
auth_backend = request.session.get('auth_backend', request.session.get(BACKEND_SESSION_KEY))
if not request.session.get("auth_third_party_done") and \ if not request.session.get("auth_third_party_done") and \
request.session.get('auth_backend') in AUTHENTICATION_BACKENDS_THIRD_PARTY: auth_backend in AUTHENTICATION_BACKENDS_THIRD_PARTY:
request.session['auth_third_party_required'] = 1 request.session['auth_third_party_required'] = 1
user_session_id = request.session.get('user_session_id') user_session_id = request.session.get('user_session_id')

View File

@ -249,6 +249,8 @@ class UserLoginView(mixins.AuthMixin, UserLoginContextMixin, FormView):
def form_valid(self, form): def form_valid(self, form):
if not self.request.session.test_cookie_worked(): if not self.request.session.test_cookie_worked():
form.add_error(None, _("Login timeout, please try again.")) form.add_error(None, _("Login timeout, please try again."))
# 当 session 过期后,刷新浏览器重新提交依旧会报错,所以需要重新设置 test_cookie
self.request.session.set_test_cookie()
return self.form_invalid(form) return self.form_invalid(form)
# https://docs.djangoproject.com/en/3.1/topics/http/sessions/#setting-test-cookies # https://docs.djangoproject.com/en/3.1/topics/http/sessions/#setting-test-cookies

View File

@ -14,9 +14,13 @@ from uuid import UUID
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
from django.db.models import QuerySet as DJQuerySet from django.db.models import QuerySet as DJQuerySet
from elasticsearch import Elasticsearch from elasticsearch7 import Elasticsearch
from elasticsearch.helpers import bulk from elasticsearch7.helpers import bulk
from elasticsearch.exceptions import RequestError, NotFoundError from elasticsearch7.exceptions import RequestError, SSLError
from elasticsearch7.exceptions import NotFoundError as NotFoundError7
from elasticsearch8.exceptions import NotFoundError as NotFoundError8
from elasticsearch8.exceptions import BadRequestError
from common.utils.common import lazyproperty from common.utils.common import lazyproperty
from common.utils import get_logger from common.utils import get_logger
@ -36,9 +40,82 @@ class NotSupportElasticsearch8(JMSException):
default_detail = _('Not Support Elasticsearch8') default_detail = _('Not Support Elasticsearch8')
class ES(object): class InvalidElasticsearchSSL(JMSException):
def __init__(self, config, properties, keyword_fields, exact_fields=None, match_fields=None): default_code = 'invalid_elasticsearch_SSL'
default_detail = _(
'Connection failed: Self-signed certificate used. Please check server certificate configuration')
class ESClient(object):
def __new__(cls, *args, **kwargs):
version = get_es_client_version(**kwargs)
if version == 6:
return ESClientV6(*args, **kwargs)
if version == 7:
return ESClientV7(*args, **kwargs)
elif version == 8:
return ESClientV8(*args, **kwargs)
raise ValueError('Unsupported ES_VERSION %r' % version)
class ESClientBase(object):
@classmethod
def get_properties(cls, data, index):
return data[index]['mappings']['properties']
@classmethod
def get_mapping(cls, properties):
return {'mappings': {'properties': properties}}
class ESClientV7(ESClientBase):
def __init__(self, *args, **kwargs):
from elasticsearch7 import Elasticsearch
self.es = Elasticsearch(*args, **kwargs)
@classmethod
def get_sort(cls, field, direction):
return f'{field}:{direction}'
class ESClientV6(ESClientV7):
@classmethod
def get_properties(cls, data, index):
return data[index]['mappings']['data']['properties']
@classmethod
def get_mapping(cls, properties):
return {'mappings': {'data': {'properties': properties}}}
class ESClientV8(ESClientBase):
def __init__(self, *args, **kwargs):
from elasticsearch8 import Elasticsearch
self.es = Elasticsearch(*args, **kwargs)
@classmethod
def get_sort(cls, field, direction):
return {field: {'order': direction}}
def get_es_client_version(**kwargs):
try:
es = Elasticsearch(**kwargs)
info = es.info()
version = int(info['version']['number'].split('.')[0])
return version
except SSLError:
raise InvalidElasticsearchSSL
except Exception:
raise InvalidElasticsearch
class ES(object):
def __init__(self, config, properties, keyword_fields, exact_fields=None, match_fields=None):
self.version = 7
self.config = config self.config = config
hosts = self.config.get('HOSTS') hosts = self.config.get('HOSTS')
kwargs = self.config.get('OTHER', {}) kwargs = self.config.get('OTHER', {})
@ -46,7 +123,8 @@ class ES(object):
ignore_verify_certs = kwargs.pop('IGNORE_VERIFY_CERTS', False) ignore_verify_certs = kwargs.pop('IGNORE_VERIFY_CERTS', False)
if ignore_verify_certs: if ignore_verify_certs:
kwargs['verify_certs'] = None kwargs['verify_certs'] = None
self.es = Elasticsearch(hosts=hosts, max_retries=0, **kwargs) self.client = ESClient(hosts=hosts, max_retries=0, **kwargs)
self.es = self.client.es
self.index_prefix = self.config.get('INDEX') or 'jumpserver' self.index_prefix = self.config.get('INDEX') or 'jumpserver'
self.is_index_by_date = bool(self.config.get('INDEX_BY_DATE', False)) self.is_index_by_date = bool(self.config.get('INDEX_BY_DATE', False))
@ -83,26 +161,14 @@ class ES(object):
if not self.ping(timeout=2): if not self.ping(timeout=2):
return False return False
info = self.es.info()
version = info['version']['number'].split('.')[0]
if version == '8':
raise NotSupportElasticsearch8
try: try:
# 获取索引信息,如果没有定义,直接返回 # 获取索引信息,如果没有定义,直接返回
data = self.es.indices.get_mapping(index=self.index) data = self.es.indices.get_mapping(index=self.index)
except NotFoundError: except (NotFoundError8, NotFoundError7):
return False return False
try: try:
if version == '6': properties = self.client.get_properties(data=data, index=self.index)
# 检测索引是不是新的类型 es6
properties = data[self.index]['mappings']['data']['properties']
else:
# 检测索引是不是新的类型 es7 default index type: _doc
properties = data[self.index]['mappings']['properties']
for keyword in self.keyword_fields: for keyword in self.keyword_fields:
if not properties[keyword]['type'] == 'keyword': if not properties[keyword]['type'] == 'keyword':
break break
@ -118,12 +184,7 @@ class ES(object):
def _ensure_index_exists(self): def _ensure_index_exists(self):
try: try:
info = self.es.info() mappings = self.client.get_mapping(self.properties)
version = info['version']['number'].split('.')[0]
if version == '6':
mappings = {'mappings': {'data': {'properties': self.properties}}}
else:
mappings = {'mappings': {'properties': self.properties}}
if self.is_index_by_date: if self.is_index_by_date:
mappings['aliases'] = { mappings['aliases'] = {
@ -132,7 +193,7 @@ class ES(object):
try: try:
self.es.indices.create(index=self.index, body=mappings) self.es.indices.create(index=self.index, body=mappings)
except RequestError as e: except (RequestError, BadRequestError) as e:
if e.error == 'resource_already_exists_exception': if e.error == 'resource_already_exists_exception':
logger.warning(e) logger.warning(e)
else: else:
@ -175,11 +236,15 @@ class ES(object):
def _filter(self, query: dict, from_=None, size=None, sort=None): def _filter(self, query: dict, from_=None, size=None, sort=None):
body = self.get_query_body(**query) body = self.get_query_body(**query)
search_params = {
data = self.es.search( 'index': self.query_index,
index=self.query_index, body=body, 'body': body,
from_=from_, size=size, sort=sort 'from_': from_,
) 'size': size
}
if sort is not None:
search_params['sort'] = sort
data = self.es.search(**search_params)
source_data = [] source_data = []
for item in data['hits']['hits']: for item in data['hits']['hits']:
@ -367,7 +432,7 @@ class QuerySet(DJQuerySet):
else: else:
direction = 'asc' direction = 'asc'
field = field.lstrip('-+') field = field.lstrip('-+')
sort = f'{field}:{direction}' sort = self._storage.client.get_sort(field, direction)
return sort return sort
def __execute(self): def __execute(self):

View File

@ -74,9 +74,13 @@ class DateTimeMixin:
query = {f'{query_field}__gte': t} query = {f'{query_field}__gte': t}
return qs.filter(**query) return qs.filter(**query)
@lazyproperty
def users(self):
return self.org.get_members()
def get_logs_queryset(self, queryset, query_params): def get_logs_queryset(self, queryset, query_params):
query = {} query = {}
users = self.org.get_members() users = self.users
if not self.org.is_root(): if not self.org.is_root():
if query_params == 'username': if query_params == 'username':
query = { query = {
@ -100,6 +104,13 @@ class DateTimeMixin:
queryset = self.get_logs_queryset(qs, 'username') queryset = self.get_logs_queryset(qs, 'username')
return queryset return queryset
@lazyproperty
def user_login_logs_on_the_system_queryset(self):
qs = UserLoginLog.objects.all()
qs = self.get_logs_queryset_filter(qs, 'datetime')
queryset = qs.filter(username__in=construct_userlogin_usernames(self.users))
return queryset
@lazyproperty @lazyproperty
def password_change_logs_queryset(self): def password_change_logs_queryset(self):
qs = PasswordChangeLog.objects.all() qs = PasswordChangeLog.objects.all()
@ -141,6 +152,7 @@ class DatesLoginMetricMixin:
ftp_logs_queryset: FTPLog.objects ftp_logs_queryset: FTPLog.objects
job_logs_queryset: JobLog.objects job_logs_queryset: JobLog.objects
login_logs_queryset: UserLoginLog.objects login_logs_queryset: UserLoginLog.objects
user_login_logs_on_the_system_queryset: UserLoginLog.objects
operate_logs_queryset: OperateLog.objects operate_logs_queryset: OperateLog.objects
password_change_logs_queryset: PasswordChangeLog.objects password_change_logs_queryset: PasswordChangeLog.objects
@ -159,31 +171,48 @@ class DatesLoginMetricMixin:
query = {f'{field_name}__range': self.date_start_end} query = {f'{field_name}__range': self.date_start_end}
return queryset.filter(**query) return queryset.filter(**query)
def get_date_metrics(self, queryset, field_name, count_field): def get_date_metrics(self, queryset, field_name, count_fields):
queryset = self.filter_date_start_end(queryset, field_name) queryset = self.filter_date_start_end(queryset, field_name)
queryset = queryset.values_list(field_name, count_field)
date_group_map = defaultdict(set) if not isinstance(count_fields, (list, tuple)):
for datetime, count_field in queryset: count_fields = [count_fields]
values_list = [field_name] + list(count_fields)
queryset = queryset.values_list(*values_list)
date_group_map = defaultdict(lambda: defaultdict(set))
for row in queryset:
datetime = row[0]
date_str = str(datetime.date()) date_str = str(datetime.date())
date_group_map[date_str].add(count_field) for idx, count_field in enumerate(count_fields):
date_group_map[date_str][count_field].add(row[idx + 1])
return [ date_metrics_dict = defaultdict(list)
len(date_group_map.get(str(d), set())) for field in count_fields:
for d in self.dates_list for date_str in self.dates_list:
] count = len(date_group_map.get(str(date_str), {}).get(field, set()))
date_metrics_dict[field].append(count)
return date_metrics_dict
def get_dates_metrics_total_count_active_users_and_assets(self):
date_metrics_dict = self.get_date_metrics(
Session.objects, 'date_start', ('user_id', 'asset_id')
)
return date_metrics_dict.get('user_id', []), date_metrics_dict.get('asset_id', [])
def get_dates_metrics_total_count_login(self): def get_dates_metrics_total_count_login(self):
return self.get_date_metrics(UserLoginLog.objects, 'datetime', 'id') date_metrics_dict = self.get_date_metrics(
UserLoginLog.objects, 'datetime', 'id'
def get_dates_metrics_total_count_active_users(self): )
return self.get_date_metrics(Session.objects, 'date_start', 'user_id') return date_metrics_dict.get('id', [])
def get_dates_metrics_total_count_active_assets(self):
return self.get_date_metrics(Session.objects, 'date_start', 'asset_id')
def get_dates_metrics_total_count_sessions(self): def get_dates_metrics_total_count_sessions(self):
return self.get_date_metrics(Session.objects, 'date_start', 'id') date_metrics_dict = self.get_date_metrics(
Session.objects, 'date_start', 'id'
)
return date_metrics_dict.get('id', [])
def get_dates_login_times_assets(self): def get_dates_login_times_assets(self):
assets = self.sessions_queryset.values("asset") \ assets = self.sessions_queryset.values("asset") \
@ -224,7 +253,7 @@ class DatesLoginMetricMixin:
@lazyproperty @lazyproperty
def user_login_amount(self): def user_login_amount(self):
return self.login_logs_queryset.values('username').distinct().count() return self.user_login_logs_on_the_system_queryset.values('username').distinct().count()
@lazyproperty @lazyproperty
def operate_logs_amount(self): def operate_logs_amount(self):
@ -412,11 +441,13 @@ class IndexApi(DateTimeMixin, DatesLoginMetricMixin, APIView):
}) })
if _all or query_params.get('dates_metrics'): if _all or query_params.get('dates_metrics'):
user_data, asset_data = self.get_dates_metrics_total_count_active_users_and_assets()
login_data = self.get_dates_metrics_total_count_login()
data.update({ data.update({
'dates_metrics_date': self.get_dates_metrics_date(), 'dates_metrics_date': self.get_dates_metrics_date(),
'dates_metrics_total_count_login': self.get_dates_metrics_total_count_login(), 'dates_metrics_total_count_login': login_data,
'dates_metrics_total_count_active_users': self.get_dates_metrics_total_count_active_users(), 'dates_metrics_total_count_active_users': user_data,
'dates_metrics_total_count_active_assets': self.get_dates_metrics_total_count_active_assets(), 'dates_metrics_total_count_active_assets': asset_data,
}) })
if _all or query_params.get('dates_login_times_top10_assets'): if _all or query_params.get('dates_login_times_top10_assets'):

View File

@ -489,7 +489,7 @@ class Config(dict):
# 安全配置 # 安全配置
'SECURITY_MFA_AUTH': 0, # 0 不开启 1 全局开启 2 管理员开启 'SECURITY_MFA_AUTH': 0, # 0 不开启 1 全局开启 2 管理员开启
'SECURITY_MFA_AUTH_ENABLED_FOR_THIRD_PARTY': True, 'SECURITY_MFA_AUTH_ENABLED_FOR_THIRD_PARTY': True,
'SECURITY_COMMAND_EXECUTION': True, 'SECURITY_COMMAND_EXECUTION': False,
'SECURITY_COMMAND_BLACKLIST': [ 'SECURITY_COMMAND_BLACKLIST': [
'reboot', 'shutdown', 'poweroff', 'halt', 'dd', 'half', 'top' 'reboot', 'shutdown', 'poweroff', 'halt', 'dd', 'half', 'top'
], ],
@ -619,7 +619,9 @@ class Config(dict):
# Ansible Receptor # Ansible Receptor
'RECEPTOR_ENABLED': False, 'RECEPTOR_ENABLED': False,
'ANSIBLE_RECEPTOR_GATEWAY_PROXY_HOST': 'jms_celery', 'ANSIBLE_RECEPTOR_GATEWAY_PROXY_HOST': 'jms_celery',
'ANSIBLE_RECEPTOR_TCP_LISTEN_ADDRESS': 'receptor:7521' 'ANSIBLE_RECEPTOR_TCP_LISTEN_ADDRESS': 'receptor:7521',
'FILE_UPLOAD_TEMP_DIR': None
} }

View File

@ -138,7 +138,6 @@ INSTALLED_APPS = [
'rbac.apps.RBACConfig', 'rbac.apps.RBACConfig',
'labels.apps.LabelsConfig', 'labels.apps.LabelsConfig',
'rest_framework', 'rest_framework',
'rest_framework_swagger',
'drf_yasg', 'drf_yasg',
'django_cas_ng', 'django_cas_ng',
'channels', 'channels',
@ -320,6 +319,8 @@ PRIVATE_STORAGE_AUTH_FUNCTION = 'jumpserver.rewriting.storage.permissions.allow_
PRIVATE_STORAGE_INTERNAL_URL = '/private-media/' PRIVATE_STORAGE_INTERNAL_URL = '/private-media/'
PRIVATE_STORAGE_SERVER = 'jumpserver.rewriting.storage.servers.StaticFileServer' PRIVATE_STORAGE_SERVER = 'jumpserver.rewriting.storage.servers.StaticFileServer'
FILE_UPLOAD_TEMP_DIR = CONFIG.FILE_UPLOAD_TEMP_DIR
# Use django-bootstrap-form to format template, input max width arg # Use django-bootstrap-form to format template, input max width arg
# BOOTSTRAP_COLUMN_COUNT = 11 # BOOTSTRAP_COLUMN_COUNT = 11

View File

@ -1,3 +1,3 @@
version https://git-lfs.github.com/spec/v1 version https://git-lfs.github.com/spec/v1
oid sha256:e4baadc170ded5134bed55533c4c04e694be6ea7b8e151d80c1092728e26a75b oid sha256:52990de6b508e55b8b5f4a70f86c567410c5cf217ca312847f65178393d81b19
size 177500 size 177824

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,3 @@
version https://git-lfs.github.com/spec/v1 version https://git-lfs.github.com/spec/v1
oid sha256:08667579241592ecacd1baf330147a720a9e41171444b925f90778613a7e1d9a oid sha256:144d439f8f3c96d00b1744de34b8a2a22b891f88ccb4b3c9669ad7273ecd08be
size 145230 size 145525

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,3 @@
version https://git-lfs.github.com/spec/v1 version https://git-lfs.github.com/spec/v1
oid sha256:93fed3f6a027f645520852f40c5f7722a5be579a3a8bb7b7029e3d0bc9794056 oid sha256:17da592df8b280d501a3b579c6a249b080bf07fbee34520a2012d8936d96ca14
size 145341 size 145636

File diff suppressed because it is too large Load Diff

View File

@ -57,21 +57,22 @@ def register_as_period_task(
task = '{func.__module__}.{func.__name__}'.format(func=func) task = '{func.__module__}.{func.__name__}'.format(func=func)
_name = name if name else task _name = name if name else task
add_register_period_task({ add_register_period_task({
_name: { _name: {
'task': task, 'task': task,
'interval': interval, 'interval': interval,
'crontab': crontab, 'crontab': crontab,
'args': args, 'args': args,
'kwargs': kwargs if kwargs else {}, 'kwargs': kwargs if kwargs else {},
'enabled': True, 'description': description
'description': description }
}
}) })
@wraps(func) @wraps(func)
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
return func(*args, **kwargs) return func(*args, **kwargs)
return wrapper return wrapper
return decorate return decorate
@ -85,6 +86,7 @@ def after_app_ready_start(func):
@wraps(func) @wraps(func)
def decorate(*args, **kwargs): def decorate(*args, **kwargs):
return func(*args, **kwargs) return func(*args, **kwargs)
return decorate return decorate
@ -98,4 +100,5 @@ def after_app_shutdown_clean_periodic(func):
@wraps(func) @wraps(func)
def decorate(*args, **kwargs): def decorate(*args, **kwargs):
return func(*args, **kwargs) return func(*args, **kwargs)
return decorate return decorate

View File

@ -80,6 +80,9 @@ def create_or_update_celery_periodic_tasks(tasks):
description=detail.get('description') or '', description=detail.get('description') or '',
last_run_at=last_run_at, last_run_at=last_run_at,
) )
enabled = detail.get('enabled')
if enabled is not None:
defaults["enabled"] = enabled
task = PeriodicTask.objects.update_or_create( task = PeriodicTask.objects.update_or_create(
defaults=defaults, name=name, defaults=defaults, name=name,
) )

View File

@ -215,7 +215,8 @@ class Job(JMSOrgBaseModel, PeriodTaskModelMixin):
return "{}:{}:{}".format(self.org.name, self.creator.name, self.playbook.name) return "{}:{}:{}".format(self.org.name, self.creator.name, self.playbook.name)
def create_execution(self): def create_execution(self):
return self.executions.create(job_version=self.version, material=self.material, job_type=Types[self.type].value) return self.executions.create(job_version=self.version, material=self.material, job_type=Types[self.type].value,
creator=self.creator)
class Meta: class Meta:
verbose_name = _("Job") verbose_name = _("Job")

View File

@ -140,6 +140,9 @@ def task_sent_handler(headers=None, body=None, **kwargs):
args = [] args = []
kwargs = {} kwargs = {}
# 不要保存__current_lang和__current_org_id参数,防止系统任务中点击再次执行报错
kwargs.pop('__current_lang', None)
kwargs.pop('__current_org_id', None)
data = { data = {
'id': i, 'id': i,
'name': task, 'name': task,

View File

@ -7,6 +7,7 @@ from channels.generic.websocket import AsyncJsonWebsocketConsumer
from common.db.utils import close_old_connections from common.db.utils import close_old_connections
from common.utils import get_logger from common.utils import get_logger
from rbac.builtin import BuiltinRole
from .ansible.utils import get_ansible_task_log_path from .ansible.utils import get_ansible_task_log_path
from .celery.utils import get_celery_task_log_path from .celery.utils import get_celery_task_log_path
from .const import CELERY_LOG_MAGIC_MARK from .const import CELERY_LOG_MAGIC_MARK
@ -48,13 +49,30 @@ class TaskLogWebsocket(AsyncJsonWebsocketConsumer):
else: else:
return None return None
@sync_to_async
def get_current_user_role_ids(self, user):
roles = user.system_roles.all() | user.org_roles.all()
user_role_ids = set(map(str, roles.values_list('id', flat=True)))
return user_role_ids
async def receive_json(self, content, **kwargs): async def receive_json(self, content, **kwargs):
task_id = content.get('task') task_id = content.get('task')
task = await self.get_task(task_id) task = await self.get_task(task_id)
if not task: if not task:
await self.send_json({'message': 'Task not found', 'task': task_id}) await self.send_json({'message': 'Task not found', 'task': task_id})
return return
if task.name in self.user_tasks and task.creator != self.scope['user']:
admin_auditor_role_ids = {
BuiltinRole.system_admin.id,
BuiltinRole.system_auditor.id,
BuiltinRole.org_admin.id,
BuiltinRole.org_auditor.id
}
user = self.scope['user']
user_role_ids = await self.get_current_user_role_ids(user)
has_admin_auditor_role = bool(admin_auditor_role_ids & user_role_ids)
if not has_admin_auditor_role and task.name in self.user_tasks and task.creator != user:
await self.send_json({'message': 'No permission', 'task': task_id}) await self.send_json({'message': 'No permission', 'task': task_id})
return return

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# #
from django.core.exceptions import ValidationError from rest_framework.serializers import ValidationError
from django.db import models from django.db import models
from django.utils.translation import gettext_lazy as _ from django.utils.translation import gettext_lazy as _
@ -45,7 +45,7 @@ class OrgManager(models.Manager):
for obj in objs: for obj in objs:
if org.is_root(): if org.is_root():
if not obj.org_id: if not obj.org_id:
raise ValidationError('Please save in a org') raise ValidationError(_('Please save in a org'))
else: else:
obj.org_id = org.id obj.org_id = org.id
return super().bulk_create(objs, batch_size, ignore_conflicts) return super().bulk_create(objs, batch_size, ignore_conflicts)
@ -70,7 +70,7 @@ class OrgModelMixin(models.Model):
# raise ... # raise ...
if org.is_root(): if org.is_root():
if not self.org_id: if not self.org_id:
raise ValidationError('Please save in a org') raise ValidationError(_('Please save in a org'))
else: else:
self.org_id = org.id self.org_id = org.id
return super().save(*args, **kwargs) return super().save(*args, **kwargs)
@ -119,4 +119,3 @@ class OrgModelMixin(models.Model):
class JMSOrgBaseModel(JMSBaseModel, OrgModelMixin): class JMSOrgBaseModel(JMSBaseModel, OrgModelMixin):
class Meta: class Meta:
abstract = True abstract = True

View File

@ -9,7 +9,7 @@ def migrate_system_user_to_accounts(apps, schema_editor):
bulk_size = 10000 bulk_size = 10000
while True: while True:
asset_permissions = asset_permission_model.objects \ asset_permissions = asset_permission_model.objects \
.prefetch_related('system_users')[count:bulk_size] .prefetch_related('system_users')[count:count+bulk_size]
if not asset_permissions: if not asset_permissions:
break break

View File

@ -11,13 +11,13 @@ logger = get_logger(__file__)
class LDAPImportMessage(UserMessage): class LDAPImportMessage(UserMessage):
def __init__(self, user, extra_kwargs): def __init__(self, user, extra_kwargs):
super().__init__(user) super().__init__(user)
self.orgs = extra_kwargs.pop('orgs', []) self.orgs = extra_kwargs.get('orgs', [])
self.end_time = extra_kwargs.pop('end_time', '') self.end_time = extra_kwargs.get('end_time', '')
self.start_time = extra_kwargs.pop('start_time', '') self.start_time = extra_kwargs.get('start_time', '')
self.time_start_display = extra_kwargs.pop('time_start_display', '') self.time_start_display = extra_kwargs.get('time_start_display', '')
self.new_users = extra_kwargs.pop('new_users', []) self.new_users = extra_kwargs.get('new_users', [])
self.errors = extra_kwargs.pop('errors', []) self.errors = extra_kwargs.get('errors', [])
self.cost_time = extra_kwargs.pop('cost_time', '') self.cost_time = extra_kwargs.get('cost_time', '')
def get_html_msg(self) -> dict: def get_html_msg(self) -> dict:
subject = _('Notification of Synchronized LDAP User Task Results') subject = _('Notification of Synchronized LDAP User Task Results')

View File

@ -93,7 +93,10 @@ class LDAPSettingSerializer(serializers.Serializer):
AUTH_LDAP = serializers.BooleanField(required=False, label=_('Enable LDAP auth')) AUTH_LDAP = serializers.BooleanField(required=False, label=_('Enable LDAP auth'))
@staticmethod def post_save(self):
def post_save(): keys = ['AUTH_LDAP_SYNC_IS_PERIODIC', 'AUTH_LDAP_SYNC_INTERVAL', 'AUTH_LDAP_SYNC_CRONTAB']
kwargs = {k: self.validated_data[k] for k in keys if k in self.validated_data}
if not kwargs:
return
from settings.tasks import import_ldap_user_periodic from settings.tasks import import_ldap_user_periodic
import_ldap_user_periodic() import_ldap_user_periodic(**kwargs)

View File

@ -9,7 +9,7 @@ from common.utils import get_logger
from common.utils.timezone import local_now_display from common.utils.timezone import local_now_display
from ops.celery.decorator import after_app_ready_start from ops.celery.decorator import after_app_ready_start
from ops.celery.utils import ( from ops.celery.utils import (
create_or_update_celery_periodic_tasks, delete_celery_periodic_task create_or_update_celery_periodic_tasks, disable_celery_periodic_task
) )
from orgs.models import Organization from orgs.models import Organization
from settings.notifications import LDAPImportMessage from settings.notifications import LDAPImportMessage
@ -65,20 +65,15 @@ def import_ldap_user():
@shared_task(verbose_name=_('Registration periodic import ldap user task')) @shared_task(verbose_name=_('Registration periodic import ldap user task'))
@after_app_ready_start @after_app_ready_start
def import_ldap_user_periodic(): def import_ldap_user_periodic(**kwargs):
if not settings.AUTH_LDAP:
return
task_name = 'import_ldap_user_periodic' task_name = 'import_ldap_user_periodic'
delete_celery_periodic_task(task_name) interval = kwargs.get('AUTH_LDAP_SYNC_INTERVAL', settings.AUTH_LDAP_SYNC_INTERVAL)
if not settings.AUTH_LDAP_SYNC_IS_PERIODIC: enabled = kwargs.get('AUTH_LDAP_SYNC_IS_PERIODIC', settings.AUTH_LDAP_SYNC_IS_PERIODIC)
return crontab = kwargs.get('AUTH_LDAP_SYNC_CRONTAB', settings.AUTH_LDAP_SYNC_CRONTAB)
interval = settings.AUTH_LDAP_SYNC_INTERVAL
if isinstance(interval, int): if isinstance(interval, int):
interval = interval * 3600 interval = interval * 3600
else: else:
interval = None interval = None
crontab = settings.AUTH_LDAP_SYNC_CRONTAB
if crontab: if crontab:
# 优先使用 crontab # 优先使用 crontab
interval = None interval = None
@ -86,7 +81,8 @@ def import_ldap_user_periodic():
task_name: { task_name: {
'task': import_ldap_user.name, 'task': import_ldap_user.name,
'interval': interval, 'interval': interval,
'crontab': crontab 'crontab': crontab,
'enabled': enabled
} }
} }
create_or_update_celery_periodic_tasks(tasks) create_or_update_celery_periodic_tasks(tasks)

View File

@ -60,7 +60,7 @@ class AppletHostDeploymentViewSet(viewsets.ModelViewSet):
queryset = AppletHostDeployment.objects.all() queryset = AppletHostDeployment.objects.all()
filterset_fields = ['host', ] filterset_fields = ['host', ]
rbac_perms = ( rbac_perms = (
('applets', 'terminal.view_AppletHostDeployment'), ('applets', 'terminal.view_applethostdeployment'),
('uninstall', 'terminal.change_applethost'), ('uninstall', 'terminal.change_applethost'),
) )

View File

@ -101,7 +101,7 @@ class AppletMethod:
from .models import Applet, AppletHost from .models import Applet, AppletHost
methods = defaultdict(list) methods = defaultdict(list)
has_applet_hosts = AppletHost.objects.all().exists() has_applet_hosts = AppletHost.objects.filter(is_active=True).exists()
applets = Applet.objects.filter(is_active=True) applets = Applet.objects.filter(is_active=True)
for applet in applets: for applet in applets:
for protocol in applet.protocols: for protocol in applet.protocols:
@ -166,7 +166,8 @@ class ConnectMethodUtil:
'support': [ 'support': [
Protocol.mysql, Protocol.postgresql, Protocol.mysql, Protocol.postgresql,
Protocol.oracle, Protocol.sqlserver, Protocol.oracle, Protocol.sqlserver,
Protocol.mariadb, Protocol.db2 Protocol.mariadb, Protocol.db2,
Protocol.dameng
], ],
'match': 'm2m' 'match': 'm2m'
}, },
@ -253,8 +254,9 @@ class ConnectMethodUtil:
def _filter_disable_protocols_connect_methods(cls, methods): def _filter_disable_protocols_connect_methods(cls, methods):
# 过滤一些特殊的协议方式 # 过滤一些特殊的协议方式
if not getattr(settings, 'TERMINAL_KOKO_SSH_ENABLED'): if not getattr(settings, 'TERMINAL_KOKO_SSH_ENABLED'):
protocol = Protocol.ssh disable_ssh_client_protocols = [Protocol.ssh, Protocol.sftp, Protocol.telnet]
methods[protocol] = [m for m in methods[protocol] if m['type'] != 'native'] for protocol in disable_ssh_client_protocols:
methods[protocol] = [m for m in methods[protocol] if m['type'] != 'native']
return methods return methods
@classmethod @classmethod

View File

@ -176,7 +176,7 @@ class Applet(JMSBaseModel):
label_value = spec_label.label.value label_value = spec_label.label.value
matched = [host for host in hosts if host.name == label_value] matched = [host for host in hosts if host.name == label_value]
if matched: if matched:
return matched[0] return random.choice(matched)
hosts = [h for h in hosts if h.auto_create_accounts] hosts = [h for h in hosts if h.auto_create_accounts]
prefer_key = self.host_prefer_key_tpl.format(user.id) prefer_key = self.host_prefer_key_tpl.format(user.id)

View File

@ -75,6 +75,7 @@ class AuthMixin:
if self.can_update_ssh_key(): if self.can_update_ssh_key():
self.public_key = public_key self.public_key = public_key
self.save() self.save()
post_user_change_password.send(self.__class__, user=self)
def can_update_password(self): def can_update_password(self):
return self.is_local return self.is_local

View File

@ -17,6 +17,7 @@ from orgs.utils import current_org
from rbac.builtin import BuiltinRole from rbac.builtin import BuiltinRole
from rbac.models import OrgRoleBinding, SystemRoleBinding, Role from rbac.models import OrgRoleBinding, SystemRoleBinding, Role
from rbac.permissions import RBACPermission from rbac.permissions import RBACPermission
from users.signals import post_user_change_password
from ..const import PasswordStrategy from ..const import PasswordStrategy
from ..models import User from ..models import User
@ -268,6 +269,8 @@ class UserSerializer(RolesSerializerMixin, CommonBulkSerializerMixin, ResourceLa
instance = self.save_and_set_custom_m2m_fields( instance = self.save_and_set_custom_m2m_fields(
validated_data, save_handler, created=False validated_data, save_handler, created=False
) )
if validated_data.get('public_key'):
post_user_change_password.send(instance.__class__, user=instance)
return instance return instance
def create(self, validated_data): def create(self, validated_data):
@ -275,6 +278,8 @@ class UserSerializer(RolesSerializerMixin, CommonBulkSerializerMixin, ResourceLa
instance = self.save_and_set_custom_m2m_fields( instance = self.save_and_set_custom_m2m_fields(
validated_data, save_handler, created=True validated_data, save_handler, created=True
) )
if validated_data.get('public_key'):
post_user_change_password.send(instance.__class__, user=instance)
return instance return instance
@classmethod @classmethod

View File

@ -33,6 +33,9 @@
{% block custom_foot_js %} {% block custom_foot_js %}
<script type="text/javascript" src="{% static 'js/pwstrength-bootstrap.js' %}"></script> <script type="text/javascript" src="{% static 'js/pwstrength-bootstrap.js' %}"></script>
<script type="text/javascript" src="{% static 'js/plugins/jsencrypt/jsencrypt.min.js' %}"></script>
<script type="text/javascript" src="{% static 'js/plugins/cryptojs/crypto-js.min.js' %}"></script>
<script type="text/javascript" src="{% static 'js/plugins/buffer/buffer.min.js' %}"></script>
<script> <script>
$(document).ready(function () { $(document).ready(function () {
// 密码强度校验 // 密码强度校验
@ -76,7 +79,8 @@ $(document).ready(function () {
checkPasswordRules(password, minLength); checkPasswordRules(password, minLength);
}) })
$("form").submit(function(){ $("form").submit(function(event){
event.preventDefault()
// Let's find the input to check // Let's find the input to check
var ids = ['id_new_password', 'id_confirm_password'] var ids = ['id_new_password', 'id_confirm_password']
for (id of ids) { for (id of ids) {
@ -87,6 +91,7 @@ $(document).ready(function () {
passwordRef.val(value) passwordRef.val(value)
} }
} }
this.submit();
}); });
}) })
</script> </script>

130
poetry.lock generated
View File

@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand. # This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
[[package]] [[package]]
name = "adal" name = "adal"
@ -2138,28 +2138,6 @@ type = "legacy"
url = "https://pypi.tuna.tsinghua.edu.cn/simple" url = "https://pypi.tuna.tsinghua.edu.cn/simple"
reference = "tsinghua" reference = "tsinghua"
[[package]]
name = "django-rest-swagger"
version = "2.2.0"
description = "Swagger UI for Django REST Framework 3.5+"
optional = false
python-versions = "*"
files = [
{file = "django-rest-swagger-2.2.0.tar.gz", hash = "sha256:48f6aded9937e90ae7cbe9e6c932b9744b8af80cc4e010088b3278c700e0685b"},
{file = "django_rest_swagger-2.2.0-py2.py3-none-any.whl", hash = "sha256:b039b0288bab4665cd45dc5d16f94b13911bc4ad0ed55f74ad3b90aa31c87c17"},
]
[package.dependencies]
coreapi = ">=2.3.0"
djangorestframework = ">=3.5.4"
openapi-codec = ">=1.3.1"
simplejson = "*"
[package.source]
type = "legacy"
url = "https://pypi.tuna.tsinghua.edu.cn/simple"
reference = "tsinghua"
[[package]] [[package]]
name = "django-simple-captcha" name = "django-simple-captcha"
version = "0.5.18" version = "0.5.18"
@ -2389,22 +2367,45 @@ url = "https://pypi.tuna.tsinghua.edu.cn/simple"
reference = "tsinghua" reference = "tsinghua"
[[package]] [[package]]
name = "elasticsearch" name = "elastic-transport"
version = "7.8.0" version = "8.13.1"
description = "Python client for Elasticsearch" description = "Transport classes and utilities shared among Python Elastic client libraries"
optional = false optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, <4" python-versions = ">=3.7"
files = [ files = [
{file = "elasticsearch-7.8.0-py2.py3-none-any.whl", hash = "sha256:6fb566dd23b91b5871ce12212888674b4cf33374e92b71b1080916c931e44dcb"}, {file = "elastic_transport-8.13.1-py3-none-any.whl", hash = "sha256:5d4bb6b8e9d74a9c16de274e91a5caf65a3a8d12876f1e99152975e15b2746fe"},
{file = "elasticsearch-7.8.0.tar.gz", hash = "sha256:e637d8cf4e27e279b5ff8ca8edc0c086f4b5df4bf2b48e2f950b7833aca3a792"}, {file = "elastic_transport-8.13.1.tar.gz", hash = "sha256:16339d392b4bbe86ad00b4bdeecff10edf516d32bc6c16053846625f2c6ea250"},
] ]
[package.dependencies] [package.dependencies]
certifi = "*" certifi = "*"
urllib3 = ">=1.21.1" urllib3 = ">=1.26.2,<3"
[package.extras] [package.extras]
async = ["aiohttp (>=3,<4)", "yarl"] develop = ["aiohttp", "furo", "httpx", "mock", "opentelemetry-api", "opentelemetry-sdk", "orjson", "pytest", "pytest-asyncio", "pytest-cov", "pytest-httpserver", "pytest-mock", "requests", "respx", "sphinx (>2)", "sphinx-autodoc-typehints", "trustme"]
[package.source]
type = "legacy"
url = "https://pypi.tuna.tsinghua.edu.cn/simple"
reference = "tsinghua"
[[package]]
name = "elasticsearch7"
version = "7.17.9"
description = "Python client for Elasticsearch"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, <4"
files = [
{file = "elasticsearch7-7.17.9-py2.py3-none-any.whl", hash = "sha256:24cfa00438dd1c0328f4c61e064bcfd4bbf5ff7684e2ec49cc46efbb7598b055"},
{file = "elasticsearch7-7.17.9.tar.gz", hash = "sha256:4868965d7d6af948c6f31510523f610e9b81299acd2fd35325e46090d786584d"},
]
[package.dependencies]
certifi = "*"
urllib3 = ">=1.21.1,<2"
[package.extras]
async = ["aiohttp (>=3,<4)"]
develop = ["black", "coverage", "jinja2", "mock", "pytest", "pytest-cov", "pyyaml", "requests (>=2.0.0,<3.0.0)", "sphinx (<1.7)", "sphinx-rtd-theme"] develop = ["black", "coverage", "jinja2", "mock", "pytest", "pytest-cov", "pyyaml", "requests (>=2.0.0,<3.0.0)", "sphinx (<1.7)", "sphinx-rtd-theme"]
docs = ["sphinx (<1.7)", "sphinx-rtd-theme"] docs = ["sphinx (<1.7)", "sphinx-rtd-theme"]
requests = ["requests (>=2.4.0,<3.0.0)"] requests = ["requests (>=2.4.0,<3.0.0)"]
@ -2414,6 +2415,31 @@ type = "legacy"
url = "https://pypi.tuna.tsinghua.edu.cn/simple" url = "https://pypi.tuna.tsinghua.edu.cn/simple"
reference = "tsinghua" reference = "tsinghua"
[[package]]
name = "elasticsearch8"
version = "8.13.2"
description = "Python client for Elasticsearch"
optional = false
python-versions = ">=3.7"
files = [
{file = "elasticsearch8-8.13.2-py3-none-any.whl", hash = "sha256:1691496d9a39b5504f768e2a3000574f0e9c684842b449f692ffc364a2171758"},
{file = "elasticsearch8-8.13.2.tar.gz", hash = "sha256:ae5f08a15f24b7af0025290f303c7777d781ebedb23c1805a46114ea0f918d0e"},
]
[package.dependencies]
elastic-transport = ">=8.13,<9"
[package.extras]
async = ["aiohttp (>=3,<4)"]
orjson = ["orjson (>=3)"]
requests = ["requests (>=2.4.0,!=2.32.2,<3.0.0)"]
vectorstore-mmr = ["numpy (>=1)", "simsimd (>=3)"]
[package.source]
type = "legacy"
url = "https://pypi.tuna.tsinghua.edu.cn/simple"
reference = "tsinghua"
[[package]] [[package]]
name = "enum-compat" name = "enum-compat"
version = "0.0.3" version = "0.0.3"
@ -2836,14 +2862,8 @@ files = [
[package.dependencies] [package.dependencies]
google-auth = ">=2.14.1,<3.0.dev0" google-auth = ">=2.14.1,<3.0.dev0"
googleapis-common-protos = ">=1.56.2,<2.0.dev0" googleapis-common-protos = ">=1.56.2,<2.0.dev0"
grpcio = [ grpcio = {version = ">=1.49.1,<2.0dev", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""}
{version = ">=1.33.2,<2.0dev", optional = true, markers = "extra == \"grpc\""}, grpcio-status = {version = ">=1.49.1,<2.0.dev0", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""}
{version = ">=1.49.1,<2.0dev", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""},
]
grpcio-status = [
{version = ">=1.33.2,<2.0.dev0", optional = true, markers = "extra == \"grpc\""},
{version = ">=1.49.1,<2.0.dev0", optional = true, markers = "python_version >= \"3.11\" and extra == \"grpc\""},
]
protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0.dev0" protobuf = ">=3.19.5,<3.20.0 || >3.20.0,<3.20.1 || >3.20.1,<4.21.0 || >4.21.0,<4.21.1 || >4.21.1,<4.21.2 || >4.21.2,<4.21.3 || >4.21.3,<4.21.4 || >4.21.4,<4.21.5 || >4.21.5,<5.0.0.dev0"
requests = ">=2.18.0,<3.0.0.dev0" requests = ">=2.18.0,<3.0.0.dev0"
@ -3575,12 +3595,12 @@ reference = "tsinghua"
[[package]] [[package]]
name = "jms-storage" name = "jms-storage"
version = "0.0.58" version = "0.0.59"
description = "Jumpserver storage python sdk tools" description = "Jumpserver storage python sdk tools"
optional = false optional = false
python-versions = "*" python-versions = "*"
files = [ files = [
{file = "jms-storage-0.0.58.tar.gz", hash = "sha256:9508864c5b65b2628291c39e3eeccedffd77448545a0359a4cb12c6435592e62"}, {file = "jms-storage-0.0.59.tar.gz", hash = "sha256:62171d5182f4ab774dbe4204aed8dabc37926869067e2ed4dd52afab1df8d1d3"},
] ]
[package.dependencies] [package.dependencies]
@ -3592,7 +3612,6 @@ certifi = "2023.7.22"
chardet = "5.1.0" chardet = "5.1.0"
crcmod = "1.7" crcmod = "1.7"
docutils = "0.20.1" docutils = "0.20.1"
elasticsearch = "7.8.0"
esdk-obs-python = "3.21.4" esdk-obs-python = "3.21.4"
idna = "3.4" idna = "3.4"
oss2 = "2.18.1" oss2 = "2.18.1"
@ -3915,16 +3934,6 @@ files = [
{file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"}, {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
{file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
{file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"}, {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
{file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"},
{file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"},
{file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"},
{file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"},
{file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"},
{file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"},
{file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"},
{file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"},
{file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"},
{file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"},
{file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
{file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
{file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"}, {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
@ -4172,6 +4181,7 @@ files = [
{file = "msgpack-1.0.8-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5fbb160554e319f7b22ecf530a80a3ff496d38e8e07ae763b9e82fadfe96f273"}, {file = "msgpack-1.0.8-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5fbb160554e319f7b22ecf530a80a3ff496d38e8e07ae763b9e82fadfe96f273"},
{file = "msgpack-1.0.8-cp39-cp39-win32.whl", hash = "sha256:f9af38a89b6a5c04b7d18c492c8ccf2aee7048aff1ce8437c4683bb5a1df893d"}, {file = "msgpack-1.0.8-cp39-cp39-win32.whl", hash = "sha256:f9af38a89b6a5c04b7d18c492c8ccf2aee7048aff1ce8437c4683bb5a1df893d"},
{file = "msgpack-1.0.8-cp39-cp39-win_amd64.whl", hash = "sha256:ed59dd52075f8fc91da6053b12e8c89e37aa043f8986efd89e61fae69dc1b011"}, {file = "msgpack-1.0.8-cp39-cp39-win_amd64.whl", hash = "sha256:ed59dd52075f8fc91da6053b12e8c89e37aa043f8986efd89e61fae69dc1b011"},
{file = "msgpack-1.0.8-py3-none-any.whl", hash = "sha256:24f727df1e20b9876fa6e95f840a2a2651e34c0ad147676356f4bf5fbb0206ca"},
{file = "msgpack-1.0.8.tar.gz", hash = "sha256:95c02b0e27e706e48d0e5426d1710ca78e0f0628d6e89d5b5a5b91a5f12274f3"}, {file = "msgpack-1.0.8.tar.gz", hash = "sha256:95c02b0e27e706e48d0e5426d1710ca78e0f0628d6e89d5b5a5b91a5f12274f3"},
] ]
@ -5791,11 +5801,9 @@ files = [
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:049f2e3de919e8e02504780a21ebbf235e21ca8ed5c7538c5b6e705aa6c43d8c"}, {file = "pymssql-2.2.8-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:049f2e3de919e8e02504780a21ebbf235e21ca8ed5c7538c5b6e705aa6c43d8c"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dd86d8e3e346e34f3f03d12e333747b53a1daa74374a727f4714d5b82ee0dd5"}, {file = "pymssql-2.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dd86d8e3e346e34f3f03d12e333747b53a1daa74374a727f4714d5b82ee0dd5"},
{file = "pymssql-2.2.8-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:508226a0df7cb6faeda9f8e84e85743690ca427d7b27af9a73d75fcf0c1eef6e"}, {file = "pymssql-2.2.8-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:508226a0df7cb6faeda9f8e84e85743690ca427d7b27af9a73d75fcf0c1eef6e"},
{file = "pymssql-2.2.8-cp310-cp310-win_amd64.whl", hash = "sha256:47859887adeaf184766b5e0bc845dd23611f3808f9521552063bb36eabc10092"},
{file = "pymssql-2.2.8-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d873e553374d5b1c57fe1c43bb75e3bcc2920678db1ef26f6bfed396c7d21b30"}, {file = "pymssql-2.2.8-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d873e553374d5b1c57fe1c43bb75e3bcc2920678db1ef26f6bfed396c7d21b30"},
{file = "pymssql-2.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf31b8b76634c826a91f9999e15b7bfb0c051a0f53b319fd56481a67e5b903bb"}, {file = "pymssql-2.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf31b8b76634c826a91f9999e15b7bfb0c051a0f53b319fd56481a67e5b903bb"},
{file = "pymssql-2.2.8-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:821945c2214fe666fd456c61e09a29a00e7719c9e136c801bffb3a254e9c579b"}, {file = "pymssql-2.2.8-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:821945c2214fe666fd456c61e09a29a00e7719c9e136c801bffb3a254e9c579b"},
{file = "pymssql-2.2.8-cp311-cp311-win_amd64.whl", hash = "sha256:cc85b609b4e60eac25fa38bbac1ff854fd2c2a276e0ca4a3614c6f97efb644bb"},
{file = "pymssql-2.2.8-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:ebe7f64d5278d807f14bea08951e02512bfbc6219fd4d4f15bb45ded885cf3d4"}, {file = "pymssql-2.2.8-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:ebe7f64d5278d807f14bea08951e02512bfbc6219fd4d4f15bb45ded885cf3d4"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:253af3d39fc0235627966817262d5c4c94ad09dcbea59664748063470048c29c"}, {file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:253af3d39fc0235627966817262d5c4c94ad09dcbea59664748063470048c29c"},
{file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c9d109df536dc5f7dd851a88d285a4c9cb12a9314b621625f4f5ab1197eb312"}, {file = "pymssql-2.2.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c9d109df536dc5f7dd851a88d285a4c9cb12a9314b621625f4f5ab1197eb312"},
@ -5811,13 +5819,11 @@ files = [
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3906993300650844ec140aa58772c0f5f3e9e9d5709c061334fd1551acdcf066"}, {file = "pymssql-2.2.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3906993300650844ec140aa58772c0f5f3e9e9d5709c061334fd1551acdcf066"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:7309c7352e4a87c9995c3183ebfe0ff4135e955bb759109637673c61c9f0ca8d"}, {file = "pymssql-2.2.8-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:7309c7352e4a87c9995c3183ebfe0ff4135e955bb759109637673c61c9f0ca8d"},
{file = "pymssql-2.2.8-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:9b8d603cc1ec7ae585c5a409a1d45e8da067970c79dd550d45c238ae0aa0f79f"}, {file = "pymssql-2.2.8-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:9b8d603cc1ec7ae585c5a409a1d45e8da067970c79dd550d45c238ae0aa0f79f"},
{file = "pymssql-2.2.8-cp38-cp38-win_amd64.whl", hash = "sha256:293cb4d0339e221d877d6b19a1905082b658f0100a1e2ccc9dda10de58938901"},
{file = "pymssql-2.2.8-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:895041edd002a2e91d8a4faf0906b6fbfef29d9164bc6beb398421f5927fa40e"}, {file = "pymssql-2.2.8-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:895041edd002a2e91d8a4faf0906b6fbfef29d9164bc6beb398421f5927fa40e"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6b2d9c6d38a416c6f2db36ff1cd8e69f9a5387a46f9f4f612623192e0c9404b1"}, {file = "pymssql-2.2.8-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6b2d9c6d38a416c6f2db36ff1cd8e69f9a5387a46f9f4f612623192e0c9404b1"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d63d6f25cf40fe6a03c49be2d4d337858362b8ab944d6684c268e4990807cf0c"}, {file = "pymssql-2.2.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d63d6f25cf40fe6a03c49be2d4d337858362b8ab944d6684c268e4990807cf0c"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:c83ad3ad20951f3a94894b354fa5fa9666dcd5ebb4a635dad507c7d1dd545833"}, {file = "pymssql-2.2.8-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:c83ad3ad20951f3a94894b354fa5fa9666dcd5ebb4a635dad507c7d1dd545833"},
{file = "pymssql-2.2.8-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:3933f7f082be74698eea835df51798dab9bc727d94d3d280bffc75ab9265f890"}, {file = "pymssql-2.2.8-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:3933f7f082be74698eea835df51798dab9bc727d94d3d280bffc75ab9265f890"},
{file = "pymssql-2.2.8-cp39-cp39-win_amd64.whl", hash = "sha256:de313375b90b0f554058992f35c4a4beb3f6ec2f5912d8cd6afb649f95b03a9f"},
{file = "pymssql-2.2.8.tar.gz", hash = "sha256:9baefbfbd07d0142756e2dfcaa804154361ac5806ab9381350aad4e780c3033e"}, {file = "pymssql-2.2.8.tar.gz", hash = "sha256:9baefbfbd07d0142756e2dfcaa804154361ac5806ab9381350aad4e780c3033e"},
] ]
@ -6306,7 +6312,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"}, {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
{file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"}, {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"}, {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"}, {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@ -6314,15 +6319,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"}, {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
{file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"}, {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"}, {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
{file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
{file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
{file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
{file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
{file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
{file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"}, {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"}, {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@ -6339,7 +6337,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"}, {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
{file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"}, {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"}, {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"}, {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@ -6347,7 +6344,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"}, {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
{file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"}, {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"}, {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"}, {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@ -7905,4 +7901,4 @@ reference = "tsinghua"
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = "^3.11" python-versions = "^3.11"
content-hash = "63da026ae0d9d5e6a62b25d265a50c95666e05f60cdae798c2e0443c93544e3a" content-hash = "54280c08e758b0767dbf7dd249a0d68f4a16c6de69a86723f00b2793c6bfd186"

View File

@ -47,7 +47,7 @@ pynacl = "1.5.0"
python-dateutil = "2.8.2" python-dateutil = "2.8.2"
pyyaml = "6.0.1" pyyaml = "6.0.1"
requests = "2.31.0" requests = "2.31.0"
jms-storage = "^0.0.58" jms-storage = "^0.0.59"
simplejson = "3.19.1" simplejson = "3.19.1"
six = "1.16.0" six = "1.16.0"
sshtunnel = "0.4.0" sshtunnel = "0.4.0"
@ -83,7 +83,6 @@ django-bootstrap3 = "23.4"
django-filter = "23.2" django-filter = "23.2"
django-formtools = "2.4.1" django-formtools = "2.4.1"
django-ranged-response = "0.2.0" django-ranged-response = "0.2.0"
django-rest-swagger = "2.2.0"
django-simple-captcha = "0.5.18" django-simple-captcha = "0.5.18"
django-timezone-field = "5.1" django-timezone-field = "5.1"
djangorestframework = "3.14.0" djangorestframework = "3.14.0"
@ -156,6 +155,9 @@ annotated-types = "^0.6.0"
httpx = "^0.27.0" httpx = "^0.27.0"
distro = "1.9.0" distro = "1.9.0"
tqdm = "4.66.4" tqdm = "4.66.4"
elasticsearch7 = "7.17.9"
elasticsearch8 = "8.13.2"
[tool.poetry.group.xpack.dependencies] [tool.poetry.group.xpack.dependencies]