You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
jumpserver/apps/common
ibuler 28e47f33c1
[Update] 修改一些逻辑
6 years ago
..
migrations
templates/common [Update] 优化登录失败次数限制的逻辑,并添加系统安全设置选项 6 years ago
templatetags
urls [Update] 修改一些逻辑 6 years ago
README.md
__init__.py
api.py [Bugfix] 解决发送邮件重启的问题 7 years ago
apps.py
compat.py
const.py
exceptions.py
fields.py
forms.py [Update] 优化登录失败次数限制的逻辑,并添加系统安全设置选项 6 years ago
mixins.py [Update] 完成基本框架 6 years ago
models.py
permissions.py
serializers.py
signals.py
signals_handler.py [Bugfix] 修复取消LDAP认证不成功bug 7 years ago
tasks.py [Bugfix] 解决发送邮件重启的问题 7 years ago
tests.py
utils.py [Update] 修改permisstion util 7 years ago
validators.py
views.py [Feature] 增加功能,安全设置(全局MFA设置,密码强度校验) 7 years ago

README.md

Common app

Common app provide common view, function or others.

Common app shouldn't rely on other apps, because It may lead to cycle import.

If your want to implement some function or class, you should think whether other app use or not. If yes, You should make in common.

If the ability more relate to your app tightness, It's mean your app provide this ability, not common, You should write it on your app utils.

Celery usage

Jumpserver use celery to run task async. Using redis as the broker, so you should run a redis instance

Run redis

$ yum -y install redis 

or

$ docker run -name jumpserver-redis -d -p 6379:6379 redis redis-server

Write tasks in app_name/tasks.py

ops/tasks.py

from __future__ import absolute_import

import time
from celery import shared_task
from common import celery_app


@shared_task
def longtime_add(x, y):
    print 'long time task begins'
    # sleep 5 seconds
    time.sleep(5)
    print 'long time task finished'
    return x + y
    

@celery_app.task(name='hello-world')
def hello():
    print 'hello world!'
  

Run celery in development

$ cd apps
$ celery -A common worker -l info 

Test using task

$ ./manage.py shell
>>> from ops.tasks import longtime_add
>>> res = longtime_add.delay(1, 2)
>>> res.get()