You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
jumpserver/apps/common
jiangweidong 64eda5f28b
perf: 命令存储ES可根据日期动态建立索引 (#8180)
3 years ago
..
auth
const
db perf: password 等使用 rsa 加密传输 (#8188) 3 years ago
drf perf: password 等使用 rsa 加密传输 (#8188) 3 years ago
management
migrations
mixins perf: 账号管理中查看密码记录日志 (#8157) 3 years ago
sdk feat: 企业微信、钉钉 工作台免密登录(飞书已实现) (#7855) 3 years ago
templatetags
urls
utils perf: 命令存储ES可根据日期动态建立索引 (#8180) 3 years ago
README.md
__init__.py
api.py
apps.py
cache.py perf: 优化部分云厂商的redis连接的问题 3 years ago
compat.py
decorator.py
exceptions.py
http.py
local.py
permissions.py
signal_handlers.py
signals.py
struct.py
tasks.py
tests.py
thread_pools.py
tree.py
validators.py perf: 优化内置系统用户角色权限 3 years ago
views.py

README.md

Common app

Common app provide common view, function or others.

Common app shouldn't rely on other apps, because It may lead to cycle import.

If your want to implement some function or class, you should think whether other app use or not. If yes, You should make in common.

If the ability more relate to your app tightness, It's mean your app provide this ability, not common, You should write it on your app utils.

Celery usage

JumpServer use celery to run task async. Using redis as the broker, so you should run a redis instance

Run redis

$ yum -y install redis 

or

$ docker run -name jumpserver-redis -d -p 6379:6379 redis redis-server

Write tasks in app_name/tasks.py

ops/tasks.py

from __future__ import absolute_import

import time
from celery import shared_task
from common import celery_app


@shared_task
def longtime_add(x, y):
    print 'long time task begins'
    # sleep 5 seconds
    time.sleep(5)
    print 'long time task finished'
    return x + y
    

@celery_app.task(name='hello-world')
def hello():
    print 'hello world!'
  

Run celery in development

$ cd apps
$ celery -A common worker -l info 

Test using task

$ ./manage.py shell
>>> from ops.tasks import longtime_add
>>> res = longtime_add.delay(1, 2)
>>> res.get()