使用 redis 队列在另一个作业中调度/排队作业

问题描述

Tasker 类在实例化时设置初始作业。基本上我想要的是在'main_queue'中放置一个作业,确定作业是否正在运行或者是否已经有相同的作业在'process_queue'中排队,从当前的'main_queue'作业返回。否则在“process_queue”中排队作业。当该进程队列完成时,将作业放入“main_queue”。

然而,“process_queue”在这段时间内具有相同的 id 作业,尽管它应该已经完成​​查看输出。因此,永远不会将新作业放入处理中。是否发生了我无法看到的僵局?

main_queue worker

$ rq worker main_queue --with-scheduler
22:44:19 Worker rq:worker:7fe23a24ae404135a10e301f7509eb7e: started,version 1.9.0
22:44:19 Subscribing to channel rq:pubsub:7fe23a24ae404135a10e301f7509eb7e
22:44:19 *** Listening on main_queue...
22:44:19 Trying to acquire locks for main_queue
22:44:19 Scheduler for main_queue started with PID 3747
22:44:19 Cleaning registries for queue: main_queue
22:44:33 main_queue: tasks.redis_test_job() (e90e0dff-bbcc-48ab-afed-6d1ba8b020a8)
None
Job is enqueued to process_queue!
22:44:33 main_queue: Job OK (e90e0dff-bbcc-48ab-afed-6d1ba8b020a8)
22:44:33 Result is kept for 500 seconds
22:44:47 main_queue: tasks.redis_test_job() (1a7f91d0-73f4-466e-92f4-9f918a9dd1e9)
<Job test_job: tasks.print_job()>
!!Scheduler added job to main but same job is already queued in process_queue!!
22:44:47 main_queue: Job OK (1a7f91d0-73f4-466e-92f4-9f918a9dd1e9)
22:44:47 Result is kept for 500 seconds

process_queue worker

$ rq worker process_queue
22:44:24 Worker rq:worker:d70daf20ff324c18bc17f0ea9576df52: started,version 1.9.0
22:44:24 Subscribing to channel rq:pubsub:d70daf20ff324c18bc17f0ea9576df52
22:44:24 *** Listening on process_queue...
22:44:24 Cleaning registries for queue: process_queue
22:44:33 process_queue: tasks.print_job() (test_job)
The process job executed.
22:44:42 process_queue: Job OK (test_job)
22:44:42 Result is kept for 500 seconds

tasker.py

class Tasker():
    def __init__(self):
        self.tasker_conn = RedisClient().conn
        self.process_queue = Queue(name='process_queue',connection=Redis(),default_timeout=-1)
        self.main_queue = Queue(name='main_queue',default_timeout=-1)
        self.__setup_tasks()
    def __setup_tasks(self):
        self.main_queue.enqueue_in(timedelta(seconds=3),tasks.redis_test_job)

tasks.py

import tasks
def redis_test_job():
    q = Queue('process_queue',default_timeout=-1)
    queued = q.fetch_job('test_job')
    print(queued)
    if queued:
        print("!!Scheduler added job to main but same job is already queued in process_queue!!")
        return False
    else:
        q.enqueue(tasks.print_job,job_id='test_job')
        print("Job is enqueued to process_queue!")
    return True

def print_job():
    sleep(8)
    print("The process job executed.")
    q = Queue('main_queue',default_timeout=-1)
    q.enqueue_in(timedelta(seconds=5),tasks.redis_test_job)

解决方法

the docs 开始,入队作业有一个 result_ttl,如果您没有定义它,则默认为 500 seconds

如果您想将其更改为例如使作业和结果仅存活 1 秒,像这样排列您的作业:

 q.enqueue(tasks.print_job,job_id='test_job',result_ttl=1)