GithubHelp home page GithubHelp logo

angelliang / celery-sqlalchemy-scheduler Goto Github PK

View Code? Open in Web Editor NEW
123.0 123.0 56.0 172 KB

A Scheduler Based SQLalchemy for Celery.

License: MIT License

Python 100.00%
celery celery-sqlalchemy-scheduler crontab-schedule python python3 sqlalchemy

celery-sqlalchemy-scheduler's People

Contributors

angelliang avatar gsingh42 avatar manuwelakanade avatar minsis avatar subhamd avatar xxxss avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

celery-sqlalchemy-scheduler's Issues

Interval task 恢复enabled无法send task

暂停一段时间后再恢复的Interval类型任务无法正常send task,schedule能正常读取到。
重置last_run_at字段就能正常运行
task.enabled = True
task.last_run_at = None

crontab也不会有这个问题。
就我一个人这样么?

Recursion error on repr for IntervalSchedule object

For the IntervalSchedule object you're passing in self in format instead of the method. This is causing a recursion issue on the object.

  File "/home/dwhitney/miniconda3/envs/scorecardv2_frontend/lib/python3.8/site-packages/celery_sqlalchemy_scheduler/models.py", line 55, in __repr__
    if self.every == 1:
  File "/home/dwhitney/miniconda3/envs/scorecardv2_frontend/lib/python3.8/site-packages/sqlalchemy/orm/attributes.py", line 283, in __get__
    dict_ = instance_dict(instance)
RecursionError: maximum recursion depth exceeded while calling a Python object

Updating the celery_periodic_task_changed triggers ALL the existing tasks now.

Whenever I need to update the scheduler tasks, I update the celery_periodic_task_changed to now to make beat aware of the changes.

The problem is that doing that triggers all the scheduled tasks to run NOW, and I just need them to run at the right time they were scheduled.

There is any workaround for this?

Thanks

crontab的任务时间会错误

interval的任务时间没问题,但是如果我在数据库里添加了minute 是 */10的任务,下一次执行时间会是utc时间,比如我是Asia/Shanghai,任务会安排到8个小时以后。。

Periodic tasks are being deleted.

I'm running celery beat and celery worker inside a docker container. When the container restarts it seems that all the periodic tasks are being deleted. I can't tell if this is an issue with this package or beat doing something weird.

The database entry doesn't have an expire datetime set.

Celery version: 4.4.7

Any ideas on where I could start to troubleshoot this? This is such an obscure issue I can't even tell how to look at this issue.

crontab 任务不发送

形如<crontab: 2,3,7,9 * * * * (m/h/d/dM/MY), Asia/Shanghai> 这种的任务不发送,

环境

Mac OS 10.15.6
Python 3.6.5
celery 4.4.7
MySQL 5.7.18-14
celery-sqlalchemy-scheduler 0.2.7

log:

celery -A celery_pro.capp  beat -l debug --scheduler celery_sqlalchemy_scheduler.schedulers:DatabaseScheduler
using config celery_pro.celery_test_config
celery beat v4.4.7 (cliffs) is starting.
__    -    ... __   -        _
LocalTime -> 2020-09-08 16:41:09
Configuration ->
    . broker -> amqp://guest:**@127.0.0.1:5672//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery_sqlalchemy_scheduler.schedulers.DatabaseScheduler
    . db -> mysql+pymysql://XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
    . logfile -> [stderr]@%DEBUG
    . maxinterval -> 10.00 seconds (10s)
[2020-09-08 16:41:09,661: DEBUG/MainProcess] Setting default socket timeout to 30
[2020-09-08 16:41:09,662: INFO/MainProcess] beat: Starting...
[2020-09-08 16:41:09,662: INFO/MainProcess] setup_schedule
[2020-09-08 16:41:09,662: DEBUG/MainProcess] DatabaseScheduler: initial read
[2020-09-08 16:41:09,662: INFO/MainProcess] Writing entries...
[2020-09-08 16:41:09,663: DEBUG/MainProcess] DatabaseScheduler: Fetching database schedule
[2020-09-08 16:41:09,784: DEBUG/MainProcess] schedule: <crontab: 0 4 * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:09,803: DEBUG/MainProcess] schedule: <crontab: 0 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:09,809: DEBUG/MainProcess] schedule: <crontab: 2,3,7,9 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:09,814: DEBUG/MainProcess] schedule: <crontab: 1,2,3,7,9 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:09,818: DEBUG/MainProcess] schedule: <crontab: * * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:09,822: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:41:09,828: DEBUG/MainProcess] schedule: <crontab: 30 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:09,831: DEBUG/MainProcess] schedule: <crontab: 28 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:09,870: DEBUG/MainProcess] schedule: <crontab: 40 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:10,182: DEBUG/MainProcess] schedule: <crontab: */2 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:10,209: DEBUG/MainProcess] schedule: <crontab: 45 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:10,216: DEBUG/MainProcess] Current schedule:
<ModelEntry: celery.backend_cleanup celery.backend_cleanup(*[], **{}) <crontab: 0 4 * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-hours celery_pro.tasks.echo(*['echo-every-hours'], **{}) <crontab: 0 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-2,3,7,9-min celery_pro.tasks.echo(*['echo-every-2,3,7,9-min'], **{}) <crontab: 2,3,7,9 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-1,2,3,7,9-min celery_pro.tasks.echo(*['echo-every-1,2,3,7,9-min'], **{}) <crontab: 1,2,3,7,9 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every--min celery_pro.tasks.echo(*['echo-every--min'], **{}) <crontab: * * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-10s celery_pro.tasks.echo(*['echo-every-10s'], **{}) <freq: 10.00 seconds>>
<ModelEntry: echo-every-30-hour celery_pro.tasks.echo(*['echo-every-30-hour'], **{}) <crontab: 30 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-28-hour celery_pro.tasks.echo(*['echo-every-28-hour'], **{}) <crontab: 28 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-40-hour celery_pro.tasks.echo(*['echo-every-40-hour'], **{}) <crontab: 40 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-*/2 celery_pro.tasks.echo(*['echo-every-*/2'], **{}) <crontab: */2 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
<ModelEntry: echo-every-45-hour celery_pro.tasks.echo(*['echo-every-45-hour'], **{}) <crontab: 45 * * * * (m/h/d/dM/MY), Asia/Shanghai>>
[2020-09-08 16:41:10,333: DEBUG/MainProcess] schedule: <crontab: 0 4 * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:41:10,389: DEBUG/MainProcess] beat: Ticking with max interval->10.00 seconds
[2020-09-08 16:41:10,725: DEBUG/MainProcess] beat: Waking up in 5.26 seconds.
[2020-09-08 16:41:15,993: DEBUG/MainProcess] beat: Synchronizing schedule...
[2020-09-08 16:41:15,993: INFO/MainProcess] Writing entries...
[2020-09-08 16:41:16,036: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:41:16,047: DEBUG/MainProcess] Start from server, version: 0.9, properties: {'capabilities': {'publisher_confirms': True, 'exchange_exchange_bindings': True, 'basic.nack': True, 'consumer_cancel_notify': True, 'connection.blocked': True, 'consumer_priorities': True, 'authentication_failure_close': True, 'per_consumer_qos': True, 'direct_reply_to': True}, 'cluster_name': 'rabbit@chennan', 'copyright': 'Copyright (c) 2007-2020 VMware, Inc. or its affiliates.', 'information': 'Licensed under the MPL 2.0. Website: https://rabbitmq.com', 'platform': 'Erlang/OTP 23.0.3', 'product': 'RabbitMQ', 'version': '3.8.7'}, mechanisms: [b'PLAIN', b'AMQPLAIN'], locales: ['en_US']
[2020-09-08 16:41:16,048: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:41:16,055: DEBUG/MainProcess] using channel_id: 1
[2020-09-08 16:41:16,056: DEBUG/MainProcess] Channel open
[2020-09-08 16:41:16,058: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->f4fc3de9-416e-49fa-9d7b-03633fad5e28
[2020-09-08 16:41:16,084: DEBUG/MainProcess] beat: Waking up in 9.94 seconds.
[2020-09-08 16:41:26,055: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:41:26,055: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:41:26,056: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->4679400c-f6f3-4046-90cd-6f698dad7403
[2020-09-08 16:41:26,257: DEBUG/MainProcess] beat: Waking up in 9.79 seconds.
[2020-09-08 16:41:36,076: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:41:36,076: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:41:36,077: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->64d8dae4-1f33-4ff5-a86a-62e22c30570f
[2020-09-08 16:41:36,227: DEBUG/MainProcess] beat: Waking up in 9.84 seconds.
[2020-09-08 16:41:46,314: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:41:46,314: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:41:46,314: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->0b2b2cae-9128-4c6b-a7b7-529cc2f89b3d
[2020-09-08 16:41:46,344: DEBUG/MainProcess] beat: Waking up in 9.96 seconds.
[2020-09-08 16:41:56,356: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:41:56,356: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:41:56,357: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->4f537797-614b-4a18-9527-bfd9d7fb9652
[2020-09-08 16:41:56,432: DEBUG/MainProcess] beat: Waking up in 3.56 seconds.
[2020-09-08 16:42:00,021: DEBUG/MainProcess] schedule: <crontab: */2 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:42:00,021: INFO/MainProcess] Scheduler: Sending due task echo-every-*/2 (celery_pro.tasks.echo)
[2020-09-08 16:42:00,021: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->f5d821b5-6070-481a-af79-0418ede6ea30
[2020-09-08 16:42:00,047: DEBUG/MainProcess] schedule: <crontab: * * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:42:00,047: INFO/MainProcess] Scheduler: Sending due task echo-every--min (celery_pro.tasks.echo)
[2020-09-08 16:42:00,048: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->29ea6744-c491-471d-8044-09df1e7d621e
[2020-09-08 16:42:00,093: DEBUG/MainProcess] beat: Waking up in 6.25 seconds.
[2020-09-08 16:42:06,372: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:42:06,372: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:42:06,373: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->7fcca158-9831-4f9c-b543-14c7a5f93006
[2020-09-08 16:42:06,397: DEBUG/MainProcess] beat: Waking up in 9.96 seconds.
[2020-09-08 16:42:16,390: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:42:16,390: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:42:16,391: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->76ad4417-3de9-4a54-a4cc-18f33e2672a3
[2020-09-08 16:42:16,415: DEBUG/MainProcess] beat: Waking up in 9.96 seconds.
[2020-09-08 16:42:26,434: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:42:26,434: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:42:26,435: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->09048f9e-1520-4a5b-aef5-acbbc0d2d974
[2020-09-08 16:42:26,556: DEBUG/MainProcess] beat: Waking up in 9.87 seconds.
[2020-09-08 16:42:36,687: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:42:36,687: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:42:36,688: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->f7ecaf26-dcf8-4f26-b9b2-8cdab84d0e65
[2020-09-08 16:42:36,728: DEBUG/MainProcess] beat: Waking up in 9.95 seconds.
[2020-09-08 16:42:46,705: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:42:46,705: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:42:46,706: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->4ccd9014-8a46-41ba-9998-fd2e0eae2400
[2020-09-08 16:42:46,738: DEBUG/MainProcess] beat: Waking up in 9.96 seconds.
[2020-09-08 16:42:56,721: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:42:56,721: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:42:56,722: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->7b53cd39-9c5c-4227-b21e-8d88e3b5ed57
[2020-09-08 16:42:56,768: DEBUG/MainProcess] beat: Waking up in 3.22 seconds.
[2020-09-08 16:43:00,140: DEBUG/MainProcess] schedule: <crontab: * * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:43:00,140: INFO/MainProcess] Scheduler: Sending due task echo-every--min (celery_pro.tasks.echo)
[2020-09-08 16:43:00,140: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->65f7dab5-8784-4ce8-ad1b-559b5f828101
[2020-09-08 16:43:00,171: DEBUG/MainProcess] beat: Waking up in 6.54 seconds.
[2020-09-08 16:43:06,831: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:43:06,831: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:43:06,831: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->1f6e569d-d075-4c5e-9037-b134a212a10f
[2020-09-08 16:43:07,134: DEBUG/MainProcess] beat: Waking up in 9.69 seconds.
[2020-09-08 16:43:17,273: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:43:17,274: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:43:17,274: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->60ea0531-1a18-464e-a837-7b02ced2ea8f
[2020-09-08 16:43:18,159: DEBUG/MainProcess] beat: Waking up in 9.10 seconds.
[2020-09-08 16:43:27,297: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:43:27,297: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:43:27,297: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->beefc52d-520e-4a41-8cb9-bb19a10b7e68
[2020-09-08 16:43:27,611: DEBUG/MainProcess] beat: Waking up in 9.68 seconds.
[2020-09-08 16:43:37,514: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:43:37,514: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:43:37,515: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->992e48b7-0b09-4a76-950f-18da96c91f74
[2020-09-08 16:43:37,569: DEBUG/MainProcess] beat: Waking up in 9.94 seconds.
[2020-09-08 16:43:47,575: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:43:47,575: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:43:47,576: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->b2ba6b2d-a245-474e-a731-fe6817174b58
[2020-09-08 16:43:47,616: DEBUG/MainProcess] beat: Waking up in 9.95 seconds.
[2020-09-08 16:43:57,599: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:43:57,600: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:43:57,600: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->0d630b69-525e-49a7-992f-ff4cd13d96ce
[2020-09-08 16:43:57,625: DEBUG/MainProcess] beat: Waking up in 2.36 seconds.
[2020-09-08 16:44:00,128: DEBUG/MainProcess] schedule: <crontab: * * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:44:00,128: INFO/MainProcess] Scheduler: Sending due task echo-every--min (celery_pro.tasks.echo)
[2020-09-08 16:44:00,129: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->8fb09751-55ad-4d0f-af7c-0504d16bf87d
[2020-09-08 16:44:00,152: DEBUG/MainProcess] schedule: <crontab: */2 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:44:00,152: INFO/MainProcess] Scheduler: Sending due task echo-every-*/2 (celery_pro.tasks.echo)
[2020-09-08 16:44:00,153: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->8c8b9c8a-c2dc-4949-807a-a4a68ef3d006
[2020-09-08 16:44:00,186: DEBUG/MainProcess] beat: Waking up in 7.40 seconds.
[2020-09-08 16:44:07,636: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:44:07,636: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:44:07,637: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->cf244ede-ae0e-412c-9c6e-bfdea658dbb0
[2020-09-08 16:44:07,665: DEBUG/MainProcess] beat: Waking up in 9.96 seconds.
[2020-09-08 16:44:17,629: DEBUG/MainProcess] beat: Synchronizing schedule...
[2020-09-08 16:44:17,629: INFO/MainProcess] Writing entries...
[2020-09-08 16:44:17,929: DEBUG/MainProcess] echo-every--min save to database
[2020-09-08 16:44:17,985: DEBUG/MainProcess] echo-every-10s save to database
[2020-09-08 16:44:18,064: DEBUG/MainProcess] echo-every-*/2 save to database
[2020-09-08 16:44:18,092: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:44:18,092: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:44:18,093: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->43b4700c-dae4-4144-a783-6c0412a76da0
[2020-09-08 16:44:18,123: DEBUG/MainProcess] beat: Waking up in 9.96 seconds.
[2020-09-08 16:44:28,108: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:44:28,109: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:44:28,109: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->9181a691-4212-4204-b94d-4928825715d0
[2020-09-08 16:44:28,251: DEBUG/MainProcess] beat: Waking up in 9.85 seconds.
[2020-09-08 16:44:38,379: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:44:38,379: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:44:38,379: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->d8b49c18-df57-46ca-b4f5-a55874c31307
[2020-09-08 16:44:38,826: DEBUG/MainProcess] beat: Waking up in 9.54 seconds.
[2020-09-08 16:44:48,585: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:44:48,585: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:44:48,585: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->accc44f3-7150-47dd-88b3-4ff055a01b58
[2020-09-08 16:44:48,611: DEBUG/MainProcess] beat: Waking up in 9.96 seconds.
[2020-09-08 16:44:58,621: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:44:58,621: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:44:58,622: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->60cf17cf-bdc8-4e7f-82ee-3c0f123fea71
[2020-09-08 16:44:58,660: DEBUG/MainProcess] beat: Waking up in 1.33 seconds.
[2020-09-08 16:45:00,046: DEBUG/MainProcess] schedule: <crontab: 45 * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:45:00,047: INFO/MainProcess] Scheduler: Sending due task echo-every-45-hour (celery_pro.tasks.echo)
[2020-09-08 16:45:00,047: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->8e261392-080a-4e43-b2c1-e3be8c85b057
[2020-09-08 16:45:00,077: DEBUG/MainProcess] schedule: <crontab: * * * * * (m/h/d/dM/MY), Asia/Shanghai>
[2020-09-08 16:45:00,077: INFO/MainProcess] Scheduler: Sending due task echo-every--min (celery_pro.tasks.echo)
[2020-09-08 16:45:00,078: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->1a759898-5663-42b5-a471-1517b97b4fd4
[2020-09-08 16:45:00,110: DEBUG/MainProcess] beat: Waking up in 8.50 seconds.
[2020-09-08 16:45:08,646: DEBUG/MainProcess] schedule: <freq: 10.00 seconds>
[2020-09-08 16:45:08,646: INFO/MainProcess] Scheduler: Sending due task echo-every-10s (celery_pro.tasks.echo)
[2020-09-08 16:45:08,647: DEBUG/MainProcess] celery_pro.tasks.echo sent. id->24546389-3729-4674-9864-6480d525feaf
[2020-09-08 16:45:08,688: DEBUG/MainProcess] beat: Waking up in 9.95 seconds.

"Connection is busy" error with SQL Server connection

I've been getting the below errors every time I try to connect to my SQL Server instance to set up the scheduler. It successfully creates the tables, so it's not a problem with the connection string. And from what I can tell from debugging, it's getting through the initial_read successfully, but the connections aren't being released after that. Help!

[2021-11-23 14:19:37,130: ERROR/MainProcess] Cannot add entry 'celery.backend_cleanup' to database schedule: DBAPIError("(pyodbc.Error) ('HY000', '[HY000] [Microsoft][ODBC Driver 17 for SQL Server]Connection is busy with results for another command (0) (SQLExecDirectW)')"). Contents: {'task': 'celery.backend_cleanup', 'schedule': <crontab: 0 4 * * * (m/h/d/dM/MY)>, 'options': {'expires': 43200}}

请教为何celery beat显示usage:

使用celery -A celery_worker.ce -S celery_sqlalchemy_scheduler.schedulers:DatabaseScheduler -l info
显示usage:而不执行
`
(asset-ar0OxIPP) [root@VM asset]# celery -A celery_worker.ce -S celery_sqlalchemy_scheduler.schedulers:DatabaseScheduler -l info
[2019-08-26 13:42:54] [INFO] - Server initialized for eventlet. (server.py:140)
usage: celery [options]

Show help screen and exit.

positional arguments:
args

optional arguments:
-h, --help show this help message and exit
--version show program's version number and exit

Global Options:
-A APP, --app APP
-b BROKER, --broker BROKER
--result-backend RESULT_BACKEND
--loader LOADER
--config CONFIG
--workdir WORKDIR
--no-color, -C
--quiet, -q

---- -- - - ---- Commands- -------------- --- ------------

  • Main:
    | celery worker
    | celery events
    | celery beat
    | celery shell
    | celery multi
    | celery amqp

  • Remote Control:
    | celery status

| celery inspect --help
| celery inspect active
| celery inspect active_queues
| celery inspect clock
| celery inspect conf [include_defaults=False]
| celery inspect memdump [n_samples=10]
| celery inspect memsample
| celery inspect objgraph [object_type=Request] [num=200 [max_depth=10]]
| celery inspect ping
| celery inspect query_task [id1 [id2 [... [idN]]]]
| celery inspect registered [attr1 [attr2 [... [attrN]]]]
| celery inspect report
| celery inspect reserved
| celery inspect revoked
| celery inspect scheduled
| celery inspect stats

| celery control --help
| celery control add_consumer [exchange [type [routing_key]]]
| celery control autoscale [max [min]]
| celery control cancel_consumer
| celery control disable_events
| celery control election
| celery control enable_events
| celery control heartbeat
| celery control pool_grow [N=1]
| celery control pool_restart
| celery control pool_shrink [N=1]
| celery control rate_limit <task_name> <rate_limit (e.g., 5/s | 5/m | 5/h)>
| celery control revoke [id1 [id2 [... [idN]]]]
| celery control shutdown
| celery control terminate [id1 [id2 [... [idN]]]]
| celery control time_limit <task_name> <soft_secs> [hard_secs]

  • Utils:
    | celery purge
    | celery list
    | celery call
    | celery result
    | celery migrate
    | celery graph
    | celery upgrade

  • Debugging:
    | celery report
    | celery logtool

  • Extensions:
    | celery flower


Type 'celery --help' for help using a specific command.

(asset-ar0OxIPP) [root@VM asset]#
`

新增任务或者修改状态,beat会报错并中止进程

系统环境: CentOS 7.8 or Windows 10
Python版本: 3.6.8
Celery版本: 4.2.2

结合Flask使用的,新增任务会触发报错并中止beat进程
经过测试发现,新增或修改以seconds之外为间隔单位的Interval类型任务会报错,就算我强制修改数据库,也不会调度任务
使用seconds单位的任务反而没有问题..

报错如下:

PS D:\GIT仓库\flask-demo> .\venv\Scripts\python.exe -m celery -A celery_worker.celery_app beat -S celery_sqlalchemy_scheduler.schedulers:DatabaseScheduler --loglevel=DEBUG
celery beat v4.2.2 (windowlicker) is starting.
D:\GIT仓库\flask-demo\venv\lib\site-packages\pymysql\cursors.py:170: Warning: (1366, "Incorrect string value: '\xD6\xD0\xB9\xFA\xB1\xEA...' for column 'VARIABLE_VALUE' at row 491")
result = self._query(query)
__ - ... __ - _
LocalTime -> 2021-07-12 18:32:41
Configuration ->
. broker -> redis://127.0.0.1:6379/2
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery_sqlalchemy_scheduler.schedulers.DatabaseScheduler
. db -> mysql+pymysql://root:123456@localhost/dev-devops
. logfile -> [stderr]@%DEBUG
. maxinterval -> 5.00 seconds (5s)
[2021-07-12 18:34:32,678: WARNING/MainProcess] Traceback (most recent call last):
[2021-07-12 18:34:32,678: WARNING/MainProcess] File "C:\Users\xxxx\AppData\Local\Programs\Python\Python36\Lib\runpy.py", line 193, in run_module_as_main
[2021-07-12 18:34:32,680: WARNING/MainProcess] "main", mod_spec)
[2021-07-12 18:34:32,683: WARNING/MainProcess] File "C:\Users\xxxx\AppData\Local\Programs\Python\Python36\Lib\runpy.py", line 85, in run_code
[2021-07-12 18:34:32,684: WARNING/MainProcess] exec(code, run_globals)
[2021-07-12 18:34:32,684: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery_main
.py", line 20, in
[2021-07-12 18:34:32,685: WARNING/MainProcess] main()
[2021-07-12 18:34:32,685: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery_main
.py", line 16, in main
[2021-07-12 18:34:32,686: WARNING/MainProcess] main()
[2021-07-12 18:34:32,686: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\celery.py", line 322, in main
[2021-07-12 18:34:32,688: WARNING/MainProcess] cmd.execute_from_commandline(argv)
[2021-07-12 18:34:32,690: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\celery.py", line 496, in execute_from_commandline
[2021-07-12 18:34:32,698: WARNING/MainProcess] super(CeleryCommand, self).execute_from_commandline(argv)))
[2021-07-12 18:34:32,698: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\base.py", line 275, in execute_from_commandline
[2021-07-12 18:34:32,699: WARNING/MainProcess] return self.handle_argv(self.prog_name, argv[1:])
[2021-07-12 18:34:32,699: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\celery.py", line 488, in handle_argv
[2021-07-12 18:34:32,700: WARNING/MainProcess] return self.execute(command, argv)
[2021-07-12 18:34:32,700: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\celery.py", line 420, in execute
[2021-07-12 18:34:32,702: WARNING/MainProcess] ).run_from_argv(self.prog_name, argv[1:], command=argv[0])
[2021-07-12 18:34:32,702: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\base.py", line 279, in run_from_argv
[2021-07-12 18:34:32,704: WARNING/MainProcess] sys.argv if argv is None else argv, command)
[2021-07-12 18:34:32,711: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\base.py", line 363, in handle_argv
[2021-07-12 18:34:32,713: WARNING/MainProcess] return self(*args, **options)
[2021-07-12 18:34:32,714: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\base.py", line 238, in call
[2021-07-12 18:34:32,715: WARNING/MainProcess] ret = self.run(*args, **kwargs)
[2021-07-12 18:34:32,716: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\bin\beat.py", line 109, in run
[2021-07-12 18:34:32,717: WARNING/MainProcess] return beat().run()
[2021-07-12 18:34:32,720: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\apps\beat.py", line 81, in run
[2021-07-12 18:34:32,721: WARNING/MainProcess] self.start_scheduler()
[2021-07-12 18:34:32,725: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\apps\beat.py", line 109, in start_scheduler
[2021-07-12 18:34:32,726: WARNING/MainProcess] service.start()
[2021-07-12 18:34:32,727: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\beat.py", line 588, in start
[2021-07-12 18:34:32,727: WARNING/MainProcess] interval = self.scheduler.tick()
[2021-07-12 18:34:32,727: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\beat.py", line 293, in tick
[2021-07-12 18:34:32,728: WARNING/MainProcess] self.populate_heap()
[2021-07-12 18:34:32,729: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery\beat.py", line 267, in populate_heap
[2021-07-12 18:34:32,729: WARNING/MainProcess] is_due, next_call_delay = entry.is_due()
[2021-07-12 18:34:32,731: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\celery_sqlalchemy_scheduler\schedulers.py", line 126, in is_due
[2021-07-12 18:34:32,731: WARNING/MainProcess] if not self.model.enabled:
[2021-07-12 18:34:32,731: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\sqlalchemy\orm\attributes.py", line 287, in get
[2021-07-12 18:34:32,732: WARNING/MainProcess] return self.impl.get(instance_state(instance), dict
)
[2021-07-12 18:34:32,733: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\sqlalchemy\orm\attributes.py", line 718, in get
[2021-07-12 18:34:32,734: WARNING/MainProcess] value = state._load_expired(state, passive)
[2021-07-12 18:34:32,736: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\sqlalchemy\orm\state.py", line 652, in _load_expired
[2021-07-12 18:34:32,737: WARNING/MainProcess] self.manager.deferred_scalar_loader(self, toload)
[2021-07-12 18:34:32,740: WARNING/MainProcess] File "D:\GIT仓库\flask-demo\venv\lib\site-packages\sqlalchemy\orm\loading.py", line 946, in load_scalar_attributes
[2021-07-12 18:34:32,741: WARNING/MainProcess] "attribute refresh operation cannot proceed" % (state_str(state))
[2021-07-12 18:34:32,742: WARNING/MainProcess] sqlalchemy.orm.exc
[2021-07-12 18:34:32,743: WARNING/MainProcess] .
[2021-07-12 18:34:32,743: WARNING/MainProcess] DetachedInstanceError
[2021-07-12 18:34:32,744: WARNING/MainProcess] :
[2021-07-12 18:34:32,744: WARNING/MainProcess] Instance <PeriodicTask at 0x23ba47c8ac8> is not bound to a Session; attribute refresh operation cannot proceed (Background on this error at: http://sqlalche.me/e/bhk3)

Deleted schedule isn't removed from DatabaseScheduler.schedule in memory

When deleting a schedule from the database it is not removed from memory so when DatabaseScheduler checks for tasks to execute it will error out, but yet will still continue to execute the script.

Adding new schedule:

[2020-10-16 12:57:35,634: INFO/MainProcess] DatabaseScheduler: Schedule changed.
[2020-10-16 12:57:35,634: INFO/MainProcess] Writing entries...

Execution of scheulde:
[2020-10-16 13:02:36,313: INFO/MainProcess] Scheduler: Sending due task splunk task (engine.modules.degradation_splunk.tasks.submit_splunk_job)
[2020-10-16 13:02:52,139: INFO/MainProcess] Writing entries...

After deletion of entry from database there is no trigger like DatabaseScheduler: Schedule changed.

After deletion it continues to run the task with traceback:

[2020-10-16 13:07:36,411: INFO/MainProcess] Scheduler: Sending due task splunk task (engine.modules.degradation_splunk.tasks.submit_splunk_job)
[2020-10-16 13:09:00,097: INFO/MainProcess] Writing entries...
[2020-10-16 13:09:00,551: ERROR/MainProcess] 'NoneType' object has no attribute 'last_run_at'
Traceback (most recent call last):
  File "/home/dwhitney/miniconda3/envs/scorecardv2_engine/lib/python3.8/site-packages/celery_sqlalchemy_scheduler/schedulers.py", line 370, in sync
    self.schedule[name].save()  # save to database
  File "/home/dwhitney/miniconda3/envs/scorecardv2_engine/lib/python3.8/site-packages/celery_sqlalchemy_scheduler/schedulers.py", line 183, in save
    setattr(obj, field, getattr(self.model, field))
AttributeError: 'NoneType' object has no attribute 'last_run_at'

Beat was unable to find crontab task in the database

crontab能够写入数据库,但是beat看起来没有从数据库中找到定时任务

所有环境都是基于Docker镜像

使用了 FastAPI框架的开源项目 Full-Stack-FastAPI-PostgreSQL
项目地址:
https://github.com/tiangolo/full-stack-fastapi-postgresql

环境

Celery: 4.4.7   基于python3.8 docker构建
Celery-sqlalchemy-scheduler:0.3.0
postgres : 12 

问题:

crontab能够写入数据库,但是beat看起来没有从数据库中找到定时任务:

我的代码配置:

from raven import Client
from app.core.celery_app import celery_app
from app.core.config import settings
from app.core import batch_update
from app.core import rds_batch_update

client_sentry = Client(settings.SENTRY_DSN)

from celery_sqlalchemy_scheduler.models import PeriodicTask, CrontabSchedule
from celery_sqlalchemy_scheduler.session import SessionManager
from celery_sqlalchemy_scheduler.schedulers import DatabaseScheduler 


BEAT_DBURI = "postgresql+psycopg2://postgresql:password@db:5432/devops"

session_manager = SessionManager()
session = session_manager.session_factory(dburi=BEAT_DBURI)

schedule_aws_ec2 = CrontabSchedule(
    minute='50',
    hour='10',
    day_of_week='1',
    day_of_month='1',
    month_of_year='1',
    timezone='Asia/Shanghai'
)
schedule_aws_rds = CrontabSchedule(
    minute='55',
    hour='10',
    day_of_week='1',
    day_of_month='1',
    month_of_year='1',
    timezone='Asia/Shanghai'
)

periodic_task_aws_ec2 = PeriodicTask(
    crontab=schedule_aws_ec2,
    name='batch_update',
    task='app.worker.ec2date_update',
)
periodic_task_aws_rds = PeriodicTask(
    crontab=schedule_aws_rds,
    name='awsrds_batch_update',
    task='app.worker.rdsdate_update',
)
beat_max_loop_interval = 10
worker_max_tasks_per_child = 10
timezone = 'Asia/Shanghai'

config = {
    # 'beat_scheduler': beat_scheduler,  # 命令行传参配置了,所以这里并不需要写死在代码里
    'beat_max_loop_interval': beat_max_loop_interval,
    'timezone': timezone,
    'worker_max_tasks_per_child': worker_max_tasks_per_child
}
celery_app.conf.update(config)

query_aws_ec2_schedule = session.query(PeriodicTask).filter_by(name=periodic_task_aws_ec2.name).first()
query_aws_rds_schedule = session.query(PeriodicTask).filter_by(name=periodic_task_aws_rds.name).first()

if not query_aws_ec2_schedule:
    session.add(periodic_task_aws_ec2)
    session.commit()
    session.close()

if not query_aws_rds_schedule:
    session.add(periodic_task_aws_rds)
    session.commit()
    session.close()

@celery_app.task(acks_late=True)
def test_celery(word: str) -> str:
    return f"test task return {word}"

@celery_app.task(acks_late=True)
def ec2date_update():
    batch_update.all_ec2_update()
    return

@celery_app.task(acks_late=True)
def rdsdate_update():
    rds_batch_update.all_rds_update()
    return

if __name__ == "__main__":
    celery_app.start()

我的beat启动命令:

celery worker -A app.worker -Q main-queue -c 2 -B --scheduler celery_sqlalchemy_scheduler.schedulers:DatabaseScheduler -l debug 

Celery Docker启动日志:Debug日志

Attaching to full-stack-fastapi-postgresql_celeryworker_1
celeryworker_1  | INFO:__main__:Initializing service
celeryworker_1  | INFO:__main__:Starting call to '__main__.init', this is the 1st time calling it.
celeryworker_1  | INFO:__main__:Service finished initializing
celeryworker_1  | /usr/local/lib/python3.8/site-packages/celery/platforms.py:800: RuntimeWarning: You're running the worker with superuser privileges: this is
celeryworker_1  | absolutely not recommended!
celeryworker_1  |
celeryworker_1  | Please specify a different user using the --uid option.
celeryworker_1  |
celeryworker_1  | User information: uid=0 euid=0 gid=0 egid=0
celeryworker_1  |
celeryworker_1  |   warnings.warn(RuntimeWarning(ROOT_DISCOURAGED.format(
celeryworker_1  | [2021-05-10 06:46:38,505: DEBUG/MainProcess] | Worker: Preparing bootsteps.
celeryworker_1  | [2021-05-10 06:46:38,508: DEBUG/MainProcess] | Worker: Building graph...
celeryworker_1  | [2021-05-10 06:46:38,510: DEBUG/MainProcess] | Worker: New boot order: {Beat, Timer, Hub, Pool, Autoscaler, StateDB, Consumer}
celeryworker_1  | [2021-05-10 06:46:38,521: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
celeryworker_1  | [2021-05-10 06:46:38,522: DEBUG/MainProcess] | Consumer: Building graph...
celeryworker_1  | [2021-05-10 06:46:38,576: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Heart, Mingle, Gossip, Agent, Tasks, Control, event loop}
celeryworker_1  | [2021-05-10 06:46:38,585: DEBUG/MainProcess] | Worker: Starting Beat
celeryworker_1  | [2021-05-10 06:46:38,588: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:38,589: DEBUG/MainProcess] | Worker: Starting Hub
celeryworker_1  | [2021-05-10 06:46:38,590: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:38,590: DEBUG/MainProcess] | Worker: Starting Pool
celeryworker_1  | [2021-05-10 06:46:38,729: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:38,730: DEBUG/MainProcess] | Worker: Starting Consumer
celeryworker_1  | [2021-05-10 06:46:38,731: DEBUG/MainProcess] | Consumer: Starting Connection
celeryworker_1  | [2021-05-10 06:46:38,752: DEBUG/MainProcess] Start from server, version: 0.9, properties: {'capabilities': {'publisher_confirms': True, 'exchange_exchange_bindings': True, 'basic.nack': True, 'consumer_cancel_notify': True, 'connection.blocked': True, 'consumer_priorities': True, 'authentication_failure_close': True, 'per_consumer_qos': True, 'direct_reply_to': True}, 'cluster_name': 'rabbit@1aebc17b4370', 'copyright': 'Copyright (c) 2007-2021 VMware, Inc. or its affiliates.', 'information': 'Licensed under the MPL 2.0. Website: https://rabbitmq.com', 'platform': 'Erlang/OTP 23.3.1', 'product': 'RabbitMQ', 'version': '3.8.14'}, mechanisms: [b'AMQPLAIN', b'PLAIN'], locales: ['en_US']
celeryworker_1  | [2021-05-10 06:46:38,759: INFO/MainProcess] Connected to amqp://guest:**@queue:5672//
celeryworker_1  | [2021-05-10 06:46:38,760: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:38,760: DEBUG/MainProcess] | Consumer: Starting Events
celeryworker_1  | [2021-05-10 06:46:38,780: DEBUG/MainProcess] Start from server, version: 0.9, properties: {'capabilities': {'publisher_confirms': True, 'exchange_exchange_bindings': True, 'basic.nack': True, 'consumer_cancel_notify': True, 'connection.blocked': True, 'consumer_priorities': True, 'authentication_failure_close': True, 'per_consumer_qos': True, 'direct_reply_to': True}, 'cluster_name': 'rabbit@1aebc17b4370', 'copyright': 'Copyright (c) 2007-2021 VMware, Inc. or its affiliates.', 'information': 'Licensed under the MPL 2.0. Website: https://rabbitmq.com', 'platform': 'Erlang/OTP 23.3.1', 'product': 'RabbitMQ', 'version': '3.8.14'}, mechanisms: [b'AMQPLAIN', b'PLAIN'], locales: ['en_US']
celeryworker_1  | [2021-05-10 06:46:38,783: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:38,784: DEBUG/MainProcess] | Consumer: Starting Heart
celeryworker_1  | [2021-05-10 06:46:38,785: DEBUG/MainProcess] using channel_id: 1
celeryworker_1  | [2021-05-10 06:46:38,803: DEBUG/MainProcess] Channel open
celeryworker_1  | [2021-05-10 06:46:38,812: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:38,812: DEBUG/MainProcess] | Consumer: Starting Mingle
celeryworker_1  | [2021-05-10 06:46:38,813: INFO/MainProcess] mingle: searching for neighbors
celeryworker_1  | [2021-05-10 06:46:38,814: DEBUG/MainProcess] using channel_id: 1
celeryworker_1  | [2021-05-10 06:46:38,830: DEBUG/MainProcess] Channel open
celeryworker_1  | [2021-05-10 06:46:38,890: DEBUG/MainProcess] Start from server, version: 0.9, properties: {'capabilities': {'publisher_confirms': True, 'exchange_exchange_bindings': True, 'basic.nack': True, 'consumer_cancel_notify': True, 'connection.blocked': True, 'consumer_priorities': True, 'authentication_failure_close': True, 'per_consumer_qos': True, 'direct_reply_to': True}, 'cluster_name': 'rabbit@1aebc17b4370', 'copyright': 'Copyright (c) 2007-2021 VMware, Inc. or its affiliates.', 'information': 'Licensed under the MPL 2.0. Website: https://rabbitmq.com', 'platform': 'Erlang/OTP 23.3.1', 'product': 'RabbitMQ', 'version': '3.8.14'}, mechanisms: [b'AMQPLAIN', b'PLAIN'], locales: ['en_US']
celeryworker_1  | [2021-05-10 06:46:38,899: DEBUG/MainProcess] using channel_id: 1
celeryworker_1  | [2021-05-10 06:46:38,908: DEBUG/MainProcess] Channel open
celeryworker_1  | [2021-05-10 06:46:39,971: INFO/MainProcess] mingle: all alone
celeryworker_1  | [2021-05-10 06:46:39,972: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:39,972: DEBUG/MainProcess] | Consumer: Starting Gossip
celeryworker_1  | [2021-05-10 06:46:39,973: DEBUG/MainProcess] using channel_id: 2
celeryworker_1  | [2021-05-10 06:46:39,975: DEBUG/MainProcess] Channel open
celeryworker_1  | [2021-05-10 06:46:39,989: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:39,989: DEBUG/MainProcess] | Consumer: Starting Tasks
celeryworker_1  | [2021-05-10 06:46:40,001: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:40,002: DEBUG/MainProcess] | Consumer: Starting Control
celeryworker_1  | [2021-05-10 06:46:40,002: DEBUG/MainProcess] using channel_id: 3
celeryworker_1  | [2021-05-10 06:46:40,004: DEBUG/MainProcess] Channel open
celeryworker_1  | [2021-05-10 06:46:40,014: DEBUG/MainProcess] ^-- substep ok
celeryworker_1  | [2021-05-10 06:46:40,014: DEBUG/MainProcess] | Consumer: Starting event loop
celeryworker_1  | [2021-05-10 06:46:40,015: DEBUG/MainProcess] | Worker: Hub.register Pool...
celeryworker_1  | [2021-05-10 06:46:40,018: INFO/MainProcess] celery@7a67a0149293 ready.
celeryworker_1  | [2021-05-10 06:46:40,019: DEBUG/MainProcess] basic.qos: prefetch_count->8
celeryworker_1  | [2021-05-10 06:46:40,899: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:46:40,899: INFO/MainProcess] Events of group {task} enabled by remote.

--------------------------------------------------------------  这里 开始
--------------------------------------------------------------  这里 开始
--------------------------------------------------------------  这里 开始
celeryworker_1  | [2021-05-10 06:46:41,251: INFO/Beat] beat: Starting...
celeryworker_1  | [2021-05-10 06:46:41,298: INFO/Beat] setup_schedule
celeryworker_1  | [2021-05-10 06:46:41,299: DEBUG/Beat] DatabaseScheduler: initial read
celeryworker_1  | [2021-05-10 06:46:41,299: INFO/Beat] Writing entries...
celeryworker_1  | [2021-05-10 06:46:41,300: DEBUG/Beat] DatabaseScheduler: Fetching database schedule
celeryworker_1  | [2021-05-10 06:46:41,350: DEBUG/Beat] schedule: <crontab: 0 4 * * * (m/h/d/dM/MY), Asia/Shanghai>
celeryworker_1  | [2021-05-10 06:46:41,352: DEBUG/Beat] Current schedule:
--------------------------------------------------------------
--------------------------------------------------------------
--------------------------------------------------------------  这里 结束,没有读到我的定时任务,在数据中是有写成功的

celeryworker_1  | <ModelEntry: celery.backend_cleanup celery.backend_cleanup(*[], **{}) <crontab: 0 4 * * * (m/h/d/dM/MY), Asia/Shanghai>>
celeryworker_1  | [2021-05-10 06:46:41,417: DEBUG/Beat] schedule: <crontab: 0 4 * * * (m/h/d/dM/MY), Asia/Shanghai>
celeryworker_1  | [2021-05-10 06:46:41,435: DEBUG/Beat] beat: Ticking with max interval->10.00 seconds
celeryworker_1  | [2021-05-10 06:46:41,448: DEBUG/Beat] beat: Waking up in 10.00 seconds.
celeryworker_1  | [2021-05-10 06:46:46,048: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:46:50,913: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:46:51,463: DEBUG/Beat] beat: Synchronizing schedule...
celeryworker_1  | [2021-05-10 06:46:51,479: INFO/Beat] Writing entries...
celeryworker_1  | [2021-05-10 06:46:51,533: DEBUG/Beat] beat: Waking up in 10.00 seconds.
celeryworker_1  | [2021-05-10 06:46:55,904: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:47:00,003: DEBUG/MainProcess] heartbeat_tick : for connection a9c0e41d02234670a4534c3b60bc8090
celeryworker_1  | [2021-05-10 06:47:00,003: DEBUG/MainProcess] heartbeat_tick : Prev sent/recv: None/None, now - 28/70, monotonic - 15271.733849645, last_heartbeat_sent - 15271.733847792, heartbeat int. - 60 for connection a9c0e41d02234670a4534c3b60bc8090
celeryworker_1  | [2021-05-10 06:47:00,890: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:47:01,547: DEBUG/Beat] beat: Waking up in 10.00 seconds.
celeryworker_1  | [2021-05-10 06:47:05,902: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:47:10,904: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:47:11,560: DEBUG/Beat] beat: Waking up in 10.00 seconds.
celeryworker_1  | [2021-05-10 06:47:15,902: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
celeryworker_1  | [2021-05-10 06:47:20,007: DEBUG/MainProcess] heartbeat_tick : for connection a9c0e41d02234670a4534c3b60bc8090
celeryworker_1  | [2021-05-10 06:47:20,008: DEBUG/MainProcess] heartbeat_tick : Prev sent/recv: 28/70, now - 28/113, monotonic - 15291.738308944, last_heartbeat_sent - 15271.733847792, heartbeat int. - 60 for connection a9c0e41d02234670a4534c3b60bc8090

个人尝试:

更换为sqlite数据库的方式,是可以取到任务的,
更换为postgreq 9.6,同样取不到任务。

目标:

celery beat 在postgreq 12版本下,能够读取到任务。谢谢

Dynamic update of the celery_crontab_schedule table?

Hello,
I've installed this package. The echo task examples are all running as expected. I added my own toy task that writes a test file to a directory--All good. Great project. Thank you.

Reading your code and examples I wish for one feature not mentioned in documentation. I am searching for a solution to support my Flask project. I need to dynamically change (create, update, delete) the SQL tables in celery-sqlalchemy-scheduler without restarting Flask, Celery workers, or Celery Beat.

I don't need to add new functions to tasks.py. I just need to add new task instances and change schedules.
For example,

  • Add a new task row (create new tasks.add with args [1,5]) in the celery_periodic_task table.
  • Update row feature args from [1,2] to [10,2] in the celery_periodic_task table.
  • Update row with features of the celery_crontab_schedule table, and change (0 4 * * *) to (* */4 * * *).

Can I update tasks in the SQL tables to achieve 'dynamic' updates with this project?

There is another older package celery_sqlalchemy_scheduler which promised to 'dynamically add tasks at runtime' and this package is based on Celery v3. I was happy to find your project using Celery v4, because I understand v4 has added additional decorators to update Celery Beat at runtime--if I understand this correctly, but I'm still a little new to the guts and gears of Celery. It's all very complex.

Thanks again. Great project.

英文文档勘误

session_manager = SessionManager()
engine, Session = SessionManager.create_session(beat_dburi)
session = Session()

第二行应该将SessionManager更改为session_manager

疑似expires字段不生效

关于下面的expires参数

periodic_task = PeriodicTask(
    ...     interval=schedule,                  # we created this above.
    ...     name='Importing contacts',          # simply describes this periodic task.
    ...     task='proj.tasks.import_contacts',  # name of task.
    ...     args=json.dumps(['arg1', 'arg2']),
    ...     kwargs=json.dumps({
    ...        'be_careful': True,
    ...     }),
    ...     expires=datetime.utcnow() + timedelta(seconds=30)
    ... )

我把ip_port_service_scan的expires改成了datetime.now() + timedelta(seconds=30)
但是当到了过期时间之后,任务仍然会触发执行。
数据库记录

id  name                    task                          interval_id  crontab_id  solar_id  queue       exchange  routing_key  priority  expires              one_off  start_time  enabled  last_run_at  total_run_count  date_changed         
12  ip_port_service_scan    scan.ip_port_service_scan     1                                  scan_tasks                                   2019-08-27 15:07:12  0                    1                     0                2019-08-27 15:06:42  

celery beat记录

[2019-08-27 15:06:20] [INFO]    - beat: Starting... (beat.py:586)
[2019-08-27 15:06:20] [INFO]    - Writing entries... (schedulers.py:337)
[2019-08-27 15:06:21] [INFO]    - DatabaseScheduler: Schedule changed. (schedulers.py:404)
[2019-08-27 15:06:21] [INFO]    - Writing entries... (schedulers.py:337)
[2019-08-27 15:06:31] [INFO]    - Writing entries... (schedulers.py:337)
[2019-08-27 15:06:52] [INFO]    - DatabaseScheduler: Schedule changed. (schedulers.py:404)
[2019-08-27 15:06:52] [INFO]    - Writing entries... (schedulers.py:337)
[2019-08-27 15:07:00] [INFO]    - Scheduler: Sending due task test (default.test) (beat.py:240)
[2019-08-27 15:07:02] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:07:12] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:07:22] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:07:32] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:07:42] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:07:52] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:08:00] [INFO]    - Scheduler: Sending due task test (default.test) (beat.py:240)
[2019-08-27 15:08:02] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:08:12] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:08:22] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:08:32] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:08:42] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:08:52] [INFO]    - Scheduler: Sending due task ip_port_service_scan (scan.ip_port_service_scan) (beat.py:240)
[2019-08-27 15:09:00] [INFO]    - Scheduler: Sending due task test (default.test) (beat.py:240)

更改了start_time以后会暂停所有任务的调度

celery_periodic_task表中更改了某个任务的start_time以后会暂停调度所有的任务,直到到达start_time时间为止,正常情况下satrt_time应该只对相应的一条任务有效果,这个情况可以避免吗?

无法使用定时任务功能咨询

您好,
我这边已经对celery_periodic_task表作了任务设计,定义了crontab_id的对应值,然后激活该任务。但是发现 beat 日志出现 :Disabling schedule XXX that was removed from database 的错误日志。之后该任务就变回 禁用状态了。该错误应该是没有配置 定时任务标记出现的。但是配置了 crontab_id 有问题,interval_id设置是正常运行的。请问是否有解决方案,谢谢

celery beat始终不send task(自己理解有误)

(asset-ar0OxIPP) [root@VM asset]# celery beat -A celery_worker.ce -S celery_sqlalchemy_scheduler.schedulers:DatabaseScheduler -l info
[2019-08-26 15:29:50] [INFO] - Server initialized for eventlet. (server.py:140)
celery beat v4.3.0 (rhubarb) is starting.
__ - ... __ - _
LocalTime -> 2019-08-26 15:29:50
Configuration ->
. broker -> amqp://xiaopo:**@xxxxx:5672/xxx
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery_sqlalchemy_scheduler.schedulers.DatabaseScheduler
. db -> mysql://root:xxxxxx@xxxxxxx:3306/celery-schedule
. logfile -> [stderr]@%INFO
. maxinterval -> 5.00 seconds (5s)
[2019-08-26 15:29:50] [INFO] - beat: Starting... (beat.py:586)
[2019-08-26 15:29:50] [INFO] - Writing entries... (schedulers.py:337)
[2019-08-26 15:29:56] [INFO] - Writing entries... (schedulers.py:337)
[2019-08-26 15:30:16] [INFO] - DatabaseScheduler: Schedule changed. (schedulers.py:404)
[2019-08-26 15:30:16] [INFO] - Writing entries... (schedulers.py:337)
[2019-08-26 15:32:28] [INFO] - DatabaseScheduler: Schedule changed. (schedulers.py:404)
[2019-08-26 15:32:28] [INFO] - Writing entries... (schedulers.py:337)
[2019-08-26 15:32:58] [INFO] - Writing entries... (schedulers.py:337)

按照example code成功写入了两个每1分钟执行一次的测试任务,celery beat也检测到发生变化了,然后celery worker那边一直没执行,按照正常逻辑,celery beat这端应该有send日志吧?

Beat

I want to use cell beat to put tasks in different queues. What should I do

celery-sqlalchemy-scheduler not working with ver 5.0.1 of Celery

Receiving the following error after upgrading to celery 5.0

File "/venv/lib/python3.7/site-packages/celery_sqlalchemy_scheduler/schedulers.py", line 11, in
from celery.five import values, items
ModuleNotFoundError: No module named 'celery.five'

Any help would be appreciated.

`celery_sqlalchemy_scheduler` does not run att all.

my conf

celery_app = Celery(__name__,backend='redis://redis:6379',
                broker='redis://redis:6379')
celery_app.conf.update(
    CELERY_REDIS_SCHEDULER_URL = 'redis://redis:6379')
celery_app.conf.timezone = 'UTC'
celery_app.conf.broker_url = 'redis://redis:6379'  # os.environ.get("CELERY_BROKER_URL")
celery_app.conf.result_backend = 'redis://redis:6379'  # os.environ.get("CELERY_RESULT_BACKEND")
celery_app.conf.update(
    {
        "task_routes": {
            "worker.alert_celery": {"queue": "piport-celery"},
            "worker.schedule_task": {"queue": "beat-queue"},
        }
    }
)
celery_app.conf.update(
    {'beat_dburi': beat_dburi}
)
celery_app.conf.autodiscover_tasks = True

@celery_app.task(name="My_new_task")
def My_new_task(a, b, c):
    print('hello beat')
    time.sleep(a)
    print(b+c)

I ran this and it worked just fine


celery_app.conf.beat_schedule = {
    'add-every-30-seconds': {
        'task': 'My_new_task',
        'schedule': 3,
        'args': (1, 2, 2)
    },
}

Buy in my target file, when I rune it from the Sqlalchemy it does not run.

data = {
            "every": 1,
            'period': IntervalSchedule.SECONDS
        }
schedule = IntervalSchedule(**data)
session.add(schedule)
schedule = db.query(IntervalSchedule).filter().first()
periodic_task = db.query(PeriodicTask).filter().first()
periodic_task = PeriodicTask(
    start_time=now() + datetime.timedelta(seconds=3),
    expires=now() + datetime.timedelta(days=10),
    name='My task',  # simply describes this periodic task.
    task='My_new_task',  # name of task.
    args=json.dumps([1,1,1]))
periodic_task.interval = schedule
session.add(periodic_task)
session.commit()
# more info 
periodic_task = session.query(PeriodicTask).filter(PeriodicTask.id == 10).first()
print(periodic_task.__dict__) # returns this
{'_sa_instance_state': <sqlalchemy.orm.state.InstanceState object at 0x7fd49a390b50>,
app              |                              'args': '[]',
app              |                              'crontab_id': 1,
app              |                              'date_changed': datetime.datetime(2021, 10, 20, 10, 21, 31, 190148, tzinfo=datetime.timezone.utc),
app              |                              'description': '',
app              |                              'enabled': True,
app              |                              'exchange': None,
app              |                              'expires': datetime.datetime(2021, 10, 30, 10, 21, 31, 192112, tzinfo=datetime.timezone.utc),
app              |                              'id': 10,
app              |                              'interval_id': None,
app              |                              'kwargs': '{}',
app              |                              'last_run_at': None,
app              |                              'name': 'celery.backend_cleanup',
app              |                              'one_off': False,
app              |                              'priority': None,
app              |                              'queue': None,
app              |                              'routing_key': None,
app              |                              'solar_id': None,
app              |                              'start_time': datetime.datetime(2021, 10, 20, 10, 21, 34, 192047, tzinfo=datetime.timezone.utc),
app              |                              'task': 'My_new_task',
app              |                              'total_run_count': 3}

full project https://github.com/aliscie/fastapi-full-graphql-template

Crontab Error

Have defined the below but throwing error. Not sure what i am doing wrong here. Please advise.

    schedule = CrontabSchedule(
          minute='*',
          hour='*',
          day_of_week='*',
          day_of_month='*',
          month_of_year='*',
          timezone=pytz.timezone('Canada/Pacific')              
     )
     session.add(schedule)
     session.commit()
     periodic_task = PeriodicTask(
        crontab=schedule,
        name=racfid+'test-scheduler',
        task='tasks.task.sayHello',
        args=json.dumps([dispatch])
     )
     session.add(periodic_task)
     session.commit()

sqlalchemy.exc.InterfaceError: (sqlite3.InterfaceError) Error binding parameter 5 - probably unsupported type.
[SQL: INSERT INTO celery_crontab_schedule (minute, hour, day_of_week, day_of_month, month_of_year, timezone) VALUES (?, ?, ?, ?, ?, ?)]
[parameters: ('', '', '', '', '', <DstTzInfo 'Canada/Pacific' LMT-1 day, 15:48:00 STD>)]
(Background on this error at: http://sqlalche.me/e/rvf5)

CrontabSchedule addition in PeriodicTask gives error

sqlalchemy.exc.InterfaceError: (sqlite3.InterfaceError) Error binding parameter 5 - probably unsupported type.
[SQL: INSERT INTO celery_periodic_task (name, task, interval_id, crontab_id, solar_id, args, kwargs, queue, exchange, routing_key, priority, expires, one_off, start_time, enabled, last_run_at, total_run_count, date_changed, description) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, CURRENT_TIMESTAMP, ?)]
[parameters: (None, 'tasks.add', None, 3, None, (3, 4), '{}', None, None, None, None, None, 0, None, 1, None, 0, '')]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.