@@ -239,30 +239,6 @@ waiting tasks you have to stop all the workers, and then discard the tasks
239239using ``discard_all``.
240240
241241
242- Windows: The ``-B`` / ``--beat`` option to celeryd doesn't work?
243- ----------------------------------------------------------------
244- **Answer**: That's right. Run ``celerybeat`` and ``celeryd`` as separate
245- services instead.
246-
247- Tasks
248- =====
249-
250- How can I reuse the same connection when applying tasks?
251- --------------------------------------------------------
252-
253- **Answer**: See :doc:`userguide/executing`.
254-
255- Can I execute a task by name?
256- -----------------------------
257-
258- **Answer**: Yes. Use :func:`celery.execute.send_task`.
259- You can also execute a task by name from any language
260- that has an AMQP client.
261-
262- >>> from celery.execute import send_task
263- >>> send_task("tasks.add", args=[2, 2], kwargs={})
264- <AsyncResult: 373550e8-b9a0-4666-bc61-ace01fa4f91d>
265-
266242Results
267243=======
268244
@@ -389,8 +365,56 @@ using the STOMP backend:
389365
390366 * mandatory
391367
392- Features
393- ========
368+ Tasks
369+ =====
370+
371+ How can I reuse the same connection when applying tasks?
372+ --------------------------------------------------------
373+
374+ **Answer**: See :doc:`userguide/executing`.
375+
376+ Can I execute a task by name?
377+ -----------------------------
378+
379+ **Answer**: Yes. Use :func:`celery.execute.send_task`.
380+ You can also execute a task by name from any language
381+ that has an AMQP client.
382+
383+ >>> from celery.execute import send_task
384+ >>> send_task("tasks.add", args=[2, 2], kwargs={})
385+ <AsyncResult: 373550e8-b9a0-4666-bc61-ace01fa4f91d>
386+
387+
388+ How can I get the task id of the current task?
389+ ----------------------------------------------
390+
391+ **Answer**: Celery does set some default keyword arguments if the task
392+ accepts them (you can accept them by either using ``**kwargs``, or list them
393+ specifically)::
394+
395+ @task
396+ def mytask(task_id=None):
397+ cache.set(task_id, "Running")
398+
399+ The default keyword arguments are documented here:
400+ http://celeryq.org/docs/userguide/tasks.html#default-keyword-arguments
401+
402+ Can I specify a custom task_id?
403+ -------------------------------
404+
405+ **Answer**: Yes. Use the ``task_id`` argument to
406+ :meth:`~celery.execute.apply_async`::
407+
408+ >>> task.apply_async(args, kwargs, task_id="...")
409+
410+ Can I use natural task ids?
411+ ---------------------------
412+
413+ **Answer**: Yes, but make sure it is unique, as the behavior
414+ for two tasks existing with the same id is undefined.
415+
416+ The world will probably not explode, but at the worst
417+ they can overwrite each others results.
394418
395419How can I run a task once another task has finished?
396420----------------------------------------------------
@@ -563,3 +587,32 @@ and they will not be re-run unless you have the ``acks_late`` option set.
563587How do I run celeryd in the background on [platform]?
564588-----------------------------------------------------
565589**Answer**: Please see :doc:`cookbook/daemonizing`.
590+
591+ Windows
592+ =======
593+
594+ celeryd keeps spawning processes at startup
595+ -------------------------------------------
596+
597+ **Answer**: This is a known issue on Windows.
598+ You have to start celeryd with the command::
599+
600+ $ python -m celeryd.bin.celeryd
601+
602+ Any additional arguments can be appended to this command.
603+
604+ See http://bit.ly/bo9RSw
605+
606+ The ``-B`` / ``--beat`` option to celeryd doesn't work?
607+ ----------------------------------------------------------------
608+ **Answer**: That's right. Run ``celerybeat`` and ``celeryd`` as separate
609+ services instead.
610+
611+ ``django-celery`` can’t find settings?
612+ --------------------------------------
613+
614+ **Answer**: You need to specify the ``--settings`` argument to ``manage.py``::
615+
616+ $ python manage.py celeryd start --settings=settings
617+
618+ See http://bit.ly/bo9RSw
0 commit comments