GithubHelp home page GithubHelp logo

sartography / spiffworkflow Goto Github PK

View Code? Open in Web Editor NEW
1.6K 88.0 310.0 31.97 MB

A powerful workflow engine implemented in pure Python

License: GNU Lesser General Public License v3.0

Python 99.84% Makefile 0.14% Dockerfile 0.02%
workflow spiff-workflow python workflow-specification bpmn bpmn-engine workflowpatterns

spiffworkflow's Introduction

SpiffWorkflow

Logo

Spiff Workflow is a workflow engine implemented in pure Python. It is based on the excellent work of the Workflow Patterns initiative. In 2020 and 2021, extensive support was added for BPMN / DMN processing.

Motivation

We created SpiffWorkflow to support the development of low-code business applications in Python. Using BPMN will allow non-developers to describe complex workflow processes in a visual diagram, coupled with a powerful python script engine that works seamlessly within the diagrams. SpiffWorkflow can parse these diagrams and execute them. The ability for businesses to create clear, coherent diagrams that drive an application has far reaching potential. While multiple tools exist for doing this in Java, we believe that wide adoption of the Python Language, and it's ease of use, create a winning strategy for building Low-Code applications.

Build status

SpiffWorkflow Documentation Status Issues Pull Requests

Code style

PEP8

Dependencies

We've worked to minimize external dependencies. We rely on lxml for parsing XML Documents, and that's it! Built with

Features

  • BPMN - support for parsing BPMN diagrams, including the more complex components, like pools and lanes, multi-instance tasks, sub-workflows, timer events, signals, messages, boudary events and looping.
  • DMN - We have a baseline implementation of DMN that is well integrated with our Python Execution Engine.
  • Python Workflows - We've retained support for building workflows directly in code, or running workflows based on a internal json data structure.

A complete list of the latest features is available with our release notes for version 1.0.

Code Examples and Documentation

Detailed documentation is available on ReadTheDocs Also, checkout our example application, which we reference extensively from the Documentation.

Installation

pip install spiffworkflow

Tests

pip install spiffworkflow[dev]
cd tests/SpiffWorkflow
coverage run --source=SpiffWorkflow -m unittest discover -v . "*Test.py"

Support

You can find us on Discord at https://discord.gg/BYHcc7PpUC

Commercial support for SpiffWorkflow is available from Sartography

Contribute

Pull Requests are and always will be welcome!

Please check your formatting, assure that all tests are passing, and include any additional tests that can demonstrate the new code you created is working as expected. If applicable, please reference the issue number in your pull request.

Credits and Thanks

Samuel Abels (@knipknap) for creating SpiffWorkflow and maintaining it for over a decade.

Matthew Hampton (@matthewhampton) for his initial contributions around BPMN parsing and execution.

The University of Virginia for allowing us to take on the mammoth task of building a general-purpose workflow system for BPMN, and allowing us to contribute that back to the open source community. In particular, we would like to thank Ron Hutchins, for his trust and support. Without him our efforts would not be possible.

Bruce Silver, the author of BPMN Quick and Easy Using Method and Style, whose work we referenced extensively as we made implementation decisions and educated ourselves on the BPMN and DMN standards.

The BPMN.js library, without which we would not have the tools to effectively build out our models, embed an editor in our application, and pull this mad mess together.

Kelly McDonald (@w4kpm) who dove deeper into the core of SpiffWorkflow than anyone else, and was instrumental in helping us get some of these major enhancements working correctly.

Thanks also to the many contributions from our community. Large and small. From Ziad (@ziadsawalha) in the early days to Elizabeth (@essweine) more recently. It is good to be a part of this long lived and strong community.

License

GNU LESSER GENERAL PUBLIC LICENSE

spiffworkflow's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spiffworkflow's Issues

doc/tutorial/deserialize.py is failing

After doc/tutorial/deserialize.py was executed, I run deserialize.py

Traceback (most recent call last):
File "deserialize.py", line 8, in
spec = WorkflowSpec.deserialize(serializer, workflow_json, 'nuclear.json')
TypeError: deserialize() takes 3 positional arguments but 4 were given

Why not follow the "Style Guild for Python Code"(PEP 8)

Thank U for ur job.
When i'm reading ur codes, i was confused the code style, such as:

        nodetype        = start_node.nodeName.lower()
        name            = start_node.getAttribute('name').lower()
        context         = start_node.getAttribute('context').lower()
        mutex           = start_node.getAttribute('mutex').lower()
        cancel          = start_node.getAttribute('cancel').lower()
        success         = start_node.getAttribute('success').lower()
        times           = start_node.getAttribute('times').lower()
        times_field     = start_node.getAttribute('times-field').lower()
        threshold       = start_node.getAttribute('threshold').lower()
        threshold_field = start_node.getAttribute('threshold-field').lower()
        file            = start_node.getAttribute('file').lower()
        file_field      = start_node.getAttribute('file-field').lower()

see http://www.python.org/dev/peps/pep-0008/#id16

and

type(context) != type([])

just use

isinstance(context, list)

and

_spec_map.has_key(child_node.nodeName.lower()):

just use

child_node.nodeName.lower() in _spec_map

.
Thank U.

TaskSpec refactoring

Some refactoring of the TaskSpec classes seems to be in order:

  • Is TaskSpec._predict_hook required? Does a simpler alternative exist?
  • Stop interpreting the return value of hook functions, it makes things hard to understand.
  • Predicted states should ONLY be set within the predict() method, nowhere else.
  • The TRIGGERED state should be an extra flag instead of part of the state value.
  • my_task._inherit_attributes() in TaskSpec should be called in Task if it is not TaskSpec-specific.
  • Are the other hook methods all needed?
  • Can Task._update_children be split into _update_children and _update_child_state?

Documentation issue

Here http://spiffworkflow.readthedocs.io/en/latest/tutorial/index.html, in the json, the president task

"president": {
            "class": "SpiffWorkflow.specs.ExclusiveChoice.ExclusiveChoice",
            "name": "president",
            "manual": true,
            "inputs": [
                "general"
            ],
            "outputs": [
                "workflow_aborted",
                "nuclear_strike"
            ],
            "choice": null,
            "default_task_spec": "workflow_aborted",
            "cond_task_specs": [
                [
                    [
                        "SpiffWorkflow.operators.Equal",
                        [
                            [
                                "Attrib",
                                "confirmation"
                            ],
                            [
                                "value",
                                "yes"
                            ]
                        ]
                    ],
                    "president"
                ]
            ]
        }

Here in the output task if the condition is true, president is given. I think it should be nuclear_strike.

enforce 80 character lines (pep8?)

pep8 goes for 80 character lines, but I noticed some files have much longer lines, in particular some of the BPMN stuff. Would it be okay to enforce 80 character lines throughout the codebase?

Documentation states BPMN supports Inclusive Gateways but it doesn't?

When trying to use an inclusive gateway with two outgoing sequence flows I get this error.

SpiffWorkflow.bpmn.parser.ValidationException.ValidationException: Multiple outgoing flows are not supported for tasks of type
Source Details: bpmn:inclusiveGateway

Are we expected to subclass 'handles_multiple_outgoing' to get this working?

Question on Duplicate Task IDs

I've come a cross a few situations where I have ended up with duplicate task IDs. I noticed in the code that there are places where you account for that (like in workflow.get_task()).

What are the legitimate cases where you have tasks with the same IDs?

Can i save a running workflow instance?

In the event if a task is waiting for a response from human, can the running workflow be saved and resumed only after the task has been executed? I don't see any doc on that or i must have missed it?

Make a release on PyPi

The latest release on PyPi was 0.3.0 in 2009. The code has changed a fair amount since then. It's time for a new version number - 1.0 maybe? - and a new release.

Some errors while running test suite on Windows

tests/SpiffWorkflow/run_suite.py, line 72 also needed fixing:

 name   = name.lstrip('.').lstrip(os.sep).replace(os.sep, '.')

Traceback:

C:\ACME\Dev\projects\SpiffWorkflow-master\ve\Scripts\python.exe C:/ACME/Dev/projects/SpiffWorkflow-master_GITHUB2/tests/SpiffWorkflow/run_suite.py
WARNING: Celery not found, not all tests are running!
testPattern (PatternTest.PatternTest) ... C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\acyclic_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\arbitrary_cycles.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_partial_join_for_multi_instance.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_case.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_multi_instance_task.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_region.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_task.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\complete_multiple_instance_activity.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\critical_section.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\deferred_choice.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\dynamic_partial_join_for_multi_instance.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\exclusive_choice.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\explicit_termination.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\generalized_and_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\general_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\implicit_termination.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\interleaved_parallel_routing.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\interleaved_routing.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\milestone.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_choice.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_without_a_priori.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_without_synch.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_with_a_priori_design_time_knowledge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_with_a_priori_run_time_knowledge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\parallel_split.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\persistent_trigger.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\recursion.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\sequence.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\simple_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\static_partial_join_for_multi_instance.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\structured_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\structured_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\structured_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\subworkflow_to_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\subworkflow_to_join_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\synchronization.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\thread_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\thread_split.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\transient_trigger.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_data.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_data_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_to_subworkflow.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_to_subworkflow_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\subworkflow_to_block.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\subworkflow_to_block_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\task_data.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\task_to_task.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff\workflow1.py
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff\workflow1.xml
ok
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff\__init__.py
testDeserialization (PersistSmallWorkflowTest.PersistSmallWorkflowTest) ... ok
testDictionarySerializer (PersistSmallWorkflowTest.PersistSmallWorkflowTest) ... ok
testTree (TaskTest.TaskTest) ... ok
testBeginWorkflowStepByStep (WorkflowTest.WorkflowTest) ... ok
testConstructor (WorkflowTest.WorkflowTest) ... ok
testRunThroughCancel (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughCancelAfterApproved (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughCancelAfterWorkStarted (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughHappy (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughOverdue (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testReadonlyWaiting (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testRunThroughHappy (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testRunThroughHappyOtherOrders (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testSaveRestore (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testSaveRestoreWaiting (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testDisconnectedBoundaryEvent (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testMultipleStartEvents (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testNoStartEvent (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testRecursiveSubprocesses (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ERROR:SpiffWorkflow.bpmn.parser.TaskParser:NotImplementedError('Recursive call Activities are not supported.',)
Traceback (most recent call last):
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\..\..\SpiffWorkflow\bpmn\parser\TaskParser.py", line 59, in parse_node
    self.task = self.create_task()
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\..\..\SpiffWorkflow\bpmn\parser\task_parsers.py", line 110, in create_task
    wf_spec = self.get_subprocess_parser().get_spec()
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\..\..\SpiffWorkflow\bpmn\parser\ProcessParser.py", line 113, in get_spec
    raise NotImplementedError('Recursive call Activities are not supported.')
NotImplementedError: Recursive call Activities are not supported.

ok
testSubprocessNotFound (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testUnsupportedTask (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageInterruptsSpTest.MessageInterruptsSpTest) ... ok
testRunThroughInterruptSaveAndRestore (bpmn.MessageInterruptsSpTest.MessageInterruptsSpTest) ... ok
testRunThroughHappy (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughMessageInterrupt (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughMessageInterruptSaveAndRestore (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughMessageOrder2SaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughMessageOrder3SaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughMessageSaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughHappy (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterrupt (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterruptOtherOrder (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterruptOtherOrderSaveAndRestore (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterruptSaveAndRestore (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughHappy (bpmn.MessagesTest.MessagesTest) ... ok
testRunThroughSaveAndRestore (bpmn.MessagesTest.MessagesTest) ... ok
testNoFirstThenThread1 (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughAlternating (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughThread1First (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughThread1FirstThenNo (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughAlternating (bpmn.ParallelTest.ParallelJoinLongTest) ... ok
testRunThroughThread1First (bpmn.ParallelTest.ParallelJoinLongTest) ... ok
test1 (bpmn.ParallelTest.ParallelLoopingAfterJoinTest) ... ok
test2 (bpmn.ParallelTest.ParallelLoopingAfterJoinTest) ... ok
test1 (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTest) ... ok
test2 (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTest) ... ok
test_breadth_first (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTestNested) ... ok
test_depth_first (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTestNested) ... ok
test1 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
test2 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
test3 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
test4 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
testRunThroughAlternating (bpmn.ParallelTest.ParallelMultipleSplitsTest) ... ok
testRunThroughChoiceFirst (bpmn.ParallelTest.ParallelOnePathEndsTest) ... ok
testRunThroughParallelTaskFirst (bpmn.ParallelTest.ParallelOnePathEndsTest) ... ok
testRunThroughParallelTaskFirstYes (bpmn.ParallelTest.ParallelOnePathEndsTest) ... ok
testRunThroughChoiceFirst (bpmn.ParallelTest.ParallelThenExlusiveNoInclusiveTest) ... ok
testRunThroughChoiceThreadCompleteFirst (bpmn.ParallelTest.ParallelThenExlusiveNoInclusiveTest) ... ok
testRunThroughParallelTaskFirst (bpmn.ParallelTest.ParallelThenExlusiveNoInclusiveTest) ... ok
testRunThroughChoiceFirst (bpmn.ParallelTest.ParallelThenExlusiveTest) ... ok
testRunThroughChoiceThreadCompleteFirst (bpmn.ParallelTest.ParallelThenExlusiveTest) ... ok
testRunThroughParallelTaskFirst (bpmn.ParallelTest.ParallelThenExlusiveTest) ... ok
testNoRouteNoFirstThenRepeating (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testNoRouteNoTaskFirst (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testNoRouteRepeatTaskFirst (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRepeatTasksReadyTogether (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRepeatTasksReadyTogetherSaveRestore (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRunThroughFirstRepeatTaskFirst (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRunThroughHappy (bpmn.TimerIntermediateTest.TimerIntermediateTest) ... ok
testAncestors (specs.ExecuteTest.ExecuteTest) ... ok
testConnect (specs.ExecuteTest.ExecuteTest) ... ok
testConstructor (specs.ExecuteTest.ExecuteTest) ... ok
testFollow (specs.ExecuteTest.ExecuteTest) ... ok
testGetData (specs.ExecuteTest.ExecuteTest) ... ok
testPattern (specs.ExecuteTest.ExecuteTest) ... ok
testSerialize (specs.ExecuteTest.ExecuteTest) ... ok
testSetData (specs.ExecuteTest.ExecuteTest) ... ok
testTest (specs.ExecuteTest.ExecuteTest) ... ok
test_ancestors_cyclic (specs.ExecuteTest.ExecuteTest) ... ok
testAncestors (specs.JoinTest.JoinTest) ... ok
testConnect (specs.JoinTest.JoinTest) ... ok
testConstructor (specs.JoinTest.JoinTest) ... ok
testFollow (specs.JoinTest.JoinTest) ... ok
testGetData (specs.JoinTest.JoinTest) ... ok
testSerialize (specs.JoinTest.JoinTest) ... ok
testSetData (specs.JoinTest.JoinTest) ... ok
testTest (specs.JoinTest.JoinTest) ... ok
test_ancestors_cyclic (specs.JoinTest.JoinTest) ... ok
testAncestors (specs.MergeTest.MergeTest) ... ok
testConnect (specs.MergeTest.MergeTest) ... ok
testConstructor (specs.MergeTest.MergeTest) ... ok
testFollow (specs.MergeTest.MergeTest) ... ok
testGetData (specs.MergeTest.MergeTest) ... ok
testSerialize (specs.MergeTest.MergeTest) ... ok
testSetData (specs.MergeTest.MergeTest) ... ok
testTest (specs.MergeTest.MergeTest) ... ok
test_Merge_data_merging (specs.MergeTest.MergeTest)
Test that Merge task actually merges data ... FAIL
test_ancestors_cyclic (specs.MergeTest.MergeTest) ... ok
testConstructor (specs.SubWorkflowTest.TaskSpecTest) ... ok
testSerialize (specs.SubWorkflowTest.TaskSpecTest) ... ok
testTest (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_block_to_subworkflow (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_subworkflow_to_block (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_subworkflow_to_join (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_subworkflow_to_join_refresh_waiting (specs.SubWorkflowTest.TaskSpecTest) ... ok
testAncestors (specs.TaskSpecTest.TaskSpecTest) ... ok
testConnect (specs.TaskSpecTest.TaskSpecTest) ... ok
testConstructor (specs.TaskSpecTest.TaskSpecTest) ... ok
testFollow (specs.TaskSpecTest.TaskSpecTest) ... ok
testGetData (specs.TaskSpecTest.TaskSpecTest) ... ok
testSerialize (specs.TaskSpecTest.TaskSpecTest) ... ok
testSetData (specs.TaskSpecTest.TaskSpecTest) ... ok
testTest (specs.TaskSpecTest.TaskSpecTest) ... ok
test_ancestors_cyclic (specs.TaskSpecTest.TaskSpecTest) ... ok
testAncestors (specs.TransformTest.TransformTest) ... ok
testConnect (specs.TransformTest.TransformTest) ... ok
testConstructor (specs.TransformTest.TransformTest) ... ok
testFollow (specs.TransformTest.TransformTest) ... ok
testGetData (specs.TransformTest.TransformTest) ... ok
testPattern (specs.TransformTest.TransformTest) ... ok
testSerialize (specs.TransformTest.TransformTest) ... ok
testSetData (specs.TransformTest.TransformTest) ... ok
testTest (specs.TransformTest.TransformTest) ... ok
test_ancestors_cyclic (specs.TransformTest.TransformTest) ... ok
testConstructor (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testDump (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testGetDump (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testGetTaskSpecFromName (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testSerialize (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testValidate (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testConstructor (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testDeserializeWorkflow (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testDeserializeWorkflowSpec (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testSerializeWorkflow (storage.DictionarySerializerTest.DictionarySerializerTest) ... ERROR
testSerializeWorkflowSpec (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testPattern (storage.DictionarySerializerTest.DictionarySerializeEveryPatternTest) ... C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\acyclic_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\arbitrary_cycles.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_partial_join.xml
ERROR
testConstructor (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testDeserializeWorkflow (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testDeserializeWorkflowSpec (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testSerializeWorkflow (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testSerializeWorkflowSpec (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testPattern (storage.JSONSerializerTest.JSONSerializeEveryPatternTest) ... C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\acyclic_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\arbitrary_cycles.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_partial_join.xml
ERROR
testConstructor (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testDeserializeWorkflow (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testDeserializeWorkflowSpec (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... FAIL
testSerializeWorkflow (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testSerializeWorkflowSpec (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testConstructor (storage.SerializerTest.SerializerTest) ... ok
testDeserializeWorkflow (storage.SerializerTest.SerializerTest) ... ok
testDeserializeWorkflowSpec (storage.SerializerTest.SerializerTest) ... ok
testSerializeWorkflow (storage.SerializerTest.SerializerTest) ... ok
testSerializeWorkflowSpec (storage.SerializerTest.SerializerTest) ... ok
testConstructor (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testDeserializeWorkflow (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testDeserializeWorkflowSpec (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testSerializeWorkflow (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testSerializeWorkflowSpec (storage.XmlSerializerTest.XmlSerializerTest) ... ok

======================================================================
ERROR: testSerializeWorkflow (storage.DictionarySerializerTest.DictionarySerializerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 123, in testSerializeWorkflow
    exclude_dynamic=True)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\DictionarySerializerTest.py", line 43, in compareSerialization
    raise Exception(key + '/' + str(e))
Exception: data/synch_1_reached/Unequal: 'gAJLAS4=' vs 'gAJLAi4='

======================================================================
ERROR: testPattern (storage.DictionarySerializerTest.DictionarySerializeEveryPatternTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\PatternTest.py", line 62, in testPattern
    self.run_pattern(filename)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 158, in run_pattern
    data=expected_data)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 123, in testSerializeWorkflow
    exclude_dynamic=True)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\DictionarySerializerTest.py", line 43, in compareSerialization
    raise Exception(key + '/' + str(e))
Exception: data/struct_synch_merge_1_reached/Unequal: 'gAJLAi4=' vs 'gAJLBC4='

======================================================================
ERROR: testPattern (storage.JSONSerializerTest.JSONSerializeEveryPatternTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\PatternTest.py", line 62, in testPattern
    self.run_pattern(filename)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 158, in run_pattern
    data=expected_data)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 123, in testSerializeWorkflow
    exclude_dynamic=True)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\JSONSerializerTest.py", line 36, in compareSerialization
    exclude_items=exclude_items)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\DictionarySerializerTest.py", line 43, in compareSerialization
    raise Exception(key + '/' + str(e))
Exception: data/struct_synch_merge_1_reached/Unequal: u'gAJLAi4=' vs u'gAJLBC4='

======================================================================
FAIL: test_Merge_data_merging (specs.MergeTest.MergeTest)
Test that Merge task actually merges data
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\specs\MergeTest.py", line 68, in test_Merge_data_merging
    self.assert_('second' in task.data)
AssertionError: False is not true

======================================================================
FAIL: testDeserializeWorkflowSpec (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\OpenWfeXmlSerializerTest.py", line 38, in testDeserializeWorkflowSpec
    run_workflow(self, wf_spec, path, None)
  File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\util.py", line 110, in run_workflow
    test.assert_(taken_path == expected_path, error)
AssertionError: Expected:
Start
  concurrence_1
    task_a1
      task_a2
        if_condition_1
          task_a3
            if_condition_1_end
              if_condition_2
                task_a5
                  if_condition_2_end
    task_b1
      task_b2
        concurrence_1_end
          task_c1
            task_c2
              End

but got:
Start
  concurrence_1
    task_a1
      task_a2
        if_condition_1
          task_a3
            if_condition_1_end
              if_condition_2
                task_a5
                  if_condition_2_end
    task_b1
      task_b2
                    concurrence_1_end
                      task_c1
                        task_c2
                          End



----------------------------------------------------------------------
Ran 161 tests in 12.789s

FAILED (failures=2, errors=3)

Process finished with exit code 1

BMPN modeler

Hello! Which GUI BPMN editor can I use to load to the SpiffWorkflow? Bizagi Modeler 2.6 saves xml not compatible with the SpiffWorkflow/

Any real world usage example I can use to inspire myself?

Hi,

I've been taking a look at Spiffworkflow for a project I'm working on and it seems it could perfectly fit the bill, since I need to develop a workflow system where a process may have several steps, several people are involved in this process and have different permissions to execute certain steps, etc.

However I'm having trouble finding real usages of this library, and how to bind each task to python code or functions (e.g. notify by email a user once a step has been completed), how to decide which person is allowed to execute a certain step on a workflow, etc. Is there any complex implementation I could see as a source of inspiration?

Thank you and kind regards.

bug in ThreadSplit _find_my_task?

In ThreadSplit.py, in _find_my_task there is a reference to my_task that doesn't seem to exist. It's used by _on_trigger, which would then seem to have to fail with a NameError. What's going on here?

Use base64 for serialization instead of UU

I came across a bug in the serialization code (max 45 bytes for https://github.com/knipknap/SpiffWorkflow/blob/master/SpiffWorkflow/storage/DictionarySerializer.py#L15) and as I was researching it I came across this article: http://effbot.org/librarybook/uu.htm. It suggests the UU is deprecated in favor of base64 encoding.

While I implemented a temporary fix in #10, I did not want to replace the serialization code until I got further input and also get a sense for how many people might be using SpiffWorkflow and how much backwards compatibility do we need to build in.

What does exactly 'general' or 'president' do in tutorial/nuclear.json

Hello!

I got confused after I read the tutorial. The problem is that I can't find the function declaration of general or president in nuclear.json , therefore, I don't know what exactly general or president does. When workflow got Task president, it just completed, I couldn't find the execution of president and how did the attribute confirmation come out.

Actually, I want to implement several custom-tasks into a workflow so that it can make sure my tasks run well in programmed sequence like

{
    "task_specs": {
        "Start": {
            "class": "SpiffWorkflow.specs.StartTask.StartTask",
            "manual": false,
            "outputs": [
                "createEC2"
            ]
        },
        "custom-task1": {
            "class": "tasks.custom.CustomTask1",
            "name": "task1",
            "inputs": [
                "Start"
            ],
            "outputs": [
                "task2"
            ]
        },
       "custom-task2": {
            "class": "tasks.custom.CustomTask2",
            "name": "task2",
            "inputs": [
                "task1"
            ]
        },
        "workflow_aborted": {
            "class": "SpiffWorkflow.specs.Cancel.Cancel",
            "name": "workflow_aborted",
            "inputs": [
                "task1",
                "task2"
            ]
        }
    },
    "description": "",
    "file": null,
    "name": ""
}

So I have to understand how to write a custom task like NuclearStrike.

Add Sphinx-based documentation

The current documentation is incomplete. To complete it, a better documentation mechanism like Sphinx would be helpful.

Workflow global data

Hi,

I'm asking if it's possible to set a value at workflow level so I can set the value once (when it's available) and then use it in multiple tasks. Is it possible?

I read there is a get_data in workflow class but I'm not able to find how to set this data.

thanks

Linking my class to a <task> xml node

Hi,

Can someone please provide an example of how to link a Simple task derivative to the node?. All s are by default treated as "Simple" type. What if I want to get my own derivative of Simple instead as the task spec? The NuclearStrike example is very clear for JSON, is there a similar one for the XML? I searched all over the internet, can't find a clear example for XML.

For the JSON example, I can see where this binding happens using the 'class' attribute - in dict.py
task_spec_cls = get_class(task_spec_state['class'])
task_spec = task_spec_cls.deserialize(self, spec, task_spec_state)

Is there a similar thing for XML. I tried providing a class attribute on the task node, but it obviously didn't work because there is no code in the XML Serializer consuming and acting on that.

设备报修管理的审批工作流程,是否适用于SpiffWorkflow

有以下应用场景:
1、用户有设备需要报修,填写了设备缺陷信息。提交给设备管理部门
2、设备管理部门获取缺陷信息后,评估缺陷,并填写预估费用,提交给部门领导审批。
3、领导审批后,技术人员组织维修设备,并反馈维修结果。
4、用户获取维修结果反馈,验收设备,并给出评价。

Error handling

When an error occurs in a scripttask the workflow is aborted. Is there a more subtle way to handle exceptions within Spiff?

Comparing pickles in tests depends on implementation-specific details and is unreliable

For fun I ran the test suite on PyPy. It all works, up till the point where it compares two pickles:

Exception: task_specs/go_to_repetition/pre_assign/[0]/Unequal: 'gAJjU3BpZmZXb3JrZmxvdy5vcGVyYXRvcnMKQXNzaWduCnEAKYFxAX1xAihVBXJpZ2h0cQNYAQAAADBxBFUPcmlnaHRfYXR0cmlidXRlcQVOVQ5sZWZ0X2F0dHJpYnV0ZXEGWAYAAAByZXBlYXRxB3ViLg==' vs 'gAJjU3BpZmZXb3JrZmxvdy5vcGVyYXRvcnMKQXNzaWduCnEAKYFxAX1xAihVDmxlZnRfYXR0cmlidXRlcQNYBgAAAHJlcGVhdHEEVQVyaWdodHEFWAEAAAAwcQZVD3JpZ2h0X2F0dHJpYnV0ZXEHTnViLg=='

If you base64 decode the string, you see that the pickles are the same, just that they have keys in a different order.

I initially thought this was a PyPy bug and reported it: https://bugs.pypy.org/issue1693
However, it turns out that even in regular python it's not safe to rely on the ordering.

I'm not sure what the best way to fix this is.

  • At DigiACTive we're trying to move away from pickles entirely - see the PureDictionarySerializer branch (it's not quite ready for a pull request yet but I'll get there eventually) - one option would be to go with that and just do away with comparing pickles entirely.
  • Another option is to rewrite the tests to unpickle the objects and compare them - that would require writing some __eq__ functions for Assign, etc.

I don't understand the rationale for pickling things, so I'd lean towards the first solution, but I'm fully open to being convinced about the necessity of pickles.

Run all non manual tasks

From the tutorial:

# Execute until all tasks are done or require manual intervention.
 +workflow.complete_all()

This will run all tasks using default choices not respecting task spec definitions flagged as manual. How does one "require manual intervention"?

ImportError: No module named SpiffSignal

Hi all,

I would like to try this lib but obviously its not working. When I simply to in python:

from SpiffWorkflow import Workflow

I get the error:

Traceback (most recent call last):
File "", line 1, in
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/init.py", line 1, in
from Job import Job
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Job.py", line 16, in
import Tasks
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Tasks/init.py", line 1, in
from AcquireMutex import AcquireMutex
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Tasks/AcquireMutex.py", line 18, in
from TaskSpec import TaskSpec
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Tasks/TaskSpec.py", line 16, in
from SpiffSignal import Trackable
ImportError: No module named SpiffSignal

Obviously a module is missing. I was not able to find it into the install directory.
Any clew why it disappeared?

Cheers

Crash while compiling with cxFreeze, Py2Exe and PyInstaller

When compiling an application that uses SpiffWorkflow into an exe the resulting exe crashes with "ImportError: No module named UserList"

The result is the same for all 3 tools to compile the application.

I have traced this back to the statements in the Packager under BPMN:

from future import standard_library standard_library.install_aliases()

As far as I understand this is only needed to allow configparser to be found in Python2 and 3. However, in the last version of 2.7 this has been taken care of. So I wonder if these lines can be safely removed?

How to custmize task by following bpmn 2.0?

I am going to implement workflow by following bpmn 2.0
but I don't know how to customize task which can execute job what I want
which task should I override, scriptTask or callActivity or ....?
I am not sure how to implement is correct way
Would you please help me?
Thank you so much

doubt on tasks

Hello,
How would one connect a python function to a task, for example in workflow1.py?
Or how would one get the task to do some processing ?

bare Excepts can catch Exception instead?

Is it really necessary to have bare except statements in the codebase or could be replace them with catching Exception? They wouldn't catch KeyboardInterrupt that way, but that's a rather unusual case anyway.

BPMN engine documentation

I recently added a ton of documentation for most parts of SpiffWorkflow (and ported the module to Py3). The only missing part is documentation for the BPMN layer, which I am now trying to add.

@matthewhampton can you help out with the following questions?:

  • What is the purpose of the BpmnWorkflow.accept_message() method?
  • Do you have a minimal example of how to run a BPMN workflow?
  • Why did you add BpmnWorkflow.do_engine_steps() in addition to the already inherited BpmnWorkflow.complete_all()? Is this actually needed?

SpiffSignal

The SpiffSignal is missing from the repository

It is required for running the tests and missing

eugen@lps-dv2700:~/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow$ python XmlReaderTest.py
Traceback (most recent call last):
File "XmlReaderTest.py", line 4, in
from WorkflowTest import WorkflowTest
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/WorkflowTest.py", line 4, in
from SpiffWorkflow import Workflow, Job
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/init.py", line 1, in
from Job import Job
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Job.py", line 16, in
import Tasks
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Tasks/init.py", line 1, in
from AcquireMutex import AcquireMutex
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Tasks/AcquireMutex.py", line 18, in
from TaskSpec import TaskSpec
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Tasks/TaskSpec.py", line 16, in
from SpiffWorkflow.external.SpiffSignal import Trackable
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/external/init.py", line 5, in
import SpiffSignal

wireit

In the folder wireit you forgot to put the build folder.

Documentation on XML for bpmn

Hi Guys,
We are trying to implement SpiffWorkflow in an exiting new project. However the documentation of the XML lacks examples and the examples I do find do not work

According to the documentation (and examples I found) the following should work:

with open(flow_def, 'r') as xmlflow: 
    xml_data = xmlflow.read() 
    spec = WorkflowSpec.deserialize(XmlSerializer(), xml_data, filename=flow_def)

However using this code results in an

    E       AttributeError: 'str' object has no attribute 'findtext'

Looking at the statement that is executed, this can never work because it expects an lxml object.

exclusive-choice conditional-successor successor is join run empty task

After run test code, I want get current task collect_change, but now is empty!

test code:

def test_swprocess(self):
    serializer = XmlSerializer()
    xml_file = 'tests/docs/swprocess.xml'
    xml_data = open(xml_file).read()
    spec = serializer.deserialize_workflow_spec(xml_data)
    wf = Workflow(spec)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    wf.complete_task_from_id(task.id)
    # start
    tasks = wf.get_tasks(Task.READY)
    print('tasks ...')
    print(tasks)
    task = tasks[0]
    task.set_data(**{'process_type': 'sw_optimization'})
    wf.complete_task_from_id(task.id)
    # start
    tasks = wf.get_tasks(Task.READY)
    print('tasks 1...')
    print(tasks)
    task = tasks[0]
    self.assertEqual('internal_sw_optimization', task.get_name())
    # internal_sw_optimization
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    self.assertEqual('code_implementation', task.get_name())
    # code_implementation
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    self.assertEqual('project_build', task.get_name())
    # project_build
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    self.assertEqual('pre_validation', task.get_name())
    # pre_validation
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    self.assertEqual('collect_change', task.get_name())
    # collect_change
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    self.assertEqual('code_review_meeting', task.get_name())
    # code_review_meeting
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    self.assertEqual('sw_change_class', task.get_name())
    # sw_change_class
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    task = tasks[0]
    self.assertEqual('review_code_issues', task.get_name())
    # review_code_issues
    task.set_data(**{'review_result': 'N'})
    wf.complete_task_from_id(task.id)
    tasks = wf.get_tasks(Task.READY)
    print(tasks)
    task = tasks[0]
    self.assertEqual('collect_change', task.get_name())

swprocess.xml:

<?xml version="1.0" encoding="UTF-8"?> 
<process-definition name="flow" revision="1.6">
    <description>CEP Software Release Process</description>

    <!-- Start with an implicit simple split. -->
    <start-task>
        <successor>process_type</successor>
    </start-task>



    <multi-choice name="process_type">
        <description>Select Process Type</description>
        <!--                        -->
        <!-- Implementation Process -->
        <!--                        -->
        <!-- New product / New test process -->
        <conditional-successor>
            <equals left-field="process_type" right-value="new_product" />
            <successor>test_plan_creation</successor>
        </conditional-successor>
        <conditional-successor>
            <equals left-field="process_type" right-value="new_product" />
            <successor>validation_plan_definition</successor>
        </conditional-successor>
        <conditional-successor>
            <equals left-field="process_type" right-value="new_product" />
            <successor>repository_creation</successor>
        </conditional-successor>

        <!-- New software project -->
        <conditional-successor>
            <equals left-field="process_type" right-value="new_sw_project" />
            <successor>repository_creation</successor>
        </conditional-successor>

        <!-- Equipment change / Process change / New model or model change -->
        <conditional-successor>
            <equals left-field="process_type" right-value="ecn_pcn" />
            <successor>process_env_change</successor>
        </conditional-successor>

        <!-- Idea for software optimization -->
        <conditional-successor>
            <equals left-field="process_type" right-value="sw_optimization" />
            <successor>internal_sw_optimization</successor>
        </conditional-successor>
    </multi-choice>

    <multi-choice name="process_env_change">
        <description>Process evironment change</description>
        <!-- New product / New test process -->
        <conditional-successor>
            <equals left-field="need_test_plan" right-value="True" />
            <successor>test_plan_creation</successor>
        </conditional-successor>
        <conditional-successor>
            <equals left-field="need_validation_plan" right-value="True" />
            <successor>validation_plan_definition</successor>
        </conditional-successor>
        <conditional-successor>
            <equals left-field="need_code_change" right-value="True" />
            <successor>code_implementation</successor>
        </conditional-successor>
    </multi-choice>

    <task name="repository_creation">
        <description>Repository creation</description>
        <successor>code_implementation</successor>
    </task>

    <task name="test_plan_creation">
        <description>Test plan creation</description>
        <successor>collect_change</successor>
    </task>

    <task name="validation_plan_definition">
        <description>Validation plan definition</description>
        <successor>collect_change</successor>
    </task>

    <task name="internal_sw_optimization">
        <description>Internal software optimization</description>
        <successor>code_implementation</successor>
    </task>

    <task name="code_implementation">
        <description>Code implementation</description>
        <successor>project_build</successor>
    </task>

    <task name="project_build">
        <description>Project build</description>
        <successor>pre_validation</successor>
    </task>

    <task name="pre_validation">
        <description>Pre-Validation</description>
        <successor>collect_change</successor>
    </task>

    <!--                        -->
    <!-- Validation Process     -->
    <!--                        -->
    <join name="collect_change" context="process_type" threshold="1">
        <description>Collect Changes</description>
        <successor>code_review_meeting</successor>
    </join>


    <task name="code_review_meeting">
        <description>Code Review Meeting</description>
        <successor>sw_change_class</successor>
    </task>
    <task name="sw_change_class">
        <description>Software Change Classification</description>
        <successor>review_code_issues</successor>
    </task>

    <exclusive-choice name="review_code_issues">
        <description>Review code issues</description>
        <default-successor>selection_tests</default-successor>
        <conditional-successor>
            <equals left-field="review_result" right-value="Y" />
            <successor>selection_tests</successor>
        </conditional-successor>
        <conditional-successor>
            <equals left-field="review_result" right-value="N" />
            <successor>collect_change</successor>
        </conditional-successor>
    </exclusive-choice>

    

    <task name="selection_tests">
        <description>Selection of Tests</description>
        <successor>run_measurements</successor>
    </task>

    <task name="run_measurements">
        <description>Run Measurements</description>
        <successor>evaluate_measurements</successor>
    </task>

    <task name="evaluate_measurements">
        <description>Evaluate Measurements</description>
        <successor>document_collection</successor>
    </task>

    <task name="document_collection">
        <description>Document Collection</description>
        <successor>sw_compliance_approval</successor>
    </task>

    <exclusive-choice name="sw_compliance_approval">
        <description>Software Compliance Approval</description>
        <default-successor>update_sw_release_table</default-successor>
        <conditional-successor>
            <equals left-field="approval_result" right-value="Y" />
            <successor>update_sw_release_table</successor>
        </conditional-successor>
        <conditional-successor>
            <equals left-field="approval_result" right-value="M" />
            <successor>document_collection</successor>
        </conditional-successor>
        <conditional-successor>
            <equals left-field="approval_result" right-value="N" />
            <successor>collect_change</successor>
        </conditional-successor>
    </exclusive-choice>

    <task name="update_sw_release_table">
        <description>Update Software Release Table</description>
        <successor>end</successor>
    </task>

</process-definition>

CompactWorkflowSerializer incorrectly deserializes state of Parallel Gateway

I have a simple BPMN 2.0 workflow with a parallel split, two tasks running in parallel and a join - see attached.

I progress my workflow to a state where both tasks are complete, and my Join Parallel Gateway is set to READY state.

If I serialize the workflow at this stage and then deserialize it back, CompactWorkflowSerializer sets the state of the Join Parallel Gateway to WAITING.

I am attaching a BPMN 2.0 diagram and a code snippet that can be used to reproduce the issue.

incorrect-deserialization.zip

Join can't be write by hand to JSON

Hello, deserializer expected treshold serialized by pickle and to base64. So this is something which is nearly impossible write by hand, only possibility is to generate this node by serialization.

Are Input/Output parameters supported?

Hi,

I am wondering if input/output parameters are supported (I am using Camunda Modeler).

I successfully used an Script Task which does "task.set_data(found='yes')" and then I have an exclusive block that has two outgoing things with expressions: "found=='yes'" and "found!='yes'".

But can't be this done this using input/output parameters. What is a use case for this?
PizzaSimpleWithScriptTaskAndCondition.bpmn.txt

Timer events or Tasks

I am struggling to find documentation on how to use timer events and/ or tasks. I need them for escalations and notifications when users do not act within a certain time period to tasks.

Where can I find directions on how this works, or pointers on how to implement such behaviour.

PatternTest.testPattern is failing

It is failing with the following callstack:

Traceback (most recent call last):
  File "/home/fcorreia/src/spiffworkflow/tests/SpiffWorkflow/PatternTest.py", line 80, in testPattern
    for filename in os.listdir(dirname):
OSError: [Errno 2] No such file or directory: '/home/fcorreia/src/spiffworkflow/tests/SpiffWorkflow/xml/spiff/resource/'

It happens because the test is looking for .xml files in a "resource" directory, that does not exist. I've fixed it on my end by ignoring that directory, but it might just have been forgotten on the previous commits.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.