sartography / spiffworkflow Goto Github PK
View Code? Open in Web Editor NEWA powerful workflow engine implemented in pure Python
License: GNU Lesser General Public License v3.0
A powerful workflow engine implemented in pure Python
License: GNU Lesser General Public License v3.0
Hello,
How would one connect a python function to a task, for example in workflow1.py?
Or how would one get the task to do some processing ?
pep8 goes for 80 character lines, but I noticed some files have much longer lines, in particular some of the BPMN stuff. Would it be okay to enforce 80 character lines throughout the codebase?
Iam Using Linux Operating System to Run the project..So help me to run the project using command based
Thank U for ur job.
When i'm reading ur codes, i was confused the code style, such as:
nodetype = start_node.nodeName.lower()
name = start_node.getAttribute('name').lower()
context = start_node.getAttribute('context').lower()
mutex = start_node.getAttribute('mutex').lower()
cancel = start_node.getAttribute('cancel').lower()
success = start_node.getAttribute('success').lower()
times = start_node.getAttribute('times').lower()
times_field = start_node.getAttribute('times-field').lower()
threshold = start_node.getAttribute('threshold').lower()
threshold_field = start_node.getAttribute('threshold-field').lower()
file = start_node.getAttribute('file').lower()
file_field = start_node.getAttribute('file-field').lower()
see http://www.python.org/dev/peps/pep-0008/#id16
and
type(context) != type([])
just use
isinstance(context, list)
and
_spec_map.has_key(child_node.nodeName.lower()):
just use
child_node.nodeName.lower() in _spec_map
.
Thank U.
I came across a bug in the serialization code (max 45 bytes for https://github.com/knipknap/SpiffWorkflow/blob/master/SpiffWorkflow/storage/DictionarySerializer.py#L15) and as I was researching it I came across this article: http://effbot.org/librarybook/uu.htm. It suggests the UU is deprecated in favor of base64 encoding.
While I implemented a temporary fix in #10, I did not want to replace the serialization code until I got further input and also get a sense for how many people might be using SpiffWorkflow and how much backwards compatibility do we need to build in.
I recently added a ton of documentation for most parts of SpiffWorkflow (and ported the module to Py3). The only missing part is documentation for the BPMN layer, which I am now trying to add.
@matthewhampton can you help out with the following questions?:
Hi,
I'm asking if it's possible to set a value at workflow level so I can set the value once (when it's available) and then use it in multiple tasks. Is it possible?
I read there is a get_data in workflow class but I'm not able to find how to set this data.
thanks
Hi,
this library does not have a dmn engine (http://demo.bpmn.io/dmn or https://camunda.org/dmn/simulator/) right?
I searched a lot and there is not dmn decoder/engine for python... Am I missing something?
I just found this: https://github.com/jan-klos/dmn_pythonbut this library is not up to date I think... And is not compatible with camunda's xml files.
Thanks
Denny
When compiling an application that uses SpiffWorkflow into an exe the resulting exe crashes with "ImportError: No module named UserList"
The result is the same for all 3 tools to compile the application.
I have traced this back to the statements in the Packager under BPMN:
from future import standard_library standard_library.install_aliases()
As far as I understand this is only needed to allow configparser to be found in Python2 and 3. However, in the last version of 2.7 this has been taken care of. So I wonder if these lines can be safely removed?
There is a small typo in SpiffWorkflow/specs/Celery.py.
Should read stored
rather than storred
.
thanks!
In the event if a task is waiting for a response from human, can the running workflow be saved and resumed only after the task has been executed? I don't see any doc on that or i must have missed it?
I have a simple BPMN 2.0 workflow with a parallel split, two tasks running in parallel and a join - see attached.
I progress my workflow to a state where both tasks are complete, and my Join Parallel Gateway is set to READY state.
If I serialize the workflow at this stage and then deserialize it back, CompactWorkflowSerializer sets the state of the Join Parallel Gateway to WAITING.
I am attaching a BPMN 2.0 diagram and a code snippet that can be used to reproduce the issue.
tests/SpiffWorkflow/run_suite.py, line 72 also needed fixing:
name = name.lstrip('.').lstrip(os.sep).replace(os.sep, '.')
Traceback:
C:\ACME\Dev\projects\SpiffWorkflow-master\ve\Scripts\python.exe C:/ACME/Dev/projects/SpiffWorkflow-master_GITHUB2/tests/SpiffWorkflow/run_suite.py
WARNING: Celery not found, not all tests are running!
testPattern (PatternTest.PatternTest) ... C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\acyclic_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\arbitrary_cycles.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_partial_join_for_multi_instance.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_case.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_multi_instance_task.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_region.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancel_task.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\complete_multiple_instance_activity.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\critical_section.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\deferred_choice.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\dynamic_partial_join_for_multi_instance.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\exclusive_choice.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\explicit_termination.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\generalized_and_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\general_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\implicit_termination.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\interleaved_parallel_routing.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\interleaved_routing.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\milestone.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_choice.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_without_a_priori.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_without_synch.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_with_a_priori_design_time_knowledge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_instance_with_a_priori_run_time_knowledge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\multi_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\parallel_split.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\persistent_trigger.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\recursion.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\sequence.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\simple_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\static_partial_join_for_multi_instance.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\structured_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\structured_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\structured_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\subworkflow_to_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\subworkflow_to_join_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\synchronization.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\thread_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\thread_split.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\transient_trigger.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_data.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_data_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_to_subworkflow.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\block_to_subworkflow_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\subworkflow_to_block.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\subworkflow_to_block_inner.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\task_data.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/data\task_to_task.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff\workflow1.py
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff\workflow1.xml
ok
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff\__init__.py
testDeserialization (PersistSmallWorkflowTest.PersistSmallWorkflowTest) ... ok
testDictionarySerializer (PersistSmallWorkflowTest.PersistSmallWorkflowTest) ... ok
testTree (TaskTest.TaskTest) ... ok
testBeginWorkflowStepByStep (WorkflowTest.WorkflowTest) ... ok
testConstructor (WorkflowTest.WorkflowTest) ... ok
testRunThroughCancel (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughCancelAfterApproved (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughCancelAfterWorkStarted (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughHappy (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testRunThroughOverdue (bpmn.ActionManagementTest.ActionManagementTest) ... ok
testReadonlyWaiting (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testRunThroughHappy (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testRunThroughHappyOtherOrders (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testSaveRestore (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testSaveRestoreWaiting (bpmn.ApprovalsTest.ApprovalsTest) ... ok
testDisconnectedBoundaryEvent (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testMultipleStartEvents (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testNoStartEvent (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testRecursiveSubprocesses (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ERROR:SpiffWorkflow.bpmn.parser.TaskParser:NotImplementedError('Recursive call Activities are not supported.',)
Traceback (most recent call last):
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\..\..\SpiffWorkflow\bpmn\parser\TaskParser.py", line 59, in parse_node
self.task = self.create_task()
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\..\..\SpiffWorkflow\bpmn\parser\task_parsers.py", line 110, in create_task
wf_spec = self.get_subprocess_parser().get_spec()
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\..\..\SpiffWorkflow\bpmn\parser\ProcessParser.py", line 113, in get_spec
raise NotImplementedError('Recursive call Activities are not supported.')
NotImplementedError: Recursive call Activities are not supported.
ok
testSubprocessNotFound (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testUnsupportedTask (bpmn.InvalidWorkflowsTest.InvalidWorkflowsTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageInterruptsSpTest.MessageInterruptsSpTest) ... ok
testRunThroughInterruptSaveAndRestore (bpmn.MessageInterruptsSpTest.MessageInterruptsSpTest) ... ok
testRunThroughHappy (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughMessageInterrupt (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughMessageInterruptSaveAndRestore (bpmn.MessageInterruptsTest.MessageInterruptsTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughMessageOrder2SaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughMessageOrder3SaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughMessageSaveAndRestore (bpmn.MessageNonInterruptsSpTest.MessageNonInterruptsSpTest) ... ok
testRunThroughHappy (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughHappySaveAndRestore (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterrupt (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterruptOtherOrder (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterruptOtherOrderSaveAndRestore (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughMessageInterruptSaveAndRestore (bpmn.MessageNonInterruptTest.MessageNonInterruptTest) ... ok
testRunThroughHappy (bpmn.MessagesTest.MessagesTest) ... ok
testRunThroughSaveAndRestore (bpmn.MessagesTest.MessagesTest) ... ok
testNoFirstThenThread1 (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughAlternating (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughThread1First (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughThread1FirstThenNo (bpmn.ParallelTest.ParallelJoinLongInclusiveTest) ... ok
testRunThroughAlternating (bpmn.ParallelTest.ParallelJoinLongTest) ... ok
testRunThroughThread1First (bpmn.ParallelTest.ParallelJoinLongTest) ... ok
test1 (bpmn.ParallelTest.ParallelLoopingAfterJoinTest) ... ok
test2 (bpmn.ParallelTest.ParallelLoopingAfterJoinTest) ... ok
test1 (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTest) ... ok
test2 (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTest) ... ok
test_breadth_first (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTestNested) ... ok
test_depth_first (bpmn.ParallelTest.ParallelManyThreadsAtSamePointTestNested) ... ok
test1 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
test2 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
test3 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
test4 (bpmn.ParallelTest.ParallelMultipleSplitsAndJoinsTest) ... ok
testRunThroughAlternating (bpmn.ParallelTest.ParallelMultipleSplitsTest) ... ok
testRunThroughChoiceFirst (bpmn.ParallelTest.ParallelOnePathEndsTest) ... ok
testRunThroughParallelTaskFirst (bpmn.ParallelTest.ParallelOnePathEndsTest) ... ok
testRunThroughParallelTaskFirstYes (bpmn.ParallelTest.ParallelOnePathEndsTest) ... ok
testRunThroughChoiceFirst (bpmn.ParallelTest.ParallelThenExlusiveNoInclusiveTest) ... ok
testRunThroughChoiceThreadCompleteFirst (bpmn.ParallelTest.ParallelThenExlusiveNoInclusiveTest) ... ok
testRunThroughParallelTaskFirst (bpmn.ParallelTest.ParallelThenExlusiveNoInclusiveTest) ... ok
testRunThroughChoiceFirst (bpmn.ParallelTest.ParallelThenExlusiveTest) ... ok
testRunThroughChoiceThreadCompleteFirst (bpmn.ParallelTest.ParallelThenExlusiveTest) ... ok
testRunThroughParallelTaskFirst (bpmn.ParallelTest.ParallelThenExlusiveTest) ... ok
testNoRouteNoFirstThenRepeating (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testNoRouteNoTaskFirst (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testNoRouteRepeatTaskFirst (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRepeatTasksReadyTogether (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRepeatTasksReadyTogetherSaveRestore (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRunThroughFirstRepeatTaskFirst (bpmn.ParallelTest.ParallelThroughSameTaskTest) ... ok
testRunThroughHappy (bpmn.TimerIntermediateTest.TimerIntermediateTest) ... ok
testAncestors (specs.ExecuteTest.ExecuteTest) ... ok
testConnect (specs.ExecuteTest.ExecuteTest) ... ok
testConstructor (specs.ExecuteTest.ExecuteTest) ... ok
testFollow (specs.ExecuteTest.ExecuteTest) ... ok
testGetData (specs.ExecuteTest.ExecuteTest) ... ok
testPattern (specs.ExecuteTest.ExecuteTest) ... ok
testSerialize (specs.ExecuteTest.ExecuteTest) ... ok
testSetData (specs.ExecuteTest.ExecuteTest) ... ok
testTest (specs.ExecuteTest.ExecuteTest) ... ok
test_ancestors_cyclic (specs.ExecuteTest.ExecuteTest) ... ok
testAncestors (specs.JoinTest.JoinTest) ... ok
testConnect (specs.JoinTest.JoinTest) ... ok
testConstructor (specs.JoinTest.JoinTest) ... ok
testFollow (specs.JoinTest.JoinTest) ... ok
testGetData (specs.JoinTest.JoinTest) ... ok
testSerialize (specs.JoinTest.JoinTest) ... ok
testSetData (specs.JoinTest.JoinTest) ... ok
testTest (specs.JoinTest.JoinTest) ... ok
test_ancestors_cyclic (specs.JoinTest.JoinTest) ... ok
testAncestors (specs.MergeTest.MergeTest) ... ok
testConnect (specs.MergeTest.MergeTest) ... ok
testConstructor (specs.MergeTest.MergeTest) ... ok
testFollow (specs.MergeTest.MergeTest) ... ok
testGetData (specs.MergeTest.MergeTest) ... ok
testSerialize (specs.MergeTest.MergeTest) ... ok
testSetData (specs.MergeTest.MergeTest) ... ok
testTest (specs.MergeTest.MergeTest) ... ok
test_Merge_data_merging (specs.MergeTest.MergeTest)
Test that Merge task actually merges data ... FAIL
test_ancestors_cyclic (specs.MergeTest.MergeTest) ... ok
testConstructor (specs.SubWorkflowTest.TaskSpecTest) ... ok
testSerialize (specs.SubWorkflowTest.TaskSpecTest) ... ok
testTest (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_block_to_subworkflow (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_subworkflow_to_block (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_subworkflow_to_join (specs.SubWorkflowTest.TaskSpecTest) ... ok
test_subworkflow_to_join_refresh_waiting (specs.SubWorkflowTest.TaskSpecTest) ... ok
testAncestors (specs.TaskSpecTest.TaskSpecTest) ... ok
testConnect (specs.TaskSpecTest.TaskSpecTest) ... ok
testConstructor (specs.TaskSpecTest.TaskSpecTest) ... ok
testFollow (specs.TaskSpecTest.TaskSpecTest) ... ok
testGetData (specs.TaskSpecTest.TaskSpecTest) ... ok
testSerialize (specs.TaskSpecTest.TaskSpecTest) ... ok
testSetData (specs.TaskSpecTest.TaskSpecTest) ... ok
testTest (specs.TaskSpecTest.TaskSpecTest) ... ok
test_ancestors_cyclic (specs.TaskSpecTest.TaskSpecTest) ... ok
testAncestors (specs.TransformTest.TransformTest) ... ok
testConnect (specs.TransformTest.TransformTest) ... ok
testConstructor (specs.TransformTest.TransformTest) ... ok
testFollow (specs.TransformTest.TransformTest) ... ok
testGetData (specs.TransformTest.TransformTest) ... ok
testPattern (specs.TransformTest.TransformTest) ... ok
testSerialize (specs.TransformTest.TransformTest) ... ok
testSetData (specs.TransformTest.TransformTest) ... ok
testTest (specs.TransformTest.TransformTest) ... ok
test_ancestors_cyclic (specs.TransformTest.TransformTest) ... ok
testConstructor (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testDump (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testGetDump (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testGetTaskSpecFromName (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testSerialize (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testValidate (specs.WorkflowSpecTest.WorkflowSpecTest) ... ok
testConstructor (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testDeserializeWorkflow (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testDeserializeWorkflowSpec (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testSerializeWorkflow (storage.DictionarySerializerTest.DictionarySerializerTest) ... ERROR
testSerializeWorkflowSpec (storage.DictionarySerializerTest.DictionarySerializerTest) ... ok
testPattern (storage.DictionarySerializerTest.DictionarySerializeEveryPatternTest) ... C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\acyclic_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\arbitrary_cycles.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_partial_join.xml
ERROR
testConstructor (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testDeserializeWorkflow (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testDeserializeWorkflowSpec (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testSerializeWorkflow (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testSerializeWorkflowSpec (storage.JSONSerializerTest.JSONSerializerTest) ... ok
testPattern (storage.JSONSerializerTest.JSONSerializeEveryPatternTest) ... C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\acyclic_synchronizing_merge.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\arbitrary_cycles.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\blocking_partial_join.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_discriminator.xml
C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\data/spiff/control-flow\cancelling_partial_join.xml
ERROR
testConstructor (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testDeserializeWorkflow (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testDeserializeWorkflowSpec (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... FAIL
testSerializeWorkflow (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testSerializeWorkflowSpec (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest) ... ok
testConstructor (storage.SerializerTest.SerializerTest) ... ok
testDeserializeWorkflow (storage.SerializerTest.SerializerTest) ... ok
testDeserializeWorkflowSpec (storage.SerializerTest.SerializerTest) ... ok
testSerializeWorkflow (storage.SerializerTest.SerializerTest) ... ok
testSerializeWorkflowSpec (storage.SerializerTest.SerializerTest) ... ok
testConstructor (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testDeserializeWorkflow (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testDeserializeWorkflowSpec (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testSerializeWorkflow (storage.XmlSerializerTest.XmlSerializerTest) ... ok
testSerializeWorkflowSpec (storage.XmlSerializerTest.XmlSerializerTest) ... ok
======================================================================
ERROR: testSerializeWorkflow (storage.DictionarySerializerTest.DictionarySerializerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 123, in testSerializeWorkflow
exclude_dynamic=True)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\DictionarySerializerTest.py", line 43, in compareSerialization
raise Exception(key + '/' + str(e))
Exception: data/synch_1_reached/Unequal: 'gAJLAS4=' vs 'gAJLAi4='
======================================================================
ERROR: testPattern (storage.DictionarySerializerTest.DictionarySerializeEveryPatternTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\PatternTest.py", line 62, in testPattern
self.run_pattern(filename)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 158, in run_pattern
data=expected_data)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 123, in testSerializeWorkflow
exclude_dynamic=True)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\DictionarySerializerTest.py", line 43, in compareSerialization
raise Exception(key + '/' + str(e))
Exception: data/struct_synch_merge_1_reached/Unequal: 'gAJLAi4=' vs 'gAJLBC4='
======================================================================
ERROR: testPattern (storage.JSONSerializerTest.JSONSerializeEveryPatternTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\PatternTest.py", line 62, in testPattern
self.run_pattern(filename)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 158, in run_pattern
data=expected_data)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\SerializerTest.py", line 123, in testSerializeWorkflow
exclude_dynamic=True)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\JSONSerializerTest.py", line 36, in compareSerialization
exclude_items=exclude_items)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\DictionarySerializerTest.py", line 43, in compareSerialization
raise Exception(key + '/' + str(e))
Exception: data/struct_synch_merge_1_reached/Unequal: u'gAJLAi4=' vs u'gAJLBC4='
======================================================================
FAIL: test_Merge_data_merging (specs.MergeTest.MergeTest)
Test that Merge task actually merges data
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\specs\MergeTest.py", line 68, in test_Merge_data_merging
self.assert_('second' in task.data)
AssertionError: False is not true
======================================================================
FAIL: testDeserializeWorkflowSpec (storage.OpenWfeXmlSerializerTest.OpenWfeXmlSerializerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\storage\OpenWfeXmlSerializerTest.py", line 38, in testDeserializeWorkflowSpec
run_workflow(self, wf_spec, path, None)
File "C:\ACME\Dev\projects\SpiffWorkflow-master_GITHUB2\tests\SpiffWorkflow\util.py", line 110, in run_workflow
test.assert_(taken_path == expected_path, error)
AssertionError: Expected:
Start
concurrence_1
task_a1
task_a2
if_condition_1
task_a3
if_condition_1_end
if_condition_2
task_a5
if_condition_2_end
task_b1
task_b2
concurrence_1_end
task_c1
task_c2
End
but got:
Start
concurrence_1
task_a1
task_a2
if_condition_1
task_a3
if_condition_1_end
if_condition_2
task_a5
if_condition_2_end
task_b1
task_b2
concurrence_1_end
task_c1
task_c2
End
----------------------------------------------------------------------
Ran 161 tests in 12.789s
FAILED (failures=2, errors=3)
Process finished with exit code 1
I am quite confused that as for the example of nuclear strike, how to modify attribute confirmation
when task is processing?
Hello! Which GUI BPMN editor can I use to load to the SpiffWorkflow? Bizagi Modeler 2.6 saves xml not compatible with the SpiffWorkflow/
In Task.py line 176: self.id = uuid4()
uuid4() returns a UUID('871648c3-4a1b-4939-8136-3323cde73744') which is not JSON serializable.
I am going to implement workflow by following bpmn 2.0
but I don't know how to customize task which can execute job what I want
which task should I override, scriptTask or callActivity or ....?
I am not sure how to implement is correct way
Would you please help me?
Thank you so much
The latest release on PyPi was 0.3.0 in 2009. The code has changed a fair amount since then. It's time for a new version number - 1.0 maybe? - and a new release.
Hi,
I've been taking a look at Spiffworkflow for a project I'm working on and it seems it could perfectly fit the bill, since I need to develop a workflow system where a process may have several steps, several people are involved in this process and have different permissions to execute certain steps, etc.
However I'm having trouble finding real usages of this library, and how to bind each task to python code or functions (e.g. notify by email a user once a step has been completed), how to decide which person is allowed to execute a certain step on a workflow, etc. Is there any complex implementation I could see as a source of inspiration?
Thank you and kind regards.
After run test code, I want get current task collect_change, but now is empty!
test code:
def test_swprocess(self):
serializer = XmlSerializer()
xml_file = 'tests/docs/swprocess.xml'
xml_data = open(xml_file).read()
spec = serializer.deserialize_workflow_spec(xml_data)
wf = Workflow(spec)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
wf.complete_task_from_id(task.id)
# start
tasks = wf.get_tasks(Task.READY)
print('tasks ...')
print(tasks)
task = tasks[0]
task.set_data(**{'process_type': 'sw_optimization'})
wf.complete_task_from_id(task.id)
# start
tasks = wf.get_tasks(Task.READY)
print('tasks 1...')
print(tasks)
task = tasks[0]
self.assertEqual('internal_sw_optimization', task.get_name())
# internal_sw_optimization
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
self.assertEqual('code_implementation', task.get_name())
# code_implementation
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
self.assertEqual('project_build', task.get_name())
# project_build
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
self.assertEqual('pre_validation', task.get_name())
# pre_validation
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
self.assertEqual('collect_change', task.get_name())
# collect_change
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
self.assertEqual('code_review_meeting', task.get_name())
# code_review_meeting
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
self.assertEqual('sw_change_class', task.get_name())
# sw_change_class
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
task = tasks[0]
self.assertEqual('review_code_issues', task.get_name())
# review_code_issues
task.set_data(**{'review_result': 'N'})
wf.complete_task_from_id(task.id)
tasks = wf.get_tasks(Task.READY)
print(tasks)
task = tasks[0]
self.assertEqual('collect_change', task.get_name())
swprocess.xml:
<?xml version="1.0" encoding="UTF-8"?>
<process-definition name="flow" revision="1.6">
<description>CEP Software Release Process</description>
<!-- Start with an implicit simple split. -->
<start-task>
<successor>process_type</successor>
</start-task>
<multi-choice name="process_type">
<description>Select Process Type</description>
<!-- -->
<!-- Implementation Process -->
<!-- -->
<!-- New product / New test process -->
<conditional-successor>
<equals left-field="process_type" right-value="new_product" />
<successor>test_plan_creation</successor>
</conditional-successor>
<conditional-successor>
<equals left-field="process_type" right-value="new_product" />
<successor>validation_plan_definition</successor>
</conditional-successor>
<conditional-successor>
<equals left-field="process_type" right-value="new_product" />
<successor>repository_creation</successor>
</conditional-successor>
<!-- New software project -->
<conditional-successor>
<equals left-field="process_type" right-value="new_sw_project" />
<successor>repository_creation</successor>
</conditional-successor>
<!-- Equipment change / Process change / New model or model change -->
<conditional-successor>
<equals left-field="process_type" right-value="ecn_pcn" />
<successor>process_env_change</successor>
</conditional-successor>
<!-- Idea for software optimization -->
<conditional-successor>
<equals left-field="process_type" right-value="sw_optimization" />
<successor>internal_sw_optimization</successor>
</conditional-successor>
</multi-choice>
<multi-choice name="process_env_change">
<description>Process evironment change</description>
<!-- New product / New test process -->
<conditional-successor>
<equals left-field="need_test_plan" right-value="True" />
<successor>test_plan_creation</successor>
</conditional-successor>
<conditional-successor>
<equals left-field="need_validation_plan" right-value="True" />
<successor>validation_plan_definition</successor>
</conditional-successor>
<conditional-successor>
<equals left-field="need_code_change" right-value="True" />
<successor>code_implementation</successor>
</conditional-successor>
</multi-choice>
<task name="repository_creation">
<description>Repository creation</description>
<successor>code_implementation</successor>
</task>
<task name="test_plan_creation">
<description>Test plan creation</description>
<successor>collect_change</successor>
</task>
<task name="validation_plan_definition">
<description>Validation plan definition</description>
<successor>collect_change</successor>
</task>
<task name="internal_sw_optimization">
<description>Internal software optimization</description>
<successor>code_implementation</successor>
</task>
<task name="code_implementation">
<description>Code implementation</description>
<successor>project_build</successor>
</task>
<task name="project_build">
<description>Project build</description>
<successor>pre_validation</successor>
</task>
<task name="pre_validation">
<description>Pre-Validation</description>
<successor>collect_change</successor>
</task>
<!-- -->
<!-- Validation Process -->
<!-- -->
<join name="collect_change" context="process_type" threshold="1">
<description>Collect Changes</description>
<successor>code_review_meeting</successor>
</join>
<task name="code_review_meeting">
<description>Code Review Meeting</description>
<successor>sw_change_class</successor>
</task>
<task name="sw_change_class">
<description>Software Change Classification</description>
<successor>review_code_issues</successor>
</task>
<exclusive-choice name="review_code_issues">
<description>Review code issues</description>
<default-successor>selection_tests</default-successor>
<conditional-successor>
<equals left-field="review_result" right-value="Y" />
<successor>selection_tests</successor>
</conditional-successor>
<conditional-successor>
<equals left-field="review_result" right-value="N" />
<successor>collect_change</successor>
</conditional-successor>
</exclusive-choice>
<task name="selection_tests">
<description>Selection of Tests</description>
<successor>run_measurements</successor>
</task>
<task name="run_measurements">
<description>Run Measurements</description>
<successor>evaluate_measurements</successor>
</task>
<task name="evaluate_measurements">
<description>Evaluate Measurements</description>
<successor>document_collection</successor>
</task>
<task name="document_collection">
<description>Document Collection</description>
<successor>sw_compliance_approval</successor>
</task>
<exclusive-choice name="sw_compliance_approval">
<description>Software Compliance Approval</description>
<default-successor>update_sw_release_table</default-successor>
<conditional-successor>
<equals left-field="approval_result" right-value="Y" />
<successor>update_sw_release_table</successor>
</conditional-successor>
<conditional-successor>
<equals left-field="approval_result" right-value="M" />
<successor>document_collection</successor>
</conditional-successor>
<conditional-successor>
<equals left-field="approval_result" right-value="N" />
<successor>collect_change</successor>
</conditional-successor>
</exclusive-choice>
<task name="update_sw_release_table">
<description>Update Software Release Table</description>
<successor>end</successor>
</task>
</process-definition>
Hello!
I got confused after I read the tutorial. The problem is that I can't find the function declaration of general
or president
in nuclear.json
, therefore, I don't know what exactly general
or president
does. When workflow got Task president
, it just completed, I couldn't find the execution of president
and how did the attribute confirmation
come out.
Actually, I want to implement several custom-tasks into a workflow so that it can make sure my tasks run well in programmed sequence like
{
"task_specs": {
"Start": {
"class": "SpiffWorkflow.specs.StartTask.StartTask",
"manual": false,
"outputs": [
"createEC2"
]
},
"custom-task1": {
"class": "tasks.custom.CustomTask1",
"name": "task1",
"inputs": [
"Start"
],
"outputs": [
"task2"
]
},
"custom-task2": {
"class": "tasks.custom.CustomTask2",
"name": "task2",
"inputs": [
"task1"
]
},
"workflow_aborted": {
"class": "SpiffWorkflow.specs.Cancel.Cancel",
"name": "workflow_aborted",
"inputs": [
"task1",
"task2"
]
}
},
"description": "",
"file": null,
"name": ""
}
So I have to understand how to write a custom task like NuclearStrike.
When an error occurs in a scripttask the workflow is aborted. Is there a more subtle way to handle exceptions within Spiff?
Is it really necessary to have bare except statements in the codebase or could be replace them with catching Exception? They wouldn't catch KeyboardInterrupt that way, but that's a rather unusual case anyway.
The current documentation is incomplete. To complete it, a better documentation mechanism like Sphinx would be helpful.
It is failing with the following callstack:
Traceback (most recent call last):
File "/home/fcorreia/src/spiffworkflow/tests/SpiffWorkflow/PatternTest.py", line 80, in testPattern
for filename in os.listdir(dirname):
OSError: [Errno 2] No such file or directory: '/home/fcorreia/src/spiffworkflow/tests/SpiffWorkflow/xml/spiff/resource/'
It happens because the test is looking for .xml files in a "resource" directory, that does not exist. I've fixed it on my end by ignoring that directory, but it might just have been forgotten on the previous commits.
In the folder wireit you forgot to put the build folder.
Hi,
I am wondering if input/output parameters are supported (I am using Camunda Modeler).
I successfully used an Script Task which does "task.set_data(found='yes')" and then I have an exclusive block that has two outgoing things with expressions: "found=='yes'" and "found!='yes'".
But can't be this done this using input/output parameters. What is a use case for this?
PizzaSimpleWithScriptTaskAndCondition.bpmn.txt
有以下应用场景:
1、用户有设备需要报修,填写了设备缺陷信息。提交给设备管理部门
2、设备管理部门获取缺陷信息后,评估缺陷,并填写预估费用,提交给部门领导审批。
3、领导审批后,技术人员组织维修设备,并反馈维修结果。
4、用户获取维修结果反馈,验收设备,并给出评价。
We should rename SpiffWorkflow properties and SpiffWorkflow attributes, and the related methods, to "task_data" and "task_spec_data", to avoid confusion with their Python equivalents.
In ThreadSplit.py, in _find_my_task
there is a reference to my_task
that doesn't seem to exist. It's used by _on_trigger, which would then seem to have to fail with a NameError. What's going on here?
Hi,
Can someone please provide an example of how to link a Simple task derivative to the node?. All s are by default treated as "Simple" type. What if I want to get my own derivative of Simple instead as the task spec? The NuclearStrike example is very clear for JSON, is there a similar one for the XML? I searched all over the internet, can't find a clear example for XML.
For the JSON example, I can see where this binding happens using the 'class' attribute - in dict.py
task_spec_cls = get_class(task_spec_state['class'])
task_spec = task_spec_cls.deserialize(self, spec, task_spec_state)
Is there a similar thing for XML. I tried providing a class attribute on the task node, but it obviously didn't work because there is no code in the XML Serializer consuming and acting on that.
From the tutorial:
# Execute until all tasks are done or require manual intervention.
+workflow.complete_all()
This will run all tasks using default choices not respecting task spec definitions flagged as manual. How does one "require manual intervention"?
I am struggling to find documentation on how to use timer events and/ or tasks. I need them for escalations and notifications when users do not act within a certain time period to tasks.
Where can I find directions on how this works, or pointers on how to implement such behaviour.
When trying to use an inclusive gateway with two outgoing sequence flows I get this error.
SpiffWorkflow.bpmn.parser.ValidationException.ValidationException: Multiple outgoing flows are not supported for tasks of type
Source Details: bpmn:inclusiveGateway
Are we expected to subclass 'handles_multiple_outgoing' to get this working?
Hello, deserializer expected treshold serialized by pickle and to base64. So this is something which is nearly impossible write by hand, only possibility is to generate this node by serialization.
Some refactoring of the TaskSpec classes seems to be in order:
Hi all,
I would like to try this lib but obviously its not working. When I simply to in python:
from SpiffWorkflow import Workflow
I get the error:
Traceback (most recent call last):
File "", line 1, in
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/init.py", line 1, in
from Job import Job
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Job.py", line 16, in
import Tasks
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Tasks/init.py", line 1, in
from AcquireMutex import AcquireMutex
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Tasks/AcquireMutex.py", line 18, in
from TaskSpec import TaskSpec
File "/usr/local/lib/python2.7/dist-packages/SpiffWorkflow/Tasks/TaskSpec.py", line 16, in
from SpiffSignal import Trackable
ImportError: No module named SpiffSignal
Obviously a module is missing. I was not able to find it into the install directory.
Any clew why it disappeared?
Cheers
I've come a cross a few situations where I have ended up with duplicate task IDs. I noticed in the code that there are places where you account for that (like in workflow.get_task()).
What are the legitimate cases where you have tasks with the same IDs?
Should it be in operators.py or in the root? It's not really a spec. I came across it trying to figure out how pre/post_assign work...
Here http://spiffworkflow.readthedocs.io/en/latest/tutorial/index.html, in the json, the president task
"president": {
"class": "SpiffWorkflow.specs.ExclusiveChoice.ExclusiveChoice",
"name": "president",
"manual": true,
"inputs": [
"general"
],
"outputs": [
"workflow_aborted",
"nuclear_strike"
],
"choice": null,
"default_task_spec": "workflow_aborted",
"cond_task_specs": [
[
[
"SpiffWorkflow.operators.Equal",
[
[
"Attrib",
"confirmation"
],
[
"value",
"yes"
]
]
],
"president"
]
]
}
Here in the output task if the condition is true, president is given. I think it should be nuclear_strike.
cond = Equal(Attrib('test_attribute1'), Attrib('test_attribute2'))
cond._matches(my_task) always return True which should be False in this case.
Hi Guys,
We are trying to implement SpiffWorkflow in an exiting new project. However the documentation of the XML lacks examples and the examples I do find do not work
According to the documentation (and examples I found) the following should work:
with open(flow_def, 'r') as xmlflow:
xml_data = xmlflow.read()
spec = WorkflowSpec.deserialize(XmlSerializer(), xml_data, filename=flow_def)
However using this code results in an
E AttributeError: 'str' object has no attribute 'findtext'
Looking at the statement that is executed, this can never work because it expects an lxml object.
After doc/tutorial/deserialize.py was executed, I run deserialize.py
Traceback (most recent call last):
File "deserialize.py", line 8, in
spec = WorkflowSpec.deserialize(serializer, workflow_json, 'nuclear.json')
TypeError: deserialize() takes 3 positional arguments but 4 were given
The SpiffSignal is missing from the repository
It is required for running the tests and missing
eugen@lps-dv2700:~/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow$ python XmlReaderTest.py
Traceback (most recent call last):
File "XmlReaderTest.py", line 4, in
from WorkflowTest import WorkflowTest
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/WorkflowTest.py", line 4, in
from SpiffWorkflow import Workflow, Job
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/init.py", line 1, in
from Job import Job
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Job.py", line 16, in
import Tasks
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Tasks/init.py", line 1, in
from AcquireMutex import AcquireMutex
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Tasks/AcquireMutex.py", line 18, in
from TaskSpec import TaskSpec
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/Tasks/TaskSpec.py", line 16, in
from SpiffWorkflow.external.SpiffSignal import Trackable
File "/home/eugen/Work/Helios-Python/temp/knipknap-SpiffWorkflow-813cc2a/tests/SpiffWorkflow/../../src/SpiffWorkflow/external/init.py", line 5, in
import SpiffSignal
For fun I ran the test suite on PyPy. It all works, up till the point where it compares two pickles:
Exception: task_specs/go_to_repetition/pre_assign/[0]/Unequal: 'gAJjU3BpZmZXb3JrZmxvdy5vcGVyYXRvcnMKQXNzaWduCnEAKYFxAX1xAihVBXJpZ2h0cQNYAQAAADBxBFUPcmlnaHRfYXR0cmlidXRlcQVOVQ5sZWZ0X2F0dHJpYnV0ZXEGWAYAAAByZXBlYXRxB3ViLg==' vs 'gAJjU3BpZmZXb3JrZmxvdy5vcGVyYXRvcnMKQXNzaWduCnEAKYFxAX1xAihVDmxlZnRfYXR0cmlidXRlcQNYBgAAAHJlcGVhdHEEVQVyaWdodHEFWAEAAAAwcQZVD3JpZ2h0X2F0dHJpYnV0ZXEHTnViLg=='
If you base64 decode the string, you see that the pickles are the same, just that they have keys in a different order.
I initially thought this was a PyPy bug and reported it: https://bugs.pypy.org/issue1693
However, it turns out that even in regular python it's not safe to rely on the ordering.
I'm not sure what the best way to fix this is.
__eq__
functions for Assign, etc.I don't understand the rationale for pickling things, so I'd lean towards the first solution, but I'm fully open to being convinced about the necessity of pickles.
As title:
Is this project still supported or commercially supported?
I also notice https://procedure8.com/, the commercial supporter returns 502 (30/04/2019)
Hello !
I noticed there was an implementation of ServiceTask in the forked version for ZEngine : https://github.com/zetaops/SpiffWorkflow/blob/master/SpiffWorkflow/bpmn/specs/ServiceTask.py
As I'm using the original repository for its LGPL 3 license, I was wondering if there is any plan to backport this class. I'm indeed a little bit confused by the LGPL 2.1 licence mentionned in the class file header...
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.