I have to start a technical workflow, which exports data to CRM, after other smaller workflows end their activities.
The smaller workflows update data internally, so I have to export data to CRM only after all these technical workflows end their updates.
The best way I find to solve this problem, making it scalable for future, is exploiting logs.
The log structure is like "Action_count , Flag_export".
Every small workflow when ends it's activities, increments by one the Action_count and than checks if Action_count is equal to the number of workflow inside a specific folder. If yes, it turns Flag_export to 1.
On the other side, the main workflow, wait until the Flag_export become 1.
This way I have only to remember to put new "small" workflow in that folder and automatically the main workflow will wait until the x+1 smaller workflows end.
I would avoid to combine all the small wf (not too small, but smaller than the main one) into a big one because if they fail their execution I have to restart each of them manually and keep monitoring them until their end.
For this case, this solution seems to be too tricky to maintain.
thank you very much for your advice and so your time.
I really like your solution but this way every time I want to develop a new "small" wf, I have to add to the "main" one the signal and the js activities.
Like the another solution I found (every "small" wf at its end calls the following small one. The last "small" wf will call the main wf) they are not too scalable for future improvement, keeping in mind that future developments will be made by business accounts: not too confident with platform customization.
Finally, in my humble opinion, the best solution (only in terms of simplicity and scalability) remains my first one.