Hi,
I am running parallel tasks(Cognite functions) in Cognite workflows. One of the task is creating table in Raw db, where as other one is deleting the table. Whatever table is create, its name is passed as input to other function deleting it.
- First attempt I ran 150 parallel tasks in a 10 workflows(each 15 tasks). One table was not deleted from raw db
- In second attempt I ran 200 parallel task in 10 workflows(each 20 tasks). In this case 8 tables were not deleted from raw
I was not able to debug why those table not get deleted. As the delete function status was showing ‘Completed’ on Cognite UI.
Can you please help why this tables are not getting deleted? If there is a limit of parallel task execution then how to exceed this limit? And how to debug this scenario where there is no function execution failure?
Attaching the test workflow for reference. Run workflow-deploy-job.py file to deploy workflow on cognite project. Edit functions.json and workflow-deploy-job.py files for replacing cognite related variables with respective values.
Below is input provide to function ‘wk_test_run_workflow’ while workflow is deployed on cognite instance
{
"workflow_split": 10,
"task_split": 20,
"queries_db": "test:db",
"workflow_name": "cdf_test_workflow",
"workflow_version": 1
}