Skip to main content

Hi,

I am running parallel tasks(Cognite functions) in Cognite workflows. One of the task is creating table in Raw db, where as other one is deleting the table. Whatever table is create, its name is passed as input to other function deleting it.

  1. First attempt I ran 150 parallel tasks in a 10 workflows(each 15 tasks). One table was not deleted from raw db 
  2. In second attempt I ran 200 parallel task in 10 workflows(each 20 tasks). In this case 8 tables were not deleted from raw

I was not able to debug why those table not get deleted. As the delete function status was showing ‘Completed’ on Cognite UI.

Can you please help why this tables are not getting deleted? If there is a limit of parallel task execution then how to exceed this limit? And how to debug this scenario where there is no function execution failure?

Attaching the test workflow for reference. Run workflow-deploy-job.py file to deploy workflow on cognite project. Edit functions.json and workflow-deploy-job.py files for replacing cognite related variables with respective values.

Below is input provide to function ‘wk_test_run_workflow’ while workflow is deployed on cognite instance

{
    "workflow_split": 10,
    "task_split": 20,
    "queries_db": "test:db",
    "workflow_name": "cdf_test_workflow",
    "workflow_version": 1

Hello @Akshay Hande ! 


Thanks for providing the code! I see that you re-raise any exceptions in your exception handler, which should in principle give you a “Failed” function call. Do you get any logs from the functions what-so-ever? This might help you pin down the root cause. 

 

Kind regards,

Ivar Stangeby


Hi @Akshay Hande,

Could you please share the function log details so that we can further investigate the issue? 


I have created ticket on zendesk, to follow up further discussion there: https://cognite.zendesk.com/hc/en-us/requests/12669


Reply