Skip to main content

Dynamic Tasks processing in sequence

  • August 2, 2024
  • 39 replies
  • 567 views

Show first post

39 replies

  • Practitioner
  • March 19, 2025

Our testing shows that this issue is now resolved, and the ticket we have with Microsoft is now closed. Please reach out to us again if you still experience any problems.

Thanks 

Dag Brattli


Aditya Kotiyal
MVP
Forum|alt.badge.img+5

Thanks ​@Dag Brattli . Team is also planning to test it as the priorities for them has changed since last time because the issue took a lot of time to be solved. I will keep you and team posted.


@Dag Brattli Thanks for the update. Do we still need to prewarm the functions to test this? or Do we expect the good performance without that as well?


@Dag Brattli I checked the performance with same example that I posted in this ticket but I still don’t see the executions are finished in nearly same time. Is it possible for you to check that example at your side if anything is missing. 

 

Here is snapshot of function executions to show the duration. This function is running in parallel through dynamic task. The CDF project name : slb-uds-qa

 

 


Aditya Kotiyal
MVP
Forum|alt.badge.img+5

@Dag Brattli  Niranjan tested it but the same issue persists.


  • Practitioner
  • April 24, 2025

@Aditya Kotiyal There should be no need to prewarm the function anymore. But scaling is not instant, and it will take some time for the function to scale up. We are currently working with Microsoft to enable Flex Consumption plan for Azure Functions which will enable faster and better scaling. It is hard to know if you are experiencing an error without any logging of when the function actually started executing. Would it be possible for you for each call of the function to log:

  • When the function is called
  • When the function actually starts to process
  • Duration of the function execution

That way we can see how quickly the function scales and how many instances you get.

Thanks
Dag Brattli


Aditya Kotiyal
MVP
Forum|alt.badge.img+5

Hi ​@Dag Brattli 


  • Practitioner
  • May 16, 2025

@Aditya Kotiyal We are now ready to onboard to Flex Consumption plan that have better performance and scalability so if you are interested we could enable you there. But it will require redeployment of functions to get them on the Flex Consumption plan. Let me know what you think.


Aditya Kotiyal
MVP
Forum|alt.badge.img+5

@Dag Brattli Let me ask the team and then may be we can plan the switch.


@Dag Brattli Is there any impact to already running functions after we switch to flex consumption plan? we may test the sample function first before we touch the core functions redeployment. 


  • Practitioner
  • May 16, 2025

@Niranjan Madhukar Karvekar No, already running functions on the Consumption plan will continue to work as before. But any new function deployment will (if we toggle) then be deployed to the new Flex Consumption plan.

PS: Note that the Azure Consumption plan will be deprecated (by Microsoft) in ~3 years time so you will eventually have to move everything anyways. But it’s good to start with a few functions first.


Aditya Kotiyal
MVP
Forum|alt.badge.img+5

Thanks ​@Dag Brattli , also had a chat with Everton.

 

I am checking with the team when they can plan to redeploy the function. Once I get a date, we can switch the plan and ask to team to redeploy.

Thank you for your support.

 


  • Practitioner
  • July 10, 2025

From our side, we consider this resolved. This should be fixed by Azure Flex Consumption, which has recently been rolled out. If you redeploy, the new function will appear on Flex. Please note that you cannot expect all calls to be processed perfectly in parallel. However, the sequential execution you have experienced before should not happen.


Aditya Kotiyal
MVP
Forum|alt.badge.img+5

Hi ​@Dag Brattli ,

 

@Niranjan Madhukar Karvekar  has observed some improvements.

He is trying it in a project named” slb-hackathon” in west European cluster.

But we want to replicate it for more concurrent execution.

 

Can you help us confirm if the concurrent function limit is set to 50 or 100 in this project?

If 50, can it be raised to 100, so that the team can replicate it in this environment?