Skip to main content
Gathering Interest

Increase max number of Cognite Functions per project

Related products:Functions
  • June 16, 2025
  • 4 replies
  • 98 views

Markus Pettersen
MVP

Hi,

We are encountering an issue with the maximum number of Cognite Functions we can have per project being a crippling limitation rendering the entirety of Cognite Functions nearly unusable to us at scale.

Our current limit is 150, but there is a hard limit of 250 that we can’t exceed. This is because of some underlying limitation with the cloud provider showing through CDF.

The ideal situation would be that we could use Cognite Functions as the processing step behind user facing applications such as streamlit, but the limit on number means we have to be very selective about which applications/integrations can make use of this CDF feature. If we were to recommend this as the way to go (which is what we want to), then we would hit the limit in a matter of days.

Even if we only used 1 function per integration and split it up by using schedules, that would still limit us to 150-250 integrations, which is simply not enough. Again, we want to use this feature, we like this part of CDF, we just don’t like the limit.

 

Regards,
Markus Pettersen
Aker BP - Technical Domain Architect - CDF

4 replies

Markus Pettersen
MVP

More info, use cases, complaints:

 

Hi, I've been contacted by the Yggdrasil team about what we use for processing especially when both the source and target systems are CDF. I know that Cognite Functions were not intended by Cognite as the main tool for processing in CDF for larger jobs, but why not. If it scales then it's great, and it will work with Cognite Workflows. Using Workflows is something we want to do more as it makes it much easier to have the processing be event driven rather than schedule based.

 

The issue we have with Workflows is that we can't use it to directly trigger and listen to jobs ourside of CDF, as those resources are on our internal VNET and CDF does not meet the requirement to get direct access to the VNET. From our perspective Cognite Functions would resolve this and we could use Workflows as the orchestration. Having both orchestration and processing within CDF is something we see as a great benefit and simplification. For this to be a viable option we would need the Functions to scale a bit more both in terms of RAM, CPU, runtime duration and max amount. Another issue is that it is hard to debug failing functions when there are multiple schedules triggering the function.

  • More RAM is needed as some jobs needs a lot of context, e.g. all tags for a given facility in memory at once (and no, there is no way to filter this down further). This takes upwards of 4GB.
  • More cores are needed to support event based parallelism. When we have multiple triggers at once 1 core starts to struggle a bit, and NRT requirements makes waiting on previous job completion not an option
  • Increased runtime duration is needed to support the more demanding tasks that can take up to an hour
  • And as with our other Cognite Functions inquiries we would need more max amount for this to be viable
  • Better debugging of Functions especially when trigger by multiple schedules/configs
    • We need a way to easily see which config caused the error in order to properly debug
    • This part is general and much needed regardless of the rest of these point and the scaling of Functions

The other option is to run the orchestration outside of CDF, but this makes event based more difficult, as getting those event triggers from CDF out is a bit cumbersome. We see Workflows as a much better and intuitive option here.

 

And again, I know this is not what Cognite initially intended Functions to be, but this is the most intuitive use of it that we see. And it would make it much easier on our developers.

 

 

The issues regarding Cognite Functions reported by the Yggdrasil team:

The limitations we currently know about with CDF functions are:

  • Maximum 250 instances of CDF functions
  • Limitations related to the tier in the CDF infrastructure setup for Azure Functions and Azure Storage Accounts:
    • Maximum 1.5GB RAM
  • Not enough memory to load all the data needed for large processing jobs
    • We need to optimize quite a bit to be able to run larger jobs, which increases complexity and end-to-end execution time
  • Runtime
    • Maximum 10 minutes
    • Heavier jobs take a lot of time and risk timeouts if data volumes increase significantly day to day
    • We need to optimize and "split up" jobs into smaller pieces, but this can't be done too much as we quickly hit the max limit of 250 functions
    • Limited CPU power means there's a risk of timeouts on heavier runs. More vCores would be preferable
  • Transparency/lack of detail in errors/logs
    • Difficult to troubleshoot

Trygve Utstumo
Practitioner

Dear Markus,

Thank you for the detailed insights from your team regarding Cognite Functions. We appreciate you outlining how you envision using Cognite Functions as a powerful processing tool, particularly in conjunction with Cognite Workflows for event-driven orchestration, and for bridging connectivity to your internal VNET. Your perspective on the benefits of having both orchestration and processing within CDF for simplification and developer experience is very clear.

We've discussed your specific points internally to provide a clearer picture of current capabilities and immediate next steps. Our Product Manager is out-of-office this week, so I’ll do my best by sharing my reflections from an engineering point of view:

  1. Runtime Duration: We understand the need for longer runtimes for demanding tasks (up to an hour) and the challenge posed by the current limits.

    • For Cognite Functions running on Azure (which your project uses), as we move to the Flex Consumption plan as the underlying infrastructure, it may be possible to extend the timeout to 30 minutes. This would be a per-project request and would involve further development work on our side to enable this for Cognite Functions. We have logged this as a new feature request based on your specific need.
  2. RAM & CPU:

    • We are actively rolling out the Flex Consumption plan for our underlying Azure infrastructure. With this change, the available RAM for Cognite Functions will be increased to 2GB. While this is an improvement from the current 1.5GB, we recognize it may still fall short of your stated need for 4GB+ for certain complex jobs requiring extensive in-memory context.
    • Regarding CPU, we understand the demand for more vCores to support event-based parallelism and NRT requirements, particularly when dealing with multiple simultaneous triggers. This is an area we continue to evaluate for future enhancements.
  3. Maximum Number of Functions:

    • You've correctly identified that the current limit of 250 Cognite Functions per project is primarily tied to the underlying limit on Azure Storage Accounts (250 per subscription by default).
    • We can investigate the possibility of increasing this particular limit for your specific Azure subscription through a manual request to Azure, potentially allowing up to 500 functions. However, it's important to note that even if this specific limit is increased, other underlying cloud provider limits may come into play, and we would need to assess these during the investigation. We will initiate this investigation for your subscription once we complete the Flex roll-out.
  4. Debugging and Observability: Your feedback on the difficulty of debugging failing functions, especially with multiple schedules/configurations, and the need for greater transparency in errors and logs, is a critical area for improvement that we are prioritising. We understand this is a general necessity for all users, regardless of scale. Personally I belive we can provide a lot of assistance here through a well crafted Jupyter notebook for troubleshooting with some Atlas AI calls embedded to guide the troubleshooting.

We recognize that some of these current limitations will continue to impact your ability to use Cognite Functions as widely as you'd prefer, particularly for very long-running or memory-intensive jobs. Your insights are vital in shaping our product roadmap. We are committed to evolving Cognite Functions to meet these advanced enterprise requirements.

I know our Product Manager would very much appreciate the opportunity to schedule a direct call with your team to discuss these constraints further and collaboratively explore potential architectural approaches within the current and upcoming capabilities. I’ll make sure he gets in touch with you 

Best Regards,

Trygve Utstumo,
Principal Software Engineer in Data Integrations, PhD

 


Markus Pettersen
MVP

Hi Trygve,

Thanks for the comprehensive update, good to see that the automatic answer is no longer “no”. We are looking forward to seeing these changes be implemented and discussing the way ahead and how we could best utilize CDF’s processing capabilities. 


Markus Pettersen
MVP

In the previous comment it was stated that the max number of Functions were 250, later we’ve been informed that this is only 150. This is right now causing problems as 150 functions very limiting and while we want to use Cognite Functions, if the limits are not increased then we can no longer let our developers use them and we will have to resort to something else.

Is there an update on the potential for increasing this number to 250, 500, 1 000, or preferably unlimited (I realize this may just be wishful thinking on my part).

Again, we really like Cognite Functions and want to use them, but the limitations are a bit too much at the moment.

 

Markus