Skip to main content

Hi,

 

When developing with the Cognite python SDK, a common restriction is  the API imposed restrictions on queries. Quering time-series, for example, is restricted by 100 asset-ids. 

 

I believe the Python SDK should recognize these constraint violations and batch / concurrently dispatch requests in chunks that do not violate constraints. Optionally with a performance-warning to the developer.

Is this sane, or do you think that the developer/customer should maintain a wrapper on the SDK for batching each endpoint (as we’ve currently done at Statnett)?

Hi.

Thanks for getting in touch. Having this in the SDK absolutely makes sense, I have added this request to our backlog. Depending on what you are trying to do it may be more suitable to pass a root asset id as filter instead, which would let you filter out any time series which don’t belong to an entire subtree rooted at a specified asset. This subtree can contain up to 100,000 assets.


Thank you for the input Robert!

This seems like a reasonable thing for the SDK to be capable of doing, so I’ve added your insight to Productboard and will link it to a feature/idea for the Python SDK to more dynamically handle API limits (batching/chunking and back-off with appropriate warnings if necessary). 

 

We will likely need more information as we investigate ideas on how to implement the functionality and hope it’s OK to reach out as needed?


Hi, please do not hesitate to reach out. I’m available on pretty short notice for these kinds of meetings.

 

@erlend.vollset  Our hierarchy is pretty wide in a contextually rich region. E.g. Substations (the primary organizing container of assets) share a common parent with ~1e2 other Substations, such that we break on 100k assets in subtrees for parents of Substations.  Much of our structure is stored in relationships, and so we seldomly use the hierarchy for context (we raised a separate issue on the limitation on 100k assets in subtrees).

 


Reply