Skip to main content

Dear Team,

I am reaching out to provide feedback and suggest enhancements for the new Cognite subscription API, particularly concerning its use in continuous data streaming applications.

Challenge with Time Series Subscriptions API:

We are currently utilizing your Time series subscription API in a project that requires continuous data streaming. However, the API’s synchronous nature has presented us with significant challenges. The main issue is the inherent latency in data processing, which is exacerbated by the need to implement a sleep mechanism with a fixed delay. Determining the optimal sleep duration is challenging; too short a delay leads to frequent polling, while a longer delay results in data lag.

Benefits of Asynchronous Streaming:

Implementing asynchronous functionality in the API would greatly alleviate these issues. Asynchronous methods would enable more efficient and responsive data handling, reducing latency and providing a smoother data stream. This is particularly crucial in applications where timely data processing and analysis are vital.

Suggestions for Improvement:

We suggest the addition of asynchronous streaming capabilities to the API, such as non-blocking data fetches or real-time push mechanisms. This improvement would align with industry standards and significantly enhance the API's utility for various applications, especially those requiring real-time data processing.

Best regards,

Oussama

Hi Oussama, 

Thank you so much for using our API, and for your great feedback.  I’ve shared your comments with the team and we’ll come back with a reply soon.

I can say that we took a decision during the development phase to not implement a real-time push as that would require a sizeable effort, though we would consider adding that later depending on customer feedback (such as yours!) so this is really good information.

We might look at how we can implement non-blocking / async methods and see if that’s low hanging fruit that we can include in the production release.

Kind Regards, Glen


Hi Oussama!

We have considered a feature where the request will actively wait in the backend for new data.

You send in the request, and if there is data available, we will return it immediately. Otherwise, we will keep the request waiting, while we continuously check if new data has come in. Once we have new data, we immediately return it, keeping the latency low (< 1 second).

After a while (2-30 seconds? Not decided yet, could be configurable), we return an  empty response, and you can query again.

If we make such a feature, would you be interested in being an early tester?
Best Regards,

Matias