@Sunil Krishnamoorthy@Carin Meems Any update on this? On Oct 7, our workaround of selecting a wider date range and throwing away unneeded data stopped working in some cases for a production dashboard (Bug #73988 on our side). This is the error we are seeing even though end is greater than start:"Request with id 9d6f0c9c-18a8-9786-8d34-bfa535564e5b to https://az-eastus-1.cognitedata.com/api/v1/projects/pinnacle/timeseries/data/list failed with status 400: Invalid time range, end must be later than start."
The Cognite Replicator has worked great for us for refreshing our Dev and QA environments from Prod. The only gap we found was the annotations that link assets to 3d models or images was missing, but I have not verified if this is still an issue.For any bugs we find or suggestions, we are able to submit them as an issue on the GitHub repo like this:Add Support For Mapping of Annotations · Issue #174 · cognitedata/cognite-replicator (github.com)
Yes, the Python SDK has a really nice DatapointsFetcher implementation that splits the API requests into multiple parallel requests, which both gets around the API’s 10,000 time series limit and improves performance. SDK’s for .NET and Java seem to lag behind quite a bit in feature set leaving them up to the SDK user to implement these advanced features.
Thank you @Sunil Krishnamoorthy. It is taking 30s to 2 minutes for each of these queries on our dashboard to return 10 rows of data, so I am hoping that fixing the where clause to be a point lookup rather than range can speed this up for us.
Unable to set and user variable in transformation queryIf CDF were my product, I’d categorize this as an unimplemented feature, but since I’m just a user I’ll call this a Bug and report it.
Found a somewhat silly workaround: My guess is the API is returning the 4 rows and Spark is doing the work to filter in back down to the 2 that I want. This is what I’d prefer to run, but it returns no results:
The 2 date strings seem to translate to different timestamps.
You’re welcome! The trouble is there are so many options around orchestration and workflow and its difficult to know which will actually remain through the test of time. workflow-engine · GitHub Topicsorchestration-framework · GitHub Topics
Once the magic is implemented in the API then the implementation in DirectQuery gets simplified, right?
Yes. Maybe a simple set of views is all that is needed to standardize across PostgreSQL Gateway, Direct Query, and the SQL Transformation features.
Of course, Temporal is also a great starting point for workflow as it is already more mature. I just don’t have as much experience with it since it does not have native .NET support. Python, Go, Java, PHP, and JS/TypeScript are supported though.Temporal Application development guide | Temporal Documentationhttps://temporal.io/
@Sunil Krishnamoorthy We are heavy Azure Durable Functions users for our orchestration, but we are very much looking forward to these developments in Dapr you may be interested in as well that allow building in a more cloud-agnostic and language-agnostic way.[Proposal] Workflow building block and engine · Issue #4576 · dapr/dapr (github.com)Dapr Community Call 67 - YouTube
This is what a working datapoint_aggregates SQL looks like:select timeseries_external_id, "timestamp", "average", "count" from public.datapoint_aggregates WHERE timeseries_external_id IN ('some external id', 'another external id') AND "timestamp" > '2020-08-23T00:00:00' AND "timestamp" < '2030-08-23T00:00:00';
Figured it out. The PowerQuery StartsWith gets converted to LIKE ‘{search string}%’ so this is what you’ve implemented to translate to the API call. This is not supported for datapoint_aggregates, but is supported for timeseries as documented here:Capabilities and limitations | Cognite Documentation This query returns the result I expect:select * from public.timeseries WHERE external_id LIKE 'ecc86a5d7eee491942ee08da87bc39bbHsePoFMi%';
Thanks for the reply. I am not needing pgAdmin to work right now. I was mostly testing it out of curiosity to discover DirectQuery’s capabilities outside of PowerBI usage.I am using the psql commandline now which does not require cursor functionality. I am not getting any rows back from my query, but can’t figure out why. Am I using `starts_with` correctly here? pinnacle-dev=> select "timestamp", "average" from public.datapoint_aggregates WHERE starts_with(timeseries_external_id,'7f9109d2-cae7-4aba-3f7a-08da85138eb7') AND "timestamp" > '2020-08-23T00:00:00' AND "timestamp" < '2030-08-23T00:00:00'; timestamp | average-----------+---------(0 rows)I see the time series in CDF:
OK. In the new UX, when you click “Duplicate” it auto-names the new Transformation as {old name} Copy and generates a random external Id. Having to navigate to another place to change these feels a little awkward.
pgAdmin 4 v6.12. Here are the settings I use in the connection.Host name/address: bi.az-eastus-1.cogniteapp.comPort: 5432Maintenance database: {my CDF project name. Using “postgres” here gives an error.}Username: {My email used for DirectQuery}SSL Mode: Prefer It logs in and shows me a dashboard with server metrics and activity. But the “Databases” node in the browser will never show a DB and I am not allowed to open any type of SQL Query Tool. Could just be me never using pgAdmin before.
@Sunil Krishnamoorthy fyi
Could Cognite Functions be used now instead of Azure Functions to avoid the inherent timeout issues with Az Fns? Would Cognite Functions be able to be configured with a long enough timeout for something like a 12 hr transformation run?
Nice! I had no idea transformation preview was in the SDK’s. Thanks for the help. Any way to parameterize these transformation SQL? As a 3rd party app dev, we sometimes have a need to segment data and handle is separately, but would like to re-use the same transformation SQL. In specific case of this would be data for different customers we store in the same CDF project but segment by Data Set Id. We would like to define a transformation and pass in a DataSetId parameter to run the transformation for a specific dataset and then at a different time run the same transformation with a different DataSetId. The workaround right now is we can internally manage a set of transformation template SQL and create copies with the parameters substituted in for each context, DataSetId in the example, that we’d like to handle. But this adds a bit of management overhead as new DataSets are added or removed.
Hi @ipolomanyi,Azure Data Studio does not have an option in its Postgres plugin to disable cursors, but it is far from the most popular graphical interface for querying Postgres. I wasn’t able to query with pgAdmin either, but I will try some other tools and see if there’s one that can do simple querying without cursors. Thanks!
Thank you for the input @Sunil Krishnamoorthy. Is there a server-side .NET SDK available to run ad-hoc SQL for the “explore data” and “transformations preview” that you have suggested?
@RobersoSN Since this idea is “Parked”, one workaround I recently discovered is getting these types of aggregates from the PostgreSQL Gateway, which doesn’t have the Synthetic Time Series limitations. The gateway is intended for data ingestion, but seems to be working now for querying DataPoints including their aggregates. Personally, I’m really liking be able to query via SQL that I’ve used for 20 years instead of having to call an API that I learned a few months ago. :-) Setup and administration | Cognite Documentationhttps://learn.cognite.com/postgresql-gateway
@Einar Omang Can you give more detail on this?Since aggregate granularity means that the start and end points are rounded, the third one might be a little tricky to get right, especially if you are interested in sums between arbitrary time stamps. If my data is always certain to be daily resolution and I am querying the API or the PostgreSQL gateway for daily resolution, will I hit this issue you are describing?
Wow! It works! You’re my hero, @Einar Omang !All we need to do is include a upper and lower bound for time_stamp in the query. Thanks!
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.