Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Oh, and I forgot to mention:FWIW, we have also rolled out some additional (advanced) metadata search/filtering capabilities for Assets, Events, Files and Documents.I.e. https://api-docs.cognite.com/20230101-beta/tag/Assets/operation/listAssets
@Andreas Kimsås Yes, I was referring to (F)DM.Thanks for the heads up on your timeline for re-engaging with testing our Data Modeling capabilities.
Hi @Liv Hagen & thanks for submitting this product idea!We have received the information, and @Kristoffer Knudsen has been asked to consider it for their roadmap. As we evaluate the request and make a decision, the outcome of the decision should be communicated here, along with any requests for additional information, and the ongoing status transitions as the teams process the information you provided.
Hi @Liv Hagen and thank you for submitting this idea/feedback on behalf of BBraun. You should expect @Arun Arunachalam to be the PM who follows up with you (or directly with the customer) on the status of this idea as well as any needed clarifications.
@RobersoSN / @Andreas Kimsås,I'm guessing you have been able to play with Data Modeling for a while now. Does it satisfy the requirements you intended to convey in this Product Idea?
Hi @Taylor Zwick ,Thank you for this insight. We are tracking and processing the product idea separately, but expect this hub post to be updated occasionally as we learn more and decide whether to include this in our future product roadmap. The current responsible PM for this area - @Elka Sierra - will keep you updated as the idea makes its way through our process.
Hi @Abram Ziegelaar (and @JohanStabekk)Thank you for this insight. We are tracking and processing this product idea separately, but expect this hub post to be updated occasionally as we learn more and decide whether to include this in our future product roadmap. The responsible PM for this area - @Kristoffer Knudsen - will keep you updated as the idea makes its way through our process.
Hi @robert.rillThank you for this product idea. We are tracking and processing it separately, but expect this hub post to be updated occasionally as we learn more and decide whether to include this in our future product roadmap. The responsible PM for this area - @Elka Sierra - will keep you updated as the idea makes its way through our process.
Hi @Lihui MengThank you for this insight. We are tracking and processing this product idea separately, but expect this hub post to be updated occasionally as we learn more and decide whether to include this in our future product roadmap. The responsible PM for this area - (me) - will keep you updated as the idea makes its way through our process.
Hi @rmaidla,Thank you for this product idea!The responsible PM - @Kristoffer Knudsen - is reviewing the information you supplied. We track ideas and insights internally, but you should expect us to update this this Hub post as well.
Hi @Eric Lin, and thanks for the question/idea.I mm not currently aware of any plans for a Go SDK for CDF, but will forward the suggestion to our Product Manager for SDKs!
Or are we talking about filtering on the properties of the time series itself?
Hi @Sangavi M, and thanks for this insight! So you’re looking for a way to limit the returned data point payload (using a filter) in the GraphQL interface?Any particular filter(ing) you’re looking for? Data point value range? Time range?
At the moment, the only approach we can offer is for you to add a metadata property to the timeseries in question, with the value being a textual representation of the geolocation. However, we are working on exposing Time Series as nodes in Data Modeling, and when that is in place, you will be able to add your own custom data containers to those nodes, with information such as geolocation. We also have added support for a Data Modeling geoLocation and Geography support onto our roadmap. Current hypothesis is to make that capability available during the fall of 2023.
@Lucas Carvalho de Sousa Yes, see https://docs.cognite.com/api/v1/#tag/Data-Modeling
Unfortunately, persistent synthetic time series hasn’t made it to the list of things we have the capacity to work on yet, nor has it made it onto the official roadmap (I.e not in the cards right now for the next ~12 months).
Hi @Arnfinn,Not sure if you’re trying to use the “Data Modeling” UI in Fusion or if you’ve already tried the /api/v1/projects/{project}/models/spaces/delete API endpoint? If you’ve tried in Fusion, I’m sorry to have to let you know that we do not yet have support for space deletion there (yet), so you’ll need to use a POST operation to the /api/v1/projects/{project}/models/spaces/delete endpoint, specifying an array of space IDs to delete in the request body. If your attempt was using the endpoint, it would be helpful to get a request ID so we can try and figure out what’s going on. Thanks!
Hi and thank you for this excellent suggestion. I’ve taken the opportunity to start a discussion internally (at Cognite) about the “source” parameter in general - making it available in Time Series and Sequences where it’s currently not - and proposed an update to the API documentation for the Assets, Events and Files instances of the property. I’ve left the sourceType property in the Relationships API alone because I believe the description there is adequate and appropriate if/when the “source” property descriptions for “Assets”, “Events” and “Files” is updated to more clearly indicate the purpose of the property itself.Hope this helps!
Hi, and thanks for this suggestion!I’m adding this to our tracking system, and thus ensure it will be evaluated for possible inclusion in a future release.We’ll update this idea as we have more info to share.
Correct, Data Modeling (DM) is not intended to store the equivalent of 100s of millions of data points.You could perhaps use the TimeSeries type in DM and store the data as data points in a Time Series.
I have shared your question with our development team. Will let you know as they are able to work on this starting next week.
Not quite following you @ibrahim.alsyed?Enhanced Data Modeling (aka FDM) has upsert support today and is expected to GA with upsert support included.Beyond that; The work to get FDM to GA in the April release consumes our available storage resources, and more than 95% of the team that is needed to deliver upsert for other resources like Assets, Events or Relationships are heads down with the FDM deliverable. They will likely stay engaged with that until sometime after the GA release (potentially for as long as 4-8 weeks post GA). Possibly too much “making the sausage” info above, but wanted to share in the interest of transparency.
FWIW, the reason we link by internal ID of the asset has to do with fact that we allow externalIds to be mutable in the current asset centric model.That might not (shouldn’t) prevent a transformation from looking up on the externalId of the asset, and use it’s assetId (the internal ID) to create a link.
Hi @Xiaofeng Wang, and thanks for letting us know.Unfortunately, I’m unable to reproduce what you’re experiencing.When I click the bit.ly link from step 1 “Download the demo data and [...]”, in the “Prerequisites” section of the “Upload demo data to CDF (3 mins)” page you pointed to, my (Chrome) browser downloads a .zip archive named ‘FDM_Movie_data.zip’
Hi again @EViswanathan,We do not currently have what we call “write-back” support in the asset hierarchy based model - i.e. the Assets, Relationships, Labels, Time Series, etc APIs, but you could potentially write an application using our SDKs or APIs to pull data from CDF, and push to any system(s) and schema. Not clear to me what limitations you’re referring to?Also, with the “soon to arrive” enhanced data model functionality, you could build your models to support whatever attributes or views you need to simplify this operation.
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.