Tell us what features you’d like to see and support other ideas by upvoting them! Share your ideas in a user story format like: As a ˂role˃, I want to ˂function˃, so that ˂benefit˃. This will help us better understand your requirements and increase the chance of others voting for your request.
Hello!We have added specific metadata fields to our Cognite Functions (project name, git branch, git commit hash, git actor, etc) so we can target very specific deployments in debugging and development. But I am not able to list cognite functions with the python sdk by filtering on metadata. For example, the following client.functions.list(metadata={"my_key": "my_val"}, limit=None) just returns an empty list. I know the functions are there, since I can target some of them by using the external id prefix argument. We have the same metadata fields to our time series and sequences, and I am able to successfully list these by using their metadata argument. Is this a bug in the API?
It would be very helpful to have ability to copy and search by Azure Source Id.Currently User can search group by name, ID (internal) and capability. The link between Azure group and CDF group by Source Id. Also, if user doesn’t have access to edit CDF access group - only options to copy ID (internal) but not Source Id. Which is useless for user.
On behalf of Celanese super users:I would like the ability to save layouts/templates for common filters that I use when searching for data in CDF. Currently, every time I use CDF, need to apply similar filters in order to focus my data search, such as dataset and/or additional metadata filters. If I was able to utilize user defined layouts this would make the user experience more streamlined and approachable.
In multiple cases, the validation of a Checklist is not needed and thoses checklists will remain pending, depending on the Team Captain avaliability. And this could also require time from the Team Captain to approve all the pending Checklists.The suggestion is to create an option on the Template creation screen so the user can define if it’s necessary to have an approval or not for the checklists generated from that Template and a possibility to define the group of users that will be able to approve it.As complement, new status for Checklists could also be created to indicate that a Checklist was finished but needs to be approved and a status to signalize the approved.
Need the ability for admins to configure the default on/off state for all 3D resource types (CAD, Point Cloud, 360 pano). This setting should be persistent for all users. For most of our intended use-cases and personas, all three should be ON by default.
When a user navigates to a Canvas and selects Add data, then searches for an asset, using the Filter by name option does not automatically bring the matching file to the top of the search results. Instead, it highlights the matching file within the existing list.This is due to the search using /documents/list instead of /documents/search when a user navigates to an asset and searches files using "filter by name..".Can we have the search use /documents/search instead of /documents/list when using “filter by name”?This will return the search results based on the search relevance.This feature request was created on behalf of @Mansoor Ahmed
Hello, We currently have the need to support radioactivity timeseries that are measured with the gAPI unit. What is gAPI? • API (American Petroleum Institute) units are commonly used in the oil industry to indicate natural radioactivity in geological formations, particularly using gamma ray logs.• gAPI stands for giga API units (1 gAPI = 10⁹ API units), but it’s often a confusion — in most cases, gAPI simply means “gamma API units”, not giga.• In well logging, 1 API unit is a relative measure, not an absolute unit of radioactivity like Bq. This makes gAPI very difficult to integrate to the CDF unit catalog (https://github.com/cognitedata/units-catalog). Since we cannot have an absolute conversion to an SI unit. This conversion depends on calibration curve or lab measurement linking the gamma ray count in API units to disintegrations per second (Bq). My question is: is there a way to support this in the unit catalog? Support of units without conversion to a SI unit? It is very important to us that every timeseries is linked to unit in the CDF catalog and have a unit_external_id even though we will not use conversion for this particular unit. Happy to discuss. Thank you,
I would like to have the ability for a third party web application to have a link that opens a new CDF Charts with pre-configured time-series, start/end times and other charts metadata. The typical way to do this would be for the 3rd party to call an HTTP POST or GET method passing the right parameters. The 3rd party application in question is based on DataMosaix so you can assume the user performing the operation is already authenticated.
We would like to have a contains filter option on columns. This would be useful to search on string type columns and see if it contains some substring, to filter the data properly.
Currently, when a user attempts to access a project via direct URL (e.g., https://ra-neworgprefixpreprod-40-nonprod.fusion.cognite.com/ra-neworgprefixpreprod-40) without the necessary access permissions, the application displays a generic or unclear error screen. This lack of a clear and descriptive error message can confuse end users, who may not understand whether the issue is due to a system error or lack of access rights.Expected Behavior:When a user tries to access a project without sufficient permissions, they should see a clear and informative message such as:"You do not have permission to access this project. Please contact your administrator if you believe this is an error."This will improve user experience and reduce confusion by clearly indicating that the issue is access-related.Suggested Improvement: Display a permission-related error message when a user lacks access Ensure the message includes: The fact that the user lacks access Optional guidance on how to request access or who to contact User Name -Vilas Dharma Suryawanshi | vilas.dharmasuryawanshi@rockwellautomation.comOrganization - Rockwell
Hi team,There should be a API provision to migrate the pre-existing charts and canvas developed in Dev project to be migrated to Prod. Currently all of them need to be recreated from scratch.
Posting on behalf of Koch Ag and Energy Solutions SummaryWe are requesting functionality to enable support for defining custom GraphQL schemas and resolvers on top of CDF. This capability would allow our team to build tailored, efficient APIs for operational applications that surface actionable insights from contextualized manufacturing data.ContextWe are using CDF to centralize and contextualize key plant data — including assets, sensor readings, events, shift logs, inspection documents, and relationships to business systems.Our application team is developing internal tools for specific use case enablement.These tools require fast and intuitive access to data, but CDF’s default GraphQL schema is too low-level for these user-facing applications.Current ChallengesOur frontend developers must manually chain multiple queries to get asset status, performance trends, and associated work orders. Common business logic (like calculating asset availability, filtering alarms, or joining time series to asset metadata) must be repeated in each app. We cannot easily present business-specific views like “ProductionLineHealth” or “ShiftSummary” without building and maintaining our own middleware.Requested CapabilitiesWe are asking for the ability to:Define custom GraphQL object types that aggregate and shape CDF data into operational concepts (e.g., ProductionLine, MachineStatus, DowntimeSummary) Implement custom query resolvers for calculated fields (e.g., OEE, MTBF, energy efficiency) Integrate external system data (e.g., SAP PM, shift logs, scheduling systems) into a unified query layer Apply business rules, KPIs, and filtering logic server-side for reuse across applications Secure the schema and field-level access based on user roles (operator vs planner, plant vs corporate)Example Use CaseCustom Type: ProductionLineOverviewgraphqltype ProductionLineOverview { lineId: ID! name: String currentThroughput: Float availability: Float topFiveAlarms: [Alarm] openWorkOrders: [WorkOrder]energy UseToday: Float} This schema would simplify dozens of low-level CDF queries into a single, reusable API call for our production dashboard.Expected ValueBenefit Impact Streamlined application development APIs are aligned with plant workflows, not just data structure Lower integration effort Data from multiple sources can be presented in one logical schema Faster time to insight Field users access actionable metrics in fewer steps Consistent KPI logic Calculations like OEE or uptime are standardized Broader adoption of CDF More teams (maintenance, operations, digital) can build on it directly Potential Implementation PathsAllow deployment of custom GraphQL services inside CDF projects (e.g., based on Apollo Server) Provide schema extension hooks that pull and reshape data from CDF's GraphQL or REST APIs Offer first-party support for calculated fields or composite objects in GraphQL Enable hybrid access to external data systems within the same query contextClosingThis capability would significantly enhance how our teams leverage CDF in the field, drive app development efficiency, and improve visibility across operations.We would be happy to participate in design validation or pilot this capability if/when it becomes available.
In Canvas, when you add an Object, if it is related to another object through Annotation (purple lines). We’d like the canvas to show when Objects are related as well (similar to my red lines below). Also, the ability to turn each type of line off independently. !--scriptorendfragment-->
hi, when using rule in canvas, I would like it to be easy to see which rules are assigned to which objects.i can’t do it in rule lists, i have to select each object(shape)to see which rules are asigned to it.I'd be happy to have it checked.
I am using Cognite SDK where I need to count instances present in a given View. I can list all instances but thats heavy response and might be heavy execution backend side as well. I need only total count of instances which if run on cognite side and return in numbers may be lighter and quick as well. It would be good if I can apply some filters as well and get respective count.
As we are learning more about how our users use Canvas, there are two main themes emerging from several use cases. STATIC: Use cases that are focused on events are more static in nature and users DO NOT desire the data on their canvas to change or update. (RCAs, Investigations, work packages, etc.) LIVE: Use cases include collaborative spaces for teams to use for ongoing monitoring require the data (timeseries, files, 3D, etc.) to update to the newest version as it flows into Cognite.We are seeking a way to allow users (or systematic processes -- system built work package canvas) to determine the behavior of the canvas that best suits their needs. This could be a toggle in the settings able to be set by the user in the settings or and setting that can be systematically set. This affects files the most today for us and we have to choose one or the other (we have chosen to keep files live at this point). We are happy to demonstrate examples to help define this out more.
Users are asking for functionality within the note-card or similar interface to be able to have full text editing capabilities, similar to a PDF editor, which would allow them the following (and more, reference PDF editor as needed). Another feature they requested was to have shapes that text can be added to (think of a balloon with an arrow on it) where the balloon can have text put in the middle with the arrow pointing to the line. This could be a shape with a connector if needed but having more defaults (custom created by customer if needed) or prebuilt to standard industry symbols and uses. The balloon reference is commonly used for tie-points for projects, another example could be a triangle with a number in it used commonly for revisions. Details requested for the shape of the box and the text functionality within it. (The users specifically said everything we can do with this product request entry text would be great with reference to this request pane). 1. Change the format including shape (rectangle, circle, triangle, etc.), size (dimensions), color (border and internal)2. Change the font color at a character level (not a whole card level), change the bold/italicized/underline of text, change the justification of the card.
Currently if a checklist was not done due to an UPE or any other plant outage, there really is not a good way to capture these situations. The only options are:Option 1 - Set all tasks to “NA” or “Not OK”, set the checklist to “Done”, & add a note explaining why the checklist was not done.Option 2 - Delete the checklistsNeither are good options from a metric tracking POV. This doesn’t tell the whole story of why the checklists were not completed. Adding a status of “Not Done due to Outage” or something similar would easily give us insight into situations like these.
Provide an additional attribute level to filter 3D resource layers, e.g. “Unit” or “Project”. Example:CAD models → Project → Model NamePoint clouds → Unit → Point cloud Name
We recently encountered an issue where we had to delay a task in a workflow to wait for transformation metrics to become accurate. To solve this we put a sleep in a Cognite Function. We can also use external function calls to create a delay, but I was thinking that this might be a feature that would be nice to have built-in to Data Workflows.Considering that there are times when you may want to delay a task for n minutes:I was wondering if there are any plans to add a built-in delay task-type, or a delay field for the existing task-types, into Data Workflows?e.g.# maybe put it in the dependsOn field #############################- externalId: someTask type: ... ... dependsOn: - externalId: transformationTaskId1 delayMinutes: 10# or as a dedicated field for each task ###########################- externalId: someTask type: ... delayMinutes: 10 dependsOn: ...# or as a unique task type ########################################- externalId: '10MinutesAfterTransforms' type: delay minutes: 10 dependsOn: - transformationTaskId1 - transformationTaskId2- externalId: someTask type: ... dependsOn: - externalId: '10MinutesAfterTransforms'And if not:Is this something the Cognite team would consider adding?Currently I think you can build a dedicated delay function by having a Cognite Function forward a delay instruction to an external endpoint using the isAsyncComplete flag, and have some external process tell the workflow to complete the task when the delay is over.But I imagine this is the sort of thing that many teams who use Data Workflows would eventually need to build themselves.You can also sleep inside the Cognite Function but, it’s wasteful since you’ll be paying for the time slept, it will increase the risk of the function timing out, and the function needs to “know” more about timing quirks of the other processes in the workflow.
Hi,Currently, transformations must target a view to populate a container. But if we update the view (e.g. remove an attribute that we no longer want to expose), the transformation will no longer populate that field — even though the container still supports it.As a result, existing users relying on older views that still expose this attribute will see empty data, creating unintentional breaking changes.Suggestion: Enable transformations to write directly to containersThis would:Ensure stable ingestion regardless of view changes, Prevent breaking downstream users, Simplify the data pipeline.
In order to make it easier to use the data registered in CDF, we propose the development of a data linkage function with Excel.From the data registered in the CDF (mainly time series data), for example, when creating a plant operation report,Right now, I have to download the time series data as a CSV, then copy and paste it into the report, which is very time-consuming.If it is possible to acquire data directly into Excel cells by the Excel linkage function, the utilization of data will be promoted.There is still an Odata function, but it would be best if it was in a more user-friendly form like an Excel function.
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
OKSorry, our virus scanner detected that this file isn't safe to download.
OK