Skip to main content

Product Ideas Pipeline

1170 Ideas

Andrew Montgomery
Practitioner
Andrew MontgomeryPractitioner

Support for Custom GraphQL Schemas & Query Resolvers in CDF to Enable Application DevelopmentGathering Interest

Posting on behalf of Koch Ag and Energy Solutions SummaryWe are requesting functionality to enable support for defining custom GraphQL schemas and resolvers on top of CDF. This capability would allow our team to build tailored, efficient APIs for operational applications that surface actionable insights from contextualized manufacturing data.ContextWe are using CDF to centralize and contextualize key plant data — including assets, sensor readings, events, shift logs, inspection documents, and relationships to business systems.Our application team is developing internal tools for specific use case enablement.These tools require fast and intuitive access to data, but CDF’s default GraphQL schema is too low-level for these user-facing applications.Current ChallengesOur frontend developers must manually chain multiple queries to get asset status, performance trends, and associated work orders. Common business logic (like calculating asset availability, filtering alarms, or joining time series to asset metadata) must be repeated in each app. We cannot easily present business-specific views like “ProductionLineHealth” or “ShiftSummary” without building and maintaining our own middleware.Requested CapabilitiesWe are asking for the ability to:Define custom GraphQL object types that aggregate and shape CDF data into operational concepts (e.g., ProductionLine, MachineStatus, DowntimeSummary) Implement custom query resolvers for calculated fields (e.g., OEE, MTBF, energy efficiency) Integrate external system data (e.g., SAP PM, shift logs, scheduling systems) into a unified query layer Apply business rules, KPIs, and filtering logic server-side for reuse across applications Secure the schema and field-level access based on user roles (operator vs planner, plant vs corporate)Example Use CaseCustom Type: ProductionLineOverviewgraphqltype ProductionLineOverview { lineId: ID! name: String  currentThroughput: Float  availability: Float  topFiveAlarms: [Alarm] openWorkOrders: [WorkOrder]energy UseToday: Float} This schema would simplify dozens of low-level CDF queries into a single, reusable API call for our production dashboard.Expected ValueBenefit Impact Streamlined application development APIs are aligned with plant workflows, not just data structure Lower integration effort Data from multiple sources can be presented in one logical schema Faster time to insight Field users access actionable metrics in fewer steps Consistent KPI logic Calculations like OEE or uptime are standardized Broader adoption of CDF More teams (maintenance, operations, digital) can build on it directly  Potential Implementation PathsAllow deployment of custom GraphQL services inside CDF projects (e.g., based on Apollo Server) Provide schema extension hooks that pull and reshape data from CDF's GraphQL or REST APIs Offer first-party support for calculated fields or composite objects in GraphQL Enable hybrid access to external data systems within the same query contextClosingThis capability would significantly enhance how our teams leverage CDF in the field, drive app development efficiency, and improve visibility across operations.We would be happy to participate in design validation or pilot this capability if/when it becomes available.

Data workflow delay taskGathering Interest

We recently encountered an issue where we had to delay a task in a workflow to wait for transformation metrics to become accurate. To solve this we put a sleep in a Cognite Function. We can also use external function calls to create a delay, but I was thinking that this might be a feature that would be nice to have built-in to Data Workflows.Considering that there are times when you may want to delay a task for n minutes:I was wondering if there are any plans to add a built-in delay task-type, or a delay field for the existing task-types, into Data Workflows?e.g.# maybe put it in the dependsOn field #############################- externalId: someTask type: ... ... dependsOn: - externalId: transformationTaskId1 delayMinutes: 10# or as a dedicated field for each task ###########################- externalId: someTask type: ... delayMinutes: 10 dependsOn: ...# or as a unique task type ########################################- externalId: '10MinutesAfterTransforms' type: delay minutes: 10 dependsOn: - transformationTaskId1 - transformationTaskId2- externalId: someTask type: ... dependsOn: - externalId: '10MinutesAfterTransforms'And if not:Is this something the Cognite team would consider adding?Currently I think you can build a dedicated delay function by having a Cognite Function forward a delay instruction to an external endpoint using the isAsyncComplete flag, and have some external process tell the workflow to complete the task when the delay is over.But I imagine this is the sort of thing that many teams who use Data Workflows would eventually need to build themselves.You can also sleep inside the Cognite Function but, it’s wasteful since you’ll be paying for the time slept, it will increase the risk of the function timing out, and the function needs to “know” more about timing quirks of the other processes in the workflow.