Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Functions under dynamic task give some output, that needs to passed to next function in workflowOutput of wk-create-2 under dynamic is{“count”:1} How can we send this output to next called function or collective output of all dynamic. I tried following, it gave serialization error. What is corect way to implement if possble.?[ { "externalId": "wk-get-ext-1", "type": "function", "name": "Dynamic Task Generator", "description": "Returns a list of workflow tasks", "parameters": { "function": { "externalId": "wk-get-ext-1", "data": "${workflow.input}" } }, "retries": 1, "timeout": 3600, "onFailure": "abortWorkflow", "dependsOn": [] }, { "externalId": "wk-create-2", "type": "dynamic", "name": "Dynamic Task", "description": "Executes a list of workflow tasks", "parameters": { "dynamic": { "tasks": "${wk-get-ext-1.output.response.tasks}" } }, "retries": 1, "timeout": 3600, "onFailure": "abortW
Suppose I want to perform below activities in the workflowFunction 1 → Dynamic Task 1 (Set of Tasks) → Function 2Function 2 will only get started after all Dynamic tasks are completed , Is there any way through which Function 2 can run in parallel based on different output coming from each dynamic task? Even if it is possible to achieve this without dynamic tasks in workflow, please let me know those ways.The output coming from all dynamic tasks are not dependent so that’s why it should be fine to run the Function 2 independently for each output. Also, please provide the example of workflow definition if I want to use the output of dynamic task in Function 2. For example, if I want to use output of dynamic task “step-2-dynamic-task-get-records” in “step-3-print-records”. This is the workflow definition I am using. { "items":[ { "workflowExternalId": "nk-cdf-workflow2", "version": "5", "workflowDefinition": { "description": "", "tasks": [ {
I want to complete Data Engineer Basics - Integrate so, How to purchase 2 credits for Data Engineer Basics - Integrate?
Hello everyone.I was testing the offline mode on Infield and got an odd behavior using an iPad, with Safari: when I get offline, I am not able to select any field to insert values nor click any buttons. It is only possible when I am back online. Is this how it is supposed to work?I have a video but could not attach here,Thank you in advance.
Hello Everyone, I’m new with CDF I would like to know how can we take the best of CDF to implement Machine Learning or Artificial Intelligence solutions in my company?
Greetings,I'm currently involved in a research project where we are considering the integration of B2MML (Business To Manufacturing Markup Language) data with Cognite Data Fusion (CDF). Our goal is to enhance our data analytics capabilities in a manufacturing context.I am reaching out to this knowledgeable community to gather insights and experiences regarding the following: B2MML Data Extraction: Have any of you worked on extracting data from B2MML-formatted sources? What challenges and best practices have you encountered in this process? B2MML Data Ingestion into CDF: Has anyone successfully configured B2MML data ingestion into Cognite Data Fusion? Could you share any specific strategies or tools you used for this integration? What were the major hurdles, and how did you overcome them? Any detailed experiences, technical insights, or even pointing towards relevant resources or case studies would be immensely helpful.Thank you in advance for your valuable input! Best regards, Art
Hi All,I am trying to setup FDM on Grafana. Can somebody provide document for same?Thanks you.
I created a process monitoring job several weeks ago and it seems to work as expected showing me the folder I created and alerts were delivered to my inbox. After 2 Recent attempts to create monitoring jobs, the UI stated the job was created but I’m not seeing the new folder nor am I receiving alert emails or seeing indications under the alerts tab in Charts.
Hello team, I was trying to trigger a workflow by passing the workflow input tasks and setting the onFailure to skipTask, as I did not want my workflow to be aborted if some task fails.Despite of triggering the workflow by sdk and passing the workflow input.,I can see that the workflow has been trigerred but with onFailure to be abortWorkflowCould you please help me with solving this. code for triggering workflow:workflow_input = { "tasks": [ { "externalId": "gb-test-func-1", "type": "function", "name": "1710479974441", "parameters": { "function": { "externalId": "gb-test-func-1", "data": {} } }, "onFailure": "skipTask", "dependsOn": [] }, { "externalId": "gb-test-func-2", "type": "function", "name": "1710482066717", "parameters": { "function": { "externalId": "gb-test-func-2", "data": {} } }, "onFailure": "skipTask", "dependsOn": [ { "externalId": "gb-multiplication-by-2" }
Hi Team,Will the data flow in Cognite is Uni-directional or Bi-directional? Thank you,Navyasri Indupalli
The number of assets shown as linked to the file is not correct. The system is wrongly showing the number of assets linked in the file.And also Linked Asset to an asset count is wrong and same issue for linked files, linked time series, Linked events. These linked is different than directly linked items. (A) Linked (Files): this count means all decedent children of a given asset. (B) Directly linked (files): this count is the number of files that are connected directly to the asset. (C) The file count on top: This count may vary since there might be duplicated IDs amongst the different types - the total number is a distinct value. The linked assets are not relevant to the Asset which user try to refer and these Linked assets will not linked to the Asset.Directly linked assets are linked to the assets which user try to refer.User don't want to see linked asset tab in CDF because its doesn't have a meaningful assets linked to it, however Directly linked asset have meaningful tab and same
Display events in Cognite Charts which does not have End time.Presently Events does not have End time will not display in Cognite Charts if user try to customize Timeseries for specific duration.Not showing the failures linked to the asset in Charts CDF view, not consistent with the linked failures in Data Explorer. Login to CDF UI Navigate to Data Explorer screen. Click on the Charts tab in menu bar Click on New Chart button. Search Time series name for e.g: FQI02005_M.TOTALIZERA.PV Click on the Time series button below the search field Select the check box of the time series FQI02005_M.TOTALIZERA.PV Click on the Events button on right side of the penal Click on Add Filter button Select the Asset ‘02-V-0203’ from Asset filter. Check Events under the Types filter. Expected Results: User should be able to see & select Events like ‘Failure’, Workorder', ‘MTBE’ etc events for Asset id 02-V-0203Actual Results: Alarm & Failure events not listed under types filter
DescriptionAccess to the 3D should be enabled by searching for the tag in the main Data explorer screen and All resources tab . Click on Data Explorer tab. search for 02-V-0203 see all resources Expected Results:Searching for tag 02-V-0203, does not indicate existence of a 3D model in the main CDF Data explorer tab (count=0). No availability of 3D model in All Resources tab when the asset is selected. Not feasible to contextualize asset information in an easy manner without this 3D view becoming accessible in an easy manner. Actual Results: Access to the 3D should be enabled by searching for the tag in the main Data explorer screen and All resources tab .
HiI am looking into using the is_new functionality in my transformations, but was wondering whether it will work in transformations where is use joins. If so, how would I write the query?In addition, I heard that you are working on releasing functionality to use is_new when reading from data models. When will this feature be ready?Thank you!Sebastian
Here are some issues we are facing with TimeSeriesUploadQueue after migrating to lastest version :Unable to upload rows, though it says it has been uploaded in latest versionIt says uploader uploaded x datapoints even though it doesnt (checked in ui and through api) , and uploaded rows is None even though there are datapoints to upload. Same code works fine in version 6.This is our post upload function: Wrong date range is uploaded in version 7. (happens on a random basis)The target timeseries didnt exist before we run the extractor. The date range we were trying to upload in the first batch is October 25, 2022 11:38:38.553 PM - December 27, 2022 11:03:47.055 AM and in the second batch is December 27, 2022 7:03:47.055 PM to January 1, 2023 11:03:47.055 AM( all GMT times)Both batch's upload failed according to sdk (error during callback, rows is NoneType), but somehow 50 datapoints in a completely different date range is written as observed through cognite ui and through sdkWe also fa
I am running worklow having two function. Functions independently working fine. On trigger of workflow, first step runs and second step is failing with internal server error. Also, multiple call to second function are observed. No other workflow is in place for second function’ingest-test’ that could trigger this function.I have tried passing same input ’ingest-test’ that is expected from previous function. It works fine and produce ouput in 18s. Sharing third snippet of it.
I have updated version to v2 of workflow. But still it runs v1 version.
I have created workflow using functions, both functions are running individually fine. They both run under 4 seconds. IF I run as user from UI, only first function runs, and error stating something went wrong is not helping to understand issue. I created schedule to run this workflow. However, there I see no problem, How can we test our workflows if they are failing to run as user from UI ? And can there be more specific error stating what exactly went wrong.
Hello,I have to import FDM data into Power BI. There’s a field I need from a table that’s in the edge of a relationship with another table. As you can see in the image below: In order to import the data, I just expanded the field.But when I apply the changes I get the fallowing error: Is there another way of importing this data? Thanks,Pablo
-removedCan the moderators pls remove this topic, I can’t find the option to delete it.
Hello everyone! I have several binary timeseries that are displayed correctly when I ask for a short period of time: However, if I ask for a longer timespan, like a year, Charts gives me the aggregates, which I believe would be fine on some cases, but in this particular binary case, it does not make any sense. The final user cannot use this information to perform any analysis; any calculations, thresholds, or just “taking a look to see the trend” will not give an accurate result. Is it possible to choose to disable the use of aggregates for cases like this? Thank you in advance.
Hello, I would like to know what is the best way to plot two variables in a XxY chart instead of stacking them. For example: I have “variable A” and I want it on Y axis and “variable B” on X axis instead of timestamp. Is it possible to do it using Charts, Canvas or should I do it on Power BI?
Is there a tool for planning an schedulling scaffolds ?
Hello,I’m trying to consume data from FDM in Power BI. But I’m getting error (408) Request Timeout. I’m already filtering Power BI for only a few days of data, which brings an amount of data comparable to the amounts I have already consumed in Power BI. How can I solve it?
Hello everyone.I have a use case that I have been trying to figure and I'm hoping you could help me.We have an equipment, let's say a pump, and we have a timeseries for its flow. We consider that the pump is operating as long as the flow is above a threshold.In PI, if I want to know how long the pump was operating, I can simply use PITimeFilterVal function in Excel to retrieve the amount of time that the timeseries was above the threshold. Me and my team have been trying to do this with Cognite, with no success. We tried transforming the timeseries to a 0 or 1 using the “Threshold” function, and then integrating it, but we have faced some limitations due to the approximations that are intrinsic to the integration function.Unfortunately, I cannot show you the actual data, be we checked a few days where the pump starts the day operating - at 00:00 - and at 2 am our integration gives us a value of 1.84h - when it should be as close as 2h as possible. This difference, even though slight, i
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.