Hello,When testing the new Diagram Parser tool on Vector PDFs, I found that it successfully finds some Assets ingested in the DMS and Symbols from a selected library. However, the Assets only show up as annotations even when verifying them. How can I get the suggested Assets to link to the Diagram in the DMS (i.e. open the diagram and see Assets directly linked to the diagram)? Thanks,Gabriel
Hi everyone,While querying the CogniteAsset view, I encountered an error when including the activities field in the selection. The API returns the following message: "Cannot traverse lists of direct relations inwards." Upon investigating, I noticed that the activities field in the CogniteAsset view is defined as a @reverseDirectRelation: activities: [CogniteActivity] @reverseDirectRelation(throughProperty: "assets") In turn, the assets field in the CogniteActivity view is defined as a list of direct relations:assets: [CogniteAsset] @directRelationThis makes sense as the root cause, since reverse traversal over a list of direct relations is not allowed, which explains the error.Given that this field leads to an invalid query pattern and always results in an error, should the activities field be removed from the CogniteAsset view to avoid confusion and runtime errors? Here’s a minimal query that reproduces the issue:query MyQuery { listCogniteAsset { items { aliases activit
Hi,We have uploaded a 3D .nwd model using the Fusion UI and noticed that the underlying processed 3D files (e.g., .glb, .reveal, .zip) are stored as file resources and are visible in the Data Explorer under the Files resource type.I couldn’t find any documentation or guideline on how to assign security categories to hide or restrict access to these files for end users. I also tried using dataset-based access control, but I’m looking for a better or recommended way to handle this.Appreciate your guidance or suggestions.
i
Does it make the data available for optimal maintenance scheduling, or does it automatically generate maintenance plans based on that data? Does it predict failures?
Hello, We are using: Toolkit Version '0.5.20' We are having a file schedules.Function.yaml: - name: Run every 10 minutes to do some work functionExternalId: fn_name description: Run every 10 minutes to do some work cronExpression: "*/10 * * * *" data: extractionPipeline: "test"When deploying we have this error trace: Traceback (most recent call last): File "venv/bin/cdf-tk", line 8, in <module> sys.exit(app()) File "venv/lib/python3.10/site-packages/cognite_toolkit/_cdf.py", line 116, in app _app() File "venv/lib/python3.10/site-packages/typer/main.py", line 340, in __call__ raise e File "venv/lib/python3.10/site-packages/typer/main.py", line 323, in __call__ return get_command(self)(*args, **kwargs) File "venv/lib/python3.10/site-packages/click/core.py", line 1161, in __call__ return self.main(*args, **kwargs) File "venv/lib/python3.10/site-packages/typer/core.py", line 740, in main return _main( File "venv/lib/python3.10/site-packages/typer/core
I am learning CDF and as a part of bootcamp, I tried canvas with my PID sample data. I included a set of timeseries including my previously created chart. I added comment in this chart and one of the timeseries. A weird behavior is encountered.Out of 2 comments in different parts of canvas, on click of one it properly shows comments. Whereas, click on other comment not showing anything. Under global comments list section, I can only see one comments instead expecting 2 commentsNot sure if am I doing anything wrong or its a bug in canvas tool?
I’m studying PostgreSQL gateway in data engineer journey. In order to perform a hands-on experience, we need creeate a group in CDF. So, I’ve signed in CDF, but I cant create groups in CDF, How can I solve these issue in order to continue hands-on activities?
Hello, In our project, we need to use TIMESTAMP_LTZ data type for our timestamps. This data type is only available in Spark starting from 3.4 version. Is it possible to upgrade your version (which is 3.3 I believe) to 3.4 at least please? Thank you
Hello, We envision to implement a versioning based on minor and major version of our data models. So a data model version will be presented with two digits: <Major>.<Minor><Major>: incremented when we do a breaking change that is not backward compatible with the previous one and we know that it will break our consumers queries. Exp: delete an attribute in a view in the data model<Minor>: incremented when we do a change that is backward compatible and consumer queries will not break. Exp: add a new field. Is there a way to get a graphql url that maps always to the latest minor version of a given data model? This will avoid our data model consumers to redeploy their application every time we increment our minor version. Thank you!
Hi all,I have a question regarding the use of nested attributes from linked objects when defining indexes.In my model, I have an object Cement that has a direct link to another object called ScenarioContent.I would like to use the attribute ScenarioContent.Type as part of the index on the Cement object.This is particularly useful, because I use views that filter Cement instances based on ScenarioContent.Type. For example, to expose only planned scenarios, I use the following rawFilter in a view:@view( rawFilter: { nested: { scope: ["Cement", "ScenarioContent"] filter: { equals: { property: ["ScenarioContent", "Type"] value: "PLANNED" } } } }) Question:Can I use this nested attribute (ScenarioContent.Type) in the index definition of Cement?If this is not currently supported, is there a recommended alternative to support performant queries on this kind of filter?
Hi! I’m the scrum coordinator for NOVA Chemical’s SMM Team. I have some customization inquiries on behalf of the team:Customization to unlock Canvas by creator Advanced Markups: Cloud Shaped Comment/ Markup Canvas size selector (print all in one size) Cognite Stamp Function enhancement: The stamp and connector from P&ID to an Icon on Photo doesn't lock and freeze to the icons and stamp icon and equipment on P&ID Maintain Layouts saving column sequencesSince the ask is that we submit them here and provide a prioritized list, I will follow up with Ahmed/ Moe to provide the list.Thanks!Sophie Garcia
Hi, I did run transformation of assets and it worked first time successfully but today when I had a look, found a failed run and looking at logs, it says below error: Request with id 0b54c004-c667-9eab-a8c3-0a0047128fe1 to https://westeurope-1.cognitedata.com/api/v1/projects/cdf-ABCD/models/instances failed with status 400: Duplicate node externalIds for space 'icapi_dm_space' present in request: OSPRFIDRTO109:off_spec. Any idea how to fix this? and what is this error actually?
A Streamlit app that happily runs locally fails when deployed to CDF.The line `st.dataframe([{‘key’: ‘value’, ...}])` produces this error. Not every time, but once the error happens, it does not recover and keeps showing this error even though the app reruns to show updated data.Error: External format error: File out of specification: Repetition level must be defined for a primitive typeDo you have a workaround?It is a simple data set really just string key-value pairs. When it works, it looks like this: Contact me and I will share the URL of the deployed streamlit app.
Hi,I'm building a Streamlit app that fetches instances from the data model. However, the view from which these instances are retrieved contains over 1.3 lakh entries, causing performance issues due to the large volume. Additionally, since Streamlit, supported by CDF, does not support multithreading, I am unable to parallelize other calls.Is there a way to retrieve all 1.3+ lakh instances in under 10 seconds?Thanks,Tausif Sayyad
Hi, I am trying to deploy dry run but its getting failed and gives below error. Please help to fix this as well as help to understand what exactly it is. Don't have correct access rights to clean all-scoped groups. Missing:GroupsAcl(actions=[GroupsAcl Action.List], scope=AllScope())GroupsAcl(actions=[GroupsAcl Action.Read], scope=AllScope()) and ERROR (AuthorizationError): Don't have correct access rights to clean data sets. Missing:DataSetsAcl(actions=[<DataSetsAcl Action.Read: 'READ'>], scope=AllScope())
My first attempt at this I was getting no time series after ingestion. I tried to clear my data and start over and now I’m getting consistent errors trying to run the transformation of assets into my Avevanet project: Parents 'FirstnameBirthyear:23-1ST STAGE COMPRESSOR-PH', 'FirstnameBirthyear:23-FE-92537', 'FirstnameBirthyear:23-PDT-92530', 'FirstnameBirthyear:23-PDT-92534', 'FirstnameBirthyear:23-PT-92531', 'FirstnameBirthyear:23-PT-92532', 'FirstnameBirthyear:23-PT-92540', 'FirstnameBirthyear:Aker BP', 'FirstnameBirthyear:PH-S-1305', 'FirstnameBirthyear:VAL' referenced from 'Cory1984:23-1ST STAGE COMPRESSOR-PH', 'Cory1984:23-FE-92537', 'Cory1984:23-FT-92537-01', 'Cory1984:23-PDI-92530', 'Cory1984:23-PDI-92534', 'Cory1984:23-PDT-92534', 'Cory1984:23-PI-92531', 'Cory1984:23-PI-92532', 'Cory1984:23-PI-92540', 'Cory1984:VAL' do not exist. Preview worked fine, please advise. cwa
Hi everyone,I'm currently using a nested filter in my view, and everything seems to work correctly — except in the tabular view.Here’s what I observe:In the tabular interface, the table shows 0 objects. However, when I click into the table, I can actually see that the instances are present. Below is an example of the nested filter I’m using:Below is an example of the nested filter I’m using:@view( rawFilter: { and: [ { nested: { scope: ["Cement", "ScenarioContent"] filter: { equals: { property: ["ScenarioContent", "Type"] value: "PLANNED" } } } }, { nested: { scope: ["Cement", "ScenarioContent"] filter: { equals: { property: ["ScenarioContent", "IsLast"] value: true } } } } ]When checking the request, I see the following error:{ "errors": [ { "message": "Client r
Hi,I am new to Cognite and getting ready for a bootcamp I will attend soon. I am just starting using my VM and getting a blinking and blank window when trying to open a python file I just created (See attached picture for reference). After a while it gives me a message on cmd (2nd picture attached). Has anyone experienced this issue? Is it related to my VM or my laptop?Thanks for helping!Maria
When deploying a function with the SDK you can specify the source code “folder” as a parameter. I have used this to deploy the functions with my own python module which is just a directory with some py files. I want to migrate to using the toolkit, but I am not sure how to deploy with the source code folder I want as I do not see a way to specify it in the yaml file. Can anyone help with this? I have multiple Cognite Functions that I want to deploy that all access the same source code I have written.Thank you!Sebastian
¿HOW COGNITE DATA FUSION HELP US TO IMPROVE THE ENEERGY EFFICIENCY IN AN INDUSTRIAL PROCESS?
Hi,I want to set up monitoring of timeseries wrt reservoir levels. Yesterday I also set up a dummy monitoring, simply to test the functionality. I set the monitoring to alert when timeseries is over 1462.88 meters, which was crossed yesterday evening, but no alarms were triggered.Please see screenshot of my setup below, does this seem right? BR,Jørgen Aarstad
I see there is a fork for ToolJet in the CogniteData git repo. I’m curious to know if Cognite has done anything with this branch and if so, how it’s being used within the product. Thanks!
I noticed a few misses / potential improvements in the alerts & monitoring section of Cognite charts - Was wondering if anybody else had similar experiences?Email alerts always come in UTC (Considering we are based out of India, we may miss important alerts because of this). Let's say I have an Asset where I have generated a chart (in our case a particular well from a particular lift type showing critical parameters). A filter-based option where I can just switch Wells showing - using the same parameters as tagged before.
Hi,I would like to ask if Cognite Maintain or CDF can show the available bill of materials on equipment in the source system(for eg. in SAP).For example if technician can view the BoM on P&ID and the most frequently consumed materials on equipment, e.g compressor.
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.