I am following the CDF-learning exercises. All good until I got to transformation of Time Series data. I get this error. Don’t know what to do to fix.Have gone back and repeated the steps a few times.
I have one question related to 3d models. I would like to explore does reveal/cognite supports programmatic change of animation on 3d model. What i am looking for is, i have 3d model of pump and would like to change the animation speed depending on the speed i set from javascript code (basically real time speed which i get from my api). Does it possible ? If so could you guide me to example code how to do that. Thanks
I am looking for information related to 3d models. I would like to explore does reveal/cognite supports programmatic change of animation on 3d models which are uploaded to cdf. What i am looking for is, i have 3d model of pump and would like to change the animation speed depending on the speed i get from my api (basically real time speed which i get from my api). Does it possible ? If so could you guide me to example code to how to do that. Also the uploaded 3d model in cdf is not displaying like how the original model that was uploaded. After uploading into cdf its not displaying with colors and animations, could you tell me why it is like that ? Thanks
I am trying to create a synthetic time series query that looks are two other time series and if those values meet a certain condition, is outputs a 1 or 0 as the synthetic time series but I keep getting a syntax error that I can’t solve. Below is a code sample. temp_id_list = [3308928194658464]flow_id_list = [1256034258207391, 3564715377369546]for temperature in temp_id_list: temp_expression = f"TS{{id:{temperature}}} < 200" flow_expression = " + ".join([f"(TS{{id:{flowrate}}})" for flowrate in flow_id_list]) final_expression = f"if({temp_expression} and {flow_expression} > 0, 1, 0)" cv_query = client.time_series.data.synthetic.query(expressions=final_expression, start="1d-ago", end="now")
We are delving into the specifics of using Cognite for certain use cases and have identified the necessity for automatically extracting asset data from engineering data sources like PI&D documents. Are there any features available to facilitate this?For instance, our clients possess numerous PI&D documents and require automatic generation and structuring of asset hierarchies.
Hello,In DB Extractor, I see a parameter that could be configured to specify how to treat the timezone parameter. Is there something similar for OPC-UA Extractor as well ?If not how can the different timezone data could be handled or atleast Local timezone data ?
As per below pic, we need to click “Create dataset” after navigating to the “Use the Data Catalog”. But I am unable to find this option for creating dataset, can someone help?
In below pic, what is the organization name I need to enter?
How can I use Machine Learning Models inside Cognite data fusion?
Trying to download the openapi spec over at https://api-docs.cognite.com/20230101/ by using the URL in the “download” link fails with 404 when trying to use the URL with wget/curl and “ADD” in a dockerfile. It seems like the link is generated behind the scenes during pageload by javascript. Is there a way to download the openapi spec in scripts e.g for use in CI pipelines and similar?
Hello, is it possible to test the CogniteAuthError using monkeypatch context manager ?Regards ,Hakim Arezki
In the video of contextualizing engineering diagrams (say P & ID), can you explain more what data is exactly fetched from these and how it is contextualized? Is it just names of the tags or much more than that is done ? Do we use some vision algorithms to understand which equipment is connected to other equipment ? Or we just fetch the name of tags using some parser from the image file ?
I am unable to generate client code for the CDF openapi spec at https://api-docs.cognite.com/20230101/ using deepmap/oapi-codegen for Go, seemingly because the spec contains some errors. Do anyone know of a way to generate an api-client in go for the CDF-api spec? Both commands below rely on having openapi spec downloaded to the current directory from https://api-docs.cognite.com/20230101/ Testing the spec for errors with redocly/cli: docker run --rm -it -v $PWD/swagger.json:/swagger.json redocly/cli lint /swagger.jsonFor me this results in “Validation failed with 25 errors and 382 warnings” Generating the client library using deepmap/oapi-codegen: go run github.com/deepmap/oapi-codegen/v2/cmd/oapi-codegen@v2.0.0 --config oapi-codegen-config.yaml swagger.jsonwith the this config file:package: "cdf_api"output: "cdf_api.gen.go"generate: models: true client: truecompatibility: circular-reference-limit: 100For me this results in the message “error generating code: error creating operat
Hello, I am going through Cognite Data Fusion Fundamentals and am having an issue that I cant figure out. After multiple attempts I am loading 1 row of data to try and figure out my issue.I make it to the “Preview the Transformation” step: The next step is to “Run the Transformation” step and this is where I am running into an issue. I get the below error message and theres no suggestion on what to do to correct my apparent error. Any idea what to do or how to progress? I just want to finish this course. -George
Hi I have an application to read data from and I have two options either to use ODBC or OPC UA directlyWhat are the benchmarks or limitations when it comes to aboveI have 30,000 data points to read with sample time of 1s and 5s and 10s
When we are selecting different date range, the granularity is changing for different number of days selection. For 2 days - 3min; For 3 days - 5min.Can we get access to any document how this pattern is designed.
I have a question about querying a data model using the CDF SDK.In the Cognite SDK docs there is an example for retrieving actors in a movie (https://cognite-sdk-python.readthedocs-hosted.com/en/latest/data_modeling.html#query-instances) from the following data model: But if we would change this to finding which pumps belongs to a given facility from this data model (picture below), using a bi directional relationship between facility and pump, what changes must be made to the example from the sdk docs for this to work?
To move data from source to CDF, it is mentioned we might require custom extractors. Can you explain in very simple terms what exactly are the extractors , how to create them ? Is it just a piece of code to connect two systems ? Events is one the resource type. Do events (like 2hr shutdown) needs to be created manually in CDF or it is detected automatically based on the value of time series data ?
I have provided the essential credentials within the config.yml file in the OpcUaExtractor config folder, however when run it get this error. How can I fix this?
Invalid config: Wrong type for field "queries.destination" - got "{'database': 'xxxx:xx:xxxx:xx', 'table': 'xxxx'}" of type dict instead of RawDestinationConfig, EventsDestinationConfig, AssetsDestinationConfig, TimeseriesDestinationConfig, SequenceDestinationConfig, FilesDestinationConfig
Hi Team,Please help me to find the mistakes in my code. Let me showcase the dummy codemain code:def main() -> None: """ Main entrypoint """ BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) config_values_vault.set_vara() with Extractor( name="SAP_Extractor", description="An extractor to extract asset hierarchy from SAP based on root node", config_class=SapConfig, # version=__version__, # debugger version="1.0.0", run_handle=run, metrics=metrics, config_file_path= os.path.join(BASE_DIR, 'config.yaml'), ) as extractor: extractor.run() I have build code unittest for above code: def test_main(): with patch('os.path') as path_mock: with patch.object(path_mock, 'abspath') as mock_abspath: with patch.object(path_mock, 'dirname') as mock_dirname: with patch.object(path_mock,'join') as mock_join: mock_abspath.return_value = '/p
Hi team, I’m working on OPC UA Extractor. I have configured metadata-targets and configured metadata-mapping.I’m not able to see the metadata along with timeseries data.Please find the configuration in attachment and screen shot in CDF how timeseries data looks like. please let me know if any additional configurations need to be done to see the metadata along with timeseries data.
Hi,I was wondering where I can find information about how to set up the permissions for a file extractor reading only files from a few sites in SharePoint, i.e., not giving the extractor access to all sites. I have solved it using the GraphAPI, but this is quite cumbersome, so I was wondering if we have documentation on how this can be set up. I’ll post my solution below for clarity on how I solved the issue:_______________________________________________________________________________You will need to create two applications. One that is used in the file extractor and one to give access to the file extractor.Register the File Extractor App: Go to the Azure Active Directory portal. Click on "Azure Active Directory" then click on "App registrations". Click on "New registration" to register a new app. Grant API Permissions: Once your app is registered, click on the app's name to go to its dashboard. Click on "API permissions" in the left panel. Click "Add a permission". Choose "Mic
Looking at API documentationhttps://pr-411.docs.preview.cogniteapp.com/api/v1/it seems one can not build an automated pipeline to create and update 3D model only using python SDKs?I aim to make a pipeline using cognite function and python sdk for nightly update reading files from SharePoint and upload or update 3d models in cdf.If its possible, where I can find the guiding document?
Hi,I’m trying to search for signals where I know some keywords in the name string. I’m not interested in any fuzzy matches but rather just like to see only results which contain a certain phrase. This seems not to be possible unless I activate fuzzy (which then gives to many results). Is there a way to activate “partial match” but without having fuzzy search activated?
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.