Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
I am trying to connect to an MQTT broker with Cognite’s MQTT Extractor.I know that the broker is fully functional, verified by several tools.However, when completing the “Add MQTT Topic”.Error code indicates that the API does not accept “Datapoints” as “Target type”, even though “Target type” is a mandatory field in the interface window. And: “Datapoints” is the only available selection.Any quick feedback to progress my testing is appreaciated!
When I run the query below, I am getting an error. If I increase the end time by 1 second I can get a result, but it gives me 2 days and I only want 1. All my data is consistently daily at midnight, UTC. Any ideas how to get past this?SELECT dp.id, dp.externalId AS `key`, dp.timestamp, dp.valueFROM `_cdf`.`datapoints` dpWHERE dp.externalId IN ( 'f18ae31cbfe544cb7bca08da91e5245a-SuFMiCo0f7c8eca4ce444208853aedd00ded4fb', '730619216b764ba77bc708da91e5245a-SuFMiCo6bb90df0efbc461d863faedd00df009b' ) AND dp.timestamp >= TO_TIMESTAMP('2022-09-01T00:00:00Z') AND dp.timestamp < TO_TIMESTAMP('2022-09-02T00:00:00Z') Gives me a result, but not what I want: cc @Torgrim Aas @Sunil Krishnamoorthy
We are currently migrating to OIDC where we need to give access through access groups linked to Azure AD.On our Statnett cluster it seems that a user needs to explicitly be member of a group “transformations” in order to to delete (or edit) a transformation.The admin group has the capabilities (on “test”) {'transformationsAcl': {'actions': ['READ', 'WRITE'], 'scope': {'all': {}}}}]But we need to login to the legacy login without OIDC and have a service account linked explicitly to the group “transformations” in order to delete a transformation. The group “transformations” has no capabilities set.I have tried both Fusion and the API/python-SDK (read is possible):CogniteAPIError: Transformation not found. This may also be due to insufficient access rights. | code: 403 | X-Request-ID: b7c0beb6-d3e0-9ec4-ba50-895533ac1996
Training - Learning to use Python SDK, Data Engineer BasicsNotebook - 1_Authentication.ipynbError when executing - Authenticate with Client Secret. Passed in Client Secret (created in the Fundamentals Training). Error Message - CogniteAuthError: Error generating access token: invalid_client, 401, AADSTS7000215: Invalid client secret provided. Ensure the secret being sent in the request is the client secret value, not the client secret ID, for a secret added to app 'fab52bb5-9de2-4f9e-aefa-712da4b5fe00'.Trace ID: 72340cc2-c318-41b4-9679-e66b72ea8c01Correlation ID: 6dfcf480-f2d6-4454-9ff8-b9973e373ee8Timestamp: 2022-11-28 01:59:50Z
Hi!Almost three years ago we made a feature request regarding search. Specifically we want to:Have support for paging. Do grouping on metadata-fields. Define which fields yields relevance to the search-results. Inclusive search-terms using AND instead of OR.I do understand that satisfying the search-requirements of all your users is a daunting task, but perhaps it would be easier to let us do the job instead? I believe it would be very beneficial if we were able to access ElasticSearch directly in some manner.And please keep in mind that the data itself defines relevancy. No two customers are alike and therefore a good generic solution likely doesn’t exist. @Knut Vidvei @Andreea Pastinaru
I just finished the cognite academy examples on contextualization. I did notice on the PID contextualization example that there were quite a few errors in what seems potentially character recognition pipeline to identify tags in the PDF.The tutorial stated that “these were all good” and “we can accept all”. I suspect such a process would create missing or strange links in the contextualized dataset.Are these known issues?I lack a bit the understanding of the context for the importance of these mispredictions,but I thought to report them anyway just in case.Happy to support you on improving these if they are something that needs improvement.
Learn Cognite API in java script SDK.
We are having a multilevel asset hierarchy in cdf, and ground level assets are only having timeseries linked.we are trying to create Relationships between every parent asset and its child asset’s Timeseries.we used python sdk code snippets to do this. Which are been attached to this for your reference.In python sdk, after running the python relationship code, it is showing relationships are established. But in UI it is not showing from Assets, But we are able to see under Timeseries.even after we tried interchanging the Source and target between assets and timeseries, same result.one unique observation is that we could see the relationship established just for 1 asset which is having 3 timeseries linkage. rest others are not showing(they have more than 20 relations, even 500 for root asset.)we would like to know are we following correct manner, or any restrictions are from CDF tool. plz help us in this regard.from cognite.client import CogniteClientfrom cognite.client.data_classes impo
I have asset hierarchy table and time series table imported, but I want to link asset to timeseries or vice-versa. asset hierarchy table looks like:level 1, level 2.e1, entity 1;level 1, level 2.e1, entity 2;level 1, level 2.e2, entity 3;time series table looks like:t1, entity 1, x1, y1t2, entity 1, x2, y2when I click asset entity 1 in Explore view, it shows time series data count but the name and description of the time series are empty; when click time series, the count of linked Assets is 0. I do not know how to fix the issue.
I have a helper function that returns a CogniteClient via interactive OAuth authentication. This worked literally yesterday, but gives me an error today.This is my function definition import osfrom cognite.client import CogniteClient, ClientConfigfrom cognite.client.credentials import OAuthInteractivefrom typing import Uniondef authenticate_to_cognite_oauth( project: str="akerbp", client_name: str="Employee", client_id: Union[str, type(None)]=None, tenant_id: Union[str, type(None)]=None, base_url: str="https://api.cognitedata.com" ) -> CogniteClient: """Authenticate to Cognite via OAuth interactive login. Return cognite.client.CogniteClient.""" if client_id is None: client_id = os.getenv("CDF_CLIENT_ID") if tenant_id is None: tenant_id = os.getenv("CDF_TENANT_ID") authority_url = f"https://login.microsoftonline.com/{tenant_id}" scopes = [f"{base_url}/.default"] creds = OAuthInteractive( authority_url=authority_u
Have been having issues with Cognite Charts since it’s been merged into CDF. Whenever I try to go to charts, I have to clear the cache in my browser for it to work, otherwise it doesn’t load. This is a small nuisance that can prevent people from adopting the use of Charts as their daily tool.
I am doing the Working with CDF Integrate module, but I have the following sql query error: An error occurred when running this query"Mismatched input ''rocio1975'' expecting {'(', 'ADD', 'AFTER', 'ALL', 'ALTER', 'ANALYZE', 'AND', 'ANTI', 'ANY', 'ARCHIVE', 'ARRAY', 'AS', 'ASC', 'AT', 'AUTHORIZATION', 'BETWEEN', 'BOTH', 'BUCKET', 'BUCKETS', 'BY', 'CACHE', 'CASCADE', 'CASE', 'CAST', 'CHANGE', 'CHECK', 'CLEAR', 'CLUSTER', 'CLUSTERED', 'CODEGEN', 'COLLATE', 'COLLECTION', 'COLUMN', 'COLUMNS', 'COMMENT', 'COMMIT', 'COMPACT', 'COMPACTIONS', 'COMPUTE', 'CONCATENATE', 'CONSTRAINT', 'COST', 'CREATE', 'CROSS', 'CUBE', 'CURRENT', 'CURRENT_DATE', 'CURRENT_TIME', 'CURRENT_TIMESTAMP', 'CURRENT_USER', 'DAY', 'DATA', 'DATABASE', DATABASES, 'DBPROPERTIES', 'DEFINED', 'DELETE', 'DELIMITED', 'DESC', 'DESCRIBE', 'DFS', 'DIRECTORIES', 'DIRECTORY', 'DISTINCT', 'DISTRIBUTE', 'DIV', 'DROP', 'ELSE', 'END', 'ESCAPE', 'ESCAPED', 'EXCEPT', 'EXCHANGE', 'EXISTS', 'EXPLAIN', 'EXPORT', 'EXTENDED', 'EXTERNAL', 'EXTRACT
While setting up credentials while transforming getting this error.Error : Session Create Error:Request failed: Status code 401 Tried creating multiple times no luck any help pls
Cognite Function throwing Bad Gateway Error:Function ID :4525036003390684Here are the details:error: Bad Gateway | code: 502 | X-Request-ID: d84d076d-6e59-9362-b58e-931d5a63644f
I have a set of work orders with the following schema along with sample data for two assets (10010234, 10010235){Equipment Work Order # Order Type Start Date Start Time End Date End Time Cost}10010234 110025063 Unplanned 7/6/2018 11:31:15 7/6/2018 23:00:00 35,600 10010234 110026082 Planned 8/25/2018 12:27:15 8/27/2018 23:44:00 37,180 10010234 110027101 Unplanned 12/8/2018 13:30:15 12/8/2018 17:00:00 26,580 10010234 110028120 Unplanned 1/27/2019 14:40:15 1/29/2019 1:45:00 38,900 10010235 110023050 Planned 3/1/2018 15:57:15 3/2/2018 3:02:00 25,000 10010235 110024617 Planned 6/14/2018 17:21:15 6/16/2018 4:30:00 30,000 10010235 110026184 Planned 9/27/2018 0:25:00 9/27/2018 6:09:00 24,600 10010235 110027751 Unplanned 1/10/2019 20:30:15 1/12/2019 7:59:00 35,600 Equipment is stored as assets and the work orders are stored a
How to fetch documents from multiple folder locations using single config.yml file by Cognite Documentum Extractor? how to write multiple DQL queries in same config.yml? Which configuration parameters to use?I am working on Cognite Documentum Extractor and using DFC Java SDK mode for connectivity.
I was trying to complete learning exercise in contextualization. I’m nit able to find dataset ‘saurabh1985-IFSDB’ created by me in initial exercise. Please check image below. Can you please help me with issue ? thanks,Saurabh Lale
Hi Team, We are working on the Cognite Pi Extractor to fetch the data from OSIsoft PI Data Archive. From the documentation we understood that we can use the OSIsoft PI Web API or PI SDK to retrieve the data we need from OSIsoft PI Server. But it is not specifically mentioned in Cognite PI Extractor's documentation that which method it is internally using. Is it using PI SDK or PI AF SDK or PI Web API?
<<URGENT>>Hi ,We are Unable to deploy Cognite Function. Below is the snippet for the same.Function ID : 269293733668660 @Philippe Bettler could you please prioritize this issue.
I have found a problem reading timeseries from Cognite with Power BI. The first bit of Power Query has a filter to limit the timeseries retrieved to just the ones I want. The second one is the same, but does not have the filter. What I have found is that adding the filter causes duplicate rows in the table. Once the timeseries rows are duplicated, the timeseries aggregation values are also duplicated. I know that the first query is not the most performant option, but it should still have the correct answer.let Source = Cognite.Contents( #"cogniteContentsParameters", #"cdfEnvironment"), Timeseries_table = Source{[Name="Timeseries",Signature="table"]}[Data], #"Filtered Rows" = Table.SelectRows(Timeseries_table, each ([IsString] = false)), #"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"MetaData", "SecurityCategories", "IsStep", "DataSetId", "CreatedTime", "LastUpdatedTime", "Asset", "Latest", "DataPoints", "StringPoints", "Aggregate"}), #"Filtered Rows1" = Tabl
Hey, while performing Incremental filter operation, I am not able to get the exact filters I have applied for in the RangeStart and RangeEnd. Can someone help me out.Thanks Sharath
How do i fix this warning as I can not answer the quiz question cause I am not getting any data for the Temp In: Gaps detection calculation.
Hi,Thanks for sharing great course material for DataScientist, which help me to understand DS workflow in Cognite.In learning module part 5 - Deploying a Cognite Function with GitHub Actions, I am interested to explore Grafana dashboard. Kindly provide access to the below link,https://grafana-tech-sales.cogniteapp.com/d/EgxLOhE7x/heatex-demo?orgId=1With Regards,N.Shailaja
Hi Team, while running the entity_matching_notebook facing the following error.Need your suggestion, please.Regards,Navin
Hi,As part of the exercise -”Match time series to assets” I am not able to see the dataset i have created - Pavan1978-IFSDB when i used Quick Match..
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.