Solved

detNet RestAPI Cognite SDK example [Community Contributed]


Hello,

I am quite new on usinf RestAPI. My company has the license to use CDF. I also inderstood that Cognite has developed dotNet RestApi SDK.

I searched a bit but I could not find an example of how we can use this SDK for data retrieval especially for time-series data. How could I get a snippet of code showing an example how to use this?

Thanks.

 

icon

Best answer by Dilini Fernando 24 May 2023, 09:45

View original

24 replies

Userlevel 4

Hi @samirhaj! To use the CDF .NET SDK you will first need to use authenticate against your Identity Provider (IdP) to acquire a valid token. If you are using an Azure Active Directory (AAD), you can use the MSAL .NET library for this purpose.

Here’s a minimal C# example that reads the AAD and associated CDF variables from environment variables, initiate an interactive flow to acquire the token and uses the token to instantiate the CDF client. If you refer to the MSAL documentation, it’s straightforward to extend this example to other authentication flows.

string clientId = System.Environment.GetEnvironmentVariable("CLIENT_ID");
string tenantId = System.Environment.GetEnvironmentVariable("TENANT_ID");
string cluster = System.Environment.GetEnvironmentVariable("CDF_CLUSTER");
string project = System.Environment.GetEnvironmentVariable("CDF_PROJECT");

var scopes = new List<string>{ $"https://{cluster}.cognitedata.com/.default" };

var app = PublicClientApplicationBuilder
.Create(clientId)
.WithAuthority(AzureCloudInstance.AzurePublic, tenantId)
.WithRedirectUri("http://local host")
.Build();

AuthenticationResult result = await app.AcquireTokenInteractive(scopes).ExecuteAsync();
string accessToken = result.AccessToken;

var httpClient = new HttpClient();
var client = Client.Builder.Create(httpClient)
.SetAppId("testNotebook")
.AddHeader("Authorization", $"Bearer {accessToken}")
.SetProject(project)
.SetBaseUrl(new Uri($"https://{cluster}.cognitedata.com"))
.Build();

Be mindful that I had to separate the words local and host, as this is a word that’s currently not allowed in the Cognite Hub.

Once you have instantiated a client, you can fetch some time series datapoints. Here’s a minimal example on how to do fetch the last 7 days of data from a time series with externalId = "pi:160704" with 1h granularity:

var res = await client.DataPoints.ListAsync(new DataPointsQuery{
Start = "7d-ago",
End = "now",
Items = new List<DataPointsQueryItem> {
new DataPointsQueryItem {
ExternalId = "pi:160704",
Aggregates = new List<string> { "average" },
Granularity = "1h",
Limit = 10_000
}
}
});
var ts = res.Items[0];
var dps = ts.AggregateDatapoints.Datapoints;

Thanks for your prompt reply @Everton Colling

One more question: in the code above where you have defined the CogniteSdk client as “var”, why this client has not been initialized with an instance of CogniteSdk client first (pseudo-code example: Client client = new Client())? I am asking this because I want to access to the client to use that in all methods but if I define it as local var, it will not be accessible in the bottom part of the code above where the client is used to fetch time-series data (which needs a method to call until fetch all the data we need but the instantiation method need to call once. Am I missing something here? 

Userlevel 4

In this case, declaring the client with var or CogniteSdk.Client does not make a difference. The last command Build() creates an instance of the client that you can use to fetch time series data points as with the example above.

Userlevel 6
Badge

Hi @samirhaj checking to see if you need more information on the above? 

Hi @Anita Hæhre : There were some specifics regarding our own company interface with Cognite, so I continued the discussion with @Everton Colling in the private session. 

Userlevel 6
Badge

Thanks @samirhaj, I’ll then close this thread. Do let the community know if we can help again with this or other issues, we’re here to help:) 

Hi @Everton Colling 

Thanks for your reply.

My ultimate goal is to convert the data that I pull out from cognite interface into a csv file (in which each of the data type with its ExternalId in one column of the csv and DateTime in the first column), so I would rather have the output result that looks like this now;

{ DateTime = 3/23/2023 21:00:00, Average = 13.713887271148339 }

{ DateTime = 3/23/2023 22:00:00, Average = 13.662878633528655 }

into the format that I won’t see the words “DateTime = “ and “Average = ” in each row of data. So I mean something similar to the following if possible?

DateTime                     Average

3/23/2023 21:00:00      13.713887271148339

3/23/2023 22:00:00       13.662878633528655

.

.

.

One other question I have is that in the console.writeline() of the code, there is no string “DateTime = “ or “Average = “, so I wonder how these words was printed out to the console?

 

Best regards.

Userlevel 4

Hi @samirhaj!

In C#, when you create a new object using the syntax new { propertyName = propertyValue }, you are creating an anonymous type object. In the case when you omit the property name, like when using the syntax new { propertyValue }, the compiler automatically generates a property name for each property based on the name of the property value.

When you call Console.WriteLine with an anonymous type object as its argument, it automatically calls the ToString() method of the object to obtain a string representation of the object, which includes the names and values of its properties.

As for your desire to create a CSV with the data, it’s quite simple, here’s a minimal example on how to do it:

using (var writer = new StreamWriter("output.csv"))
{
writer.WriteLine("DateTime,Average");
foreach (var dp in dps)
writer.WriteLine($"{dp.DateTime},{dp.Average}");
}

In this example we are using the StreamWriter which is part of the System.IO namespace in C#. We are first writing the header of the CSV followed by iterating over the dps items and writing only their values. The data will be saved into a file called output.csv.

You can adapt the example above and change the header to include the externalID instead of the aggregate name.

Ok, Thanks for your reply.

One more question here: to set how often we want to fetch data for us, should we configure the Granularity parameter or ... ? For example, if I want the data for every 1 sec or 10 sec or 1 minute. I tried to put Granularity to be “0.5h” but it crashed. 

Could you please advise?

 

Best regards.

 

Userlevel 4

For 0.5h you should use a granularity of 30m (as granularity cannot be a rational number). You can refer to the Cognite API documentation for more information about valid options for granularity and aggregateshttps://docs.cognite.com/dev/concepts/aggregation/#granularity.

Hmmm … Interesting … The code throws an exception with Granularity of 2h, 30m, 1m. When I changed back to Granularity = 1h that used to work before, it throws the same exception as well!!

It returns null for dps where dps is;

var dps = ts.AggregateDatapoints.Datapoints.Select(
                dp => new
                {
                    DateTimeOffset.FromUnixTimeMilliseconds(dp.Timestamp).DateTime,
                    dp.Average
                });

And says;

System.NullReferenceException: 'Object reference not set to an instance of an object.'

Com.Cognite.V1.Timeseries.Proto.DataPointListItem.AggregateDatapoints.get returned null.
 

Is this because Cognite does not receive this data from the source when I am running the code? Because this worked for 1h granularity before with no problem.

 

Best regards.

Userlevel 4

So, what’s probably happening is that there’s no datapoints inside of the time window you are sending in the request for the selected time series. In these situations the AggregateDatapoints object is null (empty list of Datapoints) and the code above will fail.

You can catch such a scenario by checking for null references before accessing the Select method in the LINQ query. One way to do this is to use the null-conditional operator (?.) to check if ts.AggregateDatapoints is not null before accessing its Datapoints property.

Here's a minimal example on how to deal with these cases:

var dps = ts.AggregateDatapoints?.Datapoints?.Select(
dp => new
{
DateTimeOffset.FromUnixTimeMilliseconds(dp.Timestamp).DateTime,
dp.Average
}
);
if (dps == null)
{
throw new Exception("No datapoints were available in the selected time window.");
}

In this example, the ?. operator checks if ts.AggregateDatapoints is not null before accessing its Datapoints property. If either ts.AggregateDatapoints or Datapoints is null, the Select method is not called and dps will be null.

After the LINQ query, you can check if dps is null and throw an exception if it is. You can customize the error message in the Exception constructor to provide more specific information about the error. If you prefer, you can also just print to the console instead of throwing an error (it depends on your use case).

You can also check the timestamp when the latest datapoint is available for a given time series by accessing the DataPoints.LatestAsync method.

Here's a minimal example on how to check the timestamp of the latest datapoint available in Cognite Data Fusion for a given time series (specified as tsExternalId):

var resLatestDps = await client.DataPoints.LatestAsync(new DataPointsLatestQuery
{
Items = new List<IdentityWithBefore> {new IdentityWithBefore(tsExternalId, "now")}
}
);
var latestTimeStamp = resLatestDps.First().DataPoints?.First().Timestamp;
if (latestTimeStamp != null)
{
var latestTimeStampString = DateTimeOffset.FromUnixTimeMilliseconds(
latestTimeStamp.Value
).DateTime;
Console.WriteLine(
$"Latest data point found at {latestTimeStampString} for time series {tsExternalId}\n"
);
}

 

Hi,

Thanks. That was right, there was no datapoints in the time interval selected.

But now the issue after selecting a new time interval is that, when I selected Granularity of “5m” and then “10s”, the timestamp that I get data in always 10 minutes. Why can’t I get data with the same granularity that I want? I have selected an specific externalID for this.

 

Best regards.

Userlevel 4

The aggregates are calculated on top of the original data that was ingested from the source system to Cognite Data Fusion (CDF). As described on the documentation, CDF doesn't return aggregates or interpolations for time ranges that have no data points.

For example, if the original data contains data points with a 1s granularity (every second a new data point is being measured and stored), there should be no issues to request aggregates with 10s, 1m, 10m, 1h granularities. However if the original data contains data points with a 1m granularity, asking for a granularity of 10s would still yield a new value only on a 1m granularity.

If you need to get a value for each 10s, you will need to upsample your data manually by implementing some sort of interpolation.

Aggregates are convenient calculations that help to deal with high amounts of data, but if you want you can also take a look at the original data that was ingested into CDF by passing null to both Aggregates and Granularity. Here’s a minimal example to request the original data points that were ingested into CDF:

var resDps = await client.DataPoints.ListAsync(new DataPointsQuery
{
Start = "2d-ago",
End = "now",
Items = new List<DataPointsQueryItem>
{
new DataPointsQueryItem
{
ExternalId = tsExternalId,
Aggregates = null,
Granularity = null,
Limit = 10_000
}
}
});
var ts = resDps.Items[0];

var dps = ts.NumericDatapoints?.Datapoints?.Select(
dp => new
{
DateTimeOffset.FromUnixTimeMilliseconds(dp.Timestamp).DateTime,
dp.Value
});
if (dps == null)
throw new Exception("No data points were located in the selected time window.");

Console.WriteLine($"Data fetched from time series {tsExternalId}:");
foreach (var dp in dps)
Console.WriteLine(dp);

Note that with this request we are dealing with NumericDatapoints now and not AggregateDatapoints anymore, also the datapoints does not contain the aggregate anymore, but instead refer to the Value property.

Userlevel 6
Badge

HI @samirhaj, checking in to see if you need more information from us on the above, or if you’re all set? 

@Everton Colling For the follow up of the previous questions and as a new user I have two questions now;

  • Could we use the Cognite C# Sdk as a common Rest Api Sdk in general to have RestApi client (similar to what RestSharp does for exampl) or is the Cognite Sdk is designed to access and retreive data from Cognite/CDF?
  • What is the difference between Cognite C# Sdk and Cognite Python Sdk? Does both have more or less functionality built-in? When using the Python sdk, once the token is authenticated to access CDF, the message is you are connecting to “Cognite-Postman” but there is not such message in the C# sdk. Is this because with Python you could use Postman for data analytics? Moreover most of the data processing that Postman does also exists in CDF like aggregation, averaging, etc. Are there specific functionalities that Postman does have and CDF doesn’t? 

Best regads.

Userlevel 4

Could we use the Cognite C# Sdk as a common Rest Api Sdk in general to have RestApi client (similar to what RestSharp does for exampl) or is the Cognite Sdk is designed to access and retreive data from Cognite/CDF?

The Cognite .NET SDK was built to add convenience to developers while creating solutions that need to call endpoints exposed by the CDF API.

What is the difference between Cognite C# Sdk and Cognite Python Sdk? Does both have more or less functionality built-in? When using the Python sdk, once the token is authenticated to access CDF, the message is you are connecting to “Cognite-Postman” but there is not such message in the C# sdk. Is this because with Python you could use Postman for data analytics? Moreover most of the data processing that Postman does also exists in CDF like aggregation, averaging, etc. Are there specific functionalities that Postman does have and CDF doesn’t? 

The Cognite .NET SDK is a Community SDK (you can check the other available SDKs here), and as such is not officially supported as a Product. It does not contain implementation for all endpoints available in CDF and updates are made by the community. On the other hand, the Python SDK is an official SDK that is actively maintained by Cognite and covers a wider range of Cognite API endpoints. You can also connect to CDF API using Postman, check the link above for more details.

Badge +1

@Everton Colling Can you please help on how we can pass the Data set id in the above requests.

Userlevel 4

DataSets can be queried like any other CDF resources. Here is an example of a query to list DataSets:

var res = await client.DataSets.ListAsync(
    new DataSetQuery{Limit=10}
);

DataSets can also be used to filter queries to CDF. Each DataSet has an Id and ExternalId properties that can be used as filtering arguments. Here's an example on how to filter a TimeSeries query with a DataSetId:

var res = await client.TimeSeries.ListAsync(
    new TimeSeriesQuery{
        Filter = new TimeSeriesFilter{
            Unit="barg",
            DataSetIds=new List<Identity>{ 
                new Identity(2452112635370053) 
            }
        },
        Limit=2
    }
);

Note that the DataSetIds argument of the Filter accepts a list of objects of the type CogniteSdk.Identity. The Identity class is used to set either Id or ExternalId. In the example above I used the Id to reference the desired DataSetId.

Badge +1

@Everton Colling I will try it out. Appreciate your help.

Userlevel 4
Badge +2

Hi @Adarsh Dhiman,

Did you manage to try out Everton’s suggestions? 

Best regards,
Dilini

Userlevel 4
Badge +2

Hi @Adarsh Dhiman,

I hope you were able to solve your issue. As of now, I will close this thread. If you have any questions, please feel free to post here.

Best regards,
Dilini 

Badge +1

Hi @Dilini Fernando

I have started my work on Timeseries just few days back only. I have yet to test it.

Userlevel 4
Badge +2

Hi @samirhaj,

We appreciate your contribution to our community hub! We have chosen to move your article to our hub's How-To section as it will greatly benefit other members of our community. Thank you for your understanding, and we look forward to seeing more great contributions from you in the future! 

Best regards,
Dilini 

 

Reply