Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hi Simen, The Warning you receive when using the Constant value function (Function → Signal generator → Constant value), rather than the Constant node, is because it’s being treated like any other function and calculation output – meaning, it’s subject to the 100k data point limit. Therefore, when your time period in view is large enough and/or the granularity you select is low enough, the calculation result will automatically be downsampled.However, since the value of the constant is always the same value – in most cases – you should be able to ignore that warning. It’s actually best to choose a lower granularity and rely on the automatic data alignment built into the product to resample the data to the relevant granularity based on the other calculations you’ll be performing. So long as you have at least two data points at the start and end of the range in view on the plot, everything in between will be resampled to your constant value of choice. Note: Due to the nature of how our au
Update: I investigated with @simenri further and we discovered that there is in fact a bug where the Constant value function is triggering the downsampling warning even when it's producing well below 100k points. As a workaround until this bug is fixed, it is best to use either the Line function (with a slope = 0 and an intercept = to your constant of choice) and/or the Constant node in your calculations.
Hi @rsiddha and @stanleychiu,I believe I have a calculation workflow that will work for this use case:In the case above, I’m using a uniformly sampled time series (always 1 data point per hour) that produces binary results (1 = on or “good”; 0 = off or “bad”). By setting the range in view to 1 year exactly, I can be confident that there are exactly 8760 data points. If this isn’t the case for your time series data, then you will need to add some additional steps your calculation (e.g. a Resample to granularity function).The final result of this calculation tells me that, for the past year, 98.9% of the values were “good” (or = 1) for this time series from Feb. 14, 2021 - Feb. 14, 2022. Important note: If you’re looking at a time series or time range that requires > 100k data points, this will not work since the application will automatically downsample the time series input and fetch aggregates rather than individual data points (for the time being). If this is the case, one workaro
@stanleychiu the same approach should still work, except instead inputing the same value (1 in my example above) in the Lower limit and Upper limit parameters of the Threshold function, you’ll simply input the range that represents the “good” values. Therefore, whenever the sensor is within the bounds of the range you specify (i.e. Is of a “good” value), the calculation will output a value of 1, and 0 otherwise. From here, the rest of the calculation will work. Although, again, this is assuming your pressure time series is uniformly sampled (no gaps, no variation in sampling frequency). When it comes to scaling this to more than 50 time series, this isn’t something that can be entirely done via the UI today and will require the assistance of a data scientist to solve. This can be done by leveraging our Python SDK, Cognite Functions, and InDSL to recreate the calculation – plus the flexibility of working directly in python will allow one to make the calculation more robust (e.g. Differ
Hi @stanleychiu,Since we currently do not fetch data beyond the range currently in view in the chart plot, this has some implications to how the Sliding window integration function, as well as others (e.g. Shift time series) perform and can be used. Namely, there will be a gap at the start the of the range equal to that of the window length. So, when you’ve set 365 days as the window length and are only viewing 1 year of data, there are not enough data points for the calculation to use since we are not (in the background) fetching the time series data points for previous points in time (i.e. To the left of the data currently in the plot). If you'd like to calculate for a larger range, you can simply zoom out or change the date/time window in view. Although note that the limit of 100k data points, as mentioned above, still applies – and is very important, since you’re performing an integration function.Adding the functionality to run calculations on a user-defined range of time, includi
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.