Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hi Team,Is there any sample code available for retrieving a list of instance objects from FDM? I've reviewed the documentation's sample code, but I need additional clarification on where to provide Space, Object Type, Version, and Data Model. Sample code available for Instance List. instance_list = c.data_modeling.instances.list(limit=5) I have also followed the steps outlined in the document URL below.. advancedListInstance queryContent searchInstances Thanks,Karguvel K
Hi We have a web application that uses the cognite sdk to present timeseries and datapoints.We have two use cases:1. We have a case with 500 time series, and we need to retrieve the latest 1000 data points from the combined 500 time series.2. We have a case with 500 time series, and we need to retrieve all datapoints between two dates. For both cases we also need to know which timeseries each datapoint belongs to and we need raw values and can not use aggregates.What is the most effective approach to achieve this?
How do I perform upsert using this dataframe. I also need to associate the timeseries against a specified asset. Please advise. How to use this dataframe and perform associating to an asset id using this line: client.time_series.data.insert_dataframe(df) I have a dataframe that looks like this as below: DATE LP_Crude API_Meter LP_FRN_KERO_SW_CUTPT LP_KERO_DSL_SW_CUTPT LP_DSL_AGO_SW_CUTPT LP_MVG_HVGO_SW_CUTPT 1/1/2023 28.75472705 271.8662 440.9032 680.3416 928.875 1/2/2023 28.21111702 269.3863 466.2317 686.5167 924.1292 1/3/2023 27.78340123 268.8638 484.6189 684.5542 919.9917 1/4/2023 27.5781529 269.3117 506.7792 686.6708 921.9375 1/5/2023 28.03229217 268.1567 500.2208 688.2208 917.575
As Cognite provides APIs to access RAW tables from database, is there any way to get the actual datatype (text/bool/timestamp etc.) of the table’s column.
When I run the following code in Jupyter notebook in CDF, I dont get any problem and it runs fineimport pandas as pdroot_asset = 'LPM_YT_MODEL'def handle(client, data=None, secrets=None, function_call_info=None): start=pd.Timestamp(data["start_date"]) end=pd.Timestamp(data["end_date"]) ts_names_list = client.time_series.list(limit=None,asset_subtree_ids=[client.assets.list(name=root_asset)[0].id]) time_series_data_extids = ts_names_list.as_external_ids() data_points = {ts: client.time_series.data.retrieve_dataframe(external_id=ts,start=start,end=end,) for ts in time_series_data_extids if not client.time_series.data.retrieve_dataframe(external_id=ts,start=start,end=end,).empty} but when this is executed as cognite function, it is throwing following error Traceback (most recent call last):File "/home/site/wwwroot/function/_cognite_function_entry_point.py", line 455, in run_handleresult = handle(*function_argument_values)File "/home/site/wwwroot/function/handler.py", line
I have been working on this for some time now. It was all running fine till now but suddenly it is failing now, and it is giving an error for this function. Is there something that has been changed in the platform/library?DB name and table name are all accurate and was running fine till today. lp_df_input = client.raw.rows.retrieve_dataframe(db_name="Eashwar_MOTDB-db",table_name="lp_input",limit=None,columns=None)Error: ---------------------------------------------------------------------------MissingSchema Traceback (most recent call last)Cell In[30], line 1----> 1 lp_df_input = client.raw.rows.retrieve_dataframe(db_name="Eashwar_MOTDB-db",table_name="lp_input",limit=None,columns=None)File /lib/python3.11/site-packages/cognite/client/_api/raw.py:613, in RawRowsAPI.retrieve_dataframe(self, db_name, table_name, min_last_updated_time, max_last_updated_time, columns, limit) 589 """`Retrieve rows in a table as a pandas dataframe. <https://developer.cog
I am looking forward to a solution for reading a csv files that has 100 columns and 200 rows. It is stored in CDF as ‘Files’. How do I read the file and then use pandas to convert the file as a dataframe?diet_f= (client.files.list(name='diet_daily.csv'))[0]diet_daily_transformed = pd.read_csv(diet_f.name)diet_daily_transformedThis isnt working and throwing an error. Please advise the best means to perform this task.
I have many thousand timeseries where I need to find the date of the first datapoints in each series. For each timeseries, I have no idea if the first datapoint is from this year, or from 20 years ago. Fetching data for several decades is not effective. Any ideas how to get the first datapoint? Getting the last datapoint would also be great. Thanks!
---------------------------------------------------------------------------NameError Traceback (most recent call last)Cell In[97], line 1----> 1 c.time_series.data.retrieve_dataframe(id=239507005016003, 2 end='1000w-ago', 3 aggregates=["average","sum"], 4 granularity="1h")File c:\Users\Sudil\OneDrive - Creative Technology Solutions (Pvt) Ltd\Desktop\DataEngBasics\using-cognite-python-sdk\.venv\lib\site-packages\cognite\client\_api\datapoints.py:975, in DatapointsAPI.retrieve_dataframe(self, id, external_id, start, end, aggregates, granularity, limit, include_outside_points, ignore_unknown_ids, uniform_index, include_aggregate_name, include_granularity_name, column_names) 973 fetcher = select_dps_fetch_strategy(self, user_query=query) 974 if not uniform_index:--> 975 return fetcher.fetch_all_datapoints_numpy().to_pandas( 976 column_names, include_aggregat
Hello! Excited to be part of the Cognite community! I just started the ‘Learn to Use Cognite Python SDK’ course, and am running into an issue as I try to clone the repository using Git (invalid syntax error) per the Overview of Installation and GitHub ‘Getting Started’ instructions. I installed Git separately, verified that I have the git version 2.40.1.windows.1, but still no luck. Please let me know if I am doing something wrong. Thank you!
Performing the ‘try it yourself’ exercises in the “Learn to use the Cognite Python SDK” course. The course materials and examples use the pattern of ‘client.datapoints.retrieve()’ while the Python SDK documentation examples all use the incorrect pattern of ‘client.time_series.data.retrieve()’.https://cognite-sdk-python.readthedocs-hosted.com/en/latest/cognite.html#retrieve-datapoints
I am trying to make a comparison between time series data from CDF and from Pi (accessed with the Seeq Python API). Some questions arise:How do I ensure that the DateTimeIndex-s that are returned are identical? Why does CDF return DateTimeIndex with dates before my start value?Examples:Pulling a raw time series from Pi via Seeq looks like this:start = datetime(2021, 1, 1)end = datetime(2021, 1, 2)spy.pull(items, start=start, end=end, grid=None, quiet=True)And the CDF analog looks like this:start = datetime(2021, 1, 1)end = datetime(2021, 1, 2)cognite.datapoints.retrieve_dataframe( external_id=external_id, start=start, end=end, granularity=None)The resulting DateTimeIndex-es are not identical:From Pi via Seeq:DatetimeIndex(['2021-01-01 00:00:00+01:00', '2021-01-01 00:00:05+01:00', '2021-01-01 00:00:10+01:00', '2021-01-01 00:00:15+01:00', '2021-01-01 00:00:20+01:00', '2021-01-01 00:00:25+01:00', '2021-01-01 00:00:30+01:00', '2021
I get the following warning every time I create a CogniteClient: ResourceWarning: unclosed <ssl.SSLSocket fd=5, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.xx.xx.xx', 49572), raddr=('40.xx.xx.xx', 443)> Is this caused by the connection handling in Cognite SDK? I’m using the following Python modules:cognite-sdk-4.11.0cognite-sdk-core-2.56.1
Aksels ts is Original data , ts average at 04:00 = 0.125, can only be explained by interpolated values The example above shows that 1h aggregates of type average computed at time-point 03:00 takes into considereation the interpolated values in the time-span 03:00 to 04:00. Is it always like this as long as i choose granualarity 1h ? I could not find the answer to my question here:Aggregation | Cognite documentation
Already have an account? Login
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.