Skip to main content

SOLVED:
The Cognite client was set up wrong, hence the Cognite API error when uploading data to CDF. See docs posted by Peter for setting up config.yaml for the extractor. Specifically these were the changes:
``` 

COGNITE_BASE_URL=https://westeurope-1.cognitedata.com (not Fusion URL)

COGNITE_PROJECT= {some project name}

COGNITE_TOKEN_URL= https://login.windows.net/{some_tenant}/oauth2/v2.0/token
(Different from OpenID connect token URL)
```

_______________________________________________

Hello curious and helpful members of Cognite Hub,

I’m building my first REST extractor to GET a response (sensor data) from a camera_sensor API, where I then aim to upload the data to corresponding timeseries. I have multiple cameras with each its URL path, so it seems @extractor.get_multiple(paths) gives me the desired behaviour of essentially looping @extractor.get through the URLs. 

However, this error arises when my extractor tries to upload datapoints to the timeseries in CDF.

^CTraceback (most recent call last):
File "/Users/patrick.nitschke/opt/anaconda3/envs/cdf_new/lib/python3.11/site-packages/cognite/extractorutils/util.py", line 344, in _retry_internal
return f()
^^^
File "/Users/patrick.nitschke/opt/anaconda3/envs/cdf_new/lib/python3.11/site-packages/cognite/extractorutils/uploader/time_series.py", line 242, in _upload_batch
self.cdf_client.time_series.data.insert_multiple(upload_this)
File "/Users/patrick.nitschke/opt/anaconda3/envs/cdf_new/lib/python3.11/site-packages/cognite/client/_api/datapoints.py", line 1340, in insert_multiple
dps_poster.insert(datapoints)
File "/Users/patrick.nitschke/opt/anaconda3/envs/cdf_new/lib/python3.11/site-packages/cognite/client/_api/datapoints.py", line 1479, in insert
self._insert_datapoints_concurrently(binned_dps_object_lists)
File "/Users/patrick.nitschke/opt/anaconda3/envs/cdf_new/lib/python3.11/site-packages/cognite/client/_api/datapoints.py", line 1550, in _insert_datapoints_concurrently
summary.raise_compound_exception_if_failed_tasks(
File "/Users/patrick.nitschke/opt/anaconda3/envs/cdf_new/lib/python3.11/site-packages/cognite/client/utils/_concurrency.py", line 64, in raise_compound_exception_if_failed_tasks
collect_exc_info_and_raise(
File "/Users/patrick.nitschke/opt/anaconda3/envs/cdf_new/lib/python3.11/site-packages/cognite/client/utils/_concurrency.py", line 101, in collect_exc_info_and_raise
raise CogniteAPIError(
cognite.client.exceptions.CogniteAPIError: <html>
<head><title>405 Not Allowed</title></head>
<body>
<center><h1>405 Not Allowed</h1></center>
<hr><center>nginx</center>
</body>
</html>
| code: 405 | X-Request-ID: None
The API Failed to process some items.
Successful (2xx): u]
Unknown (5xx): /]
Failed (4xx): {'externalId': 'Pixelite_CCU_C10-C1.Temperature'}, {'externalId': 'Pixelite_CCU_C10-C1.Depth'}, ...(more failed InsertDatapoints)

where I traced this to a failure in line 222 (res = f.result()) in the execute_tasks() function of cognite/client/utils/_concurrency.py,

 

I also checked the variables “being executed”:

tasks

<(e{'externalId': 'Pixelite_CCU_C10-C1.Temperature', 'datapoints': r(1698690892, 7.7)]},  …(more InsertDatapoints...) ,  {'externalId': 'Pixelite_CCU_C4-C3.topCameraHeading', 'datapoints': m(1698690892, 150.0)]}, {'externalId': 'Pixelite_CCU_C4-C3.bottomCameraHeading', 'datapoints': m(1698690892, 205.0)]}],)]

 and func:

<bound method DatapointsPoster._insert_datapoints of <cognite.client._api.datapoints.DatapointsPoster object at 0x10f54a750>>

So how can I fix this error?

Looking at this conversation Cognite Hub, everything seems ok.


Code:

This is what I have so far:

### dto.py ###

@dataclass
class SensorData:
Temperature: str
Depth: str
OxygenLevel: float
OxygenLevelPercent: float
internalHeading: int
topCameraHeading: int
bottomCameraHeading: int

@dataclass
class ImencoCameraData:
id: str
OxygenSensorModel: str
CameraSensors: ListrSensorData]
connected : bool

### extractor.py ###

@extractor.get_multiple(paths=paths, response_type=ImencoCameraData, interval=60)
def imenco_response_to_datapoints_handler(imenco_camera_data: ImencoCameraData) -> GeneratornInsertDatapoints, None, None]:
"""
Receive responses of type 'ImencoCameraData' from GET request to decorated function 'paths'.
Transform response to yield a generator of 'InsertDatapoints', which is then uploaded to CDF.
A response has 7 sensor values, which means we yield 7 InsertDatapoints - each with a unique
externalId and a single Datapoint.

Parameters:
imenco_camera_data (ImencoCameraData): Response from GET request to Imenco IP.

Returns:
A generator of 'InsertDatapoints' that will be uploaded to CDF.
Constructor: InsertDatapoints(
external_id,
ListeDatapoint], where Datapoint = TupleoTimeStamp, float]
)'
"""

camera_name = imenco_path_to_camera_name_mapicamera_id_to_path_mapnimenco_camera_data.id]]
ox_sensor_model = imenco_camera_data.OxygenSensorModel
connected_status = imenco_camera_data.id

# list with 1 'SensorData' value
for camera_sensors in imenco_camera_data.CameraSensors:
for data_field in fields(camera_sensors):
timestamp = arrow.now().timestamp()
ts_ext_id = f"{camera_name}.{data_field.name}" #pseudo ext_id for this example
value = float(getattr(camera_sensors, data_field.name))
print(timestamp, ts_ext_id, type(value), value)

yield InsertDatapoints(
# id=imenco_ext_id_to_id_map ts_ext_id],
external_id=ts_ext_id,
datapoints= (timestamp, value)]
)

where the raw data from `print(timestamp, ts_ext_id, type(value), value)` looks like:

1698688899.664962 Pixelite_CCU_C1-C2.bottomCameraHeading <class 'float'> 205.0
1698688899.665888 Pixelite_CCU_C7-C3.bottomCameraHeading <class 'float'> 205.0
1698688899.667632 Pixelite_CCU_C4-C1.Temperature <class 'float'> 8.4
1698688899.667749 Pixelite_CCU_C4-C1.Depth <class 'float'> 2.2
1698688899.667816 Pixelite_CCU_C4-C1.OxygenLevel <class 'float'> 10.164129140950386
1698688899.66788 Pixelite_CCU_C4-C1.OxygenLevelPercent <class 'float'> 110.06

 

I’m sure you’re supposed to start with extractors after doing more courses and getting familiar with all CDF endpoints, so my bad :)

An addition perhaps:
cognitedata/python-extractor-utils: Framework for developing extractors in Python (github.com)

Defining a config schema — cognite-extractor-utils 5.4.0 documentation (readthedocs-hosted.com) (or Extractor-utils Library for Cognite Python SDK)
and here

on config file configuration like you showed me can be a huge time saver. I completely missed the developer docs :D

 

Happy halloween to you too!

Best Patrick


Yep you’ve solved it.

 

Glad you made it into a successful evening :)

Happy Halloween!

(=PA=)


Tried to find the documentation about the `baseUrl`

Looks like you can find it in

https://cognite-sdk-python.readthedocs-hosted.com/en/latest/quickstart.html#instantiate-a-new-client

 

# This value will depend on the cluster your CDF project runs on
cluster = "api"
base_url = f"https://{cluster}.cognitedata.com"
tenant_id = "my-tenant-id"
client_id = "my-client-id"

Or here

https://developer.cognite.com/dev/use_the_API

Or here

https://developer.cognite.com/dev/quickstart/#step-2-set-up-environment-variables

baseURL: You can find your baseURL from the CDF project. Navigate to your CDF project. Under Manage & Configure > Manage access, select Open ID connect tab. The URL in the audience field is the baseURL.

 

I’ve to admit it is not 100% crystal clear(?)

 

Let me try to follow up internally, and check how we can improve documentation especially for onboarding new developers!

(=PA=)


Yep you’ve solved it. Saw the discrepancy with how my cognite client was set up in the python SDK notebook example compared to the example used in the REST example_config.yaml.

In other words, this works:
```

COGNITE_BASE_URL=https://westeurope-1.cognitedata.com

COGNITE_PROJECT=sao-staging

COGNITE_TOKEN_URL=https://login.windows.net/de10159d-2c09-4762-966c-e841d3391feb/oauth2/v2.0/token
```
Thank you Peter. Saving the project timeline as you do :)


```
2023-10-31 16:40:00.300 UTC UDEBUG   ] ThreadPoolExecutor-1_0 - Starting new HTTPS connection (1): westeurope-1.cognitedata.com:443
2023-10-31 16:40:00.564 UTC DEBUG   ] ThreadPoolExecutor-1_0 - https://westeurope-1.cognitedata.com:443 "POST /api/v1/projects/sao-staging/timeseries/data HTTP/1.1" 200 22
2023-10-31 16:40:00.574 UTC 4DEBUG   ] ThreadPoolExecutor-1_0 - HTTP/1.1 POST https://westeurope-1.cognitedata.com/api/v1/projects/sao-staging/timeseries/data 200
2023-10-31 16:40:00.575 UTC 7INFO    ] MainThread - Uploaded 84 datapoints
```


your

COGNITE_BASE_URL

is wrong and doesn’t point to our CDF API endpoint, but Fusion UI endpoint.

 

If you don’t know, which cluster your cdf-project is running on, you can see it in Fusion UI URL at the `?cluster=` http-parameter!

Just add `https://` in front and set it as your `COGNITE_BASE_URL`

 

Most likely correct, but I’ve seen

COGNITE_TOKEN_URL

with endings `/oauth2/token` or `/oauth2/v2.0/token`

 

(=PA=)

 


Ahh, good to know. I’ll try to dive into the library to see which variables to define. Currently, the config and env file looks like:

### config.yaml

logger:
console:
level: DEBUG

cognite:
# Read these from environment variables
host: ${COGNITE_BASE_URL}
project: ${COGNITE_PROJECT}
# extraction_pipeline:
# external_id: rest:imenco:ocean_farm_1:ts

idp-authentication:
token-url: ${COGNITE_TOKEN_URL}
client-id: ${COGNITE_CLIENT_ID}
secret: ${COGNITE_CLIENT_SECRET}
scopes:
- ${COGNITE_BASE_URL}/.default

# source:
# base_url: ${SOURCE_BASE_URL}
# auth:
# basic:
# username: my-user
# password: ${SOURCE_PASSWORD}


### .env file
COGNITE_BASE_URL=https://sao.fusion.cognite.com
COGNITE_PROJECT=sao-staging

COGNITE_TOKEN_URL=https://login.windows.net/(removed tenant)/oauth2/token
COGNITE_CLIENT_ID=(removed)
COGNITE_CLIENT_SECRET=(removed)

SOURCE_BASE_URL=http://127.0.0.1:3000 #not used

Still the API URL looks wrong

> https://sao.fusion.cognite.com/api/v1/projects/sao-staging/timeseries/data

 

Now I see it starts with https://sao.fusion.., but it is supposed to start with something like

`http://{cluster}.cognitedata.com/...`

Where cluster is `westeurope-1` or `api` or else.

You see it in Fusion URL, like `?env=westeurope-1&cluster=westeurope-1.cognitedata.com`

 

Can you pls check again?

Or share your config (w/o secrets :))

(=PA=)

 


Ah I just set up my .env file wrong while debugging, though fixed it now. The address seems plausible, but the error persists:
```
2023-10-31 16:06:32.233 UTC [DEBUG   ] ThreadPoolExecutor-1_0 - https://sao.fusion.cognite.com:443 "POST /api/v1/projects/sao-staging/timeseries/data HTTP/1.1" 405 150
2023-10-31 16:06:32.234 UTC DEBUG   ] ThreadPoolExecutor-1_0 - HTTP Error 405 POST https://sao.fusion.cognite.com/api/v1/projects/sao-staging/timeseries/data: <html>
<head><title>405 Not Allowed</title></head>
<body>
<center><h1>405 Not Allowed</h1></center>
<hr><center>nginx</center>
</body>
</html>```

We can rule out that it is credentials issue though, as the same Cognite client can be used to upload the datapoints using the python-SDK, instead of the REST extractor.


Hi again Peter,

Thanks for taking the time. Good idea. With the DEBUG logger, it seems the error does not complain about the POST payload, just the Cognite POST path (https://sao.fusion.cognite.com/api/v1/projects/https://sao.fusion.cognite.com/timeseries/data?):
```
2023-10-31 15:46:12.414 UTC DEBUG   ] ThreadPoolExecutor-1_0 - Starting new HTTPS connection (1): sao.fusion.cognite.com:443
2023-10-31 15:46:12.585 UTC CDEBUG   ] ThreadPoolExecutor-1_0 - https://sao.fusion.cognite.com:443 "POST /api/v1/projects/https://sao.fusion.cognite.com/timeseries/data HTTP/1.1" 405 150
2023-10-31 15:46:12.586 UTC TDEBUG   ] ThreadPoolExecutor-1_0 - HTTP Error 405 POST https://sao.fusion.cognite.com/api/v1/projects/https://sao.fusion.cognite.com/timeseries/data: <html>
<head><title>405 Not Allowed</title></head>
<body>
<center><h1>405 Not Allowed</h1></center>
<hr><center>nginx</center>
</body>
</html>

```

my `paths` variable in the get_multiple decorator is just the path to request data from, hardcoded to some ports in my mock-server for now:
paths=a'http://127.0.0.1:3000/sensordata', 'http://127.0.0.1:3001/sensordata', 'http://127.0.0.1:3002/sensordata', 'http://127.0.0.1:3003/sensordata', 'http://127.0.0.1:3004/sensordata', 'http://127.0.0.1:3005/sensordata', 'http://127.0.0.1:3006/sensordata', 'http://127.0.0.1:3007/sensordata', 'http://127.0.0.1:3008/sensordata', 'http://127.0.0.1:3009/sensordata', 'http://127.0.0.1:3010/sensordata', 'http://127.0.0.1:3011/sensordata']

I’ll look into the where the upload path is set, but for now I just follow the example_config.yaml and define this within the config.yaml:

```cognite:

# Read these from environment variables

host: ${COGNITE_BASE_URL}

project: ${COGNITE_PROJECT}```


and on a 2nd look, you posted this

```2023-10-31 13:57:39.081 UTC [DEBUG   ] ThreadPoolExecutor-1_0 - HTTP Error 405 POST https://sao.fusion.cognite.com/api/v1/projects/https://sao.fusion.cognite.com/timeseries/data: <html>

 

the API URL looks wrong, as at the place where you expect the “cdf-project” value (after the `/projects` there is the base-url again?

 

Can you check your Cognite client configuration and envvars used?

(=PA=)


Hi Patrick,

two things could help debugging:

  1. understanding the exact CDF API call sent from your code. Given that you are using the latest Cognite Python SDK, you can activate debug logging with `client.config.debug = True`
  2. Not all the parameters of your decorator are visible, what is the value going in `paths` here?
    (paths=paths, ..

An Error 405 is very uncommon using the Cognite Python SDK, so hopefully we understand it better with more details.

best regards

Peter

(=PA=)


To provide an update, everything works “manually” just using the SDK: 
```
ts_test_ext_id = ".Pixelite_CCU_C1-C1.Temperature" # pseudo ext_id matching test timeseries ext_id in CDF

datapoints = i(arrow.now().timestamp(), 10), (datetime(2018,1,2), 2)]

client.time_series.data.insert(datapoints, external_id=ts_test_ext_id)
```

I can do a clean environment install of both cogex (python-extractor-utils) and the cogex REST extension, but is there a stable version that works out-of-the-box? E.g.


Thanks for the reply @matiasholte. Under the hood, the REST-extractor does POST the datapoints to the timeseries.
```2023-10-31 13:57:39.081 UTC [DEBUG   ] ThreadPoolExecutor-1_0 - HTTP Error 405 POST https://sao.fusion.cognite.com/api/v1/projects/https://sao.fusion.cognite.com/timeseries/data: <html>
<head><title>405 Not Allowed</title></head>
<body>
<center><h1>405 Not Allowed</h1></center>
<hr><center>nginx</center>
</body>
</html>
```
I’m setting up a notebook to “manually” upload the datapoints to the timeseries (without using the extractor library), and will see if that’s possible with the current setup. If that works, it must be something in the library? Has nobody seen this error type before?


405 is usually the code you get when you use the wrong HTTP method.
Almost all our time series endpoints use the POST method, only list all time series use GET.

Not too familiar with the code you posted though, so I’m not sure it will help you with your problem


Reply