Solved

Unable to use Transformations to query for a single day

  • 9 September 2022
  • 9 replies
  • 104 views

Userlevel 3
Badge

When I run the query below, I am getting an error.  If I increase the end time by 1 second I can get a result, but it gives me 2 days and I only want 1.  All my data is consistently daily at midnight, UTC.  Any ideas how to get past this?

SELECT
  dp.id,
  dp.externalId AS `key`,
  dp.timestamp,
  dp.value
FROM
  `_cdf`.`datapoints` dp
WHERE
  dp.externalId IN (
    'f18ae31cbfe544cb7bca08da91e5245a-SuFMiCo0f7c8eca4ce444208853aedd00ded4fb',
    '730619216b764ba77bc708da91e5245a-SuFMiCo6bb90df0efbc461d863faedd00df009b'
  )
  AND dp.timestamp >= TO_TIMESTAMP('2022-09-01T00:00:00Z')
  AND dp.timestamp < TO_TIMESTAMP('2022-09-02T00:00:00Z')
 


Gives me a result, but not what I want:
 

 

cc @Torgrim Aas  @Sunil Krishnamoorthy 

icon

Best answer by Carin Meems 26 September 2022, 08:57

View original

9 replies

Userlevel 3
Badge


The 2 date strings seem to translate to different timestamps.

Userlevel 3
Badge

Found a somewhat silly workaround:
 

My guess is the API is returning the 4 rows and Spark is doing the work to filter in back down to the 2 that I want.

 

This is what I’d prefer to run, but it returns no results:
 

 

Userlevel 3

@Ben Brandt This is weird. I will check this tomorrow and get back. Apologies for the delay :) 

Userlevel 3
Badge

Thank you @Sunil Krishnamoorthy.  It is taking 30s to 2 minutes for each of these queries on our dashboard to return 10 rows of data, so I am hoping that fixing the where clause to be a point lookup rather than range can speed this up for us.

Userlevel 1
Badge

Hi Ben, 

It looks like we need to do a bit more digging here. I’ll create a Support ticket for this, and we’ll be following up from there. 

Enjoy your day!

Carin

Userlevel 3
Badge

@Sunil Krishnamoorthy@Carin Meems 

Any update on this? On Oct 7, our workaround of selecting a wider date range and throwing away unneeded data stopped working in some cases for a production dashboard (Bug #73988 on our side). 

 

This is the error we are seeing even though end is greater than start:

"Request with id 9d6f0c9c-18a8-9786-8d34-bfa535564e5b to https://az-eastus-1.cognitedata.com/api/v1/projects/pinnacle/timeseries/data/list failed with status 400: Invalid time range, end must be later than start."

Userlevel 3

Hi @Ben Brandt , We on it this sprint.

Is it’s a bug on transformation side. When we queried directly using the timeseries APIs we get the results as expected. We will fix it shortly. Will inform you once fixed.

Apologies for the delay. :))

Userlevel 3

Hi @Ben Brandt 

 

This is taking longer than expected to debug our spark data source locally and unfortunate sick leaves due to weather. Please be assured this is still on the top of our sprint board along with orchestration capability within CDF.  I will get back to you shortly once fixed. Sorry for the inconvenience. 

Userlevel 3

Hi @Ben Brandt 

The fix is now deployed.  Please take a look.

This took bit longer than expected thank your for your patience :))

Reply