Solved

Unable to use Transformations to query for a single day

  • 9 September 2022
  • 5 replies
  • 72 views

Userlevel 3
Badge

When I run the query below, I am getting an error.  If I increase the end time by 1 second I can get a result, but it gives me 2 days and I only want 1.  All my data is consistently daily at midnight, UTC.  Any ideas how to get past this?

SELECT
  dp.id,
  dp.externalId AS `key`,
  dp.timestamp,
  dp.value
FROM
  `_cdf`.`datapoints` dp
WHERE
  dp.externalId IN (
    'f18ae31cbfe544cb7bca08da91e5245a-SuFMiCo0f7c8eca4ce444208853aedd00ded4fb',
    '730619216b764ba77bc708da91e5245a-SuFMiCo6bb90df0efbc461d863faedd00df009b'
  )
  AND dp.timestamp >= TO_TIMESTAMP('2022-09-01T00:00:00Z')
  AND dp.timestamp < TO_TIMESTAMP('2022-09-02T00:00:00Z')
 


Gives me a result, but not what I want:
 

 

cc @Torgrim Aas  @Sunil Krishnamoorthy 

icon

Best answer by Carin Meems 26 September 2022, 08:57

View original

5 replies

Userlevel 3
Badge


The 2 date strings seem to translate to different timestamps.

Userlevel 3
Badge

Found a somewhat silly workaround:
 

My guess is the API is returning the 4 rows and Spark is doing the work to filter in back down to the 2 that I want.

 

This is what I’d prefer to run, but it returns no results:
 

 

Userlevel 2

@Ben Brandt This is weird. I will check this tomorrow and get back. Apologies for the delay :) 

Userlevel 3
Badge

Thank you @Sunil Krishnamoorthy.  It is taking 30s to 2 minutes for each of these queries on our dashboard to return 10 rows of data, so I am hoping that fixing the where clause to be a point lookup rather than range can speed this up for us.

Userlevel 1
Badge

Hi Ben, 

It looks like we need to do a bit more digging here. I’ll create a Support ticket for this, and we’ll be following up from there. 

Enjoy your day!

Carin

Reply