Solved

Numeric Data Points transformation not accept NULL values

  • 20 March 2023
  • 3 replies
  • 51 views

Badge

I’m trying to Create a new Numeric DataPoint transformation and when I created a new transformation the UI shows a question on how it should handle NULL values (keep existing value or deleting).

I’ve setup my query with a CAST function to convert values to double. I understand that this will return NULL values when CAST is not possible, so I would expect the transformation to follow the settings defined for the transformation, but instead the transformation fails with the following message:
Column 'value' was expected to have type Double, but NULL was found

Can you please clarify what is the expectation regarding the Numeric DataPoint Transformation. Can it have NULL values as input or not and what is the settings for NULL values about?

Thanks

icon

Best answer by Dilini Fernando 7 April 2023, 09:47

View original

3 replies

Userlevel 1
Badge

Hi,

You can’t ingest numeric datapoint with null value as it is not supported by the Api.
The configuration for NULL handling is translated to the option ignoreNullFields  under the hood in our Spark Data Source. But it’s useful for Nullable field only, when you use Upsert Mode and you have a Nullable field, if the new input value of your data for the row is null and:
ignoreNullFields  is true (i.e Keep existing value), the existed value of the row is preserved

ignoreNullFields  is false (i.e Clear existing value), the existed value of the row is overridden with Null value

Userlevel 6
Badge

Hi @Leandro Santos did the above help? 

Userlevel 4
Badge +2

Hi @Leandro Santos,

I hope Vu Hai's explanation was helpful. I'm closing this topic now, but please feel free to leave a comment if you require any further assistance. 

Reply