Impact 2024: The Industrial Data and AI Conference for and by Users | Nominate Speakers Now for a Ch...
Hi, I think the columns predictions is quite hard since with Raw you can have different number of columns per row in the same table. So depends on your final goal, here’s my example of using CTE as subquery, get_names from your tables and use it in the next query:with colNames as ( select get_names(*) as names from `myDb`.`myTable`)select names from colNamesAnd I think it’s easier to achieve this with Spark dataframe rather than Spark sqlhttps://stackoverflow.com/questions/60896406/how-to-unpivot-spark-dataframe-without-hardcoding-column-names-in-scala
Hi,You can use the profiler to see the predict type of your columns, but AFAIK, we can only get names but not types of columns for Transformation query from this UDFhttps://docs.cognite.com/cdf/integration/guides/transformation/write_sql_queries/#get_names
Hi,You can’t ingest numeric datapoint with null value as it is not supported by the Api.The configuration for NULL handling is translated to the option ignoreNullFields under the hood in our Spark Data Source. But it’s useful for Nullable field only, when you use Upsert Mode and you have a Nullable field, if the new input value of your data for the row is null and:ignoreNullFields is true (i.e Keep existing value), the existed value of the row is preservedignoreNullFields is false (i.e Clear existing value), the existed value of the row is overridden with Null value
Hi,There’s a bug that we need to fix when validating data for direct relation property, can you try to run the transformation in both way to see? The node_reference should work.
Hi,Normally you can use this UDF node_reference for this.In your casenode_reference(‘sj-test-data-model’, cast(TO_ITEM_ID as STRING) as fromentityIf that doesn’t work you can try named_struct("spaceExternalId", "sj-test-data-model", "externalId", cast(TO_ITEM_ID as STRING)) as fromentity
Hi,The issue is fixed now. Can you try to re-ingest data with transformation and querying it again?
Can you give me the uuid string of the request Id?Should be the one with xxxx in your error log.I can try to see with auth team why that request is rejected
Can you try to replace all env variable by raw string with the real value for the authentication section in the .yaml file to see?authentication: clientId: "your ClientId" clientSecret: "yourClientSecret" tokenUrl: "https://login.microsoftonline.com/yourTenantId/oauth2/v2.0/token" scopes: - "yourScope" # should be smthing like https://.... cdfProjectName: "yourProjectName"Aslo can you give me the request ID from the error above ? I can ask auth team to check the 401401 | X-Request-ID: xxxxxBut I still think something is wrong with your credentials. If it’s a parsing value issue we should have seen it with other users who used the cli as well
Hi,I don’t think it’s permission issue (You would get 403 instead in that case). Here it’s 401 so you are not authorized, looks like your credentials is not correct. Can you double check it again, also is it possible to get the full error message, may be I can get a hint which endpoint returns that 401.
Hi, I think it saw the view version as a number but not string according to your log'view_version': 2Can you force it to string by doing: view_version: “0_2”In the manifest instead?
Hi,Ah right you need to use the syntax with _ instead of camel case I think :)https://github.com/cognitedata/transformations-cli/blob/main/tests/test_deploy.py#L185-L190 destination: type: instances view_external_id: test_view view_version: test_view_version view_space_external_id: test_space instance_space_external_id: test_space
Hi, it looks like you did not use the latest version of the cli, since we already renamed AlphaDMIDestinationConfig not DMIDestinationConfig , and the new type of FDM is InstancesDestinationConfig
Hi,Unfortunately we don’t support SET command yet.I can suggest you a workaround for this using CTE as below: with myTime(startDate,endDate) as (select "2023-01-01", "2023-12-31")SELECT explode(sequence(to_timestamp(startDate), to_timestamp(endDate), interval 12 hours)) AS timestamp from myTime
Hi,Unfortunately distinct can’t work on map.I don’t see your complete query but a workaround that you can be inspired on is:select map(...) as metadata, .... from (select distinct * from my.table)
Hi,Yes the notion is quite confused. The space is the DataModel and the modelExternalId is the Data Model Type.So for example if you want to read from the model that you write to in the 2nd screenshot (the first screenshot doesn’t have Data Model Type so I can’t take give you an example of it)select * from _cdf_datamodels.`Data Model:Data Model Type`That is translated to:select * from _cdf_datamodels.`udmobjectmodel:Subject_1`
Hi @Trond Saure,You can try to query FDM in a transformation using the syntax below select * from _cdf_datamodels.`$space:$modelExternalId`Let me know if this helps!
Hi @Neerajkumar Bhatewara ,You can ingest data for JSON property type using the function to_json or even raw json string, both syntax below should work:to_json(named_struct("string_val", "toto", "int_val", 1)) as prop_json'{ "int_val": 2, "string_val": "tata", "struct_val": { "name": "jetfire", "age": 25.0 }}' as prop_jsonPlease ignore the warning type string, since we can’t infer the result type that it is JSON or raw json string. If you run the transformation it should ingest data successfully. Let me know if you encounter any issue :)
Hi,Your query syntax is not correct. I think if you want the and condition on both oil Rate and gasRate it should be:filter: { and: [{oilRate: {eq: 1.5}}, {gasRate:{eq:1.5}}]}
Hi,We made the fix to support Json type can you try it again?I don’t think we have well support to link timeseries data with FDM yet. But there’s work on it to grow with FDM as well :)
Hi,You are not missing anything :)Currently Transformation does not support writing to Json type yet. We have this feature in the backlog and will work on it.
Hi Ben,There is a bug that Set command is not allowed when validating the query. Meanwhile before we fix it, can you try something like below as a alternative solution?WITH AllMyVars as ( select 123 as myVar )select myVar as key from AllMyVars
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.