Skip to main content

it is our intent that the PowerBi can be used as a real time visualization tool for many use cases including:

 

  1. Analytics on Compressor Efficiency
  2. Real time monitoring off Natural Gas Consumption in our methanol unit
  3. Work Order Wrench Time Optimization
  4. Shift Schedule and OT optimization

 

The problem with PowerBi connector is that it currently treats CDF as a data source and then imports data on a scheduled basis. There is no federated connection available where data is not imported or duplicated. Microsoft has confirmed this issue lies with Cognite and not on their side.  The current Powerbi connector forces an import and does data duplication in the Powerbi space. This method has two names, some call it Import Data, and some call it Scheduled Refresh. Both names explain the behavior of this method. With this method data from the source will be loaded into Power BI. Loading in Power BI means consuming memory and disk space. As long as you are developing Power BI on your machine with Power BI Desktop, then it would be the memory and disk space of your machine. When you publish the report on the website, then it will be the memory and disk space of the Power BI cloud machines on AZURE.

If you have 1 Million rows in a source table, and you load it into Power BI with no filtering, you end up having the same amount of data rows in Power BI. If you have a database with 1000 tables, however, you only load 10 of those tables in Power BI, then you get memory consumption for only those 10 tables. Bottom line is that you spent memory and disc space as much as you load data into Power BI. This also has limitation on how much can be imported and what frequency. This not only limits us in scale but does not provide data in real time.

PowerBi live connection is very similar to DirectQuery in the way that it works with the data source. It will not store data in Power BI, and it will query the data source every time. Power BI will be a visualization layer, then query the data from the data source every time. Power BI will only store metadata of tables (table names, column names, relationships…) but not the data. Power BI file size will be much smaller, and most probably you never hit the limitation of the size because there is no data stored in the model.

Because these data sources are modeling engines themselves, Power BI only connects to these and fetches all model metadata (measure names, attribute names, relationships…). With this method, you need to handle all your modeling requirements in the data source which in our case is CDF. and Power BI just surfaces that data through Visualization.

I hope this make sense.

I would like this addressed, there are multiple Hub post on this 


Hi, and thanks again for this product idea.

We’re already tracking this request in the “Cognite CDF: PowerBi Live and Real Time Connector (CRITICAL)” product idea, so I will ask @Anita Hæhre or @Carin Meems to please merge the information in this product idea with it.


Updated idea statusNewParked
Idea merged into:

All the votes from this idea have been transferred.

Cognite Team @Anita Hæhre  , out of curiosity, since the final solution is not yet available, could this be addressed by using a Push dataset, Streaming dataset, or PubNub streaming dataset?

Reference here: https://learn.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming


@andrelcalves - If the question is whether it would be possible to use a Push/Streaming/PubNub dataset right now, I believe the answer to be “no” as we do not expose the equivalent of a “streaming data flow” from CDF (and something would need to provide OData with the changing data information). 

If you’re asking whether a Push/Streaming/PubNub dataset could be a possible feature enhancement for our OData interface(s), that’s an interesting idea and something we may choose to do at some point. But it is unfortunately not something we can get started on until at the earliest the summer of 2024.


You are correct @andrelcalves, it is possible to create a scheduled job and push data from CDF to a push or streaming dataset in Power BI using the Power BI REST API.

It's not as simple as creating the report using the CDF Power BI connector, but it is a viable solution. In this case, we are not using OData to transfer data, but instead using a wrapper to fetch data from CDF REST API and pushing data to Power BI REST API.

I tested it a while ago just to confirm (followed this tutorial) and it worked fine. I created a simple Cognite Function and scheduled it to push data to Power BI every five minutes and it's been working fine since. I plan to write a tutorial here on the hub sharing how to use this method to stream data to Power BI datasets.


@andrelcalves - If the question is whether it would be possible to use a Push/Streaming/PubNub dataset right now, I believe the answer to be “no” as we do not expose the equivalent of a “streaming data flow” from CDF (and something would need to provide OData with the changing data information). 

If you’re asking whether a Push/Streaming/PubNub dataset could be a possible feature enhancement for our OData interface(s), that’s an interesting idea and something we may choose to do at some point. But it is unfortunately not something we can get started on until at the earliest the summer of 2024.

Thanks @Thomas Sjølshagen 


Great, @Everton Colling ! That was a viable solution, albeit a tough one. I'll take a look at the tutorial you mentioned. It would be fantastic to have your tutorial with a Cognite example. If I can contribute from my side, I will gladly share it with you. Thank you very much.


Here is the tutorial I promised.

As I mentioned before, this can be a valid solution to continuously push data from CDF to Power BI with a higher frequency. @Andre Alves  thanks again for the suggestion of using Power BI REST API endpoints.