Can we use source control, or version control, with Power BI? We get this question ALL THE TIME! Adam walks you through what your options are and how this works today with Power BI.
ALM Toolkit: http://alm-toolkit.com/
Tabular Editor: https://tabulareditor.com/
Latest updates for Deployment Pipelines:
Source Control Idea
📢 Become a member: https://guyinacu.be/membership
Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.
🎓 Guy in a Cube courses: https://guyinacu.be/courses
🛠 Check out my Tools page – https://guyinacube.com/tools/
#PowerBI #SourceControl #GuyInACube
Hey, is this still the current state of CI/CD in Power BI? Are there any significant releases around this since you made this video?
We would love the opportunity to "diff" the files, but we can leave out the feature of "div" the files as you proposed :)) Thanks for the content!
is there a way that can actually track changes with the report design of .pbix?
Can Tabular Editor be used for extracting a BIM file from a Power BI template (PBIT) file?
All these COTS click and drag crap have the worst integration (specially granular) are terrible. And it’s sheerly moronic to develop these obvious model, view, controller parts as a single binary. MiroSUCKS stupidity! I always separate my systems in loose controllers, loose models and loose view components.
Also for the sake of ultimate flexibility and cooperation with other devs. I’d never use Power BI after seeing this. It’s just not enterprise grade in my opinion.
haha, you mean "diffs" not "divs" right?
For source control, we create a SharePoint Document Library with versioning enabled to host the .pbix files. Each .pbix file is connected [Get data / Files / SharePoint – Team Sites ] to a DEV [we use deployment pipelines] PBI Workspace.
To "publish" a new version, we just upload a new copy — using the exact same file name — to SharePoint. PBI automagically detects the change in the connected file and updates the report and dataset.
One unpleasant part is that in order to record a check-in comment when we update the .pbix file in SharePoint, we must use "classic" mode since the new 'modern' SharePoint mysteriously deleted this capability.
The only other awkward piece is that when connecting to the workspace, there is no drop-down for the SharePoint site URL; it must be memorized or copied/pasted.
The end result is that we have a versioned set of .pbix files with comments in SharePoint. Whatever changes in SharePoint gets reflected in the PBI workspace where we can deploy as appropriate to TEST & PROD.
We keep the model definition in SSAS which allows us to at least separate the model definition. I design the bim in VS2019. Once ready I check the project in to DevOps which kicks off a pipeline that creates a deployment package. That is then automatically pushed to our development environment which installs the bim in SSAS, transforms the data source connections to the deployment environment etc. It even creates a job in SQLServer to refresh the cube every 15/30 mins. We also manually add our PBIX files to source control as well. Happy to share DevOps steps etc if anyone is interested.
In the mean time, Tabular Editor is licenced software that you have to buy.
I like how the icon for "diff" is the html "div" icon < / >
ALM toolkit is a nice workaround to track changes between versions
Hi Adam, have you had the chance to try one of the Azure Dev Ops Extensions to get the Power BI CI/CD working?
You said diffs, and I was also thinking fetch /merge as in a git action. But the video showed "Divs".
Did your editor fucked up by confusing div with diff?
Would be great if you could directly publish a PBIT file as storing the data contained in a PBIX file isn’t a great idea for a source control system. We ended up creating a process to use a “deflated” PBIX file and then using the Rest APIs automating the publish through a dev ops CI/CD process. It isn’t perfect, but has made deploys a little less manual to the UAT and Prod spaces.
Do you think there is a way to take two different version pbix files and then zip them , extract their config, merge that together and then package the file back up? The goal would be to merge changes on the front end.
0:52 – <div />'s or diffs??
This will be a great update, but I think that the PBI pipelines may support more than 3 stages.. i.e. De/QA/UAT/…./Stage/ Prod
Has anyone found a solution to get the conversation id outlook? can someone help me?
As some others have stated, *.pbix aren't technically binary files, they are just re-badged zip files. If you change the extension and open it, basically everything inside (aside from actual binaries like images) are stored as plain text and easily diff-able.
That said, there are ways to effectively use source control on pbix with some automation using pre-commit and post-commit hooks. The pre hook would unpack the pbix in to a folder of the same name, and THEN push those component pieces to source. The post-commit hook would then re-bundle what it gets from source control in to the appropriate pbix (zip) file/archive. I have been looking to implement this at my workplace for some time and haven't done so yet, but there are examples of people out there that have done it.
Mostly this whole situation is infuriating to me as it means Microsoft could provide solutions to this source control problem very easily within their framework. Especially given they OWN GitHub and could VERY EASILY just add a feature (optional) to auto-unpack and re-pack PBIX files for source control.
When the complexity level of reports gets real having an appropiate version control is crucial. I do not know why they haven't implemented a way of handling version control yet. Even more so when the pbix file is a zip and there are traces that pbix work with configuration files and the like…
I'm using an Azure SQL DB with Azure AD. I can connect to this using SSMS and SSDT but when I try to connect with Power BI Desktop it won't authenticate the Azure Active Directory account (EXTERNAL_USER rôle). If I use an SQL login (SQL_USER rôle) with Power BI it works fine. Any thoughts on what I need to do to get this up and running?
I now use ODBC connexion but this is not optimal and impossible to refresh data via Power BI Portal
Thanks for any help