Source Control with Power BI – Can it be done?

    6
    22



    Can we use source control, or version control, with Power BI? We get this question ALL THE TIME! Adam walks you through what your options are and how this works today with Power BI.

    ALM Toolkit: http://alm-toolkit.com/

    Tabular Editor: https://tabulareditor.com/

    Latest updates for Deployment Pipelines:
    https://powerbi.microsoft.com/blog/announcing-new-deployment-pipelines-capabilities/

    Source Control Idea
    https://ideas.powerbi.com/ideas/idea/?ideaid=c73d887d-042e-4ca0-8216-bcb09bd282ab

    šŸ“¢ Become a member: https://guyinacu.be/membership

    *******************

    Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.

    šŸŽ“ Guy in a Cube courses: https://guyinacu.be/courses

    *******************
    LET’S CONNECT!
    *******************

    — http://twitter.com/guyinacube
    — http://twitter.com/awsaxton
    — http://twitter.com/patrickdba
    — http://www.facebook.com/guyinacube
    — https://www.instagram.com/guyinacube/
    — https://guyinacube.com

    ***Gear***
    šŸ›  Check out my Tools page – https://guyinacube.com/tools/

    #PowerBI #SourceControl #GuyInACube

    source

    Previous articleCI CD Pipeline Using Jenkins | Continuous Integration and Deployment | Intellipaat
    Next articlePortainer – The Best GUI for Docker and Kubernetes?

    22 COMMENTS

    1. All these COTS click and drag crap have the worst integration (specially granular) are terrible. And it’s sheerly moronic to develop these obvious model, view, controller parts as a single binary. MiroSUCKS stupidity! I always separate my systems in loose controllers, loose models and loose view components.
      Also for the sake of ultimate flexibility and cooperation with other devs. I’d never use Power BI after seeing this. It’s just not enterprise grade in my opinion.

    2. For source control, we create a SharePoint Document Library with versioning enabled to host the .pbix files. Each .pbix file is connected [Get data / Files / SharePoint – Team Sites ] to a DEV [we use deployment pipelines] PBI Workspace.

      To "publish" a new version, we just upload a new copy — using the exact same file name — to SharePoint. PBI automagically detects the change in the connected file and updates the report and dataset.

      One unpleasant part is that in order to record a check-in comment when we update the .pbix file in SharePoint, we must use "classic" mode since the new 'modern' SharePoint mysteriously deleted this capability.

      The only other awkward piece is that when connecting to the workspace, there is no drop-down for the SharePoint site URL; it must be memorized or copied/pasted.

      The end result is that we have a versioned set of .pbix files with comments in SharePoint. Whatever changes in SharePoint gets reflected in the PBI workspace where we can deploy as appropriate to TEST & PROD.

    3. We keep the model definition in SSAS which allows us to at least separate the model definition. I design the bim in VS2019. Once ready I check the project in to DevOps which kicks off a pipeline that creates a deployment package. That is then automatically pushed to our development environment which installs the bim in SSAS, transforms the data source connections to the deployment environment etc. It even creates a job in SQLServer to refresh the cube every 15/30 mins. We also manually add our PBIX files to source control as well. Happy to share DevOps steps etc if anyone is interested.

    4. Would be great if you could directly publish a PBIT file as storing the data contained in a PBIX file isn’t a great idea for a source control system. We ended up creating a process to use a ā€œdeflatedā€ PBIX file and then using the Rest APIs automating the publish through a dev ops CI/CD process. It isn’t perfect, but has made deploys a little less manual to the UAT and Prod spaces.

    5. As some others have stated, *.pbix aren't technically binary files, they are just re-badged zip files. If you change the extension and open it, basically everything inside (aside from actual binaries like images) are stored as plain text and easily diff-able.

      That said, there are ways to effectively use source control on pbix with some automation using pre-commit and post-commit hooks. The pre hook would unpack the pbix in to a folder of the same name, and THEN push those component pieces to source. The post-commit hook would then re-bundle what it gets from source control in to the appropriate pbix (zip) file/archive. I have been looking to implement this at my workplace for some time and haven't done so yet, but there are examples of people out there that have done it.

      Mostly this whole situation is infuriating to me as it means Microsoft could provide solutions to this source control problem very easily within their framework. Especially given they OWN GitHub and could VERY EASILY just add a feature (optional) to auto-unpack and re-pack PBIX files for source control.

    6. When the complexity level of reports gets real having an appropiate version control is crucial. I do not know why they haven't implemented a way of handling version control yet. Even more so when the pbix file is a zip and there are traces that pbix work with configuration files and the like…

    7. Hi guys,

      I'm using an Azure SQL DB with Azure AD. I can connect to this using SSMS and SSDT but when I try to connect with Power BI Desktop it won't authenticate the Azure Active Directory account (EXTERNAL_USER rƓle). If I use an SQL login (SQL_USER rƓle) with Power BI it works fine. Any thoughts on what I need to do to get this up and running?

      I now use ODBC connexion but this is not optimal and impossible to refresh data via Power BI Portal

      Thanks for any help