Archive for the ‘Blogs’ Category

Power BI and Source Control Integration

Yes! This is a real problem with the Power BI team to source control Power BI report files (.pbix) with the team. I too have faced problem with it.

Questions arise:

  1. How to integrate Power BI files to a source control tool like TFS, SVN, Git, Azure DevOps, etc. as there is no Project structure available?
  2. Will we get proper versioning of .pbix files as they are not XML/JSON based as we get in case of SQL Server Reporting Services?

Different teams have adopted different ways to handle it.

One of the teams used Visual Studio to have a Solution with any project. Created Solution Folders to organize the report files and added all .pbix files to respective folders. Finally, associated that Solution to the TFS. Whenever they need to work on a particular report, open the Visual Studio Solution and double click on .pbix file. It opens the report up into Power BI Desktop. Work on it and save it. Come back to Visual Studio and Check-In. That’s it! Though, the real issue is Power BI Developer needs to have Visual Studio installed on their local system to make it work. The management questions, why a Power BI team needs to have Visual Studio??? Of course, they need to buy VS licenses for them.

Does it really maintains different versions of .pbix file? Technically a .pbix file is not a XML/JSON file the way we have SSRS reports. So, how does it maintain different versions of .pbix file? Practically, I have seen it working fine with TFS. God knows how it works, but it works, at least so far. Never got any issue from anyone on this.

I wish if Power BI team can integrate the Desktop tool to Azure DevOps the way Azure Data Factory team has done it for ADF V2 version. Azure Data Factory has implemented CI/CD in the Azure Portal itself. So, there is no need for Visual Studio and other stuff, which had been there with ADF V1 version earlier. Yes! I faced problem working on ADF V1 when it comes to sharing files with other team members. Hope, Power BI team does same sooner.

Let me know your experience on this issue.


How to find out Power BI Desktop Version, User ID and Session ID

It’s always good to have latest version of Power BI Desktop as Power BI team keeps on updating the Desktop tool. So, how to find out what version of Power BI Desktop you’ve been running???

Open up your Power BI Desktop, go to About and click it. It will open a new box displaying all the required details about Power BI Desktop. Take a look at below image:

I’ve rubbed the User ID and Session ID off from the image as that’s, I guess, the confidential info about my Power BI Desktop.



Update on .NET Core 3.0 and .NET Framework 4.8

Courtesy: Scott Hunter, Director of Program Management for .NET

In May 2018, Microsoft announced .NET Core 3.0, the next major version of .NET Core that adds support for building desktop applications using WinForms, WPF, and Entity Framework 6. We also announced some exciting updates to .NET Framework which enable you to use the new modern controls from UWP in existing WinForms and WPF applications.

Today, Microsoft has shared a bit more detail on what Microsoft is building and the future of .NET Core and .NET Framework.

.NET Core 3.0 addresses three scenarios our .NET Framework developer community has asked for, including:

  • Side-by-side versions of .NET that support WinForms and WPF: Today there can only be one version of .NET Framework on a machine. This means that when we update .NET Framework on patch Tuesday or via updates to Windows there is a risk that a security fix, bug fix, or new API can break applications on the machine. With .NET Core, we solve this problem by allowing multiple versions of .NET Core on the same machine. Applications can be locked to one of the versions and can be moved to use a different version when ready and tested.

  • Embed .NET directly into an application: Today, since there can only be one version of .NET Framework on a machine, if you want to take advantage of the latest framework or language feature you need to install or have IT install a newer version on the machine. With .NET Core, you can ship the framework as part of your application. This enables you to take advantage of the latest version, features, and APIs without having to wait for the framework to be installed.

  • Take advantage of .NET Core features: .NET Core is the fast-moving, open source version of .NET. Its side-by-side nature enables us to quickly introduce new innovative APIs and BCL (Base Class Library) improvements without the risk of breaking compatibility. Now WinForms and WPF applications on Windows can take advantage of the latest .NET Core features, which also includes more fundamental fixes for an even better high-DPI support.

.NET Framework 4.8 addresses three scenarios our .NET Framework developer community has asked for, including:

  • Modern browser and modern media controls: Today, .NET desktop applications use Internet Explorer and Windows Media Player for showing HTML and playing media files. Since these legacy controls don’t show the latest HTML or play the latest media files, we are adding new controls that take advantage of Microsoft Edge and newer media players to support the latest standards.

  • Access to touch and UWP Controls: UWP (Universal Windows Platform) contains new controls that take advantage of the latest Windows features and touch displays. You won’t have to rewrite your applications to use these new features and controls. We are going to make them available to WinForms and WPF so that you can take advantage of these new features in your existing code.

  • High DPI improvements: The resolution of displays is steadily increasing to 4K and now even 8K resolutions. We want to make sure your existing WinForms and WPF applications can look great on these displays.

Read more…

Migrating On-Premises SQL Server Database to Azure SQL Database

Today I was trying to migrate an On-Premises SQL Server 2016 Database to Azure SQL Database using .bacpac file option i.e. Export Data-Tier Application, but bad luck! I got this error every time I tried, and to my surprise, all credentials and configurations were correct.

The error that I got was: “Deployment fails with ‘Unable to connect to master or target server ‘xyz’. You must have a user with the same password in master or target server ‘xyzdb“. I really couldn’t get it why it’s happening as everything was correct. Whether I try to do it using Azure Portal or SSMS, the result remain same – Error.

Somehow I figured it out that I have been using SQL Server Management Studio ver. 13.0 and I need to have the latest SSMS with me. I used SQL Server 2016 and SSMS version for the same. Hence, I downloaded the SQL Server Management Studio ver. 17.4, which is for SQL Server 2017. And guess what??? It did work without any error at the very first try. 🙂

So, the above error message was completely misleading. Just have the latest SSMS with you, doesn’t matter if you are using SQL Server 2014/2016. And, you will be able to migrate SQL Server DB to Azure SQL DB.

Hope this helps you out! 🙂


OLE DB versus ADO NET Components in SSIS

ETL Package developers often ignore this question: What is the difference between OLE DB and ADO NET components? Should I use OLE DB Source/Destination or ADO NET Source/Destination? Which one is faster?

Let me explain what the difference is. The only difference is both of them use drivers/providers behind the scene, which have been developed using different technologies. ADO NET providers use ADO.NET along with C# behind the scene, while OLE DB providers use something else. Recently, the Microsoft said that they would not be enhancing the features in OLE DB providers. So, don’t expect more with OLE DB. Hence, you need to use ADO NET providers more and more. Don’t get scared! In fact, if you wish to connect to Azure SQL Database, you need to use ADO NET providers, as the OLE DB doesn’t offer flexibility to connect to Azure SQL Database or Data Warehouse. However, the OLE DB has got more providers to connect to different types of data stores compare to the ADO NET, at least as of now.

As far as performance is concerned, I personally did R&D with both the Sources/Destinations with huge data i.e. 1-5 GB, Database to Database, Database to File, File to Database, all types of package. Eventually, the statistics say that OLE DB has got an edge over ADO NET, and give you 60% more performance than that of ADO NET. Yes! ADO NET Sources/Destinations process data slower than OLE DB. Here, I would like to mention that I tweaked all option like DefaultBufferMaxRows, DefaultBufferSize, AutoAdjustBufferSize for SSIS 2016, etc. and then reached to this conclusion. And my experience of package development corroborates with this.

Hence, I recommend using OLE DB Source/Destination for the time being, until Microsoft enhances the ADO NET providers. Yes! You may need to use ADO NET in case of Azure, which is becoming very common now-a-days as every client is moving databases to Azure.

There are many other factors which affect the package performance, which I am going to discuss in other blogs. So, stay tuned!

Hope this helps you out.


Why Not to Use Sort and Aggregate Transformations in SSIS

I often come across queries about the SSIS package optimization techniques. There are many, and all depends up on scenarios. However, there are a few fundamental principals to architect SSIS packages. These help us to design a package, which runs faster. I am going to tell about two transformations that I always avoid: Sort Transformation and Aggregate Transformation.

You can categorize all the transformation in two broader categories: Synchronous and Asynchronous. Or, you may divide them into three types: Non-Blocking, Semi-Blocking and Blocking. It’s not right place to discuss all Categories/Types here. I will be discussing only Blocking Transformations here. Just for the sake of your knowledge, the Non-Blocking Transformation belong to Synchronous category, while Semi-Blocking and Blocking belong to Asynchronous category.

The Blocking transformations accept entire input at first i.e. all rows from source, perform the assigned operation, and then give the output of the operation. During this process, you have to wait for the transformation to collect entire required rows before getting output/desired result. This is where it bring degradation in the Data Flow performance, and in turn, slow package.

Both the Sort and Aggregate transformations belong to the Blocking types. This is the reason I always recommend not to use them altogether in your pipeline. Then question arises: How to sort the rows and, if required, to get aggregate?

There are ways! In case of OLE DB Source and SQL Server as source, you need to use SQL query, not the Table Name from drop down list, and perform these operations in the query. These operations will run at the database engine machine and will give you output instantly. In case of flat file/raw file, request your team to calculate these values and put it either in anther file or in first one/two lines into the source file. I am sure you know how to read/escape initial lines while reading flat files. There is an option available. Check it out! This is how you can avoid using Blocking transformations like Sort and Aggregate and, in turn, achieve better package performance.

I will be blogging about other techniques to optimize the SSIS package. So, stay tuned!


Microsoft Launches Preview of Quantum Development Kit and Q# Programming Language

Wow! Amazing! I’m excited!

Microsoft just released the preview of Quantum Development Kit for programming for Quantum computers. We’re getting a new programming language today called Q# for quantum programming.

Here is the link from where you can get more information about Quantum Development Kit. Go ahead and download it…

Click here to download Quantum Development Kit

Best of luck!