Update on .NET Core 3.0 and .NET Framework 4.8

Courtesy: Scott Hunter, Director of Program Management for .NET

In May 2018, Microsoft announced .NET Core 3.0, the next major version of .NET Core that adds support for building desktop applications using WinForms, WPF, and Entity Framework 6. We also announced some exciting updates to .NET Framework which enable you to use the new modern controls from UWP in existing WinForms and WPF applications.

Today, Microsoft has shared a bit more detail on what Microsoft is building and the future of .NET Core and .NET Framework.

.NET Core 3.0 addresses three scenarios our .NET Framework developer community has asked for, including:

  • Side-by-side versions of .NET that support WinForms and WPF: Today there can only be one version of .NET Framework on a machine. This means that when we update .NET Framework on patch Tuesday or via updates to Windows there is a risk that a security fix, bug fix, or new API can break applications on the machine. With .NET Core, we solve this problem by allowing multiple versions of .NET Core on the same machine. Applications can be locked to one of the versions and can be moved to use a different version when ready and tested.

  • Embed .NET directly into an application: Today, since there can only be one version of .NET Framework on a machine, if you want to take advantage of the latest framework or language feature you need to install or have IT install a newer version on the machine. With .NET Core, you can ship the framework as part of your application. This enables you to take advantage of the latest version, features, and APIs without having to wait for the framework to be installed.

  • Take advantage of .NET Core features: .NET Core is the fast-moving, open source version of .NET. Its side-by-side nature enables us to quickly introduce new innovative APIs and BCL (Base Class Library) improvements without the risk of breaking compatibility. Now WinForms and WPF applications on Windows can take advantage of the latest .NET Core features, which also includes more fundamental fixes for an even better high-DPI support.

.NET Framework 4.8 addresses three scenarios our .NET Framework developer community has asked for, including:

  • Modern browser and modern media controls: Today, .NET desktop applications use Internet Explorer and Windows Media Player for showing HTML and playing media files. Since these legacy controls don’t show the latest HTML or play the latest media files, we are adding new controls that take advantage of Microsoft Edge and newer media players to support the latest standards.

  • Access to touch and UWP Controls: UWP (Universal Windows Platform) contains new controls that take advantage of the latest Windows features and touch displays. You won’t have to rewrite your applications to use these new features and controls. We are going to make them available to WinForms and WPF so that you can take advantage of these new features in your existing code.

  • High DPI improvements: The resolution of displays is steadily increasing to 4K and now even 8K resolutions. We want to make sure your existing WinForms and WPF applications can look great on these displays.

Read more…

Advertisements

Advanced C# Training at Siemens, Bangalore 25-29 June, 2018

Just finished 5 days Advanced C# training at Siemens, Bangalore…Had a nice time explaining concepts!

Visual C# .NET Training at Siemens, Bengaluru, 11 – 13 Jun, 2018

Just finished 3 days C# training at Siemens, Bengaluru…Visited their Siemens’ new office at Gold Hills, EC, Phase-2. Very good facility, especially the training room, etc. Enjoyed!

 

6th Consecutive Year being Microsoft Certified Trainer

Just received! Excited! 6th consecutive year being Microsoft Certified Trainer…Thanks to Microsoft!

Migrating On-Premises SQL Server Database to Azure SQL Database

Today I was trying to migrate an On-Premises SQL Server 2016 Database to Azure SQL Database using .bacpac file option i.e. Export Data-Tier Application, but bad luck! I got this error every time I tried, and to my surprise, all credentials and configurations were correct.

The error that I got was: “Deployment fails with ‘Unable to connect to master or target server ‘xyz’. You must have a user with the same password in master or target server ‘xyzdb“. I really couldn’t get it why it’s happening as everything was correct. Whether I try to do it using Azure Portal or SSMS, the result remain same – Error.

Somehow I figured it out that I have been using SQL Server Management Studio ver. 13.0 and I need to have the latest SSMS with me. I used SQL Server 2016 and SSMS version for the same. Hence, I downloaded the SQL Server Management Studio ver. 17.4, which is for SQL Server 2017. And guess what??? It did work without any error at the very first try. 🙂

So, the above error message was completely misleading. Just have the latest SSMS with you, doesn’t matter if you are using SQL Server 2014/2016. And, you will be able to migrate SQL Server DB to Azure SQL DB.

Hope this helps you out! 🙂

 

OLE DB versus ADO NET Components in SSIS

ETL Package developers often ignore this question: What is the difference between OLE DB and ADO NET components? Should I use OLE DB Source/Destination or ADO NET Source/Destination? Which one is faster?

Let me explain what the difference is. The only difference is both of them use drivers/providers behind the scene, which have been developed using different technologies. ADO NET providers use ADO.NET along with C# behind the scene, while OLE DB providers use something else. Recently, the Microsoft said that they would not be enhancing the features in OLE DB providers. So, don’t expect more with OLE DB. Hence, you need to use ADO NET providers more and more. Don’t get scared! In fact, if you wish to connect to Azure SQL Database, you need to use ADO NET providers, as the OLE DB doesn’t offer flexibility to connect to Azure SQL Database or Data Warehouse. However, the OLE DB has got more providers to connect to different types of data stores compare to the ADO NET, at least as of now.

As far as performance is concerned, I personally did R&D with both the Sources/Destinations with huge data i.e. 1-5 GB, Database to Database, Database to File, File to Database, all types of package. Eventually, the statistics say that OLE DB has got an edge over ADO NET, and give you 60% more performance than that of ADO NET. Yes! ADO NET Sources/Destinations process data slower than OLE DB. Here, I would like to mention that I tweaked all option like DefaultBufferMaxRows, DefaultBufferSize, AutoAdjustBufferSize for SSIS 2016, etc. and then reached to this conclusion. And my experience of package development corroborates with this.

Hence, I recommend using OLE DB Source/Destination for the time being, until Microsoft enhances the ADO NET providers. Yes! You may need to use ADO NET in case of Azure, which is becoming very common now-a-days as every client is moving databases to Azure.

There are many other factors which affect the package performance, which I am going to discuss in other blogs. So, stay tuned!

Hope this helps you out.

 

Why Not to Use Sort and Aggregate Transformations in SSIS

I often come across queries about the SSIS package optimization techniques. There are many, and all depends up on scenarios. However, there are a few fundamental principals to architect SSIS packages. These help us to design a package, which runs faster. I am going to tell about two transformations that I always avoid: Sort Transformation and Aggregate Transformation.

You can categorize all the transformation in two broader categories: Synchronous and Asynchronous. Or, you may divide them into three types: Non-Blocking, Semi-Blocking and Blocking. It’s not right place to discuss all Categories/Types here. I will be discussing only Blocking Transformations here. Just for the sake of your knowledge, the Non-Blocking Transformation belong to Synchronous category, while Semi-Blocking and Blocking belong to Asynchronous category.

The Blocking transformations accept entire input at first i.e. all rows from source, perform the assigned operation, and then give the output of the operation. During this process, you have to wait for the transformation to collect entire required rows before getting output/desired result. This is where it bring degradation in the Data Flow performance, and in turn, slow package.

Both the Sort and Aggregate transformations belong to the Blocking types. This is the reason I always recommend not to use them altogether in your pipeline. Then question arises: How to sort the rows and, if required, to get aggregate?

There are ways! In case of OLE DB Source and SQL Server as source, you need to use SQL query, not the Table Name from drop down list, and perform these operations in the query. These operations will run at the database engine machine and will give you output instantly. In case of flat file/raw file, request your team to calculate these values and put it either in anther file or in first one/two lines into the source file. I am sure you know how to read/escape initial lines while reading flat files. There is an option available. Check it out! This is how you can avoid using Blocking transformations like Sort and Aggregate and, in turn, achieve better package performance.

I will be blogging about other techniques to optimize the SSIS package. So, stay tuned!