Quick Track: Beginner’s Guide to Azure SQL- Authentication and Access Management

Picture by George Becker

Authentication and Authorization are the two things that we need to get familiar with when we learn about the Access management.

Authentication is proving that we are the same user who we say we are. Below are the authentication methods available in Azure.

SQL Authentication: Azure SQL Database and for the Managed instance (MI) supports SQL authentication. This is just like any regular SQL Authentication account with username and the password. When you create the server, you will need to create the SQL authentication account which will become the server admin that will have database owner on all the databases. This account can create other SQL authentication accounts on the server as well.

Azure Active Directory: There is a centralized location for the identities and permissions to database users to be stored for the Azure active directory authentication. As the password is only stored at one location, it becomes easier for the administrators to manage the identities and for maintaining the passwords. This authentication is supported for Azure SQL database, Managed instance (MI) and Azure synapse analytics. The Active directory admin will become the server admin where to create other AD logins, only the Active directory admin can create the other AD accounts having permissions to Azure SQL database. Multi factor authentication option is available for this option. Authentication app can be used as the multi factor authentication where it provides phone call/text message/authentication app to confirm the identity. This authentication type will use the contained database users to connect to the database to access the database objects.

Active directory will support the users with the login id and password, Integrated (single sign-on) authentication used when there is a trust built between the Active directory and the Azure AD using windows authentication. Universal/interactive with multi factor authentication is where the password is generated dynamically. There is a token authentication where a token is generated from Azure AD and as we pass this token to the SQL. Source: here.

One of the important point to remember is, the Azure active directory admin has to create the Azure active directory login. SQL authentication logins cannot be created by Azure AD logins. The other type of users that the Azure AD supports are individual user accounts, group accounts, service principles.

To create the Azure AD account, you can create the account by running on virtual master database. please see sample code to create Azure AD login here.

Lets learn a bit about the differences in the logins and users in between the managed instance and Azure SQL database

Managed Instance: You can create the Azure AD server admin along with the SQL server admin. You can create the SQL logins even when you only enable Azure AD only authentication but you cant connect until you enable the SQL authentication. You can create the SQL or the Azure AD logins, Database users and database contained users. You can also create the Server trust group for the distribution transaction scenarios between the two instances.

Azure SQL Database: You can create the Azure AD Server admin along with SQL logins. Additionally you can have roles like loginmanager and dbmanager for limited server admins. You can create the users associated with the SQL logins, create contained users including Azure AD. To create these contained users, the user must be login using the Azure AD server admin.

There are two configuration options for the authentication:

Cloud only option- where you can create the AD azure accounts with out having them connected to the on-prem servers or you can federate services with single sign-in option, Azure AD storing password as a Hash and Azure SQL passthrough with single sign-on is where the password stored in on-prem.

Azure SQL Authentication supports Azure AD authentication or SQL authentication or Azure AD only authentication (disables the SQL authentication automatically).

Azure Role-Based Access control (RBAC) is what can and cannot be done as a user. We have built in roles in azure where we can assign access to the users and groups permissions all the way from the subscription to the individual resource groups and resources. These permissions can be applied to each level of the hierarchy of resources.

There are three RBAC built in roles that we need to remember:

Owner: Owner can do everything and assign other users as well.

Contributor: Can only modify but cant assign other userd

Reader: Can only read the assigned resource

Custom roles can also be created. Once these roles are created, you can assign the users to these groups.

For the SQL permissions, you can use the built in roles or the custom roles making sure you give the least privileges needed. To provide the least privileges, for the azure SQL database the AD user will be assigned to the database user which will have permissions to the database role connecting to the database. With the Azure SQL managed instance, the login at the server level will be associated and connected to the database user which is assigned to the database roles.

The other important thing to remember is that when the owner of the tables and the views objects are the same, we can only grant the user permissions to access the views instead of granting the permissions to tables for security reasons but the user can now access the tables through the views without having to have the access to the tables. This is called Ownership chaining.

Row level security: Securing the data by only providing access to the rows that the users are allowed to see. This comes under the least privilege access as well.

In order to implement the row level security, we need to create the SQL object known as security policy which is assigned to the table where it will filter the rows and provide only the information that is allowed to share.

I did not show any code examples here in this post because we already have several resources in place showing the examples of how to create the logins and users. Please check here and here

In the coming blog post, I will be writing about threat protection.

Thanks for reading!

Resources:

Azure SQL Security: Understanding Access and Authentication (Ep. 2) | Data Exposed

Quick Track: Beginner’s Guide to Azure SQL- Network Security Concepts

Let’s learn a bit about Security in this post.

Securing something you built is as important as creating it. Any resource that you create needs to have in depth layers of protection to save it from the bad eyes.

Let’s take the analogy of the Bank systems where you save your money and other valuables. Starting from the parking lot of the bank till the door of the bank vault where the actual money is stored, there is multi layer of security. Cameras, digital locks, padlocks and more locks to reach the money. This is in depth security system. Breaking any layer can only result in more effort to crack an another layer which is even more difficult.

Same goes with the Data. No matter you are on-prem or in the cloud, Data needs to be protected and there are no exceptions.

Securing data in Azure is an important part and there are different security layers available in Azure. Below diagram shows you the different layers of Security we have in Azure to reach the customer data.

In this post, let’s focus on the Network security.

Source: https://docs.microsoft.com/en-us/azure/azure-sql/database/security-overview?view=azuresql

Network Security

The below table will give an idea about different Network security options available and the differences among them.

Allow Access to Azure ServicesFirewall rulesVirtual network rulesPrivate link
Least secureSecure with firewallPrivate IP address of the Azure VMMost secure
Anything in Azure can connectServices with added Firewall rules can connectAny resource in that VN can access Azure SQLBlock all public access
Outside of Azure cannot connectAllow on-prem connectionsApply virtual network technologies to connect on-premPrivate IP address within a specific virtual network
Public IP addressPublic IP addressPrivate IP address within the VNPrivate IP address
DNS hierarchy is publicDNS hierarchy is public (region, cr)DNS hierarchy is still public (region, cr)DNS hierarchy shows private endpoint

In any specific region you have hosting the Azure SQL Databases, your client can connect to Azure SQL database using the Gateways having the public ip addresses. Gateways are the intermediate layer in between the Azure SQL database and the client. Client can connect to Azure SQL database through these gateways in two ways.

Proxy: We can understand this by simple analogy. You want to visit your friend’s apartment frequently but at the security gate, the watch man asks you several questions where you need to prove your authenticity to enter into your friend’s home every time you want to visit him. That is the proxy mode. Here, the DNS will resolve from one of the multiple gateways through port 1433. Each time the client want to connect, the gateway will act as a proxy in between to connect to the database. Proxy mode is the default mode and if you are trying to connect from outside Azure.

Redirection: You want to visit your friend’s apartment frequently and this time, you will receive additional key to the apartment at the entrance as you prove your authentication. Once you receive the key, you can directly go to the apartment with out any intermediate layer of security check at the gate. In the redirection mode, a redirect token is assigned when the client connects to the gateway and all the connections made to the database next doesn’t have to pass through the gateway. The redirect token contains the ip address of the database and the port in between the range 11000 to 11999. Redirection mode is default if you are trying to connect to the database within azure.

These gateways are known as data slices when they are logically grouped together. These data slices are used for the load balancing.

Let’s learn about each of the network security layer:

1. Allow access to Azure Services: Anything in azure can connect to your database if you enable this option. Any azure resource from any subscription and from any region. On-prem Servers cannot connect using this option. This option is a kind of the firewall rules where the start and end ip addresses were all zeros.

2. Firewall rules: These are the firewall rules that you create like any other on-prem Servers. By adding the client ip addresses here in the firewall rules, the client can connect to the Azure SQL database. On-prem servers can connect to Azure by using this option.

3. Virtual network rules: Clients from the specific virtual network subnets can connect to the Azure SQL through the private IP address.

4. Private Link: Private link is creating a private ip address from the virtual network which can connect to your database privately. Internet will be disconnected here. To create a private endpoint, we need to know what resource type you are creating the private endpoint for. It can be Azure SQL Database or the storage account, then on which virtual network and subnet we are going to create this endpoint.

Here is the youtube link you can follow to create a Private link for your database.

In the next article, we will be learning about the next security later, Access Management.

Thanks for reading!

T-SQL Tuesday #158, That One Time Implementation!

Happy new year to everyone. This year January T-SQL Tuesday is hosted by Raul Gonzalez asking us to write about the scenarios which are not necessarily the best practices but scenarios where we had to implement the practices which are supposed to be avoided. Thanks to Raul for bringing this topic as I have some interesting scenario related to replication where I had to replicate the data from SQL Server 2012 to SQL Server 2000. I know, its the old server SQL 2000 which is unsupported.

According to the best practices from Microsoft, it is recommended that the transactional replication can be compatible and supported in going two versions higher including the same version (SQL Server 2012/2014/2016) or two versions lower (SQL Server 2008 R2/2008).

The scenario that I faced was little challenging. We had SQL Server 2012 production server replicating data to a Server 2000 which is used for reporting purposes. Subscriber SQL Server 2000 used by the reporting team were not ready to upgrade the Server as they need to rewrite their entire application as it was using vb6 code. They need a strategy where the data can still be replicated without upgrading the Server.

As I researched, I found that it is not compatible version but planned to test the replication to see if somehow it works. I tested the replication between SQL Server 2012 as a publisher and SQL Server 2000 as subscriber. I was able to setup the transactional replication between the servers for the database but found during the initial initialization snapshot, the ANSI_PADDING setting in the snapshot generated .sch files caused the issue while the distribution job runs. The setting was OFF for the .sch files which I need to turn back on. Please see the below one of the errors I have faced in my situation, took the error reference from here.

“SELECT failed because the following SET options have incorrect settings: ‘ANSI_PADDING’. Verify that SET options are correct for use with indexed views and/or indexes on computed columns and/or query notifications and/or xml data type methods. (Source: MSSQLServer, Error number: 1934)
Get help: http://help/1934″

It is easy to fix the .sch file manually if it is just one file that needs to be changed for the setting ANSI_PADDING from OFF to ON but there are many .sch files with the same setting “OFF”. The most tedious part is that the error only shows one .sch file at a time. I need to fix the .sch file digging into the details of the error and then run the distribution job again to see the next error for the .sch file.  I found a very handy PowerShell script answered by Asaf Mohammad in Microsoft forum where you can place this script as the last step in the snapshot job. Once the snapshot of the database is created and the .sch files are generated, this script will search for all the .sch files with ANSI_PADDING setting and edit the files having the settings from OFF to ON. This process happens automatically when ever the new snapshot is generated or the reinitialization of the subscription happens as it forces to create the new snapshot of the database.

Once I fixed this error, I was able to replicate multiple databases from SQL Server 2012 to SQL Server 2000 implementing the job step for each database snapshot job. I was able to solve the problem by using the PowerShell script. Since then, I did not see any specific file errors related to this error.

This was one of my real time experiences where I had to walk against the recommended best practices and found success in setting up the replication with some issues which I was able to resolve. I am looking forward to reading the experiences of other SQL family members in seeing successful results in such scenarios.

Thanks for reading!

Reflecting Back on Year 2022: My Highs and Lows

Reflecting back at 2022, it was a great year with challenges but had learned many things. This year 2022 has bought some significant changes to my life as a person. This blog post is a quick overview of my year 2022.

Speaking:

I started my speaking journey back in 2020 with my debut session for New Stars of Data. Since then, I continued speaking at different events, both local and international events. I continued my speaking in year 2022 as well but most of my presentations were virtual. I was honored to be selected for the PASS Summit 2022 and decided to go in person to give my presentation for the first time ever.

To be very sincere, I was scared and worried of how I would be presenting in person for the first time at that big conference but I am blessed to have wonderful attendees who patiently listened to me during my talk. The first ever presentation I gave in person was on the topic “Women in Technology: How to become an Ally” where I decided to be vulnerable in sharing my story starting from my primary school days. Special thanks to Rie Merritt for reviewing my session and for giving your valuable feedback. Never in my dream I thought my story would connect to audience so well that it would made them shed the tears throughout the session. First few minutes were nerve-wracking and could easily felt in my voice but as time pass by, I felt more and more comfortable in sharing the story. In addition to that, some of my friends, mentors and well wishers attended the session to support me during my presentation. Thanks to Chris Wood, Jeff lannucci, Alpa Buddhabhatti , Kevin Chant and Martha Osorio for attending my session and supporting me with your presence. Your presence had a positive impact on delivering my presentation successfully. Thank you so much.

Me presenting at “Women in Technology: Becoming an Ally” Session at PASS Summit 2022

I received a very good feedback from audience for this session. Later, I presented a technical session along with my dear friend Surbhi Pokharna on the topic “Performance Monitoring, Tuning and Scaling Azure SQL Workloads” which was successful as well. We received great feedback on this session too. I also participated in the panel discussion for Diversity, Equity and Inclusion along with my friend Jean Joseph. I also gave a pre-recorded lightening talk presentation on “10 minutes in to SQL Notebooks”.

For the year 2023, I am honored to be selected as one of the speakers for SQLBits 2023. I will be presenting one technical “A to Z Azure SQL Security Concepts” and a non-technical session “Can an Introvert become an International Speaker? If so, How?” with my dear friend Alpa Buddhabhatti. I am going to present both the sessions virtually.

I will continue to speak on the technical and non-technical sessions in the coming year 2023.

Microsoft MVP Award

I have been awarded the Microsoft Most Valuable Professional Award on October 1, 2022 by Microsoft. It is a great honor to be recognized as an MVP. This is a prestigious Award from Microsoft. If you are not familiar with the details of the award, please look here.

Redgate 100

Really honored to be recognized in this year’s Redgate 100 top online influencers in ones to watch category for creating impact in the database community. I am really honored to be listed among other famous celebrities from SQL Family.

PASS Summit 2022 Committee member

Honored to be on the Committee list for PASS Summit 2022 in improving the Diversity, Equity and Inclusion at the Summit. It was a real great experience in working with other committee members and with Redgate as they continuously take actions and make an effort to improve the DEI at the Summit.

Mentoring

From the past year, it has been a huge honor in mentoring other women coming from the underrepresented countries. It helps them in being vulnerable, comfortable in sharing their story as I come from the same background as well. Sharing and moving ahead in the environments where you can succeed but also support and help other woman thrive in the same environments is the most beautiful thing I can ever imagine. Mentoring is always a two way street. I learn from my Mentees and it has been a great learning from them. I would like to continue to mentor in the coming years as well.

Blogging

I have been blogging about the Query Store and on Azure fundamentals in the past year. You can find the in depth series of posts on Query Store, please find here and blogs related to Azure here. I have been participating in the T-SQL Tuesdays. Its been a while I continued with my posts on Azure. I will have to be consistent on posting them. In 2023, I hope I will be writing more of the posts related to Azure.

Women in Technology Volunteer and Co-Organizing data Platform DEI user group

I feel blessed to be able to volunteer WIT user group in maintaining the Women speaker page “Webinars by Women“. I usually gather the details of women speakers along with the events they are speaking in the near future and upload the information into the portal. This will help the audience to look at the page for attending their sessions, helps event organizers who are looking for women speakers. This page is like a central location where you can find most of the women speakers from the data platform world and the event details of where and when they are speaking. I consider this as a great opportunity to serve the WIT community which is close to my heart. I highly suggest you to follow WIT page and DEI user group page on twitter to receive new upcoming session updates. If you follow these twitter handles, it will help you in being notified with updates and also show a kind gesture of you supporting women and diverse set of groups from the data platform world.

I am also serving as a Co-organizer for Data platform DEI virtual user group with amazing Tracy Boggiano and Jess Pomfret. I am so happy to work with both of them to organize the events and meetings for DEI user group.

Co-Organizer for Microsoft Data and AI South Florida User Group

I am happy to serve as a co-organizer for Microsoft Data and AI South Florida user group along with my dear friend Adriano.

Talking about my lows

Alright, so all the above information is related to highs. Everyone has to go through both highs and lows at certain point in their lives. I would like to share about my lows here. As you can see in one of my previous blog post, I have gone through lot of challenges and gaining back self confidence is not an easy few step processes. It is a continuous journey and if you fall in the trap of being low for long period of time, it will consume you and your time in complete. I had several times where I felt low for a good amount of time. During those times, I do not feel like learning anything, write any blogs or even engage in speaking. I am sure everyone has these periods in their lives where they get depressed or let the past struggles drive their present lives.

To gain back my power, I had to listen and watch a lot of motivational videos on YouTube. I read “Can’t Hurt Me” by David Goggins which helped me come out of the tough times in my life. I recently started to listen to the audio version of the book again because I felt like I am falling back again. I also purchased David Goggins second book “Never Finished” and I am going to purchase the audio book as well. David Goggins straight forward approach and his sincere suggestions (though he uses cuss words, I don’t mind them at all as it helps me realize where I am in my life in a raw way with out using soft words, babying me telling its okay to be a down in life).

David Goggins is the sole reason for where I am today in my life. If not his book, I do not know if I even would have realized what I was going through. I was waiting for someone to save me and felt like a victim of abuse for many years. His book gave me strength to face my own fears. I stood up for myself after long time.

So, am good after reading the book, right? I should be in a place where I never turned back to the past and as I now gained self confidence, I should be good, right? Well, that is not the case. I need to constantly remind myself that I am capable of doing something when self doubt creeps in and tells me where I was in the past and yells at me to never forget the past. It is a tape record that plays in my mind which pulls the past back and disturb my present time. We all face these situations sometime in our lives. When we do, we need to remember that we cannot change the past but we still have a choice to make things better in the future. We need to understand that several things in life are out of our control and no matter how many times we think about it, we can change nothing. Figuring out how those exact negative circumstances can fuel us for making a better future for ourselves is important. I am still in the process of knowing myself and fighting my own fears. It is a life long process.

If you are going through rough times in your life and are looking for someone outside of yourself to save you, I am going to tell you the hard truth that no one is coming to save you. You need to save yourself by doing the work needed. I only learned this from David Goggins book. I highly suggest you read the books to gain back your power. If you think I am doing publicity of these books, Yes, I am 🙂 and I am very proud to do so as they are truly self help books which changed my life.

As we are close to end of the year, I wish you all the best and Good luck for the year 2023! Thanks for reading!

T-SQL Tuesday #157 – Monthly DBA Checklist

Thank you Garry Bargsley for hosting the December month of T-SQL Tuesday!

Garry wanted us to write about the year end checklist for the Database administrators. I would like to write about the checklist I prefer to follow every month of the year.

Some of the important monthly checklist are as follows:

  1. Backup validation: Making sure your database backups are valid. You can do that by adding CHECKSUM while taking the backups, verify the backups regularly using VERIFYONLY option and also test the restore if it works properly by actually restoring the backup on different environment. Make sure you run the DBCC CHECKDB to be extra safe in making sure your backups are in consistent state.
  2. Database monitoring: Making sure the monitoring tool you use are not missing any of the instances. Making sure the alerts are triggered properly and email filters are in place to receive those alerts.
  3. SQL Server Security updates: Making sure your Servers are patched properly and all the services are working properly after the patching. I mention this because we recently found an issue with Master data services unable to connect to the database after SQL Server Security patching on either SQL Server Server where MDS database is hosted or on the MDS application server. We need to manually go ahead and update the database from the MDS configuration manager. To run this step of updating the database after patching, the user who runs this update needs to have super user permissions to MDS. If you are planning to take the vacation, make sure these servers specific to MDS are patched and properly updated. If you are in vacation and other DBA’s doesn’t have enough permissions to update the database or if they don’t have proper documentation to the procedures, it would be a disaster.
  4. Documentation, please: No matter how small the task and the resolution steps might be, having proper documentation of steps for troubleshooting any issue helps the DBA team members have peace of mind while resolving the issue. Make sure documentation is in place and the location to these documents are shared with all team members.
  5. Job failure notifications and please no success notifications: Make sure you only receive failure notifications and disable if you have any success notifications setup on any of the servers you built. Make sure the job owners are valid.
  6. Check for any password expirations: If any of the AD accounts are the owners of any jobs from task scheduler at OS level and if those accounts has password expired, you will get continuous notifications for those failed jobs due to password expiration. Making sure changing those passwords before going to vacation helps. I mentioned this example because I have seen cases in my past company where emails are bombarded with the failure alerts due to password expiration.
  7. On-Call: If you plan to take vacation and if your DBA team is small, make sure there is a backup for the On-Call person just in case. Having the updated contact information in outlook teams where everyone can access your contact information is crucial.

These are some of my monthly activities to check. I am looking forward to read the posts from other SQL family members on the same topic. I am sure there will be plenty of activities to take a note of.

Thanks for reading!

Don’t think you are ready yet? Do it anyway! Do NOT Self Reject!

This blog post is related to speaking at New Stars of Data. If you read the title, this post is for the professionals who wish to become the speaker but the self doubt is pulling them back from taking that first step.

I totally get that feeling. I was in the same situation two years ago. Self doubt was not just creeping in from time to time but it was part of who I was back then. I was going through depression due to the harassment I faced from my previous company for years in a row. I have no self esteem and lost self confidence bit by bit as each year passed through harassment. While I was in the darkest dungeon of my life, I finally decided to standup for my self. I know no one will come to rescue me and it is me who needs to walk towards the light from that darkness. I know it takes time but it is not impossible. But how? Where should I get the strength and energy to move a step forward when I have every reason to feel like a victim of abuse and harassment and be right in the same place? I was tired of blaming my fate for the situations I was in and I remembered I always can make a choice in any situation life throws at me.

I then decided there should be a challenge that I need to take that can build my self confidence back.

In the process of searching for a challenge, I opened twitter account only to follow BrentOzar and his new blog post news. I saw a tweet which said something like “If you are looking to speak at the event for the first time, New Stars of Data is the conference you can look into”. When I first read this tweet, I never thought I would submit for this conference. For the next couple of days, I couldn’t able to sleep properly as this tweet keep popping up in my head. Those thoughts screaming at me “what if this was the challenge you are looking for? Speaking can get back your self confidence, think about it! If you miss this chance, you may not get another chance”. These thoughts were consistent for couple of days. I know in my heart that I was not ready and I will never will. I just cant relax and wait for the perfect time to start doing the things that are right for me to do.

I decided to take the first step in the faith and submitted my abstract for the New Stars of Data 1. I then prayed to God from bottom of my heart that my session should not be selected. I was that scared and had that big self doubt. God went against my will and selected my session 🙂 It was a blessing in disguise. I know I put myself in a uncomfortable situation only to test myself if I can face the challenge. If I can, I know it is only for my self growth.

At the time of my abstract submission, I didn’t know that I will be supported by a Mentor by New Stars of Data conference organizing team (NSOD). Ben Weissman and William Durkin are the Organizers for NSOD conference. I was blessed to have Deborah Melkin as my Mentor. Her smile almost always calmed my tensed nerves. Deborah and I had weekly meetings to prepare my presentation. Deborah was very patient with me during the process. She helped me build my presentation and Andy Yun also helped me in giving a proper flow to the presentation. Both Deborah and Andy helped me build the presentation, reviewed it and had dry run before the NSOD presentation. That rehearsal session actually helped in fine tuning the presentation.

During my big day of NSOD, Deborah was moderating my session. Thanks to Ben and William for thinking above and beyond about the comfort of the new speakers without us even asking for it. As Deborah was a familiar person to me and as she was my mentor, I was super comfortable in delivering my presentation for the first time. My presentation went well, at least not terrible I believe, lol. After I completed my presentation, attendees were asked to provide the feedback. Please look at the feedback provided by the attendees below. It was such a great honor to be able to present at NSOD and to receive such encouraging feedback.

This feedback has helped me to take another step and present this session and another sessions at different events and conferences within United states and at different international conferences across the globe.

Since then, I have spoken at different user groups, SQL Saturdays, local and international conferences. I have presented more and more to get my self confidence back. I believe it is a continuous process and I am still in the process. Healing from the trauma takes time. I have presented at more than 200 events till date. I did it like I was obsessed with speaking because the more I present, the more I gained inner peace. The more peace I had in my heart, the more confident I felt. I had mention this at many events but would like to mention it here again.

“At the darkest moments in my life, I have chosen Speaking as a way out. Speaking has become a therapy to me. It will be a therapy to me for the rest of my life!” -Deepthi Goguri

To all the professionals who are reading this and are interested in becoming a speaker for the first time, I would highly suggest you to submit your sessions to New Stars of Data 5. NSOD has changed my life and I am sure it will change your life for good as well.

Though you think you are not ready yet, believe you will never be ready until you take the step and put the things into action. Remember, experience will never come if you wish and dream of a better future. Experience will only come with action. Go ahead and take the first step in faith. Do NOT self reject.

After reading this blog post, I am confident that you will be motivated enough to submit the sessions. With the same confidence, I would like to wish you all the best and Good luck on your submissions!

Thanks so much for reading!

T-SQL Tuesday #156 – Checklist for Better Production Code

Thanks to Tom Zika for hosting this month of T-SQL Tuesday! Tom asked us to share about the tips for the perfect production code.

Below are some of the tips I would like to share for the better production ready code:

1. A production ready code is always readable with detailed comments of what changes are being made to the existing code. Detailed documentation is important as this code is going to stay for years moving forward. Why, when and who changed the code needs to be documented. I have seen situations where these obvious details are being missed in production code. Code is here to stay, Developer who have written the code may or may not stay in the same company.

2. Best practices for commenting, I highly suggest you read the BrentOzar excellent post here.

3. Check for the execution times based on the DEV, TEST data. I have seen scenarios where the data size in the DEV and TEST environments are not same as production data. This can cause the difference in the execution times from lower environments to production. It is always suggested to test your code in lower environments having same amount of data as production.

4. Unit testing is important to find out if there will be any resource contention or concurrency issues ahead of time before implementing the code in production. Please find the free unit testing tools here.

5. Error handling your code. You need to have procedures in place to handle and capture the errors if in case you encounter the errors. Rollback procedures should be in place. Please read the details here. This is an excellent article on transaction handling in SQL Server.

6. Tune your code before going to production. Check for the indexes that you can create to improve the performance. Check if there are already existing indexes with the columns before creating the indexes, you may be creating duplicate indexes.

7. For better readability and style of your code, check these formatting tools here.

8. Give the sensible alias names for columns in the code.

9. Test all the use cases. Do you remember any situations where you had deployed your code in production and it failed and now everyone blames the testers and developers for not checking every use case scenario. Thorough testing on all use cases is important for a successful production deployments.

10. Do not run the code you see online directly on production. I know, I know, you will say never! but even the code from the legitimate website you always use, please check the entire code, test the code in test environments before thinking to use the same sample code in production.

These are some of the things that you can check if your code is ready for production deployment. I am looking forward to read the posts from other SQL family members for this month T-SQL Tuesday!

Thanks for reading!

T-SQL Tuesday 154 – New Performance Features in SQL Server 2022

Thanks to Glenn Berry for hosting this month T-SQL Tuesday. Glenn wanted us to write about the new features that we are excited about in the SQL Server 2022. Please find the invitation here.

I am very excited about the performance capabilities in SQL Server 2022. When we talk about performance, you know how much Query Store feature has actually helped in finding and resolving the slow performing queries since this feature got introduced in the year 2016.

If you already used the Query Store feature for your databases, you know it needs to be enabled at the database level. The primary use of this feature is to capture the execution plans and execution stats information for your queries over time so you can identify the regressions between the execution plans and take this information to fix most troublesome queries easily.

Starting SQL Server 2022, the Query Store feature is going to be enabled by default when you create new databases. If you need to migrate databases from older SQL servers to SQL Server 2022, then you need to still manually enable this feature.

In SQL Server 2019 and below versions, there is no support for the read only replicas for availability groups, meaning the query data is only collected to Query Store on primary replica and not the readable secondary replica. In SQL Server 2022, the query data is collected into Query Store on the secondary read only replicas for availability groups. Query Store hints can also be used in SQL Server 2022.

The other feature is the Query optimization for the parameter sensitive queries. In SQL Server 2019 and below versions, the moment the query executes for the first time, it saves the plan in the plan cache with the parameter value used for the first execution. The next time the same query runs with the different parameter value, the initial plan that got stored in the plan cache will be used. If the parameter that is being used for the first execution is sensitive, then it can generate a plan which can have resource consuming operators that can effect the next executions of the query with different parameters. This is because the same stored plan from the plan cache is reused.

In SQL Server 2022, multiple plans are generated based up on the parameter sensitive values and are saved in the plan cache. Depending on the parameter values, the optimizer will sniff the parameter value and use the execution plan based on the parameter value. This is a very useful feature as it can mostly solve the parameter sensitive query issues as the optimizer choses the right optimal plan for your queries based on the sensitivity of your parameter value.

I did not get a chance to test the performance capabilities in SQL Server 2022 but I am going to create a new demo for one of my upcoming sessions related to performance capabilities and Query store enhancements in SQL Server 2022.

Thanks to Glenn Berry one more time for boosting our interest in trying out the new features in SQL Server 2022 and to share the knowledge to the world.

I am looking forward to read this month T-SQL Tuesday posts from other SQL family members and learn their excitement and learnings from the new features SQL Server 2022.

Thanks for reading!

T-SQL Tuesday #153: My Life changing event!

Thanks to Kevin Kline for hosting this month of T-SQL Tuesday! Kevin wanted us to share about the events that has changed our lives in a better way.

Becoming a speaker has changed my life. I decided to become a speaker during the dark times in my life. Speaking has become a therapy to me. It has helped me pass through the rough times in my life.

I am always thankful to the conference “New Stars of Data” for giving me the opportunity and the platform to speak for the first time as a Speaker. My journey begin from here as a Speaker.

I have presented at many events later. Dataweekender is the second conference that gave me the opportunity to speak. This is one of my favorite conferences as well.

As I speak more, I gained a part of my confidence back. It is a continuous process. I took the step and decided to speak at the lowest point in my life. I sincerely can say that New Stars of Data event is a life changing event to me.

To anyone who wanted to become a speaker but if lack of confidence is stopping you from taking your first steps, please rethink. Confidence only comes by practice.

I can say that very sincerely because I spoke at several events later only to build my confidence. The more I presented, the more comfortable I became. Though I only presented virtually till date, I will carry the same confidence I gained when I present in-person.

When I think about the in-person events, I need to write a bit about the PASS Data Community Summit 2022. I am honored to be selected as a speaker at this year’s summit and I chose to present in-person.

I am very nervous to present in-person for the first time ever and that too for the biggest Data platform summit in North America. I believe having nervousness is a good thing when you are presenting because it will force you to prepare more and keeps you laser focused.

As New Stars of Data gave me a life as a Speaker, PASS Data Community Summit 2022 will definitely take me to the next level as a speaker as I will be presenting for the first time in-person.

Attending and presenting at the Conferences like SQLBits was one of the best experiences. Attending other events like EightKB is a great opportunity to learn internals about the SQL Server.

There are other conferences I have attended and learnt from other speakers.

During the Journey as a speaker, I have met new people along the way and few of them became my good friends for life!

I cherish all the moments…

I am very thankful to each and every organizer of the conferences, User groups and to each person who helped me along the way giving me the opportunity and supporting me at each step in my journey as a Speaker.

I would love to read other SQL Family member posts about this topic. Would love to learn from them how attending or speaking at different conferences changed their lives.

Thank you for reading!

T-SQL Tuesday #152 – Rants

Thanks to Deborah for hosting July month of T-SQL Tuesday!

I would like to share a rant on a situation I faced working on the database Server migrations couple of years ago.

In the past few years, I have done several database and server migrations. As you know, there are many things that needs to be taken care of during the testing phase until we do the actual migrations.

It is not simply the database that we migrate but all the related objects needs to be migrated as well. The agent jobs, SSIS packages, linked servers and most importantly updating the connections to the datasource information, not only to the database related objects but also to the related applications.

If you are using the Servername and database name in the connection strings in all the applications connecting to your database being migrated, this process gets tough and tedious to update once the database is migrated to a different server. It is always advised to use database DNS in the connection strings instead of the IP address or the Servername.

This also applies to the report server datasources as well.

I know it is hard to make the change but if the change is good, it is worth the effort.

During several database Server migrations, I have seen developers feeling comfortable in changing their application code to the new Server name instead of taking the advice of database DNS. I know the amount of work that goes behind to find the objects referencing the server names. Missing any one of the locations can absolutely break the code and mess really hard.

I have experienced many times where developers had failed updating the SSIS packages or failed Linked Server connections once the databases are migrated. This problem can be solved simply by using the database DNS instead of the Server name. You will be only updating the DNS entry pointing to the new server name once you migrate the database to point to the new Server environment.

I know many companies already follow the standards of using database DNS but there are still others out there doesn’t want to make the change and instead fix the problem when applications break.

I believe it is not only important to fix a broken application somehow but how effectively and simply you can avoid causing the applications to break in the first place.

I am very interested in reading other SQL family member posts on this topic!

Thanks for reading!