T-SQL Tuesday #156 – Checklist for Better Production Code

Thanks to Tom Zika for hosting this month of T-SQL Tuesday! Tom asked us to share about the tips for the perfect production code.

Below are some of the tips I would like to share for the better production ready code:

1. A production ready code is always readable with detailed comments of what changes are being made to the existing code. Detailed documentation is important as this code is going to stay for years moving forward. Why, when and who changed the code needs to be documented. I have seen situations where these obvious details are being missed in production code. Code is here to stay, Developer who have written the code may or may not stay in the same company.

2. Best practices for commenting, I highly suggest you read the BrentOzar excellent post here.

3. Check for the execution times based on the DEV, TEST data. I have seen scenarios where the data size in the DEV and TEST environments are not same as production data. This can cause the difference in the execution times from lower environments to production. It is always suggested to test your code in lower environments having same amount of data as production.

4. Unit testing is important to find out if there will be any resource contention or concurrency issues ahead of time before implementing the code in production. Please find the free unit testing tools here.

5. Error handling your code. You need to have procedures in place to handle and capture the errors if in case you encounter the errors. Rollback procedures should be in place. Please read the details here. This is an excellent article on transaction handling in SQL Server.

6. Tune your code before going to production. Check for the indexes that you can create to improve the performance. Check if there are already existing indexes with the columns before creating the indexes, you may be creating duplicate indexes.

7. For better readability and style of your code, check these formatting tools here.

8. Give the sensible alias names for columns in the code.

9. Test all the use cases. Do you remember any situations where you had deployed your code in production and it failed and now everyone blames the testers and developers for not checking every use case scenario. Thorough testing on all use cases is important for a successful production deployments.

10. Do not run the code you see online directly on production. I know, I know, you will say never! but even the code from the legitimate website you always use, please check the entire code, test the code in test environments before thinking to use the same sample code in production.

These are some of the things that you can check if your code is ready for production deployment. I am looking forward to read the posts from other SQL family members for this month T-SQL Tuesday!

Thanks for reading!

T-SQL Tuesday 154 – New Performance Features in SQL Server 2022

Thanks to Glenn Berry for hosting this month T-SQL Tuesday. Glenn wanted us to write about the new features that we are excited about in the SQL Server 2022. Please find the invitation here.

I am very excited about the performance capabilities in SQL Server 2022. When we talk about performance, you know how much Query Store feature has actually helped in finding and resolving the slow performing queries since this feature got introduced in the year 2016.

If you already used the Query Store feature for your databases, you know it needs to be enabled at the database level. The primary use of this feature is to capture the execution plans and execution stats information for your queries over time so you can identify the regressions between the execution plans and take this information to fix most troublesome queries easily.

Starting SQL Server 2022, the Query Store feature is going to be enabled by default when you create new databases. If you need to migrate databases from older SQL servers to SQL Server 2022, then you need to still manually enable this feature.

In SQL Server 2019 and below versions, there is no support for the read only replicas for availability groups, meaning the query data is only collected to Query Store on primary replica and not the readable secondary replica. In SQL Server 2022, the query data is collected into Query Store on the secondary read only replicas for availability groups. Query Store hints can also be used in SQL Server 2022.

The other feature is the Query optimization for the parameter sensitive queries. In SQL Server 2019 and below versions, the moment the query executes for the first time, it saves the plan in the plan cache with the parameter value used for the first execution. The next time the same query runs with the different parameter value, the initial plan that got stored in the plan cache will be used. If the parameter that is being used for the first execution is sensitive, then it can generate a plan which can have resource consuming operators that can effect the next executions of the query with different parameters. This is because the same stored plan from the plan cache is reused.

In SQL Server 2022, multiple plans are generated based up on the parameter sensitive values and are saved in the plan cache. Depending on the parameter values, the optimizer will sniff the parameter value and use the execution plan based on the parameter value. This is a very useful feature as it can mostly solve the parameter sensitive query issues as the optimizer choses the right optimal plan for your queries based on the sensitivity of your parameter value.

I did not get a chance to test the performance capabilities in SQL Server 2022 but I am going to create a new demo for one of my upcoming sessions related to performance capabilities and Query store enhancements in SQL Server 2022.

Thanks to Glenn Berry one more time for boosting our interest in trying out the new features in SQL Server 2022 and to share the knowledge to the world.

I am looking forward to read this month T-SQL Tuesday posts from other SQL family members and learn their excitement and learnings from the new features SQL Server 2022.

Thanks for reading!

T-SQL Tuesday #153: My Life changing event!

Thanks to Kevin Kline for hosting this month of T-SQL Tuesday! Kevin wanted us to share about the events that has changed our lives in a better way.

Becoming a speaker has changed my life. I decided to become a speaker during the dark times in my life. Speaking has become a therapy to me. It has helped me pass through the rough times in my life.

I am always thankful to the conference “New Stars of Data” for giving me the opportunity and the platform to speak for the first time as a Speaker. My journey begin from here as a Speaker.

I have presented at many events later. Dataweekender is the second conference that gave me the opportunity to speak. This is one of my favorite conferences as well.

As I speak more, I gained a part of my confidence back. It is a continuous process. I took the step and decided to speak at the lowest point in my life. I sincerely can say that New Stars of Data event is a life changing event to me.

To anyone who wanted to become a speaker but if lack of confidence is stopping you from taking your first steps, please rethink. Confidence only comes by practice.

I can say that very sincerely because I spoke at several events later only to build my confidence. The more I presented, the more comfortable I became. Though I only presented virtually till date, I will carry the same confidence I gained when I present in-person.

When I think about the in-person events, I need to write a bit about the PASS Data Community Summit 2022. I am honored to be selected as a speaker at this year’s summit and I chose to present in-person.

I am very nervous to present in-person for the first time ever and that too for the biggest Data platform summit in North America. I believe having nervousness is a good thing when you are presenting because it will force you to prepare more and keeps you laser focused.

As New Stars of Data gave me a life as a Speaker, PASS Data Community Summit 2022 will definitely take me to the next level as a speaker as I will be presenting for the first time in-person.

Attending and presenting at the Conferences like SQLBits was one of the best experiences. Attending other events like EightKB is a great opportunity to learn internals about the SQL Server.

There are other conferences I have attended and learnt from other speakers.

During the Journey as a speaker, I have met new people along the way and few of them became my good friends for life!

I cherish all the moments…

I am very thankful to each and every organizer of the conferences, User groups and to each person who helped me along the way giving me the opportunity and supporting me at each step in my journey as a Speaker.

I would love to read other SQL Family member posts about this topic. Would love to learn from them how attending or speaking at different conferences changed their lives.

Thank you for reading!

T-SQL Tuesday #152 – Rants

Thanks to Deborah for hosting July month of T-SQL Tuesday!

I would like to share a rant on a situation I faced working on the database Server migrations couple of years ago.

In the past few years, I have done several database and server migrations. As you know, there are many things that needs to be taken care of during the testing phase until we do the actual migrations.

It is not simply the database that we migrate but all the related objects needs to be migrated as well. The agent jobs, SSIS packages, linked servers and most importantly updating the connections to the datasource information, not only to the database related objects but also to the related applications.

If you are using the Servername and database name in the connection strings in all the applications connecting to your database being migrated, this process gets tough and tedious to update once the database is migrated to a different server. It is always advised to use database DNS in the connection strings instead of the IP address or the Servername.

This also applies to the report server datasources as well.

I know it is hard to make the change but if the change is good, it is worth the effort.

During several database Server migrations, I have seen developers feeling comfortable in changing their application code to the new Server name instead of taking the advice of database DNS. I know the amount of work that goes behind to find the objects referencing the server names. Missing any one of the locations can absolutely break the code and mess really hard.

I have experienced many times where developers had failed updating the SSIS packages or failed Linked Server connections once the databases are migrated. This problem can be solved simply by using the database DNS instead of the Server name. You will be only updating the DNS entry pointing to the new server name once you migrate the database to point to the new Server environment.

I know many companies already follow the standards of using database DNS but there are still others out there doesn’t want to make the change and instead fix the problem when applications break.

I believe it is not only important to fix a broken application somehow but how effectively and simply you can avoid causing the applications to break in the first place.

I am very interested in reading other SQL family member posts on this topic!

Thanks for reading!

T-SQL Tuesday 151: T-SQL Coding Standards I wish I knew Earlier!

Thanks to Mala for hosting this month of T-SQL Tuesday! Mala wanted us to write about some of the T-SQL Coding standards.

Some of the coding standards that I mention here are obvious but I know how many times I obviously skip the simple coding standards. I believe every bit of simple things we do every day matters!

Here are few quick but effective coding standards I wish I knew earlier:

1. Get your T-SQL code formatting faster than before

We all love to write the code that can help you and others read better later but I know it is time consuming, especially when you are trying to focus more on writing the effective code that can run faster. My mindset while typing the code: Throw the formatting out of the window, let me just type this and get the results I need. I know many of you have the same mindset. If you are just typing few lines of code, that’s totally fine but building a huge stored procedure with complex logic needs formatting not only to make it easier for others to read it later but to help yourself along the way. Formatting doesn’t have to be complex with some free online tools like poorsql. This tool was introduced to me by one of my favorite SQL community champions, Randolph West. Thank you my friend! You saved me many hours. There are also other similar free tools like sqlformat.

2. Error and Transaction Handling: Saved many Jobs out there

Don’t wait until you break something and then figure out there is a way to avoid it in the first place. This was me at the beginning of my career. I learnt it in the hard way. These were all my face expressions at my computer as I write something which looks like a code.

Source: Pexels

From handling and capturing the errors to transactional handling the right way, knowing how to handle the errors and transactions is crucial when troubleshooting and modifying the data. Learn about error handling here and transaction handling here.

3. Are you inserting large number of rows as a schedule SQL agent job? This is for you

This is simple but effective. Add update statistics step as the next step in the agent job for the tables you just imported the data into. This will help optimizer to chose the right operators in your execution plans with the updated stats.

4. Start using SQLCMD mode, you will thank me later

Did you ever faced a situation where you accidentally deleted the rows in production instead of deleting the rows in development environment? If yes, then keep on reading. Start using SQLCMD mode. After you enable this mode, run your query with beginning

:CONNECT Servername                                                                                                      
USE DATABASENAME                                                                                                              
Your query art here   

Make it as a coding standard, this will help you remember where you are running the code no matter what your query editor is connecting to. If you need to run the same code on different server, you just change the server name from the first line. At the beginning, I felt adding this line at the beginning of the code is kind of pain but then after I get used to it, this has become one of the best standards. As I execute any code these days, my eyes automatically rolls to the first line, looking for the server name in the connect statement. This becomes even more helpful when you want to document the code you are running on different environments (DEV/TEST/QA/PROD). Learn about SQLCMD here.

5. Consolidate the number of Indexes: You could have duplicates

If you regularly implement the indexes from the recommendations provided on the execution plans, make sure to check if you already have an index with all the columns mentioned in the recommendations but just a column recommended not in already existing index. Try to fit in this additional column into the existing index if necessary but do not just go ahead and run the recommended index query. You may already have an index. Regularly check for any duplicate indexes you may have costing you lot of resources maintaining them.

6. Checkout the datatypes of variables used in your queries

The variable datatypes that you use in your queries for where clauses should have same datatypes for the columns you are referencing in your tables. If they are not the same, the optimizer have to go through the implicit conversion as an additional step. This may cause performance impact and may not use the required indexes when it have to. This can take more resources to execute the query and may be harder to find the reason later for why the query is running slow.

These are some of the T-SQL coding standards that I use on regular basis. I am looking forward to read and learn from other SQL family members on their T-SQL standards T-SQL Tuesday posts!

Thanks for reading!

TSQL Tuesday #150: My First “Why am I even here?” Job

Thank you Kenneth Fisher for hosting the May month of T-SQL Tuesday. Kenneth wanted us to write about our first technical job.

As I mentioned in my previous T-SQL Tuesday post here, I was a Pharmacy graduate who loved technology. As I decided to change my career into IT, I applied for masters degree in computer technology and finally made it to graduation.

After I completed my masters degree, I felt like I can easily do any job related to computers. I was wrong! I didn’t know at the time that “Degree’s doesn’t matter”.

I had my initial struggles to find a DBA job as everyone was looking for an experienced candidates who can fit their requirements. I took many interviews and failed. One final day, I got an offer for a DBA position in a company (I am not naming it here) after attending an interview for Junior DBA role. I was on the Cloud nine.

Everything is cool, right? No, not as I thought it would be.

On the first day into the company, they gave me an ID card with my happy face picture on it and under it, my job title as “.Net Developer”.

Wait, what?? A developer?

Well, I was in the same shock as you and don’t really know what to say. I tried contacting the management to know why I was placed in a complete different position when they interviewed me for a DBA position. I did not receive any useful information from them. Now, you know why I am not naming the company 🙂

I was angry in the beginning but thought I can learn and survive. You may ask, how can you survive when you know nothing about that technology? Remember, it was my first tech job ever. I was scared and worried if I will get an another job soon if I resign. I tried understanding and learning the code but it was very hard to catch up. I know that job was not for me and always wanted to get a job as a SQL DBA.

I am not an accidental DBA but a planned one who ended up in an accidental developer position 😛

I was looking for a DBA job while I was still working in this developer position and fortunate enough, it did not took long enough to be placed as a Database administrator in another company.

Since then, I am a SQL DBA who love working with SQL Servers and Databases.

Being a Women in Tech, I faced my challenges in my career but I never thought of changing/moving my career from IT. That’s what I always loved and that was the main reason why I changed my career from being a Pharmacist to a SQL DBA.

From the core of my heart, I always believed “Everything happens for a reason”. This mindset has helped me pass through those initial challenges I have faced moving my career in to IT.

All the past experiences has taught me and helped me to be where I am today as a Women in Tech.

If I turn back and see, I know I made the right decision 🙂

Image Sourcehttps://in.pinterest.com/pin/514817801127582057/

Thanks so much for reading!

T-SQL Tuesday #149: Best SQL advice I would give to my younger self!

Thank you camilahenrique for hosting April month of T-SQL Tuesday. You can find the invitation here.

I give a big thank you to my younger self for changing my career from a pharmacist to a SQL DBA. I always loved computers since my childhood. You may ask, why did you choose to do pharmacy then? It’s a long story! I was just an innocent kid fulfilling my father’s dreams. By the time I realize pharmacy is not for me, I graduated as a pharmacist. I came to the united states to finally pursue my dreams and change my career. I joined as a master’s student in computer technology at one of the great universities in the united states. That was a major life-changing moment for me since I do not have much computer knowledge before. One of my core subjects was SQL. That’s when I was introduced to SQL and SQL Server. I can tell you with my whole heart that at that moment I know the decision I made was right. Though it was very tough in the beginning to catch up with the speed of other students, I learned T-SQL basics and from there, my computer journey started. I was stressed out in many situations and started slow. I thought I would never be able to make it to graduation. No matter how much I learned, I made mistakes and was never perfect. I spent late hours learning basics so I would at least pass the exam. It was a roller coaster ride for my master’s degree but I finally made it to the graduation 🙂

If I can give one piece of advice to my younger self, I would say learning is a continuous process and we all are perfectly imperfect. Failure happens only when you stop willing to learn. There will never be a perfect moment to start learning anything. No matter what you learned in the past or what degree you pursued in the past, that doesn’t have to define your future goals. Anyone can learn anything if we have a heart, and the will to work hard at it.

Learning the basics of T-SQL is really important before moving into complex T-SQL coding skills. I would like to share my favorite youtube channel that helped me when I started my T-SQL journey. If you are starting with T-SQL learning, follow this series from Kudvenkat.

I am looking forward to read all posts for this T-SQL Tuesday and learning from SQL Family!

Thanks for reading!

T-SQL Tuesday #147 – Upgrade Strategies

Thanks to Steve Jones for hosting February month of T-SQL Tuesday! This month T-SQL Tuesday is all about database upgrades.

When I started my first job as a DBA seven years ago, my project was to migrate several SQL Servers and all the servers were in SQL Server 2000. In my first SQL class at my school, I started my learning with SQL Server 2012. It was a shock to me to work on SQL 2000 databases at the time (as I am not familiar with the SQL Server 2000 yet), especially as it was my first job as a DBA.

My first project was to migrate approximately two hundred and fifty SQL 2000 SQL Servers to SQL Server 2012/2016. It took us a couple of years to successfully migrate all these Servers.

I would like to answer the below questions from Steve Jones for this T-SQL Tuesday:

Why do we wait to upgrade?

Fear! Fear of breaking things or impacting performance, I believe. Especially when you are making a big jump in the versions of SQL Server. The old legacy applications may or may not support the newer SQL Server environments. Rebuilding the application can take years, involve more developers and cost more. I believe, many companies are still using the older versions of SQL Server even after the end of support. Some companies are fine to be on legacy systems if their application works fine.

Strategies for testing an upgrade

When I migrated SQL Server 2000 databases, there was a lot of work that needs to be done before even planning for testing. There was deprecated T-SQL syntax like joins were no longer supported in SQL 2008 version of SQL Server. Legacy DTS packages need to be converted or rebuilt to be compatible with the latest version of SQL Server. By using the upgrade advisor and Migration assistant tool, it is easy to capture the incompatible objects. These objects need to be modified to make them compatible ahead of time. If these lists of objects are modified by developers in between, capturing those changes to the modified objects is important. Testing can be done using the backup and restore. You can skip SQL Server 2005 to upgrade the SQL database from SQL 2000 to SQL 2008. Running the compatible object script on SQL 2000 test database is recommended so the upgrade advisor can be run over the SQL 2000 database to confirm no more compatibility issues. Once the database is in SQL 2008 mode, there are very less T-SQL deprecated changes and so easy to upgrade.

Testing the SSIS packages after being migrated from legacy DTS packages is important. Once the database is upgraded on test environments, vigorous testing for any code breakages and application testing is important.

Smoke tests or other ways to verify the upgrade worked

I believe this depends upon the database and the type of applications. When you have databases having the same schema on several servers with similar environments, testing one database upgrade can confirm the other database upgrades to work. There are always caveats in these cases as well.

Moving to the cloud to avoid upgrades

Many companies are moving to cloud infrastructure as there is less maintenance when compared with on-prem environments and to lower their costs. I believe moving into the cloud is one of the options and the reason may not only be to avoid upgrades but to improve the performance and for lowering the costs.

Using compatibility levels to upgrade an instance by not a database

This can be an option to place the database in older compatibility mode even after upgrading the database to the latest versions of SQL Server like SQL Server 2019 but to use the features that are enabled for the latest compatibility databases cannot be used. Features like Query store can be used on older compatibility mode (up to SQL 2008) even after upgrading to newer versions of SQL Server. Though this is an option available, it is not suggested to place the database in older compatibility mode for a longer time.

Checklists of things to use in planning

  1. Using tools like Migration assistant will help identify the incomptibility objects list that needs to be modified to make the code compatible with lastest versions of SQL Server.
  2. Capacity planning
  3. Gathering the deprecated and unsupported feature list for the version of SQL Server you are moving to is important.
  4. Making sure the SSIS packages are tested properly.
  5. Gathering the logins and associated permissions (I used sp_hexadecimal and sp_help_revlogin)
  6. Making sure to take the backup of the database before migration and to place the copy of the database on source server for quick rollback if needed.
  7. Testing is needed if additional features like replication is being used. Making sure the publisher, subscriber have supportive versions of SQL Server to replicate the data.
  8. Making sure to script out the linked servers.
  9. Capturing performance baseline is an option to chose if you would like to use Query store. After upgrading the database to new version of SQL Server, place the database in the compatibility level similar to the compatibility level prior to the upgrade. This will help capture the regressions caused by any plans after the upgrade. We can easily compare the plan regressions and fix the performance issues quickly. To know more about capturing baselines using Query store, please see my blog post here.
  10. What method you plan to migrate a database is important. It depends on how much downtime is accepted for your database and application. Some of the methods to migrate databases are to backup and restore, detach and attach or using full, differential and transactional log backups for less downtime.

These are a couple of things to consider during planning. This list does not contain all the planning information but gives an overview.

The time it takes to upgrade your environment

I believe the time to upgrade the database depends on how large the database is and how many dependent features needs to be moved like for example, replication, whether the application needs to make any changes, what method is used to restore the database on destination (copy files to destination server can take time if the database is huge).

What do you evaluate in making a decision to upgrade or not?

I believe, checking what new features will be used by your database and application after the upgrade is important. Testing is a crucial part to evaluate the SQL engine, database features, and performance benefits after the upgrade. The latest cardinality estimates provide a better estimate which can help SQL engine to create an optimal plan. There are many other features that have been added to SQL Server over the years in improving the performance of the Queries and workloads.

These are some of my thoughts about the database upgrades. I would like to read and learn from the experiences of other SQL family members.

Thanks for reading!

T-SQL Tuesday #146 Preconceived Notions

Thanks to Andy Yun for hosting this month of T-SQL Tuesday. Andy’s challenge for this month of T-SQL Tuesday is to share our learnings that have changed our preconceived opinions.

There are many learnings throughout my career as a DBA but in this post, I would like to share something interesting I have learned in the recent months from Pam Lahoud about tempdb.

I always had an opinion of changing the schema of the temp tables or adding indexes later after the temp table creation doesn’t affect in any way but it’s actually not a good idea. Yes, you heard that right. Altering or adding indexes later after temp table creation within the stored procedures will be marked for deletion and so the cached temp table metadata cannot be reused. It is suggestable to create the index as you create the temp table using the inline index syntax.

Please read the full article here written by Pam Lahoud to know in detail how this best practice can help reduce the tempdb metadata contention. Pam also mentioned other best practices in a detailed in-depth presentation on tempdb internals at EightKB Conference. Here is the link to it.

Also, do not forget to read this wonderful article by Erik Darling on indexing temp tables. Erik explained in detail when can you actually be creating the indexes on temp tables by using the inline indexing syntax, when it can be useful, and cases where it is not an option.

I am looking forward to reading and learning from other SQL community members about what their preconceived notions are.

Thanks for reading!

T-SQL Tuesday #145: Pandemic, Presentations and Perspectives!

Thanks to Xavier Morera for hosting December month of T-SQL Tuesday!


I never experienced how an in-person conference feels like. I started my speaking career during the pandemic through a virtual event New Stars of Data last year 2020.

I see different perspectives about virtual and in-person events. To be very sincere, I love virtual events for many reasons. Virtual events have given me and other new speakers a chance to speak at different conferences worldwide without having to travel.

As I never experienced an in-person event before, I am really looking forward to attending them in near future.

For this T-SQL Tuesday fun, Xavier asked a couple of questions to answer.

  1. Which is your favorite conference and why?
  2. Which is the best venue that you have visited for a tech conference?
  3. Who is the best presenter that you have ever listened to?
  4. Which location would you like for your next event to take place and why Costa Rica?

Which is your favorite conference and why?

My favorite conference will always be the one that has given me the stage and chance as a new speaker, that’s New Stars of Data.

I cannot stop with one conference 🙂 My next Favourite conference is DataWeekender which is also an online data platform event. PASS Data Community Summit 2021 and Data Platform Virtual Summit 2021 is equally my favorite one along with New stars of data and Dataweekender. To list all of my favourite conferences, the list is definitely going to be big. All the above conferences encourage diversity and give a chance for the new speakers to share the same stage as the experienced speakers. New stars of data organized by the DataGrillen team is especially dedicated to new speakers. Their support for new speakers is just incredible. DataGrillen team also organized Dativerse specially dedicated to the speakers coming from underrepresented communities. They also started Dataminutes which is a ten-minute lightning talk conference and now the New Star of Data- Speaker improvement library. DataGrillen team continuously comes up with new ideas to help support the community in every way possible.

DataWeekender team has a style that I love. I just love the way they organize, advertise the event in the most entertaining way possible with an amazing logo. They encourage new speakers from diverse backgrounds and support them.

I am thankful to PASS Summit 2021 and DPS 2021 for encouraging new speakers coming from diverse backgrounds to submit to their conferences and giving them incredible support and a chance to share the stage with very well-known speakers out there. For many new speakers, it is a dream to speak at big events. As these conferences were virtual, many professionals used this wonderful opportunity to challenge themselves and took their first step in their speaking journey. If you can count the number of new upcoming speakers from the SQL Community for the past two years, you will be truly amazed. I believe we need to keep online conferences alive even when we choose to have other in-person conferences. This will help people who cannot afford to travel to other places or countries due to various reasons like family situations/young kids/visa waiting process etc but still interested in presenting and learning from other speakers.

Which is the best venue that you have visited for a tech conference?

I never attended an in-person event before 😛

Who is the best presenter that you have ever listened to?

That’s a really tough question. Every speaker is special in their own way. I have one speaker who actually inspired me to start thinking about speaking. BrentOzar valuable suggestions from one of his podcast interviews helped me think about becoming a speaker. I cannot thank Brent enough for his continuous support of the community. Kudos to you Brent.

Which location would you like for your next event to take place and why Costa Rica?

If my first in-person event will be in Costa Rica, how amazing that would be? I mean, look at the below picture from Tortuguero (‘Land of Turtles’). Who would not love to spend their time watching sunrise and sunset here?

The picturesque beaches of Tortuguero, Costa Rica during sunset
Source: https://www.nomadicmatt.com/travel-blogs/my-favorite-destinations-in-costa-rica/

Thanks again Xavier for hosting this month of T-SQL Tuesday!