In DBA Best practise training, I also cover the storage bits. This plays a vital role in IO and performance of every single application or user’s ad-hoc querying. As we know a database should have one MDF (Main Database File), 1 LDF (Log Database File) and can have more than 1 NDF (Secondary Database File). These files when stored across various drives improves the performance of the applications in one way. All data stored in tables are actually getting stored in these files. SQL Server gives flexibility, to store these data in both physical and logical components.
As you can see in below slide, the basic unit of storage is stored in these MDF, NDF via File Group concept, where you dedicate one filegroup location for one database file. Also this slide covers the concept of extents and its types (Mixed & Uniform).
When a bottleneck occurs due to performance problem, every DBA should start with initial analysis on why and where the Performance issue have triggered. I cover these topics in my DBA Best Practises training. This process should have Monitoring Tools, Monitoring Plan and then the Performance Analysis like it says in below slide.
In the Monitoring tools section, I cover the following as it says in the below slide with SQL Trace, Profiler, Statistics IO, Trace Flags, System Stored Procedures, DMV & DMF, DBCC, DTA, Performance / Activity monitors.
In Monitoring Plan, I cover mainly with a performance baseline, where a DBA should have clear understanding on the environment, client applications, database applications, network architecture, available system resources, etc.,
We all work with Data everyday, but is everyone required to access every single data on a daily basis? Mostly it is a BIG NO. Because we don’t have to see what happened, 3 years ago but may be used to compare and that too bit rarely.
When your company ask for current week, last week, this month, last month, this quarter is what we talk about latest data and this becomes the hot topics and this is referred to HOT data.
In the below picture, I have given a simple example from Jan to Dec. Let’s say we are in Dec 15 and the data showing up for Oct, Nov, Dec is going to be much more latest data compared to Jan, Feb, Mar. The below picture gives you an ideal scenario on how you can segregate the Hot, Warm & Cold data.
Also HOT data is very heavily accessed but comparing the same with cold data is very less accessed. Also the data in recent months changes quite frequently compared to the very earlier months.
What does every single installed SQL Server Instance Maintenance covers? Below slide gives information on Monitoring Disk Space usage, Adding/Removing users, Moving objects for better performance, Reviewing and verifying system / error catalogs, to check database integrity, Server monitoring, adding / removing servers, Apply patches, etc.,
You can create a brand new job by duplicating already existing job within the SQL Server instance. Open SSMS and then go down to SQL Server Agent and expand to see the JOBS and again expand to see all the available Jobs. From the below screen, I have chosen a maintenance plan job and then right click on it to see the available options on it and mouse over “Script Job as” and then further mouse over to “CREATE To” and then choose the relevant options for you. I would suggest to choose “New Query Editor Window” and this gives all the settings that is available for this job. You can choose this and rename wherever required and you can execute the same by creating a new job accordingly.
Below scripts will enable you to take backup of every single database except Model & TempDB databases, by using cursors on it. Strongle advise not to run this with very large databases or in OLTP environments where the systems are heavily used.
DECLARE @dbName VARCHAR(50)
DECLARE @backupPath VARCHAR(256)
DECLARE @backupFileName VARCHAR(256)
SET @backupPath =
DECLARE All_Backup_Cursor CURSOR FOR
SELECT Name FROM sys.databases WHERE Name NOT IN (‘Model’,’TempDB’);
FETCH NEXT FROM All_Backup_Cursor INTO @dbname
WHILE @@FETCH_STATUS = 0
SET @backupFileName = @backupPath + @dbName
BACKUP DATABASE @dbName TO DISK = @backupFileName
FETCH NEXT FROM All_Backup_Cursor INTO @dbname
After you have created your new job following this article, you can view all the new created jobs and monitor the health of the jobs like whether it succeeded or failed, how long it took to run, reasons if it has failed, what time it started and what time it completed, when is the next run scheduled, etc.,
Once you have opened your SSMS follow the JOBS section and you can see JOB ACTIVITY MONITOR once you expand it. Double click on this and you can find below screen, which gives overall status.
If you want to take further action on each job you can right click on any of the listed jobs and you will find more options on this like below screen. In this you have options to Start the job from any step level, stop the job, enable/disable the job, refresh to see the current job running status, delete the job, view the whole history of job run and properties to further edit by opening the job itself.