Ok, I guess I have made you wait long enough since my last post in this series so here is how I create the SQL job that does everything.

First, let us look at the job setup.

On the General Tab, we have the following:

 

As you can see just a simple but descriptive Step name, the type is T-SQL, Database is set to our Stage 1 Database (DYN), and the command is just an execute statement to for SQL Server to kick off the job.

The good part is on the Advanced Tab.

 

One thing you cannot see in the screen shot is the full file name of the output file.

It is \\linksrv\common\SQL\Logs\DYN - Master Job - Import Data.txt

Notice the following:

  • I have an Output file listed.
  • The output file is on a network share!
  • Append output to existing file is checked.
  • Include step output in history is checked! That way every "PRINT" statement will be in the log file.

You should also know that I do have a naming convention for the output file names. You can use whatever naming conventions you desire, but here are mine for your examination.

\\Server Name\Network Share\Sub Folder 1\Sub Folder2\ Database name - Job Name - Step Name.txt

So what is the logic of my conventions?

That specific network share has many sub folders so I use Sub Folder1 (named SQL above) as the folder for my SQL files. Hey, I have said it many times I am not good at coming up with inventive names. In my SQL folder, I have several sub folders, folders for SQL Alerts, Log files from jobs, etc. In addition, as this is a log file from a job hence the name Logs.

I always include the Database name and SQL Job name as part of the file name. If the SQL Job has multiple steps, I include the step name in the filename as well. Also, If I have several Servers with the same Job name (example: Maintenance) I will place the Server Name at the beginning of the filename.

So, why did I go to all this trouble? The short answer is "I am lazy." Every SQL Job on every SQL Server writes to this same Network Share! So, instead of opening SQL Server Management Studio, connecting to one of multiple servers, and checking each job history each morning to see if everything is ok or not, then connect to another SQL Server and repeat. I go to a single Network Share, open a text file in notepad with a double click, quickly scroll through it, and repeat. That is until I got smart and just open the file and searched for the words "FAIL" or "ERROR". Then I got even smarter and discovered that I could actually use Windows Search to look for the words "FAIL" or "ERROR" as part of the contents of a file on just the folder itself without having to open the files at all.

A couple of other things I like about doing this is that there are times when the Job History in SQL Server does not contain all of the history messages. Keep in mind that all those "PRINT" statements will print to both the Job History and to the Output log file.

If you have not already, you will discover that I love using the PRINT statement. One reason is that I can tell exactly where in my T-SQL code the procedure was when it failed when I am debugging something. However, the biggest reason for me on a continual basis is to monitor how long a specific procedure or sub procedure takes. I schedule just about all of my SQL Jobs to run every hour at least. Yet the main procedure "sjp_ImportData" may or may not execute a specific sub procedure such as "id_upr00100" depending on the time, date, or some other criteria. In addition, as tables get bigger what started out as something that took less than a second can quickly become something that takes hours. I want to know how long a specific sub procedure is taking, so that I can adjust how often the main procedure executes it long before any of my users ever know something is slowing down.

Two final things to note on this post is that while I am using Windows Search each morning to validate that everything is ok with the world, at least once a week I go through each of the files to check the durations of each item. Moreover, you need some sort of maintenance on the files themselves, they will get big over time. I move all the log files in this folder to another folder on an infrequent basis. You could just delete them, but if you did and then found out later that day you needed them, well, you deleted them.

<Previous Post in this Series> <Next Post in this Series>

Till later,
John