My First Week at SQL Services

Monday 06.03.2017

I was a strong mixture of excited and nervous to begin this first week at SQL Services. I knew I was going to learn a lot, but am always apprehensive when it comes to new environments. Well, there was no need for all of those nerves (there very rarely is). On Monday, Nadine, Neil, Nimisha and I spent the first two hours or so going through the first section of the induction process with  Adam. Adam covered a lot of different topics: organisation structure, DBA handbook, tools used etc. After this first induction meeting I had gained a real sense of how the company started, how they got to where they are today, where they are heading and what they do. In a very basic and general sense.

One thing Adam gave to all four of us was a SQL Services induction spreadsheet, which is complected by all new DBA’s during their 6 week training period. Given the limited time I have to complete my project, it is not viable for me to complete the entire process nor is it within the scope of my project. I  will focus on completing relevant tasks in the week 1-2 section which will help with my understanding of how things work at SQL Services.

Below are the tasks which all new DBAs complete during their first two weeks.

induction-spreadsheet

SQL Services Induction Spreadsheet

The following are the tasks that are not relevant to my time at SQL Services, and I will therefore not be completing:

  • Practice use of phone, transfer of calls etc.
  • Connectwise University training and degree
  • How WE use Connectwise: Review Connectwise internal documentation
  • Explore Connectwise as SQL Services
  • Training in Connectwise: Mike Spence using training database
  • And of the SMART assessments

After this initial meeting, I created a timetable which shows the days/ hours that I plan on spending at SQL Services, work and in Class as seen below. This timetable was then included in my project proposal, which I proceeded to complete and submit that afternoon (fingers crossed it is accepted).

The aforementioned tasks allowed me to prepare for my afternoon meeting with Adam. During this meeting we discussed the hours I will be spending at SQL Services each week. Because I will still be working part time and attending project seminars, Adam said he was happy for me to come and go as I please. I think this will work really well for me as long as I manage to complete at least the required 300 hours at SQL Services.

Following this meeting, I decided to take a look at virtual box. Virtual machines aren’t something I’m very familiar with, so this was something new and interesting to ‘fiddle with’.

Tuesday 07.03.2017

This morning I began by importing a virtual lab appliance SSLLAB v2 Clean by following the tutorial we were given as part one of our induction. I had no problems with this, which was great! But it didn’t last long..

Following on from the above task, I began the process of completing the SQL services self-paced DBA training exercise. This is where I had a few issues. After watching the first few videos, it was time to install the .NET framework on my virtual machine. Easier said than done. Multiple times I ran into the same problem, an error would come up half way through installation stating:

Feature installation
Installation of one or more roles, role service feature failed.
The source files could not be found. Try installing the roles, role services, or features again in a new Add Roles and features Wizard session, and on the Confirmation page of the wizard, click “Specify an alternate source path” to specify a valid location of source files that are required for installation. The location must be accessible by computer account of the destination server
.”

This was something completely new to me, so Google to the rescue! I found an incredibly helpful forum post  which had all of the answers. It was as simple as installing new windows updates and voila, the .NET install was successful. Should I mention the 20 minutes I sat unsuccessfully trying to install the .NET framework over and over again, even though I kept getting the same result? Probably not.

From here I continued watching the videos which are part of the self paced training exercise,  with my next endeavor being the installation of SQL server.

The first issue I encountered here, was not having access to the L drive, which is where the latest edition of SQL server is. However, after talking to Adam, he directed me to where I could access this file. Easy fix!

I had trouble installing the latest version of the service pack, however after some quick research i found an easy solution online. I just had to change the security settings through control panel as seen below.

downloads

Security settings changed

By the end of the morning I had completed the first  of ten stages of the exercise. Below is a list of the tasks I have completed so far:

Stage 1 – Basic Installation of SQL Server

  1. Install latest version of SQL Server, developer edition, on your laptop. (install located on L drive)
  2. Install latest service pack for SQL Server installed above.
  3. Ensure both SQL Server and Agent are operational and running under your own account.
  4. Review SQL Server Error Log and SQL Agent Error Log for errors.
  5. Expected result is clean install. Report findings to DBASM.

After lunch, I moved on to the next stage of the self paced training exercise, stage 2. This page helped me to change the owner of the database from the administrator who created the database to the sa, which is what was asked of us. This is the script I used:

ALTER AUTHORIZATION ON DATABASE::BasicSimpleDB to sa;

DB owner.PNG

Owner changed to sa

However, upon further investigation I found several sources which suggested sa should not be set as the owner of the database – the opposite of what the worksheet provided states. To better understand why the principle of least privilege is so important, I took a look at this helpful resource. I assume the reason we were asked to set sa as the owner is because this is simply a test environment therefore security isn’t a real concern. This is something I should ask Adam about as I would love to know what his opinion is and learn more about the topic. I did have a quick chat to John about this, and he briefly explained what his opinion is after having been at SQL Services for a few months now (its nice to have so many familiar faces around).

Recovery model

Recovery model set to simple

I then moved on to changing the recovery model from full to simple, as seen above. This was a simple task, completed through the use of the DB properties window.

The next task was to complete a one off backup of the database. This was another first for me, but was thankfully rather simple. But in doing this task, I did learn about flags. The first of these being the ‘with’ keyword. In this instance I used the with clause ‘init’. This means initialize, which creates a new file every time I complete a backup rather than having multiple backups in the same file. So each time I run a backup with init, it recreates and overwrites the given backup file.  I also learnt that using ‘format’ to reformat the file each time the backup is performed is valuable, however I do need to look into this further to fully understand its value. Using stats = 10 simply shows me the progress of the backup, generating a message every 10%. Below is an image of the backup:

backup

Backup of BasicSimpleDB

Here is a list of the tasks I had completed by the end of the afternoon:

Stage 2 – Simple Backup

  1. Create database on above SQL Server called BasicSimpleDB (10MB Data, 5MB Log)
  2. Change database owner to sa (Best Practice; as system administrator account is always there)
  3. Ensure recovery model is set to Simple (Which does not allow point in time recovery)
  4. Create table in BasicSimpleDB called tTest. Add one column as integer (Primary Key), one as varchar(10), one as datetime. Add a few rows of sample data to table.
  5. Perform a one-off backup of the database using SQL Server Management Studio
  6. Expected result is a backup of database with table containing data. Report findings to DBASM.

Wednesday 08.03.2017

This morning started with a slight panic as SQL server manager couldn’t initially connect to the server – here we go, how am I going to resolve this one! I did my same old trick of just trying the same thing again a few times, and it turns out the saying is true – third times lucky. Crisis averted .

My learning for the morning began with figuring out what differential backups (diffs) are, as they are something unfamiliar to me – or maybe just something I had forgotten about since my last DAT class. I found this succinct explanation online, which really helped clarify and cement their purpose:

A differential backup is a cumulative backup of all changes made since the last fullbackup, i.e., the differences since the last full backup. The advantage to this is the quicker recovery time, requiring only a full backup and the last differential backupto restore the entire data repository.

BasicFull

After watching a few more of the tutorial videos, I got started on Stage 3 of the self paced learning exercise. This consists of:

Stage 3 – Full Backup

  1. Create database on above SQL Server called BasicFullDB (10MB Data, 5MB Log)
  2. Change database owner to sa (Best Practice; as system administrator account is always there)
  3. Ensure recovery model is set to Full (Which allows point in time recovery)
  4. Create a maintenance plan to backup the database daily, with hourly log backups, retaining full backups for 2 days and log backups for 48 hours.
  5. Run the full backup job created from the maintenance plan.
  6. Create table in BasicSimpleDB called tTest. Add one column as integer (Primary Key), one as varchar(10), one as datetime. Add a few rows of sample data to table.
  7. Run the log backup job created from the maintenance plan.
  8. Run a manual DBCC corruption check on the database.
  9. Expected result is a maintenance plan which has 2 associated jobs. Also a full backup file and a log backup file. Finally a clean corruption check. Report findings to DBASM.

The first three tasks were relatively straight forward, however task 4 called for some research into how to implement a maintenance plan as I could not access the file provided on the worksheet. I found this tutorial which, although is somewhat out of date and for the wrong version of SQL, proved helpful in completing task 4. During the process of completing this task, I did some research into precedence constraints and found some really interesting information.

I learnt that the precedence constraint can use the following execution results alone or in combination with an expression.

  • Completion requires only that the precedence executable has completed, without regard to outcome, in order for the constrained executable to run.
  • Success requires that the precedence executable must complete successfully for the constrained executable to run.
  • Failure requires that the precedence executable fail for the constrained executable to run

This helped me understand why the tutorial I had been following had called for a change of the constraint from on success to on completion.

Step 5 was nerve racking, would the maintenance plan be successful? I was very relieved to see this message box appear!

Maintenance plan success

Maintenance Plan Success

When completing task 8 I found this resource really useful as it helped define the three different aspects which make up the command – . DBCC CHECKALLOC, DBCC CHECKCATALOG or DBCC CHECKTABLE. This resource here is also well worth a read,as it answers most FAQ’s surrounding checkdb.

After making a quick cup of green tea, I moved on to the next stages. By the time of our 3 o’clock meeting with Adam, I had completed the following tasks with little or no problems:

Stage 4 – Simple Restore

  1. Using SQL Server Management Studio, restore the database created in Stage 2 as a separate database on the same server, called BasicSimpleDBRestored. Restore with recovery, renaming files appropriately. Generate a script of the restore for future use.
  2. Add sample data to BasicSimpleDBRestored..tTest
  3. Execute restore script created earlier
  4. Check rows in BasicSimpleDBRestored..tTest
  5. Expected result is a newly restored database with different files to the original database. Also a scripted restore that has been successfully executed. Report findings to DBASM.

Stage 5 – Full Restore

  1. Adapt the script created in Stage 4 to restore the full database backup of BasicFullDB created in Stage 3 to a database called BasicFullDBRestored. Restore with recovery, renaming files appropriately.
  2. Execute your adapted script.
  3. Check BasicFullDBRestored for the table created in Stage 3. Table should not exist.
  4. Add RECOVERY to end of script and execute again. Should error. Ensure you understand why.
  5. Add REPLACE to end of script (with RECOVERY still there) and execute again. Should be successful. Ensure you understand why.
  6. Remove RECOVERY with NORECOVERY. Execute again. Refresh database list and check. Should show “(Restoring…)” at end of database name. Database is unusable.
  7. Create a new script to restore the log backup created in Stage 3, again with no recovery. Database should still be unusable.
  8. Restore database with recovery.
  9. Check BasicFullDBRestored for the table created in Stage 3. Table should now exist.
  10. Report findings to DBASM.

I also had a quick read over the material provided for the afternoon meeting, just to gain a vague understanding of what the meeting would involve. During the meeting, Adam walked us through what the template does, what we can configure it to do, and all of the other important features of the template. This meeting reminded me why I love databases, they just make so much sense! Everything is so logical, and serves a clear purpose. I’m looking forward to playing around with the template tomorrow morning in our next induction meeting.

Thursday 09.03.2017

Today was spent with Adam learning to install the template. Below are the lists of tasks competed in order to set-up the template.

  • Create the SQLData & SQLLogs folders
  • Review model data file growth settings
  • Configure SQL Server properties
  • Configure SQL Agent properties
  • Disable any existing database maintenance jobs
  • Configure mail requirements
  • Create and populate the SQLReports directory
  • Copy pkzip.exe and elogdmp.exe
  • Create the SSLDBA database
  • Prepare the script
  • Execute install script

After we had been through the initial setup with Adam, he sent us off to try our luck installing the template on our own, following a specific scenario. Below is the list of tasks for that scenario.

  • Silver SLA
  • 24×7 enabled
  • New operator called “SQLServicesDBA”, using your own work email address
  • 1 database configured to write a full backup to an alternate location (i.e. a file location other than SQLDataDumps)
  • 1 database configured to copy a full backup to an different location
  • 1 database configured to retain 3 days of backups in both the standard location (i.e. SQLDataDumps) and an alternate location
  • 1 database to be configured for a full backup once a week and differential backups every other day of the week
  • 1 database to have transaction log backups taken only between 8am and 7pm
  • 1 disk drive configured to not send a warning or an alert
  • Optimisation jobs configured to run for 1 hour only on Mondays, Tuesdays & Fridays at 2:05am
  • 1 database configured to only carry out database reorganises (a.k.a. defragments) if the fragmentation is higher than 7% (i.e. no index rebuilds are required)
  • 1 database configured to only optimise objects that have a page count larger than 750 pages
  • 1 database configured to not be included in any backup, optimisation or database consistency check processes

I was only able to complete the first 2 tasks before having to leave to go to work that afternoon.However, I did find everything we had accomplished and learned today to be really interesting as it was our first small taste of what the DBAs do everyday for clients.

So, by the end of the week I had been able to tick the tasks “Introduction to key staff members” and “commence DBA induction training” off my task sheet. And I was also able to tick the following tasks off my induction sheet:

  • Ensure working environment is ready to go
  • Configure VirtualBox lab environment
  • First overview with DBA services manager
  • Second part of overview with DBA services manger
  • SSL self-paced training exercise (stage 5)

All in all, this week has made me excited to carry on with my work when I come back next week!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s