This morning I had a discussion with Nadine to find out what Her and Neil had achieved in the two days I wasn’t in the office – Friday and Monday. They had managed to continue working on the second Template install scenario from last Thursday, and had attended the third induction meeting where they learnt about report writing. So its fair to say, I have some catching up to do.
First thing, I made the decision to read over the document provided for report writing and see what sense I could make of it. The below diagram is an overview of the report writing process:
After reading the document and talking to Nadine briefly about the process, I decided it was time for me to move on as reports weren’t very relevant in regards to my project and the document and Nadine’s brief overview were more than enough to gain a brief understanding. However, I did still find it interesting to learn how reports are generated and what is included.
I then decided to read over parts of the DBA Handbook which contained information I thought would be useful for me to know. I began by reading the section, But what to DBA’s actually do? As I thought this would help me gain an understanding of how DBA’s are expected to spend their day. Below is an outline of the day-to-day role of the SQL Services DBA:
First thing in the morning, review the Service Boards and Site Status (see section Service Boards & Site Status for detail)
- Prioritise the work from above and carry out accordingly
- Monitor the Dashboard throughout the day (see section Dashboard for detail)
- Monitor the Service Boards throughout the day (at least half hourly)
- Be available for support requests (phone or email) throughout the day
- Complete DBA Reporting to schedule (see section DBA Reports for detail)
- Action or co-ordinate/arrange jobs in DBA Task list, to configure servers to best practice
- Ensure Client Documentation is up to date and accurate (see section Client Documentation for detail)
- Assist your team mates with their workload
- Complete weekly SMART assessment (see section Internal training programs for detail)
- Assist with Consulting (PS) work
- Complete assigned Internal Project
- Study for MCITP/OCP certification (see section Microsoft/Oracle qualifications for detail)
I would love to talk to one of the DBA’s and find out if this is actually how they spend their time on an average day at SQL Services.
From here I discovered all of the training the DBA’s are expected to carry out while they work at SQL Services – It seems like a lot of work! But its good to know that your study and learning definitely doesn’t end once you leave NMIT.
After I had had enough of reading over the DBA handbook, I decided to do some research into workflows, shadowing, surveying etc. This is going to be a key component of my work placement, and essential to completing my project.
This source helped me to establish the steps needed to gain an understanding of each workflow, diagram each workflow and then review the output. Below are the steps I will need to carry out:
- Identify the process that you want to document.
- Define the boundaries that mark the beginning and end of the process.
- Decide on the goals of your project. Do you want to gain a better understanding of a process? Or implement a new version of the process? Or both?
Shadowing and information gathering
- Identify those people who are most familiar with the process you want to document – this group will probably be a subset of your stakeholders. Ideally, you want to talk to those who actually carry out the day to day work of the process and can tell you how it’s really done (not how it’s supposed to be done).
- Set up times to meet individually with each person you’ve identified.
- Determine if you will be shadowing each person, or just talking through the workflow.
- If shadowing, ask them to perform the work as they would if you weren’t there. Emphasize that your are not judging their work, just observing.
- During each meeting, either watch the workflow carefully or have the person you’re interview give you a detailed verbal description of the workflow.
- Make detailed notes.
- Ask a lot of questions.
Creating the workflow diagram
- Compile your notes from each interview conducted.
- Use what you’ve learned to walk through the workflow yourself and make sure you have a solid understanding of the whole process. You can use whatever means works best for you – drawing the process on a white board, writing out a text description, creating an outline, etc.
- If you have questions, go back to your original sources and get clarification. Questions might include discrepancies between how people do their work, gaps in your notetaking, things that don’t make sense, etc.
- Begin working on a visual diagram of the workflow using Visio or another workflow diagramming program.
- Use workflow mapping symbols, which include symbols to represent processes, decisions, flow, and documents.
- Choose a level of granularity for your diagram. You will need to be more or less detailed depending on what type of decisions you need to make based on your diagram.
- This is the most difficult part of the process. You may need to make several starts to determine the best way to represent something visually. It make also take some time to make things fit on the page the way you want them to.
- Continue to think through the process, ask questions, and revise your diagram until you feel it makes sense and looks good.
Vetting the workflow diagram
- Arrange a meeting with supervisors review the workflow diagram.
- At the meeting, walk through the workflow diagram step by step
- Invite supervisors to ask questions or clarify pieces of the workflow as you go.
- The goal is to find any errors in the workflow diagram and fix them.
- You may also get feedback that allows you to improve the formatting and readability of the chart.
I shadowed john while he responded to a Specifc Alert. This was my first experience seeing how a DBA actually closes a ticket, or works towards closing a ticket so I learnt a lot and found it really interesting! By shadowing John, I learnt about how he calculates the time spent on a ticket – from the time he closed the last ticket he was working on until the time he closes the current ticket.
John also let me know he had no idea the information on how to handle alerts was even available, and said he wished he had known earlier as it would have saved him a lot of time as well as saved him from ‘pestering’ his team mates to find out how to respond to certain alerts. This is something I might mention to Adam so in the future he can insure all new DBA’s are aware of the resources that are there to help them.
After watching John work on closing the alert, it became apparent to me (who was aware of the alert resolution documents) that he followed the same process anyway – which he had learnt from other members of his team. Which leads me to think maybe a lot of the documentation is subconsciously followed by the DBA’s and then passed down to the newbies, and therefore does accurately represent the processes the majority of thr DBA’s follow. But of course, this will be either proved of disproved as time goes on and I talk with more of the DBA’s and explore different alerts.
Adam then called me in for a quick meeting where he let me know that from tomorrow on wards I will be able to start working more with the team (but not John – who is super busy – sorry John!). I will be meeting with Guy, who is working on project where he has identified the noisiest after hours tickets – tickets which appear frequently and tickets which take a large amount of time to close. So Guy sounds like the perfect person for me to meet with, as he will help identify the top 10 tickets for me to initially begin working on.
Since I will begin working on more project oriented work tomorrow, the focus of my afternoon was on completing any induction related task. This included:
- Watching any SSL self paced learning videos I felt would benefit me
- Read over SMART assessment material (as suggested by Adam)
New concepts encountered while watching videos:
- backup compression default
- blocked process threshold
- clr enabled
- database mail xps
- Fill factor %
- max degree of parallelism
- drop clean buffers
- shirnk database (data and logs)
- shrink file
- Free proc cache
- Check DB Check Table
- SQL Perf – Log file management
- Input Buffer
- Unintended privilege escalation
- Cross database ownership chaining
- Covering Index
I also did some brief research into clustered servers, as they were something John mentioned which I hadn’t heard of. I suppose my aversion to all things networking is probably the reason for this.
As a side note, you might remember in last weeks blog post, that I was curious as to why SQL Services believes sa is best practice when other sources contradict this. Well, I found out through Adam – It avoids the creation of orphan databases as the user sa will always be a user. Orphan databases are something I plan on learning more about outside of work placement time.
Adam wasn’t in the office this morning – I’m unsure why as yesterday it sounded like Adam, Guy and myself would be meeting to go over noisy tickets and begin some more project specific work. However, this clearly wasn’t achievable with Adam not present. So, I begun the morning by carrying on where I left off on Monday afternoon, watching the self paced learning tutorial videos. I’ve definitely been learning a lot from them! All sorts of things from transaction logs, to causes of blocking.
The only issue with watching so many videos is it becomes hard to concentrate. In order to break up my video watching, I decided to look over the SMART assessments as suggested by Adam yesterday. Its not expected of me to complete the sheets, but it will be good for me to know whats on them and to attend the marking meeting next week – where I’m bound to learn more than I can absorb!
All in all, my morning consisted of alternating between watching videos for the self paced learning exercise and answering some of the questions in the SMART assessments.
My afternoon consisted of much of the same, watching videos and working on my SSLLAB V2 Clean machine.
Adam came into the office in the afternoon and we had a quick chat about plans for tomorrow/ next week. He mentioned he had spoken with Guy who will be introduced to me tomorrow, and who will go over in more detail the essential resources I will be using to identify on call tickets, time spent on those tickets and who worked on them. This will give me information on which tickets are noisiest and who I should speak to in order to gain information on what they did during that time to resolve the ticket. So far, it is sounding really interesting and really manageable! But I don’t want to jinx it. It also sounds like it will provide the team with a lot of valuable information, as not only will I be creating a new training resource, but along the way I will be able to edit and update existing materials.
When I arrived into the office, I had an email from Adam outlining a few tasks for me to complete, as well as a spreadsheet which I will be using as my base document when talking to Guy and collection information on alerts and their priorities. Below is the list of tasks I will be working towards completing today:
- Meet with Guy and ensure that any new alerts on his lists are added to this spreadsheet
- Update the Pageable column for all alerts that tbl_EmailDistribution in the template currently treats as pageable
- For the pageable alerts only, review the associated SharePoint page and highlight any that are not set to show as being pageable.
- Assign a priority (1 to x) for each of the pageable alert types. The priority should be based on the amount of time spent for each ticket over the last year, rather than the amount of alerts received.
- Create a list of questions that you’d like to ask a DBA that will help you be able to create the step by step list of instructions/processes.
I spent the morning going through the SharePoint alerts pages and entering into the spreadsheet whether those pages recognize the alert as pageable or not. I then used a simple select statement in my Jade template lab to pull out which alerts it recognizes as pageable. As far as I could tell the template matched the SharePoint pages, which leads me to believe those pages are up to date. I’m sure this will be either confirmed or denied when I meet with Guy this afternoon.
There was one specific column in the email table of the template which I wanted to find out more about. This prompted me to spend the rest of the morning reading through the template operation instructions and other associated documentation to fully understand how the emails and ticketing are controlled by the template. This did raise a few questions, which I asked Slade about. He managed to clarify things for me, using databases he had to demonstrate the differences between alerts for different SLA’s. Again this was a very interesting topic and I learnt a lot from both reading the documentation and ‘playing’ with my Jade lab to see the theory put into practice.
I spent the rest of the afternoon with Guy, discussing the first 4 tasks from above. My head is now just full of alerts, alerts, alerts. Then it was time to go to work, and the end of my second week of my work placement. Task 5 from the list above is one I had actually already begun working on earlier in the week, and will pick up again on Tuesday.
By the end of this week I have now completed 59.5 hours of my work placement at SQL Services.