Blog

Mar 09
3-12-15: Backup and Restore

This month's topic will be about the most mundane of a DBA's duties: Backing up and restoring a database. 

At least, it should be mundane.  You don't want backups and restores to be exciting, for example at two in the morning when you find out that the backups from the last six months are no good.  ​

In this session, Greg Burns will talk about:

  • The Basics of Backup and Restore
  • Meeting an SLA
  • 3rd Party Backup solutions
  • Maintenance Plans
  • Scripting a Backup / Restore
  • Centralizing your SQL Agent jobs
  • Ola Hallengren's Backup script

Target audience: IT Admins, DBAs, and Developers.  This will be a broad topic ranging from the 100 level to the 300 level. 

Speaker: Greg Burns

Greg Burns has been working as a SQL Server DBA (and sometime developer) since 1997.  He has worked with SQL Server in some way nearly every day, from server builds and migrations to database design and reporting.  He is a SQL PASS chapter leader for the Alaska SQL User Group (and also runs the Alaska SharePoint User Group).  He currently works as a senior consultant for Network Business Systems in Anchorage, AK.   

Download the SQL Script used in this presentation

After watching the video, I have realized there are a couple of mistakes.  I normally put a lot of time and effort into a presentation and make sure all the information is accurate, but this time I winged it a little and got a couple of facts wrong.  Although I think it's mostly good, I'd rather not leave the video out there and possibly lead people astray.

Here are some links I mentioned during the meeting: 

Beginner:

Intermediate / Advanced: 



Feb 23
NetApp and SQL Best Practices - Free Technical Seminar & Breakfast March 24th

Hi All,

Just passing this along.  My employer, Network Business Systems, is bringing up an expert to speak about best practices for SQL Server and NetApp storage.  It's a free event with a continental breakfast and prizes.

 

We're bringing a NetApp SQL expert and evangelist to Alaska to discuss everything you always wanted to know about how to manage SQL on NetApp storage.  Invite not only storage and server administrators but please include DBAs and developers.  Pat is a fascinating speaker and can address each of their areas of interest. 

 

Best practices, backup, recovery, tuning and performance acceleration, especially using cache/SSDs, can be done a number of ways depending on the SQL, Windows or NetApp operating system versions and the protocols (iSCSI, FC, NFS, CIFS, RDM, etc.) in use.   Pat is an expert; can address those options and has done extensive testing in VMware and HyperV environments with SQL 2012 and 2014.  Plus he can shine some light on the new features in SQL 2014 and SMB3.

  • When:  Anchorage, Tuesday March 24th, 8 am to 10:30 am
  • Where:  3000 C Street in Downstairs Conference Room. 

Food and prizes plus an update on NetApp hardware and software.

Click to Register!

More Info 

Feb 10
2-12-15: Lightning Talks - SSRS / Kerberos

February 12, 2015 Meeting

 

We'll be meeting at the Network Business Systems office in their conference room at their new Midtown location:

3000 C Street, Suite 210

Anchorage, Alaska  99503 

Click here for directions.

 

Lightning Talks 

This month we have two speakers:

  • Nicholas Smith, who will be talking about SQL Server Reporting Servicesn Tips and Tricks
  • Greg Burns, who will be talking about Setting up Kerberos for SQL Server 

Speaker: Nicholas Smith

Nicholas Smith is an Analyst Programmer IV working for the State of Alaska Department of Public Safety.  He has years of experience working with SQL reporting and administration. 

Speaker: Greg Burns

Greg Burns has been working as a SQL Server DBA (and sometime developer) since 1997.  He has worked with SQL Server in some way nearly every day since then, from server builds and migrations to database design and reporting.  He is the chapter leader for the Alaska SQL User Group (and also the Alaska SharePoint User Group).  He currently works as a senior consultant for Network Business Systems in Anchorage, AK.   

Watch now on YouTube:
 
Feb 10
Key Features in the New Power BI Service Announced

​Some interesting changes are coming to the PowerBI tools, aimed at making them more powerful and independent of Excel and SharePoint.  Blogger Christopher Woodill at MicrosoftTrends.com has an excellent writeup here:

http://www.microsofttrends.com/2015/01/28/key-features-in-the-new-power-bi-service-announced/

But I'll give you the high points:

  • Power BI is going Freemium for Consumers, Discounted for Businesses
  • Power BI is being Disconnected from Excel
  • Power BI is being Disconnected from Office 365 and SharePoint
  • Power BI will have New Visualizations
  • Power BI is still Cached Data and Refreshes are Limited
  • It's all about the Connectors:
    • In addition to the current collection of standard databases, NoSQL data sources, Azure Services and HDInsight...  Microsoft has announced that new Connectors for GitHub, Marketo, Dynamics CRM, SalesForce, SendGrid and Zendesk are coming
  • The Weakest Link Still Seems to be the Data Management Gateway

 

If you want to try the preview for the new PowerBI Designer, it's available here: http://powerbi.com/dashboards/downloads/designer/

Jan 13
Freebie: SSWUG Virtual Conference - 24 hours of SQL training

Just passing this along.  SQL Server Worldwide User Group (SSWUG) always puts on a good event, and you can't beat the price: 

The Best Online Conference Experience Available and it's FREE! The SSWUG.ORG Spring Virtual Conference provides a unique experience to make sure you get the information you need from some of the industry's leading experts. Thanks to our sponsors, we're excited to offer this event to you FREE OF CHARGE! We've gathered the industry's top speakers to take you through outstanding content, daily keynotes, and multi-part sessions. You'll have 24+ sessions to participate in over the course of 2 days with 45 days of Ondemand viewing. Your access to an unmatched learning experience begins here.

Dec 30
How to Migrate a MySQL Database to SQL Server

My company manages a number of customers using a couple of different cloud-based management services, one of which is GFIMax (now MaxFocus.com). The idea is, you go onsite and install an agent on each customer workstation, and the agent reports to the cloud service, documenting each machine and its properties. Then the agent monitors for errors and performance issues, alerting us if there's a problem. Managed Services is not a new idea, of course. One thing I like about it is that you get an always-current inventory of all the machines.

But GFI's canned reports for this aren't very impressive, and they don't allow you to customize the fields or layout. So I figured I could export the data and massage it into my own report.

For GFI, your export choices are XML (which is rather cryptic and unfriendly) or "SQL", which downloads a SQL dump of information about the customer, their sites, devices, hardware, and software.  Unfortunately it's in MySQL format, which is meh.

Disclaimer:

I guess MySQL is OK, it's not a terrible database engine, but it exists in a parallel universe to mine and I don't touch it often. That's not to say it's inaccessible. If you have the right ODBC provider, you can create a DSN and connect any application to MySQL. But for the purposes of this blog post, let's assume your data needs to be moved to SQL Server.

So I had to figure out how to export the client inventories into a usable format.  It does require that you install both SQL Server and the free MySQL server on your laptop, and make sure the ODBC libraries are the proper version.  Many of the steps below only have to be done once.

 

Here's what I did:

  1. Exported the customer inventory from GFI to a MySQL dump file (e.g. Contoso2.SQL).

  2. Now what do I do with this? 

    If you open this Contoso2.SQL file in SQL Server Management Studio you'll find that it's a script to create a new set of tables, then add a bunch of data.  But even though it is a generic ANSI SQL script, it won't run as-is.  MySQL has its own particular DDL commands when it comes to creating objects and specifying data types.  You have to convert it to Transact-SQL (TSQL) which is what Microsoft SQL Server understands.  You can do this manually (and Microsoft provides instructions on how to do this), but that's meh, too.  Who wants to do all that by hand?


    As far as I know, there is no utility to convert a MySQL Script directly to a TSQL script.  But there is a free tool to export data from a MySQL database and import it into SQL Server. It's called the SQL Server Migration Assistant (SSMA), and there is a special version just for MySQL.  This MSDN blog post explains how to install and use the tool:

    http://blogs.msdn.com/b/ssma/archive/2011/02/07/mysql-to-sql-server-migration-how-to-use-ssma.aspx

     So that's the method I used.  The following steps got me where I needed to go.

  3. Download SSMA for MySQL here: http://blogs.msdn.com/b/ssma/
    • (the blog says you might be prompted to register with Microsoft, but I wasn't).
  4. Download the free version of MySQL here: http://dev.mysql.com/downloads/windows/installer/5.6.html
  5. Download the required MySQL ODBC providers here: http://dev.mysql.com/downloads/connector/odbc/5.1.html (registration with Oracle required)
    • Note: you should grab both the x86 and x64 versions just to be sure
  6. Download SQL Server 2014 (you can use Express Edition (here) but you may want all the features; e.g. SQL Express doesn't include the SQL Agent service which is needed for running jobs. If you want a full SQL Server install you should grab the free SQL 2014 Evaluation Edition here)

 

  1. You only have to do this once:

  2. Install SQL Server
  3. Install MySQL
  4. Install MySQL ODBC providers (x86 and x64)
  5. Install SSMA
    • I am pretty sure that SSMA requires the 32 bit version of the MySQL ODBC provider – otherwise you'll get an error during the install, and subsequently SSMA won't be able to connect to your MySQL service.

 

  1. Now you have a place to run your Contoso2.SQL script:

  2. Open MySQL Workbench and connect to your local MySQL instance.
  3. Click File Run SQL Script…
    • Browse to the .SQL file.
    • Under schema name, type in a new name for your database.  I chose "Contoso2".
    • Click Run, which will build all the tables and then populate them.  Depending on the number of computers in the inventory, it could take a long time. 

    • You'll end up with 6 tables:

 

  1. Now migrate the MySQL data to SQL Server, using SQL Server Migration Assistant

  2. Open SQL Server Migration Assistant for MySQL.
  3. In SSMA, click New Project. Give it a name and specify the version of SQL Server you're migrating to.
  4. Connect to MySQL

    • Note that the provider I used is MySQL ODBC v5.1.13. I had some trouble here getting this dialog to connect.  I did play with some of the other versions (e.g. 5.2.x) but found that the "Connect" button stayed grayed out. I thought it might be due to differences between the 32-bit and 64-bit versions of the ODBC provider, but maybe not. Anyway it's extremely picky.  The yellow warning icon's tooltip says I should consider using version 5.1.6 or higher. I'm not sure if SSMA really thinks 13 isn't greater than 6.
    • Once you're connected you'll see the MySQL Contoso2 source database in the top left pane.

       

  5. Connect to SQL Server and specify a destination database.  You can type in a new name and it will be created for you.
  6. Now you need to convert the schema. This will read through the schema metadata and build a map of all the changes that need to be made in order for it to run in SQL Server.
    • In the top pane, select the CONTOSO2 source database and click "Convert Schema" button in the toolbar.  In the output pane you should see something like this:
    • Conversion finished with 0 errors, 0 warnings, and 16 informational messages.
  7. Now you need to Synchronize the destination database with the converted schema.
    • In the bottom pane, select the CONTOSO2 destination database, right click and select "Synchronize with database"
    • You'll see a popup confirming the sync objects.  Click OK.
    • Output should say: Synchronization operation is complete.
  8. Now you need to migrate the source data into the destination database.
    • In the Top pane, make sure the CONTOSO2 source database is selected and click "Migrate Data".
    • You may be prompted to re-authenticate to both MySQL and MSSQL.
    • Output: you should get a popup with this information

  9. Now open SQL Server Management Studio and verify the data.
    • Expand Databases Contoso2 Tables
    • Right click table dbo.as_devices and click "Select top 1000 rows"
    • Output should show you all the computers in the inventory.

       

Points for Style

Now that the data is in SQL Server, you can use all kinds of tools to view and manipulate the data. 

 

  1. Open the SQL data in Excel

    1. In Excel, click on the Data Tab From Other Data Sources SQL Server
    2. Connect to your SQL instance. 
    3. Select Contoso2 and the as_device table.
      • Import it as a Table.
        • Note: this is a read-only view of the data.  If you want to be able to edit it, us MS Access as described below.

 

  1. Link to the data from Access

    1. In Access, Create a new Desktop database and click the External Data tab.  Click ODBC Database.
    2. Choose to Link to the data.
    3. …and connect to SQL Server and the CONTOSO2 database.
      • You will probably be prompted to create a "Data Source" which is a shortcut for MS Access to use to connect to SQL Server.    Click "New" User Data Source SQL Server provider

         

    4. Follow the prompts to connect to your SQL instance and CONTOSO2 database.  Most of the defaults should be OK.  
    5. Select the 6 tables we care about…

      • … and now you will have linked tables you can browse, edit, query, and report on.

     

    If all of this seems like a lot of work, you're right.  Of course, you can thank GFI for only providing a MySQL format for export.  Have these guys ever heard of CSV? 

Dec 22
1-8-15: The DBA Interview

January 8, 2015 Meeting

 

We'll be meeting at the Network Business Systems office in their conference room at their new Midtown location:

3000 C Street, Suite 210

Anchorage, Alaska  99503 

Click here for directions.

 

The DBA Interview 

If you're looking to hire a DBA, chances are you already have a vast repertoire of expert-level gotcha questions for the prospective hire.  I know, there's a grim sort of satisfaction watching someone squirm in their chair trying to remember how a UNION join works, while your coworkers snicker in the next room.  I once helped create an elaborate "DBA Questionnaire" to be used to grill interviewees, one that was so difficult we actually had a couple of people walk out. 

But it's not cruel, really.  A tech interview has to be hard, because you need to find out if the fantastic resume they gave you is worth the paper it's printed on.  One time I interviewed a guy who had all kinds of certs (an MCITP for SQL 2008 R2!) but he couldn't answer the simplest of questions. 

Whether you're looking for a DBA or you're a DBA looking for a job (or if you've never been a DBA and want to break in to the market) then this session is for you.  I'll go over some basic interview techniques and then share some interview questions from the "DBA Questionnaire".  Feel free to share some of your own!

Speaker: Greg Burns

Greg Burns has been working as a SQL Server DBA (and sometime developer) since 1997.  He has worked with SQL Server in some way nearly every day since then, from server builds and migrations to database design and reporting.  He is the chapter leader for the Alaska SQL User Group (and also the Alaska SharePoint User Group).  He currently works as a senior consultant for Network Business Systems in Anchorage, AK.   

Watch now on YouTube:

 
Dec 12
Scaling TempDB across Disks in a VM

The SQLCAT team gives some good metrics on what kind of performance improvement you can expect when you split your TempDB into different secondary data files (.NDF), and shows the differences when you increase the number of disks you spread the NDF files over.  Very cool! 

Note that this applies as much to your on-prem virtual machines as it does to Azure.

http://blogs.msdn.com/b/sqlcat/archive/2014/01/16/scaling-out-sql-server-disks-and-data-files-on-windows-azure-virtual-machines-a-real-world-example.aspx

Dec 10
Featured Whitepaper: Troubleshooting SQL Server Wait Stats

Passing this along from the PASS newsletter:

SQL Server MVP Kevin Kline discusses the most common problems facing IT teams who focus their performance troubleshooting activities solely on wait statistics. This can serve as a useful and effective template for performance troubleshooting which you will be able to use over and over again whenever SQL Server performance is called into question. Read the latest whitepaper by our Global Alliance Partner, SQL Sentry, along with other whitepapers by our other Global Alliance Partners, here.

Note that this is kind of an advertisement (for SQLSentry's Performance Advisor), but there's a ton of useful info in there.  I don't mind plugging a product if it will make a DBA's life easier.   

Nov 10
"SharePoint? Uggghhh!"

​Last week I attended the PASS Summit 2014 conference in Seattle.  I must say it was an amazing experience, way better than any previous conference I had attended.  I'll write more about it later, but I wanted to talk a little about something I noticed every time I mentioned I worked with SharePoint.  Here are the most common responses from every DBA I met:

  • "I'm sorry."
  • "Oh god."  (eye roll)
  • "They didn't actually talk to any DBAs when they designed it, did they?"
  • "SharePoint's not an application, it's a beast."
  • "Uggghhh."

And so on.  I mean, it's one thing to be hatin' on Microsoft for writing complex software, but these DBAs clearly thought of SharePoint as the enemy and something that needed to be destroyed.  Not one considered it useful.  If it was in their environment it was a burden they were shouldered with.  Even one of the speakers, who had written a book on PowerPivot that included several chapters on SharePoint, couldn't really bring himself to endorse it.

A Solution in Search of a Problem 

First, there's the common complaint about SharePoint, that it's just a glorified file share, something no one really knows how to use and just complicates the process of document management.  One guy talked about how his company keeps reinstalling SharePoint every year, trying to get it right, and that consultants are making a fortune every time they're brought in.  These are pretty common complaints actually, which are the result of people deploying SharePoint without any user buy-in or a real objective.  Or, as one customer put it, "SharePoint's a solution in search of a problem."

SharePoint Wants Its Own SQL Instance

The SharePoint workload has some unique requiments, in terms of security and configuration.  For example, SharePoint 2013 requires that MAXDOP be set to 1 at the Instance level.  MAXDOP (Maximum Degree of Parallelism) pertains to how many threads a single query can use, and SharePoint is designed to expect every process to be single-threaded (MS Dynamics has the same requirement).  This makes a DBA's head explode, because MAXDOP (in its default configuration) is usually a good way to scale performance. 

You could certainly set up SharePoint share a SQL instance with other databases, and I've found it to be a good neighbor, but you might find that its workload dominates other databases and tends to pound on TempDB. 

You Can't Touch the Databases

Next, DBAs complain about the myriad databases used by SharePoint.  They are closed off: Microsoft says you can't modify these or even directly query them (I'll get into why in a minute).  You can't create indexes or even do defrag/rebuilds. How is a DBA supposed to be able to maintain a database if they can't even optimize performance?  So if SharePoint is slow (and it is) then they can only get blamed for it, with no way to solve the problem. 

Why is SharePoint such a closed system?  It comes down to the fact that most of the configuration is contained in content and config databases, coordinated via timer jobs.  If you make a direct change to a table or row without using the SharePoint API (or PowerShell), you have a chance of corrupting services or data.  Compare this to editing the Registry.  For most people, the "Keep Out" sign is for their own protection, and for good reason.

Also, a SharePoint content database has a somewhat, shall we say, cryptic design.  All documents are stored as BLOBs.  In a single site collection, all document libraries and lists use just one table to store all content.  This means if you run a SELECT statement on the wrong table, you can end up locking an entire site collection.  It leaves one scratching their head as to why the SharePoint team designed it this way, because it becomes a limiting factor for performance and scale.  Although querying content directly can be useful, it's not supported. 

Lastly DBAs complain that you can't even tune indexes in SP databases.  That's not strictly true (you're not blocked from doing it) but SharePoint has its own timer jobs that do the same thing, and they might undo or repeat some of your work.  You're also told to not auto-enable statistics for the same reason, because SharePoint likes to maintain its own execution plans. 

So if it's the worst thing ever, why use it?

 I got into SQL Server way before SharePoint was a thing.  I developed my own applications (in VB6) and I was responsible for everything, from the controls on the form to the triggers / sprocs behind every event and action.  Every time I started a new app, I had to reinvent the wheel again, even if I developed reusable classes.  I had to design everything, which turned a simple assignment into a complex task.  (Note: I was never a very good developer, but then again I was self-taught and worked alone). 

When SharePoint came along it wasn't exactly revolutionary but I saw a lot of potential, even in the 2001 version.  You could create code-free solutions that ran on top of the SharePoint platform.  You didn't have to reinvent the wheel.  You could focus on solutions and not framework.

SharePoint is a platform that allows you to build your own solutions, often without invoving IT, programmers, and yes, even DBAs.  It's designed to empower users.  I will be the first to admit that the layers below the surface are pretty ugly, but I think the end result is a valuable product.  Microsoft sure seems to think SharePoint is the cornerstone of their Office suite, and it's only going to keep growing.  As a consultant I can't ignore that. 

References:

1 - 10Next