Tuesday, July 24, 2012

SharePoint / SQL Gotcha: Massive Log Files

My backup scripts on my test SharePoint 2010 box have been running but not producing backups for the last few days. I tried a manual backup via Central Admin and found that the backup estimate was more than 10GB, which is odd for a test server.

A bit of hunting with Sequoia View showed that SharePoint_Config.ldf was more than 10GB (the database data file itself was only 100MB).

It seems that the database was set to the Full Recovery model, which I’ve been bitten with before. In this mode SQL will not throw away the transaction log after a database backup, you have to backup the transaction log as well. If you do a lot of updates to the same data you can end up with large log files (compared to the actual data file).

Thinking about why you might want this makes my head spin a bit. Presumably if you have the previous backup plus the log file you can recover anyway – you just need to get the log off the server in case it goes bang. I guess if you have all the transactions you don’t even need the data file backup – you could rebuild to any point from the accumulated transaction logs. I’m sure there are lots of useful things you can do in this mode, but I don’t know of any so I don’t “need” it at the moment.

In SQL Server 2010 R2’s Management Studio you can view the properties of a database and see how large the files are. You can also use Tasks > Shrink > Database and Tasks > Shrink > Files to see how much space is available in a file (SQL keeps some free space and can auto-grow when it gets low so you can strike a balance between “wasted” free space and disk fragmentation.

The Shrink dialog told me that the log had 0% free space available. I tried a transaction log backup (Tasks > Backup > Backup Type > Transaction) but there wasn’t enough free disk space for it (failed with error 112, but couldn’t give me a descriptive message). Oh well, I don’t need it as it’s a test box and I can stand to lose some data if I need to do a restore. My very simple PowerShell scripts will get SharePoint to do a full weekly backup and daily differential backups so I should be good if something goes wrong – I think I’m only exposed for up to a day’s loss (assuming I notice if the backup itself fails – there’s no error reporting in the script at the moment).

I’m mainly just playing with the SharePoint installation – I know there are third-party tools that will automate this for me and I know that Data Protection Manager (or whatever it’s called now) can do automated SQL Backups, for example. I’m just interested in what you can easily do out-of-the-box.

Anyway, I set the SharePoint_Config database to the Simple Recovery model, and then using Tasks > Shrink > Files, I could see that now 99% of the log file was available. Shrinking it recovered that space, freeing up 10GB.

While I was there I set the model database to Simple Recovery too so any future databases created on the box will “inherit” that. I also set all the databases that were Full recovery to Simple. It turns out there was a 50-50 split. I’m not sure why some were already in Simple mode. Possibly SharePoint will set some non-critical things to Simple by default and leave the reset to “inherit” from the model database?

So a sort-of lesson learned is to watch out for Full Recovery model databases in future. If you need this, you also need to figure out what to do with all those transaction logs.

In future SQL Server and SharePoint installations I will need to watch out for default options that I might have chosen badly. It seems to me that new databases should use Simple Recovery by default – Full Recovery is more an advanced option. Maybe this is down to the edition of SQL Server that was installed?

Tuesday, July 17, 2012

Exporting from Active Directory

I recently needed to get a dump of users from Active Directory to analyse the department field values because we changed some recently and now some users have lost access to the call logging system we use (Vivantio).

This was surprisingly fiddly to achieve with the tools available to me, so I thought I’d write it up for future reference.

You can get at the users via the built-in Active Directory Users & Computers tool (dsa.msc). This will already be on Windows servers running Active Directory (domain controllers for example), but can also be installed onto a workstation via Remote Server Admin Tools (RSAT).

Using Active Directory Users & Computers, select a container in the AD tree. If your users are in multiple containers (multiple organisational units (OUs) for example) then you’ll need to repeat this for each as I didn’t find a way to list several at once.

To get the department field showing (there are lots of others available too, but not everything) use View > Add/Remove Columns.

To remove groups and disabled users (for example) use View > Filter Options. The tool will help you to build simple queries, but to screen out the disabled users we need to use an LDAP query, so choose Create Custom Filter > Customized > Advanced and enter the following:


You can then export the users to file by right-clicking on the selected container and choosing Export List. I changed the file format to Unicode Text (Tab Delimited). I tried Comma Delimited first, but some of our departments contain commas which messed up the import into Excel.

I opened the file with Excel (via File > Open) which walked me through the import process.

I inserted a Pivot Table and dragged Department into the Row Labels box and Name into the Values box (this was automatically converted to “Count of Name” by Excel). Excel lists all the departments and the count of users in that department. Double-clicking a department name will ask what to display next (or just drag another row name into the Row Labels box). I chose Display Name, so expanding a department lists the names of the people in it.

From this it was easy to spot users that had incorrect or unusual department values.

Thursday, July 12, 2012

MSDN & Campus Agreement

TL;DR – you can get the top-end MSDN license, which permits use of pretty much any Microsoft product for software development, for ~£350 per year under the Campus Agreement.

I’ve been in my current role for about a year. An on-going issue is that we don’t have much of a test environment, which means every change is in some sense a “live” change and therefore a risk. Even well understood changes carry some risk, but software development (and perhaps learning in general) is a destructive thing – I need to try things to see what happens and some of what happens might be “breakages”.

The way to handle this is to have another environment that isn’t live and which is OK to smash up and rebuild as required. This seems to be generally understood and agreed, but no such environment has been forthcoming.

So, plan B then: build my own. My boss has given me the go ahead to get a good workstation (more on that later). Actually, I’ve already got a good laptop, but Visual Studio 2010 (with Resharper) just grinds along slowly and VMWare Workstation 7 isn’t exactly fast on it either. The CPU & memory seem ok, but the laptop has picked up anti-virus, SCCM and Zenworks clients and I suspect these all drain precious resources.

On the hardware I will need software. I have previously had an MSDN subscription and am generally aware that this is the right thing to have for software development on the Microsoft platform. It comes in a variety of flavours, but the top-end one (Visual Studio Ultimate 2010 with MSDN) retails for ~£7500 for the first year and £2500 a year thereafter. For this you get access to all the development tools and all the server stuff so you can write and test pretty much anything you want. But that’s still a fair size chunk of money.

I had previously learned that we get a good discount on Microsoft licenses via our Campus Agreement, so I sent an email off to our supplier – Pugh Computers.

I was told that it would cost a bit less than £350. That’s quite a saving.

That’s per developer, per year. Because the Campus Agreement is a yearly subscription we don’t get the perpetual usage rights that the retail MSDN gives you, but that’s not really much of a loss unless you want your development environment to go stale.

This was duly ordered and the subscription delivered via our Campus Agreement logon. It was then assigned to me by email and I now have access to the MSDN downloads & key generation pages.

UPDATE 1: When we assigned the MSDN subscription from the Campus Agreement website we ticked an optional boxed marked “media” or something. Today, a small pouch was delivered containing DVDs with all the latest Microsoft products, including Windows 2008 R2 SP1, Windows 7 Ultimate, Office 2010, Visual Studio 2010, SharePoint 2010. Sweet – saves me downloading them and burning them to disk myself.

Thursday, July 5, 2012

Home Routers

I’ve had home wifi routers from both LinkSys (later bought out by Cisco) and Netgear and been underwhelmed with both. Poor wifi range, dropped connections, etc.

Recently Cisco attempted an evil change of terms & conditions on people who already own their routers, so they might be a firm to avoid if at all possible.

I’ve not owned a D-Link router. They may be OK I suppose.

However, Jeff Atwood recommends using open firmware, such as DD-WRT, OpenWRT & Tomato, because it’s better quality and can be fixed & upgraded. The off-the-shelf stuff can also, in theory, but why would Cisco, etc, spend time making the thing you already bought better, when you could be out buying a more up-to-date version instead?

I’m especially interested in QoS, so I can prioritise Xbox gaming traffic, for example. I’d also like to explore the option of a private VPN. Some decent traffic logging would also be useful.

So, I guess I better see if my current routers will work with one of these.