Friday, 28 November 2014

ConfigMgr 2012 - MDT deployment info to hardware inventory

If you are integrating MDT into ConfigMgr 2012, one of the steps it runs is Tattoo, which runs ztitatoo.wsf.

This adds some basic information about the deployment to the registry under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Deployment 4, including the OSD Package ID, the date the computer was deployed, and the type of deployment.

I want this information captured in the hardware inventory so I can easily find computers that haven't been refreshed in awhile. Since MDT includes a MOF file, capturing this in hardware inventory is fairly easy.

  • In the ConfigMgr console, go to Administration -> Client Settings
  • Edit the appropriate client settings and go to the Hardware Inventory tab
  • Click Set Classes.  On the window that pops up, click Add
  • Connect to a computer that has already been deployed with an MDT task sequence.
  • Check Microsoft_BDD_Info
After the next hardware scan cycle, you'll have a new Microsoft_BDD_Info section that contains the deployment timestamp and other useful information to report on.

Thursday, 27 November 2014

Home File Server Part 1 - Hardware

What do I need?

I have a few requirements for a home file server:

  • Needs to be fairly reliable and easy to fix - all the TVs in the house use it for media storage, and my 3 year old daughter demands high uptime for her cartoons!
  • Support lots of hard drives.  I don't have money to go out and buy all new 4 or 6TB drives, so I need room for more smaller drives.  It also needs to be easy to upgrade those drives later on.
  • Needs to be a full computer - not an appliance.  I want to be able to run whatever software I need on it with ease.
  • Not cost a fortune.
Notably absent from that list are "low power" and "quiet".  I'm ok with it using a bit more power if I get a lot more functionality and reliability out of it.  And noise isn't much of an issue - it sits in the basement with all the network gear.

My New Server

I managed to snag an older IBM 3500 (the original 7977 model, not the more modern M3 or M4) very cheaply - as well as some spare parts like extra fans and power supplies for it.

This server has 8 3.5" drive slots,  There are two backplane boards which I have previously written about built in.

I had considered making a custom case that could use the backplanes, with small motherboard and a SAS RAID card...but that's a lot of work, and would cost more as well.  The biggest downside is that I wouldn't have been able to use the server's power supplies.  The server's power supplies provide 12.1V (at 69 amps!), but no 5V or 3.3V - those voltage regulators are all built into the motherboard.  So it ends up being far simpler to just use the server as is.

SAS controller

The biggest problem is the server's ServeRAID 8k controller.  With the latest firmware updates, it supports drives up to 2TB - but nothing past that.  Any drive larger than 2TB either gets detected as 2TB, or not properly detected at all.

The backplanes use standard SFF-8087 ports, so it's easy to find HBAs that can use them.  I had it narrowed down to:
  • IBM M1015
  • Dell H200
  • Dell H310
  • LSI MegaRAID 9240-8i
All of those cards are pretty similar - all LSI chipsets, and practically identical for my purposes - they are have 2 SFF-8087 ports, and can be used as JBOD or pass through without any RAID.  That was very important that it properly support JBOD - otherwise the disks wouldn't be readable with a different controller card later on.  The IBM and Dell branded ones would have had to be flashed with LSI IT firmware to do that properly, but that's not hard to do.

I ended up getting the LSI branded one on ebay for around $110.

I also picked up 2x 1M SFF-8087 SAS cables on ebay for $23.70, since the ones that come with the server aren't going to be long enough to reach the new card.

Once the parts arrived, installing the card was pretty straight forward.  The only tricky part was actually running the cables.  The entire section of the case between the motherboard and drive bays is full of fans on a big mount that slides out.

Luckily, if you remove the mount there is a small gap between two of the fan connectors where the cables can run.  You just need to be very careful reinstalling the fan mount so the cables don't move and get pinched.

The server detected the new card without any issues, and the card detected all of the hard drives properly.  This card can do RAID, but it's default is to act as a regular HBA and just use the disks normally - that is exactly what I want for this setup.  The OS can manage the disks directly, and unlike controllers that do tricks like single drive RAID 0 arrays for each disk, the drives can be pulled out and used in another system or the card replaced with a different model without any trouble.

Why is there a jet engine in our basement?

So, quiet wasn't on the list of requirements.  But quiet is a relative thing.  I don't need it to be silent, but by default this server is noisy!  It has 6 fans running.  If you try to remove the fans, the others won't slow down anymore, so it gets even louder.

Luckily, I don't actually need all those fans.  You only need all 6 fans if you have either a second CPU or a second power supply installed.  Without a second CPU or PSU, fans 4-6 can be removed and blanking panels installed over slots 5 and 6.  That quiets it down quite a bit.

I didn't have a second CPU to begin with, but since I have it, I did want a second power supply installed.  But having the server shutdown until I swap in a new power supply is a small price to pay for a large noise reduction.

Next up is to install and configure the OS.

Tuesday, 19 August 2014

IBM x3500 SAS Backplane Pinout

IBM x3500 SAS Backplane Pinout

You can get these backplane boards fairly cheap on ebay these days.  Even if you don't have the server it belongs do, they could still be useful for DIY storage projects. The IBM FRU is 44E8783. There are likely other similar IBM parts that use the same pinout, but no guarantees.
Unfortunately as far as I can tell, nobody bothered to document the cabling for them!  As luck would have it, I still have a working one, so I was able to map out the power cable with a multimeter.

This is just a regular backplane, not a SAS expander - one SFF-8087 connector for 4 drives.  The data connector is standard, it's just the power cable that is strange.  It looks similar to an ATX 24 pin power connector, but smaller.  If you are buying one to use in a different device, make sure it comes with the power cable - you may not be able to easily find the right connector.

I measured this by removing the cable from the backplane and measuring the voltage on each pin, with the black probe on the metal of the case (ground).  The picture shows how I numbered the pins.


Monday, 23 June 2014

KB2919355 and LSI Raid Controllers Part 3

Microsoft released a hotfix a few weeks ago, and has finally started pushing it out via windows update.  KB2966870 should be automatically installed before KB2919355 now, so servers shouldn't break anymore.  Yay!

I'm now in the process of updating all my affected machines.  Other people have reported that the update does solve the problem, so hopefully this goes smoothly.

Wednesday, 14 May 2014

KB2919355 and LSI Raid Controllers Part 2

Microsoft has finally released a knowledge base article about this issue -

The problem affects a lot of LSI based cards, including the Dell H200 my servers have, as well as some from HP, IBM and Supermicro.

Unfortunately, while there is a temporary workaround, there is no fix yet.  They did however figure out what is causing it:

This problem occurs if the storage controller receives a memory allocation that starts on a 4 gigabyte (GB) boundary. In this situation, the storage driver does not load. Therefore, the system does not detect the boot disk and returns the Stop error message that is mentioned in the "Symptoms" section.

Note This problem may not always occur in this situation. The problem is affected by the computer’s startup process, the driver load sequence, and the memory allocation of the storage controller driver at startup.
 Microsoft's workaround is to limit the system to 4GB of RAM long enough to remove the update.

This is actually an interesting bug.  4GB is the limit for 32 bit memory addresses.  But this is a 64 bit OS.  Those limits should be well behind us.  The driver itself didn't change with KB2919355, so something else in the update triggers this bug.

I'm quite curious to see what they changed that caused this issue, and why the Server 2008 drivers aren't affected by it.  Hopefully Microsoft will be releasing a fix soon.

On the upside, they seem to have extended the cutoff for future updates to June 10th, according to  That post doesn't say if it applies to Server 2012R2, or just Windows 8.1.  I'm hoping it applies to both.  People who use WSUS to update their systems have until August before KB2919355 becomes required to receive further updates.  Unfortunately, I am working on rolling out WSUS, but it ready quite yet.

For now all I can do is sit back and hope Microsoft, Dell and/or LSI come out with a fix soon.

Documenting patch panels the easy way

As I discussed previously, network documentation is important. One part of that is documenting all the physical cabling of a building.

I had started drawing out visio diagrams for each of our patch panels, indicating where the cable went, as well as floorplans marking all the ethernet ports.  The floor plans aren't too bad, especially if you can get CAD drawings of the building.  But the patch panels turned out to be a pain - positioning labels over each port on a patch panel is very time consuming.

So I decided to write a plugin for DokuWiki that will do the hard work for me.  I give it a simple description like this:

<patchpanel groups="6" name="Top_of_Rack" ports="24" rows="2">
# Port Label (#COLOR) Comment
1 Uplink #ff0000 Connects to firewall
2 2 Office 101
3 3 Office 102
4 4 Office 103
5 5 Office 104
24 24 Reception desk
And I get a drawing like this:

The plugin allows you to specify the name, number of ports, number of rows, and port grouping of the patch panel, so it looks pretty close to the real thing.  For each port, you can specify a label, optional color, and a comment.  The comments are shown as tooltips when you hover over a port.

This saves me a ton of time over updating visio diagrams, and means I don't have to edit the visio diagram, export it as an image, and upload it to our wiki.  Just edit a few lines of text when a port changes.  I'm also planning on writing one to handle switches,

The image is actually an embedded SVG.  The amazing things about scalable vector graphics is that they are text based descriptions of what to draw - so with a little bit of effort, you can dynamically create an image in any programming language, without any special image manipulation libraries.

If you are interested in using this plugin, you'll find it at

Bug reports and feature requests should be added to the github repo.

Sunday, 27 April 2014


Documentation is one of the most overlooked things in IT.  Maybe it's because writing documentation is boring.  Maybe because you don't have time to write it all down before moving on to put out the next fire.

Why do I need any documentation?  I know how it all works...

Documentation is important!  Without it, nobody knows what other people on their team have changed.   Nobody knows how things are setup, and more importantly why they are setup that way.  Bringing on a new person to the IT staff is much easier if they have a document they can reference, rather than ask about every little detail.

In a crisis having good documentation of your backup and restore procedures is essential.  You need to restore things quickly, people will be yelling at you, and it becomes very stressful.  Write yourself a nice step by step guide while you test the backups, so when you need to do it for real you just follow the instructions.

From the IT staff's perspective, having good documentation has direct benefits. You can't take a real vacation if you are the only person who knows how parts of the network work.  Those parts will break, and you will get a phone call.  And if you have a terrible memory like me, you'll come across things that are setup strangely - that you setup yourself, several years and many projects ago.  You'll remember there's a reason you did it that way, but not what the reason was.  That information is just lost.

From a business standpoint, having no documentation is terrible.  IT staff can quit, get sick, or even die.  When someone comes in to take over for them, it will take them weeks to figure out where everything is and how everything runs.  I have personally run into situations where the only thing we could do was rebuild a service from scratch, because nobody knew the passwords and they couldn't be reset.

Ok, I'm sold.  I'll write things down.  Now where to put them?

I have gone through a lot of different systems for keeping documentation.  At one point it was a physical binder in my office.  At another a bunch of text files on my laptop.  Then a Sharepoint site for the IT staff.  All of them had limitations.

The binder never got updated properly.  The text files were hard to navigate, and couldn't easily be shared with the team.  Sharepoint worked ok for awhile.  But what about the notes I need to fix Sharepoint when it breaks...they had to be stored separately, or I wouldn't be able to reach them when I needed them.

Criteria for a good documentation system

Your needs may be different, but this is what I was looking for to store our documentation:
  • Easy to add to and update.  If it's hard or time consuming, nobody will do it.  Documentation is useless if nobody ever updates it.
  • Able to link to other parts of the documentation.  Specific information should only be added ones, but referenced from anywhere that it might be relevant.  You don't want to repeat the same information over and over, since you'll have to find every instance of it when it changes.
  • Basic formatting and pictures.
  • Hosted ourselves.  There are cloud based platforms that do this kind of thing.  But I don't want my data inaccessible if they fail.
  • Accessible to and editable by the whole IT staff, without any extra effort to share our changes.
  • Accessible from anywhere and from any device - sometimes I need to fix things via remote desktop from my phone.
  • Accessible even in the event of a catastrophic network failure.
Those last few points are tricky.  Making it accessible to all the staff from any device is easy - put it on a web server.  But what if the web server dies?  Or I don't have a working internet connection when I really need it?  My solution is to use DokuWiki with Dropbox

Dokuwiki with Dropbox

DokuWiki is a simple open source PHP based wiki.  It's syntax is relatively easy to read and use.  It can store pictures and other things like saved config files.  It has decent access control to make sure only authorized staff can access it.  It takes care of everything except that last bullet point - accessible in a catastrophic network failure.

There are lots of similar wiki packages, but DokuWiki has one very important distinction for my purposes - it does not use a database.  Everything is stored in plain text files in it's data folder.

Dropbox is a great service.  You run a program on each of your computers, and it syncs the files in your Dropbox folder between them.  Your files are also accessible from their website.  It also allows you to share folders with other people, and all the files in that folder will show up in their Dropbox folder on their computer too.  I've been using it for years, and it's dead simple to use.

The best thing about Dropbox is that it keeps a local copy of all your files.  Unlike other cloud storage services, if Dropbox goes down, I still have access to all of my files.  This is important, because it means using Dropbox isn't adding a single point of failure.  If Dropbox stops working, changes start getting synced, but all the existing data is stored on every computer.

Since everything in DokuWiki is a regular text file, no databases, Dropbox can sync it to everyone's computer.  Add in a tiny web server like MicroApache, and you can load it directly from your Dropbox.  DokuWiki On a Stick is a prepackaged version that comes with the web server portion ready to go.  That covers loading it from our work and even personal computers.  Any file someone edits automatically gets synced to everyone else within a few seconds.

What it doesn't cover is accessing it from mobile devices or computers that don't have Dropbox installed.  That's covered by loading Dropbox onto a webserver running the full Apache, and pointing the DocumentRoot at our shared Dropbox folder.  The biggest trick here is that all the files must be writable by both Dropbox and Apache - either they have to run as the same user, or you have to be in each other's groups, and set their umask to ensure group has write permissions.

Now, in normal operation, I can access the wiki from its web server.  If the web server is unreachable, I can fire it up on my laptop.  If I don't even have my laptop, at the very worst I can read the text files directly with the Dropbox client on my phone - I don't get the pretty formatting, but all the content is there.

I think it works brilliantly.

What about passwords?

I'm a very security conscientious person.  Storing passwords in plain text in a third party service like Dropbox isn't a great idea.  So we do not record passwords in the wiki.

That's what KeePass is for.  It stores passwords for all our services, nicely organized and encrypted.  There is a portable version that runs directly from Dropbox, so all the staff have access to it and it is automatically kept up to date.