Hyper-V, undersizing your boot partition and swap file fun

I have a bad habit of making the boot drives for my virtal machines quite small, usually around 16GB. My Exchange 2010 server is virtualised and has this sort of configuration, a 16-gigabyte C: drive which just the OS, and a D: drive with Exchange and the databases on it. At some point, Windows decided to increase the swapfile to a point where I was uncomforable with the amount of free space on C: drive. No problems I thought, I’ll put it on D: drive. It would refuse to take it and would recreate the swap file on C: as a temporary swap file leading to bad performance.

After a fair amount of hair pulling and bad performance, I found out the issue. As detailed here, you can’t boot a Hyper-V machine from a SCSI Virtual Disk (that is, a virtual hard disk created and attached to the VM’s SCSI controller), not can you create a swapfile on it. This is apparently caused by the nature of the SCSI disk and controller and can be fixed by using an IDE virtual hard disk instead. So now the Exchange server has a 3rd drive, S: drive which is a 32-gigabyte IDE virtual disk that happily stores the swap file. Performance is much improved as a result.

Exchange 2010

I had a play around with the Exchange 2010 beta. It looked pretty good although the “killer” feature I wanted to check out, archiving, wasn’t fully functional. I’ve migrated my email from MailEnable to Exchange 2010. So far, I’m liking:

  • The database availablity group (DAG) feature looks very cool. Haven’t had the chance to test it fully.
  • The Unified Messaging feature set and management UI is a lot more developed. In 2007, it felt kind of half done (especially in 2007 RTM)
  • “Self service” of mailing lists sounds pretty cool but could be a problem in practice. Similarly, the Exchange Control Panel (ECP) allows sys admins the ability to manage Exchange from any web-enabled machine but whether you would is another issue.
  • The Powershell capabilities have matured which is a good thing.

Not liking:

  • The archive feature in its current form is pretty much useless. Most 3rd party archive implementations have two servers and/or storage systems – one with fast, smaller drives for the recent live email where disk IO performance is important, and one with slower, larger drives for the archive data where disk performance is less important than capacity. 2010 forces you to store the archive mailbox in the same database as the user’s main mailbox meaning your small fast SAS drives are being used up by archives.
  • Microsoft being a tad dishonest about using SATA drives for Exchange storage and pushing direct attached storage (DAS) over external storage solutions such as SANs and iSCSI. Yes you can use SATA drives, but according to their own storage calculator you’ll need 2-3 times the drives compared to SAS. For example, you might have a storage design that requires 8 SAS drives for the Exchange databases. With SATA, you would need 16-20 drives. Getting a DAS server chassis that can take those 8 SAS drives isn’t too difficult. Finding one that does 20 drives is more difficult and may compromise the design of other aspects of your server (RAM, CPU, etc).

Currently the server is running under Hyper-V and performing reasonably well considering the specs of the VM.

Powershell Task of the Day – Move Mailbox for all users in an Organisational Unit (part 2, Bulk move)

A follow on from the previous script was to move a larger number of users in a list of OUs. To make the task easier, I decided to automate it a bit:

$miglist = import-csv c:scriptsmigration.csv

foreach ($item in $miglist)
{
$strTargetmbx = "EXCH02" + $item.tiernumber + "-east" + $item.tiernumber + "-east"
$strReportFile = "c:scripts" + $item.name + ".xml"
Get-Mailbox -server "EXCH01" -OrganizationalUnit $item.ouname | `
Move-Mailbox -Confirm:$False -TargetDatabase $strTargetmbx -SourceMailboxCleanupOptions DeleteSourceMailbox -ReportFile $strReportFile
}

In the first line, I’m importing the contents of a CSV file that listed the OUs, a friendly name for the OU and the “tier” database those users were to go in. Iterating through the $miglist, I construct the target mailbox database name ($strTargetmbx) and report filename ($strReportFile). The Move-Mailbox command uses the -Confirm:$False switch to suppress confirmation. If this switch isn’t used, you will be prompted for confirmation at each step. The rest of the move command is pretty simple, specifying where to move to and a report filename.

Powershell Task of the Day – Move Mailbox for all users in an Organisational Unit

I needed to migrate users from one email server to a new one with more storage. These users mostly fell into neat OU grouping. To do an OU at a time, I used the following command:

Get-Mailbox -server "EXCH01" -OrganizationalUnit "domain.local/Company/State/Office/Department/Team" | `
Move-Mailbox -TargetDatabase "EXCH02StorageGroup1Database1" -SourceMailboxCleanupOptions DeleteSourceMailbox –ReportFile c:migration_report.xml

The first line gets all the mailboxes in the specified OU. Line 2 performs the mailbox move, specifying the target database and a custom report file name for the results.

Powershell Task of the Day – Get Mailbox Size For Users in an OU

During an email migration, I wanted to see how many users would be under the new limits that would be imposed on them. In most cases, the restrictions would match their location in Active Directory. So I was able to export the needed details and view them:

# Process email stats for users in the specified OU and save to CSV file

Add-PSSnapin Quest.ActiveRoles.ADManagement
Add-PSSnapin Microsoft.Exchange.Management.PowerShell.Admin
Add-PSSnapin Microsoft.Exchange.Management.Powershell.Support

if (!$args) # Empty arguments
{
write-host "You must specify correct arguments, specify the OU to check, then the file name. ie. email_stats.ps1 domain.com/OUName/OUName2 oulist.csv"
}

write-host "Processing based on arguments..."
write-host "OU: " $args[0]
write-host "CSV: " $args[1]

$OUName = $args[0]
$FileName = $args[1]
$i=0
$strTest = "Name,Alias,ServerName,OU,TotalSize(KB)`n"

$arrMailboxList = get-mailbox -OrganizationalUnit $OUName | select name, alias,servername,organizationalunit
$arrMailBoxListCount = $arrMailboxList.count
foreach ($item in $arrMailboxList)
{
$i = $i+1
write-progress -id 1 -activity "Getting Mailbox List" -status "Progress:" -percentcomplete ($i/$arrMailboxListCount*100)
$arrMailboxStats = Get-MailboxStatistics -identity $item.alias | select displayname,totalitemsize
foreach ($stat in $arrMailboxStats)
{
$strTest = $strTest + $item.name + "," + $item.alias + "," + $item.servername + "," + $item.organizationalunit + "," + $stat.totalitemsize.value.toKB() + "`n"
}
}

Out-File -filePath $FileName -inputObject $strTest -encoding ASCII

This is my first powershell script using arguments. In this script, I was wanting to submit 2 arguments – the Organisation Unit to check and the CSV file to save the final results to. The basic work flow is to get all the users in the specified OU and then loop through the results, performing a Get-MailboxStatistics cmdlet on each. A CSV-formatted string is stored in a variable and then finally outputted to a CSV file.