Author: markh

Hyper-V Disk Speed Musings

Hyper-V Disk Speed Musings

I was testing out the Intel 905P and getting the benchmark numbers, I also went ahead and installed HyperV, set it up correctly, and then created a Windows 2016 VM running and decided to run the same CrystalDiskMark benchmarks again, in the HyperV host as well as inside the VM.

I only had the one VM running, and the host was not doing anything other than hosting whilst I was running the benchmark

On the queue depth of 32, sequential reads and writes are where they should be.

On to queue depth 32 random reads and writes, and we can start to see a penalty.  Just wait.

At a queue depth of 1, sequential reads and writes are looking not too shabby.  The VM itself sees a reduction on reading, but not too bad.

And here we get to the interesting part.  Look at the random reads and writes from inside the VM.  I ran this test multiple times and the result was nearly the same numbers at every go.   From a VM, raw disk performance sees a tremendous, nay, massive hit.

I went so far as to reboot the host, rebuild the mirror, and ran the numbers again.  Each time was basically the same.  I made sure the virtual disk on the VM was not throttled, and it was a Gen 2 VM.

Quite odd.  I’m hoping to soon be able to test to see if perhaps the VHDX auto expanding setup was the culprit.  Stay tuned.

Intel 905P SSD Benchmarks

Intel 905P SSD Benchmarks

Today I’ve got some benchmarks on the Intel 905P SSD 960GB card.

For reference, this is a PCIe add in card (4x lanes) that uses the Intel/Micron 3D XPoint (or Optane) memory instead of the usual NAND flash.  The promise is really high random performance at low queue depths.

For these benchmarks the server config is:

  • Dell R740XD containing two Xeon Gold 6144 CPUs (3.5Ghz)
  • Windows Server 2016 Standard fully updated, full GUI
  • For tests with two cards, each card was placed in a PCIe slot on different CPUs
  • CrystalDiskMark version 5.2.0.
  • I’m also including the results of an Intel P4510  for comparison’s sake.
  • I ran CrystalDiskMark three times for each test, but since the results each time where so very, very close to each other I’m only going to show the results of the first run.
  • I’m showing the data that includes the IOPS count as I find that more informative

I’m also including the following drives for comparison’s sake:

  • Intel 4800X 750GB
  • Intel P4510 4TB
  • Micron 9200 MAX 1.6TB
  • Samsung 850Pro 2TB Sata drive (please note, this was in Dell R720 server hooked to a Perc H710P Raid controller with no read ahead, write through, use cache.  This gives SSDs on that controller the fastest performance they can get).

For more details of just how awesome 3D Xpoint is compared to normal NAND, I highly recommend a stop by Anandtech as they go into much greater detail than I do.

Too the charts!

If you look at sequential reads and writes at a queue depth of 32, you can see just how much of an advantage PCIe based SSD’s have over their sata ancestors.  On the PCIe side of things, the P4510 is the king.

The same is pretty much the case when it comes to randomness at this high queue depth.

At the lower queue depth of 1, we start seeing Optane come into its own on the random read side.  On the write, it is above the Micron and yet is beaten by the P4510.

Now we come to random reads at a queue depth of 1.  Trounces everything.  Nothing even comes close for random reads.

Writes are a different story, but still no slouch.

This being a server however, where uptime is king and redundancy is quite necessary, what happens if we mirror the drives?  Sadly the Dell server does not support Intel VROC yet so I went with a software mirror inside of Windows.  Additionally, I added in some benchmarks on the Samsung 850Pro 2TB in Raid 1 (2 drives) and Raid 10 (4 drives).  Settings on the raid controller are the same as above.

First, we can see that at a queue depth of 32, random reads are affected by a lot, but not as much as random writes as they take a substantial, if not massive, hit. On the 905P we go from a single drive random write of 222147 down to 107525.  The P4510 goes from 239401 down to 139499.   That is a large penalty for redundancy.

The long and short of it is that redundancy costs in raw performance.  That being said, your data is redundant, ’nuff said.

All told, if you have a workload that can benefit from low queue depth reads, the 905P is one killer SSD.

 

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***

 

Adventures in Powershell – Math and dates

Adventures in Powershell – Math and dates

One month I found myself in a situation where I had to take over an ERP vendor price file update that happens every two weeks.  The actual import into the ERP system was fairly straight forward, as the prices have a built in “price not effective until such and such date” but due to reasons beyond my control there were a few requirements that had to be accomplished prior to the import.

Those requirements are, in no particular order:

  • The files have to be no more than two weeks old, otherwise they are not the current files
  • The file sizes of each file have to be evenly divisible by 272.  This tells you how many items are in the file
  • The files have to be renamed to include the effective date of the prices
  • Those files have to then be archived off to a zip file and copied over to the ERP system’s import folder

Previous to myself, this was all done manually.

I was able to get the requirements above into a Powershell script (skip to end for the full code).  I can’t take original credit for any of this as I found most of what I needed on Stack Exchange and Microsoft’s Scripting Guy blog.

Anyway, I just put it all together into what I needed.

Let’s go through things step by step.

Step 1:  Set some variables for each of the price files (FF being full file, CO being changes only)


#Declare Price Update Files

#vendor1
$sourcevendor1ff = "\\servershare\download\vendor1FF"
$sourcevendor1co = "\\servershare\download\vendor1CO"
#vendor2
$sourcevendor2ff = "\\servershare\download\vendor2FF"
$sourcevendor2co = "\\servershare\download\vendor2CO"
#vendor3
$sourcevendor3ff = "\\servershare\download\vendor3FF"
$sourcevendor3co = "\\servershare\download\vendor3CO"

Step 2: Get the modified date of each file


#Get date of price files
$FileDatevendor1ff = (Get-ChildItem $sourcevendor1ff).LastWriteTime
$FileDatevendor1co = (Get-ChildItem $sourcevendor1co).LastWriteTime
$FileDatevendor2ff = (Get-ChildItem $sourcevendor2ff).LastWriteTime
$FileDatevendor2co = (Get-ChildItem $sourcevendor2co).LastWriteTime
$FileDatevendor3ff = (Get-ChildItem $sourcevendor3ff).LastWriteTime
$FileDatevendor3co = (Get-ChildItem $sourcevendor3co).LastWriteTime

Step 3: Have the person running the script enter the effective date of the files


#Enter the effective date of the Price files
$effective = read-host 'Enter the price effective date in format yyyymmdd'
$effectiveDate = [datetime]::ParseExact($effective,"yyyyMMdd",$null)

Step 4:  Echo to the person what day of the year the modified date of each file is (i.e. Jan 31 is the 31 day of the year, February 20 is day 51 and so on and so on).  This is useful when you need to determine which file is causing the script to stop in step 6.

#Determine days of the year for all variables
    write-host "You entered an effective date of $effectiveDate"
    #Determine Day of Year for effective date
    write-host "Day of the Year for what you entered for an effective date:" $effectiveDate.DayOfYear
    #Determine day of year for vendor1ff
    write-host "vendor1FF was last modified at $FileDatevendor1ff.  Day of the year for vendor1ff file:" $FileDatevendor1ff.DayOfYear
    #Determine day of year for vendor1co
    write-host "vendor1CO was last modified at $FileDatevendor1co.  Day of the year for vendor1co file:" $FileDatevendor1co.DayOfYear
    #Determine day of year for vendor2co
    write-host "vendor2FF was last modified at $FileDatevendor2ff.  Day of the year for vendor2co file:" $FileDatevendor2co.DayOfYear
    #Determine day of year for vendor2ff
    write-host "vendor2CO was last modified at $FileDatevendor2co.  Day of the year for vendor2ff file:" $FileDatevendor2ff.DayOfYear
    #Determine day of year for vendor3co
    write-host "vendor3FF was last modified at $FileDatevendor3ff.  Day of the year for vendor3co file:" $FileDatevendor3co.DayOfYear
    #Determine day of year for vendor3ff
    write-host "vendor3CO was last modified at $FileDatevendor3co.  Day of the year for vendor3ff file:" $FileDatevendor3ff.DayOfYear

Step 5: Do some math to make sure all files are less than 14 days old, and make this a variable

</pre>
#Calculate if price file dates are good

#vendor1co
$pricefilevendor1comath = $effectiveDate.DayOfYear - $FileDatevendor1co.DayOfYear
write-host "There are $pricefilevendor1comath days between what you typed and the modified date of file $sourcevendor1co"
#vendor1FF
$pricefilevendor1ffmath = $effectiveDate.DayOfYear - $FileDatevendor1ff.DayOfYear
write-host "There are $pricefilevendor1ffmath days between what you typed and the modified date of file $sourcevendor1ff"
#vendor2FF
$pricefilevendor2ffmath = $effectiveDate.DayOfYear - $FileDatevendor2ff.DayOfYear
write-host "There are $pricefilevendor2ffmath days between what you typed and the modified date of file $sourcevendor2ff"
#vendor2CO
$pricefilevendor2comath = $effectiveDate.DayOfYear - $FileDatevendor2co.DayOfYear
write-host "There are $pricefilevendor2comath days between what you typed and the modified date of file $sourcevendor2co"
#vendor3FF
$pricefilevendor3ffmath = $effectiveDate.DayOfYear - $FileDatevendor3ff.DayOfYear
write-host "There are $pricefilevendor3ffmath days between what you typed and the modified date of file $sourcevendor3ff"
#vendor3CO
$pricefilevendor3comath = $effectiveDate.DayOfYear - $FileDatevendor3co.DayOfYear
write-host "There are $pricefilevendor3comath days between what you typed and the modified date of file $sourcevendor3co"

Step 6: Do some if/else logic where if any file is older than 14 days, the script stops and tells you


#Do Date math to be sure price file is valid.
#All six files have to be less 13 days or fewer from the
#new effective price date
#vendor1CO Test
if ($pricefilevendor1comath -ge 14) {
write-host "Your price file for vendor1co is too old, get an updated one. Stopping script."
Break
}
#vendor1FF Test
if ($pricefilevendor1ffmath -ge 14) {
write-host "Your price file for vendor1ff is too old, get an updated one. Stopping script."
Break
}
#vendor2FF Test
if ($pricefilevendor2ffmath -ge 14) {
write-host "Your price file for vendor2ff is too old, get an updated one. Stopping script."
Break
}
#vendor2CO Test
if ($pricefilevendor2comath -ge 14) {
write-host "Your price file for vendor2co is too old, get an updated one. Stopping script."
Break
}
#vendor3FF Test
if ($pricefilevendor3ffmath -ge 14) {
write-host "Your price file for vendor3ff is too old, get an updated one. Stopping script."
Break
}
#vendor3CO Test
if ($pricefilevendor3comath -ge 14) {
write-host "Your price file for vendor3co is too old, get an updated one. Stopping script."
Break
}

Step 7: If the proceeding steps all show files less than 14 days old, rename the files and move them to a temp folder


#If all six price files are 13 or fewer days old, begin processing
else {
#vendor1 Process
write-host "- Starting vendor1 Files"
$sourcearchivevendor1 = "\\servershare\it\price data\vendor1\Temp"

$destvendor1ff = "$sourcearchivevendor1\vendor1ff$effective"
$destvendor1co ="$sourcearchivevendor1\vendor1co$effective"
write-host "- Copying vendor1FF from download to temp folder"
Copy-Item $sourcevendor1ff $destvendor1ff
write-host "- Copying vendor1CO from download to temp folder"
Copy-Item $sourcevendor1co $destvendor1co 

Step 8:  Take the items from the temp folder, archive them into a zip file, and copy them to the ERP import folder


write-host "- Archiving vendor1 to zip file"
$destinationvendor1archive = "\\servershare\it\price data\vendor1\vendor1_$effective.zip"
add-type -AssemblyName "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($sourcearchivevendor1, $destinationvendor1archive)
write-host "- Copying vendor1CO to Share"
copy-item $destvendor1co "\\ERPPriceFiles\IN"
write-host "- vendor1 files processed" 

Step 9:  Calculate the number of parts that are in the file and store that number in a variable


#Find the number of records in each file
$Sizevendor1CO = get-item "$destvendor1co"
$Numvendor1co = $Sizevendor1CO.length/272
$lengthvendor1CO = $Sizevendor1CO.length

$Sizevendor1FF = get-item "$destvendor1ff"
$numvendor1FF = $Sizevendor1FF.length/272
$lengthvendor1FF = $Sizevendor1FF.length

$emailvendor1CO = "vendor1CO is $lengthvendor1CO bytes long and has $Numvendor1co records."
$emailvendor1FF = "vendor1FF is $lengthvendor1FF bytes long and has $numvendor1FF records." 

Step 10:  Set up your email server and email info


#Declare SMTP Server
$smtpServer = "smtp.email.local"
$smtpFrom = "pricefiles@domain.local"
$smtpTo = "notifications@yournamehere.com"
$messageSubject = "vendor1vendor3vendor2 Price File Record Count for $effective"

#Declare email message body
$message = New-Object System.Net.Mail.MailMessage $smtpfrom, $smtpto
$message.Subject = $messageSubject
$message.IsBodyHTML = $true
$style = "<style>BODY{font-family: Arial; font-size: 10pt;}"
$style = $style + "TABLE{border: 1px solid black; border-collapse: collapse;}"
$style = $style + "TH{border: 1px solid black; background: #dddddd; padding: 5px; }"
$style = $style + "TD{border: 1px solid black; padding: 5px; }"
$style = $style + "</style>" 

Step 11:  Send an email with the file size of each file, the number of items in each file, and a hyperlink to an Excel log


write-host "- Sending email with number of records"
#actual powershell script
$message.Body = "$emailvendor1CO <br> $emailvendor1FF <br>Update file ""\\servershare\it\price data\vendor1\_datalog_vendor1vendor2vendor3.xlsx"" with those counts<br><br> $emailvendor2CO <br> $emailvendor2FF <br>Update file ""\\servershare\it\price data\vendor2\_datalog_vendor1vendor2vendor3.xlsx"" with those counts<br><br>$emailvendor3CO<br>$emailvendor3FF<br>Update file ""\\servershare\it\price data\vendor3\_datalog_vendor1vendor2vendor3.xlsx"" with those counts<br><br>"

$smtp = New-Object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($message)

Step 12:  Clean up your temp folder


write-host "Cleaning up Files"
#File Cleanup
Remove-Item $destvendor1ff
Remove-Item $destvendor1co
Remove-Item $destvendor2ff
Remove-Item $destvendor2co
Remove-Item $destvendor3ff
Remove-Item $destvendor3co

#Done 

And that’s it.  This script has made it quite easy for me over the last couple of years.

Below, here it is in its entirety:


&nbsp;
<pre>
#Declare Price Update Files

#vendor1
$sourcevendor1ff = "\\servershare\download\vendor1FF"
$sourcevendor1co = "\\servershare\download\vendor1CO"
#vendor2
$sourcevendor2ff = "\\servershare\download\vendor2FF"
$sourcevendor2co = "\\servershare\download\vendor2CO"
#vendor3
$sourcevendor3ff = "\\servershare\download\vendor3FF"
$sourcevendor3co = "\\servershare\download\vendor3CO"

#Get date of price files
$FileDatevendor1ff = (Get-ChildItem $sourcevendor1ff).LastWriteTime
$FileDatevendor1co = (Get-ChildItem $sourcevendor1co).LastWriteTime
$FileDatevendor2ff = (Get-ChildItem $sourcevendor2ff).LastWriteTime
$FileDatevendor2co = (Get-ChildItem $sourcevendor2co).LastWriteTime
$FileDatevendor3ff = (Get-ChildItem $sourcevendor3ff).LastWriteTime
$FileDatevendor3co = (Get-ChildItem $sourcevendor3co).LastWriteTime

#Enter the effective date of the Price files
$effective = read-host 'Enter the price effective date in format yyyymmdd'
$effectiveDate = [datetime]::ParseExact($effective,"yyyyMMdd",$null)

#Determine days of the year for all variables
write-host "You entered an effective date of $effectiveDate"
#Determine Day of Year for effective date
write-host "Day of the Year for what you entered for an effective date:" $effectiveDate.DayOfYear
#Determine day of year for vendor1ff
write-host "vendor1FF was last modified at $FileDatevendor1ff. Day of the year for vendor1ff file:" $FileDatevendor1ff.DayOfYear
#Determine day of year for vendor1co
write-host "vendor1CO was last modified at $FileDatevendor1co. Day of the year for vendor1co file:" $FileDatevendor1co.DayOfYear
#Determine day of year for vendor2co
write-host "vendor2FF was last modified at $FileDatevendor2ff. Day of the year for vendor2co file:" $FileDatevendor2co.DayOfYear
#Determine day of year for vendor2ff
write-host "vendor2CO was last modified at $FileDatevendor2co. Day of the year for vendor2ff file:" $FileDatevendor2ff.DayOfYear
#Determine day of year for vendor3co
write-host "vendor3FF was last modified at $FileDatevendor3ff. Day of the year for vendor3co file:" $FileDatevendor3co.DayOfYear
#Determine day of year for vendor3ff
write-host "vendor3CO was last modified at $FileDatevendor3co. Day of the year for vendor3ff file:" $FileDatevendor3ff.DayOfYear

#Calculate if price file dates are good

#vendor1co
$pricefilevendor1comath = $effectiveDate.DayOfYear - $FileDatevendor1co.DayOfYear
write-host "There are $pricefilevendor1comath days between what you typed and the modified date of file $sourcevendor1co"
#vendor1FF
$pricefilevendor1ffmath = $effectiveDate.DayOfYear - $FileDatevendor1ff.DayOfYear
write-host "There are $pricefilevendor1ffmath days between what you typed and the modified date of file $sourcevendor1ff"
#vendor2FF
$pricefilevendor2ffmath = $effectiveDate.DayOfYear - $FileDatevendor2ff.DayOfYear
write-host "There are $pricefilevendor2ffmath days between what you typed and the modified date of file $sourcevendor2ff"
#vendor2CO
$pricefilevendor2comath = $effectiveDate.DayOfYear - $FileDatevendor2co.DayOfYear
write-host "There are $pricefilevendor2comath days between what you typed and the modified date of file $sourcevendor2co"
#vendor3FF
$pricefilevendor3ffmath = $effectiveDate.DayOfYear - $FileDatevendor3ff.DayOfYear
write-host "There are $pricefilevendor3ffmath days between what you typed and the modified date of file $sourcevendor3ff"
#vendor3CO
$pricefilevendor3comath = $effectiveDate.DayOfYear - $FileDatevendor3co.DayOfYear
write-host "There are $pricefilevendor3comath days between what you typed and the modified date of file $sourcevendor3co"

#Do Date math to be sure price file is valid.
#All six files have to be less 13 days or fewer from the
#new effective price date
#vendor1CO Test
if ($pricefilevendor1comath -ge 14) {
write-host "Your price file for vendor1co is too old, get an updated one. Stopping script."
Break
}
#vendor1FF Test
if ($pricefilevendor1ffmath -ge 14) {
write-host "Your price file for vendor1ff is too old, get an updated one. Stopping script."
Break
}
#vendor2FF Test
if ($pricefilevendor2ffmath -ge 14) {
write-host "Your price file for vendor2ff is too old, get an updated one. Stopping script."
Break
}
#vendor2CO Test
if ($pricefilevendor2comath -ge 14) {
write-host "Your price file for vendor2co is too old, get an updated one. Stopping script."
Break
}
#vendor3FF Test
if ($pricefilevendor3ffmath -ge 14) {
write-host "Your price file for vendor3ff is too old, get an updated one. Stopping script."
Break
}
#vendor3CO Test
if ($pricefilevendor3comath -ge 14) {
write-host "Your price file for vendor3co is too old, get an updated one. Stopping script."
Break
}

#If all six price files are 13 or fewer days old, begin processing
else {
#vendor1 Process
write-host "- Starting vendor1 Files"
$sourcearchivevendor1 = "\\servershare\it\price data\vendor1\Temp"

$destvendor1ff = "$sourcearchivevendor1\vendor1ff$effective"
$destvendor1co ="$sourcearchivevendor1\vendor1co$effective"
write-host "- Copying vendor1FF from download to temp folder"
Copy-Item $sourcevendor1ff $destvendor1ff
write-host "- Copying vendor1CO from download to temp folder"
Copy-Item $sourcevendor1co $destvendor1co
write-host "- Archiving vendor1 to zip file"
$destinationvendor1archive = "\\servershare\it\price data\vendor1\vendor1_$effective.zip"
add-type -AssemblyName "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($sourcearchivevendor1, $destinationvendor1archive)
write-host "- Copying vendor1CO to Share"
copy-item $destvendor1co "\\ERPPriceFiles\IN"
write-host "- vendor1 files processed"

#vendor2 Process
write-host "- Starting vendor2 Files"
$sourcearchivevendor2 = "\\servershare\it\price data\vendor2\Temp"

$destvendor2ff = "$sourcearchivevendor2\vendor2ff$effective"
$destvendor2co ="$sourcearchivevendor2\vendor2co$effective"
write-host "- Copying vendor2FF from download to temp folder"
Copy-Item $sourcevendor2ff $destvendor2ff
write-host "- Copying vendor2CO from download to temp folder"
Copy-Item $sourcevendor2co $destvendor2co
write-host "- Archiving vendor2 to zip file"
$destinationvendor2archive = "\\servershare\it\price data\vendor2\vendor2_$effective.zip"
add-type -AssemblyName "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($sourcearchivevendor2, $destinationvendor2archive)
write-host "- Copying vendor2CO to Share"
copy-item $destvendor2co "\\ERPPriceFiles\IN"
write-host "- vendor2 files processed"

#vendor3 Process
write-host "- Starting vendor3 files"
$sourcearchivevendor3 = "\\servershare\it\price data\vendor3\Temp"
$sourcevendor3ff = "\\servershare\download\vendor3FF"
$sourcevendor3co = "\\servershare\download\vendor3CO"
$destvendor3ff = "$sourcearchivevendor3\vendor3ff$effective"
$destvendor3co ="$sourcearchivevendor3\vendor3co$effective"
write-host "- Copying vendor3FF from download to temp folder. Larger file, takes longer"
Copy-Item $sourcevendor3ff $destvendor3ff
write-host "- Copying vendor3CO from download to temp folder"
Copy-Item $sourcevendor3co $destvendor3co
$destinationvendor3archive = "\\servershare\it\price data\vendor3\vendor3_$effective.zip"
write-host "- Archiving vendor3 to zip file"
add-type -AssemblyName "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($sourcearchivevendor3, $destinationvendor3archive)
write-host "- Copying vendor3CO to Share"
copy-item $destvendor3co "\\ERPPriceFiles\IN"
write-host "- vendor3 files processed"

write-host "- Calcuating number of records"

#Find the number of records in each file
$Sizevendor1CO = get-item "$destvendor1co"
$Numvendor1co = $Sizevendor1CO.length/272
$lengthvendor1CO = $Sizevendor1CO.length

$Sizevendor1FF = get-item "$destvendor1ff"
$numvendor1FF = $Sizevendor1FF.length/272
$lengthvendor1FF = $Sizevendor1FF.length

$emailvendor1CO = "vendor1CO is $lengthvendor1CO bytes long and has $Numvendor1co records."
$emailvendor1FF = "vendor1FF is $lengthvendor1FF bytes long and has $numvendor1FF records."

$Sizevendor2CO = get-item "$destvendor2co"
$Numvendor2co = $Sizevendor2CO.length/272
$lengthvendor2CO = $Sizevendor2CO.length

$Sizevendor2FF = get-item "$destvendor2ff"
$numvendor2FF = $Sizevendor2FF.length/272
$lengthvendor2FF = $Sizevendor2FF.length

$emailvendor2CO = "vendor2CO is $lengthvendor2CO bytes long and has $Numvendor2co records."
$emailvendor2FF = "vendor2FF is $lengthvendor2FF bytes long and has $numvendor2FF records."

$Sizevendor3CO = get-item "$destvendor3co"
$Numvendor3co = $Sizevendor3CO.length/272
$lengthvendor3CO = $Sizevendor3CO.length

$Sizevendor3FF = get-item "$destvendor3ff"
$numvendor3FF = $Sizevendor3FF.length/272
$lengthvendor3FF = $Sizevendor3FF.length

$emailvendor3CO = "vendor3CO is $lengthvendor3CO bytes long and has $Numvendor3co records."
$emailvendor3FF = "vendor3FF is $lengthvendor3FF bytes long and has $numvendor3FF records."

#Declare SMTP Server
$smtpServer = "smtp.email.local"
$smtpFrom = "pricefiles@domain.local"
$smtpTo = "notifications@yournamehere.com"
$messageSubject = "vendor1vendor3vendor2 Price File Record Count for $effective"

#Declare email message body
$message = New-Object System.Net.Mail.MailMessage $smtpfrom, $smtpto
$message.Subject = $messageSubject
$message.IsBodyHTML = $true
$style = "<style>BODY{font-family: Arial; font-size: 10pt;}"
$style = $style + "TABLE{border: 1px solid black; border-collapse: collapse;}"
$style = $style + "TH{border: 1px solid black; background: #dddddd; padding: 5px; }"
$style = $style + "TD{border: 1px solid black; padding: 5px; }"
$style = $style + "</style>"

write-host "- Sending email with number of records"
#actual powershell script
$message.Body = "$emailvendor1CO <br> $emailvendor1FF <br>Update file ""\\servershare\it\price data\vendor1\_datalog_vendor1vendor2vendor3.xlsx"" with those counts<br><br> $emailvendor2CO <br> $emailvendor2FF <br>Update file ""\\servershare\it\price data\vendor2\_datalog_vendor1vendor2vendor3.xlsx"" with those counts<br><br>$emailvendor3CO<br>$emailvendor3FF<br>Update file ""\\servershare\it\price data\vendor3\_datalog_vendor1vendor2vendor3.xlsx"" with those counts<br><br>"

$smtp = New-Object Net.Mail.SmtpClient($smtpServer)
$smtp.Send($message)

write-host "Cleaning up Files"
#File Cleanup
Remove-Item $destvendor1ff
Remove-Item $destvendor1co
Remove-Item $destvendor2ff
Remove-Item $destvendor2co
Remove-Item $destvendor3ff
Remove-Item $destvendor3co

#Done

}

 

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***

 

Hyper-V CPU Musings

Hyper-V CPU Musings

Recently I had the opportunity arise where I was able to test a few CPU core configurations on an unused host.

My gold here is to see if a CPU virtualization penalty exists and secondarily to see what effect hyperthreading has on CPU performance in a single VM setting.

Specs of the host:  Dual Xeon Gold 6144, 512GB RAM, SSD

HyperV version:  Windows 2016 (long term branch)

To start, here are the Cinebench R15 scores before HyperV was installed:

Hyperthreading enabled got a score of 3427 (left) whereas hyperthreading disabled got a score of 2680:

                            

Next, installed Hyper-V, built a VM running the full GUI of 2012R2, fully updated.

First test, hyperthreading disabled, VM has 16 cores assigned:

Nice!! only 2 points off of the physical.

Now, enable hyperthreading at the host level.  VM still has 16 cores assigned:

Ouch, 1021 points lower (-38%).  Keep in mind all we did was enable hyperthreading on the host.  A 38% penalty just in that setting.

Next test, assign 32 cores to the same VM:

Above 3000 again.  191 points off (-5%) the physical install benchmark above.

And just because, 24 cores assigned to the VM:

Here we have 511 points off the physical host (-15%).

Conclusion:

What did I learn?  With hyperthreading disabled, there is a virtualization penalty, but it barely registers.

It’s when hyperthreading is enabled that one has to be careful.  That large of a hit (-38%) is interesting to say the least.

That being said, with hyperthreading enabled and you assign all available logical cores to the VM, it wasn’t too shabby.

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***

Intel Hyperthreading… worth it?

Intel Hyperthreading… worth it?

In a recent post, I listed a benchmark using Cinebench R15 on a pair of Xeon Gold 6144 CPU’s.  In that post, I mentioned that hyperthreading was enabled.  How much of a difference did that make?

To recap, here’s the full 16 physical/32 logical benchmark:

And now, hyperthreading disabled:

And there we go.  Hyperthreading adds another 747 points, for an increase of nearly 28%.

Oddly, that goes in line with something I read before where hyperthreading can add up to a 25% increase in performance (on physical hardware).  Sadly I can’t remember where I read that, or when.

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***

Upgrading to a current gen Xeon… worth it?

Upgrading to a current gen Xeon… worth it?

Recently I had a chance to introduce a new server into our environment spec’d out to use two CPU’s from Intel’s Xeon Gold family with the same amount of cores as the server it was replacing.  Could their be an increase in CPU performance?

Let’s find out, using a quick and easy way to measure:  Cinebench R15

The old server (Dell R720XD) running two Intel Xeon E5-2667 v2 processors (Ivy Bridge, 8 cores each):

The new server (Dell R740XD) running two Intel Xeon Gold 6144 processors (Skylake-SP, 8 cores each)

BIOS and Power management settings were set to maximum (no power savings, max performance)

So first, 16 cores of Xeon E5-2667v2 (Hyperthreading enabled)

A score of 2464.  No slouch.

Next, 16 cores of Xeon Gold 6144 (Hyperthreading enabled)

Wow, 3427! Almost 1000 points higher, which if my math is right is nearly 40% faster.

Keep in mind this is one benchmark, YMMV, but it’s an easy way of showing the difference you can gain in three generations of CPU’s.

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***

Clearing AX Client Cache for All Users on a RDS Host

Clearing AX Client Cache for All Users on a RDS Host

It’s AX code update time, and you’re needing to clear out everyone’s AX client side cache on your RDS host servers.

That’s a lot of user folders to do through.

Here’s what I do:

Let’s start with some assumptions.

  1. You have the AX client deployed using Remote Desktop Services (RemoteAPP)
  2. You are not using User Profile Disks

If that describes your environment, read on.  If not, or if it’s close, keep reading.  You may be able to modify my solution to fit your needs.

The goal:  Remove all local AX cache files for all users in the following locations:

  1. All files/folders starting with “vsa” located here: c$\users\%username%\Appdata\Local\Microsoft\Dynamics AX\
  2. All files ending in “*.kti” or “.auc” located here: c$\users\%username%\Appdata\Local

Requirements:

  1. User running the Powershell script must have full rights to each user folder on each RDS host server.  Domain Admin members in AD would be a good place to start for the quick and dirty way.
  2. You must run this on each RDS host server.

What it does:

  1. Users the command “get-childitem” under “c:\users” to generate a list of all folders
  2. Using “foreach”, it loops through every folder
  3. In each folder, it does a “get-item” under the specific AX client cache directories get all files/folders, then another “for-each” to delete them

The code:

#Specify Directory containing User folders
$userfolderlist = get-childitem -path c:\users -directory

#Remove AX Client Cache files for each folder
foreach ($item in $userfolderlist)
{
write-host "Removing Test AX cache for username $item"
            get-item "C:\Users\$item\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse}
            get-item "C:\Users\$item\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse}
            get-item "c:\Users\$item\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse}
            }

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***

Clearing AX Client Cache for One User in a RDS Environment Using Powershell

Clearing AX Client Cache for One User in a RDS Environment Using Powershell

“You should clear that person’s AX cache”.

How many times have you heard that?  If you are running Dynamics 2012 (R2 in my case), this post may help.

Let’s start with some assumptions.

    1. You have the AX client deployed using Remote Desktop Services (RemoteAPP)
    2. You are not using User Profile Disks
    3. You have multiple RDS Hosts people connect to

If that describes your environment, read on.  If not, or if it’s close, keep reading.  You may be able to modify my solution to fit your needs.

The goal:  Remove all local AX cache files for one user in the following locations:

  1. All files/folders starting with “vsa” located here: \\%hostname%\c$\users\%username%\Appdata\Local\Microsoft\Dynamics AX\
  2. All files ending in “*.kti” or “.auc” located here: \\%hostname%\c$\users\%username%\Appdata\Local
  3. Additionally, be able to do this for Production, Test, Dev, and Training AX RDS servers

Requirements:  User running the Powershell script must have full rights to each user folder on each RDS host server.  Domain Admin members in AD would be a good place to start for the quick and dirty way.

What it does:

  1. Prompts the person to enter the AD username of the desired user of whose cache you want cleared (line 5)
  2. Prompts the person to make sure that user has had their RDS session closed, otherwise the AX client may still be running, thus locking the files you want to delete (line 6)
  3. Asks if you are sure (line 8).  If you enter Y, the script continues, otherwise a N will skip over everything until line 64.
  4. Asks which AX environment you want to do this in (line 13)
  5. Connects to the RDS server for the environment specified in step 4.

There are if/else statements from here on out (line 17, 24, 31, and then 38) that are hard coded for specific servers dependent on which AX environment you chose.

The code:

#This script will connect to AX RDS servers and delete the AX client cache
#Make sure the user has been logged out of the RDS box first otherwise the files will be locked

#Provide the Active Directory Username
$axuser = read-host 'Enter the AD username'
write-host "Make sure that $axuser has been logged out of their RDS session before proceeding"

$Yousure = read-host 'Are you sure you want to continue? Y/N'
If ($yousure -match "y")
 {

#Provide the AX Environment
write-host "Possible AX Environments:  Test, Training, Dev, Prod"
$axenvironment = read-host 'Enter the AX Environment'
    
#Connect to AX environment chosen and delete local cache    
    If ($axenvironment -match "Test")
        {
            write-host "Removing Test AX cache for username $axuser"
            get-item "\\axtestrds\c$\Users\$axuser\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse} 
            get-item "\\axtestrds\c$\Users\$axuser\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse} 
            get-item "\\axtestrds\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse}
        }
        else { If ($axenvironment -match "Training")
            {
                write-host "Removing Training AX cache for username $axuser"
                get-item "\\TrainingRDS\c$\Users\$axuser\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse} 
                get-item "\\TrainingRDS\c$\Users\$axuser\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse } 
                get-item "\\TrainingRDS\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse } 
            }
                else { If ($axenvironment -match "Dev")
                    {
                        write-host "Removing Dev AX cache for username $axuser"
                        get-item "\\DevRDS\c$\Users\$axuser\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse } 
                        get-item "\\DevRDS\c$\Users\$axuser\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse } 
                        get-item "\\DevRDS\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse } 
                    }
                    else { If ($axenvironment -match "Prod")
                        {
                            write-host "Removing Prod AX cache for username $axuser"
                            get-item "\\ProdRDS02\c$\Users\$axuser\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS02\c$\Users\$axuser\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS02\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS03\c$\Users\$axuser\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS03\c$\Users\$axuser\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS03\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS04\c$\Users\$axuser\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS04\c$\Users\$axuser\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse } 
                            get-item "\\ProdRDS04\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse } 
                            get-item "\\ProdRDS05\c$\Users\$axuser\AppData\Local\Microsoft\Dynamics Ax\vsa*" | foreach ($_) {remove-item $_.fullname -recurse} 
                            get-item "\\ProdRDS05\c$\Users\$axuser\AppData\Local\*.kti" | foreach ($_) {remove-item $_.fullname -recurse } 
                            get-item "\\ProdRDS05\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse } 
                            get-item "\\1075rds09\c$\Users\$axuser\AppData\Local\*.auc" | foreach ($_) {remove-item $_.fullname -recurse} 
                            }
                                else { write-host "You did not enter the right command"
                                    }
                        }
                    }
                }

#Reminder to clear usage data in AX Client
write-host "Now that cache has been cleared, connect to AX client on $axenvironment and clear the usage data for $axuser"
}
else 
                    {write-host "You chose not to continue.  Well played"
                    }

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***

Copy AX Security Roles to Another User Using Powershell

Copy AX Security Roles to Another User Using Powershell

As anyone who is an admin for Dynamics AX 2012 (R2 in my case) can attest, assigning user security roles can be a monotonous chore, especially if you have a high number of users that need specific roles.

Initially when my company implemented AX, user security roles were added via the AX client.  User to copy on one side, user to add on the other.  Click assign roles for each one you need to copy.

Ugh.

About a year later I was looking into other options and found that while there isn’t much Powershell for AX 2012, one of the things you can do is add or remove security roles.  Tell me more!

Let’s start, shall we?

Below in this first image, we have one Leia Organa, new hire to our company.

Oddly, according to HR she needs to have the same roles as her brother:

At this point, you need to be on a computer that is configured with an AX connection to the AOS service, AX management utilities installed, and the user you are running this as in Windows needs to have security or system admin permissions in the AX environment.  In my case, I run this directly on the AOS server.  YMMV

Anyway, the code:

#Load AX Powershell functions
. "C:\Program Files\Microsoft Dynamics AX\60\ManagementUtilities\Microsoft.Dynamics.ManagementUtilities.ps1"

$axuserold = read-host 'Enter the AD username of the person to copy from'
$axusernew = read-host 'Enter the AD username of the person to copy to'

get-axsecurityrole -axuserid $axuserold | foreach ($_) {add-axsecurityrolemember -axuserid $axusernew -aotname $_.aotname}

What it does:

Step 1:  Enter the Active Directory user name of the person to copy from (pauses for input):

Step 2:  Enter the Active Directory user name of the person to copy to (pauses for input):

Hit enter, and once completed the window goes away.

What does it do?

It uses the Powershell command “get-axsecurityrole” for a specific user you mention, and then using the “foreach” option Powershell has it uses the command “add-axsecurityrolemember” to add that role to the user you specify.  It loops through each role the originated user has and adds it in.

Finally, Leia is good to go:

This has saved me countless hours of tediously adding roles to new people to the organization.  Hopefully this can help you out as well.

***DISCLAIMER*** I am not responsible for this breaking or damaging any of your stuff.  Copyrights belong to their original owners***