Monday 30 September 2013

Create and Delete Website from PowerShell.

I thought I would share the whole solution rather than just the SSL Binding code.

Import-Module WebAdministration

function Add-Site([string]$folder, [string]$sitename, [string]$protocol="http", [int]$port, [int]$sslport, [string] $hostheader, [string]$thumbprint, [string]$appPoolName, [hashtable] $appDetails, [string]$version="v4.0")
{
 
 if ( -not ( Get-Website | ? {$_.Name -eq $sitename}))
 {
  if ($hostheader)
  {
   New-Item iis:\Sites\$sitename -bindings @{protocol="$protocol";bindingInformation="*:$($port):$($hostheader)"} -physicalPath $folder
  }
  else
  {
   New-Item iis:\Sites\$sitename -bindings @{protocol="$protocol";bindingInformation="*:$($port):"} -physicalPath $folder
  }
  
  if (-not($thumbprint) -or -not ($sslport))
  {
   Write-Error "Ensure that a Certificate Thumbprint and SSLport are set for HTTPS Bindings"
   Write-Host "Let's clean up a little bit here..."
   DeleteSite $sitename
   exit
  }
  else
  {
   AddSSLBinding $thumbprint $sitename $sslport $hostheader
  }

  if ($appDetails -and $appPoolName)
  {
   CreateAppPool $appPoolName
   SetAppPoolVersion $appPoolName $version
   foreach ($app in $appDetails.GetEnumerator())
   {
    MakeApplication $sitename $app.Name $app.Value 
    SetAppPool $sitename $app.Name $appPoolName
   }
  }
  else
  {
   Write-Warning "The website $sitename has been created with no applications or applicationPools. Nothing wrong with this, just saying"
  }
 }
}

function Remove-Site([string]$sitename, $appPoolName)
{  
  Get-ChildItem IIS:\SslBindings | ? {$_.Sites -eq $sitename} | %{ Remove-Item iis:\sslbindings\$($_.pschildname) -Force -Recurse}
  Get-ChildItem IIS:\Sites\ | ?{$_.Name -eq $sitename} |% { Remove-Item  IIS:\Sites\$sitename  -Force -Recurse}
  Get-ChildItem IIS:\AppPools\ |? {$_.Name -eq $appPoolName} | %{ Remove-Item IIS:\AppPools\$appPoolName -Force -Recurse }  
}

function AddSSLBinding([string]$thumbprint, [String]$sitename, [int]$port, [String]$hostheader)
{
 
 $cert = Get-ChildItem cert:\LocalMachine\My | ?{$_.Thumbprint -eq $thumbprint}
 
 if( -not($(Get-ChildItem iis:\sslbindings| ? {$_.Port -eq $port})))
 {
  New-Item IIS:\SslBindings\0.0.0.0!$port -Value $cert | out-null
  
  if ($hostheader)
  {
   New-ItemProperty $(join-path iis:\Sites $sitename) -name bindings -value @{protocol="https";bindingInformation="*:$($port):$($hostheader)";certificateStoreName="My";certificateHash=$thumbprint} | out-null
  }
  else
  {
   New-ItemProperty $(join-path iis:\Sites $sitename) -name bindings -value @{protocol="https";bindingInformation="*:$($port):";certificateStoreName="My";certificateHash=$thumbprint} | out-null
  }
 }
 else
 {
  Write-Warning "SSL binding already exists on port $port"
 }
}

function MakeApplication([string]$sitename, [string]$applicationName, [string]$folder)
{
  New-Item "IIS:\Sites\$sitename\$applicationName" -physicalPath $folder -type Application | out-null
}

function CreateAppPool([string]$applicationPoolName)
{
  New-Item IIS:\AppPools\$applicationPoolName | out-null
}

function SetAppPool([string]$sitename, [string]$application, [string]$applicationPool)
{
  Set-ItemProperty IIS:\sites\$sitename\$application -name applicationPool -value $applicationPool | out-null
}

function SetAppPoolVersion([string]$applicationPool, [string]$version)
{
  Set-ItemProperty IIS:\AppPools\$applicationPool managedRuntimeVersion $version | out-null
}

Export-ModuleMember -Function Add-Site,Remove-Site
An example of how to use it the above module to add a new site, assuming that it's been saved as Module.psm1. The thumbprint needs to be of a certificate already installed in your server on the LocalMachine\My store. This will create a website with an http binding in port 80 and an https binding on port 443 using the certificate passed in (thumbprint). An application and virtual directory called TestService will be created. All that remains is to copy the website files.
Import-Module Module.psm1
$path="F:\TestWebSite"
$testAppdetails=@{"TestService" = "$Path\TestService"}
Add-Site -folder $path -sitename "testsite" -protocol "http" -port 80 -sslport 443 -thumbprint "FE1D6F1A5F217A7724034BA42D8C57BEC36DD168" -appPoolName "testapppool"  -appDetails $testappDetails
and an example of how to remove the same site:
Remove-Site -sitename "testsite" -appPoolName "testapppool"

Sunday 29 September 2013

On Quality

A few months back I hit upon an idea for a rather laborious scheme that would make me not that much money: Selling hard drive magnets on Ebay.

There were approximately 50 or so old hard drives at work, and when I say old, I do mean old. None of them could accommodate more than 18.6 GB of storage with an Ultra2 Wide SCSI interface.

Nobody wanted to take the time to dispose of them properly, which included the unenvious tasks of filling all the paper work or indeed running the disks through the data erasure program. I figured if the disks were not working nobody had to worry about this.

I set about taking apart the first disk and I hit my first road block: Torx headed screws. I had never had the need to use a Torx screwdriver so I decided to go out and buy a set of Torx screwdrivers.

I did a bit of research online and I wasn't surprised to find such a large disparity in prices: from as low as a few pounds to around one hundred pounds. I was sure that I didn't need to spend £100 on a set of Torx screwdrivers, but how low should I go?

As is my wont, I procrastinated and resolved to do more research on the topic. However, that weekend I stumbled upon a set of Torx screwdrivers at a discount store for £2.99, so I thought that I might as well buy them there and then.

I was fully aware that at this price the quality of the set would leave a lot to be desired but still I managed to suppress this knowledge long enough for me to dismantle 1.5 hard drives, which is when I hit the quality limit of my £2.99 set of Torx screwdrivers.

As I said above, I was not expecting them to last a lifetime and they were a spur of the moment, almost an impulse buy, so it wasn't unexpected, however this left me with a bit of a problem.

I know that if one buys the cheapest, in this day an age with a proliferation of  peddlers of cheap junk, one gets poor quality, but is the converse true?  In other words, does one get quality by buying expensive stuff? Perhaps more importantly, how much should I have spent on the set given the used I was planning to give it?

It is very easy to determine the quality of items at the bottom of the price scale: They are rubbish, but one does know what one is paying for. However, once we leave the safety of the cheaper items, then it becomes a lot harder to ascertain how much better something is or put another way Do you know what you are paying for when you buy an expensive item?

Take the Apple iPod shuffle, which can be obtained for £35 from Amazon. Storage capacity is 2 GB, it has no screen, it's tiny and has some sort of clip mechanism. For a similar price, £36, it is possible to buy a Sansa Clip+ with 8GB of storage, an expansion slot, screen, FM radio and also a clip mechanism. Yes, it's slightly bigger but hardly noticeable and voice commands can be added with RockBox firmware, so are you sacrificing 6 GB of storage for voice command?

The reality is that to a great extent you are paying for the Apple brand, with its design and quasi-religious following, which means that if you don't really care about design and don't think much of Apple as brand then you would be wasting money by going down the iPod shuffle route.

Is there are a similar quasi-religious following for say Stanley tools?, I would rather imagine that this unlikely to be the case. In fact, from talking to some of my relatives, who work or have worked in construction, they seem to buy tools from different brands mostly through experience. In other words, they tend to favour a brand because it was worked for them in the past and negative experience have a much more lasting effect that positive ones:
I spent loads of money on a expensive diamond tipped drill bit set from Black & Decker and it was rubbish. Since then I've always gone with Bosch drill bit sets and power tools.
In truth it might have been the other way round, the point still stands though, a negative experience is a lot more likely to be remembered, than a positive one as the positive one, this case, simply means having a reliable tool every day for a long time.

Whenever I find myself thinking about quality, I always imagine myself going back to that Arcadia of the consumer on the days prior to consumerism, whenever they happen to have occurred. In reality I have to admit that there have always been different quality levels on the products available to the consumer and while the bewildering price ranges that can be found for most products these days, makes buying the right item really tricky and by the right item I mean an item whose quality is commensurate with the price paid, it is simply naive to think that it easier through choice.

It was only easier because there was no choice, in other words, if you were a worker you could just afford the cheapest stuff and that is what you bought. It's only a modern dilemma that we have this paradox of choice, which makes discerning how much of your money goes on quality and how much goes on to pay for the brand premium almost impossible.

To a certain extent this is ameliorated by the various product reviews, but product reviews are no panacea as it is just as likely that the service is reviewed, which can be helpful, but it's hardly relevant to the product's quality or lack thereof. Furthermore, a large number of reviews describe personal preference and are normally added very early on, i.e. when no issues are found or the product was found to be defective, so they tend to be very Manichean.

There are dedicated people who seem to take reviewing very seriously, a sort of amateur Which? (Consumer Reports if you are in the  US) but sadly they are, very much, the minority and if you're not contemplating buying something that they have already bought, then you are out of luck.

So what to do?






Wednesday 25 September 2013

Issues with large solutions in Ms Dynamics CRM 2011.

We have a Ms Dynamics CRM 2011 in box Test environment, i.e. CRM and SQL on the same box, don't ask why, and our main solution is a good 10+ MB zipped up, which means that sometimes it takes a few attempts to get the solution imported.

As of late, my lack of commitment to scientific inquiry and betterment of the world continues to show as I really haven't tested this thoroughly enough, but here it goes anyway.

The problem seemed to be that the W3WP process was using almost all the available memory on the server, which resulted in timeouts when running various SQL queries, at least that's what the trace log said, it's hard to trust a log that seems surprised that the are no errors, but I digress. 

The solution was to set upper and lower limits of memory on SQL server, to be fair I think the problem was the lower limit, but it makes sense to limit memory usage at the high end as well, lest SQL server thinks all the memory it's for itself.

EXEC sys.sp_configure N'show advanced options', N'1'  RECONFIGURE WITH OVERRIDE
GO
EXEC sys.sp_configure N'min server memory (MB)', N'512'
GO
EXEC sys.sp_configure N'max server memory (MB)', N'2048'
GO
RECONFIGURE WITH OVERRIDE
GO
EXEC sys.sp_configure N'show advanced options', N'0'  RECONFIGURE WITH OVERRIDE
GO
For the record the server had 4 GB of RAM, which could well be the source of the issue in the first place, i.e. this might not happen in a server with 8 GB of RAM.

We've not had any of these issues on our OAT environment, which features separated CRM and SQL boxes, each with 8 GB of RAM, so hopefully setting limits to the memory used by SQL server was the solution to the problem.

Friday 20 September 2013

Updates are currently disallowed on GET requests. To allow updates on a GET, set the 'AllowUnsafeUpdates' property on SPWeb.

So today I hit a limit on SharePoint when listing from a library
The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator
The solution is simple, just increase the number of items, so from the Central Administration Site:
  1. Application Management -> Manage Web Application and select your web application
  2. In the Ribbon, click on General Settings drop-down and choose “Resource Throttling”.
  3. In the “List View Threshold”, increase the value
The problem I was having was that when I tried to do this I would get the following error:
Updates are currently disallowed on GET requests.  To allow updates on a GET, set the 'AllowUnsafeUpdates' property on SPWeb.
The solution, from the SharePoint PowerShell console:
$sp = get-spwebapplication https://myapp
$sp.HttpThrottleSettings
$sp.Update()
The problem seems to be related to the web application not having a value for HttpThrottleSettings, which will be set by running the above commands.

Sunday 15 September 2013

Issues with Word Automation Services in SharePoint 2010

This week we had an issue with Word Automation Services, in one of our test servers, where our custom code (really boiler plate code, see below) would fail on the second line:

var context = SPServiceContext.GetContext(SPsite);
var wsaProxy = (WordServiceApplicationProxy)context.GetDefaultProxy(typeof(WordServiceApplicationProxy));

Since the same code was working fine in our development environment, it was clear that it was not the code that was at fault but our SharePoint configuration that was at fault.

The issue was that the Word Automation Services Application had not been configured to be added to the default proxy list, see screenshot below, and thus the code was failing to get the proxy.


Note that this adding Word Automation services is done from the Central Administration website:
Central Administration -> Manage Service Applications -> New Word Automation Service

Tuesday 10 September 2013

Add HTTPS/SSL Binding to website in IIS from powershell

Edit:

I've crearted a powershell module to create and remove websites that includes this function, see it here post.

Since Wix seems to be frowned upon at work, I have been looking at PowerShell as a replacement to try to automate deployment of builds.

This little script will set the HTTPS binding. The certificate thumbprint is needed but the rest of the parameters are optional, defaulting to the most common option.

param ([String]$thumbprint, [String]$sitename="Default Web Site", [int]$port=443, [String]$hostheader)

if (-not($thumbprint))
{
  Write-Error "Certificate Thumprint is needed"
  exit
}

Import-Module WebAdministration

If (-not ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{
 Write-Warning "Run this script with elevated permissions"
 exit
}

function AddHTTPSBinding([String]$thumbprint, [String]$sitename, [int]$port, [String]$hostheader)
{
 $cert = Get-ChildItem cert:\LocalMachine\My | ?{$_.Thumbprint -eq $thumbprint}
 
 if( -not($(gci iis:\sslbindings| ? {$_.Port -eq $port})))
 {
  New-Item IIS:\SslBindings\0.0.0.0!$port -Value $cert | out-null
  
  if ($hostheader)
  {
   New-ItemProperty $(join-path iis:\Sites $sitename) -name bindings -value @{protocol="https";bindingInformation="*:$($port):$($hostheader)";certificateStoreName="My";certificateHash=$thumbprint}
  }
  else
  {
   New-ItemProperty $(join-path iis:\Sites $sitename) -name bindings -value @{protocol="https";bindingInformation="*:$($port):";certificateStoreName="My";certificateHash=$thumbprint}
  }
 }
 else
 {
  Write-Warning "SSL binding already exists on port $port"
 }
}

AddHTTPSBinding $thumbprint $sitename $port $hostheader

There is a New-WebBinding cmdlet in the WebAdministration module, but I think it needs to be used in conjunction with the Set-WebBinding to set the certificate and certificate store.

Thursday 5 September 2013

Issues with solutions in MS Dynamics CRM 2011 - Maintaining vs Overwriting Customizations

The ongoing saga of the CRM solutions continues, MS Dynamics CRM 2013 can't come too soon, hopefully there will be improvements around this area.

Having undertaken no rigorous testing whatsoever we have determined that overwriting customizations is slower than maintaining customizations when importing a solution, and so we normally try to avoid overwriting customizations, the problem is what CRM considers a customization.

I haven't really done a serious investigation, but a colleague did a semi serious one and he found that for workflows, the simple action of deactivating it and then activating it, was enough to for CRM to think it had been changed and thus would not be updated if the solution was imported maintaining customizations.

I guess the modifiedon attribute is changed when a workflow is deactivated and that must explain why CRM thinks it has been changed and thus will not modify it.

Like I said, this is all preliminary, so take it with a pinch of salt. Good thing that Ms Dynamics CRM 2011 has just been released, I'm sure this post will save people loads of headaches