Tuesday, 26 May 2015

Create Relying Party Trust for Microsoft Dynamics CRM from Powershell

I've configured Claims based authentication and IFD for MS Dynamics CRM more times than I care to remember and every time I do it manually, on the basis that it just doesn't take that long, which is true but it's also very tedious, so I spent some time creating a script to create the Relying Party Trust needed for MS Dynamics CRM claims based authentication and IFD to work. Obligatory XKCD.

I've only tried this script with ADFS 3.0 and MS Dynamics CRM 2015, but it should work for MS Dynamics CRM 2013 as well.

It's also possible to pass a file with the claims, using the IssuanceTransformRulesFile and IssuanceAuthorizationRules flags instead for the Add-AdfsRelyingPartyTrust command.

The script should be run after MS Dynamics CRM has been configured for Claims based authentication from the ADFS server.

The script can be also used to create the Relying Party trust for an Internet Facing Deployment and again it needs to be run after IFD has been configured in MS Dynamics CRM.

param ([string]$Name, [string]$Identifier)

if (-not ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{
    Write-Warning "You do not have Administrator rights to run this script!`nPlease re-run this script as an Administrator!"
    break
}

if (-not($Name))
{
 Write-Host "Name is a mandatory parameter. This should be the name of the Relying Party Trust"
 break
}

if (-not($Identifier))
{
 Write-Host "Identifier is a mandatory parameter. This will normally be of the form: https://<fqdn crm>/"
 break
}

$Identifier = $Identifier.Trim("/")

#These are the Transform Rules needed for CRM to work.
$transformRules='@RuleTemplate = "PassThroughClaims"
@RuleName = "Pass through UPN"
c:[Type == "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn"]
 => issue(claim = c);

@RuleTemplate = "PassThroughClaims"
@RuleName = "Pass through primary SID"
c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/primarysid"]
 => issue(claim = c);

@RuleTemplate = "MapClaims"
@RuleName = "Transform Windows Account name to Name"
c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname"]
 => issue(Type = "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name", Issuer = c.Issuer, OriginalIssuer = c.OriginalIssuer, Value = c.Value, ValueType = c.ValueType);'

#A single Authorization Rule, i.e. let everybody thru. Could tie down further if needed.
$authRules='@RuleTemplate = "AllowAllAuthzRule"
 => issue(Type = "http://schemas.microsoft.com/authorization/claims/permit",
Value = "true");'

#Copied and pasted this from a CRM 2011/ADFS 2.1 RPT
$imperRules ='c:[Type =="http://schemas.microsoft.com/ws/2008/06/identity/claims/primarysid", Issuer =~"^(AD AUTHORITY|SELF AUTHORITY|LOCAL AUTHORITY)$" ] => issue(store="_ProxyCredentialStore",types=("http://schemas.microsoft.com/authorization/claims/permit"),query="isProxySid({0})", param=c.Value );c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/groupsid",Issuer =~ "^(AD AUTHORITY|SELF AUTHORITY|LOCAL AUTHORITY)$" ] => issue(store="_ProxyCredentialStore",types=("http://schemas.microsoft.com/authorization/claims/permit"),query="isProxySid({0})", param=c.Value );c:[Type =="http://schemas.microsoft.com/ws/2008/06/identity/claims/proxytrustid", Issuer=~ "^SELF AUTHORITY$" ] => issue(store="_ProxyCredentialStore",types=("http://schemas.microsoft.com/authorization/claims/permit"),query="isProxyTrustProvisioned({0})", param=c.Value );'

Add-AdfsRelyingPartyTrust -Name $Name -Identifier $Identifier -IssuanceTransformRules $transformRules -IssuanceAuthorizationRules $authRules -ImpersonationAuthorizationRules $imperRules

Set-AdfsRelyingPartyTrust -TargetName $Name -MetadataUrl $($Identifier + "/FederationMetadata/2007-06/FederationMetadata.xml") -MonitoringEnabled $true -AutoUpdateEnabled $true

Update-ADFSRelyingPartyTrust -TargetName $Name
This is what I ran to create the relying party trust for Claims based authentication:

 .\CRMRPT.ps1 -Name "crm2015 - CBA" -Identifier "https://crm2015.dev.local/" 

and this tocreate the relying party trust for IFD:

 .\CRMRPT.ps1 -Name "crm2015 - IFD" -Identifier "https://auth.dev.local/"

Sunday, 17 May 2015

Cashless society N=1

A couple of weeks ago I read an article regarding a new law in Denmark that will effectively make cash not legal tender anymore. In other words, business will not be obliged to accept cash as payment.

To me this seems fairly sensible, I really hate paying by cash, in fact I occasionally find myself struggling to remember the pin number of my debit card as I use it so rarely, so I thought I would try to analyze my cash use. The simplest way of doing this is by looking at cash withdrawals from my bank account.

I decided to have a look at what data I could use from my banks online site and was a little bit disappointed to find that they only provide the last 12 months, I could request older data but it would be printed so I decided to stick to 12 months.

This is the raw data.

As I suspected, there seems to be a decrease in the frequency of cash withdrawals, but the linear fit is pretty poor due to the various outliers.

I did a little bit of thinking and I realized that the March and September outliers are due to leaving dos from people at work and the June one was due to buying some stuff for my girlfriend at Vintage Fair. 

I decided to plot the data again but this time without the outliers.

The trend line, just a linear fit, shows a much better fit than the raw data as is to be expected.

I can easily see this trend holding true, i.e. I will continue to visit the ATM less often, due to the increasing acceptance of contactless payments and its limit being raised to £30 in September.

Friday, 15 May 2015

I Don’t Always Test My Code. But When I Do I Do It In Production

I honestly never I thought I would need to post this:

Unable to Navigate to external domain (auth endpoint) in MS Dynamics CRM 2013/2015 IFD

The standard practice for my company when deploying web servers is to use a host header, which probably made sense at some point but it surely made for an interesting Friday.

I'm tired, so I will just present the facts: If you are configuring IFD and can't get to the external domain endpoint, the problem might be that you have a host header for your https binding.

The external domain endpoint is normally: https://auth.adomain.com/FederationMetadata/2007-06/FederationMetadata.xml and is the last step on the configure IFD wizard.


Ensure that IIS is configured without a host header for the https binding:


Friday, 8 May 2015

ExpectedException Failures in MSTest?

This morning I got an automated email alerting me of a failed build.

Excerpt from Jenkins

Results               Top Level Tests
-------               ---------------
Failed                SecurePasswordTest.EncryptDecryptFailure
Failed                SecurePasswordTest.EncryptDecryptFailureSecureString
Passed               SecurePasswordTest.EncryptDecryptSuccess
Passed               SecurePasswordTest.EncryptDecryptSuccessSecureString
2/4 test(s) Passed, 2 Failed

This is one of the failing tests:

[TestMethod]
[ExpectedException(typeof(System.Security.Cryptography.CryptographicException))]
public void EncryptDecryptFailureSecureString()
{
    var password = DateTime.Now.Ticks.ToString();
    var encryptedPasswordRandom = Convert.ToBase64String(Encoding.ASCII.GetBytes(password));
    var encrypted = SecurePassword.Encrypt(SecurePassword.ToSecureString(password));
    var decrypted = SecurePassword.ToInsecureString(SecurePassword.DecryptString(encryptedPasswordRandom));
}
I ran the tests from my machine and it worked fine. I tried from the server and it worked ....

At this point I remembered that a second build server had been brought online, as some tests runs were taking 2-3 hours, in any case. I checked the other server and I could reproduce the issue, in other words, the test failed there, but why?

Build Server #2 did not have Visual Studio 2013 Shell installed, after I installed it, it started working correctly.

It is rather strange that this happened in the first place and I'm not sure that this is actually related to the ExpectedException attribute, the type of exception or what exactly.

At any rate, hopefully it helps somebody out some day.

Saturday, 2 May 2015

Squid, Azure and beating youtube's regional filter

Every so often I try to watch videos on YouTube that are not available for my region/country (UK) and normally I can easily find an alternative that is available for the UK but last week I thought, why not just use a proxy?.

I have an MSDN subscription, which among other things, gives me £100 of Azure credit a month, so I thought I'd use some of the credit for this.

I provisioned a A0 OpenLogic 7.0 box and I got started. If you're wondering the cost for a linux A0 instance is £0.0011/hr ~ £8 a month and bandwidth is £0.08 per GB (first 5GB are free).
  1. Install Squid

  2. sudo yum -y install squid

  3. Add allowed IP addresses

  4. sudo vi /etc/squid/squid.conf

    # Example rule allowing access from your local networks.
    # Adapt to list your (internal) IP networks from where browsing
    # should be allowed
    acl localnet src 10.0.0.0/8     # RFC1918 possible internal network
    acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
    acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
    acl localnet src fc00::/7       # RFC 4193 local private network range
    acl localnet src fe80::/10      # RFC 4291 link-local (directly plugged) machine  s

    acl work src  111.111.111.0/24
    acl home src  111.111.111.111

    # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS

    http_access allow work
    http_access allow home


  5. Enable IP Forwarding by editing sysctl.conf file.

  6. sudo vi /etc/sysctl.conf
    Add the following to file: net.ipv4.ip_forward = 1
    sudo sysctl -p /etc/sysctl.conf


  7. Enable and start Squid

  8. systemctl enable squid
    systemctl start squid


Now we need to open the firewall in Azure for the box, Squid listens, by default on port 3128 and configure your browser to use this proxy ... Bye, Bye dreaded: The uploader has not made this video available in your country.

It's worth pointing out that this doesn't work for all websites, see my post about a setting up a VPN server on AWS if you want to a VPN server. Azure doesn't support GRE protocol 47, so it's SSL VPNs only on Azure.

Tuesday, 31 March 2015

Integrated Windows Authentication Log Out

It turns out that it's sort of possible to log out from Integrated Windows Authentication:

document.execCommand("ClearAuthenticationCache")

A few problems though:
  1. It only works with IE
  2. It will log you out of all websites in IE
The latter tends to annoy users no end, but I don't really know why

Saturday, 28 February 2015

Ordered Parallel Processing in C#

In one of the projects I've been working on, we need process a bunch of files containing invoice data. Processing these can be time consuming, as the files can be quite large and although the usage given to this data seems to suggest that it can be done overnight, the business has insisted in processing the files during the online day, at 17:00.

The problem is that that the files tend to contain the invoice journey through the various states and for audit purposes we need to process them all.

So, for instance if the first record on a file is an on hold invoice, we want to process this, but we also want to process the same invoice showing as paid further down the file. We can't just process the paid event. Furthermore, we also want the invoice record to end with a status of paid, which is fairly reasonable.

The problem is that if we process the invoices in parallel, we have no guarantees that they will be processed in the right order, so a paid invoice record might end up with a state of issued, which is not great, so we just went for the quick and easy solution and thus processed the files serially.

I gave the matter a little bit more thought and came up with this:

private void UpdateInvoices(IEnumerable<IInvoice> invoices)
{
    var groupedInvoices = invoices.GroupBy(x => x.Status)
        .OrderBy(x => x.Key)
        .Select(y => y.Select(x => x));

    foreach (var invoiceGroup in groupedInvoices)
    {
        Parallel.ForEach(invoiceGroup, po, (invoice) =>
        {
           UpdateInvoice(invoice);
        });
    }
}
What we do is, we group all the invoices by status and order them by status. We then process all of the invoices in a status group in parallel, so that all invoices with status issued, get processed first, and paid last, a few more get processed in between.

It is, of course, possible to have multiple parallel for each loops for each status, but I feel that this solution is more elegant and easier to maintain.

PLinq does have an AsOrdered method, but the UpdateInvoice method doesn't return anything, if it fails to update the database, it simple logs it and it's for the server boys and girls to worry about.

Furthermore, it simply doesn't quite work as I might have expected it to work.

The code from this sample has been modified to better simulate what we're trying to achieve:

var source = Enumerable.Range(9, 50);

var parallelQuery = source.AsParallel().AsOrdered()
    .Where(x => x % 3 == 0)
    .Select(x => { System.Diagnostics.Debug.WriteLine("{0} ", x); return x; });

// Use foreach to preserve order at execution time. 
foreach (var v in parallelQuery)
{
    System.Diagnostics.Debug.WriteLine("Project");
    break;
}

// Some operators expect an ordered source sequence. 
var source = Enumerable.Range(9, 30);

var parallelQuery = source.AsParallel().AsOrdered()
    .Where(x => x % 3 == 0)
    .Select(x => { System.Diagnostics.Debug.WriteLine("{0} ", x); return x; });

// Use foreach to preserve order at execution time. 
foreach (var v in parallelQuery)
{
    System.Diagnostics.Debug.WriteLine("Project");
    break;
}

// Some operators expect an ordered source sequence. 
var lowValues = parallelQuery.Take(10);

int counter = 0;
foreach (var v in lowValues)
{
    System.Diagnostics.Debug.WriteLine("{0}-{1}", counter, v);
    counter++;
}
The call to Debug.WriteLine is the same as UpdateInvoice in the code above, in the sense that they are both void methods that cause side effects.

This is what the above prints:
9 15 18 12 30 33 36 21 24 27
Project
9 15 18 12 30 21 36 27 24 33 
0-9 1-12 2-15 3-18 4-21 5-24 6-27 7-30 8-33 9-36 


As you can see the end result is ordered but the getting there isn't, and the getting there is what we're interested in, which is why we could not use PLinq.


Saturday, 14 February 2015

Gas and Electricity Consumption in a 1920s mid-terrace house in the North of England.

Last week I was going through some old pen drives to see if there was actually anything worth keeping and I found a lot of old energy consumption measurements I took back at our old house, so I thought I would share them here.

The house was a small mid terrace house, with central heating and a gas cooker, built after the First World War. I started taking the measurements after I decided that leaving my gaming PC on 24/7 wasn't a great idea, I should've taken a few measurements with it on, but there you go. We only heated the house to a relatively low temperature, i.e. ~ 18° C

Unfortunately, I don't have measurements of outside temperature so I cannot correlate energy use to outside temperature, but the data was gathered to try to get a better understanding of how much gas and electricity we were using at the time.

Without further ado here are the charts:

It's hard to see electricity consumption in the above chart, so here it is:

Estimate costs below. I will not rant about the rather ludicrous way Gas and Electricity is priced in this country.



Electricity on its own again:





Wednesday, 11 February 2015

Brain Dump 7 - Remove User from group in SharePoint

Clue is in the title

In essence below is a method that will remove a user from a group in SharePoint.

User can be of the form domain\user or user@domain


public bool RemoveUserFromSharePointGroup(string userName, string groupName)
{
 var principal = Microsoft.SharePoint.Client.Utilities.Utility.ResolvePrincipal(context, context.Web, userName,
  Microsoft.SharePoint.Client.Utilities.PrincipalType.User, Microsoft.SharePoint.Client.Utilities.PrincipalSource.All,
  context.Web.SiteUsers, false);
  
 context.ExecuteQuery();
 
 if (principal.Value != null)
 {
  string login = principal.Value.LoginName;
  GroupCollection siteGroups = context.Web.SiteGroups;
  Group group = siteGroups.GetByName(groupName);
 
  var query = context.LoadQuery(group.Users.Where(usr => usr.LoginName == login).Include(u => u.LoginName));
 
  context.ExecuteQuery();
 
  User user = query.SingleOrDefault();
 
  if (user != null)
  {
   group.Users.RemoveByLoginName(user.LoginName);
  }
 
  context.ExecuteQuery();
 
 }
}