Archive for the ‘Azure’ Category

Could Skype be the one communications client to rule them all?

June 23, 2014

Google has Google Voice (soon to be part of Hangouts, expanding their footprint for the ailing – or morphing – Google+ by forcing users to switch), Facebook has their Messenger client, there’s Viber and Line in the Voice space and WhatsApp and SnapChat delivering text and image messaging. The iPhone has Visual Voicemail and iMessage. Where is Microsoft fit in all of this? (more…)

Advertisements

Keeping a-head in the clouds

November 19, 2013

One of the great things about developing on today’s cloud platforms is elastic computing. You never know what the peaks are going to look like, but you don’t want to pay for hardware you’ll only use once in a blue moon. So you opt for a dynamically adaptive scalable solution.

If you’re read any of my posts about jsErrLog (or “jsErrLog posts” if it’s still down) you’ll know that’s what I did for that service. As I’m offering it for free to anyone with a reasonable load I needed something as cost effective as possible (ie free!). When I built it I looked at Windows Azure, Amazon’s EC2, a few smaller options, Virtual Private Servers and finally settled on Google AppEngine – in common with the others it offered a number of options for programming languages and data storage but the big bonus was a no-nonsense the free tier.

Sometimes however things don’t go quite as planned…

(more…)

Azure Dynamic Compression

April 9, 2011

On a normal Windows IIS installation it’s pretty easy to turn on dynamic compression for WFC and other served content to reduce the amount of bandwidth you need to consume (important when you are charged by the byte) – you just change the server properties to enable dynamic as well as the more common static compression.

With Windows Azure though it’s a little more interesting because with roles dynamically assigned and started from a standard instance you don’t have much control … unless you’re used to doing everything from the command line …

Luckily one of the nice things that you can do with an Azure role is script actions to take place as part of the initialization. The process is as simple as adding the commands you need to execute to a batch script that gets deployed as part of your project and calling it at the relevant time.

The first thing you script needs to do is to turn dynamic compression on for the server in that role:

·         “%SystemDrive%WindowsSystem32inetsrvappcmd.exe” set config -section:urlCompression /doDynamicCompression:true /commit:apphost

You then want to set the minimum size for files to be compressed (in bytes)

·         “%SystemDrive%WindowsSystem32inetsrvappcmd.exe” set config -section:system.webServer/httpCompression -minFileSizeForComp:50 /commit:apphost

Finally your script should specify the MIME types that you want to enable compression for

·         “%SystemDrive%WindowsSystem32inetsrvappcmd.exe” set config /section:httpCompression /+dynamicTypes.[mimeType=’application/xml’,enabled=’true’] /commit:apphost

·         “%SystemDrive%WindowsSystem32inetsrvappcmd.exe” set config /section:httpCompression /+dynamicTypes.[mimeType=’application/atom+xml’,enabled=’true’] /commit:apphost

·         “%SystemDrive%WindowsSystem32inetsrvappcmd.exe” set config /section:httpCompression /+dynamicTypes.[mimeType=’application/json’,enabled=’true’] /commit:apphost

If you have a problem with MIME types like atom+xml not registering properly you may need to escape the plus sign and replace the string with ‘atom%u002bxml’ – I’ve had success with both methods

You can add as many MIME types as you need to the list, and remember that sometimes you also need to specify the characterset you are using

·         “%SystemDrive%WindowsSystem32inetsrvappcmd.exe” set config /section:httpCompression /+dynamicTypes.[mimeType=’application/xml;charset=utf-8′,enabled=’true’] /commit:apphost

And then when you’re done exit the script to tidy up gracefully

·         exit /b 0

Once you have combined those steps together in a script and saved it as (eg) EnableDynamicCompression.cmd you should add the script to your Visual Studio project and make sure you select “Copy Always” in the properties for the file to ensure it gets correctly deployed.

Finally you need to add a reference to that startup script in your project’s ServiceDefinition.csdef file and then deploy your project as normal.

    <Startup>
        <Task commandLine=”EnableDynamicCompression.cmd” executionContext=”elevated” taskType=”simple”></Task>
    </Startup>

Finally… how do you know if it’s working or not? The thing that tricks people a lot of the time and makes them think it’s broken is that if they are behind a corporate proxy server that often un-compresses the data for you on the way past. You can check yourself using a tool like Fiddler to examine the response and make sure it has been gzipped or you can visit http://www.whatsmyip.org/http_compression/ and test that way (the latter is good if you are behind a proxy which interferes with the compression).