1&1 domain transfer status: ‘Domain Update Done’

In the process of moving a .co.uk domain over to 1&1 for someone this week, I encountered the undocumented status code of ‘Domain Update Done’.

The domain had previously shown as ‘Ready’ in the control panel and I was able to set the server and mailserver for them using 1&1’s nameserver.

The gotcha is that during the ‘Domain Update Done’ status you can’t swap the nameserver settings. You can still get to change the A and MX records with 1&1’s Nameserver but if you wanted to point it to an external nameserver you’ll have to wait until this status clears.

The official word I got from 1&1 support says : “Domain Update Done status means that a domain is already on its last phase of propagation and normally, this will take 24-48 hours.”

So just be warned that if you want to change your initial settings when transferring a domain, you might get locked out while the domain is being finalised on your account.

Fixing MSDTC between two machines on different domains

I’ve been chasing problems with MSDTC today. We were trying to get one machines on a different domain to use MSDTC through COM+ to talk to a remote SQL Server on a different domain.

Select / Read operations seemed to work fine but when it attempted to use an UPDATE method in a transaction, it failed with an exception saying

COM+ was unable to talk to the Microsoft Distributed
Transaction Coordinator (Exception from HRESULT: 0x8004E00F)

The following information describes my eventual journey to success.

Continue reading Fixing MSDTC between two machines on different domains

Google Contacts throws a gauntlet at Facebook?

Given the size Facebook has grown to, it has an obvious ‘elephant in the corner’ for those willing to step back from their time-line for a minute.  Those users who are only interested in friends, photos and now location-based checkins too might miss the fact that Facebook is data-mining on a grand scale.

The debate about the public visibility of your data has grabbed the media’s attention. What’s given less coverage is the missing ability for you to easily use and share the data you’ve chosen to store with Facebook elsewhere on the internet.

Google have recently changed their terms of service on their Contacts API in a seemingly simple way, but with hopefully bigger consequences. GigaOm summised in their article about this by saying “Third-party apps and services can’t pull data from Google without allowing Google to do the same with their data”.

We shall have to see whether linking with Google Contacts in Gmail etc. to find new friends to connect to on Facebook is enough to force Facebook to apply a can-opener to their own APIs.

Thanks to Adam Bird for sending me a link to the GigaOm story.

Shutting down log4net repositories

I’ve been learning and evaluating the Gibraltar ‘Runtime intelligence’ & logging application recently. If you’re already using log4net, there’s a very low impact route to adopting its many benefits by using their simple Gibraltar Appender.

In knocking up a quick sample application to test, I had setup and configured my Log4Net logger:

public class Program
{
  private static readonly ILog Log = 
    LogManager.GetLogger(MethodBase.GetCurrentMethod().DeclaringType);
  static int Main(string[] args)
  {
    if (!LogManager.GetRepository().Configured)
    {
      XmlConfigurator.Configure();
    }

The simple application went on to output some simple log lines to test using the Gibraltar appender. All was fine, except this simple console application appeared to hang on exit. I quit the running application and Gibraltar dutifully announced that my session had Crashed.

A bit of headscratching later made me realise that I needed to be a better Log4Net citizen in my sample application. I had omitted the line:

Log.Logger.Repository.Shutdown();

Now my code properly stopped its logging activities before the program exited and Gibraltar was able to report successfully. The Log4Net RollingFileAppender or ConsoleAppenders don’t complain like this on program exit if the repository isn’t shutdown first but it does makes sense to tidy up after yourself rather than relying on the garbage collector.

I’ll write more about my positive experiences of Gibraltar in another post, but just wanted to share this in case any early adopters faced the same ‘facepalm’.

TDD, Visual Studio 2010 & BadImageFormatException

Just had a very strange run-in with Visual Studio 2010 throwing a BadImageFormatException. Just to set the scene I’ve got Visual Studio 2010 running on a Windows XP 64-bit machine.

My solution has a Console Application and a few Class Libraries; one of which is for Integration Tests using StoryQ and another which is for my unit tests classes.

Running my Console application works fine on its own, but as soon as I either ran the Unit Tests or got Resharper to run my Integration Tests, I was hitting a BadImageFormatException. It could not load my console application (or one of its dependencies).

A bit of blog searching found an forum thread which talks about how a new c# console application targets x86 by default. I think the combination of it attempting to run my console application in x86 and my test libraries on the 64-bit environment was causing a battle of wills.

The solution is to go into your console application Properties and switch to the Build tab. Then make sure that for both Debug and Release configurations that your Platform Target is set to “Any CPU” rather than “x86”.

This then allows your TDD to continue unchallenged by assumed defaults and the use of a 64-bit OS.

SUDS – Short URL Discovery Service

URL Shortening Services have been springing up all over the place since Twitter’s continued rise in popularity. The ability to use less characters to link to any webpage isn’t limited to short messaging platforms like SMS and Twitter. I’ve seen plenty of technical presentations include them to aid those taking notes not having to write too much.

Recently brands have been creating their own short URLs rather letting others obfuscate their identities. http://bbc.in and http://youtu.be have started to appear. Also 3rd party Twitter clients like @TweetDeck have started to shorten URLs in these custom formats.

It got me thinking there should be a Short URL Discovery Service (SUDS) defined. Domains could then be queried for a shorter version of a URL.
A simple query for a resource on that domain could reveal a shorter form or if there’s no response the 3rd party could use the user’s preferred generic URL shortening service.

I quite like these custom brand driven short URLs as it’s clearer where the link will take me.

Not checked to see if theres something like this already as I’m currently on the move. But if you think it’s an idea worth perusing, get in touch and we’ll brainstorm.

Importing AVCHD movies to iMovie

At first, it appears that the AVCHD movie files that certain cameras use are difficult to import into iMovie without some sort of conversion software.

Ignore all that, you can do it all with Disk Utility and your Mac without spending a dime/penny/whatever your local unit of currency is.

  1. Start Applications->Utilities->Disk Utility up.
  2. Click ‘New Image’
  3. Give it a file name in the ‘Save As’ box and select where you want it saving.
  4. Select the ‘Size’ . I used a 4.6Gb (DVD-R / DVD-RAM) size as I knew what size of files I’d copied from the SD card were.
  5. Click ‘Create’
  6. Go into Finder and if its not already opened the Image File (called Disk Image) in your Devices section, find the image and open it.
  7. Drag the files from your camcorder’s memory card to the image file, folders included. I copied the AVCHD folder from within the PRIVATE folder on the SD card that came out of the Canon Camcorder. It contained two subdirectories called BDMV and CANON with all the bits and bobs in it.
  8. When it’s finished copying, eject the disk image.
  9. Start iMovie
  10. Open the image file again and iMovie should detect it as a camera and offer to import it all for you!

Simple, hey? 🙂

Invoking remote executables with Powershell

The Powershell Invoke-Command  script is very powerful for automating the deployment of software built in a Continuous Integration process. However, I just ran into a bit of new syntax when trying to invoke an executable on the remote box when specifying arguments.

Continue reading Invoking remote executables with Powershell

How to be one step ahead of UK Roadworks

There is an assumption that UK government websites aren’t up to much. Elgin is a hidden gem of a website which lists roadworks from authorities which have joined their service. It’s either written or managed by Jacobs.

What’s fantastic about this website it gives you the ability to hand-draw an area on a map and be notified via Email or RSS feed of any roadworks currently underway and, more importantly, any planned for the near-future!

The Traffic Management Act 2004 meant that local councils have to publish their street-works register. The fantastic result of this is that you can map out an area which represents your community or commute and get notified before stuff happens.

When you’re on the Elgin Search page, you can expand the ‘Area’ section and select ‘Filter by drawing an area’. Then you can draw a polygon to represent the area you’re interested in.

When you click ‘Search’ you’ll see the results for that area. At the top of the results there’s the option to ‘Set up an e-mail alert or RSS feed for this search’. [Doesn’t appear to work in Google Chrome]

Brilliantly simple and allows you to stay one step ahead of your commute!

XAuth

Meebo, the web-based IM service, have proposed a new standard called XAuth. Using a new feature of HTML5, similar to cookies, an authentication token is stored in ‘LocalStorage’ within the browser. Therefore this will only work with modern browsers (IE8+, Safari4+, Chrome3+, FF3+).

You can see a demo of Meebo’s proposal at http://www.meebo.com/xauth and also there’s a YouTube video of Seth Sternberg explaining XAuth

Google, Microsoft, MySpace and Yahoo have signed up and implemented a first-pass at this. As the @Mashable article about XAuth says Twitter and Facebook don’t appear to be supporting it at all. “That means, rather than uniting these sharing services, the runner-up services are all banding together. That doesn’t simplify things for users or publishers” – Pete Cashmore.

And Pete goes on to suggest that Facebook and Twitter are big enough players to not dance to anyone else’s tune. So will this usage of the new ‘cookie’ implementation in HTML5 be tempting enough to create harmony across the big players? I doubt it, but XAuth looks like an interesting idea.