Creating shortened URL’s from Windows Phone 7 in C#

You’re no doubt familiar with URL shorteners on sites such as twitter, where they are practically mandatory. For example, my previous post,  How to Create Screenshots for Windows Phone 7 Marketplace without a Phone, weighs in at a hefty 109 characters. And that’s without any tracking codes that may be mandatory for marketing or affiliate programs. But with the use of any URL shortener ( in this case), we’re down to a mere 20 characters (

URL shorteners have other advantages as well. If you’re working with affiliate programs typically the only way you get paid is by placing your affiliate code in the URL itself. Affiliate directories will also typically add some tracking variables to the URLs, and you’re left with a long ugly URL that’s downright unseemly to include in an email, share on a facebook wall, or anywhere else you cannot control the link text being presented. And there is alos the cance people will use the URL and just omit your affiliate code, for whatever reason. URL shorteners can help prevent this.

In addition, most provide some excellent tracking and analytics  – often in real time and for free. You can view number of clicks,  referrers, county of origin, even get a QR code if you wish.

Any decent URL shortener will come with an API, and many of these services have C# libraries for interfacing with them directly. I chose for this example, but most services have a very similar API.

Signing up for an account is straigtforward, and once you do so you’ll automatically have an API key on your “Settings” page. There is already a codeplex project for a bitly library, but unfortunately it does not work for Windows Phone 7. Most other examples found online use synchronous calls which is not permitted in WP7 programming (for good reason), so we’ll start from scratch.

To make any API call, you’ll need to supply your username, and your API key. Simply log in to your account and go to Bitly offers a standard REST based API, the full documentation can be found by following the “API” link at the bottom of their page. Documentation for the method we’ll be looking at to shorten a URL can currently be found here:


For a long URL, /v3/shorten encodes a URL and returns a short one.


  • format (optional) indicates the requested response format. supported formats: json (default), xml, txt.
  • longUrl is a long URL to be shortened (example:
  • domain (optional) refers to a preferred domain; either,, or, for users who do NOT have a custom short domain set up with bitly. This affects the output value of url. The default for this parameter is the short domain selected by each user in his/her bitly account settings. Passing a specific domain via this parameter will override the default settings for users who do NOT have a custom short domain set up with bitly. For users who have implemented a custom short domain, bitly will always return short links according to the user’s account-level preference.
 Two important points: the URL must be URL encoded. No spaces, question marks, or any other odd characters. Also, the format parameter can specify either text , XML, or JSON. Text is the simplest to work with – only the shortened URL is returned. If you’re only working with one link to shorten at a time, this is an obvious choice. However,  if you’ll be sending multiple requests to bitly at one time, or you can’t guarantee the return order of your requests, you’ll want to use XML or JSON. Both of these return both the shortened URL and the original, so you can match them if necessary.
For this example, we’ll just use text, since in my app we’ll never be submitting multiple requests per page. To shorten the URL in bitly, all you need is open a web request to the URL specified by the API:
  2. string url = string.Format(@"{0}
  3. &apiKey={1}&longUrl={2}&format=txt",
  4. BITLY_LOGIN, BITLY_API_KEY, HttpUtility.UrlEncode(longUrl));
  6. WebClient wc = new WebClient();
  7. wc.OpenReadCompleted += new OpenReadCompletedEventHandler(wc_OpenReadCompleted);
  8. wc.OpenReadAsync(new Uri(url));

Since we specified the text format, the result will contain the URL only:

  2. void wc_OpenReadCompleted(object sender, OpenReadCompletedEventArgs e)
  3. {
  4. Stream stream = e.Result;
  5. var reader = new StreamReader(stream);
  6. var shortenedUrl = reader.ReadToEnd();
  7. }

I’ve wrapped the above methods in a class that you can find on github. To use, simple call the Shorten method with a callback, like so:

  2. new WP7NetHelpers.BitlyShorten().Shorten(
  3. HttpUtility.UrlDecode(ProductFeed.Instance.URL),
  4. this.ShortenCallback);
  2. private void ShortenCallback(string url)
  3. {
  4. // do something with url;
  5. }


How to Create Screenshots for Windows Phone 7 Marketplace without a Phone

One of the best and worst features of developing for windows phone 7, as opposed to the iPhone, is that it’s possible to develop and publish a windows phone 7 app without ever laying hands on an actual device that can actually run it. I’m sure this will lead to worse apps in the marketplace, because you should *always* test your apps on an actual device before publishing. But it’s nice that you don’t have to.

While the Windows Phone 7 emulator is great in many ways, I was annoyed by one thing… the ugly performance  status numbers that show up in debugging mode:


Yes, I know these are terribly useful when trying to work out bugs in performance and screen refresh issues (still a problem…) but it does make getting screenshots for the Windows Phone 7 marketplace more difficult. Photoshop them out? Take a screenshot on the phone itself and email it to you?

The solution is so simple…

  1. When you’re debugging your app, just click on Stop Debugging (Shift+F5 in VS express for phone). The emulator is still running, with your app installed.
  2. Click the Start button in the emulator
  3. Click the right arrow for the list of apps
  4. Find your app. Run your app. Now without annoying performance counters.
  5. Use built-in snipping tool (Start > All Programs > Accessories > Snipping Tool ) or graphics program of your choice to get screenshot.


That’s it! A little cropping or touch-up in and you’re ready for the marketplace.





Optimize AWS SImpleDB Deletes with BatchDeleteAttributes

I’ve found that the pricing model for SimpleDB can be somewhat complex. EC2 is easy. The longer you leave your machine up, the longer it costs. However, for SimpleDB there is no single machine for your databases. Each request to SimpleDB takes up a certain number of CPU cycles, and at the end of the month those cycles are added up, translated into a number of hours used, and then translated into a bill.

Amazon SimpleDB measures the machine utilization of each request and charges based on the amount of machine capacity used to complete the particular request (SELECT, GET, PUT, etc.), normalized to the hourly capacity of a circa 2007 1.7 GHz Xeon processor.

It’s easy to check up on the machine utilization for your SimpleDB account. Log in to AWS, go to Account, AccountActivity, and you can download an XML file or CSV file of the current month’s usage. This report will list all requests, along with the usage for each one:

	<StartTime>07/14/11 18:00:00</StartTime>
	<EndTime>07/14/11 19:00:00</EndTime>

On a recent project we saw a spike in SimpleDB costs after about a month of usage. The app was using SimpleDB to store some logging and transaction information, and after a month it was deemed safe to delete this. However, each of these records was deleted with a single requests – which adds up if you’re deleting hundreds at a time. BatcheDelete lets you delete up to 25 per request – not perfect, but at least it’s better than one at a time. The AWS C# library supports this request:

var client = AWSClientFactory.CreateAmazonSimpleDBClient(ID, KEY);
BatchDeleteAttributesRequest deleteRequest = new BatchDeleteAttributesRequest()
deleteRequest.Item = new List();
foreach (var r in recordIDs)
deleteRequest.Item.Add(new DeleteableItem() { ItemName = r });

SimpleDB also has a BatchPut request, helping you to group INSERTs.

Free Programs for new Businesses to get into Cloud Computing

The fierce competition in the cloud marketplace today has resulted in some great deals for small business. Both Amazon and Microsoft currently have programs that offer a free tier of all their major cloud offerings for new accounts or new businesses. Amazon’s is very simple – a free tier of service is offered for the first 12 months once you sign up. There is no need for an application process and it’s open to individuals. To sign up, just go to The restrictions are as follows:

AWS Free Usage Tier (Per Month):

  • 750 hours of Amazon EC2 Linux Micro Instance usage (613 MB of memory and 32-bit and 64-bit platform support) – enough hours to run continuously each month
  • 750 hours of an Elastic Load Balancer plus 15 GB data processing
  • 10 GB of Amazon Elastic Block Storage, plus 1 million I/Os, 1 GB of snapshot storage, 10,000 snapshot Get Requests and 1,000 snapshot Put Requests
  • 5 GB of Amazon S3 storage, 20,000 Get Requests, and 2,000 Put Requests
  • 30 GB per of internet data transfer (15 GB of data transfer “in” and 15 GB of data transfer “out” across all services except Amazon CloudFront)
  • 25 Amazon SimpleDB Machine Hours and 1 GB of Storage
  • 100,000 Requests of Amazon Simple Queue Service
  • 100,000 Requests, 100,000 HTTP notifications and 1,000 email notifications for Amazon Simple Notification Service
This should be more than enough for a basic website with typical database needs.
Google’s AppEngine continues to havea  free usage tier. As of this writing it is 500MB of storage and up to 5 million page views a month, but Google is making changes with the introuction of “Apps for Business” so it’s best to check directly for updates.
Microsoft’s has several programs for trying out its service, but if you’re a small starting business you MUST try to join BizSpark. In addition to the networking and visibility benefits, you get a  full MSDN subscription and the following impressive package of Windows Azure services:
  • Windows Azure Small compute instance 750 hours / month
  • Windows Azure Storage 10 GB
  • Windows Azure Transactions 1,000,000 / month
  • AppFabric Service Bus Connections 5 / month
  • AppFabric Access Control Transactions 1,000,000 / month
  • SQL Azure Web Edition databases (1GB) 3
  • SQL Azure Data Transfers 7 GB in / month, 14 GB out / month

A Quick Tour of Amazon’s Mobile App Developer Program

OK, my mobile app isn’t quite ready yet, but this post from the people at AWS caught my attention. One of the main difficulties in developing Android applications is that there’s not one app store (not even one draconian one), but several different app stores available. Amazon hopes to fill that void by developing its own app store for any Android device, and while only time will tell if it is successful, given Amazon’s track record of quality and market reach any mobile developer needs a foothold here. If you sign up now, it’s free for the first year:

If you are using the SDK to build an Android application, I would like to encourage you to join our new Appstore Developer Program and to submit your application for review. Once your application has been approved and listed, you’ll be able to sell it on before too long (according to the Appstore FAQ, we expect to launch later this year). If you join the program now we’ll waive the $99 annual fee for your first year in the program.

You can list both free and paid applications, and you’ll be paid 70% of the sale price or 20% of the list price, whichever is greater. You will be paid each month as long as you have a balance due of at least $10 for US developers and $100 for international developers. The Amazon Developer Portal will provide you with a number of sales and earnings reports.

The store will provide rich merchandising capabilities. Each product page will be able to display multiple images and videos along with a detailed product description.

Joining the program is simple. If you already have an custome or affiliate account (and who doesn’t), you can simply use that account:

After this, it’s about 4-5 confirmations until you’re signed up. Is this you name? Agree to terms of service? Agree to pay us the $99 after your first year? If you charge for apps, what’s your bank account info?

By the way… only a $10 minimum payout is very cool…

After that you’re in!

Of course, the rest of the site is incomplete. They do have samples of the submit an app page, reports, and account pages that are interesting. It looks like you’ll have considerable control over your application’s launch cycle- including pre-orders and limited release windows. The reports look basic but adequate for most developers. I do hope they open up an API that lets you get more information on the who/what/where of downloads… but it’s a welcome and much needed addition to the android marketplace.

ActiveReports and POCO data sources

I was recently working on a WinForms project that was pulling data from a RSS feed. Since all of the data was loaded from online and it was a relatively simple addition there was no need for a local database. The only problem came when I was tasked with using ActiveReports to create a report of this data, since ActiveReports expects things like DataTables, DataSets and you can’t have a POCO (Plain Old C# Object) act as a data source directly. Most online tutorials suggested making a fake DataTable from your objetcs and looping through your collection manually to add fields and rows, which seems somewhat tedious if you’re working with a large collection of possible fields. Fortunately ActiveReports supports XML data sources, so if your data is Serializable you can just plop it in as a data source, like so:

DataDynamics.ActiveReports.DataSources.XMLDataSource ds = new DataDynamics.ActiveReports.DataSources.XMLDataSource();
ds.FileURL = null;
ds.RecordsetPattern = "//Entry"; // or whatever your data is serialized as
var sw = new StringWriter();
var ser = new XmlSerializer(t);
ser.Serialize(sw, obj);
this.DataSource = ds;

Once that’s done you can go about data binding your report fields to your POCO data memebers.

Disabling the Evil of iTunes Services on your Windows PC

I love my iPhone, but I hate iTunes. Without a doubt it is the most wretched piece of software on my machine. The interface is counterintuitive and difficult to navigate, and seems to get worse with each release. Bad UI I can live with, what really annoys is its virus-like behavior – installing unecessary software and services, constant reminders to update or add more services, and not one, not two, but three services running on your machine at all time. One of which is constantly sniffing around on your network…. Clearly something must be done.

The first thing to fix after any iTunes installation is to remove two startup items. Without your knowledge, iTunes has added set two programs to run every time Windows is booted – supposedly to check for updates. Disable them by going to the “Run” menu, type “msconfig”, go to the startup tab, and disable both QuickTime and iTunes.

Next, let’s disable those *three* services that iTunes has installed. The worst is the Bonjour Service, which sniffs your network for shared music or other Apple devices. If you have an AppleTV or other Apple devices in your house it might be necessary, but for most people is completely worthless. Go to Control Panel, Admistrative Tools, Services. Stop the Bonjour service and set it to Disabled. Next time you start iTunes it will complain that Bonjour isn’t running, but ignore it.

While you’re in services, set both the “iPod Service” and “Apple Mobile Device” to Manual. As best as I can tell, these are required to sync with iTunes, but there’s no reason for these to run when iTunes is not running, so we’ll correct that in a moment. At the very least, disabling these services will help improve boot times, and I believe disabling these helps improve stability – particularly when other USB devices are connected.

Finally, we want these services to run while we have iTunes open for syncing, but no other times. Easy enough to fix with a simple batch file, with the following five lines:

net start "Apple Mobile Device"
net start "iPod Service"

"C:\Program Files\iTunes\itunes.exe"

net stop "iPod Service"
net stop "Apple Mobile Device"

Use a shortcut to this batch file instead of a shortcut to iTunes, and it will start the dependent services, open iTunes, and then stop the services when iTunes closes (like iTunes *should* do).

Free Automated Backups For Your Windows PC

Windows Backup does an adequate job, but unfortunately in Windows 7 you need the Professional or Ultimate edition to back up to a network share. If you want to save your backups to a network device, or even just to sync files between two computers on your home network, Windows 7 Home won’t do, but I’m not about to pay for the upgrade to Pro just to sync files between computers. Fortunately there is a free solution with SyncToy.

SyncToy is a free “PowerToy” offered by Microsoft that lets you set up folder pairs to be synchronized. Out of the box it does not include any built-in scheduling, but it does offer a command line interface making it easy to roll your own scheduling.

First, download SyncToy and set up the folder pairs you want to synchronize. Each folder pair has a name: remember this for later. You can synchronize folders between machines across the network, or to an external USB drive. Once you have your folder pairs set up you can create a simple batch file (yes, remember batch files!) and call the SyncToy command line interface, like so:

"C:\Program Files\SyncToy 2.1\SyncToyCmd.exe" -R FOLDER_PAIR_NAME

If your folder pair name has spaces in it you may find the command line interface temperamental – just rename it to something without spaces.

Once your batch file is created simply set up a schedule in the task scheduler.

If you are ssynchronizing to a network share, you may want to check that the share exists before running SyncToy. This isn’t necessary, but it does prevent SyncToy from searching for the share and using resources when it shouldn’t run.

"C:\Program Files\SyncToy 2.1\SyncToyCmd.exe" -R FOLDER_PAIR_NAME

Adding Dynamic Code to C# with IronRuby

Getting started with IronRuby is simple – just two quick steps:

1. Download and install IronRuby from The latest version (as of this writing, 1.1.1) is compatable with .NET 4.0, but if you’re stuck in 2.0 or 3.5 then IronRuby version 1.0 will work fine in earlier versions of Visual Studio.

2. Once installed, create a new console application in visual studio, and add references to IronRuby, IronRuby.Libraries, Microsoft.Scripting.Core, and Microsoft.Scripting. The Microsoft.Scripting libraries are included with .NET 4.0 I believe, but if you install the IronRuby 1.0 release these will be found in the C:\Program Files\IronRuby 1.0\bin folder. In your console app, copy the following three lines:

var engine = IronRuby.Ruby.CreateEngine();
var result = engine.Execute("puts 'Hello, world!'", engine.CreateScope());

That’s it! You now have much* of the power and flexibility of the ruby language available to you in any .NET application. But, you ask, if you’re not a Ruby aficionado why would you want to do such a thing?

Assume you are developing a typical line of business application, one of the functions of which is to determine a discount for some customers at checkout. Business rules dictate that customers who joint the “Valued customer” program get offered a 10% discount at checkout. The prior coder on the project simply hard coded the rate into the program like so:

if (customer.ValuedCustomer)
price -= (price * 0.1f);

Any change to the discount required a rebuild and redeploy of the web site…. not a good solution. You propose storing the discount rate in a configuration value or database field that is easier to change, and initially everyone is very impressed with this solution. Until marketing decides they want to vary the discount based on the customer’s state. That requires a rebuild and redeploy…

What if instead of storing simply the discount rate in a configuration setting, you could store the entire discount function? With dynamic scripting and IronRuby this is easy. Use the scripting library’s SetVariable method to pass the customer record to a custom ruby function that calculates the discount. Whenever marketing decides to change business logic for the dicount, you can change the ruby function (stored in the database or config), and the app’s logic is updated without a recompile and redeploy. For example:

var customerRecord = new Customer () { ID = 10, FirstName = "John", LastName = "Smith", State = "CA", ValuedCustomer = true };
engine.Runtime.Globals.SetVariable("CUSTOMER", customerRecord);

var code =
@"def getDiscount(c)
if c.ValuedCustomer && c.State == 'CA'
return 0.1
return 0
var discount = engine.Execute(code);

Quick Review of .NET Performance & Memory Tools

Evaluating the performance of any ASP.NET application is a complex beast indeed. All the standard pitfalls of web application performance apply: server configuration, caching, network issues, client-side script performance, image management… This is not meant to be a comprehensive review of all these factors. Today we’re simply looking at tools to identify one area of performance problems and answer one question – what areas of the ASP.NET code are adversely impacting performance. Specifically, I want to run a tool on my development box and identify problem areas of code before deployment. So, what are the options?

Redgate ANTS Performance Profiler
The profiler easily attaches to an existing ASP.NET application, and no changes to your source code or application are necessary. The profiler logs all method calls, and will summarize based upon the % of time spent in each method, number of times each method was called, and the total time spent for each method call.

Clicking on any method name in the performance window shows you the source code for that method, and you can open up a call tree showing all of the methods that called this method (ancestors), and all methods called by this method (descendants). This ancestor/descendant feature is extremely helpful, since you can quickly identify which methods are being called unnecessarily.

Redgate also comes with a memory profiler that shows all objects in memory, along with the memory size, gain or loss in number of objects, and a number of filters to help identify common problems.

The Redgate toolset also offers a SQL server profiler that shows all database calls and time spent in each call. However, this is only available for Windows Vista and higher, so I was not able to evaluate this firsthand.

Price: the standard version is $395, but the memory profiler brings the price to $795. This is not cheap, but you certainly get what you pay for. It’s a very polished and comprehensive tool. A 14 day trial is available.

JetBrains Profiler
JetBrains offers a performance and memory tracking toolset that offers many of the same features of RedGate. The profiler tracks all methods called in an ASP.NET application, sorting by total time spend, number of calls, or time spent per method.

The profiler does show methods called by each method (descendants), but does not show methods calling each method (ancestors). To find ancestors you would need to do a manual search in Visual Studio.

The suite does include a memory profiler as well showing the top memory objects in use, but it does not have the filters to identify common leak problems.

The JetBrains product is $399 for a personal licence, and a trial is available. It also integrates with ReSharper and the other tools from JetBrains, so if your shop is already a ReSharper user that’s definitely a plus.

The Equatec profiler is a very cost-effective option for basic performance tracking. It shows the time spent in each method along with both ancestor and descendant methods. Results include time spent in method along with methods calling target method, and methods called by this method.

The free version allows you to track up to 10 DLLs at once, and the $99 version allows you to track 25. However, it does require you to rebuild your app so custom DLLs are used to track performance. No changes to source code are necessary, but it is an extra step before you can generate reports. Since it’s harder to do, it would probably be used less.

PAL (Performance Analysis of Logs)

PAL is an open source tool that generates very detailed reports from Performance Monitor (PerfMon) logs. To generate reports you set up a custom PerfMon tracker on a server, and let PerfMon generate log files as your web application runs. PAL takes these log files and generates detailed reports highlighting areas of concern (CPU usage, SQL calls per minute, etc).

No changes to your application are necessary, and it can be run on a production server. However, it only presents an overall view of the health of the server, does not identify problem areas of code. A great tool for evaluating the health of your web servers, but not for helping clean up the code before launch.

Slimtune is an open source tool that offers excellent performance tracking for desktop applications, but the beta version currently available fails when trying to attach to the ASP.NET process. The developer has not announced when the next update will be available.

Nprof is another open source testing tool, but there has not been any serious development on updating the project for a couple years. NProf can display the percent of time spent in methods, but it is exceptionally difficult to get it working with ASP.NET applications. The data reported by NProf is very basic, only listing the namespaces and function names with no source code integration or related functions called.

Microsoft CLR Profiler
The free CLR profiler from Microsoft can be used to track memory usage, but like Nprof it requires changing ASP.NET configuration and the reports are arcane and difficult to interpret. A nice free option for searching for memory leaks if you have the time and not the budget for a professional tool.

SciTech Memory Profiler
This offers many of the same features as the RedGate studio, with the addition of showing the stack trace for each object. The standard edition is $159, and a 14 day trial is available.