Home / Archive by category "Programming" (Page 2)

Adding Dynamic Code to C# with IronRuby

Getting started with IronRuby is simple – just two quick steps:

1. Download and install IronRuby from http://ironruby.codeplex.com/. The latest version (as of this writing, 1.1.1) is compatable with .NET 4.0, but if you’re stuck in 2.0 or 3.5 then IronRuby version 1.0 will work fine in earlier versions of Visual Studio.

2. Once installed, create a new console application in visual studio, and add references to IronRuby, IronRuby.Libraries, Microsoft.Scripting.Core, and Microsoft.Scripting. The Microsoft.Scripting libraries are included with .NET 4.0 I believe, but if you install the IronRuby 1.0 release these will be found in the C:\Program Files\IronRuby 1.0\bin folder. In your console app, copy the following three lines:

var engine = IronRuby.Ruby.CreateEngine();
var result = engine.Execute("puts 'Hello, world!'", engine.CreateScope());
Console.WriteLine(result);

That’s it! You now have much* of the power and flexibility of the ruby language available to you in any .NET application. But, you ask, if you’re not a Ruby aficionado why would you want to do such a thing?

Assume you are developing a typical line of business application, one of the functions of which is to determine a discount for some customers at checkout. Business rules dictate that customers who joint the “Valued customer” program get offered a 10% discount at checkout. The prior coder on the project simply hard coded the rate into the program like so:

if (customer.ValuedCustomer)
price -= (price * 0.1f);

Any change to the discount required a rebuild and redeploy of the web site…. not a good solution. You propose storing the discount rate in a configuration value or database field that is easier to change, and initially everyone is very impressed with this solution. Until marketing decides they want to vary the discount based on the customer’s state. That requires a rebuild and redeploy…

What if instead of storing simply the discount rate in a configuration setting, you could store the entire discount function? With dynamic scripting and IronRuby this is easy. Use the scripting library’s SetVariable method to pass the customer record to a custom ruby function that calculates the discount. Whenever marketing decides to change business logic for the dicount, you can change the ruby function (stored in the database or config), and the app’s logic is updated without a recompile and redeploy. For example:

var customerRecord = new Customer () { ID = 10, FirstName = "John", LastName = "Smith", State = "CA", ValuedCustomer = true };
engine.Runtime.Globals.SetVariable("CUSTOMER", customerRecord);

var code =
@"def getDiscount(c)
if c.ValuedCustomer && c.State == 'CA'
return 0.1
else
return 0
end
end
getDiscount(CUSTOMER);
";
var discount = engine.Execute(code);

Quick Review of .NET Performance & Memory Tools

Evaluating the performance of any ASP.NET application is a complex beast indeed. All the standard pitfalls of web application performance apply: server configuration, caching, network issues, client-side script performance, image management… This is not meant to be a comprehensive review of all these factors. Today we’re simply looking at tools to identify one area of performance problems and answer one question – what areas of the ASP.NET code are adversely impacting performance. Specifically, I want to run a tool on my development box and identify problem areas of code before deployment. So, what are the options?

Redgate ANTS Performance Profiler
http://www.red-gate.com/products/ants_performance_profiler/
The profiler easily attaches to an existing ASP.NET application, and no changes to your source code or application are necessary. The profiler logs all method calls, and will summarize based upon the % of time spent in each method, number of times each method was called, and the total time spent for each method call.

Clicking on any method name in the performance window shows you the source code for that method, and you can open up a call tree showing all of the methods that called this method (ancestors), and all methods called by this method (descendants). This ancestor/descendant feature is extremely helpful, since you can quickly identify which methods are being called unnecessarily.

Redgate also comes with a memory profiler that shows all objects in memory, along with the memory size, gain or loss in number of objects, and a number of filters to help identify common problems.

The Redgate toolset also offers a SQL server profiler that shows all database calls and time spent in each call. However, this is only available for Windows Vista and higher, so I was not able to evaluate this firsthand.

Price: the standard version is $395, but the memory profiler brings the price to $795. This is not cheap, but you certainly get what you pay for. It’s a very polished and comprehensive tool. A 14 day trial is available.

JetBrains Profiler
http://www.jetbrains.com/profiler/
JetBrains offers a performance and memory tracking toolset that offers many of the same features of RedGate. The profiler tracks all methods called in an ASP.NET application, sorting by total time spend, number of calls, or time spent per method.

The profiler does show methods called by each method (descendants), but does not show methods calling each method (ancestors). To find ancestors you would need to do a manual search in Visual Studio.

The suite does include a memory profiler as well showing the top memory objects in use, but it does not have the filters to identify common leak problems.

The JetBrains product is $399 for a personal licence, and a trial is available. It also integrates with ReSharper and the other tools from JetBrains, so if your shop is already a ReSharper user that’s definitely a plus.

Equatec
http://www.eqatec.com/Profiler/
The Equatec profiler is a very cost-effective option for basic performance tracking. It shows the time spent in each method along with both ancestor and descendant methods. Results include time spent in method along with methods calling target method, and methods called by this method.

The free version allows you to track up to 10 DLLs at once, and the $99 version allows you to track 25. However, it does require you to rebuild your app so custom DLLs are used to track performance. No changes to source code are necessary, but it is an extra step before you can generate reports. Since it’s harder to do, it would probably be used less.

PAL (Performance Analysis of Logs)
http://pal.codeplex.com/

PAL is an open source tool that generates very detailed reports from Performance Monitor (PerfMon) logs. To generate reports you set up a custom PerfMon tracker on a server, and let PerfMon generate log files as your web application runs. PAL takes these log files and generates detailed reports highlighting areas of concern (CPU usage, SQL calls per minute, etc).

No changes to your application are necessary, and it can be run on a production server. However, it only presents an overall view of the health of the server, does not identify problem areas of code. A great tool for evaluating the health of your web servers, but not for helping clean up the code before launch.

Slimtune
http://code.google.com/p/slimtune/
Slimtune is an open source tool that offers excellent performance tracking for desktop applications, but the beta version currently available fails when trying to attach to the ASP.NET process. The developer has not announced when the next update will be available.

Nprof
http://code.google.com/p/nprof/
Nprof is another open source testing tool, but there has not been any serious development on updating the project for a couple years. NProf can display the percent of time spent in methods, but it is exceptionally difficult to get it working with ASP.NET applications. The data reported by NProf is very basic, only listing the namespaces and function names with no source code integration or related functions called.

Microsoft CLR Profiler
http://msdn.microsoft.com/en-us/library/ff650691.aspx
The free CLR profiler from Microsoft can be used to track memory usage, but like Nprof it requires changing ASP.NET configuration and the reports are arcane and difficult to interpret. A nice free option for searching for memory leaks if you have the time and not the budget for a professional tool.

SciTech Memory Profiler
http://memprofiler.com/
This offers many of the same features as the RedGate studio, with the addition of showing the stack trace for each object. The standard edition is $159, and a 14 day trial is available.

Value can not be null. Parameter name: serviceType

WCF itself isn’t inherently compliated, but learning WCF presents many stumbling blocks. The terminology can be somewhat convoluted, and WCF development and debugging often boils down to making sure many different parts of disparate XML files are edited correctly.

However, the true pain lies in the exceptionally unhelpful error messages. When developing a WCF service that is hosted in a Windows Service (a net/tcp WCF server, for example), you may be presented with this error:

Value can not be null. Parameter name: serviceType

Of course, your code doesn’t have any parameters named “serviceType”. You can’t debug this error because your service can’t even start. “Value can not be null. Parameter name: serviceType” is all you’ve got.

After many hours of frustration, the solution is rather simple. The error happens when the windows service tries to create the WCF server, via this call:

protected override ServiceHost CreateServiceHost(
Type serviceType,
Uri[] baseAddresses
)

CreateServiceHost cannot load the Type information for your WCF service, either because the namespace is wrong of the file itself cannot be loaded. For example, assume your WCF service is defined with the interface IMyService, and implemented in the namespace MyNamespace.MyServiceImplementation, and the file MyServiceImplementation.dll. For one reason or another, MyNamespace.MyServiceImplementation cannot be loaded. Most often this is because of a typo in your namespace declaration. Check both the .config file and the .cs file where it is implementged. Another common problem is that the .dll was not copied to the correct directory. Verify that the MyServiceImplementation.dll is in the same folder as the windows service that is trying to load it. Check your project references or automated build scripts if this was the cause of the error.

Using the Chrome Developer Tools for JavaScript Debugging

One of the thorniest problems with web development has always been cross-browser compatibility. As our web sites become more interactive we focus more on JavaScript and AJAX to deliver rich user experiences, which introduces new challenges for debugging and tracking down cross-browser differences.

Firebug is an excellent option when working with Firefox, and Visual Studio is perfect for Internet Explorer. Until recently there was no good option for Google Chrome, but earlier this year Chrome updated the developer tools, and they rock.

Assuming you’re on a Windows machine, hit Control+Shift+J to open the developer tools window. You can also get there from the page menu, or by right clicking on any element in the page to “inspect element.”

The toolbar at the top lets you inspect the DOM elements, debug scripts, even inspect the cookies currently saved in the browser. It isn’t obvious from teh UI, but there are keyboard shortcuts for each of the debugger commands:
F8 - Run
F10 - Step over
F11 - Step into

Note that if the browser window is focused and not the debugger window, F11 will have the window go into full screen mode.

You can set breakpoints by clicking on the line number in the script window, or by adding the “debugger;” line to your javascript. Simply mouse over a variable to see its current value.

While I’ve found that Chrome is a phenomenal browser and I’m in love with its performance, it can be somewhat less forgiving of errors than other browsers. The Chrome debugger is an excellent tool for tracking down those errors.

Source Control: How to automatically “get latest” with subversion

First things first – if you aren’t currently using source control or are still using something like SourceSafe – stop! Read this article from Jeff Atwood, and come back.

Team System is a great option for larger companies with the budget for it, but for most shops subversion is the perfect solution. Set up your subversion server, and on the clients use TortoiseSVN to integrate subversion with the shell.

While merging  can be relatively painless with Tortoise, too often people forget to get updates from the repository often, which can lead to subtle bugs during development. Fortunately Tortoise comes with a command like interface making it easy to automate this process. Simple create an old-fashioned batch file with notepad in your StartUp folder, and add this line:

TortoiseProc.exe /command:update /path:”c:\path\to\your\code\” /notempfile /closeonend:1

Yes, the  /notempfile switch is required. The  /closeonend:1 will close the dialog automatically if there are no errors. For other options, check the documentation here.

A brief introduction to video in HTML5

With the release of the iPad there has been a renewed interest in HTML5, since it enables a web site to display video without using Flash. On the one hand displaying video in HTML 5 couldn’t be simpler. It is one tag, that links to one file, like so:

<video src=”waterfall.mp4″ controls></video>

Too easy! But as always, the devil is in the details. The issue with video over the internet has always been the lack of a common standard for video formats. HTML5 does not solve this problem. As of today, HTML5 does not specify the one format that a browser must support. When doing video in HTML5, you essentially have two choices: Ogg Theora or H.264. Ogg Theora is open and free of patents (as far as we know), but is less widely supported and H.264 offers better performance. H.264 is covered by patents, but they are not charging royalties for free internet video until 2015. Today, H.264 is supported by Safari (including the iPhone/iPad), whereas Firefox supports Ogg Theora. Google Chrome supports both. Internet Explorer 8 still does not support the <video> tag. For an updated browser support comparison, see wikipedia.

With this in mind it is easy to alter our <video> tag to link to alternate formats, so our web page will work both Safari, Chrome and Firefox:

<video poster=”poster.jpg” controls>

<source src=”waterfall.mp4″ type=”video/mp4″>

<source src=”waterfall.ogg” type=”video/ogg”>

Your browser isn’t HTML5 compliant, download the video <a href=”waterfall.mp4″>here</a>

</video>

The browser will automatically select the video to display based on what format it supports. In the example above, Safari and Chrome viewers will see the “waterfall.mp4” video, and Firefox viewers will see the “waterfall.ogg” video. Internet explorer viewers will see the “Your browser isn’t HTML5 compliant” message. You could easily embed a flash video here as well instead of a download link.

All that remains is converting your source video both an H.264 mpeg and an ogg video. QuickTime Pro is relatively inexpensive and can easily convert to h.264 and many other formats. The free VLC player supports converting to ogg.  For Windows machines, SUPER can convert to FLV. There are several other free or open source implementations to convert videos from one codec to another – as always google is your friend.

Getting Started with SQL Azure

After getting an account, you can log in at http://sql.azure.com.

They provide a very basic web interface lets you set up the firewalls or create new databases, but that’s about it. To do anything interesting you have to connect via code or the management studio.

The most recent version of MS SQL Server Management Studio (2008 R2) supports connecting to Azure. It’s possible to get earlier versions to connect, but 2008 R2 also ships with an Import/Export wizard that is supposed to support migrating data to Azure, but I have had little success with that. The open source Azure Migration Wizard has been far more reliable at moving data and informing you of any issues you’ll have migrating to the cloud.

When you connect via management studio the standard “object browser” does not work, but you can connect via a new query window:

Then specify the connection parameters, and under “options” select the database you want to connect to:

The first time you’ll attempt to connect chances are you’ll get an access denied error. SQL Azure’s firewall defaults to blocking all incoming traffic, so before you connect you have to open access to your current IP address, or the range of IP’s for your location. This is easy enough to do from your account at http://sql.azure.com. If you still cannot connect check your local firewall and ensure that the TCP 1433 is not blocked for outgoing connections (this is the port used by Azure).

Once connected you have a standard query window in Management Studio, and you can perform virtually any T-SQL function. With a few restrictions, Azure is a standard SQL server database, and very simple to work with.

PHP Frameworks, Fonts

The weather’s too good here for a more in-depth blog post, so today I just have a a few of the better links I’ve run across recently:

If you do any serious work in PHP, you need to familiarize yourself with a framework if you haven’t developed one yourself. This article on Discussing PHP Frameworks is a fair introduction to the major frameworks – no in-depth commentary as to which is best, but a good starting point if you’re not familiar with them already.

Never underestimate the power of fonts in your web design, particularly when choosing a logo.And speaking of fonts, if you’re sick of the nine “web safe” fonts we’ve been stuck with for ages, you may want to experiment with the TypeSelect project from MIT.

If you’re hiring a designer, you are using a contract, right? If not, no more excuses.

A free online drag and drop flash editor?! Try out http://www.wix.com/

Programming the Ingenico 6770 in C#

One of the more troublesome projects I’ve been working in is adding signature capture + MSR to a desktop application. One of the hardware pieces we added support for is the Ingenico 6770. The good news is that with the economic downturn you can get them for a steal on eBay. But your customers may be loathe to spend what these cost new…

Regardless, here are my initial observations:

  • While you may be able to get this working in straight .NET code with the Microsoft PointOfService libraries – it isn’t worth it. Your customers will need to install the driver software from Ingenico to get the device working at all, and that software installs some ActiveX objects that expose far more reliable and powerful API methods.
  • Yes, you heard me correctly – ActiveX controls. Ingenico seriously needs to work on improving their API.
  • Ingenico also does not supply any of their SDKs directly on their web site. You will need to contact the company via phone or email and ask for a link to the SDK’s. Once you do so be sure to get both the SDK for the product you are working on, along with the “Signature SDK.” They do not come in one package or location.
  • Once you get the SDKs and add them to your project most parts are fairly simple. The only major problem I had was with extracting the signature from the library into a string I could save to the database, so it could be read back and displayed to the user later. It turns out that while the regular SDK documents never mentioned it, the signature SDK did mention that you must set BinaryConversion to OPOS_BC_NIBBLE in order to get this working.
  • SDK did not say what the value for OPOS_BC_NIBBLE is…. I’ll tell you it’s 1.

Pricing Windows Azure, Amazon S3 and the Google App Engine

Microsoft has finally announced the pricing model for Windows Azure (Confirming Commercial Availability and Announcing Business Model). Azure has generated a lot of buzz among Windows or .NET developers but one of the major unknown factors was pricing. While it’s easy to develop a test case against the Azure “beta” now, you’d be mad to base any major business changes without knowing the price yet. While there are several other unknowns to grapple with (SLA, availability, reliability…) at least this question has been answered. In short:

Upon commercial availability we will offer Windows Azure through a consumption-based pricing model, allowing partners and customers to pay only for the services that they consume.

Windows Azure:

  • Compute @ $0.12 / hour
  • Storage @ $0.15 / GB stored
  • Storage Transactions @ $0.01 / 10K
SQL Azure:

  • Web Edition – Up to 1 GB relational database @ $9.99
  • Business Edition – Up to 10 GB relational database @ $99.99
.NET Services:

  • Messages @ $0.15/100K message operations , including Service Bus messages and Access Control tokens

Bandwidth across all three services will be charged at $0.10 in / $0.15 out / GB

Microsoft’s major competitors in the “Cloud” marketplace are Amazon S3 and the Google App Engine.

Microsoft’s major advantage over The Google App Engine is the relative ease by which you can integrate a traditional desktop integration with an app or database hosted online. The Google App Engine is an online-only service, and you are limited only to programming in Java and Python. The pricing for the Google App Engine includes a free quota, but beyond that you must pay:

Resource Unit Unit cost
Outgoing Bandwidth gigabytes $0.12
Incoming Bandwidth gigabytes $0.10
CPU Time CPU hours $0.10
Stored Data gigabytes per month $0.15
Recipients Emailed recipients $0.0001

Amazon S3 is the major established player in this market. Amazon started offering S3 about three years ago, and many very successful businesses (Twitter, for one) use S3 to host files, images, or increase performance. For simple file storage, they cannot be beat. However, it is not an application or relational database service. It is a very cheap and efficient means to store files. Their pricing model is as follows:

Storage
* $0.150 per GB – first 50 TB / month of storage used
* $0.140 per GB – next 50 TB / month of storage used
* $0.130 per GB – next 400 TB /month of storage used
* $0.120 per GB – storage used / month over 500 TB

Data Transfer
* $0.100 per GB – all data transfer in

* $0.170 per GB – first 10 TB / month data transfer out
* $0.130 per GB – next 40 TB / month data transfer out
* $0.110 per GB – next 100 TB / month data transfer out
* $0.100 per GB – data transfer out / month over 150 TB

Requests
* $0.01 per 1,000 PUT, COPY, POST, or LIST requests
* $0.01 per 10,000 GET and all other requests*

The choice of service largely depends on the business needs of your application (pricing is very similar), but it’s good to see the competition from three major players in this space.