Windows Azure Cloud Costs Analysis

If you’ve ever tried to figure out what Cloud Hosting would cost, you may have thought, “wow, that’s complex.”

I’m here to confirm that thought and maybe help out with the knowledge I gained from 4 days of back and forth emails with Microsoft’s Billing Support.  They’re very helpful, but I think they’re pricing model is probably a barrier for some Architects.  Regardless of complexity, it turned out to be a very good deal.

Azure Hosting if you have a MSDN License:
If you currently have a Visual Studio Premium with MSDN subscription, that may include $100 monetary credit each month. Therefore, as long as your usage is within $100, you would not be billed. If your usage exceeds $100, you would be charged as per standard rates.

Standard Website
You will be metered for compute hours as you are using reserved units for your website.
How many compute hours are there in a month?  930 compute hours.
Pricing = 930 x 0.06 = 55.8 USD.

WordPress Websites MySQL Database

Since I made a WordPress site, I discovered I got 1 Free 20MB MySQL database license.  Additional MySQL databases will be charged.

How much would I be charged for a second MySQL database?
Standard Rates:

  • 0 to 100 MB         $4.99
  • 100 MB to 1 GB         $9.99
  • 1 GB to 10 GB         $9.99, for the first GB $3.99 for each additional GB
  • 10 GB to 50 GB         $45.96, for the first 10 GB $1.99 for each additional GB
  • 50 GB to 150 GB     $125.88, for the first 50 GB $0.99 for each additional GB

How much would a Microsoft SQL Server Database Cost (what are standard rates)?
When you set up a SQL Database in Azure you can choose:

  • 0 to 100 MB            0.5 DU
  • 100 MB to 1 GB        1 DU
  • 1 GB to 10 GB        1 DU for the first GB, 0.4 DU for each additional GB
  • 10 GB to 50 GB        4.6 DU for the first 10 GB, 0.2 DU for each additional GB
  • 50 GB to 150 GB        12.6 DU for the first 50 GB, 0.1 DU for each additional GB

These rates are pro rated.

Why does a “Windows Azure Website” show up under “Cloud Services”
For billing purposes only, Azure Web Sites (standard) uses Windows Azure Compute Hours meter (also used by Cloud Services). This is the reason why the website would show up as “Cloud Services” under the invoice/detailed usage report.

If I have 3 WordPress Websites, How can I minimize the costs?    
By default, if you create a Standard Website, there is a reserved unit created for that sub-region (ex: east-us).
Any additional websites that you create in the same sub-region, would be automatically linked to this reserved unit.
We will be billing only for the reserved unit and not for the number of websites.
Therefore, to save cost, you can create all three websites under the same sub-region.
You would be only for one reserved unit, i.e. 30 hours per day.

Well, hope that helps anyone out.

Intro to SQL # For Twitterizer

In this post we’re talking a little bit about an add-on tool for SQL called SQL #.  The idea is to give programmers useful functions in T-SQL and Enterprise Manager.  It is a library of CLR (Common Language Runtime) functions so it works on SQL 2005+.  Getting started is pretty easy and is on the home page of the website:

The simplest way to start is to create an empty database called “Sharp”, the run the script they provide and open a new query window.

You can review a lot of the features here, but, what really piqued my interest was RegEx, String, and Twitter features.  I have wasted a lot of coding time in the past re-validating data passed from the middle tier.  Why doesn’t Microsoft put the .Net Framework into Enterprise Manager?  They gave us SMO (SQL Management Objects) so we could mess with SQL in C# (probably to the dismay of all DBA’s that pay attention), so why not the other way?  Solomon Rutzky tired of that question and built SQL#.

Let’s start with a few examples.  I’ll be using the AdventureWorks database on a SQL 2008 R2 (or as we like to call it: SQL 2010…)  If you don’t have that database, you can get it here.  These examples are taken/modified from another great blog.

/* SQL# Calculating business days */
SELECT  wo.DueDate,
        [WorkingDays]=Sharp.SQL#.Date_BusinessDays(wo.DueDate, wo.EndDate, 3),
FROM    Production.WorkOrder AS wo
WHERE   wo.EndDate > wo.DueDate ;

/* SQL# Calculating the distance between two points */
SELECT  [Meters]=a1.SpatialLocation.STDistance(a2.SpatialLocation),
FROM    Person.Address AS a1
        JOIN Person.Address AS a2
        ON a2.AddressID = 2
           AND a1.AddressID = 1;

/* SQL# Delete files older than n-days via T-SQL  */
SELECT SQL#.File_Delete(files.Location + '' + files.Name)
FROM SQL#.File_GetDirectoryListing(@StartingDirectory, @Recursive, @DirectoryNamePattern, @FileNamePattern) files
WHERE files.LastWriteTime < (GETDATE() - 3)

Let’s move onto selecting your posts from Twitter.  First make sure you’ve followed these instructions:

/* SQL# Return a table of tweets  */
DECLARE		@ConsumerKey		NVARCHAR(100),
		@ConsumerSecret		NVARCHAR(100),
		@AccessToken		NVARCHAR(100),
		@AccessTokenSecret	NVARCHAR(100)
SELECT		@ConsumerKey = 'x',
		@ConsumerSecret = 'y',
		@AccessToken = '7-z',
		@AccessTokenSecret = 'z'

SELECT		StatusText,Created,ScreenName,UserName
FROM		SQL#.Twitter_GetFriendsTimeline(@ConsumerKey, @ConsumerSecret,
                                        @AccessToken, @AccessTokenSecret, NULL)

This is the same functionality I demo’d in C# in my first post.

In conclusion, this gives me an alternative to LINQ for business logic and it allows me to keep it in the stored procedures.  Which is especially nice when I have to write any reports where I don’t get to pass into a C# layer.


Convert a Blog Website into a SQL Data Dump

The challenge I faced: I wanted to catalog a blog site, format the content, and dump it into a SQL database to help out a buddy.

Solution Miststep 1:

I began to go down the road of using .Net’s WebRequest and WebResponse objects.

You find a lot about this on google, and they run like this:

// used to build entire input
StringBuilder sb  = new StringBuilder();
 // used on each read operation

byte[] buf = new byte[8192];
 // prepare the web page we will be asking for

HttpWebRequest request  = (HttpWebRequest)WebRequest.Create("");
// execute the request
HttpWebResponse response = (HttpWebResponse)
 // we will read data via the response stream

Stream resStream = response.GetResponseStream();
 string tempString = null;

int count      = 0;
int length = 0;
FileStream fs = new FileStream("test.txt",FileMode.Create);
StreamWriter writer = new StreamWriter(fs);

	// fill the buffer with data
	count = resStream.Read(buf,0,buf.Length);
	// make sure we read some data
	if (count != 0)
		// translate from bytes to ASCII text
		tempString = Encoding.ASCII.GetString(buf,0,count);
		 // locate and isolate the WOD

		begin =   tempString.IndexOf(beginstr);
		if (begin>-1)
while (count > 0); // any more data to read?
Console.WriteLine("Press enter and all that.");

Wow, that feels like a whole lot of code to do something pretty simple?



A buddy stumbled across this in a post.  Does everything you want above in one line.  Just for kicks it also strips the HTML out for you.  Nice.

   public string RemoveHtml(string sURL)
       using (System.Net.WebClient wc = new System.Net.WebClient())
         return System.Text.RegularExpressions.Regex.Replace(new System.IO.StreamReader(wc.OpenRead(sURL)).ReadToEnd(), "<[^>]*>", "").ToString();
     catch (Exception ex)
        return null;

I’ll take it.  If you take a look at the two sets of code… you’ll notice the second snippet creates a WebClient object and just grabs the HTML dump with OpenRead.  The first snippet creates a webrequest, fires it off, gets the WebResponse.  Then they fill a buffer byte array and execute a do-while loop to strip out each line.  Wow!  I’ll let you decide which code you’d want to use.  Needless to say the smaller code footprint was faster.



Alas, none of this was getting me any closer to being able to crawl through a specific set of URL’s.  Eventually I realized I was looking for a SiteMap generator, and where best to look for advice than Google.  They’ve compiled a very nice page of SiteMap generators (, and although I was looking forward to building my own in C#, the first rule of programming is laziness, so I quickly found a free tool that got me closer to what I wanted to be for this task:

Slick tool.

That got me a HTML dump of all the blog pages.  A little notepadd++ and excel sorting got me what I wanted (date/day/content).  Here’s a couple links I found along the way:

Then I dumped the excel into a SQL databse table.  Done.

Conclusion? This was a case where I was reinventing the wheel.  Standing on the shoulders of giants saved me a lot of thrashing.



© Copyright Duke Hall - Designed by Pexeto