Dev: Flask Intro 1

Recently, I attended a Python Meetup. The first presenter was Carter Rabasa from Twilio. He gave two short demos of Flask. From the site:

Flask is a microframework for Python based on Werkzeug, Jinja 2 and good intentions.

More specifically, it’s a micro web framework.
The alternative would be Django, a full server side MVC web framework.

This motivated me to run out and buy the book, “Flask Web Development” by Miguel Grinberg. This blog series will follow my experiments between the book, external sources like Carter’s open source code, and anything else I might fancy. My bigger motivation here is to see if Flask can help shrink the code footprint of a standard REST interface. I have done these in C# and while I appreciate the larger frameworks, I often find their a bit overkill for my needs. Particularly with a current trend in business shifting us towards Micro-Services. This slide from the presentation sort of explain what I mean:

First thing first: Running Flask on your dev machine.

Setting up your Virtual Environment on OS X:

sudo easy_install virtualenv
sudo mkdir flask_test
cd flask_test
virtualenv venv
source venv/bin/activate
(venv) pip install flask

That installs VirtualEnv. From the site:

virtualenv is a tool to create isolated Python environments.

Best to keep it clean. As an aside; if you need more than this, you might look to Vagrant.
Then you make a directory, startup a virtual environment, start it, and finally install flask.
You’re now ready to write the obligatory hello world:

Open an text editor (like SublimeText) and let’s put this code in:

from flask import Flask
app = Flask(__name__)

def index():
  return '<h1>Heyas world</h1>'

def user(name):
  return '<h1>Heyas, %s.</h1>' % name

if __name__ == '__main__':

Save that as

So what’s happening here:
Importing the Flask library and creating an app. If you’re familiar with Express, it’s similar.
Then you define two routes. First to take handle addresses to the base URL; Second to show dynamic routes.
Then we run the app.

Back in your terminal:

(venv) python

That should run your site:



With the debugging you should be able to watch some logging happen of HTTP error codes as you try out your site:

Okay there’s step 1.

Dev: Continuous Integration with Jenkins in a mixed Linux and Microsoft Environment

Jenkins is an open source continuous integration server. It boasts 929 plugins to handle about any sort of bizarre requirement you can throw at a project. In most cases, setting up CI or build servers is a tedious but incredibly important part of any environment.

Why Setup Continuous Integration
If your career has remained in small shops, you may not realize how easy something like this can make your life. If you put a little investment into setting up CI, you can gain quite a lot of time and peace of mind back on your system’s environments.
For me there are two big wins here:

  • First: Continuous Integration setup means you can reasonably disconnect your developers from your deployments. In small shops this in interpreted as a loss of control, however I sell this to developers as a risk management technique. No developer wants to be held accountable for breaking production at 2am. Automate this process, hand the keys over to the people who want to wake up at 2am, and back away slowly from the smoking gun.
  • Second: Continuous Integration is a positive feedback loop for good project maintenance. Once you have nightly builds configured, your developers will quickly learn not to check in broken code, and will use peer pressure to ridicule any developers who “break” a QA environment because they were too lazy to make sure their code built. Also, it’s nice to configure test cases to run before building, so, the “breaks” should be caught. If they’re not caught and something breaks, you know you need more tests.

Jenkins on Linux
In most cases, you’re going to find examples of Jenkins being installed on a linux server and being configured through it’s administrative website. If you’re deploying to all linux servers, life is easy. However, if you have even one server that requires any .Net compilation… well, life is not easy. You need this MSBuild plugin that needs the MSBuild dll. Surprisingly, Microsoft does not actually make a linux distribution of this tool (haha). If you rolled Jenkins on Debian or CentOS, well, you’re in a sticky place where you have to rely on WINE or MONO to hopefully execute a Win DLL. While this is a cute technical challenge, it’s also a waste of time in most cases that adds nothing to your project but hours and maybe a few stack exchange points.

Jenkins on Windows
If you run Jenkins on Windows, then there really are no technical challenges. Deploying to linux and windows systems is now doable with standard plug-ins.
With three plugins you can integrate a Git repository and deploy to linux and windows servers:

  1. Git Plugin:
  2. Publish Over SSH Plugin:
  3. MSBuild Plugin:

Installation on Windows.
Jenkins is written in Java, so you will need the Java Runtime Executable installed on your server.

You’ll also need to install Git on the server so Jenkins can use it.

Get the Jenkins Windows Installer:
Outside of the plugins, there’s very few configurations you have to make. Go to Configure Security:
Click “enable security”
select “Jenkins own user database” as realm
select “matrix-based security”
and create an account.
Then add in whatever plugins you need.

There are a few “gotchas” to avoid frustration:

Regarding MSBuild
Jenkins MSBuild plugin requires .NET framework 4.5 to be installed if the Visual studio project is dependent on it (

Regarding Git
The initial path is wrong. In “Manage Jenkins–>Configure System”. Change it to “C:\Program Files (x86)\Git\cmd”
Be careful when you setup your Git credentials on your job. Jenkins will automatically try your credentials without asking over and over. If you typo’d it, your account will get locked out.

Regarding Publish Over SSH
Server Setup is here: Jenkins->Manage Jenkins->Configure System
This is where you configure and add servers. These will populate the server drop down when you are creating a new job.

Just to restate; the point is not that a Jenkins install on Linux can-NOT handle running the MSBuild.dll through WINE or MONO.
The point is that going through this exercise is not always mission critical and that Jenkins can easily run on a Windows machine and handle deployments to all machine types now without the extra time spent setting up the above.

Dev: Enterprise Backbone.js Project Setup

Today I want to discuss the how’s and why’s of a Backbone.JS ( implementation.

Ultimately, doing client side MVC, you come to face to face with Google’s popular Angular framework. Many people will question you if you use anything but Angular. Another architect I met summed it up well though. He said that if you have to teach a team Angular, it takes some time, because you have to learn the “Angular” way of doing things. using Backbone, it’s straight Model View Controller setup. There’s really nothing new in there to learn, and I’m a big fan of eliminating learning curves for teams when possible.

So, that’s as good a reason as any. Now, once you decided to go down the Backbone road, there’s a lot of opinionated decisions you have to make. Ironic, since Backbone is supposed to be “opinionated” according to some definitions. Personally I think there were still too many choices left up to the developer… ergo pitfalls.

Folder Structure
The first thing you have to decide is how you want to setup your project.
If you’re smart, you want to do something that will make sense to programmers that come later.
Here is what we ended up using:

     |-- [scss]
     |-- [img]
     |-- [vendor] (where you stuff 3rd party plugins)
     |-- [js]
         |-- [lib] (external scripts: handlebars, foundations, i18next, etc)
         |-- [collections] (your backbone extended collections)
         |-- [models] (your backbone extended models)
         |-- [templates] (handlebars templates for programmatic views)
         |-- [views] (your backbone extended views)
         |-- app.js (configure your backbone app)
         |-- main.js
         |-- require-config.js (setup require.js dependencies)
         |-- router.js (setup your routes)
     |-- [locales]
     |-- [tests]
     |-- Gruntfile.js (builds the project)
     |-- package.json (contains all the packages...)
   |-- server.js (contains the Node.js website configuration)
   |-- build.js (optimizes javascript files)
     |-- 404.html
     |-- favicon.ico
     |-- index.html

This came from Mathew LeBlanc’s excellent Backbone.js course on Six months ago, I tried to follow Addy Osmani’s Backbone book, only to find it’s source code was out of date and didn’t work. There are some good ideas in it, but, the code’s out of date.

Here’s a short description of the 3rd party scripts we started with:

Style Sheets
Foundation handles the mobile-first design.
For our own CSS organization, we follow the SMACSS (Scalable and Modular Architecture for CSS – approach. On top of that, we’re using SASS with Compass which generates the .scss files.

Scalable and Modular Architecture for CSS

SASS and LESS are popular CSS authoring frameworks (Foundation is built on SASS, so, it aligns better than LESS for us). That means they manage and compile CSS code for your project. To get the build/compile functionality, we’re using Compass, which is written in Ruby, so you need Ruby installed for it to work.

The folder structure ends up looking like this:

     |-- [css]
         |-- main.css (generated)
     |-- [scss]
         |-- [base]
         |-- [layout]
         |-- [module]
         |-- [state]
         |-- [theme]
         |-- _mixins.scss
         |-- _variables.scss
         |-- main.scss

Now let’s explain that a bit.
The base structure comes from SMACSS.
main.scss: this is automatically generated via imports from other scss files. This is the compilation process, and thus the only CSS file that will be minified.
_mixins.scss: Custom mixins.
_variables.scss: Contains scss variables (colors/fonts).

You define styles in base, layout, module, state, and theme.
So if you create a Backbone View called “medical-survey.js” then the handlebars template would be “medical-survey.html” and the style sheet would be “medical-survey.scss”

On small projects, you might think that’s annoying, but on a large scale project, this is very handy. It allows us to chunk out work, while still compiling it all into a single, minified styles sheet at the end.

The important part of SASS comes with this setup line of code:

sass --watch scss:css --style compressed

This tells the SASS (running via Ruby/Compass) to watch the folders under scss. SASS will find all the .scss files and import them into a single main.scss file. This file will be compiled into the final main.css. This file *will* come out very large. Even with the white space stripped out, it will often be huge and can slow down your site unacceptably…

Minification and Compression
So, if you implemented Require.js in the project to handle the efficient loading of Javascript files. What do you do about the massive CSS file you just generated using SASS above? If you are using Node.js, there’s a solution in the middleware. Compression.

The website discussing it is here:

This will use gzip to decrease the file size when loading the website.

There’s a lot of thought, effort, and detail complete missing from this blog. But if you have any questions, please let me know. My goal is always to help teams use simple and tested methods, but when we use newer techniques, we’re faced with not being able to get the best information about using it. This is definitely a major caveat to using technology developed in the last 3-5 years. We often do it anyway because new techniques are usually developed to resolve an annoyance we’ve been dealing with. Node.js resolves the annoyance of web servers like IIS and Apache being a bit slower and complex than we would like. It also adapts well in scalable scenarios.

Likewise, I don’t see any reasonable way to avoid the modern problem of having ridiculous amounts of Javascript and CSS. It’s just the modern problem. We don’t want to take the time hit to a project (and should not) to reinvent the wheel by rewriting things other developers have done via JQuery and other handy tools.
I have been coding CSS since 1998 and what began as a tool to *simplify* applying global (and thus branded) styles has grown into a monster; albeit a highly useful one, the most developers simply won’t take the time to really understand. SASS and LESS are simply ways for these developers to deal with the massive amount of CSS to compile it down into at least one file. And Compression makes it manageable. But it doesn’t solve the root problem:
The CSS cat has gotten out of the bag and no one is quite sure hot to get it back in.

The other root problem is unsolvable: the companies building the browsers will never follow a standard unless it’s the “standard” they made. Just accept it and you’ll be happier.

Setup: MariaDB 10.0 on Azure

Back again with some more notes.

Opinion and Motivation

MS SQL is a popular tool, but in has a big hurdle in cloud adoption: pricing.
The obfuscation of pricing in Azure hurts. When a client asks how much it will cost to run a MS SQL cluster; no one can answer this very well. It’s sort of like asking Azure billing support how much running Sharepoint in Azure would cost (this is an inside joke; most of the Azure billing support reps don’t know about the Office365 cloud).
When you finally get an answer, well, it’s quite expensive. You’re paying for the Azure DB or the VM running SQL 2012; you’re paying for licenses, cores, compute time, huh? Enterprise shops are not immune to budget cuts. If you’re simultaneously being asked to migrate to the cloud and cut expenses… remember this post.

Solution: Use another popular relational database: MySQL
MySQL is the famed open source database.

Problem 1: Oracle bought it. If you don’t get why that’s a problem; you’re probably an executive. Congrats.
Problem 2: The original developer left and built a better version.

New Solution: Use the less popular open source relational database: MariaDB
Never heard of it? That’s okay. This post explains it pretty well:

Okay, so, that’s why, now let’s focus on the how-to.
If you followed my last blog, you’ve got a Virtual Machine in the Azure Cloud running Ubuntu 13.10. So what we’re going to do now is:

  • Install MariaDB 10.0
  • Connect to it via MySQL Workbench from my local desktopn
  • Create a test database
  • Go back to the Azure VM and prove it worked.

Install MariaDB

Here is the official website:

Open Putty and connect to your Azure VM running Ubuntu.
Issue these commands:

sudo apt-get install software-properties-common
sudo apt-key adv --recv-keys --keyserver hkp:// 0xcbcb082a1bb943db
sudo add-apt-repository 'deb saucy main'

That sets up the key and repository. Now you can install it.

sudo apt-get update
sudo apt-get install mariadb-server

This takes you to the following setup screen for MariaDB:

Verify MariaDB Works & Nice To Know’s

My source for this is the documentation:

To connect on your Azure VM to your MariaDB installation:

mysql -u root -p -h localhost

Here’s what you should see:

Great, now let’s just take it a little further so we have something to work with.
Enter the following commands into MariaDB to create a test database and look around.


Okay, so, it’s installed, alive, and working properly.

If you want to connect remotely to your MariaDB/MySQL database, be sure to:
Create an ENDPOINT on the Azure Virtual Machine
Go to your Azure Portal, click on Virtual Machines in the left navigation, and select your test VM.
Go to the EndPoints tab in the main portion of the portal.
At the bottom center, select Add (which infers you want to add an endpoint).
Select MySQL from the Endpoint drop down and it should look like this:

In a near future post I will describe how to connect to your MariaDB via MySQLWorkBench

Setup: Azure, Ubuntu 13.10, Node.js, Express.js

In my last post, I covered how you would setup a Virtual Box virtual machine on a Windows host for a Ubuntu 13.10 client, running as a web server with Node.js.

This post will take it to the next level: the cloud. Microsoft’s Azure cloud specifically. We will spin up a virtual machine in Azure with an Ubuntu image. From what I’ve seen, the Azure team has put a lot of effort into their Ubuntu images in the gallery. They’re pretty nice.

The next step will be to setup an SSH connection using Putty. Putty tends to be the most popular way to connect to remote servers that only have a command line. It’s not very exciting from user interface perspective, but, these are web servers, not gaming machines.

After the Putty install, certificate key generation, and connection to the server, the rest will be the usual install with some additional information I learned to make the process a bit better. Specifically, I’ll focus on setting up Express.js on top of Node.js. Express.js is a server side MVC framework that runs on top of Node.js and makes handling requests just a little bit easier.

Okay, on with the walk through.

The Azure Portal part

Log into your Azure Portal using your Windows Live Login ID.
On the left navigation of the Azure portal, click the Virtual Machines.

Then in the bottom left, there will be a “+New” button. Click that to begin creating your new virtual machine.
Pick through the choices like in this image:

Choose the Ubuntu 13.10:

This will take you the first server setup page. Before you can fill this page out though, you need a very important thing: An Azure Compatible Key
So, open a new browser, and download a program that generates SSL keys.

If you want to know how to get an SSL cert for your Windows Azure account, please follow the instructions here:

It can be a pain, but eventually you’ll end up with a .pem file that you enter on the Azure VM page.

Fill out the rest of the information: server name, size (small), username, password along with your .pem cert file.
This takes you to the second screen of the VM setup, where you can choose your DNS name, storage account, and regional affinity.
Choose a region closest to you or your client physically

The final screen lets you configure your End Points. You want a web server and you want to control it through Putty, so you need three: HTTP, HTTPS, and SSH. Just select them in the drop down and Azure does the work for you:

Create the server and watch it spin for a few minutes. Now that part is done. Next we will setup Putty. If you are a Windows Developer, you may never have used a Telnet or SSH client before. You may feel like you’ve gone back in history 20,000 years to witness the awesome power of a server without a user interface. And you may be surprised to know that this is exactly why Unix admins made jokes behind your back. Well, now it’s time to put that behind you.

The Putty part

Okay, so the first thing you need is not putty, it’s the puttygen program. It can be downloaded here:
You might be thinking, why is that URL not something like or some more legit sounding domain. Suffice to say, the developer doesn’t care.

On that page, look for the binaries section and download puttygen and putty

The Ubuntu part

Okay, so you’ve got your VM, you’ve installed and configured Putty, and you’re connected to the VM. Here’s the command line steps to get a minimal install setup:

sudo apt-get update
sudo apt-get install -y python-software-properties python g++ make
sudo add-apt-repository ppa:chris-lea/node.js
sudo apt-get update
sudo apt-get install g++ curl libssl-dev apache2-utils
sudo apt-get install git-core
sudo apt-get install nodejs
sudo npm install express
sudo nano package.json

Copy these lines into your window, then hit CTRL-X, Y, Enter. This file sets up the dependencies for the project. In this case, we’re just including express. After this, when we do the sudo npm install, that command will run through the package.json file and install any libraries under the dependencies part. If you have them mistyped… well it won’t work.

  "name": "a-test-app-no-spaces",
  "description": "spaces are okay here",
  "version": "0.0.1",
  "private": true,
  "dependencies": {
    "express": "3.x"

sudo nano server.js

var path   =   require("path");
var fs    =  require("fs");
var http  =  require("http");
var express  =  require("express");
var app  =  express();

//capture logs in the console

//serve static files - blank in the quotes means server.js is in the same folder as your HTML files.
app.use(express.static(__dirname + ''));

//404 error
app.use(function(req, res, next) {
  res.send(404, "file not found");

//start server
console.log("listening on port 80");

sudo npm install
sudo node server.js

So, after all this, you can open a browser to the domain name of your server, and you should see the 404 error message, since you have no HTML files:

And if you are watching your Putty session, you should see the activity logged by Express.js like this:

And there you have it.
So now, we’ve covered how to setup this web server in both Virtual Box and Windows Azure.
Thanks for checking in.

© Copyright Duke Hall - Designed by Pexeto