meta comments edit

This post is pretty meta, actually, but I found it a little too interesting not to publish.

When I decided to restart my blog, I wanted to use GitHub Pages / Jekyll. The problem with that is that I also have my own domain name, and I strongly believe in ‘HTTPS ALL TEH THINGS’ - which I could not do with GitHub Pages and a custom domain.

I knew there had to be a way to accomplish my very simple requirements:

  • Jekyll
  • No GitHub

Eventually, I came to the realization that all I really needed was a Linux VM, and some coffee.

My first starting point was Azure, cause I’m a Microsoft-stack kind of guy, but the price of the Linux VM that also allowed custom domains and HTTPs was over $50 USD/month. I ain’t spending that on a blog. Then, I remember my old friend, Digital Ocean. I knew I could spin up a low-end, but still enough for a blog, VM for about $10 USD/month, which I was much more comfortable with.

So, I signed back up, and fired up a basic Linux VM.

With the VM ready, I just had to figure out the configuration for everything -

  • Ruby
  • Jekyll
  • Git (plus some way of building the site when I pushed changes, ala, GitHub Pages)
  • Apache
  • Let’s Encrypt

Since I’m not a Linux guy, this took me about 4 hours to figure out. Every article I referenced did things a little bit differently than the last, and nothing worked for me. I ended up blowing away the VM at least 3-4 times.

Finally, after multiple attempts, I landed on the right set of apt-get installations and configuration changes to get all of this going again.

Before I dive into the configuration settings and walk-through, here is an overview of my blogging process, in graphical form:

blogging process overview

It all starts with me, the blue guy in the upper left corner. I make a post, and push it to my private Git repository hosted on my Linux VM from Digital Ocean. The Git repository has a post-receive hook that clones the repository into a temporary location, does all the Jekyll magic, pushes the files into the Apache folder for the site, and then cleans up after itself. The site in Apache is visible to the internet, and bound to a Let’s Encrypt certificate that auto-renews daily via a cron job.

You can also see an alternate path, where I can push the code to GitHub. This is to allow anyone to fork/clone the repository (so they can start a similar site, or maybe push a change to mine). If they decide to push a change to mine, I can integrate the change locally, and then push it to the private site.

I really like this setup (now that it’s all setup and configured the way I want/need it to be)

In the next post, I’ll begin outlining the steps I took on the Linux VM to make the magic happen.

meta comments edit

I’m back!

Not that you care, but still.

Content forthcoming!

Thanks for reading!

itunes, letsencrypt, podcast comments edit

For all of you podcasters out there, Apple has FINALLY enabled support for LetsEncrypt certificates to be used on your podcast feeds. I only recently ran into this issue standing up a new feed for The Talking Devs, and getting denied by the iTunes submit process (but accepted right away via Googles).

Apparently, this change also happened somewhat silently, and currently they (Apple) haven’t even updated their FAQ on the changes.

I learned about this via a friend, that pointed me to this article:

extensions, visual studio comments edit

Every single time I install Visual Studio, the very next step I take is to install my favorite extensions. I only have three, and here they are, in order:

  • ReSharper - I can’t live without this anymore. I’m finally to the point where I have a lot of shortcuts memorized. VS is catching up, but R# still takes the win.
  • NCrunch - Continuous testing, right in VS. Love my green dots.
  • WallabyJs - Continuous JAVASCRIPT testing, right in VS. Love my green squares :D

Pretty simple list, eh?

What about you? What are your favorite extensions? Is there something groundbreaking that I’m missing out on? Something you developed yourself? Leave it in the comments!

*Also, props to WallabyJs for being the only one in my list WITH HTTPS, BY DEFAULT.

extensions, visual studio, notepad++, oss comments edit

The past weekend, I took a few moments to relish in the “fun” that is programming something that is not work-related. That creation came to be a Visual Studio extension, known as, “Open in Notepad++”.

You might be wondering, what does this extension even do?

Well, it adds a new item to the context menu when right-clicking files/folders in the Solution Explorer:

Context Menu

The extension will attempt to find your installation of Notepad++, but if for some reason it cannot, you can always manually update the path in Tools | Open in Notepad++:

Settings Pane

You can install this extension through the Extensions and Updates dialog inside of Visual Studio:

Install Dialog

Or, grab the VSIX directly from the Visual Studio Gallery (also, go here to leave awesome reviews for me):

Also, if you find any issues, or want to contribute, head over to GitHub:


python, drm, google comments edit

I purchase all of my music through Google Play, and I primarily use their web application to listen to it. As a precaution, I also use the Music Manager application to keep a local backup of all of this music on my NAS.

Recently, I wanted to switch to a new Google Account because I was changing primary email addresses. Before doing this, I made sure the music manager had downloaded any purchased music to my NAS because I will be deleting the old Google Account permanently.

Once I had verified the music had been downloaded, I logged out in the Music Manager application, and logged in under my new account. At this point, I had to wait for ALL of my music to be re-uploaded to Google’s servers.

After the music had uploaded, there were 17 songs that could not be uploaded because I had purchased them under a different account. (Apparently not of all the music contains DRM?)

You can see the error in the troubleshooter –

“Song was purchased with another Google Play account”

A little digging led me understand that Google hides their DRM in a hidden ID3 tag that is not easily removable by conventional means. This digging, however, also led me to a Python utility that could do it – and so, into the world of Python.

First, I had to install Python, which was dead easy – *Note: I am on a Windows machine, and the utility I will show you apparently requires version 2.7 of Python, so get that one :)

During the installation, I let the installer add python.exe to my PATH variable.

Once the installation was complete, I launched a command prompt and installed ‘eyeD3‘ using pip:


Now, navigate to your music directory (mine is on my NAS, mapped to my computer as “M:”)

cd /d M:

Execute one simple command, and let it run (depending on how much music you have, it could take a while)

python -m eyed3.main --remove-frame PRIV ./

You should now see a lot of text flowing through your console output. That’s eyeD3 doing its thing. Let it run. Go grab a snack.

Once completed, the Google Music Manager picked up on the fact that the files had changed on disk, and started to re-upload them – without error I might add!

career, meta comments edit

I realized earlier that today marks my one year anniversary at Heuristic Solutions (thanks, Facebook ‘On this day’ :-) ).

My time at Heuristics has definitely been a game changer for me. Being able to work alongside Seth Petry-Johnson and Matthew Groves has proved beneficial beyond my imagination. To say, “I’ve learned a lot”, would be an oversimplification of the amount of knowledge I’ve gained over the past year, but that’s the easiest way to put it – I truly have learned a lot.

What does the next year hold?

Hard to say for sure. I would imagine a ton of new lines of code for LearningBuilder and more learnin’.

Outside of that, surprise me.

blogging, career, review comments edit

Over the last few weeks, I’ve been indulging in the content delivered to my inbox by a Mr. John Sonmez.

This content, or what is better known as his 5 day email course on blogging, has been a delight to consume. If you’ve never blogged before, but have always wanted to, I suggest you subscribe to John’s email course. He provides a clear and concise path for potential bloggers to get started easily and quickly that I haven’t found in other blogging materials.

For me, personally, since I’m already somewhat of a blogger, it’s provided a ton of ideas that I plan to work through and implement here.


conference, review, codemash comments edit

I’ve known about CodeMash for about 3 years, and of those three years, I had to skip out on TWO of them. The first year had already sold out before I found out, so there wasn’t much I could do in that situation. The second year, our daughter was born one week before (which I obviously knew ahead of time), so I couldn’t go then either.

Finally, with no children being expected, plenty of forewarning, and a ticket paid for by my [awesome] company, Heuristic Solutions (hey, we’re hiring), I was able to attend!

I didn’t end up attending any of the pre-compilers, only the main two days of sessions.

Let’s get to it!

Day One

Ten Practical Tips for Automated Testing of Web Applications - Jim Holmes

Jim is an experienced tester, and a great speaker – so this is a fantastic session for anyone interested in testing (obviously).

I walked away with some good tips, and a testing framework that I hadn’t heard of before.

The Future of C# is Now! - Dustin Campbell

I went into this session expecting an overview of new C# features, and I got that.

However, Dustin was also showing off a code analyzer that he built using Roslyn that included fixes for migration pre-C# 6.0 code to C# 6.0.

My love for Roslyn begins.

Decoupling the Frontend through Modular CSS - Julie Cameron

I am the furthest thing from a front-end designer as you can get, but I do have to think about stuff like it on a daily basis.

I, more or less, went to this session by recommendation, and I’m actually glad I did.

Julie gave some great tips for structuring your HTML, CSS, & JavaScript that I hadn’t paid much attention to before. She also went over some conventions that I had never even known about, so all in all, this was an excellent session.

Xamarin Forms and The Future of Mobile Application Development - Jesse Liberty

This was one of the vendor sessions, but since it was being presented by Jesse Liberty, I had to go. I tend to enjoy his style of teaching (I think I’ve watched every single one of his Pluralsight courses :))

Not only that, but Xamarin Forms is on my radar of things to deep dive, so I figured this would be a great session.

I was let down a little bit, to be honest. It was very introductory to what Xamarin Forms is and the basics on how to use it. I expected somewhat of a deeper look into it.

Still a good session, if that’s what you’re looking for.

Building Code Aware Frameworks using the .NET Compiler Platform (“Roslyn”) - Kevin Pilch-Bisson

As you previously read, I have fallen in love with Roslyn, and couldn’t pass up the opportunity to sit on a session dedicated to it.

Ironically, I had received the latest MSDN magazine in the mail prior to leaving for CodeMash, and had brought it along to read (which I did). This session demonstrated the concepts and sample code from one of the articles in that issue.

It was still a great session, and just the conversation and questions that went on were worth it.

Day Two

Getting More Out of Git - Jordan Kasper

I was a little late to this session – hey, it was early, alright! – and it was packed, so I ended up sitting in the floor. Comfy.

Anyway, there were a lot of good tips packed into this session, and Jordan did a great job of presenting. For me, though, I wish it had been later in the day, because I might have zoned out a time or two. Maybe. Probably. Yeah, I did.

Clean Code: Writing for Humans - Cory House

I’ve been following Cory for a while now, and was actually able to meet him and see his session while at CodeMash. Cory is an awesome guy, super cool, super nice, and gives a great session on clean code to boot.

The funny thing about this session, is, you listen to it, nodding your head the whole time agreeing with everything he says. Something about the material being reiterated to you in a fantastic presentation really makes the session worth it.

From Backbone to Ember and Back(bone) Again - Jonathan Knapp

I wasn’t fond of this session, mainly because it outlined a contracting gig where they did exactly what the title says. It didn’t hit home with me because I’ve never used either of those frameworks (and likely won’t for some time). Jonathan gave some metrics as to LOC/Speed/etc difference between the two, which was nice to see – but then ended up picking the slower/bigger one, and I don’t quite understand why. The whole time he kept reiterating that the original developers didn’t know JavaScript, and I suppose that’s a fair reasoning for being there, I think I expected more of why I should choose one of these frameworks over another – and, unfortunately, I still don’t have that answer.

Do You Even Kanban? - David Neal

This was another vendor session, and quite honestly, I was a little let down with it.

What I thought would be a session with a lot of information on LeanKit, it turned out to be an introduction to Kanban.

I already Kanban.


Guys, this was awesome (as are most open sessions).

This open session was packed, with people such as:

  • Kevin Pilch-Bisson
  • Dustin Campbell
  • Jon Skeet
  • Tim Rayburn
  • Kathleen Dollard

If you don’t know who those people are, go find out. NOW.

I didn’t contribute much to the overall conversation, but listening to these industry leaders talk about Roslyn made my day. I followed along with most of it (some definitely went over my head).

You can have your movie star idols – some of mine are listed right here. It was very difficult to choose those sessions, there were a ton of options for every time slot, and you want to see them all!


To summarize:

  • CodeMash is awesome.
  • The people that attend CodeMash are awesome.
  • The sessions are awesome.
  • The food is awesome.
  • Meeting new people is awesome.

Get your ticket and attend next time – find me and say hello, cause you better believe I’m going back next year!

jenkins, iis comments edit

You want to run Jenkins on port 80 on your windows machine, huh? It’s easier said than done.

Well, it used to be before I wrote this blog post. If you’re like me, you’ve already spent some time trying to get this to work the way you would expect it to work like the Jenkins documentation alludes to:

  • Open Jenkins.xml
  • Set --httpPort=80 in the command line section

Seems fairly easy doesn’t it? Well, as you’ve probably already figured out, it doesn’t work.

Jenkins, apparently, cannot bind to a ‘system’ reserved port (anything below 1024).

Again, if you’re like me, you fiddled with this for a while. Setting the windows service to run as Local Account/Network Account/Local Admin/Domain Admin/etc/etc/etc, Nothing works.

Then, you start searching the internet and find out this is a real problem that people have been struggling with for a while, and apparently, you can solve it by installing Apache.

Now, don’t get me wrong, I have nothing against Apache, though I haven’t used it in a while, I use to use it quite exclusively. For this installation, however, I didn’t feel like attempting to get that work, and wondered if it would be just as possible with IIS - which was already installed.

I finally figured out the secret recipe for getting it to work, and it goes a little something like this:

  • Configure Jenkins to run on whatever port you like - I left it at the default 8080
  • Install IIS
  • Install the IIS URL Rewrite module
  • Install the IIS Application Request Routing module
  • Create a ‘dummy’ website in IIS that is bound to port 80
  • Add a ‘Reverse Proxy Inbound Rule’ to the dummy website that rewrites the requests from port 80 to port 8080 (on the same machine)
  • Step 6 is a little more involved than the others, so, screenshots!

Pick the rule type to create (Reverse Proxy Inbound Rule) Rule Creation

Configure the destination IP/port on the dialog, and hit OK Rule Configuration

Right-click the new rule, and click ‘Edit’ - the rule should look similar to this Current Rule Configuration

Then, we’re only using HTTP, so I remove the condition, and hard-code HTTP in the rewrite URL Modified Rule Creation

If all is successful, you should be able to hit the ‘website’ running on port 80 of your server, and silently be directed to Jenkins on port 8080 without even noticing it.

If you run into any issues, or have any feedback, please leave a comment below!

career, imposter-syndrome comments edit

As most of you know, I recently started a new job with Heuristic Solutions working on their LearningBuilder product. I’ve been with them now for about a month, and up until about last week, things were going great. Then, out of nowhere, The Imposter Monster made its unwelcoming return.

Now, I’ve always been rather hard on myself when it comes to my work. Doubts plague me daily about where I am and where I should be. They never quite overlap in my mind – probably never will – but I’m working on it.

Having said all of that, I do believe that part of this is just from the stress of starting a new job. Everybody has been there, I imagine. You leave a job where you provided value, felt good about the decisions you made, knew everyone, etc. – then, all of a sudden, you’re back on the bottom of the totem pole without any of those happy feelings.

Am I supposed to be here? Do I deserve this job? Do I fit in? Do I actually have what it takes to bring value to this company?

I can’t answer those questions.

All I really wanted to do was get this post out here, in hopes that if someone is feeling the same way, perhaps they’ll come across this post and realize – they’re NOT alone.

channel9, dotnetconf, powershell comments edit

Based on my previous script of a nearly identical title, this script will snag the 2014 dotNetConf videos (high quality MP4s) from Channel 9.

Change the $baseLocation to a folder of your choosing, and let it go.

$baseLocation = "V:\Coding\Channel 9\dotNetConf 2014\"

$rssFeed = New-Object -TypeName XML
$rss = (New-Object System.Net.WebClient).DownloadString("")


$itemCount = $

for($i = 0; $i -lt $itemCount; $i++) {
     $fileCount = $i + 1
     Write-Progress -Activity "Downloading Recordings..." -Status "Processing file $fileCount of $itemCount" -PercentComplete (($i/$itemCount)*100)

     $item = $[$i]

     $fileExtension = $item.enclosure.url.Substring($item.enclosure.url.lastIndexOf('.'), $item.enclosure.url.length - $item.enclosure.url.lastIndexOf('.'))

     $cleanFileName = [RegEx]::Replace($item.title, "[{0}]" -f ([RegEx]::Escape([String][System.IO.Path]::GetInvalidFileNameChars())), '') 

     $downloadTo = $baseLocation+$cleanFileName+$fileExtension

     If(!(Test-Path $downloadTo)) {
          (New-Object System.Net.WebClient).DownloadFile($item.enclosure.url, $downloadTo)

career comments edit

The last few weeks have been a little stressful here at the Allen household due to the job situation, but things are turning around!

On Tuesday, May 27th, I will be starting a new job that I’m extremely excited about.

What is it you ask?

Well, I have accepted an offer with Heuristic Solutions to work on their LearningBuilder product!

Not only do I get to work with a fantastic team of people on a great product, but I get to do it from home!

virtualbox comments edit

I had a really difficult time coming up with a title for this post, but essentially what I’m trying to convey is:

  • You created a VirtualBox machine
  • You deleted/removed/transferred the VM files on your HDD
  • VirtualBox will no longer let you manage that machine - indicates “inaccessible” (can’t even remove it).

I found myself in this predicament earlier this evening. All I wanted to do was remove the VM from the list of machines in VirtualBox, but it wouldn’t let me because it couldn’t find any of the files in the given location. After a few hacks, I found a solution. Hopefully, this might help someone.

Given a VirtualBox machine - let’s call him “Test Machine”, that resides on your host machine’s file system Setup - Image 1 Setup - Image 2

You, accidentally or purposefully, delete the folder containing the VM Folder is gone

Suddenly, you can no longer access the VM from within the VirtualBox Manager Machine Invalid

And, any attempts to remove it are futile Cannot Remove

If you can deal with the ‘clutter’ of an orphaned VM, yay. If not, like me, we gotta get rid of that thing.

Here’s how I did it.

  • Shut down ALL VMs, and close VirtualBox Manager
  • Navigate to your personal VirtualBox settings file (Mine is IN my Users folder - C:\Users\Calvin\.VirtualBox\VirtualBox.xml)
  • Edit the file IN Notepad/Notepad2/Notepad++/WhateverEtcPad++2
  • Find the MachineRegistry section, and remove the MachineEntry for the offending machine VirtualBox Configuration
  • Save the file, close, and reopen VirtualBox Manager. If successful, you should no longer see the machine. Invalid Machine Gone

That’s it! Hope it helps someone!

channel9, build, powershell comments edit

Crazy long title, but you get the idea.

Here is a PowerShell script that will download all of the mp4 high quality videos from Channel 9 for Build 2014.

Change the $baseLocation to a folder of your choosing, and let it go.

$baseLocation = "V:/Coding/Channel 9/Build 2014/"
$rssFeed = New-Object -TypeName XML

$rss = (New-Object System.Net.WebClient).DownloadString("")
$itemCount = $

for($i = 0; $i -lt $itemCount; $i++) {
    $fileCount = $i + 1
    Write-Progress -Activity "Downloading Recordings..." -Status "Processing file $fileCount of $itemCount" -PercentComplete (($i/$itemCount)*100)
    $item = $[$i]
    $fileExtension = $item.enclosure.url.Substring($item.enclosure.url.lastIndexOf('.'), $item.enclosure.url.length - $item.enclosure.url.lastIndexOf('.'))
    $cleanFileName = [RegEx]::Replace($item.title, "[{0}]" -f ([RegEx]::Escape([String][System.IO.Path]::GetInvalidFileNameChars())), '') 
    $downloadTo = $baseLocation+$cleanFileName+$fileExtension
    If(!(Test-Path $downloadTo)) {
        (New-Object System.Net.WebClient).DownloadFile($item.enclosure.url, $downloadTo)

stirtrek, conference, review comments edit

Oh man, StirTrek was awesome this year! There were a ton of people there – I think around 1200, total. It did, at times, make me a little claustrophobic, which I have never had a problem with before, so that was new :)

Let’s get straight to the details….

Registration was a snap – quick and easy, no problems.

I loved the badges they gave us – awesome Star Trek image on it, a place for your name, and, my favorite – the schedule for the day on the back. You could simply flip it over to see where you were going next. It was a great idea!

Breakfast consisted of donuts, bagels, coffee and water. I’m not much of a breakfast eater, so it worked out perfectly for me.

After some mingling, it was off to the first session of the day!

Session 01 – Javascript Spaghetti – Jared Faris

I had a strong desire this year to hit all Javascript-based talks, so this has nothing to do with the fact that Jared is my boss :) Anyway, this was a great talk. Jared did a wonderful job of incorporating some humorous aspects into his talk, while still making it relevant and interesting. I learned a few things, and had a great time, what more could you ask?

Session 02 – Understanding Prototypal Inheritance – Guy Royse

Guy is another one of those fantastic speakers that you should definitely try and catch any chance you get. The topic of this talk is a difficult one to give, and while I came away still a little confused, Guy made it fun and enjoyable.


Jimmy John’s. Served in whatever theatre you were already in. Awesome. And Yummy. ‘Nuff said.

Session 03 – Custom Graphics for your Web Application: The HTML5 Canvas and Kinetic.js – Jason Follas

I went to this talk to learn more about Kinetic.js than the Canvas itself, and I was glad to see a nice portion of the talk dedicated to it. I had not heard of Kinetic.js prior to this talk, and man, it’s amazing! The features seem to be pretty robust, and Jason has even contributed back a few features himself to extend it even further. Jason did a great job presenting this – he was extremely clear, concise, and just an all around great speaker. This is the first time I had seen him speak, and I’ll definitely be seeking out his sessions in the future.

Session 04 – JavaScript: Pretty cool guy and doesn’t afraid of anything – Evan Booth

Well, I guess there has to be a bad apple in every bag. This talk was poor, at best. Evan tried to scatter some humor in his talk, and while it was entertaining at first, it quickly got old. He seemed to be reading this slides as he went, talked quickly, rarely made eye contact with the audience, and went completely off-topic into CSS as part of his presentation. I didn’t go to hear about CSS – I went to hear about Javascript. After it was over, he began showing videos of himself making weapons from items bought beyond the TSA security checkpoints at airports. He seemed to believe he was doing a good deed by doing these things, and commented that he does give the info to the TSA. My thoughts? Wrong location to be showing that stuff, bub. While it was mildy entertaining at first, it quickly got old, and honestly seemed to make some people a little irritated that he would be showing that kind of information to the general public. Not a good idea. Fail.

Session 05 – I Didn’t Know JavaScript Could Do That! – David Hoerster

I had kind of forgotten what this session was about until after David actually began, but I’m glad I went! David seemed to be a little nervous at first, not sure if this was his first time presenting or not – and he even mentioned being nervous at one point. But, David, you did an awesome job. I enjoyed your talk very much. You got the audience participating with questions (most of which I got wrong, by the way), had humor woven into the talk, used Prezi (bonus points for that :) ), and you taught me stuff! I walked away from your talk with a better understanding of prototypal inheritance, which I had been trying to understand for some time. Nice job, David.

Overall, Stir Trek was bloody awesome, and I can’t wait to go next year. I would change a few things, but they are mainly minor. For instance, having a difficulty level on the talks so we can gauge better about attending. Evan’s talk would have been Beginner, while Guy’s would have been Advanced. I would also like to see televisions scattered around monitoring the Stir Trek hashtag from Twitter. They had this at CodePaLOUsa, and I loved reading the comments while moving around. Oh, perhaps even do it on the big screens between sessions :) Honestly, that’s it. That’s all I would change. See? Told you in was minor stuff.

See you at Stir Trek 2014!

codepalousa, conference, review comments edit

I had the opportunity this year to attend one of the midwest’s premiere community-ran developer conferences, CodePaLOUsa, which was held in Louisville, KY on April 25th-27th.

The first day of the conference was all pre-compiler workshops, which I did not attend (and therefore cannot review).

April 26th


The first day of the conference opened with a keynote by none other than Richard Campbell, from .NET Rocks!, one of the best podcasts available.

Richard did a fantastic job at the keynote, and had the audience rolling with all the jokes. He told a great story, but what really stuck with me was his endeavor to create software for humanitarian relief efforts through I already see myself getting involved with this at some point.

Session #1 – The Class That Knew Too Much – Matthew Groves

This session was on refactoring techniques and had a (brief) introduction/overview to aspect-oriented program (or, AOP, for short). Matthew is local to me, and he’s given this talk many times all around me, but this was the first chance I’d had to attend one of them. Although I enjoyed the session, there were some technical difficulties that kept arising with the projector and connection. I don’t believe this was an issue with the presenter’s hardware, however, as I witnessed the same issue later on in the same room. Not only that, but the room was quite small and quickly overflowed. As a matter of fact, Matt had to give a second session the next day to accommodate the rest of the individuals. Overall, this is a fantastic session/talk, and Matt does a great job all around.

Session: 8/10 Location: 5/10

Session #2 – Deeper Dive into the Windows Phone 8 SDK – Michael Crump

This session was all about the new features in the WP8 SDK. Not having tried any WP development before, I was surprised to see some of the items in these new features. Surprised, because I would have expected some of them to already have been there. Michael presented quite well, but did run into some demo issues that were unable to be resolved during the session. He did, however, make them available via GitHub after the fact. The problem seemed to revolve around flaky internet connectivity, though that cannot be proven at this point I suppose. This room was a lot larger than the previous room, but attendance was relatively low and did not require a large room.

Session: 7/10 Location: 5/10

Session #3 – Secure Mobile Application Development – Jamie Ridgway

I hate to say it, but I did not enjoy this session. Jamie did a great job of gathering the information, and presenting it, but that’s all it was. His slides and talk were all based around the top 10 vulnerabilities for mobile applications by OWASP. I could have read that information myself. I would have liked to have seen a few demos scattered in that demonstrated some of the issues. The room held everyone well, and was a nice choice – though it did get rather cold.

Session: 5/10 Location: 8/10

Session #4 – Rails for the .NET Developer – Jamie Wright

Let me be clear. I am not a Ruby developer. I am a .NET developer. Why did I choose this session? I love to learn. I enjoyed the beginning of this session, but quickly got lost in the demos. Jamie did a side-by-side comparison of the same application being developed in both .NET and Ruby (Rails). He did things a little different by recording his demos ahead of time, and discussing things while he played them back for us. This worked out okay, but there are a few issues. The first problem is that the video speed was increased – and for people new to Ruby/Rails, this made it difficult to follow at times, even with Jamie giving an overview. The second issue is that after about half-way through, he stopped showing the .NET videos and only showing the Ruby videos. I suppose by that point we had an idea of what the application was supposed to do, but I would have liked to have the comparison. This was a smaller room, and was relatively full, but worked nicely. Jamie did have a technical issue or two with the projector, but luckily things got resolved.

Session: 6/10 Location: 7/10

April 27th


This keynote presentation was given by Carl Franklin, the other half of .NET Rocks!. While I would love to review this keynote speech, Calvin slept in this morning.

Session #1 – All The Buzzwords: Backbone.js Over Rails In the Cloud – Jared Faris

I have to be careful what I say here, as Jared is one of my managers :)

Jared discussed a lot of his architecture choices while he ran the development at a local start-up for 1.5 years. The application was written in Ruby, and utilized quite a few frameworks and packages during development. Not being a Ruby developer, as previously mentioned, I enjoyed hearing about their trials, tribulations, and the many decisions that came up along the way. Jared put a lot of extra time into his slides, utilizing 8-bit style imagery throughout, which I loved. Jared’s talk was located in the same room where lunch and keynotes were held. Attendance, while pretty good, did not warrant that amount of space, and participation/questions from the attendees was minimal to none at best.

Session: 8/10 Location: 5/10

Session #1.5 – Everyone Sucks at Feedback – Chris Michel

I was actually not expecting this session, since it was in the middle of lunch. The presenter did a great job speaking, and used a lot of humor in his slides – which was a nice change. Honestly, I didn’t pay enough attention during this presentation (ummm…food!?) to warrant a full review, but I would definitely see him present again from what I did see.

Session #2 – Open Space

A few people, including myself, decided to skip this session and have an open-space discussion on confidence. There were, at one time, about 8-10 people present for this (sorry, I don’t remember everyone!). I found this discussion rather enlightening, as I definitely have a confidence problem in myself. It was good to hear that I’m not the only one, and it can definitely be overcome.

Session #3 – Build a Single-Page App with Ember.js and Sinatra – Chris Meadows

Chris did a great job on this presentation, showing one of the more ‘elusive’ javascript frameworks. While Sinatra was used, that was secondary to the main topic and was only used as the back-end. I haven’t had an opportunity (or need) to utilize Ember.js before, but after seeing Chris’s talk, I’m on the hunt for a project. He described the relationships between the views, controllers, models, and the router. The room was full, but worked out well.

Session: 8/10 Location: 8/10

Session #4 – An Introduction to Genetic Algorithms for Artificial Intelligence Using Rubywarrior – James McLellan

Woah. I didn’t realize what I was getting myself into by going to this one. This talk focused heavily on genes and genetic makeup – something I know nothing about. The only saving grace was that it was brought into focus by utilizing a Ruby application called RubyWarrior. This ‘game’ allowed you to utilize your own ‘genes’ (or classes that act as AI – i.e., walk, turn, etc.) You can then bundle these ‘genes’ to try and solve a level in the game. There was a lot of Ruby code involved, which I did expect given the title of the session. Overall, though, James’ presentation style was a little dry. The room was pretty full, though, and seemed to be a good match for the session.

Session: 5/10 Location: 8/10

Closing Session

We almost didn’t stay for the closing session, but I’m kind of glad we did. Carl Franklin took over again, asking attendees various development related questions – he inevitably gave away the answers – to which prizes were given away. Trust me, there were tons of prizes given away (we didn’t win anyway), and it was just a fun time.


Overall, I would say that CodePaLOUsa is a great conference. It’s ran by intelligent people – by the community, for the community. My biggest complaints are actually minimal in the grand scheme of things. Some of the projectors and equipment seem to be finicky – they might be property of the hotel, too, I am unsure. Some of the rooms were a little cramped due to the partitioning walls of the hotel. I would have liked more signage as a first time attendee as well. Is it worth the $250? Yeah, I think so. I think so enough that I’ll be attending next year!

See you at CodePaLOUsa 2014!

webapi, mvc comments edit

Alright, you have your MVC 4 website up and running, and you realize you need to add some WebAPI support – not necessarily for the website, but potential external consumers. Your site is configured using Areas for everything, so you decide it would be best to put the WebAPI layer in an Area as well. Makes sense, right? Right. You quickly find out that it isn’t just as simple as right-clicking, add new area, name it API, pat self on back, etc. That’s where this trick comes in.

Now, by default in an MVC 4 project, your Global.asax file calls out to another class to configure WebAPI. It will look something like this:


Guess what? Comment that line out. The file this utilizes is in the App_Start directory, aptly named WebApiConfig.cs. You can leave it, or delete it. You’re call.

Now, head over to your area, we need to make some routing changes.

Look for APIAreaRegistration.cs and open it up.

Bring in another namespace:

using System.Web.Http;

Now, you see that route down below? It needs two minor tweaks to work with WebAPI. Basically, change the method call from:

        new { action = "Index", id = UrlParameter.Optional }


        new { id = UrlParameter.Optional }

In a nutshell, we changed the route to register an HttpRoute, and got rid of the {action} part of the route.

Boom. You’re done.

Keep in mind that this is THE WebAPI layer for your application – with the changes we’ve made, you can’t have any other WebAPI controllers outside of your area. If you find you need the ability for the Area and others, there are a couple methods that others have posted to make it work. I didn’t need anything like that, so this worked well for me.

nuget, powershell comments edit

Recently a need arose to have a few project-level items added to a project via a NuGet package. While this was no big deal, we ran into an issue of having the items marked as Copy if Newer for the Copy to Output Directory action, and couldn’t manage to find a way to change these properties.

After a bit of research, we determined that an install.ps1 PowerShell script (as part of the NuGet package installation) could access the project items and set the properties of them.

A script was written to handle the three files added to the project:

    param($installPath, $toolsPath, $package, $project)

    $file1 = $project.ProjectItems.Item(\"FolderItem.exe\")
    $file2 = $project.ProjectItems.Item(\"FolderItem.exe.config\")
    $file1.Properties.Item(\"CopyToOutputDirectory\").Value = [int]2
    $file2.Properties.Item(\"CopyToOutputDirectory\").Value = [int]2

Unfortunately, the script didn’t work. Why, you ask? Well, after some more digging, it turns out you can only access top-level items using the above syntax, so you have to chain the commands together to properly access the items:

    param($installPath, $toolsPath, $package, $project)
    $file1 = $project.ProjectItems.Item(\"Folder\").ProjectItems.Item(\"Item.exe\")
    $file2 = $project.ProjectItems.Item(\"Folder\").ProjectItems.Item(\"Item.exe.config\")
    $file1.Properties.Item(\"CopyToOutputDirectory\").Value = [int]2
    $file2.Properties.Item(\"CopyToOutputDirectory\").Value = [int]2

webapi, mvc, entity-framework comments edit

In this post, I’ll show you some of the basics on how to utilize Entity Framework 5.0’s “Code-First” features to develop a data access layer against an existing database. I’ll be demonstrating these concepts with a new MVC 4 Web API application and SQL Server 2012, but I won’t be covering either of those in this tutorial.

While Code-First is a great paradigm for starting a new application from scratch, you can also use it to map back to an existing database with ease.

Let’s pretend we’re working with a very simplistic Twitter model, as shown below. Database Schema

Not a lot of meat here, a simple structure for Users and their Tweets. Of course, the real Twitter model is more complex, but this will suffice for the purpose of this tutorial.

To demonstrate how to accomplish this, we’re going to create a new MVC 4 Web API application in Visual Studio 2012, using C# AS our language. Our database will be running in SQL Server 2012.

After launching Visual Studio, navigate to FILE | New | Project dialog, and select Web from the installed templates navigation section, select ASP.NET MVC 4 Web Application, give your project a name, and click OK.

I’m going to call mine, “Tweeters” New Project Dialog

Once you’ve hit OK, another dialog will pop-up (below), asking what kind of MVC 4 web application you would like to create. Go ahead and choose “Web API” from the list, and press OK. New MVC4 Dialog

Once Visual Studio finishes creating the project, you should have a structure resembling the figure below: New Project Finished

By default, a new Web API project will have quite a number of files put in place for you. For the most part, we’re going to leave them alone.

Visual Studio 2012 automatically pre-installs Entity Framework 5.0 for us, so we’re ready to start coding! If you ever need to install it separately, however, you can find the package for download via NuGet.

The first thing I’m going to do is create two basic classes that represent our database tables. These ‘models’ will be placed in the Models folder of your application: New Models

Let’s run through the code, so you can understand what’s going on.

If you notice the highlighted portions in the previous figure – these are attributes. We are using them to define some of the constraints we need to put on our models. Let’s go through each one in detail.

  • [Key] – This tells EF that the property that directly follows is to be used as the Primary Key.
  • [Required] – A value for this property must be supplied
  • [MaxLength(x)] – Sets the length of the string EF will accept when saving

One thing to note is the property at the bottom of the Users class. You’ll notice we have an ICollection<Tweets>; – this lets EF know how to navigate through the objects to, in this case, the children “Tweets” of “Users”. Essentially we’re setting up the database relationship between these two objects. So for the one-to-many relationship of Users & Tweets, we use a Collection.

If you refer back to our database model in the first figure, you’ll see that the Required and MaxLength attributes just replicate what our database will accept for these tables.

The next thing we have to do is create a database context. A context tells EF what database to connect to, and what models it expects.

Simply create a new class file (I’m calling mine EntityContext), and make it inherit from DbContext. This is a class provided by EF, so you’ll need to import the namespace to get access to it.

Once you have that, you’ll need a constructor. Notice, in our constructor, that we call into the base constructor and pass it a string. This is the connection string from the web.config that we want EF to use during database connections.

You’ll see that, in our constructor, we are setting the Initializer for our context to null with SetInitializer<>. This is very important. Since we’re connecting to an existing database, we don’t want EF to touch that database schema AT ALL. That’s what this is doing; telling EF NOT to try any database initialization logic – just connect, and leave it alone.

The next thing you’ll notice are the public DbSet<> properties. You want one of these for each model you want to interact with. In our case, we have Users and Tweets. Context

That’s all for the context. Pretty simple, huh? Let’s go define our connection string in the web.config. Web Config

Your connection string may differ from mine slightly, but I’m just connecting to a running database on my local machine.

Now we can go work on the controller that will be responsible for serving up some data from our database.

The first thing I’ve done is rename the ValuesController to UsersController just to make a little more sense. You don’t have to do this part if you don’t want, it doesn’t really matter in the long run, just remember what you call it when we go to make a request to the method. Users Controller

We start out by creating an EntityContext reference object and instantiating it in the constructor. In the Get method, we simply call the context and grab the Users collection. (This is the DbSet<Users> we talked about in the context section)

This is all we need to do to start querying our database! Press F5 to start debugging the project (hopefully you have no errors). You should get a default webpage (below) discussing how to get started with a Web API project. We’re NOT going to pay any attention to this since we just want to query our database through the API call. Website Landing

I’m going to fire up Fiddler to make a GET request to our API method. Fiddler

By default, the API is running under an “api” route, and then the controller name. So to issue a GET of our Users, we just need to construct the URL in Fiddler’s Composer tab, with a GET method. Once you have that setup, hit “Execute”.

In the figure below, we’re now looking at the result of our http request. You should hopefully get a result of 200. If so, double click on that line to switch over to see the results of your query. Fiddler Results Query Results

If all has went well, you should see the result of your query, presented IN JSON format! Notice that we got back ALL of our Users, along with a collection of their Tweets. You might be wondering why we got back the Tweets, when we only requested the Users – this is due to the properties we added to the Users class letting it know that it has a collection of Tweets beneath it. Entity Framework is smart enough to work out the magic beneath the covers to populate the collections for us! Awesome!


To recap, I showed you how to create a new ASP.NET MVC 4 Web API project that is backed by an existing database for querying through Entity Framework 5.0 Code-First.

Hopefully this tutorial has given you some basic insight into the capabilities of Entity Framework 5.0 Code First. I encourage you to keep digging into it, as this was only the tip of the iceberg!