Category Archives: Technology

Why [Programming Language X] Is Unambiguously Better than [Programming Language Y]

Recently I have seen a lot of people wondering about the difference between [X] and [Y]. After all, they point out, both are [paradigm] languages that target [platform] and encourage the [style] style of programming while leaving you enough flexibility to [write shitty code].

Having written [simple program that’s often asked about in phone screens] in both languages, I think I’m pretty qualified to weigh in. I like to think about it in the following way: imagine [toy problem that you might give to a 5th grader who is just learning to program]. A [Y] implementation of it might look like this:

[really poorly engineered Y code]

Whereas in [X] you could accomplish the same thing with just

[slickly written X code that shows off syntactic sugar]

It’s pretty clear that the second is easier to understand and less error-prone.

Now consider type systems. [Religious assertion about the relative merits and demerits of static and dynamic typing.] Sure, [Y] gives you [the benefit of Y’s type system or lack thereof] but is this worth [the detriment of Y’s type system or lack thereof]? Obviously not!

Additionally, consider build tools. While [Y] uses [tool that I have never bothered to understand], [X] uses the far superior [tool that I marginally understand]. That’s reason enough to switch!

Finally, think about the development process. [X] has the amazing [X-specific IDE that’s still in pre-alpha], and it also integrates well with [text-editor that’s like 50 years old and whose key-bindings are based on Klingon] and [IDE that everyone uses but that everyone hates]. Sure, you can use [Y] with some of these, but it’s a much more laborious and painful process.

In conclusion, while there is room for polyglotism on the [platform] platform, we would all be well served if you [Y] developers would either crawl into a hole somewhere or else switch to [X] and compete with us for the handful of [X] jobs. Wait, never mind, [Y] is awesome!

(Hacker News link)

Constructive Mathematics in F# (and Clojure)

(Tell me what a terrible person I am on Hacker News.)

For as long as I can remember1 I’ve dreamed of reimplementing the entirety of mathematics from scratch. And now that I’ve finished the “Wheel of Time” series I have a little bit of extra time on my hands each day, which has allowed me to take baby steps toward my dream.

What this is

An implementation of mathematics in F# (and also in Clojure)

What this is not

An efficient implementation of mathematics in F# (or in Clojure)

You would never want to use this library to do mathematics, as it is chock-full of all sorts of non-tail-recursive function calls that will blow your stack like there’s no tomorrow. (If you don’t know what that means, just take my word that you would never want to use this library to do mathematics.) Instead, this library is an interesting way to learn about

  • how to construct a mathematics from scratch
  • how to implement a mathematics in F# (or Clojure)
  • my bizarre obsessions

As always when I work on stuff like this, the code is on my GitHub.

This was originally just going to be in F#, and then I read a couple of blog posts about ClojureScript, which reminded me I’d been meaning to do something in Clojure, so why not implement the same stuff a second time? (This is why “in Clojure” is in parentheses everywhere, and why the F# code has all the comments.) I tried to make the F# code F#-y and the Clojure code Clojure-y, but I’m not sure how well I succeeded.

I won’t go into excruciating detail about either mathematical theory or F# (or Clojure), but hopefully you can understand both from the detail I do go into. I also will only call a few high points of each codebase, if you want more gory details check out GitHub.

Both sets of code have handfuls of tests written, which should give you a good sense of how both libraries operate.


In F#, I’ll define a discriminated union

type Comparison = LessThan | Equal | GreaterThan

In Clojure you don’t typically use “types”, but we can just use keywords :less-than and :equal and :greater-than.

Natural Numbers

We’ll define these recursively. A natural number is either

  • “One” (which is just some thing, forget that you’re already familiar with a “one”), or
  • the “Successor” of a different natural number

Anything you can make using these rules is a natural number. Anything that you can’t isn’t.

We’ll call the successor of One “Two”, and the successor of Two “Three”, keeping in mind that at this point those are just names attached to things without any meaning other than “Two is the successor of One” and “Three is the successor of Two”.

In F# we can do this with a discriminated union:

type Natural = One | SuccessorOf of Natural
let Two = SuccessorOf One
// and so on

After trying a lot of things in Clojure, I finally decided the most Clojure-ish Clojurian way was

(defn successor-of [n] {:predecessor n})
(def one (successor-of nil))
(def two (successor-of one))
; and so on

Although the Clojure way at first looks backward, if you think about it both ways define the “successor of One” to be the number whose “predecessor” is One.

Next we’ll want to use this recursive structure to create an arithmetic. For instance, we can easily add two natural numbers:

let rec Add (n1 : Natural) (n2 : Natural) =
    match n1 with
        // adding One to a number is the same as taking its Successor
    | One -> SuccessorOf n2
        // otherwise n1 has a predecessor, add it to the successor of n2
        // idea: n1 + n2 = (n1 - 1) + (n2 + 1)
    | SuccessorOf n1' -> Add n1' (SuccessorOf n2)

Clojure doesn't have built-in pattern-matching, so instead I did something similar using a one? function:

(defn add [n1 n2]
  (if (one? n1)
    (successor-of n2)
    (add (predecessor-of n1) (successor-of n2))))

Both make it easy to create lazy infinite sequences of all natural numbers.

let AllNaturals = Seq.unfold (fun c -> Some (c, SuccessorOf c)) One


(def all-naturals (iterate successor-of one))

And (blame it on the natural numbers) both run into trouble when you try to define subtraction. In F# the natural thing to do is return an Option type:

// now, we'd like to define some notion of subtraction as the inverse of addition
// so if n1 + n2 = n3, then you'd like "n3 - n2" = n1
// but this isn't always defined, for instance 
//  n = One - One
// would mean One = One + n = SuccessorOf n, which plainly can never happen
// in this case we'll return None
let rec TrySubtract (n1 : Natural) (n2 : Natural) =
    match n1, n2 with
        // Since n1' + One = SucessorOf n1', then SuccessorOf n1' - One = n1'
    | SuccessorOf n1', One -> Some n1'
        // if n = (n1 + 1) - (n2 + 1), then
        //    n + n2 + 1 = n1 + 1
        // so n + n2 = n1,
        // or n = n1 - n2
    | SuccessorOf n1', SuccessorOf n2' -> TrySubtract n1' n2'
    | One, _ -> None // "Impossible subtraction"

In Clojure there is no option type, so I just returned nil for a bad subtraction:

(defn try-subtract [n1 n2]
    (one? n1) nil
    (one? n2) (predecessor-of n1)
    :else (try-subtract (predecessor-of n1) (predecessor-of n2))))


The failure of "subtraction" leads us to introduce the Integers, which you can (if you are so inclined) think of as equivalence classes of pairs of natural numbers, where (for instance),

(Three,Two) = (Two,One) = "the result of subtracting one from two" = 
 "the integer corresponding to one"

In F# we can again define a custom type:

type Integer =
| Positive of Natural.Natural
| Zero
| Negative of Natural.Natural

and map to equivalence classes using

let MakeInteger (plus,minus) =
    match Natural.Compare plus minus with
    | Comparison.Equal -> Zero
    | Comparison.GreaterThan -> Positive (Natural.Subtract plus minus)
    | Comparison.LessThan -> Negative (Natural.Subtract minus plus)

whereas in Clojure we just use maps:

(def zero {:sign :zero})
(defn positive [n] {:sign :positive, :n n})
(defn negative [n] {:sign :negative, :n n})

and the very similar

(defn make-integer [plus minus]
  (let [compare (natural-numbers/compare plus minus)]
    (case compare
      :equal zero
      :greater-than (positive (natural-numbers/subtract plus minus))
      :less-than (negative (natural-numbers/subtract minus plus)))))

We can easily define addition and subtraction and even multiplication, but when we get to division we run into problems again. You'd like 1 / 3 to be the number that when multiplied by three yields one. But there is no such Integer.

let rec TryDivide (i1 : Integer) (i2 : Integer) =
    match i1, i2 with
    | _, Zero -> failwithf "Division by Zero is not allowed"
    | _, Negative _ -> TryDivide (Negate i1) (Negate i2)
    | Zero, Positive _ -> Some Zero
    | Negative _, Positive _ -> 
        match TryDivide (Negate i1) i2 with
        | Some i -> Some (Negate i)
        | None -> None
    | Positive _ , Positive _ ->
        if LessThan i1 i2
        then None // cannot divide a smaller integer by a larger one
            match TryDivide (Subtract i1 i2) i2 with
            | Some i -> Some (SuccessorOf i)
            | None -> None

and similarly

(defn try-divide [i1 i2] =
     (zero? i2) (throw (Exception. "division by zero is not allowed"))
     (negative? i2) (try-divide (negate i1) (negate i2))
     (zero? i1) zero
     (negative? i1) (let [td (try-divide (negate i1) i2)]
                           (if td (negate td)))
     :else ; both positive
       (if (less-than i1 i2)
         (let [td (try-divide (subtract i1 i2) i2)]
           (if td (successor-of td))))))

And if we're clever we can get a lazy sequence of all prime numbers:

let rec IsPrime (i : Integer) =
    match i with
    | Zero -> false
    | Negative _ -> IsPrime (Negate i)
    | Positive Natural.One -> false
    | Positive _ ->
        let isComposite =
            Range Two (AlmostSquareRoot i)
            |> Seq.exists (fun i' -> IsDivisibleBy i i')
        not isComposite 

let AllPrimes =
    |> Positive
    |> Seq.filter IsPrime

and in Clojure

(defn prime? [i]
    (zero? i) false
    (negative? i) (prime? (negate i))
    (equal-to i one) false
    :else (not-any? #(is-divisible-by i %) (range two (almost-square-root i)))))

(def all-primes
  (->> natural-numbers/all-naturals
    (map positive)
    (filter prime?)))

Rational Numbers

Now, to solve the "division problem", we can similarly look at equivalence classes of pairs of integers, just as long as the second one isn't zero.

// motivated by the "division problem" -- given integers i1 and i2, where i2 not zero,
// would like to define some number q = Divide i1 i2, such that EqualTo i1 (Multiply q i2) 

// proceeding as above, why not define a new type of number as a *pair* (i1,i2) representing
// the "quotient" of i1 and i2.  Again such a representation is not unique, as you'd want
// (Two,One) = (Four,Two) = [the number corresponding to Two]

// when do we want (i1,i2) = (i1',i2') ?  
// when there is some i3 with i1 = i2 * i3, i1' = i2' * i3, or
// precisely when we have i1 * i2' = i1' * i2

// in particular, if x divides both i1 and i2, so that i1 = i1' * x, i2 = i2' * x, then
// i1 * i2' = i1' * x * i2' = i1' * i2, so that (i1, i2) = (i1', i2')

type Rational(numerator : Integer.Integer, denominator : Integer.Integer) =
    let gcd = 
        if Integer.EqualTo Integer.Zero denominator then failwithf "Cannot have a Zero denominator"
        else Integer.GCD numerator denominator
    // want denominator to be positive always
    let reSign =
        match denominator with
        | Integer.Negative _ -> Integer.Negate
        | _ -> id

    // divide by GCD to get to relatively prime
    let _numerator = (Integer.Divide (reSign numerator) gcd)
    let _denominator = (Integer.Divide (reSign denominator) gcd)

    member this.numerator with get () = _numerator
    member this.denominator with get () = _denominator


(defn rational [numerator denominator]
	  (let [gcd (if (integers/equal-to integers/zero denominator)
	              (throw (Exception. "cannot have a zero denominator!"))
	              (integers/gcd numerator denominator))
	        re-sign (if (integers/less-than denominator integers/zero)
	                  (fn [i] i))]
	    {:numerator (integers/divide (re-sign numerator) gcd),
	     :denominator (integers/divide (re-sign denominator) gcd)}))

There is lots of extra code around the rationals, although it's hard to run into limitations as we did before. The most common limitation is that there's no rational whose square is two, but it's hard to run into that limitation without reasoning outside the system.

Real Numbers

Two common ways of constructing the real numbers from the rationals are Dedekind Cuts and equivalence classes of Cauchy Sequences. Neither is easy to implement in code.

Instead, I found a way to specify real numbers as cauchy sequences along with specific cauchy bounds:

// following
// we'll define a Real numbers as a pair of functions:
// f : Integer -> Rational
// g : Integer -> Integer
// such that for any n, and for any i and j >= g(n) we have
//  AbsoluteValue (Subtract (f i) (f j)) <= Invert n

type IntegerToRational = Integer.Integer -> Rational.Rational
type IntegerToInteger = Integer.Integer -> Integer.Integer
type Real = IntegerToRational * IntegerToInteger

let Constantly (q : Rational.Rational) (_ : Integer.Integer) = q
let AlwaysOne (_ : Integer.Integer) = Integer.One
let FromRational (q : Rational.Rational) : Real = (Constantly q), AlwaysOne


(defn real [f g] {:f f, :g g})
(defn f-g [r] [(:f r) (:g r)])

(defn constantly [q] (fn [_] q))
(defn always-integer-one [_] integers/one)

(defn from-rational [q] (real (constantly q) always-integer-one))

One interesting twist here is that it is impossible to say whether two numbers are equal without reasoning outside the system. For instance, the real number FromRational Rational.Zero is equal to the real number

(Rational.FromInteger >> Rational.Invert, Rational.FromInteger >> Rational.Invert)

(which represents the sequence 1, 1/2 , 1/3, 1/4, ...), but again you can only reason about that outside of code. Instead you can define CompareWithTolerance which -- given a tolerance -- can tell you that one number is definitively greater than another, or that they're "approximately equal".

The ultimate test here would be to show that the real number

let SquareRootOfTwo : Real =
    let rationalTwo = Rational.FromInteger Integer.Two
    let sq x = Rational.Subtract (Rational.Multiply x x) rationalTwo
    let sq' x = Rational.Multiply x rationalTwo
	// newton's method
    let iterate _ guess = Rational.Subtract guess (Rational.Divide (sq guess) (sq' guess))
    let f = memoize iterate Rational.One
    let g (n : Integer.Integer) = n
    f, g

gives you the real number FromRational Rational.Two when you square it. It looks like it should. Unfortunately, trying to do so will blow up the call stack, so it's not advised. Maybe someday I'll go back and try to make everything tail-recursive.

Gaussian Integers

Another drawback of the Integers is that none of them have negative squares. One way to solve this is by adding a number "i" whose square is negative one. I got kind of bored with these, so I never took them too far and never wrote any tests.

Complex Numbers

The obvious next step would be to add the square root of negative one "i" to the real numbers. But since they're not working so great I never did this.


I spent way too much time on this project, and I really need to get back to other things, like the group-couponing site I'm planning to build, so I'm ready to call this quits. Here are some things I learned:

1. Math is hard.
2. Writing the Clojure versions was more "fun". However,
3. Getting the F# versions to work was much easier, because most of my Clojure bugs would have been caught by a type checker (or were caused by using maps as types and then having them unintentionally decompose).
4. If I put this much work into useful ideas, imagine what I could accomplish!
5. Probably I shouldn't read "Wheel of Time" again.

1. Which is approximately 1 week.

Secrets of Fire Truck Society

Hi, I gave a talk at Ignite Strata on “Secrets of Fire Truck Society” and at the end I promised that for more information you could visit this blog. Unfortunately, I haven’t had time to write a blog post. Here are some links to tide you over until I do:

On On Leaving Academia

Several people in my influencesphere have linked to this essay by a CS prof who’s leaving academia to join Google in order to “make a positive difference in the world.” I am, of course, wholly supportive of such a program, if not of his precise rationale, which is a mish-mash of ranting about wicked Republicans and wild-eyed idealism about the Academy.

What interests me most about his essay is the section entitled “Mass Production Of Education”, which is misguided in all the ways you’d expect from someone steeped in the culture of “bespoke” education. It lists three “worries”:

First, I worry that mass-production here will have the same effect that it has had on manufacturing for over two centuries: administrators and regents, eager to save money, will push for ever larger remote classes and fewer faculty to teach them.

Said differently, technologies that allow fewer faculty to teach the same number of students will allow universities to operate with fewer faculty. Let’s call this worry “Luddism“. I love a good loom-smashing as much as the next guy, but it’s sort of hard to take seriously a preference for the 19th-century manufacturing regime.

It seems likely that in a hundred years our grandchildren and those of us who’ve successfully been cryonically revived will share a laugh about how “education” used to involve crowding people into a room and making them sit still while someone stood up front and lectured at them. And then someone will brain-cast a ludicrous hyper-essay about how 4-D printing is democratizing the singularity, pining for the good old days of 3-D printing. And so on.

Second, I suspect that the “winners win” cycle will distort academia the same way that it has industry and society. When freed of constraints of distance and tuition, why wouldn’t every student choose a Stanford or MIT education over, say, UNM?

Said differently (and with apologies to UNM, which I’m sure is a fine school), if every student has access to cheap, high-quality education, few of them will choose to pursue a low-quality education. It is easy to see how purveyors of low-quality education might worry about this, but it’s hard to imagine why anyone else should.

Are we approaching a day in which there is only one professor of computer science for the whole US?

Seems pretty unlikely, but if we were that would be awesome because it would free up all the other computer science Ph.D.s, many of whom are brilliant, to do other stuff (like building Groupon and Pinterest clones)! This would be sad for the ones who really, really, really want to be teachers, but on balance it would be a huge win for the world.

Third, and finally, this trend threatens to kill some of what is most valuable about the academic experience, to both students and teachers. At the most fundamental level, education happens between individuals — a personal connection, however long or short, between mentor and student.

I have no idea how to say this differently, so I won’t try. Having been a teacher, I agree that the most rewarding moments happened between individuals. (Particularly when one of the individuals was the cute goth freshman girl who aced all the quizzes but still came to office hours.) Were those the most valuable parts of the teaching experience? Less clear. What’s more clear is that what was/is most valuable about my experience as a student was/is learning stuff. And these days most of what I know that’s useful I’ve learned from books or doing or even Coursera, not from the academy. I’ve broadened my horizons by pleasure reading, by arguing on LiveJournal, by discussions with peers on geek hikes far more than I ever did through school. With very few exceptions, my most profound intellectual connections have been with people I met outside of the school system.

It resonates at levels far deeper than the mere conveyance of information — it teaches us how to be social together and sets role models of what it is to perform in a field, to think rigorously, to be professional, and to be intellectually mature.

I suspect you have to have spent your whole life in academia to seriously assert that “the human connection in education” is the only path to these things, or even the easiest path to these things. College taught me how to play the same juvenile bulshytt status games we played in high school but at a slightly higher level. College professors were (sometimes) great role models for how to behave if you ever became a college professor, but not for much else. The levels of professionality and intellectual maturity I experienced in the academy were certainly no greater than I’ve experienced in the real world. I will freely admit to learning rigor (some would say too much rigor) while studying mathematics, which primed me to recognize the lack of rigor in so many other fields.

I am terribly afraid that our efforts to democratize the process will kill this human connection and sterilize one of the most joyful facets of this thousand-year-old institution.

Said differently, “we fear change”. Hopefully at Google he’ll learn to stop saying “democratize”, and maybe he’ll even meet a Republican or two. There must be one or two Republicans at Google, right?

Hyphen Class Post-Mortem

Last fall I signed up for two of the hyphen classes: the Machine Learning ml-class (Ng) and the Artificial Intelligence ai-class (Thrun and Norvig). Both were presented by Stanford professors but one of the conditions of taking the courses was that whenever I discuss them I am required to present the disclaimer that THEY WERE NOT ACTUALLY STANFORD COURSES and that I WAS NEVER ACTUALLY A STANFORD STUDENT and that furthermore I AM NOT FIT TO LICK THE BOOTS OF A STANFORD STUDENT and so on. (Caltech is better than Stanford anyway, even if whenever you tell people you’re in the economics department they always say, “we have one of those?!”)

My background is in math and economics, but I’ve taught myself quite a bit of computer science over the years, and I consider myself a decent programmer now, to the point where I could probably pass a “code on the chalkboard” job interview if that’s what I needed to do in order to support my family and/or drug habit.

I’d worked on some machine learning projects at previous jobs, so I’d picked up some of the basics, but I’d never taken any sort of course in machine learning. At my current job I’m the de facto subject matter expert, so I thought the courses might be a good idea.

The classes ended up being vastly different from one another. Here’s kind of a summary of each:


* Every week 5-10 recorded lectures, total 1-2 hours of lecture time. (There was an option to watch the lectures at 1.2x or even 1.5x speed, which I always used, so it might have been more like 3 hours in real-time. This means that if I ever meet Ng in real-life, he will appear to me to be speaking very, very slowly.)

* Most lectures had one or two (ungraded) integrated multiple choice quizzes with the sole purpose of “did you understand the material I just presented?”

* Each week had a set of “review questions” that were graded and were designed to make sure you understood the lectures as a whole. You could retake the review if you missed any (or if you didn’t) and they were programmed to slightly vary each time (so that a “which of the following are true” might be replaced with a “which of the following are false” with slightly different choices but covering the same material).

* Each week also had a programming assignment in Octave, for which they provided the bulk of the code, and you just had to code in some functions or algorithms. I probably spent 2-3 hours a week on these, a fair amount of that chasing down syntax-error bugs in my code and/or yelling at Octave for crashing all the time.

* Machine learning is a pretty broad topic, and this course mostly focused on what I’d call “machine learning using gradient descent.” There was some amount of calculus involved (although you could probably get by without it) and a *lot* of linear algebra. If you weren’t comfortable with linear algebra, the class would have been very hard, and the programming assignments probably would have taken a lot longer than they took me.

* The material was a nice mix of theoretical and practical. I’ve already used some of what I learned in my work, and if there was a continuation of the class I would definitely take it. As it stands I’m right now signed up for the nlp-class and the pgm-class, which should be starting soon, both of which are relevant to what I do.

* The workload, and the corresponding amount I learned, were substantially less than they would have been in an actual 10-week on-campus university course. This was great for me, since I also have a day job and a baby. If I were a full-time student being offered ml-class instead of a real machine learning class, I might feel a little cheated. (I saw a blog post by some Stanford student whining about this, but he was mostly upset that the hyphen classes were devaluing his degree. Someone should have reminded him about the disclaimer.)

* The class was very solidly prepared. The lectures were smooth and well thought out. The review questions did a good job of making sure you’d learned the right things from the lectures. The programming assignments were good in their focus on the algorithms, although that did insulate you from the real-world messiness of getting programs set up correctly.

* It certainly seemed like Ng really enjoyed teaching, and at the end of the last lecture he thanked everyone in a very heartfelt way for taking the class.


* Every week dozens of lectures, each a couple of minutes long, interspersed with little multiple choice quizzes. This was my first point of frustration, in that the quizzes were frequently about parts of the lecture that hadn’t happened yet. Furthermore, they often asked ambiguous questions, or questions that were unanswerable based on the material presented so far.

* Each week had a final quiz that you submitted answers for one time only. Then you waited until the deadline passed to find out if your answers were correct (and then you waited another day, because the site always went down on quiz submission day, and so they always extended the deadline by 24 hours). These quizzes were also ambiguous, which meant that if you wanted to get them correct you had to pester for clarifications (and sometimes for clarifications of the clarifications).

* This resulted in the feeling that the grading in the class was stochastic, and that your final score was more reflective of “can I guess what the quiz-writer really meant” than “did I really understand the material”. Although I didn’t particularly care about my grade in the class, I was still frustrated and disheartened by the feeling that the quizzes were more interested in *tricking* me than in helping me learn.

* What’s more, the quizzes often seemed to focus on what seemed to me tangential or inconsequential parts of the lesson, like making sure that I really, deeply understood step 3 of a 5-step process, but not whether I understood the other four steps or the process itself.

* The material also seemed very grab-bag, almost like an “artifical intelligence for non-majors” survey course.

* Anyway, partly on account of my finding the class frustrating, partly on account of time pressures, and partly because I didn’t feel like I was learning a whole lot, I dropped the ai-class after about four weeks.

* There were no programming assignments, but there was a midterm and a final exam, both after I quit the course. From what I could tell, they were longer versions of the quizzes, with the same problems of clarity and ambiguity. (I never unfollowed the @aiclass twitter, and during exam time it was a steady stream of clarifications and allowed assumptions.)

* Compared to the tightly-planned ml-class, the ai-class felt very haphazard. In addition, the ml-class platform I found more pleasant to use than the ai-class platform.

* I quit long before the last lecture, so I have no idea how heartfelt it was.

One thing about both classes: I *hate* lectures. I learn much better reading than I do being lectured at, and I found the lecture aspect of *both* classes frustrating. I have complained about this in many venues, but my prejudice is that if you’re using the internet to make me watch *lectures*, you’re not really reinventing education, because I still have to watch lectures, and I hate lectures. Did I mention that I hate lectures?

By way of comparison, I have also been doing CodeYear. It is currently below my level (I am plenty familiar with variables and if-then statements and for loops), but I don’t know much Javascript, and the current pace makes me hopeful that it will get interesting for me after another month or two.

If you don’t know that platform, it gives you a task (“create a variable called myName, assign your name to it, and print it to the console”) and a little code window to do it in. Then you click “run” and it runs and tells you if you got it right or not. There is a pre-canned hint for each problem.

What I really like about Codeacademy is that I can do it at my own pace. The lessons are wildly variable in quality, but I’m glad not to have to sit through hours of lectures every week. They also do “badges”, which I find more satisfying than I wish I did. That said, I suspect someone with no experience debugging code would find the experience impenetrable and waste hours tracking down simple syntax errors, and indeed I saw on Hacker News a post to this effect a few weeks ago.

In the end, despite all this, the way I learn best is through a combination of reading books and writing actual code. I’ve had to learn F# over the last month, which I’ve done by reading a couple of (quite nice) books and writing a lot of actual code. It’s hard for me to imagine the course that would have done me any better (or any faster).

Similarly, if I wanted to learn Rails (which some days I think I do and other days I think I don’t), I have trouble imagining a course that would do better for me than just working through the Rails Tutorial (which I have skimmed, which has convinced me that I could learn well from it).

Similarly similarly, I suspect that the right Machine Learning book (and some quality time with e.g. Kaggle) would have been much more effective for me than the ml-class was. But if such a book exists, I haven’t found it yet. But With Guns And a No-Knock Warrant

The government has a CIO, it turns out, and when he’s not hassling us to change our passwords again or to stop BitTorrenting on company time, he’s got a plan to re-invent government itself:

On Tuesday, VanRoekel said that he wants to overhaul the federal bureaucracy to become more agile in an age of services delivered via mobile apps, and where information is atomized so that it can be mashed up by anyone to provide deeper insights. He also wants to break down massive multi-year information technology projects into smaller, more modular projects in the hopes of saving the government from getting mired in multi-million dollar failures.


“Going forward, we need to embrace modular development, build on open standards, and run our projects in lean startup mode,” he said.

No one can argue that he doesn’t grasp the lingo. However, a career Microsoftie is maybe not the best choice to run anything in “lean startup mode”. As someone with a fair amount of startup experience, I offer him the following pieces of advice:

1. Never say “lean startup mode” (or “agile” or “mashed up”)

Each of these buzzwords sends a clear signal that either you’ve been in a coma since 2006 or that your “startup experience” consists entirely of eavesdropping at a coffee shop where programmers hang out.

2. Also, “mobile apps” are very 2009

I’m not saying you couldn’t hit the jackpot and sell several million copies of “Angry Birds D.C.” or “Laws with Friends” or even “Doodle Congress”. But the odds are against you.

3. Startups have to convince investors to give them money

This is part of what makes startups startups. It’s tough to stay “lean” and “agile” (let alone “mashed up”) if you can simply close a funding round at gunpoint each April 151. If VanRoekel can somehow make it so that government has to make PowerPoint slides and beg us for money each time it needs some, that would be a huge win.

4. Startups have to at least pretend to have a revenue model

It doesn’t have to be completely realistic. It can in fact be pretty ludicrous, like “we’ll sell ‘$50 of junk for only $25′ coupons and then only give the merchants half of the $25.”

But it does have to involve revenue. For instance, “we’ll use the funds to subsidize our friends’ failing businesses and also to bail out our other friends’ failed businesses and then to send troops to Africa and then finally to imprison some recreational drug users” is not a revenue model. Could you maybe add some sort of group shopping component?

5. Startups need an “elevator pitch”

At some point you’ll be in an elevator with someone, and he’ll ask you what your startup does, and you’ll have to explain it to him in terms of something he already knows (and recognizes as a success for venture capital).

For instance, a startup might be “Flickr but for dogs” or “Facebook but for cats” or “ but for group shopping deals.”

Obviously, none of these describes the federal government. Coming up with these analogies is more of an art than a science, but you might consider “Enron but bigger” or “Swoopo but mandatory” or “ but with guns and a no-knock warrant”.

6. Startups fire people

Part of being “lean” and “agile” and “mashed up” is that you can’t afford to keep the wrong people *cough* Tim Geithner *cough* Janet Napolitano *cough* Eric Holder *cough* Steven Chu *cough* in their jobs when they suck at them. If the CIO is empowered to make this change, then good for him!

7. Startups have a “fun” culture

fun not fun
ping-pong table metal detectors
free popcorn the toothpick rule
catered meals Supreme Court cafeteria
geek shootout Waco shootout

8. Startups usually fail and go out of business

I’d be lying if I said this wasn’t the most exciting part of the “government as startup” plan.

1. Can we dispense with the fiction that taxes are due on April 15? Multiple times I paid my taxes by April 15 and yet was still “penalized” because I didn’t “estimate” and “prepay” them sooner.

Machine Learning Beverage

Although my formal training is in subjects like math and economics and animal husbandry, most of the money-work I do is in subjects like data science and fareology and writing over-the-top religious polemics. This is one of the reasons why I’m so sour on the value of college, as my multi-million-dollar investment in tuition and pitchers of Ice Dog beer and Tower Party t-shirts didn’t even provide me the opportunity to learn any of these.

I did get to take an “Artificial Intelligence” class. The only listed prerequisite was the “Intro to CS” class, but a brand new professor was teaching and she decided to make it a much more advanced class, and then I was going to partner with my friend who was a CS major so that he could handle all the more advanced programming aspects, but he dropped the class after a couple of weeks so he could spend his senior year focused on “not taking classes”, which meant that I got to spend my senior year focused on “learning enough about computer programming to not fail the class”, after which I picked up a bit of “how to sometimes beat the computer at tic-tac-toe” and “how to sometimes beat the computer at Reversi” and “how to narrowly avoid coming in last place in the classwide ‘Pac War‘ tournament.”

Despite that initial setback, over the course of my career I’ve managed to learn bits and pieces of what’s variously called “machine learning”, “artificial intelligence”, or “guessing stuff”. I suspect I would be more popular at data mining parties if I had a smidge more training in these subjects, and so I was very excited at the prospect of Stanford’s free online Artificial Intelligence Class and Machine Learning Course, both of which are offered this fall. (There’s also a Database Class, but I know too much about databases already.)

You don’t get actual Stanford credit if you take the classes online, but I don’t particularly want Stanford credit, which means that’s not a deal-breaker. You get some sort of certificate signed by the professors listing your rank in the class, which will probably be somewhere in the millions thanks to all the Chinese students who will be cheating on their assignments, but I don’t particularly want a certificate either. I wouldn’t mind some sort of bumper sticker (“MY COMPUTER ALGORITHM IS SMARTER THAN YOUR HONOR STUDENT AND FURTHERMORE WON’T EVER BE UNEMPLOYED AND LIVING IN MY BASEMENT UNDER A CRIPPLING MOUNTAIN OF STUDENT-LOAN DEBT”), but that doesn’t seem to be part of the plan.

Most likely I won’t have enough time to devote to the classes anyway, what with work and training the baby to take over the world someday and trying to finish the novel about the boy who likes to play baseball but is no good at it. And this isn’t helped by the fact that both classes are going to have hours of online lectures that I’m going to have to sit through. Lectures!

I twittered the other day that if I have to sit through lectures then you’re not really transforming education. A lot of people (reasonably) interpreted this as a dig at the Khan Academy, but I was more angry at the Stanford CS department, which is tech-savvy enough to offer courses over the Internet to millions of cheating Chinese people and yet not tech-savvy enough to think of a better method of knowledge transmission than lectures with slides, which were invented by Moses or possibly even God, making them thousands of years old. I’m happy to take their quizzes and solve their problem sets and write their examinations, but the prospect of having to spend time listening to lectures is really glooming me down.

It’s not that I don’t appreciate what they’re doing, but if the Stanford Computer Science department really wants to revolutionize the educational process, they should figure out a way to upload information directly into my brain, or to embed it subliminally in Spider-Man cartoons, or to make it somehow drinkable. “Machine Learning Class” is the past; the future belongs to whoever first figures out “Machine Learning Beverage”!

Google, Plus

If you have been living in a cave without Internet access, you might not be aware of Google Plus, which you might think of as Google’s answer to Facebook (if Facebook were a question). After playing around with it a bit, it seems to have several advantages:

  • not operated by Facebook
  • your relatives aren’t on it yet
  • 90% of posts are about hot topics like Google Plus and how to use Google Plus and how cool Google Plus is, not boring topics like “pictures of my kids”
  • are able to “follow” people who aren’t actually your friends, which means you can get topics in your feed other than the Paleo diet, cryonics, and the Reichart and Garrett show
  • and most importantly, circles

Whereas Facebook makes you lump all your friends together in one feed, Google lets you segregate them into circles for browsing and sharing. If you curate correctly, it’s easy to share links only with the “Asian females” circle and to browse only the “people on my kickball team that I like” circle.

Unfortunately, at this early stage of the game you cannot nest circles, which means it’s important to partition your friends correctly. After a lot of trial and error, I’ve found the following scheme of circles works pretty well for me:

  • Asian females
  • People who hate libertarians but put up with me for some unspecified reason
  • People on my kickball team that I like
  • People on my kickball team towards whom I’m ambivalent
  • Former bosses
  • People who post about things currently happening at the college we attended, even though we all graduated 15 years ago
  • Jackie Passey
  • Tall people
  • People that I don’t know who they are, but we have a lot of friends in common, so I’ll pretend like I do know who they are, because probably I’m supposed to
  • Fictional characters
  • Kirez
  • People I met at the Rudy Ray Moore concert
  • Everyone else

If there’s a downside to Google Plus, it’s that it’s a lot of work to check it all the time, and to casually brag about how many people are adding to me to their circles, and to ask everyone their heights so that I know whether to put them in the “Tall people” circle or the “Jackie Passey” circle. Nonetheless, it’s pretty clear at this point that circles are the wave of the future, which means that my decades-long investment in analytic geometry is about to pay off!

Will Someone Please Invent the Virtual Locker Room

Bill Gates, always a man with big ideas, suspects that the internet is going to shake up our educational system:

“Five years from now on the web for free you’ll be able to find the best lectures in the world,” Gates said at the Techonomy conference in Lake Tahoe, CA today. “It will be better than any single university,” he continued.

In fact, this is already true today. When I used to bus-commute across the bridge, every bus ride that I didn’t spend reading pirated young-adult Star Wars novellas or playing “Angry Birds” I spent watching “iTunes U” lectures from Stanford and MIT and iPorn about “Machine Learning” and “Computer Science” and “The Naked Female Body.” If only I could somehow put these on my resume, I’d be able to talk my way into all sorts of jobs I’m not really qualified to do. BillG has got a plan for that too:

He believes that no matter how you came about your knowledge, you should get credit for it. Whether it’s an MIT degree or if you got everything you know from lectures on the web, there needs to be a way to highlight that.

Now, there is a cynical school of thought that says that the value of a MIT degree is not that it signals that you learned dozens of MIT-lecture-worths of things; rather, it’s that it signals that you were admitted to and jumped through all the hoops necessary to survive four years at MIT, in which case the hypothetical third-party credentials “watched a bunch of MIT lectures on the bus” probably aren’t that useful to employers.

Furthermore, being lectured at is frequently not the best way to learn something. Nonetheless, I join BillG in applauding this trend. If it puts competitive pressure on colleges, it will be a good thing.

It seems to me that it’s even more promising for K-12 education. Rather than having centrally-assigned, underqualified teachers trying to lecture 30 students who learn at varying paces (and several of whom are disruptive), each student could find the lecturer and lecture style that works best for him. In many cases these might be no lectures at all. Think of the innovations that would ensue! I bet BillG is most excited about this:

He made sure to say that educational institutions are still vital for children, K-12. He spoke glowingly about charter schools, where kids can spend up to 80% of their time deeply engaged with learning.

But college needs to be less “place-based,” according to Gates. Well, except for the parties, he joked.

Wait, what? K-12 education needs to be “place-based”? I mean, I understand that the internet can’t yet teach kids valuable life skills like “staying in your seat” and “raising your hand before you speak” and “not going to the bathroom without getting permission first” and “getting duct-taped to a bench in the locker room for being too slow at running laps.” But surely virtual locker rooms and virtual duct tape are only a few years away!

(Also, for those of you who don’t know, I am delighted to report that the post-college years contain a huge number of parties, including Oktoberfests, Nights of Decadence, 80’s Parties, Bacchinaliae, Shut-up-and-Drinks, and Lovett Casino Parties.)

It’s tough to assert with a straight face that competition (from the internet or otherwise) will provide vast benefits for students in grades 13-16, but has no role to play in grades K-12. If Bill ever decides to spend his vast fortunes improving education, hopefully he’ll revisit his opinion on this first, before he wastes billions of dollars.

Why Software Testers Should Run for Congress

In my previous post “Why Software Developers Shouldn’t Run for Congress” I poked fun at the idea, proposed by a pie-in-the-sky, government-would-work-well-if-only-it-were-run-by-my-kind-of-people type, that an influx of software developers would noticeably improve the quality of our laws.

During a subsequent Facebook discussion, I came up with an additional “reason” why developers might enjoy Congress: developers hate testing their code, and Congress never tests before shipping. Of course I was being flip, but the idea has since gotten stuck in my head.

In software, when you want to make changes to code, you test them. You change small pieces and use unit tests to make sure they don’t break existing functionality. You develop a spec outlining what the code is supposed to do, and then you check that it does those things before you ship it. You have code reviews so that other coders can inspect your code looking for possible unintended consequences. You let normal users try to break the code before it ships. You try using the code yourself for a while before you inflict it on your customers. When you know that people will try to “game” your final product, you model their behavior and try to account for it in your design.

In particular, if you want to stay in business you don’t show up the night before release with thousands of pages of unreviewed, untested, hodge-podge code written by the very people hoping to hack your systems, full of hidden side effects, functionality that wasn’t in the spec, backdoors, and billion-dollar bugs. Unless you’re Congress, of course, in which case you stay in business no matter how sloppy your “coding” habits are.

So while I still don’t think Congress would be particularly improved by the addition of software developers, it sure as hell could benefit from some testers.