Today on Hacker News, there’s yet another “Rewrite of X lines from popular language Y to obscure language Z” blog post near the top. There’s no point in linking it because they’re all the same: small- to medium-sized projects that can be rewritten and maintained by a single developer or a tiny team. I’ve written about this before on Rewrite speed is a terrible, no-good, very-bad metric.
However, today I wanted to tackle this subject in a slightly different way, which is that those posts, as well as virtually all new languages, are for hobby. They’re not for work.
This particular author acknowledges this in the post by saying they did it just because it was fun. I agree, it is fun. I used to have enough spare time to enjoy trying out new languages and did it frequently. Primarily, when I did so, I did so as a hobby. I can only think of three times in my career where I’ve introduced a new language into work and have had that language end up being very successful.
- Python, 1995. (For web, scripts, and Tkinter tools)
- Java, 1996. (Though I introduced it for applets!)
- Lua, 1998. (Embedded only)
- C#, 2002. (Windows/gamedev job)
Despite having explored many languages, that’s the list I’ve successfully introduced at work, and those languages are still used in the context of where I introduced them.
I think there’s an important commonality to recognize about these languages: when it was that I introduced them. The general problem with languages is that they don’t gain widespread acceptance unless there is a major platform that’s growing. So far, in the history of computing, we’ve seen 4 major platform eras.
- Nothing -> Mainframes.
- Timeline: 1940s -> Mid-to-late 1980s (market peak)
- Scale: low millions
- Platform language adoption: Fortran and COBOL.
- Mainframes -> PCs.
- Timeline: 1981 -> 2010 (market peak)
- Scale: Hundreds of millions to low billions.
- Platform language adoption: C++, C, C# (Windows only).
- PCs -> Web Browsers/Servers.
- Timeline: 1993 -> Ongoing
- Scale: Multiple billions
- PCs / Web -> Smartphones / Tablets
- Timeline: 2007 -> Ongoing
- Scale: Multiple billions
- Platform language adoption: Java, Objective-C.
Now you may ask, “But wait, what about Ruby on Rails? What about PythonErlangScalaRustGo?”.
Here’s the thing: every era of computing that I just listed for the past 80 years has had many languages that people found useful at the time and were very popular, but had no staying power. The languages weren’t a fundamental part of a platform’s scaling period. And so they either went away or fell into a niche.
Some examples: ALGOL, BASIC, VB, Ada, Pascal, Object Pascal, Modula, Delphi, Perl, PHP, Lisp, Scheme, OCaml, Eiffel, MUMPS, Self, Smalltalk, Tcl, and on and on and on.
I know many successful things that have been accomplished in all of these languages. But just because a successful project was written in a language doesn’t mean that language is a good choice ever again. I’ve thought about this for many years, and having unsuccessfully tried to introduce obscure languages into my work over that time, I’ve concluded there are really only two criteria for picking a language:
- Does it fit the requirements of the problem? (e.g. Python would not fit the requirements of a video game engine.)
- Is it the most popular language in that space?
And that’s really all you need. If #1 and/or #2 isn’t true, then you need to have a migration plan. An unbelievable amount of engineering time (SWE-megaannum) has been wasted by people writing server-side code which then needs to be rewritten in a language that other people actually know and scales. Python to Java is a typical one. So my recommendation is to always use the JVM. Instead of starting with Python, start with Kotlin, or Scala, or Clojure, or any JVM language that suits your fancy. Then your migration path to something like Java won’t require a complete rewrite, if you even ever need it (since the JVM scales well).