I recently stumbled across a file format known as Intel HEX. As far as I can
gather, Intel HEX files (which use the
.hex extension) are meant to make
binary images less opaque by encoding them as lines of hexadecimal digits.
Apparently they are used by people who program microcontrollers or need to burn
data into ROM. In any case, when I opened up a HEX file in Vim for the first
time, I discovered something shocking. Here was this file format that, at least
to me, was deeply esoteric, but Vim already knew all about it. Each line of a
HEX file is a record divided into different fields—Vim had gone ahead and
colored each of the fields a different color.
set ft? I asked, in awe.
filetype=hex, Vim answered, triumphant.
I recently stumbled across a file format known as Intel HEX. As far as I can gather, Intel HEX files (which use the
Subscribers to Popular Electronics were a sophisticated group. The magazine’s editor, Arthur Salsberg, felt compelled to point out as much in the editorial section of the December 1974 issue. The magazine had received letters complaining that a recent article, titled “How to Set Up a Home TV Service Shop,” would inspire a horde of amateur TV technicians to go out and undercut professional repairmen, doing great damage to everyone’s TVs in the process. Salsberg thought this concern was based on a misunderstanding about who read Popular Electronics. He explained that, according to the magazine’s own surveys, 52% of Popular Electronics subscribers were electronics professionals of some kind and 150,000 of them had repaired a TV in the last 60 days. Moreover, the average Popular Electronics subscriber had spent $470 on electronics equipment ($3578 in 2018) and possessed such necessities as VOMs, VTVMs, tube testers, transistor testers, r-f signal generators, and scopes. “Popular Electronics readers are not largely neophytes,” Salsberg concluded.
Github was launched in 2008. If your software engineering career, like mine, is no older than Github, then Git may be the only version control software you have ever used. While people sometimes grouse about its steep learning curve or unintuitive interface, Git has become everyone's go-to for version control. In Stack Overflow's 2015 developer survey, 69.3% of respondents used Git, almost twice as many as used the second-most-popular version control system, Subversion. After 2015, Stack Overflow stopped asking developers about the version control systems they use, perhaps because Git had become so popular that the question was uninteresting.
Lines of code longer than 80 characters drive me crazy. I appreciate that this is pedantic. I’ve seen people on the internet make good arguments for why the 80-character limit ought to be respected even on our modern Retina-display screens, but those arguments hardly justify the visceral hatred I feel for even that one protruding 81st character.
There was once a golden era in which it was basically impossible to go over the 80-character limit. The 80-character limit was a physical reality, because there was no 81st column for an 81st character to fit in. Any programmers attempting to name a function something horrendously long and awful would discover, in a moment of delicious, slow-dawning horror, that there literally isn’t room for their whole declaration.
The miraculous thing about Tim Berners-Lee is that he was the perfect person for the job.
In 1989, when Berners-Lee first proposed the idea that would become the World Wide Web, exciting things were happening in the realm of computing. A new set of standards called TCP/IP were allowing previously isolated computer networks to talk to each other. These standards had become popular, particularly in the American scientific community. By 1989, TCP/IP was also just starting to be adopted by commercial service providers like CompuServe.
In 2001, Tim Berners-Lee, inventor of the World Wide Web, published an article in Scientific American. Berners-Lee, along with two other researchers, Ora Lassila and James Hendler, wanted to give the world a preview of the revolutionary new changes they saw coming to the web. Since its introduction only a decade before, the web had fast become the world’s best means for sharing documents with other people. Now, the authors promised, the web would evolve to encompass not just documents but every kind of data one could imagine.
It’s hard to believe today, but the relational database was once the cool new kid on the block. In 2017, the relational model competes with all sorts of cutting-edge NoSQL technologies that make relational database systems seem old-fashioned and boring. Yet, 50 years ago, none of the dominant database systems were relational. Nobody had thought to structure their data that way. When the relational model did come along, it was a radical new idea that revolutionized the database world and spawned a multi-billion dollar industry.
The Altair 8800 was a build-it-yourself home computer kit released in 1975. The Altair was basically the first personal computer, though it predated the advent of that term by several years. It is Adam (or Eve) to every Dell, HP, or Macbook out there.
Ruby has always been one of my favorite languages, though I’ve sometimes found it hard to express why that is. The best I’ve been able to do is this musical analogy: Whereas Python feels to me like punk rock—it’s simple, predictable, but rigid—Ruby feels like jazz. Ruby gives programmers a radical freedom to express themselves, though that comes at the cost of added complexity and can lead to programmers writing programs that don’t make immediate sense to other people.
In 1962, JFK challenged Americans to send a man to the moon by the end of the decade, inspiring a heroic engineering effort that culminated in Neil Armstrong’s first steps on the lunar surface. Many of the fruits of this engineering effort were highly visible and sexy—there were new spacecraft, new spacesuits, and moon buggies. But the Apollo Program was so staggeringly complex that new technologies had to be invented even to do the mundane things. One of these technologies was IBM’s Information Management System (IMS).
I’ve always found man pages fascinating. Formatted as strangely as they are and accessible primarily through the terminal, they have always felt to me like relics of an ancient past. Some man pages probably are ancient: I’d love to know how many times the man page for
teehas been revised since the early days of Unix, but I’m willing to bet it’s not many. Man pages are mysterious—it’s not obvious where they come from, where they live on your computer, or what kind of file they might be stored in—and yet it’s hard to believe that something so fundamental and so obviously governed by rigid conventions could remain so inscrutable. Where did the man page conventions come from? Where are they codified? If I wanted to write my own man page, where would I even begin?
JSON has taken over the world. Today, when any two applications communicate with each other across the internet, odds are they do so using JSON. It has been adopted by all the big players: Of the ten most popular web APIs, a list consisting mostly of APIs offered by major companies like Google, Facebook, and Twitter, only one API exposes data in XML rather than JSON. Twitter, to take one example, supported XML until 2013, when it released a new API version that dropped XML in favor of using JSON exclusively. JSON has also been widely adopted by the programming rank and file: According to Stack Overflow, a question and answer site for programmers, more questions are now asked about JSON than about any other data interchange format.