Skip to main content

Computer Languages Change - Like Spoken Languages

Are computer languages inherently "artificial" and "pure" — either like Esperanto or a dead language, such as Latin? Or, are computer languages as much "living" as spoken languages?

Understand, I am not considering low-level assemblers or "dead" computer languages that exist in virtual museums (and yes, there are tech archives to explore). I mean the languages that are in wide enough use that programmers develop attachments to them and vocally argue about their futures.

In spoken languages, some people are purists. These experts like to "prescribe" grammars and the meanings of words, insisting on a rigid approach to a language. By comparison, some scholars of language as "descriptive" researchers, trying to document a language's evolution. Most scholars, however, are a bit of both — we try to prescribe dominant rules, while accepting change will happen.

The French try desperately to maintain an official "French" language. That's the same as having an ISO committee oversee a programming language. Yet, neither seems to work as well as intended.

The following is a rough metaphor, so please don't nit-pick at my generalities…

Many, if not most, computer languages are extensible; you can add to the language via macros, function libraries, and other means. Some languages can even be extended via other languages! (Tech trivia: The most common language for extending other languages is C.)

Eventually, the "language" as known by programmers includes libraries, plug-ins, and frameworks. We tend to forget where the core language ends and the extensions begin. Take Apple's Objective-C as an example. The Cocoa frameworks (actually several frameworks) are so integral to development that most people use "Objective-C" and "Cocoa" interchangeably. Objective-C is really a small language that doesn't do much by itself. Technically, we could state that Objective-C is C, with optional object-oriented extensions. Objective-C is a branch of the "C family tree" reflecting the ideas some programmers had about improving C.

Even in "old" traditional C, the standard I/O and math libraries have blurred with the language, too. Everyone considers the function __ strcopy(variable, text) __ part of C but it isn't — it is a library function. One of my complaints about C is that strings were an afterthought in most older languages, and it shows. Still, the language evolved because over time the importance of dealing with textual data started to rival the value of number crunching.

Most languages end up being changed not by the purists with computer science degrees, those original compiler creators with special skills. Instead, the daily users of the languages alter and extend languages to get work done. Sometimes, the compilers are altered to reflect these pressures, and the "common usage" becomes an acceptable usage, much like spoken language. This has been problematic in scripting languages, and I challenge anyone to explain the mess that is PHP.

For another example of computer languages changing like spoken languages, consider good old-fashioned BASIC. There is a standard, but even by the mid-1980s there were countless dialects (GW-BASIC, Commodore BASIC, Atari BASIC, et cetera.) Even today, we have branches from the original BASIC that, unless you know their heritages, would be as difficult to trace backwards to BASIC as English is to trace back to Western Germanic languages. English has taken words and grammar from dozens of sources, and BASIC dialects have done the same.

Elements of C and Pascal, among others, have entered BASIC. PowerBASIC, Visual Basic (and VBA, VB.Net), PureBASIC, BlitzMax, and Real Studio (formerly REALbasic). In the case of PowerBASIC, the users kept adding functions using inline assembly code, so Bob Zale modified the compiler to keep up with what people were doing! The PowerBASIC language incorporated the elements coders were hacking into the language.

A common task in the old days was "page copying" a screen to simulate visual effects. Most of us hand-coded assembly programs and linked those into our programs. However, both QuickBASIC and PowerBASIC eventually relented and added the function __ pcopy z, y __ directly within the compiler. The users changed the language.

The world of programmers is small, granted, and the number of compiler experts even smaller. But, most "normal people" don't add to dictionaries, either. A word becomes common, like a function becomes common, and eventually someone in "power" adopts the word or tries to reject it. Much of this depends on how close you are to that community that argues about what to add or not add to a language.

A more current example of language evolution can be found in HTML 5, which incorporated "div" (division) tags people were using with some regularity. These div tags, such as "Article," become stand-alone tags in HTML 5; they are no longer simple named divisions on the page. We now have page elements not because the parser creators wanted them, but because so many designers were using CSS to construct libraries of common page divisions. The previous markup <div id="article"> has become <article> because that is the way users, not HTML parser developers, wanted it.

There remains an active debate between the XHTML supporters, of which I am one, and the HTML supporters. I prefer the rigid grammar of XHTML, which prevents structural errors — but the masses won out and we don't need closing tags for all elements. Why? Because languages seem to drift towards simplicity, losing complex grammar elements over time. (I compare this to the loss of "whom" and the use of "their" for "his or her" in common speech.)

As you can probably tell, I give this too much thought.


  1. The evolution of Objective-C over the last decade is another object lesson along these lines. There were standards enforced by the the community gestalt, such as the getter/setter nomenclature and behavior, and then those got baked into the compiler as properties. Similarly with ARC - the majority of the programmers out there played by the same rules (which in many communities would be impossible - imagine something that depended on standardized nomenclature working in the straight-C community) which allowed some tedious work to be moved into the compiler.


Post a Comment

Popular posts from this blog

Slowly Rebooting in 286 Mode

The lumbar radiculopathy, which sounds too much like "ridiculously" for me, hasn't faded completely. My left leg still cramps, tingles, and hurts with sharp pains. My mind remains cloudy, too, even as I stop taking painkillers for the back pain and a recent surgery.

Efforts to reboot and get back on track intellectually, physically, and emotionally are off to a slow, grinding start. It reminds me of an old 80286 PC, the infamously confused Intel CPU that wasn't sure what it was meant to be. And this was before the "SX" fiascos, which wedded 32-bit CPU cores with 16-bit connections. The 80286 was supposed to be able to multitask, but design flaws resulted in a first-generation that was useless to operating system vendors.

My back, my knees, my ankles are each making noises like those old computers.

If I haven't already lost you as a reader, the basic problem is that my mind cannot focus on one task for long without exhaustion and multitasking seems…

MarsEdit and Blogging

MarsEdit (Photo credit: Wikipedia) Mailing posts to blogs, a practice I adopted in 2005, allows a blogger like me to store copies of draft posts within email. If Blogger, WordPress, or the blogging platform of the moment crashes or for some other reason eats my posts, at least I have the original drafts of most entries. I find having such a nicely organized archive convenient — much easier than remembering to archive posts from Blogger or WordPress to my computer.

With this post, I am testing MarsEdit from Red Sweater Software based on recent reviews, including an overview on 9to5Mac.

Composing posts an email offers a fast way to prepare draft blogs, but the email does not always work well if you want to include basic formatting, images, and links to online resources. Submitting to Blogger via Apple Mail often produced complex HTML with unnecessary font and paragraph formatting styles. Problems with rich text led me to convert blog entries to plaintext in Apple Mail and then format th…

Screenwriting Applications

Screenplay sample, showing dialogue and action descriptions. "O.S."=off screen. Written in Final Draft. (Photo credit: Wikipedia) A lot of students and aspiring writers ask me if you "must" use Final Draft or Screenwriter to write a screenplay. No. Absolutely not, unless you are working on a production. In which case, they own or your earn enough for Final Draft or Screenwriter and whatever budget/scheduling apps the production team uses.

I have to say, after trying WriterDuet I would use it in a heartbeat for a small production company and definitely for any non-profit, educational projects. No question. The only reason not to use it is that you must have the exclusive rights to a script... and I don't have those in my work.

WriterDuet is probably best free or low-cost option I have tested. It is very interesting. Blows away Celtx. The Pro version with off-line editing is cheaper than Final Draft or Screenwriter.

The Pro edition is a standalone, offline versio…