A Programmer’s Lot is Not a Happy One?

Embed from Getty Images

Well, I don’t know really. Most programmers that I know seem about as happy as the rest of the population, but I was thinking about programming and that variation on “A Policeman’s Lot” from the Pirates of Penzance appealed to me.

Programming in often presented as being difficult and esoteric, when in fact it is only a variation of what humans do all the time. When you read a recipe or follow a knitting pattern, you are essentially doing what a computer does when it “runs a program”.

Unix program to display running processes
Unix program to display running processes (Photo credit: Wikipedia)

The programmer in this analogy corresponds to the person who wrote the recipe or knitting pattern. Computer programs are not a lot more profound than a recipe or pattern, though they are, in most cases, a lot more complicated than that.

It’s worth noting that recipes and patterns for knitting (and for weaving for that matter) have been around for many centuries longer than computer programs. Indeed it could be argued that computers and programming grew out of weaving and the patterns that could be woven into the cloth.

English: Pattern of traditional Norwegian Sete...
English: Pattern of traditional Norwegian Setesdal-sweater. The pattern is created to be used on a punch card in a knitting machine. Svenska: Klassiskt mönster från lusekofta från Setesdalen, Norge. Mönsterrapporten är skapad för att användas på hålkort i stickmaskin. (Photo credit: Wikipedia)

In 1801 Joseph Marie Jacquard invented a method of punched cards which could be used to automatically weave a pattern into textiles. It was a primitive program, which controlled the loom. I imagine that before it was invented the operators were giving a sheet to detail what threads to raise and which drop, and which colour threads to run through the tunnel thus formed. I can also imagine that such a manual process would lead to mistakes, leading to errors in the pattern created in the cloth. It would also be time consuming, I expect.

Jacquard’s invention, by bypassing this manual method would have led to accurately woven patterns and a great saving in time. Also, an added advantage was that changing to another pattern would be as simple as loading a new set of punched cards.

English: Jacquard loom in the National Museum ...
English: Jacquard loom in the National Museum of Scotland, Edinburgh. Nederlands: Weefgetouw met Jacquardmechanisme in het National Museum of Scotland, Edinburgh. (Photo credit: Wikipedia)

At around this time, maybe a little later, the first music boxes were produced. These contained a drum with pins that plucked the tines of a metal comb. However the idea for music boxes goes back a lot further as the link above tells.

The only significant difference between Jacquard’s invention and the music boxes is that Jacquard relied on the holes and music boxes relied on pins. They operated in different senses, positive and negative but the principle is pretty much the same.

A PN junction in thermal equilibrium with zero...
A PN junction in thermal equilibrium with zero bias voltage applied. Electron and hole concentrations are reported respectively with blue and red lines. Gray regions are charge neutral. Light red zone is positively charged. Light blue zone is negatively charged. Under the junction, plots for the charge density, the electric field and the voltage are reported. (Photo credit: Wikipedia)

Interestingly there is a parallel in semiconductors. While current is carried by the electrons, in a very real sense objects called “holes” travel in the reverse direction to the electrons. Holes are what they sound like, places where an electron is absent, however I believe that in semiconductor theory, they are much more than mere gaps, and behave like real particles.

It’s amazing how powerful programming is. Microsoft Windows is probably the most powerful program that non-programmers come into contact with, and it does so many things “under the hood” that people take for granted, and it is all based on the absence or presence of things, much like Jacquard’s loom and the music boxes. While that is an analogy, it is not too far from the mark, and many people will remember having been told, more or less accurately that computers run on ones and zeroes.

Embed from Getty Images

When a programmer sits down to write a program he or she doesn’t start writing ones and zeroes. He or she writes chunks of stuff which non-programmers would partially recognise. English words like “print”, “do”, “if” and “while” might appear. Symbols that look like maths might also appear. Depending on the language, the code might be sprinkled with dollar signs, which have nothing directly to do with money, by the way.

The programmer write in a “language“, which is much more tightly defined than ordinary language, but basically it details at a relatively high level what the programmer wants to happen.

Logo for the Phoenix programming language
Logo for the Phoenix programming language (Photo credit: Wikipedia)

The programmer may tell the program to “read” something and if the value read is positive or is “Baywatch” or is “true”, do something. The programmer has to bear in mind that often the value is NOT what the programmer wants the program to look for and it is the programmer’s responsibility to handle not only the “positive” outcome but also the “negative” one. He or she will tell the program to do something else.

When the programmer tells the program to “read” something, he or she essentially invokes a program that someone else has written whose only job is to respond to the “read” command. These “utility” program are often written in a more esoteric language than the original programmer uses (though they don’t have to be), and since they do one specific task they can be used by anyone who programs on the computer.

This program instructs other, lower level programs to do things for it. Again these lower level programs do one specific thing and can be used by other programs on the computer. It can be seen that I am describing a hierarchy of ever more specialised programs doing more and more specific tasks. It’s not quite like the Siphonaptera though, as the programs eventually reach the hardware level.

At the hardware level it will not be apparent what the programs are intended for, but the people who wrote them know the hardware and what the program needs to do. This is partially from the hierarchy of programs above, but also from similar programs that have already been written.

English: CPU
English: CPU (Photo credit: Wikipedia)

Without going into detail, the low level program might require a value to be supplied to the CPU of the computer. It will cause a number of conducting lines (collectively a “bus”) to be in one of two states, corresponding to a one or a zero, or it might cause a single line to vary between the states, sending a chain of states to the CPU.

In either case the states arrive in a “register”, which is a bit like a railway station. The CPU sends the chains of states (or bits) through its internal “railway system”, arranging for them to be compared, shifted, merged and manipulated in many ways. The end result is one or more chains of states arriving at registers, from whence they are picked up and used by the programs, with the end result being whatever the programmer asked for, way up in the stratosphere!

Modelleisenbahn im Hauptbahnhof Wiesbaden
Modelleisenbahn im Hauptbahnhof Wiesbaden (Photo credit: Wikipedia)

This is monumental achievement, pun intended, and is only achievable because at each level the programmer writes a program that performs one task at that level which doesn’t concern itself at all with any other levels except that it conforms to the requests coming from above (the interface, technically). This is called abstraction.

Data abstraction levels of a database system
Data abstraction levels of a database system (Photo credit: Wikipedia)

Grandad’s a Geek!!


The original geeks were sideshow performers who did disgusting things like disembowelling a chicken and eating it raw. They often had mental issues and lived in squalid conditions, maybe even cages. They might be billed as “a savage from the depths of darkest Africa” or some such nonsense but more likely they were just people who had sunk to the bottom of society and had fallen in with the carnival. Alcoholics who would work for a bottle of moonshine would reputedly sometimes  act the wild man for the carnival.

The film “Nightmare Alley” tells the story of one such geek, from his start as a sane and relatively normal person, who joins a carnival and works his way up to fame and fortune, only for his world to collapse around him, to his final fate as the alcoholic carnival geek.

Nightmare Alley

The word “geek” (together with the similar word “gook”) has been used as a derogatory term for Asian people by Americans and others during war time. Troops were supposedly encouraged to use such terms in order to “dehumanize” the people of the countries which were being fought in or over. Hence the connotations of dislike that comes with the word.

The word “geek” meaning a clever person may possibly have its origins in the United Kingdom. It’s possible that its use in this sense may have arisen when the word which had been used to target overseas people was instead used to target unlikeable  people much closer to home! The person who top scored in all tests and had no social graces became known as a “geek”. Of course, in some cases the so-called “geek” eventually by virtue of his smartness became the employer of those who belittled him at school.


In the highly technical world that we live in, the “geek” naturally became a wanted person and while the term is still often used in the derogatory sense, it can be a term of back-handed admiration, and the term is often proudly asserted by the geeks themselves. Indeed, having worked in Computers and Information Technology for all of my working life, I somewhat proudly consider myself to be a “geek”.

79-365 I am a computer geek!

The techno-heroes of the current day are the likes of Bill Gates of Microsoft, Steve Jobs, Steve Wozniak and Ronald Wayne of Apple, and Bill Hewlett and Dave Packard of Hewlett Packard. There  is a sort of sense of awe that these geeks have achieved so much.

Latter day geeks have had films made about them. The founder of Facebook, Mark Zucherberg has been portrayed in a film, in a not so flattering light, I understand, not having seen the film. The school geek appears late on in the film “Romy and Michelle’s High School Reunion” to whisk the eponymous heroines off in his helicopter.

Cover of "Romy and Michele's High School ...

“Geeks”, “boffins”, “back room boys” have existed in every era of history, no doubt. They are relied upon to produce the technical goods while being regarded both as humorous and not quite normal. However their status has risen of late, driven by the vast technological boom that pretty much started during the Second World War. The Dambusters, the Enigma machine and the atomic bomb all came from that era and after the war the boom exploded.

ENIGMA cipher machine collection
ENIGMA cipher machine collection (Photo credit: brewbooks)

Geeks and computers go together. In the beginning, in the late 1940s, large machines started to appear in back rooms, tended by men and some women in white coats. These mysterious machines performed strange calculations and the geeks in control were treated like high priests of some mystery cult.

At this time a relatively new company called IBM rose to prominence and dominated the new field of computing. Mainframe computers as they were called swiftly spread to many companies, and special rooms were built to house the multitude of beige cabinets that formed a mainframe computer system. By the 1980s there were many computers performing many different tasks and companies began to depend on them.

English: IBM Personal Computer model 5150 with...
English: IBM Personal Computer model 5150 with monochrome phosphor monitor (model number 5151) and IBM PC keyboard. (Photo credit: Wikipedia)

However there were smaller, simpler computers starting to appear. Many households of that era would have had a Sinclair or Commodore or Atari computer on which to play games. IBM introduced a computer of this size, the IBM Personal Computer, but then they dropped the ball. While IBM is still one of the biggest companies in the world, they did not really embrace this technology, allowing the rise of the PC.

IBM System/360 at the Computer History Museum.
IBM System/360 at the Computer History Museum. (Photo credit: Wikipedia)

One company did embrace the technology and realised that the way to make the big money was not to provide the hardware, but to provide the software that ran on it and Microsoft became its rise to prominence, like IBM before it, and the Microsoft Operating System became dominant, and is still dominant today.

1993 - Grandad's old computer setup, Irith -
1993 – Grandad’s old computer setup, Irith – (Photo credit: Rev. Xanatos Satanicos Bombasticos (ClintJCL))

So what has this got to do with Grandad? Well, the current generation wonders whether the older generations will “get” the new technology. Consider though. Grandad will be 60-ish, right? That will mean that he would have been born in the early 1950s or late 1960s. In the 1980s he will have been around 20 and just the right age to take part in the spread of computing around the globe. He may have had a Commodore or an Atari at home.

Commodore 64
Commodore 64 (Photo credit: unloveablesteve)

In his thirties he will have seen the rise of DOS and Windows and he may even have had a 386 machine at home. Possibly he became proficient in DOS and the early Windows being what it was he probably was proficient obtaining and loading “drivers” for his machine.

It is likely that he has experienced the joys of persuading a  modem to connect to a bulletin board, or through a fledgling ISP to this new thing, the Internet. He may have spent hours downloading a blocky, slow game to display on his CGA-capable monitor, transferring it down the telephone lines at the rate of a few bytes a second. A megabyte download might have taken half an hour or more.

古董 (Photo credit: alanine)

As the Internet grew he would have switched to the Netscape browser and accessed the Internet at 2400bps, then 4800bps, then a massive 9600bps and on to an astronomical 56kbps! Doubtless these days he uses some form of broadband or cable connection.

Today’s geeks believe that because they have grown up with the technology that Grandad (even Dad) will not be able to cope with it. They conveniently forget that while they may have grown up with the Internet and the technology, the Internet and technology have grown up with Grandad!

Blowing out Grandad's birthday candles
Blowing out Grandad’s birthday candles (Photo credit: djdpascoe)