This is the door of St Peter’s Church, Tadworth. The journey home starts here. Normal service should be resumed next Sunday.
Category: General
My Mother’s 90th Birthday.
My parents are amazing. Not content with celebrating their 70th wedding anniversary this year, we also celebrated my Mother’s 90th birthday, this Sunday. We gathered at a hotel in Yatton on Saturday for a big do, with a lower key celebration on the Sunday. Like I said they are an amazing couple.
(Normal service will be resumed when I get home. I apologise for this brief post.)
The Burren
The Burren is a natural wonder of the natural world. Situated in the west of Ireland it is an area of outstanding beauty that should be on everyone’s bucket list. It consists of a huge area of limestone outcrops and pavements, with brightly flowered plants living precariously but abundantly in the joints and cracks in the limestone. As I said, everyone should visit it.
This is brief post to keep the chain going. Normal service will resume when we get back.
The first of a few brief posts
As I am travelling, visiting relatives, my next few posts are going to be brief. I’ve flown to Singapore for two days and then on to London. In a day or two we go on to Ireland, which I have never visited.
One interesting moment on the trip was when I was trying to post a photo of the plane’s progress screen to Facebook via the on board WiFi while we were way out over the ocean! While I could post text, I couldn’t get the image to upload.
My photos are mostly not on this device, but I may move some up to post here later. To finish this brief post I will post a picture of a spring time rain shower from the window in my sister’s house. Fortunately the weather hasn’t been too bad so far.
Sinistralism

At one time banks and post offices used to tie down the pen that they provided for people to sign things, such as cheques, with a piece of string or chain to prevent customers from stealing the pens. These days you are more likely to have a bunch of pens (with the bank’s logo) pressed on you.
Anyway, I was always discomforted by these tied down pens as I am left-handed and the string or chain was always fixed to the right of the counter and rarely was long enough for me to easily sign my cheques with left hand. As a result I was cramped up and twisted round as I manoeuvred the cheque book closer to the right hand side.

This is only one of the many times that my sinistralism has left me disadvantaged. Corkscrews and scissors, even power tools, wood screws and nuts and bolts are all designed for the majority who are right-handed. (I believe that the correct term would be dextralist). Can openers are the things that I find most tricky to use.
However, sinistralists living in a dextral world soon learn to use right handed tools, to at least some level of competence. There are tools made specifically for left-handed people and it is quite funny to give one to a right-hander. They don’t have a clue! This is because they have not had to learn to use tools with a sinistral configuration, whereas a sinistral has at least a level of competence with right-handed tools since they are everywhere.

The Universe obviously has a basic chirality or handedness which is a bit odd to say the least. It’s interesting to wonder why chirality exists. Is it just so that we can easily remove corks from bottles of wine? “Thanks, Deity, that just what we needed. Will you take a glass?”
Chirality makes it hard for some creatures. Imagine that you are a mollusc with a left-coiled shell and all the other mollusc are right-coiled. You’d find partners to be rare. Of course, it can’t make reproduction too difficult or the left-coiled molluscs would become extinct within a generation or so. (That’s probably too simplistic, but let’s not quibble).

Chirality allows us to have streams of traffic travelling both ways on a single road, but I’m sure that Deity would have considered that and arranged for some way that we could duplex our highways. Allowing a way for solid matter to inter-penetrate would be one, but perhaps “both directions” implies chirality anyway and in a non-chiral universe there would only ever be one way.
In some universes (and maybe in this one for all I know) there may be cases of even more complex situations. One reflection of a chiral object changes it into another object that can’t be superimposed on the original object. A second reflection does allow it to be superimposed on the original. Imagine a universe where three or more reflections are need to allow the superimposition! In that universe the motorways and autobahns would need at least three carriageways.

The spin of some sub-atomic particles has an interesting spin characteristic. The linked Wikipedia article says in one place:
However, if s is a half-integer, the values of m are also all half-integers, giving (−1)2m = −1 for all m, and hence upon rotation by 2π the state picks up a minus sign.
Rotation through 2π is a rotation through 360 degrees. So a particle with a spin of ½ turns into its mirror reflection on one full rotation. At least I think that’s what that means. Two full rotations and it is back to its original orientation (or at least, its original spin value).

In the macro world two reflections of an object create an image that can be superimposed on the original. That means that if you look at the reflection of your reflection you see yourself as others see you rather than your mirror image. You are so much more used to seeing your reflection that it can be difficult to comb your hair while looking at a doubly reflected image of yourself.
Chiral objects are not symmetrical. Reflections of symmetrical objects can be superimposed on the original object however (after being moved and turned). Symmetry and asymmetry are part of the fabric of the universe that we live in and I think that having both allows for much more complexity in our universe than would be possible in a universe without the symmetry/asymmetry dichotomy.

One of the great questions of philosophy is “Why does anything exist?” or “Why does something exist rather than nothing?” The real answer to this question is that no one knows and no one is likely to know. I suppose the Deity might if the Deity exists Itself. Given that the universe exists why is does it embody the concept of left and right? Again, no one knows or is likely to know, or so it appears at this time.
One reason may be that if asymmetry did not exist one could probably not travel from A to B. If one started from A to travel to B in a universe without symmetry and hence without asymmetry, and someone else was travelling in the opposite direction then to pass one another the two travellers would have to pass on one side or the other of the other traveller.

It seems to me that the concept of “side” implies the concepts of “left” and “right”. Asymmetry comes in because, to either traveller, one passes to the left or right of the oncoming traveller who passes to the right or left of you. Symmetry comes in because, to pass one another both traveller has to keep to their left or their right to successfully pass one another.
Of course, that is only true for our universe. It is conceivable that a universe could exist where symmetry and asymmetry do not exist, but it is a universe which we would, most likely, be unable to conceive of, except in the broadest terms.
http://www.gettyimages.com/detail/173800213
Speed
(Posted late again! Whoops!)
Every time I write my 1,000 words it is a challenge but sometimes it is more of a challenge. As I’ve said before, sometimes I know roughly what I want to include, while other times I pick a topic and go for it. This post is one of the latter.
Speed. Anyone who has been on the Internet since the early days knows about speed. When I hear people complaining about the speed of their connection I quietly laugh as I consider the days of dial-up, and of 2400 baud modems. A megabit download could take half an hour to an hour if the connection held up that long.

As an aside, the “skreee, Kaboinga, boinga, boinga, skeeee…” of a dial up modem connecting induces nostalgia in me, though I’d not go back to those days! Today’s Internet is sometimes called the Information Superhighway. The old dial up Internet was much like a dirt road. With potholes.
When one takes a journey, say from one end of the country to the other, one sets out on local roads, which may or may not be congested, then one travels over the Motorways, or the Interstates, or the Autobahns. Then one travels on the local roads at the far end. Any of these may be congested, but the local roads are most likely to be slower to traverse.

The same is true of the Internet. Your local ISP is equivalent to the local roads at your end of the trip, and the target website or whatever you are connecting to is in their ISP’s local network.
If we delve a little further, it is evident that the copper or fibre that makes up the Internet is not really a factor in the speed of the Internet. Signals in the fibre travel at light speed. Typically this is less than light speed in empty space, but it is close to it. Signals in copper travel slower than this, but still at a significant percentage of the speed of light. (I’d put a link in here, but the subject is complex and I found no clear explanation. YMMV).

The real reasons that fibre is preferred over copper is the huge bandwidth and the much smaller attenuation in fibre cables. Bandwidth is often described in terms of how many lanes a highway has. Obviously the more lanes the more traffic a highway can handle. Attenuation is interesting. It’s as if your car starts out in New York in pristine condition, but deteriorates en route to Los Angeles, until it arrives at its destination looking like a bucket of bolts, if it makes it that far.
Whether or not the packets are transmitted by fibre or copper, the signal must somehow be loaded onto the cable, and this takes significant time. The packet of data is placed into a register on the network connector by the computer and a special chip translates that to a stream of bits on the wire, or pulses of light in the fibre.
These pulses then whiz off onto the network. However they don’t travel all the way in one hop. Your computer connects to a modem device that sends the signal to your ISP, where the signals hit a router. This device looks and a whole packet of data, then send it off again towards its destination. It doesn’t get to its destination in one hop, and there may be a dozen or more hops before it gets there.
At the beginning and the end of each and every hop there is a device that grabs the packet of data off the wire, decides where to send it and puts it on another wire. These devices are called ‘routers’, a term which many people will have heard. As you can imagine, each hop adds a delay (or latency) to the packet of data. These delays are quite small but they add up.
So the Information Superhighway doesn’t look that flash after all. Sometimes I wonder how data actually gets through at all. It’s as if there was a multi-lane highway across the country (the world even), but it is studded with interchanges which take significant time to traverse, increasing the time taken for the trip – light would take nanoseconds to get from here to Sydney if we had line of sight, but over the Internet it takes milliseconds.
Satellites are even worse. To reach a geostationary communications satellite and return takes of the order of half a second, an age in computing terms. Of course most communication satellites are lower than that so the delay is not as much as half a second, but it is still significant.
The advent of streaming services has exacerbated the problem. The basic issue is twofold. To use the motorway analogy, there are many more cars on the road and as a result the interchanges are becoming crowded resulting in congestion. In general the motorways themselves are fast and free-flowing, but the interchanges have not caught up.
An ingenious partial solution is to strategically place machines around the world which effectively distribute the stuff that people want to receive, so that the same content is available locally. It’s like taking all the copies of the pictures in a gallery in Los Angeles and storing the copies in New York, Miami, Washington and so on. Rather than having to go all the way to Los Angeles, a New York viewer sees the picture locally.

So the sources of delay to your streaming the latest version of TV shows are many. The first possibility is your own setup. Maybe your network and modem are not up to the task. Secondly there is the telecom network. Tricky stuff happens between you and your ISP which if the province of the telcos. I don’t know the ins and outs of it, but in some cases switching a couple of connection in the roadside cabinet or in the exchange helps.
Then there is your ISP. ISP will be keeping a close eye on the traffic through their part of the network, but the rapid rise of the streaming services has caught them a little bit unawares, and some are scrambling to keep up. Then there is the Internet backbone. It is unlikely that there are issues here. Finally there is the target ISP’s network and the target site’s network and the site itself. Any of these could cause issues, but they are way beyond the control of the end user and his/her ISP.
Speeds on the Internet are phenomenal when compared to the early days. Things are much more complex these days. It is amazing what can be achieved, and those of us who have experienced the early days are less likely to whinge about speed issues as we remember that it was like!
Documentation

Documentation. The “D word” to programmers. In an ideal world programs would document themselves, but this is not an ideal world, though some programmers have attempted to write programs to automatically document program for them. I wonder what the documentation is like for such programs?

To be sure if you write a program for yourself and expect that no one else will ever look at it, then documentation, if any, is up to you. I find myself leaving little notes in my code to remind my future self why I coded something in a particular way.
Such informal documentation can be amusing and maybe frustrating at times. When reading someone else’s informal documentation such as “I’m not sure how this works” or “I don’t remember why I did this but it is necessary”. More frequently there will be comments like “Finish this bit later” or the even more cryptic “Fix this bit later”. Why? What is wrong with it? Who knows?

The problem with such informal in code documentation is that you have to think what the person reading the code will want to know at this stage. Add to this the fact that when adding the comments the programmer is probably focussing on what he/she will be coding next, the comments are likely to be terse.
Add to this the fact that code may be changed but the comments often are note. The comment says “increment month number” while the code actually decrements it. Duh! A variable called “end_of_month” is inexplicably used as an index to an array or something.

Anyone who has ever done any programming to a level deeper than the usual beginner’s “Hello World!” program will know that each and every programmer has tricks which they use when coding, and that such tricks get passed from programmer to programmer with the result that a newcomer looking at code may be bamboozled by it. The comments in the code won’t help much.
Of course such programming tricks may be specific to the programming language used. While the same task may be achieved by similar means at a high level, the lower level of code will be significantly different. While that may seem to impose another barrier to understanding, I’ve found that it is usually reasonably easy to work out what is going on in a program, even if you don’t “speak” that particular language, and the comments may even help!
While internal documentation is hit and miss, external documentation is often even more problematic. If the programmer is forced to write documents about his programs, you will probably find that the external documentation is incomplete, inaccurate or so terse it is of little help in understanding the program.

In my experience each programmer will document his/her programs differently. Programmers like to program so will spend the least possible amount of time on documentation. He/she will only include what he/she thinks is important, and of course, the programmer is employed to program, so he/she might get dragged away to write some code and conveniently forgets to return to the documentation.
If the programmer is at all interested in the documentation, and some are, he/she will no doubt organise it as he/she thinks fit. Using a template or model might help in this respect, but the programmer may add too much detail to the documentation – a flowchart may spread to several pages or more, and such flowcharts be confusing and the source of much frustrated page turning.

Of course there are standards for documentation, but perhaps the best documentation of a program would be to specify the inputs and specify the outputs and then a description of how the one becomes the other at a high level. As I mentioned above a programmer will probably give too much detail of how inputs become outputs.
Documentation tends to “decay” over time, as changes are made to the program and rarely is the documentation revisited, so the users of the program may need to fill in the gaps – “Oh yes, the documentation says that need to provide the data in that format, but in fact that was changed two years ago, and we now need the data in this format”.

The problem is worse if the programmer has moved on and gone to work elsewhere. Programmers tend to focus on the job in hand, to write the program to do the job required and then move on to the next programming task, so such code comments as there are will be written at the time that the programmer is writing them. Such comments are likely notes to the programmer him/herself about the issues at the time that the program is being written.
So you get comments like “Create the app object” when the programmer wants a way to collect the relevant information about the data he/she is processing. Very often that is all that one gets from the programmer! No indication about why the object is needed or what it comprises. The programmer knows, but he/she doesn’t feel the need to share the information, because he/she doesn’t think about the next person to pick up the code.

I don’t want to give the impression that I think that documentation is a bad thing. I’m just pondering the topic and giving a few ideas on why documentation, as a rule, sucks. As you can imagine, this was sparked by some bad/missing documentation that I was looking for.
Open source software is particularly bad at this as the programmer has an urge to get his program out there and no equal urge to document it. After all, a user can look at the code, can’t he/she? Of he/she could look at the code, but it is tricky to do so for large programs which will probably be split into dozens of smaller ones, and the user has to be at least a passable programmer him/herself to make sense of it. Few users are.

So I go looking for documentation for version 3.2 of something and find only incomplete documentation for version 2.7 of it. I also know that big changes occurred in the move from the second version of the program to the third undocumented version. Ah well, there’s always the forums. Hopefully there will be others who have gone through the pain of migration from the second version to the third version and who can fill in the gaps in documentation too.

Philosophy and Science
Philosophy can be described, not altogether accurately, as the things that science can’t address. With the modern urge to compartmentalise things, we designate some problems as philosophy and science, and conveniently ignore the fuzzy boundary between the two disciplines.
The ancient Greek philosophers didn’t appear to distinguish much between philosophy and science as such, and the term “Natural Philosophy” described the whole field before the advent of science. The Scientific Revolution of Newton, Leibniz and the rest had the effect of splitting natural philosophy into science and philosophy.

Science is (theoretically at least) build on observations. You can’t seriously believe a theory that contradicts the facts, although there is a get-out clause. You can believe such a theory if you have an explanation as to why it doesn’t fit the facts, which amounts to having an extended theory that includes a bit that contains the explanation for the discrepancy.
Philosophy however, is intended to go beyond the facts. Way beyond the facts. Philosophy asks question for example about the scientific method and why it works, and why it works so well. It asks why things are the way they are and other so called “deep” questions.
One of the questions that Greek philosopher/scientists considered was what everything is made of. Some of them thought that it was made up four elements and some people still do. Democritus had a theory that everything was made up of small indivisible particles, and this atomic theory is a very good explanation of the way things work at a chemical level.
Democritus and his fellow philosopher/scientists had, it is true, some evidence to go and to be fair so did those who preferred the four elements theory, but the idea was more philosophical in nature rather than scientific, I feel. While it was evident that while many substances could be broken down into their components by chemical method, some could not.

So Democritus would have looked at a lump of sulphur, for example, and considered it to be made up of many atoms of sulphur. The competing theory of the four elements however can’t easily explain the irreducible nature of sulphur.
My point here is that while these theories explained some of the properties of matter, the early philosopher/scientists were not too interested in experimentation, so these theories remained philosophical theories. It was not until the Scientific Revolution arrived that these theories were actually tested, albeit indirectly and the science of chemistry took off.

That’s not such a big change as you might think. Philosophy says “I’ve got some vague ideas about atoms”. Science says “Based on observations, your theory seems good and I can express your vague ideas more concretely in these equations. Things behave as if real atoms exist and that they behave that way”. Science cannot say that things really are that way, or that atoms really exist as such.

Indeed, when scientists took a closer look at these atom things they found some issues. For instance the (relative) masses of the atoms are mostly pretty close to integers. Hydrogen’s mass is about 1, Helium’s is about 4, and Lithium’s is about 7. So far so tidy. But Chlorine’s mass is measured as not being far from 35.5.
This can be resolved if atoms contain constituent particles which cannot be added or removed by chemical reactions. A Chlorine atom behaves as if it were made up of 17 positive particles and 18 or 19 uncharged particles of more or less the same mass. If you measure the average mass of a bunch of Chlorine atoms, it will come out at 35.5 (ish). Problem solved.

Except that it has not been solved. Democritus’s atoms (it means “indivisibles”) are made up of something else. The philosophical problem is still there. If atoms are not indivisible, what are their component particles made of? The current answer seems to be that they are made of little twists of energy and probability. I wouldn’t put money on that being the absolute last word on it though. Some people think that they are made up of vibrating strings.
All through history philosophy has been raising issues without any regard for whether or not the issues can be solved, or even put to the test. Science has been taking issues at the edges of philosophy and bringing some light to them. Philosophy has been taking issues at the edge of science and conjecturing on them. Often such conjectures are taken back by science and moulded into theory again. Very often the philosophers who conjecture are the scientists who theorise, as in famous scientists like Einstein, Schroedinger and Hawking.

The end result is that the realm of philosophy is reduced somewhat in some places and the realm of science is expanded to cover those areas. But the expansion of science suggests new areas for philosophy. To explain some of the features of quantum mechanics some people suggest that there are many “worlds” or universes rather than just the one familiar to us.
This is really in the realm of philosophy as it is, as yet, unsupported by any evidence (that I know of, anyway). There are philosophers/scientists on both sides of the argument so the issue is nowhere near settled and the “many worlds interpretation” of quantum mechanics is only one of many interpretations. The problem is that quantum mechanics is not intuitively understandable.

The “many worlds interpretation” at least so far the Wikipedia article goes, views reality as a many branched tree. This seems unlikely as probabilities are rarely as binary as a branched tree. Probability is a continuum, like space or time, and it is likely that any event is represented on a dimension of space, time, and probability.
I don’t know if such a possibility makes sense in terms of the equations, so that means that I am practising philosophy and not science! Nevertheless, I like the idea.

Television
Television as a medium is less than one hundred years old, yet in the sense of a broadcast over radio waves, it seems doomed as the rise of “streaming” sites takes over the role of providing the entertainment traditionally provided by broadcast television.
My first recollection of television was watching the televising of the coronation of Queen Elizabeth the Second. I can’t say that I was particularly interested at the time, but I do remember that what seemed a large number of people (probably 20 or so, kids and adults) crowded on one side of the room while the television across the room showed its flickering images on its nine inch screen.

I remember when at a later date my father brought home our first television. It was a large brown cabinet with a tiny noticeably curved screen. When it was set up properly and working, it displayed a black and white image on a screen which was smaller than the screen of an iPad.
The scan lines on the screen were easily visible, and the stability of the circuits that generated the scan were unstable, so the picture would flicker and roll from top to bottom and tear from left to right. Then someone would have to jump up and twiddle some knobs on the rear of the set to adjust it back into stability, or near stability.
To start with many people did not have aerials on their roofs. For one thing, television was new, and secondly the aerials were huge. They were generally large constructions, either in an X shape or in a H shape several feet in length. Most people started with an internal aerial, the so-called “rabbit’s ear” aerials.
These were small, low down and generally didn’t work too well as they were nowhere near comparable to the wavelength of the transmitted signal. Nevertheless they enabled people to, in most cases, get some sort of a picture on their new televisions.
The trouble was that with a weak signal and unstable circuits, the person leaning over the television to tune it more often than not affected the circuits and signal. With the rest of the family yelling instructions and with a clear(-ish) picture on the screen, it only took the movement of the person tuning the set away from the set for the picture to be lost again.
Of course soon everyone had an aerial on the roof, and the aerials shrunk in size as television was moved to higher frequencies, and as the technology improved. The classic shape of a television receiver aerial consists of a bristly device, sometimes with smallish mesh reflector, one dipole and several reflectors and directors, which pretty obviously points towards a television broadcast station.

Many tower sprung up on the tops of convenient hills to provide the necessary coverage and it is a rare place these days when the terrain or other problems prevent the reception of a television signal. Even then, coverage could probably be obtained by usage of satellite technology.
However, after several decades of dominance the end of the broadcast network looks like it is in sight. The beginning of the end was probably signalled by the Video Cassette Recorder, which enabled people to record programs for viewing later. People were no longer tied to the schedule of a broadcaster, and if they wanted to watch something that was not on the schedule, they went to a store and hired it.

The video cassette stores appear to be going to have an even shorter lifetime than television itself. Of course most of them have switched to DVD as the medium but that doesn’t make a significant difference.
What does make a difference is the Internet. Most people are now connected to the Internet in one way or another, and that is where they are getting a major part of their entertainment, music, news, films, games, and also that is increasingly where they are getting their TV-style entertainment, what would otherwise be called “TV series”.

TV companies produce these popular series, an example of which would be “The Big Bang Theory”. This show has run for years and is still very popular on television, but it also available for download (legitimately) from one or more companies that are set up expressly for the purpose of providing these series online, on the Internet.
In countries at the end of the world, like here, it takes months or even years for the latest episodes to be broadcast here. If they ever are. So more and more people are downloading the episodes directly from the US, either legitimately or illegitimately.

This obviously hits at the revenues of the companies that make these costly shows, so, equally obviously they are trying to prevent this drain on their revenues. The trouble is that there is no simple way of ensuring that those who download these programs are paying for the service. If they are paying and the supplier is legitimate then presumably the supplier will be paying the show producers.
Once an episode is downloaded, then it is out of the control of the show’s producers. The recipient’s ethics determine if he will share it around to his friends or keep it to himself. If thousands of people (legitimately) download it, then presumably some of the less ethical will then share it on, and it soon becomes available everywhere for free.

It will at some stage reach a point where broadcasting a television program is no longer economic. The producers will have to primarily distribute their programs via the Internet and somehow limit or discourage the sharing of the programs around. That would mean the end of TV broadcasting as we know it.
We are not anywhere near that situation yet, and the program production companies will have to come up with a new economic model that allows them to make a profit on the shows without broadcasting them over radio waves. The more able companies will survive, although they may be considerably smaller. TV actors will only be able to demand much smaller salaries, and budgets will be tighter.

Another factor that the program production companies will have to take into account would be loss of advertising revenue. Losing advertisers can scuttle a television show, so this is not a minor factor.
Whatever happens in the long term, as I said above, a new economic model is necessary. I’ve no idea what this will look like, but I foresee the big shows moving to the Internet in a big way.
Broadcast TV will continue for some time, I think, as there are people who would resist moving away from it, but it is likely to be much reduced, with less new content and more reruns. It may be that the broadcast TV may be reduced to a shop window, with viewers seeing the previews and buying a series with a push of a button on their smart TVs.
http://www.gettyimages.com/detail/492696777
Cooking

Most people have a hand in food preparation at some time in the day. Even those who subsist on “instant meals” will at least zap it in the microwave for the necessary amount of time. Some people however cook intricate dishes, for their own amusement or for friends and families.
Most people eat cooked food although there is somewhat of a fad for raw food at the present time. All sorts of diets are also touted as having some sort of benefit for the food conscious, all of which seem bizarre when one considers that many, many people around the world are starving.
Cooking can be described as applied chemistry, as the aim of cooking is to change the food being cooked by treating it with heat in one way or another. All the methods treatment are given names, like “boiling”, or “baking” or “roasting”. In the distant past no doubt such treatments were hit and miss, but these days, with temperature controlled ovens and ingredients which are pretty much consistent, a reasonable result can be achieved by most people.

I’d guess that the first method of cooking was to hold a piece of meat over a fire until the outside was charred and much of the inside was cooked. However, human ingenuity soon led to spit roasting and other cooking methods. A humorous account of the accidental discovery of roasting a pig was penned by Charles Lamb. In the account the discovery came as a result of an accidental setting fire to a pig sty, and consequently, as the idea of roast pork spread this led to a rash of pig sty fires, until some sage discovered that houses and sties did not need to be burnt down and it was sufficient to hang the pig over a fire.
I’d suspect that while roasting may have been invented quite early by humans, cooking in water would have come along a lot later as more technology is needed to boil anything. That is, a container would be needed and while coconut shells and mollusc shells can contain a little water, and folded leaves would do at a pinch, when humans invented pottery, the art and science of cooking was advanced immensely.
Although the foods that we eat can pretty much all be eaten raw, most people would find cooked food much more attractive. Cooked food smells nice. The texture of cooked food is different from the texture of raw food. I expect cookery experts are taught the chemical reactions that happen in cooking, but I suspect that cooking breaks down the carbohydrates, the fats, and the proteins in the food to simpler components and that we find it easier to digest these simpler chemicals.
Maybe. That doesn’t explain why cooked food smells so much nicer than raw food. If food is left to break down by itself it smells awful, rotten, and with a few exceptions we don’t eat food that has started to decay.
Maybe the organisms that rot food produce different simpler components, or maybe the organisms produce by products that humans dislike. Other carnivores don’t seem to mind eating carrion and maybe a rotting carcass smells good to them.

The rules of cooking, the recipes have no doubt been developed by trial an error. It is likely that the knowledge was passed from cook to cook as an aural tradition initially. After all, cooking is likely to have started a long time before reading and writing were invented. Since accurate measurements were unlikely to be obtainable, much of the lore or cooking would have vague and a new cook would have to learn by cooking.
However, once the printing press was invented, after all the bibles and clerical documents had been printed, I would not be surprised to learn that the next book to be printed would have been a cook book. I’ve no evidence for this at all though!

Cooking changes the texture of meat and vegetables, making them softer and easier to eat. Connective tissues in particular are released making a steak for example a lot more edible. Something similar happens to root vegetables, swedes, turnips, carrots and parsnips. These vegetables can be mashed or creamed once they are cooked, something that cannot be done to the rather solid uncooked vegetables.
Cooking is optional for some foods – berries and fruits for example. Apples can be enjoyed while raw when they have a pleasant crunch, or cooked in a pie, when they are sweet and smooth. Babies in particular love the sweet smoothness of cooked apple and for many of them puréed fruits or vegetables are their first “solid” foods.
Chicken eggs are cooked and eaten in many different ways. The white of an egg is made partly of albumen and when this is cooked it changes from translucent, almost transparent, to an opaque white. Almost everyone will have seen this happen, when an egg is cracked into a frying pan and cooked until the clear “white” of the egg turns to opaque white of the cooked egg.
Many other items when cooked change colour to some extent, but the white of the egg is most apparent. When you pair that with bread which is slightly carbonised on the outside, covered in the coagulated fat from cow’s milk (butter) and you have a common breakfast dish – fried eggs on toast.

There’s a whole other type of cooking – baking – that relies at least partly on a chemical reaction between an alkali (baking soda or sodium bicarbonate) and an acid (often “cream of tartar” which is weakly acidic). When the two are mixed in the presence of water, carbon dioxide gas is given off, leading to gas bubbles in the dough. When the dough is cooked the bubbles are trapped inside the stiffening dough, give the baked cake the typical spongy texture.
Some cooking utilises biological reactions. When yeast, a fungus, is placed into a liquid containing sugar, it metabolises the sugar, releasing carbon dioxide, and creates alcohol. In bread making this alcohol is baked off, but it may add to the attractiveness of the smell of newly baked bread. In brewing the alcohol is the main point of the exercise, so it is retained. It may even be enhanced by distillation.
http://www.gettyimages.com/detail/523818827
I’ve just touched on a few highlights as regards the mechanisms of cooking (and brewing!), but I’ve come to realise as I have been writing this that there are many, many other points of interest in this subject. The subject itself has a name and that name is “Molecular Gastronomy”. A grand name for a grand subject.




