Skip to main content

The Blacksmith and the Bookkeeper, Part 3

October 12, 2004

{cs.r.title}

(part 3of a 3 part series)










Contents
The Evolving Programmer
Rules-based and Human-competitive
Machine Intelligence
Hyper-human Services
Denouement

Thanks, thanks to thee, my worthy friend,
For the lesson thou hast taught!
Thus at the flaming forge of life
Our fortunes must be wrought;
Thus on its sounding anvil shaped
Each burning deed and thought.

- From "The Village Blacksmith" by Henry Wadsworth Longfellow

The blacksmith immortalized by Longfellow, once an historically vital, centrally
necessary vocation which enabled social and technological innovation over epochs
of human history, is now practically extinct.   The
similarly vital role of bookkeeper, which shares much with the blacksmith in
terms of ancient historical roots and commercial-enabler, still thrives in the
Network Age. The third and final article in this
series discusses the evolving  role of computer programmer and speculates
on the likely fate of the postmodern programmer -- blacksmith or bookkeeper?

The Evolving Programmer

Do you write
code?  Not can you write
code, but do you?  When
was the
last time you cracked open one of href="http://www-cs-faculty.stanford.edu/%7Eknuth/taocp.html">Knuth's
volumes?  Do you even know who href="http://www-cs-faculty.stanford.edu/%7Eknuth/">Knuth is?

There was a time when programming meant the encoding of finely-honed
algorithms, parsimoniously pruned to the fewest possible instructions;
algorithms which would then also have utilized the least amount of
memory for
execution.  In the early days of digital logic and
microprocessors, such approaches were necessary in light of 
limits -- what today would be considered to be quite draconian
limits.  Imagine, for example, the programming skills that
were needed to
enable the onboard Apollo 11 Guidance Computer which successfully
landed
the first men on the moon:

The on-board Apollo Guidance Computer
(AGC) was about 1 cubic foot with 2K
of 16-bit RAM and 36K of hard-wired core-rope memory with copper wires
threaded or not threaded through tiny magnetic cores. The 16-bit words
were
generally 14 bits of data (or two op-codes), 1 sign bit, and 1 parity
bit.
The cycle time was 11.7 micro-seconds. Programming was done in assembly
language and in an interpretive language, in reverse Polish. Scaling
was
fixed point fractional.  An assembly language ADD took about 23.4
micro-seconds. The operating system featured a multi-programmed,
priority/event driven asynchronous executive packed into 2K of
memory."  --  Apollo 11: 25 Years Later by
Fred H. Martin, Intermetrics, Inc., July 1994

The vast majority of coding today is ignorant of such
constraints.  A 2K limit for even the simplest of applications --
even those written in Java, which was ostensibly designed to minimize
an executable's footprint -- would today be considered absurd. 
And that's just for the minimalist application.  Never mind the
JVM, which is sort of required for anything meaningful to
occur.  But an entire operating system squeezed into 2K?  It
is obvious that the skills required for a successful programmer in 1969
are very different from the skills required for a successful
programmer today.

Perhaps the "wire bender" crowd -- those programmers who today work
with embedded systems -- still exhibits traits of the first real
programmers.  Embedded systems programming today must still
sometimes acknowledge the existence of hard limits, albeit subject to
resource constraints vastly greater than those which tested our
fathers.   Indeed, embedded systems programmers seem to be
quite a href="http://www.javadevices.org/javadevices/resources/images/venn_diagram_large.pdf">different
species from the enterprise norm.  But
there can be no mistaking any sub-species of the postmodern
programmer with those pioneers responsible for the onboard Apollo-era
systems.  The pomo coder stripe has evolved into something quite
different than what it once may have been.  Generally speaking,
the programming skills required to
put men on the moon in 1969 would be as superfluous today as would
Network Age programming sensibilities have been then.  Since
innovation alters the coding fitscape, which in turn places pressure on
the resident species to adapt, it should be equally clear that further
innovation will have similar results insofar as programming orders are
concerned ... which is to say that the evolution of programming must
therefore continue.

What does it mean to "write code" today?  Using style="font-style: italic;">Netscape 7.1
Composer to author this article, I did sometimes click on the style="font-style: italic;">Source
tab
and make adjustments to the HTML code found there.  Am I writing
code?  Strictly speaking, does HTML even qualify as "code?" 
If so, then what about the encoding of a particular application using a
tool like a
spreadsheet?  Variables are specified, functions enjoined, and
utility served ... do applications created thusly qualify as
such?  Are those that produce them programmers?  If so, then
don't similar arguments hold true for (for example) the producer of a
PowerPoint presentation? ... or a web page using some WYSIWYG
tool?  If not, why not? 

The point is that the nature of programming has changed considerably
since those early days, which really weren't that long ago.  Given
Moore's Law and ever-reducing resource
limitations, the need for really tight algorithms fades; thus,
programming has changed.  Given the hard costs and time required
to develop custom applications versus the increasing supply of
off-the-rack, best-practices-laden business components (which I can
lease at a fraction of the development cost), programming changes
continue.  A
dwindling percentage of the shrinking IT workforce could today even
qualify as programmers from the Apollo 11 perspective.  The
dizzying speed at which computing technology is changing the world is
having a similarly profound effect on the very vocation that is
central to its nature.  Just as the blacksmith steadily
created custom components for machines that would effectively replace
him altogether, the postmodern programmer is coding himself into a
corner(-case) job market, destined to fight for the table scraps
which the machine-intelligence purveyors will not stoop to
consume.

Rules-based and Human-competitive Machine Intelligence

In the previous two installments in this series, the history of the
blacksmith trade was discussed, and a rationale for its demise
proffered.  A highly-automatable Vocational Knowledge Base (see href="http://today.java.net/pub/a/today/2004/09/28/Blacksmith2.html">AVKB
in the previous article), which the practicing
smithy exposed to a machine-hungry world, made the trade low-hanging
fruit from a productivity-enhancement perspective.   
Indeed, the unwitting blacksmith was handmaiden to his own demise, just
as the postmodern programmer
is (perhaps ironically) today.  The fact that the nature of
programming is changing even as the demand for programming skills has
abated, to serve what is otherwise an increasing global demand
for IT-related products, should be a glaring indicator and something of
a concern to skeptics; even the computer industry is not immure
from the ephemeralizing virtuous cycles wrought by IT adoption. 
And of course the dark side of dramatically increasing productivity
must be higher unemployment, at least in the short term.  Until
innovation gives rise to market demands (and skills to meet those
demands) that simply didn't exist before, economic displacement is
inevitable.  But there's more that Moore's Law has in
store, beyond even the dream of effectively limitless memory and
compute cycles ...

Artificial Intelligence ... AI ... may be a red herring.  In
truth, the case for
pomo smithy cum coder need not rely on anything as speculative or
scorned.  But it is interesting to momentarily reflect on the
thesis
behind the much-maligned AI journey and where it has and may yet
lead.

In a 1948 paper entitled "Intelligent Machinery," the stepfather of modern computer science, href="http://www.turing.org.uk/turing/">Alan M. Turing, envisioned
a machine that could rival human intelligence; virtually all AI efforts
for the past 50 years have leaned on Turing's thoughts.  He
identified three broad approaches by which human-competitive machines
could be achieved:

  1. Intellectual (logic-driven) searches:  Turing's own work
    (The style="font-style: italic;">Turing Machine) laid the foundations
    for a logic-based approach to intelligence
  2. Cultural (knowledge-based) searches:  A body of knowledge
    (or Expert System)
    approach to intelligence
  3. Genetical (evolutionary) searches: Intelligence discovered via a " href="http://www.talkorigins.org/faqs/genalg/genalg.html">genetic"
    approach

Turing's observation that "intellectual activity consists of various
kinds of searches" has set the tone and direction for AI research ever
since, with debatable success.  While a human-competitive machine
still seems a long way off, the commercial uptake of both logic-driven
machines as well as expert systems cannot be denied.  The third
approach is probably the least understood and the most
compelling, if a truly human-competitive AI system is ever to
emerge -- beyond Deep
Blue
, image analysis and some interesting expert systems.

Eighty percent of Google hits for "human-competitve machine
intelligence" also include the name John R. Koza,
the Stanford University
professor who has been a champion of genetic programming since the mid-1980s.  A logical cousin of genetic algorithms (spawned by John
H. Holland
, the world's first Ph.D. in computer science), genetic
programming environments utilize evolutionary
patterns such as crossover, selection, replication and mutation,
often resulting in surprisingly intelligent solutions.  A
teleology is required -- in other words, selection criteria must
specify a goal or problem to be solved.  And sometimes the magic
just doesn't work.  But with each successive generation --
constrained primarily by the compute cycles and memory available to
throw at
the problem -- genetic programming is getting better.  According
to the most recent publication from Koza and company, the progressively more
sophisticated results produced by genetic programming over the 15-year
period from 1987 to 2002 include:

  • solving toy problems
  • producing human-competitive (creative) results (but not
    previously patented or patentable inventions)
  • duplicating 20th-century patented inventions
  • duplicating 21st-century patented inventions
  • creating patentable new inventions

Note at least four interesting implications of Koza's work:

  1. patentable new inventions
    may very well be the acme of human-competitive activities -- a Turing
    Test if ever there was one
  2. some forms of creativity and innovation are not exclusively
    human activities
  3. Moore's Law all but ensures that genetic programming may
    effortlessly succeed where other approaches to AI have yet to; it is,
    perhaps,
    only a matter of more compute cycles
  4. no programming -- or programmers -- are required

Okay, yes, some programming
will be needed <sigh>.  A few geeky, old-fashioned hackers
(in the Eric Raymond sense of the word) would be nice to have around,
if for no other reason than to marvel at the results of our Darwinian
black box, and provide a few tweaks here and there.  But nothing
like the number of workers that may have been needed to (for example)
steady the world's Y2K fears.  Those days -- and the demands of
those days -- are gone.  Once the genetic machines are encoded,
they would ostensibly even have the capacity to heal themselves ...

We no longer harvest the fields with
hired hands.  We have our machines.  And our machines were
designed by machines, built by machines, and are sold, distributed and
serviced by machines.   No blacksmiths are needed.

It is happening steadily -- to some, it may not appear to be happening
at all.  I'm quite sure there were still blacksmith's in 1904 who,
when
confronted with the informed observation that machines would soon
reduce their dwindling numbers to near zero, still scoffed.  A
ubiquitous abundance of imagination and absence of denial is not a
hallmark of our species; sometimes we are like frogs do tend to succumb
to heat when
boiled in water slowly.  Though we may never notice our daily rate
of
aging, we
nevertheless grow old.  Change, as such, is inevitable.

Hyper-human Services

So whence goeth the pomo coder?  How will the programming trade
evolve?  If indeed early 21st century workforce reductions are
harbingers of IT things to come, what skills will be in demand?

Futurist Richard W. Samson
has proposed a set of attributes that he believes will be
characteristic of "in-demand" trades of the near future.  In a
recently published article in the
Futurist magazine
(September-October 2004) entitled "How to Succeed
in the Hyper-Human Economy," Samsom posits that a new class of jobs
that involve difficult-to-automate "hyper-human" skills (beyond
know-how) will be the successors of post Industrial Age know-how based
professions.  What  are hyper human skills?  Those
(sometimes ineffable) attributes of being "alive" that are currently
very difficult to automate.  Things like:

  • creativity and imagination
  • love and empathy
  • subjectivity
  • wanting, valuing, intending
  • social skills
  • hypothesizing

These hyper-human functions, according to Samson, will become
increasingly more valuable as know-how based activities become as
automated as agriculture and manufacturing is today. 

If Samson is correct, today's coder will evolve into something more
like a concierge than an expert system.  The ability to discern a
customer's unarticulated problems will become more important than an
accurate implementation of a specification -- at least from a valued
skills perspective.  Intuition about a client's needs will replace
algorithmic agility as a premium ability.  Big picture knowledge
will be valued more than narrow-and-deep expertise.  And the
ability to write code of any kind will be about as valuable as the
ability to code in assembler today.  Yes, there will be a demand
for those skills, just as there is a demand for farm laborers
today.  But the demand for enterprise level IT coding skills will
be nowhere near what it was at its peak -- which,  in retrospect,
was probably 1999.  That party, it seems, is over.

The fate of the bookkeeper in a hyper-human economy may ultimately be
just as tenuous as that of the un-evolved programmer, although to a
great degree, the bookkeeping function already presumes a basis
in hyper-human attributes.  My relationship with my accountant is
as personal as that of my priest.  His empathy, discretion and
creativity are as important to me as is his topical knowledge. 
Arguably, he has already evolved well beyond his know-how based
ancestor that kept the books for Longfellow's smithy.  
Indeed, the difference between the blacksmith and the bookkeeper was
never one of know-how; but rather, the extent to which hyper-human
attributes embossed their particular vocation.  It was the extent
to which hyper-human attributes were enjoined that, in the end, spelled
the virtual extinction of the one versus the evolved survival of the
other.

Denouement

This series began with the first stanza of a poem which, when published
in 1841, romanticized and immortalized the role of a vital trade that
we might have completely forgotten otherwise.  So many other
vocations have come and gone without such celebration: dowsers, mule
skinners, higglers and scribes have all seen their day, made a living
when it was possible, and moved on as needed when economic fitscapes
demanded such evolution.  The blacksmith, however, provides a
lovely metaphor in so many ways:  self-reliant, centrally-vital to
the community, commanding iron and flame with sinew and hammer, the
smithy iconified so much of what was and is vital to our notions of
truth and virtue.  The metaphor is also salient in that the smithy
tended the fields of his own demise.  Programmers today are
ironically doing the very same thing, by shepherding the ephemeralizing
wake of unrelenting IT virtuous cycles into increasingly broader
spheres of economic impact.  The pomo coder is a dying breed, at
least insofar as the job market is concerned.  Rapidly increasing
productivity does indeed have a dark side; economic displacement is
inevitable as innovative job creation is not automatic.

In the end, the blacksmith and the bookkeeper are only convenient
metaphors, not to be confused with inherently meaningful symbols. 
Any number of contrasting metaphors may have served just as well. 
The point of this series, however, has been to examine the diminishing
role of the computer programmer in the Network Age, which presumably
has only just begun.  In an age in which IT continues to
relentlessly explore the boundaries of human existence, the shrinkage
of the IT workforce  must be viewed with at least a modicum of
interest if not amusement and perhaps a bit of concern.  Hopefully
this
series has been of some value in that regard.

Max Goff is the Chief Marketing Officer for Digital Reasoning Systems, Inc., headquartered in Nashville, Tennessee.
Related Topics >> Community   |