Nihil est in intellectu quod non sit prius in sensu 1
Nothing is in the intellect that is not first in the senses
In her essay The Death of the Sensuous Chemist, Lissa Roberts tells a riveting story of a crucial period in the history of science that has Antoine Lavoissier as one of its protagonists. Prior to this time, chemists used their senses to analyze the matters they worked with. It was not uncommon for a chemist to taste urine and analyze its properties from what was plainly available to the senses. The training of a chemist involved years of developing a literacy of nature through the senses, gas concentrations would, for example, be determined by factors such as smell or temperature. This education was a lengthy process in which the apprentice had to be sensitized to the point that they could use their bodies to detect all the subtle discernments necessary to pursue chemistry with success.
Lavoissier was essential in formalizing the new science of chemistry by promoting an innovative synthesis of principles that were introduced by a network of fellows in the eighteenth century, including laboratory equipment, experimental techniques, pedagogical approaches and language to discuss findings.
Analysts in the new chemistry had to master new experimental techniques that required them to subordinate and discipline their own bodies in the service of the material technologies of their laboratories.
This is not to say that chemists stopped altogether smelling, tasting, touching or listening in the course of their work. But knowledge gained in this way came to be regarded as unreliable, arrived at by intuition and more importantly, as non-discursive given the requisite for precise measurements demanded by the new language of the new chemistry. The machine began to be seen as the primary means to attain trustworthy scientific knowledge and technology became the means by which knowledge could transcend the body.
The Cartesian Machine
The most decidedly modern of machines is the digital computer, but the computer is a particular kind of machine in that it is not designed to extend human limbs, make them stronger or protect human from predator. Like the abacus and other technologies before it, the computer is a machine designed to extend the power of human reason.
The use of reason as the principal means of attaining knowledge, is a major pillar of western philosophy. As Jonathan Israel argues "after 1650, everything, no matter how fundamental or deeply rooted, was questioned in the light of philosophic reason". The philosophical milieu of the time was perhaps best summarized by Descartes transcending statement “Je pense, donc je suis" ("I think, therefore I am"). It is only through intellectual reasoning that human can know human. The breach between mind and body in western philosophy widened substantially during this period.
What more evidence of extreme cartesianism in the current technological worldview does one need than the dualism existing between software and hardware? The digital computer, this most ubiquitous of technologies is a manifest case of the way technology has developed under the Cartesian worldview.
The computer can be said to be the most Cartesian of machines, for the human capacity that it codifies is logical inference itself and its distance to embodied operation the greatest yet that has existed in any other technology before it.
Mechanical Turk as the last job standing
It has been one of the most elusive promises of science that in twenty years computers will be as smart as humans. This particular kind of techno-utopia continues to be promoted by the aging rearguard of the Artificial Intelligence (AI) community.
In the utopian landscapes proposed by this promise, humans are freed of mundane repetitive labour, work is performed by ever more intelligent machines and humans have plenty of free time to enjoy the wealth generated by the work of machines.
The currently existing relationship between human and machine is far from the promise of AI. The machine now pervades every aspect of the human’s working life and far from making space for more free time the machine is now portable enough and offering sufficient connectivity to make work possible anywhere anytime. Humans take this feature as an opportunity and work at all times with careless abandon. The shift in labour is not only quantitative but qualitative as well. Machines perform tasks at the convenience of humans and humans do what machines can not yet do. As machines become more capable in a general sense, humans become more specialized to fill the gap of the machine.
One example in which the machine’s capability has surpassed that of the human is cancer diagnosis. It turns out computers are more accurate in the diagnosis of breast cancer than trained humans are. It would therefore be irresponsible to not adopt machines for cancer diagnosis as it would mean that a greater number of human lives would be saved by introducing more accurate diagnosis of early signs. This also means that it would be, not only less than optimal but also irresponsible to continue giving this job to a human.
As the machine further erodes the territories that were reigned by humans in yesteryears . It becomes inefficient, inviable and at times even immoral for humans to remain doing certain types of work. Machines however are not yet capable of carrying out all necessary work by themselves.
The use of human intelligence to perform tasks that computers are currently unable to do is becoming a niche market. Humans performing such tasks are called Mechanical Turks. Global online retail giant Amazon offers HITs (Human Intelligence Tasks) for humans to perform in exchange of money, at the time of this writing, some of the tasks listed include: rating the credibility of a piece of content, discerning dialects from streams of arabic text, writing an engaging callout, describe the color of a product, rate images for adult-only content, proofreading a publication. At any given time there are around 3000 HITs listed on the Amazon marketplace. Some of these HITs are in fact perfectly doable with a computer today and soon most of them will.
It might seem a far fetched idea that the last jobs where human labour will be required will be those assisting machines, but the current trend is to incentivize this kind of human-machine relationship in labour and there are no signs of this trend reversing.
The codification of civilization
In one of his lectures, Daniel Dennett showed a slide containing an old instruction manual for elevator operators, from back in the day in which operating an elevator was a job carried out by a human. It contained clear and concise instructions such as "emergency exits at either side of the car must be closed when in motion". Dennett spoke of how these manuals have been progressively codified into technologies as either electronic devices or software. In the process of codification a great deal is lost. Certain directives are dropped, some new ones created, ambiguity and moral judgement are replaced with feedback loops.
In the words of Simon Penny “any tool, soft or hard, is a mechanistic approximation of a narrow and codified aspect of human behavior. [...] Tasks which are simple and open to variation for a person, must be specified and constrained when embodied in a machine”.
Concealment of the rules
In the process of codification certain aspects of the codified task are lost. The process of trying to find the rules encoded in a system by looking at nothing other than the system itself is called Reverse Engineering. At first encounter with a codified system most humans find severe hurdles in their understanding of it. Few things about its internal workings are revealed to the untrained eye. Details of a system's inner workings are elusive even to expert eyes. Some systems enclose so many encoded abstractions that at times it is impossible to fully grasp how they all play together as a whole. This is one of the reasons why codification affects comprehension. The machine cannot easily transmit knowledge about the abstractions that it codifies.
Systems can sometimes reach such levels of complexity that indeed no single human can even begin to understand how they work. In the Flash Crash of May the 6th of 2010, the Dow Jones index plunged about 9% in the course of a morning, 600 points alone in just 5 minutes at exactly 2:45. The causes for this were unknown at the time but had to do with High Speed Trading. High Speed Traders are algorithms executed by very fast computers that operate on real-time market data, sometimes buying and selling within nanoseconds. These tiny transactions scratch only fractions of pennies on every transaction. But because these operations are performed in really huge numbers every day, can amount to millions and millions of dollars worth of trade. The technological arms race that these trading conditions have created is as interesting as is ludicrous. Each of these algorithms in operation get triggered on certain conditions, so for example when a particular set of shares that are interrelated present an oscillation in value, a particular algorithm might be triggered to perform a sale. Whereas another algorithm operating in the same arena, under the same conditions might trigger a buy action. Making this market a vast pool of codified rule sets that affect one another and where no single entity has an overview of how the whole works.
The new ecosystem of the machine is an Economist’s wet dream. All these trading agents, performing their actions rationally, with equal access to information, with human emotions ruled out of the market. A perfectly rational system, the very essence of the science of Economics.
Yet what happened in the Flash Crash was unexpected and might never be fully explained. It is now known that a glitch in price reporting might have triggered the downward spiral that was then exacerbated by High Frequency Traders, but the complexity of the system and the opacity of the rules codified in each individual trader make an accurate assessment of the causality positively impossible.
No single human being has a detailed understanding of how these systems work.
A lot of apps available today replace a technology that previously existed as a physical device. Making the smartphone the ultimate generic tool that can perform the tasks of hundreds of other devices that previously required to be manipulated by a human in the physical world.
What before was pushing keys in a calculator is now tapping the touchscreen of a smartphone. What before was manipulating a water level, is now balancing a smartphone’s inclination sensor.
As devices and the software running on them become more capable, software simulation quickly becomes the dominant aspect of the machine. The more generic the hardware, the more specific the software seems to become.
All codified aspects of an activity buried in the machine exist in a realm of ideas away from human consciousness, accessible only to the expert. The machine becomes a black box. With the process disembodied, the human using the machine that makes the thing, hardly ever learns to make the thing itself. As Simon Penny put it “the process which links conceptualization to physical realization is destroyed”.
As the machine specializes it is the human that becomes stereotyped, the human becomes a “standard part”, an interchangeable element in the chain, a parameterized formula in the design of machines. The more ergonomic the design, the more stereotyped the human.
Extraordinary development of the machine
F.M. Alexander found a causal relationship between technological development in the build-up toward the first World War and the “crisis of consciousness” that ultimately led to war.
[...] "The extraordinary development of machinery, which demanded for its successful pursuance that the individual should be subjected to the most harmful systems of automatic training. The standardized parts of the machine made demands that tended to stereotype the human machine" [...] "The power to continue work under such conditions depended upon a process of deterioration in the individual as he is slowly being robbed of the possibility of development"
This thought of Alexander is what nourished the idea of what John Dewey called the Degeneration of Civilised Unconscious. It is important to note that Dewey was not talking about this process as a cultural trend, but rather as a tragic disconnect between means and ends. Being subject to change but never in control of the process of change itself, or at any rate aware of it at all. Change happening below the buoyancy level of collective consciousness.
Alexander understood that awareness of this process of change and the development of awareness to exert some level of control on it, was a process that had to occur through the body and one that must be experienced before it is understood.
fig. 3 - Example of a directive for human operators that cannot yet be codified into a machine.