Monday, May 11, 2015
Statisticians talk about two kinds of errors, which they call, imaginatively, Type I and Type II.
Type I errors are those in which we perceive or recognize a pattern or effect that actually doesn’t exist, a so-called “false positive.” Our data mislead us, giving evidence of a phenomenon that, with further research, we will debunk. My 9-year-old son found that the Yankees won when he sat on the floor and lost when he sat on the sofa. Further research, unfortunately, failed to produce data in support of this hypothesis. Type I errors, we could say, are errors of gullibility.
Type II errors are those in which we fail to recognize or discover a pattern or truth that actually exists, a so-called “false negative.” Our data are incomplete or insufficient, our tools too coarse, our methods too crude. The pattern is there to be discovered, but we can’t (yet) see it. We may suspect that cigarettes increase the incidence of certain cancers, but we have yet to generate the research that demonstrates this, for example. Well, a while ago. Type II errors, we could say, are errors of skepticism.
Researchers are generally more skeptical, comfortable living safely with Type II errors, knowing that there are things out there that we just don’t know. And Type I errors make them uncomfortable, suggesting superstition and conspiracy theories. (Statisticians know that these two errors are fundamentally the same—alter the hypothesis you are testing, and the same research will produce a different possibility for error. I’m not talking here about statisticians, however, but about interpretations of their work.)
This preference in favor of Type II errors and against Type I errors isn’t a necessary one, nor has it always held sway. Medieval minds were clearly happier with Type I errors than we are—dragons in China? A city of gold in central America? Angels on pinheads? Sure. And Medievals would likely have found a modern, skeptical, Type II-error-preferring person dry and boring, knowing with certainty but knowing little. Just as gullible children live happily with Type I errors and often find grown-ups… dry and boring.
Recognizing these errors, and our preferences for one or the other, can open us to some further considerations.
1.) There are certainly many things we don’t know, and, for anyone interested in open-minded research, either preference—for Type I errors or for Type II errors—can eventually lead to greater statistically-based truth. The Middle Ages led to modern times. Skeptics are gradually convinced; gulls are gradually enlightened.
2.) If we indulge a view that all truths are found only in statistics, however, then truths that are not found in statistics—that are found in direct experience, for example—may elude us altogether. Or be revealed in an instant.
3.) The middle ground—the truths unrecognizable by Type II thinking and within Type I thinking but not distinguishable from falsity—is likely larger than we imagine. An example may include some aspects of astrology. I’m not interested in it, particularly, and, as a careful Type II guy, I don’t find or really attempt to find evidence of any truth in it. But I’m skeptical of other Type IIs who too quickly dismiss it. I’m unwilling completely to dismiss those ancients who found value in it. For what it’s worth—which is clearly more than obvious superstitions like, “step on a crack, break your mother’s back” (I’m happy not to have access to the parallel universe of spine-injured moms)—astrology held sway in various ways for millennia and in many brilliant minds. It may well be that our data are incomplete, our methods crude, our assumptions wrong.
4.) Researchers’ preference for living with Type II errors is based on the assumption of randomness (and, with it, the value of large sample sizes). And, as I’ve written elsewhere (click here), although we may “feel” this assumption to be correct, it is, in the end, an assumption. If that assumption is incorrect—if randomness is not as prevalent or as pervasive as we believe—then our preference for Type II errors would have to subside, and our tolerance for Type I errors would have to rise.
What would it be like to live in a world in which Type I and Type II errors were tolerated equally? If we attempted to balance healthy skepticism with healthy open-mindedness? We might know less for certain, and we might be more open-minded regarding the experience of others.
(These ruminations derive from a conversation with Andrew Hill, Collegiate Chair, Glenaeon Rudolf Steiner School, NSW, Australia, on the steps of the Sydney Opera House.)
Friday, May 8, 2015
Elementary school teachers are charged with guarding students from a world that asks them to grow up too quickly.
High school teachers are charged with liberating students in a world that asks them to remain adolescent forever.
At around the age of 13 or 14 or 15, children become young adults, and the world that asked them to grow up too quickly acts like a judo master and overthrows expectations and pressures. Now our young adults are never to grow anymore. This new world wants them to remain at that idealistic, malleable stage in which judgment, discernment, and executive function are not developed. We are to be consumers and political pawns, but not mature, thoughtful, free, creative adults. (I have written about this in a bit more detail here.)
Despite superficial similarities—delivery of curriculum, and so on—primary school teachers and high school teachers, then, have very different jobs. Elementary school provides, among other things, protection and guardianship. High school provides, among other things, guidance and liberation. These tasks are so different that they can become a source of friction and criticism among colleagues in a school.
One mundane example regards student dress. Elementary school teachers often set policies against wearing clothes with logos and images, keeping classrooms free of distracting pop and commercial imagery. These rules are relaxed in high school as students, now young adults, are capable of seeing past such superficialities in the classroom, are not distracted by them.
On a deeper level, elementary school teachers can count on their students to behave, often, because of their respect for a teacher’s authority. High school teachers have to earn the respect of their students, and commands for good behavior are simply often no longer effective.
In good times, younger students look up to the astonishing achievements of older students—their artwork, their science projects, their athletic feats—and the school community honors the blossoming of its older students. But I’m talking about the day-to-day friction that can occur among colleagues because of the profound change from childhood to adolescence and the demands it places on teachers.
In Waldorf or Steiner schools, primary school teachers often stay with their classes right through to this transition point. If we are not careful, teachers’ appropriately guarding, nurturing influence carries on too long, and the students begin to chafe under this increasingly inappropriate guardianship. And, when primary school teachers look up to the high school grades and observe the messy process by which teenagers gradually mature and learn to act freely and responsibly, they can become critical of high school teachers who appear not to be doing their job in controlling high school student behavior.
And high school teachers sometimes gaze down from their perches and develop disdain for the work of their colleagues in primary school, who may appear to lack expertise and sophistication because they are guarding their students beautifully.
If we understand the challenges of the world we have created and in which we live, a world that sends the message, “stay forever young”—but not too young—and the different challenges that this presents for primary school teachers and high school teachers, we can understand each other as colleagues. We can develop sympathy for colleagues whose jobs appear outwardly similar but that are in a fundamental way the inverse of each other—one necessarily guarding, the other necessarily liberating.
Friday, April 17, 2015
We are told to live in the moment.
But can we?
The future, which does not yet exist and has never existed, unfurls inexorably and then, in literally no time at all, is past, gone. From the future, the realm of potentiality and possibility, we move instantly to the past, the realm of the unalterable.
How can we understand the present?
It does not exist in itself. It is the seam, the infinitesimally fine imaginary line between the past and the future. The present gains direction and momentum from the past and creative energy from the future.
Imagine a future with no past.
Amnesiacs experience something like this, although (as far as I know), no one is completely amnesiac. The future may hold creative possibilities, but with nothing to build on, no rudder, no direction, none of them can really be brought into being.
Without a past, there is no real future, only an illimitable, meaningless present. We could call this imagination without memory.
Imagine a past with no future.
I don’t know that we have a name for this as a medical condition, although maybe it deserves one. Dwelling hopelessly in memory, no image of the future, no ideals, nothing to lead us forward. The only track is the one we’re on.
Without a future, there is simply inertia, more of the same, devoid of novelty or agency or meaningful change. We could picture it as memory without imagination.
(Memory without imagination? Isn’t that an apt description of a computer? And the world we’re building with our digital devices? Devices with astonishing amounts of “memory,” and programmed with algorithms that include every possible “future” without novelty. The creators and programmers have the thrill of invention and creation, but what they create excludes these experiences from us users, or provides only their illusion.)
Both lopsided worlds, the past-without-future and the future-without-past, are imaginary only. They are poles between which the present comes into being. Future and past interpenetrate to produce the present.
And a meaningful present exists only in the tension, mediated by human beings, between the past and the future.
Sunday, April 12, 2015
Note: I’m posting this largely unfinished—a collection of paragraphs in search of resolution—because I’m interested to know what anyone who stumbles across this thinks. Post comments. Let me know. I feel like I’m just scratching the surface of the topic…
You stand outside a burning building. A loved one is trapped inside. You are afraid. And if you cannot overcome your fear, you will watch your loved one perish.
You stand outside a burning building. A loved one is trapped inside. You must resist the urge to rush thoughtlessly, foolishly into the building. If you cannot resist your foolhardiness, you and your loved one will both perish.
You stand outside a burning building. A loved one is trapped inside. If you can act courageously, overcoming fear and withstanding the urge to rash action, you may save your loved one.
In human experience, then, courage is not some interaction of adrenaline and electrochemical neurology. It is that capacity that can arise when we allow fear and foolhardiness to “interpenetrate” within us. A courageous person experiences fear, and she also feels the impulse to act without thinking. She is able, however, to maintain a dynamic tension between these forces of fear and foolhardiness, and, in mediating them, to act courageously. The opposition of fear and foolhardiness, the polarity between them, provides the possibility for courage to arise.
Apologies for the melodramatic opening. But part of the challenge of writing about polarities is that the writing can come off as dry and logical, when the phenomena of polar interactions are so alive, multifaceted, and interesting. It’s the difference between describing an optical demonstration in which a prism is interposed between a beam of light and a projector screen and the experience of seeing the rainbow of colors that results.
Some thinkers, from Heraclitus and Plato through Giordano Bruno, Johann Wolfgang von Goethe, Samuel Taylor Coleridge, Ralph Waldo Emerson, Rudolf Steiner, and Owen Barfield, to William Butler Yeats and James Joyce, and beyond, have found value in demonstrating how capacities, qualities, and values (these are, we could say, immaterial or spiritual realities—realities that span a huge range, including courage, emotions in general, color, and art) arise through the interaction of apparent opposites, polarities.
What is a polarity? It is an opposition or tension between two qualities or capacities that, in their “interpenetration" (Coleridge, Barfield) or “intensification" (Goethe) or “held balance" (Steiner), give rise to a third quality or capacity. It is “a dynamic and generative interpenetration of opposites.” [“Dialectic And The ‘Two Forces Of One Power:’ Reading Coleridge, Polanyi, And Bakhtin In A New Key,” Elaine D. Hocks, p. 1.]
The interpenetration or intensification is dynamic, moving, we could even say “alive,” although not in a strictly biological sense (what is biological life? Have we given up on that question?). So the new creation of the interaction or mediated tension between the poles is never static or single. The interaction of light and darkness, in the color theory brilliantly elaborated by Goethe, produces color. All color, from the dullest gray-brown to the most luminous, brilliant gold, with all hues and shades in between.
Similarly, Steiner’s description of emotions arising from movement between attraction (sympathy) and repulsion (antipathy) shows how all emotions (we could say “true” emotions; more on “false” emotions below), from the gentlest glimmer of new love to the strongest expression of courage, arise from polar phenomena.
The emotion we call courage, the capacity for appropriate action in the face of fear and the tug of foolhardiness, fits within Steiner’s more general polarity in which emotions arise out of attraction and repulsion (sympathy and antipathy). Fear repels us; we must face a fear to overcome it, and resist its repulsive force. Hence, perhaps, the allure of ghost stories and horror movies. And foolhardiness draws us toward rash action.
This gives some evidence of what we might call the “holographic nature” of polarities—one nested inside another. I leave it to you to decide whether or not all polarities are in the end nested in one grand one.
Similarly, the pole in one polarity may become the capacity developed by another. Thinking and perceiving form a powerful polarity from which consciousness arises, as Steiner and Barfield demonstrate. But thinking itself is the result of a polarity between the “universalizing” and the “particularizing” movements of our minds, according to Barfield.
This gives evidence of the “net-like” web of meaning that polarities inhabit. A polarity reveals or creates a quality or truth, but, to fully comprehend its ends, we must move to another polarity. This sounds like a call to relativism and a hall of mirrors, each point of meaning leading to another, not more meaningful or revealing than the last. It’s difficult to move swiftly past this point, but suffice it to say that just because we may move from pole to pole does not mean that each polarity is equal in meaning or import.
One strange aspect of polar existence is that the central phenomenon—color, let’s say, or emotions—can be experienced, used, and studied, often fruitfully, without understanding the polarity that gives rise to the phenomenon. But, we could say, real understanding, the beginning of meaning, comes only with recognition of and experience of the polarity that gives rise to the phenomenon under investigation.
To think in polarities—to recognize their existence, their power, and, particularly, to live in the moving, multi-dimensional creations of their resolution or in the balance of tension that they produce—is to think in a living way. This may not be the only thing that Steiner meant by “living thinking,” but it is one.
It’s entirely possible, by the way, to read my description of polarities, or someone else’s—Barfield’s, Coleridge’s—, to “understand” polarities in some academic or intellectual way, to be able to explain them to someone else, but not to be able creatively or imaginatively to enter their existence. It takes human activity, an act of imagination, to enter the activity of polar phenomena. I find that my ability to contend with them flickers over time—sometimes I approach more closely; other times, I meet a wall of incomprehension. Do two phenomena that appear in opposition present a polarity? Only work over time—assisted by the research of others—will tell.
“The concept of polarity … is not really a logical concept at all, but one which requires an act of imagination to grasp it. … Unlike the logical principles of identity and contradiction, it is not only a form of thought, but also the form of life. It could perhaps be called the principle of seminal identity. It is also the formal principle which underlies meaning itself and the expansion of meaning.” [Owen Barfield, Speaker’s Meaning (Middletown: Wesleyan Univ. Press, 1967), pp. 38-39.]
In this sense, polarities are akin in some ways to proofs in formal geometry. A student of geometry can be led through the steps leading to a proof, but the “QED,” the experience that these steps constitute a proof, arises only in the imaginative perception of the student. As a teacher, I can state and restate terms and relationships, trying to make clear that their relationship constitutes a proof, but only the student can generate the insight to leap over the intuitive gap presented by the final step of the proof and “see” the truth of the proof.
Similarly, an apparent opposition only becomes a generative polarity through an intuitive act of insight.
Further, a student can learn to repeat the words of a proof, uncomprehendingly, just as someone can repeat the words of, say, Goethe’s demonstration of the polarity between light and darkness that gives rise to color, or Steiner’s assertion that emotions arise from the interaction of antipathy and sympathy, without living in or comprehending the reality of this relationship. (Barfield calls this “the intellectual soul masquerading as the consciousness soul.”)
And, once this relationship has been experienced, rather than simply reiterated, its truth is unshakably present, just as is the truth of a proof in geometry, once “seen.” It may be re-experienced, almost at will, but it cannot be easily forgotten. To perceive a polarity changes the person who perceives it.
It’s also clear that what we may describe in polarities is not necessarily or not always what we mean when we describe something as “dialectical.” Dialectics, in simplest form, is the famous conflict between thesis (statement), antithesis (counter-statement), and their resolution in synthesis (which becomes the new thesis). This wheel of oppositions may describe what occurs in the world (ask Karl Marx)—slavery (thesis) and anti-slavery (antithesis) clash during the U.S. Civil War, and the resulting antithesis includes the end of slavery but the continuation of racism and oppression (synthesis). The new creation here is not the result of “interpenetration,” “balance,” or mediated, creative tension, but simply of strife. Dialectics may be powerful—in argument and in Marx’s dialectical materialism—but it is a wheel that churns what already exists, not a creative process that gives rise to something new.
Coleridge used the language of dialectics to describe polarity, but added a critical condition [my italics]:
“Every power in nature and in spirit must evolve an opposite as the sole means and condition of its manifestation: and all opposition is a tendency to re-union. This is the universal law of polarity or essential dualism, first promulgated by Heraclitus… The principle may be thus expressed. The identity of thesis and antithesis is the substance of all being; their opposition the condition of all existence or being manifested: and every thing or phenomenon is the exponent of a synthesis as long as the opposite energies are retained in that synthesis.” [(Coleridge, The Friend I, 94. Quoted in “Dialectic And The ‘Two Forces Of One Power:’ Reading Coleridge, Polanyi, And Bakhtin In A New Key,” Elaine D. Hocks.]
In every polarity, the possibility exists that the poles will simply be allowed to meet and mix, unmediated, un-intensified, without interpenetrating. This “unmediated” mixture of poles does not produce any new third quality or value. Light and dark can simply swirl together to make shades of gray. Fear and foolhardiness can simply mix to produce anxiety or even panic. The simple mixture of thinking and will produces “false” or “dead” emotions—I think of C.S. Lewis’s Men Without Chests. The mixture of thinking and perceiving results in conventional consciousness, what Barfield calls, pejoratively, “common sense,” which includes the illusion of thinking accompanied by the illusion of perceiving. And beauty and meaning can mix to produce propaganda or advertising, a hollow shell or the illusion of art.
In each polarity, one pole tends to be more “active” and one more “passive” or “receptive.” One accords more with cognition, the other with will or action.
The circumstance of a polarity offers three possibilities for error, then. 1.) A failure to allow the ends of the polarity to interpenetrate, producing only the illusion, the gray, the base mixture, the apathetic, the conventional appearance of resolution. 2.) Yielding to the temptation of one pole--fiery egocentricity, warmth, beauty, perception, sympathy, foolishness, and light. And 3.) Yielding to the temptation of the other pole—despair, meaninglessness, cold abstraction and intellect, antipathy, and darkness.
And the human being who stands between these, allowing them to interpenetrate in thinking and perceiving, must rouse herself to (inner) activity to resolve the tension between them, to allow them to interpenetrate.
Polarities are contextual and situational, and all three “parts” of them seem to arise simultaneously. For instance, antipathy does not exist until its opposite, sympathy, is also present, and both spring into being with the feeling between them. We may imagine light and darkness sitting around, waiting (to be separated one from the other), and then color arising sometime later through their interaction. But, of course, we see color first and only in imagination perceive the generative poles of light and darkness. In schematizing a polarity, we may imagine the poles existing “first” and determining the creative intensification arising “later,” but, in reality, all three spring into being at the same time. We could say that someone who too literally sketches polarities in an abstract way simply doesn’t quite understand what he is talking about—or understands it superficially without actually living it.
Some values or qualities are not immediately obvious as poles, but resolve into a polarity when properly apprehended. Goethe’s work on color is obviously, uncontrovertibly true within its conceptual framework to anyone who has studied it and understood or apprehended it, and yet it’s still controversial—seen as a matter of “belief” by those who know “about” it but do not know it. (Very briefly: It’s not necessary to choose Goethe or Newton. Both are correct within their own frameworks. Goethe’s work centers on human perception of color. Newton’s work, as he himself recognized, is not about color at all, but about what we now call electromagnetic radiation. And, either way, that Pink Floyd t-shirt image of light passing through a prism is an inaccurate representation.)
In the growth or development of a human, we may move from one to the other—will in youth, thinking in old age, for example. In human history, too, we may move from a more perceptive consciousness to a more cognitive consciousness. It’s more complicated than this, but I’m trying to keep this brief. In general, polarities present not a static creation, but one that moves. And this movement may have a meaningful direction in a human life or in the life of the world.
Not every pair of apparent opposites is necessarily a polarity. Good and evil, for example, don’t “interpenetrate” to produce a third quality or value. To paraphrase C.S. Lewis, this is because evil is simply fallen or twisted good, derivative of good, not its opposite.
Finally, a polarity is not a duality, and consideration of polarities does not perpetuate another dualistic philosophy. In fact, polarities are simultaneously both whole (one) and “triune,” and they provide a path—possibly the only path—away from dualism and irreconcilable “two-realm” theories of truth.
I will end with some examples of polarities that I have collected over the past couple of years.
1. For Plato, in the dialogue Laches, courage is the “mean” or mediated interaction or interpenetration between fear (too much thinking that precludes action) and foolhardiness (action that is not tempered by thought). I have used this as my entry to thinking about polarities at the beginning of this article. Plato does not use the word “polarity” (as I understand it, Coleridge is the first thinker to use the word in the sense in which I am using it here). He is not didactic, and the dialogue as a whole presents in a living way what I have made a bit more cut-and-dried here.
2. In his Theory of Colors, Goethe showed that color arises as the interpenetration of light and dark. Darkness seen through a light-filled medium, atmosphere or prism, appears as cyan, blue, or violet. Light seen through a dense medium, atmosphere or prism, appears as red, orange, or yellow.
3. Plant growth and development can be seen as a living process that occurs between the poles of expansion and contraction, as Goethe demonstrated in his Metamorphosis of Plants.
4. Steiner showed how the polarity of cognition (antipathy) and will (sympathy) in a human soul gives rise to feelings or emotions. Feelings lead us to put our thoughts into action; feelings likewise lead us to think about our actions. The words sympathy and antipathy are challenging in English—sympathy seems “good,” antipathy bad—and Steiner did not mean them to have these shades of meaning. Both are necessary, and the tension between them is also necessary. We might better talk about something that attracts us and something that distances us. Each is appropriate in its proper sphere.
5. Barfield, following Steiner, shows how thinking and perceiving give rise to human consciousness. We can think about the world, and we can perceive the world. Both arise simultaneously in us, producing consciousness. There is (despite William James’s thought experiment about a “blooming, buzzing confusion” or Merleau-Ponty’s hypothesis regarding the “primacy of perception”), no pure or prime perception; no perception without thought, and vice versa. Each discovers itself in the other, in consciousness.
6. “James Joyce, in A Portrait of the Artist as a Young Man, makes a distinction between what he calls ‘proper art’ and ‘improper art.’ By ‘proper art’ he means that which really belongs to art. ‘Improper art,’ by contrast, is art that’s in the service of something that is not art: for instance, art in the service of advertising. Further, referring to the attitude of the observer, Joyce says that proper art is static, and thereby induces esthetic arrest, whereas improper art is kinetic, filled with movement: meaning, it moves you to desire or to fear and loathing. Art that excites desire for the object as a tangible object he calls pornographic. Art that excites loathing or fear for the object he terms didactic, or instructive. All sociological art is didactic. Most novels since Zola’s time have been the work of didactic pornographers, who are preaching a social doctrine of some kind and fancying it up with pornographic icing. Say you are leafing through a magazine and see an advertisement for a beautiful refrigerator. There’s a girl with lovely refrigerating teeth smiling beside it, and you say, ‘I’d love to have a refrigerator like that.’ That ad is pornography. By definition, all advertising art is pornographic art.” [Joseph Campbell, quoted from Reflections on the Art of Living p. 246, 1991, D.K. Osborn, Ed.]
We can summarize this in a polarity between sensuality and dry meaning: Pornography is beauty or sensuality without meaning; pedantry is meaning without beauty or sensuality. If we think of beauty and meaning as a polarity, their interpenetration, both beautiful and meaningful, is art.
7. “Things fall apart. The center cannot hold… The best lack all conviction while the worst are full of passionate intensity.” [William Butler Yeats, “The Second Coming.”] “Lacking all conviction” is to give in to meaningless ego denial; “passionate intensity” describes the temptation of egocentricity. The polarity between too much ego and not enough ego is resolved in the healthy self.
8. Michael D’Aleo demonstrates (I won’t give away the secret of precisely how) that our apprehension of material objects in the world requires the interaction of two senses—touch and sight. We see things that are not objects—the colors at sunset, for example—and we feel things that are not objects—the wind at our back, for example. Only when our sight and touch corroborate each other do we create the concept of and perceive tangible, physical, material objects. Our visual field is built, in large measure, of the tangible, and we conceptualize the world as a unity based on the interaction of these two senses.
Friday, April 10, 2015
Long ago, life was uncertain. Famine, blight, plague, political instability, Viking raids, infant mortality. And, just, generally, infections.
But the beautiful minds of those persons living in such rancid conditions operated symbolically, trying to read through the book of nature the signs of the times. They did not objectify the world, at least not to the degree that we now do. The celestial sphere, the seasons, the sun, the progress of life—symbols for higher realities of which these things were signs. Similarly, religious images and religious conceptions were not seen as ultimate truths, but as signs through which ever higher or ever deeper mysteries could be read. There were no laws of nature, no theories of evolution or of the unconscious because their minds were not tuned to seeking objective answers to such questions.
In this world of spiritual symbols and material uncertainty, the authority of the Church rang like a bell. Dogma provided a kind of certainty on which a religious life could hang. Partly, this may be due to the uncertainty of material life, and partly, it is likely due to dogma being seen and experienced less as an assertion of ultimate or absolute truth and more as the considered and powerful expression of learned authority.
Today, material life is comparatively certain. We are unlikely to die violent deaths, to suffer famine or plague. We go to sleep at night unafraid of a Viking torch. Our children live, mostly. We have heat in our homes, a fridge in the kitchen, hot and cold running water, and lots of clothing and footwear. This sense of certainty and control extends to the world of objects—the objective world out there, the random motion of atoms and molecules we imagine beneath a microscope, the meaningless balls of gaseous fusion sparkling in the night sky. And also the imagined objective world in our heads, which we can approach, for example, through psychology and brain science. Psychology is not longer a science of the soul, it is the study of the outside of our insides, you might say—at its worst, a phrenology of the inside of our heads. We live shrink-wrapped lives in a known and knowable universe.
But we are spiritually uncertain. Our symbols, if we have them, are poetic and personal, but not taken or mistaken for reality. The unconscious, if it exists, is a vast unknown, and largely unknowable. Our culture is plural and polyglot. Most of us are agnostics, if honest, atheists and literalists if not.
In this world of material certainty and spiritual uncertainty, dogma, the assertion of truth based on authority alone, is anathema to most of us. (The worst, however, “full of passionate intensity,” object to the dogma of others, then assert their own as genuine. Fundamentalists, literalists, ideologues.) Dogma no longer provides cultural certainty and cohesion, but the foundation for profound disagreement with no hope of resolution beyond conflict.
These changes—from a symbolic or image-based consciousness in a meaningful but uncertain world to an object-consciousness in a meaningless world of material certainty—alter our perspective on dogma—from a welcome and comforting island of truth to an unwelcome and indigestible choking hazard.
One particularly powerful way to observe the change from the former consciousness to the latter is to look at the history of art during these periods—from medieval European art through the Renaissance to the movements that change with increasing rapidity since early modern times—Mannerist, Baroque, Rococo, Neoclassicism, Realism, Impressionism, Postimpressionism, Expressionism, and the explosion of styles and techniques and isms of the 20th century.
Medieval art evolves slowly in a tradition of masters, journeymen, and apprentices dating back to the end of Roman times; even the turn to naturalism at the end of the middle ages continues this approach to art as craft. Beginning with post-Renaissance art, however, art becomes increasingly personal and decreasingly representative (a gross generalization, but there it is).
To look at the strange troubling tumultuous history of art in the past 150 years or so, then, is to see an effort to break the bonds of dogma and of objectifying consciousness. Art used to move from generation to generation, apprentice copying master in respectful tradition. And then, in a matter of a few years in the late 19th century, that changed. What had been done could no longer authentically be done again. I drip paint onto a canvas on the floor, and I do not create a new tradition. I create an object and a technique that informs only what you must NOT do. Make it new. Break with the past.
If we allow freedom, if we wish to break the bonds of the dogmatic assertion of truth by authorities, no matter how learned, then, to state the utterly obvious, we have to actually allow freedom. Freedom to offend, to experiment, to play, and to ignore the past. Freedom to be rude, shallow, cold, or wild. If we look at modernism in art as a whole (and post-modernism, in this view, is just late modernism, modernism curling back on itself), art of the last 150 years, can we not see the birth pangs of freedom?
Friday, March 6, 2015
A fat woodcutter, name unknown, did not attend a dinner party to which he had been invited, offending the diminutive architect Filippo Brunelleschi. Seems excessive, but, regardless, Brunelleschi planned to get his revenge by playing an elaborate prank on the woodcutter. While the woodcutter was asleep, Brunelleschi contrived to alter the woodcutter’s room so that, when he awoke, the woodcutter believed that he, the woodcutter, did not exist…
How did he do it? History doesn’t tell us, but here’s my hypothesis.
First, remember that Brunelleschi has rediscovered and systematized the representation of linear perspective in visual art. Among his other achievements, Brunelleschi has done this:
He has painted a Florentine street scene in perfect one-point perspective, building by building, on a mirror. Why a mirror? You’ll see in a moment. The horizon in the painting matches the horizon on the street. The vanishing point in the painting matches the point at which Brunelleschi stands to make the work, and then at which he has his viewers stand. At the vanishing point, Brunelleschi drills a small hole. He mounts the painting in the street that the painting portrays. He positions an unpainted mirror before the painting so that viewers standing behind the painting and looking through the small hole see, reflected, the painting and, around it, the actual street it represents (which it matches perfectly, at the right time of day). The mirror includes the actual sky, clouds scudding along, reflected twice, above the buildings. By forcing viewers to look with one eye only, peeking through his hole, Brunelleschi heightens the illusion of perspective, which otherwise would be compromised by our binocular vision. Florentines line up to see this almost miraculous vision. Here's a picture that gives you some idea of what he did:
Remember that, for Italians in the Renaissance, perspective is new and wholly captivating (sort of like the first time you were mesmerized by a screen saver, I imagine, or a 3-D movie). Their vision is less schooled in perspective, less alert to the illusion of depth. They are likely more gullible and susceptible to the authority—genuine and assumed—of their political, religious, and cultural leaders. For them, the illusion is almost certainly far greater than it would be for us more skeptical, more jaded modern persons.
The fat woodcutter runs afoul of Brunelleschi. A woodcutter? Since childhood, I heard of woodcutters in tales and fables and assumed they were like Mr. Briggs, who delivered our firewood. A strenuous, not especially skilled job. A chainsaw, a hydraulic splitter, and a pickup truck. What else is there? But woodcutters before the Industrial Revolution are something else. Fell a tree, by hand, haul it, store it, dry it, plan its use, cut it without waste, plane it smooth and square. Know what wood is best for what use, fill the orders, make a living through hard work. Don’t sell wood that will warp or doesn’t meet the demands of your client. Deliver an order on time. No excuses. Don’t run afoul of a brilliant, demanding Brunelleschi. According to one account, the woodcutter rudely missed a dinner party to which Brunelleschi had invited him. Oops.
Brunelleschi plots his revenge.
While the woodcutter is asleep, perhaps aided by some strong wine, Brunelleschi steals into his room and replaces the small metal mirror, which the woodcutter prizes and hangs on the wall near his door, with a small, careful painting, light falling at a morning angle, reversed as in a mirror, in perfect perspective, of the interior of the woodcutter’s room. Large, glass mirrors like we enjoy simply didn’t exist in the Renaissance, especially not for mere woodcutters.
When the groggy, unsophisticated woodcutter awakens and looks in his mirror, what does he see? Not himself, but his room, as if he’s not there at all. He looks for his own face. It’s not there. He moves his arms. Nothing alters. For several moments, he doubts his own existence, and Brunelleschi has his laugh.
Three mirrors, two paintings, one joke. How else could Brunelleschi have pulled this off?
Thursday, December 18, 2014
I asked my students if computers would eventually be able to think, assuming they can’t already. Some said yes. Some said no. Some said it depends on what you mean by thinking. Are neurons digital? Even if thinking isn’t digital, couldn’t something digital, perhaps, think? Software can already learn, some said. That’s not learning, others said. Round and round.
And then we learned about ENIAC, the Electronic Numerical Integrator And Computer, built at the University of Pennsylvania during World War II. ENIAC was the first “universal” computer, built according to Turing’s understanding of a "complete" machine; one built to do whatever it was programmed to do, and not simply a giant calculator or tabulator. Famously room-sized, made with vacuum tubes instead of transistors, and programmed with around one million punch cards, it had no memory and could solve between 300 and 400 multiplication problems per second. It was created to solve problems related to the trajectories of shells, but was used to create the first atomic bomb.
After learning about ENIAC, I asked the students, again, if they believed computers could think. Imagine a larger ENIAC, I said, as large as you like. Imagine a program of millions and millions of punch cards, as many as you like. Imagine it works more quickly, as quickly as you like. Imagine I attach some sort of conveyor belt to it so that its program will output new punch cards that can feed into the input stack and alter the existing program, I said. That’s what you mean when you tell me computers can learn, I told them. Nope, they said, not learning, not thinking. They all agreed that ENIAC would never be able to think.
But that’s all a computer is, I told them. Your smartest phone, your fastest laptop, the bestest supercomputer.
Is there something about the hidden, electrical, solid-state nature of computers built on microprocessors connected to glowing screens that seems, well, magical, Harry Potter? Something that convinces rational students of an irrational impossibility? Isn’t this the definition of superstition?
Humans can think, still. Computers clearly cannot—if you think about it clearly.
But that’s not the end of the story.
Every technological advance brings us power and control. And every technological advance robs something from us.
Clocks bring order and regularity to a conception of time, but, having invented the clock, we can… forget about time. The clock will remind us. Invent a printing press, and we can reproduce texts by the millions. And we can forget stories. We don’t need to remember, because the story is always waiting for us in the book. (How many stories did your children know by heart before they learned to read? And how many after?) Invent a light bulb, and we can forget about natural light. Invent GPS, and we can ignore where we are. And on and on.
I’m not arguing against technology, just pointing out one way and one direction in which it changes us, always.
So, what about the computer? Calculator, library, messenger, entertainer. The computer isn’t one thing. The computer is whatever we want it to be, within the limits of its digital existence.
Of what does it threaten to rob us? Of thinking itself. Not because the computer can think, but because, in using it, too often, we don’t have to.
And if we forget how to think, then it won’t matter much if computers can think or not, will it?
(Many thanks to the Class of 2017 and the Class of 2018 of the Berkshire Waldorf High School for a lively History of Technology course, one that prompted the entry above.)
- administration (2)
- adolescence (12)
- advertising (1)
- ai weiwei (1)
- alcott (1)
- alternatives (2)
- american (3)
- anthroposophy (6)
- archimedes (1)
- Aristotle (1)
- art (3)
- art of fielding (2)
- arthur zajonc (1)
- ashwini pawar (1)
- asses (1)
- Atlantis (1)
- australia (1)
- AWSNA (1)
- Barfield (12)
- benjamin bloom (1)
- bishop berkeley (1)
- bloom's taxonomy (1)
- Bortoft (6)
- botticelli (1)
- bureaucracy (1)
- caravaggio (1)
- charter schools (1)
- cheating (1)
- chesterton (1)
- Christine Cox (1)
- christof wiechert (1)
- class size (1)
- cognition (2)
- cold war (1)
- Coleridge (1)
- college (1)
- computers (4)
- consciousness (3)
- consumerism (1)
- craig holdrege (2)
- creativity (5)
- crock of gold (1)
- daniel kahneman (1)
- destiny (1)
- development (4)
- Dickens (1)
- discipline (2)
- dogma (2)
- doodling (1)
- Dorothy Harrer (1)
- douglas gerwin (1)
- douglas sloan (3)
- dr. johnson (1)
- drug abuse (2)
- Dwight Shrute (1)
- early childhood (4)
- Edgar Cayce (1)
- egypt (1)
- emerson (6)
- erik erikson (1)
- Ernst Schubert (2)
- european (1)
- fareed zakaria (1)
- fichte (1)
- fish (1)
- frankenstein (1)
- fred amrine (1)
- freedom (2)
- funding (1)
- future (2)
- garden city (4)
- gardner (4)
- gatto (1)
- geometry (1)
- george mcwilliam (1)
- gertrude reif hughes (1)
- gladwell (1)
- gnomes (1)
- God (1)
- goetz (1)
- governance (5)
- Gradgrind (1)
- Gruber (6)
- Guggenheim (1)
- Gutenberg (1)
- harbach (2)
- harry kretz (1)
- Harry Potter (1)
- harwood (4)
- henry james (1)
- high school (15)
- history (21)
- holistic education (2)
- homelessness (2)
- homeschooling (2)
- humanities (2)
- husserl (1)
- Huston Smith (1)
- ideal (1)
- ideology (1)
- Iltar (1)
- imagination (2)
- insight (1)
- instinct (1)
- Internet (1)
- jackson pollock (1)
- james boswell (1)
- james stephens (1)
- James Turrell (2)
- jasper johns (1)
- juergen habermas (1)
- karma (2)
- knowledge (2)
- l. ron hubbard (1)
- Lakota (1)
- lawrence wright (1)
- Leeuwenhoek (1)
- Lewis (2)
- m. c. richards (3)
- marihuana (1)
- mark riccio (1)
- mary shelley (1)
- math teaching (7)
- matthew (1)
- memory (3)
- Mencken (1)
- merleau-ponty (1)
- Michael D'Aleo (1)
- michael lipson (1)
- middle school (2)
- millennial child (1)
- milwaukee (1)
- moby dick (1)
- Monke (1)
- morality (3)
- movies (1)
- myrin (2)
- nancy parsons-whittaker (1)
- new age (2)
- New York (1)
- new york steiner school (3)
- no child left behind (2)
- oakeshott (1)
- oberman (3)
- painting (1)
- parents (3)
- paul haggis (1)
- perception (1)
- peter curran (1)
- piaget (1)
- PLANS (1)
- Plato (2)
- play (1)
- Polanyi (3)
- polarity (1)
- pop culture (1)
- powerpoint (1)
- psychology (3)
- public education (7)
- racism (2)
- Rainn Wilson (1)
- razor's edge (1)
- reading (1)
- reform (3)
- relativism (1)
- rembrandt (1)
- research (2)
- richard ingersoll (1)
- ripley (1)
- road not taken (1)
- robert frost (1)
- Romanticism (1)
- rudolf steiner (30)
- salience (1)
- sartre (1)
- schiller (2)
- schwartz (4)
- scientology (1)
- small schools (5)
- somerset maugham (1)
- South Dakota (1)
- spirit (3)
- sri rajneesh (1)
- stephen edelglass (1)
- stephen talbott (1)
- synesthesia (1)
- teacher education (7)
- teaching (26)
- technology (2)
- television (1)
- testing (2)
- textbooks (1)
- The Office (1)
- thomas aquinas (1)
- thomas kinkade (1)
- thomas vaughan (1)
- values (1)
- van gogh (1)
- variations (2)
- video games (1)
- vocation (1)
- Vowell (1)
- waldorf critics (2)
- waldorf education (66)
- weizenbaum (1)
- will (2)
- William Harrer (1)
- Winnicott (1)
- Wolakota Waldorf School (1)