Wednesday, September 26, 2018

ANOREXIA INTELLECTUALIS


The Righteous Brothers had a hit song with a line that highlighted an important stomach-related word.  “I’ve hungered” it went, then, repeating the word, “hungered for your touch”.  It’s a lovely song, but it has nothing do do with eating. Still, that doesn’t matter.  Listeners know that “hunger” and “hungered,” when the context is human beings, can mean more than filling an empty stomach.


Take Aristotle. One of his most cited
lines reads like this: “all men, by nature, desire to know.”


What he says is that we have an "orexis" for knowledge. The term translated “desire” can mean reach for, yearn for, or hunger for. But why can “hunger” a biologically-derived term, apply in contexts of affection and intelligence?  Humans are biological creatures and they are more than that. The metaphorical use of “hunger”  offers one level of support for such continuity with difference.   Aristotle’s entire philosophy was friendly to biology. This did not mean he reduced "human" to the merely biological.  There are continuities and differences. He insisted that all ideals had a natural basis and all natural tendencies had an ideal culmination (to paraphrase George Santayana).


Philosophers are often accused of playing with words. Sometimes this is true.  Sometimes, recognizing distinctions is meaningful. In his book.
Hunger, Raymond Tallis suggests that “hunger” and “appetite” should be distinguished.  The former, along with the non food-centered term, “desire,” is more indicative of all-embracing humanity than is “appetite.”  The difference? Narrative time. “A desire is an appetite that has a story--albeit unfolding, evolving, fragmented, at its heart.”  It is a hunger that narrates itself and tries to make sense of itself.”


Appetites are raw, instinctual, sated in a stimulus-response  way. Desires/hungers are multi-layered. They are often occasioned, not by an object but by a relationship. That’s certainly the case in both the Unchained Melody of the Righteous Brothers and in Aristotle’s claim about human hunger for knowledge. What is yearned for is ongoing (time-factored) actualizing: making love real in one case and satisfying curiosity in the other.


The culminations of such hungers occur only within spans of time. Unfolding within spans of time means not only that they are ongoing, but that they require effort, and, instead of unimprovable perfection, often result in culminations with loose ends.  They are thus precarious, subject to contingency, require that we make ourselves vulnerable, and can be frustrating.


For those reasons, there is a general tendency to an-orexia, the suppression of hunger. This is not just the pathology associated with physiological appetite.
 There is also intellectual an-orexia: the longing for absolutely secure, contingency-free, doubt-free existence. One impediment to such security: thinking. After all, thinking is bothersome, annoying, discomforting.  If only the yearning, the hunger for knowing could be suppressed, if only intellectual anorexia could become the rule, then a sense of comfort, and security could be established. The paths to intellectual anorexia are well-paved and multiple.  Two came to special prominence during the Modern (roughly 1600-1900) era: logical deduction and calculative rationality. Using formal logic to deduce conclusions that are certain has little to do with thinking. The latter always engages complicated situations and arrives at resolutions that are only secure within varying degrees of probability.  Instrumental rationality, which starts with a pre-ordained end and then calculates the most efficient means of getting there, offers a poor substitute for thinking which puts ends themselves into play. Other strategies have been around forever: absolutism and skepticism.
But wait, readers may be thinking, aren’t these opposed to one another.  Well, yes, they are. One says, “the answers pre-exist, all you have to do is accept them.” The other says “there are no answers.”


Despite their differences, and like logic and instrumental rationality, both offer a particular and attractive invitation: stop thinking. It might be “stop thinking because the answers are already at hand.”  It could be “stop thinking because there are no answers anyway.” In each case, the result is the same: peace of mind occasioned by sidestepping the annoying practice of thinking.


For the physiological anorexic, the body’s constant demands for edibles is felt as an oppressive demand. For the philosophical anorexic it is the demands of the mind that are annoying. Thinking involves “trials,” literally the processes of sifting out, of trying one path and then another. Such processes are  are also “trying” in the sense of being annoying, bothersome. Like hunger, thinking never leads to stasis. The practice is unending. How to eliminate such discomfort? Go intellectually an-orexic. Hide behind pure logic. Transform thinking into calculating. Pretend the answers are already obvious. Insist that finding answers is not possible. In other words, suppress a basic human orexis, stop thinking.

Wednesday, September 19, 2018

Eat/Die2




Robert Indiana’s Diptych highlights two words Eat and Die.  “Eat,” as we saw in the previous blog, is complicated. Yes, we eat in order to stay alive, but human eating tells us something about what it means to be fully human.


Something similar occurs with “die.”  Martin Heidegger, once again, leads the way.  He suggests that animals “perish” whereas humans “die.” Conscious awareness marks the difference.  Heidegger gets all worked up about this, referring to human life as Sein-zum-tode “being unto death.” On one level the phrase is a truism: we live in the awareness of our mortality.  As a result, prioritization about how to spend our limited span is forced upon us. At its best, it encourages  a more “authentic” existence.


So far so good, but there are limitations.


First, the “being-unto-death” focus is too individualistic.  Fastening on concern about my individual death ignores an important aspect of existence: sorrow and
grief. Grief at the death of others is often the earliest confrontation with mortality.  The experience of grief moves us away from the isolated subject facing his (this is the pronoun Heidegger would have preferred) own death. Grief means we have loved. The love/grief awareness, in turn, reveals something important about our cares, concerns and patterns of significance. Death experienced as grief diverts attention from the atomized subject and its authenticity. It emphasizes, rather, worthiness of a life lived in relation with others.


Second, we are not just “beings-unto-death.”  Our physiologies signal a pattern. Fully articulated, it’s a pattern of birth, puberty (because we do not live forever),
birth of others, then death.  Even if we do not have children of our own, we are attuned to the birth/puberty/birth/death/ cycle.This is where eating, and the mother imagery from the previous blog, come again to the fore.  The gestating mother, the one who is food for the fetus, is about to do something very important, something inseparable from mortality, give birth to a new human being.


Isolating mortality from natality encourages insulated, self-directed, reflection on how to shape our life trajectories.  Admitting we are both “beings-unto-death” and “beings-unto-birth” encourages a more social set of concerns. It encourages us to reflect on the  kind of world we will leave behind for the offspring who will follow (whether we actually produced some or not).


Third, life-time should not be thought of as an unending sequence of instants. When time is envisioned as a line, it makes sense to envision life as possibly going on forever. This is what the 16th century philosopher Spinoza, did when he
asserted  that any finite entity strives to “persevere in its being.” Experience teaches a different lesson. Finite beings realize they are part of a finite cycle.
 
Time is every bit as much a cycle as it is a line. Perhaps, for lived experience, time is actually more periodic than linear. Even philosophers fascinated by an unending line recognize nature’s cycles of seasons and years. They realize how  the sun itself will, at some point, come to an end. Life-time is periodic, punctuated by  beginnings and an endings. For biological creatures, periodicity is marked by overlapping spans, with  newer generations always in the making.



Death may be untimely and thus appear as a dark-garbed, scythe-wielding assassin. But death itself is not opposed to life. It is a necessary constituent of any life-trajectory. Our path as organisms is to live a limited span, make a kind of mark, leave offspring and die.  Emphasizing only the impulse to live on, or thinking of death as a kind of enemy is, in an important sense, to deny our humanity. Contrary to Spinoza we should encourage finite beings to accept their finitude.


When we think mortality, grieving, and natality together, we can recognize how thoughtful reflection can move in a direction different from the existentialist emphasis on an”authentic” life. That
emphasis includes a few important elements: (a) Thinking mortality/grief/natality together encourages us to think in terms of “we” rather than “I.” (b) A fundamental concern moves from  “how shall I live authentically,” to “how shall we make a world for the next generation?” (c) Finally the mortality/grief/natality awareness encourages a shift in time-consciousness. We recognize ourselves as periodic beings, part of a particular span which will impact overlapping, subsequent spans. The emphasis is thus on generosity in terms of what we bequeath. Worthiness replaces authenticity.


Eat/Die. Both unavoidable. Both complicated. Both sources of philosophical reflection. The best reflection will hold them together. We are beings unto life and death. Isolating one from the other falsifies our condition and leads to faulty thinking.

Wednesday, September 12, 2018

Eat/Die


EAT/DIE.  Nothing could be simpler.  Life reduced to two words encased in a famous set of diptychs by Robert Indiana. Organisms must eat. At some point, they die. Eat/die. Both guaranteed, both straightforward.


“Guaranteed,” sure. But “straightforward”?  Maybe. At least until thinking gets involved. Then philosophy interferes with “straightforward.” It asks: “what exactly is meant by "eat" and by "die?"  Philosophers are not just being difficult. They take on the tough job of thinking, seeking to grasp how things are--even if “how things are” is not straightforward.  


Robert Indiana said he got the diptych idea from the last word spoken by his mother: “eat.”   


As a mother’s last in injunction, “eat” ranks way up there.  Mothers, after all, are, for the period of gestation, food for the fetus.  Mothers also, in general, have carried the burden of supplying family members with nutritious, tasty food. And this, not just as a one-off thing, but every day, multiple times a day.  


“Eat” also distinguishes humans as a species. The philosopher Martin
Heidegger made much of how the German language sorted out, with two different words, what animals do, i.e. “feed,” fressen, and what humans do, “eat,” essen.  Animals relieve hunger, most often individually even if they are a pack feeding side by side. They ingest whatever is identifiable as food. Then they move on.


Essen, “eat,” as a mother’s last utterance, conveys an important message.  Keeping the fressen/essen distinction in mind, it suggests “be fully human.” And, what, in turn, does this mean?


1 Don’t eat alone.  Eating alone, let’s face it, is kind of a drag. Counselors report how, for first-year college students, one of the most anxiety-inducing situations is sitting alone in the cafeteria, an anxiety made famous in Mean Girls when Lindsay
Lohan, worried about being alone at a table, retreats to the rest room with her food. We are social animals. One manifestation of that sociality is eating with others.


2. “Savor.”  “Relish.” Eating is a value-laden activity. We make distinctions between better and worse. What’s on the plate may be rated somewhere between tasty and bland as well as somewhere on the ugly to beautiful spectrum.  Preferences and avoidances are enacted.
“Taste” is something to be educated and cultivated. “Eating,” can, of course, be reduced to “feeding,” to a straightforward intake of calories. The latest attempt in this direction is the “meal replacement drink”  Soylent. To choose feeding (whatever rationalizations are offered in terms of efficiency and convenience) is, in effect saying “no” to being fully human.


3. “Connect.” All food was once a plant. Even meat  is from animals who fed on plants. Intelligent awareness associated with eating highlights dependences and connections to our natural surroundings. It draws our attention  to rain, sunshine, fertile soil, bacteria that keep it fertile, insects that pollinate, and, within ourselves, the bacteria that help us digest. It also links us to fellow humans, unrecognized mostly, but without whom, seeds would not be planted, farmland not tended to, plants not harvested, animals not butchered, fruits, vegetables and meats not delivered.



4. “Thank.” The religious word “Eucharist” means thankfulness.  It’s not clear whether pets are grateful. Some sure seem to be. But for humans, gratefulness marks a mode of relating which manifests full bore humaneness.  It’s beyond simple economic exchange, i.e. more than “I offer this service and now you owe me in return.” That exchange can take place without any sense of “thank you.”  What used to be uttered before meals, “grace,” like “Eucharist,” originally indicated thankfulness. A common prayer, “Bless this food,” can, by contrast, be misleading. It suggests that food is, initially, just stuff.  Such stuff would require a special effort to make it enter the realm of the sacred. If we pay attention to etymology, though, even “bless,” this is not surprising, carries within itself the sense of giving thanks.
Of course we can choose to live a life in the mode of exchange, the mode in which all objects and actions become commodities, one in which human relations devolve to contractual transactions alone. Gratitude at table helps inoculate us against such retrogression.



Today’s fast food world is one which offers temptations to which we can easily succumb: to think of ourselves as above and beyond the ordinary pleasures of the table, to prize individual autonomy, to think solely in contractual terms.


A mother’s exhortation to “eat” urges us not to succumb.  It suggests that we embrace and welcome our humanity in its fullest and best possibilities. Not bad advice

What about "die" the other half of the diptych? That's for another blog.




Wednesday, September 5, 2018

macrophages, logic and bio-logic

The French are, at least in part, a hippophagous people.  Caterpillars are phyllophagous, and, with population growing,
humans will soon be entomophagous.  Yes, it’s not hard to guess, the Greek root phagein means “to eat.”  It might be horses. It might be leaves.  It might be insects. In each case, there is an eater and an eaten.


Something similar occurs on the cellular level. The eater in this case is a type of white blood cell.  It becomes a “big eater,” a “macrophage.” Despite their formidable name,
macrophages are mostly an organism’s friend. What they eat are intruders and trouble makers, nasty bacteria, parasites, rogue cells. Macrophages thus provide protection. They serve an important function.


In official philosophical terminology, this end-directed, or goal-directed activity has a specific name, “teleology.”  We shiver, for example, when cold. This helps us keep warm. Ditto, but in reverse, for perspiration. Dr. Walter B.
Cannon’s well known book Wisdom of the Body, detailed the various processes contributing to, here comes another big word, “homeostasis.” It means keeping the body’s key activities within an established, familiar (“homeo”) and stable (“stasis)  range.


Walter B. Cannon was even willing, as we saw above, to speak of “wisdom” as the body preserved its harmonious, well-functioning operations.  At one time, the “big eaters,” the macrophages, fit well within this paradigm. Their activity, gobbling up troublemakers, made of them a sort of poster child for the “wisdom of the body.” Then came a bit of trouble.  


It turns out that, for some ailments, macrophages actually aggravate the problem. Atherosclerosis, autoimmune disorders, and even cancers provide the best-known examples. When it comes to cancers, the “big eaters” should be  “tumoricidal.” Instead, they often become “tumorigenic,” fostering tumor growth.


Just as Cannon used a human descriptor “wisdom” to describe our physiological functioning, so Barbara Eherenreich in a recent book discussing macrophages, also uses human descriptors. Hers, though, are negative. She speaks of macrophages as “treasonous,” as having a “mind of their own.”  The “wisdom” of the body may thus need to be rethought. Instead of the smooth functioning of harmoniously interacting components, the organism should be seen as a “battleground where its own cells and tissues meet in  mortal combat.


At the same time there is something discomforting about alternating between harmonious “wisdom” and all-out combat.  It would be good, Ehrenreich thinks, to have a new paradigm, one moving away from the sharp utopian or dystopian models.


She could be helped by taking a page from the realm of food. Salt, sugar, fat.  Are they good
for us? “Yes, assuredly.” Are they bad for us? “Definitely.” It’s not all that unusual for the same factor to embody both beneficient and maleficient possibilities. The technical term for this comes from ancient Greek.  The word is pharmakon.  It means both “drug, in the medical sense, and “drug” in the sense of poison.


Humans seem to have a natural aversion to holding opposites together. In strict formal logic the contradictory claims “sugar is good” and “sugar is bad” simply can’t coexist.

Logically, a pharmakon makes no sense.   Bio-logically it’s fairly prevalent. Logic may be neat and clean, but life if often messy, context-sensitive, subject to contingencies,  and full of surprises. In logic, strict determinism is assumed. In bio-logic it’s all about probabilities.


Using metaphors is hard to avoid. But they can be misleading. An organism is not a “machine.”
At the same time, it is not a conscious being able to possess “wisdom.” It can’t be “treasonous.” The same thing, sugar or a macrophage, in conjunction with an organism, can be both beneficient and maleficient.  


If a paradigm switch is needed it’s one in the direction of a favorite phrase of Aristotle, a biology-inspired philosopher: “always or for the most part.”  Today, we might tweak it a bit and say “usually, or for the most part.”

Neither pure determinism nor sheer randomness, neither peaceful harmony or treasonous betrayal can faithfully describe organic operations.  The biological realm need not follow any other pattern, certainly not that of machines; not that of generous-spirited or nasty humans. The biological is sui generis, unique and complicated. Teleological flexibility in light of homeostatis is real.  It works well “usually or for the most part.” The pharmakon dimension is also real. What is good can also be bad.

Flexibility and pharmakon as pivots of a new paradigm can help make good sense of “usually or for the most part.” For a long time philosophers were fascinated with certitude. This might work well in deductive logic.  However, in bio-logic, it’s all about probabilities. Start with a large enough sample, recognize flexibility, admit a pharmakon dimension, and atherosclerosis, diabetes, arthritis and various cancers will appear in the population. It’s not about wisdom having failed or treason succeeding. It’s just about bio-logic.
















 

Wednesday, August 29, 2018

MINDFULNESS or WHOLEHEARTEDNESS?

Mindfulness is everywhere.
When a movement is so popular, it (1) has something going for it and (2) will bring out detractors.  Cue Barbara Ehrenreich and a chapter entitled “The Madness of Mindfulness.”  Distractedness is a real problem. But, “mindfulness” as a solution?  The economic dimension, as usual with Ehrenreich, dominates. Mindfulness has been co-opted by commerce.  Smartphones foster distractedness. We might, therefore, be wary of purchasing any of the plentiful
mindfulness apps that bind us even more to our glowing rectangles.  There is also the disconnection from roots in Buddhism. Mindfulness is now understood as a tool for solving specific problems. It’s not a practice that gives rise to an enlightened awareness that might challenge the entire context that occasioned the problems in the first place.  Finally, she asserts that scientific support is weak. While recognizing “neuroplasticity,”she claims the benefits of mindfulness can be arrived at by other means: muscle relaxation, medication, and psychotherapy.


Ehrenreich overreaches with her blanket condemnation. After all, if it brings benefits, without drugs or psychotherapy, why not appreciate mindfulness?  She is on stronger ground when she emphasizes how mindfulness has shed much its Buddhist context.


Her way of phrasing it, though, is questionable. Mindfulness, she says, has been “drained of all reference to the transcendent.”  This is a very Western way of identifying the religious dimension, i.e. associating it with a higher power in a different realm. One of the great Zen texts is the work of Dōgen (1200-1253) who
introduced a form of Zen to Japan. It’s called Instructions for the Zen Cook. He focuses on the here and now, on humble activities involved in food preparation.  As the book’s introduction puts it, Zazen is neither an escape from the world, nor an attempt to achieve a separate goal. In fact, the practice is “tainted” if it aims  at some external, separate gain.
Eating, if not cooking, was central to  contemporary mindfulness. The source of
it all,  Jon Kabat-Zinn, suggested a simple initial exercise: eating a raisin mindfully.


The raisin example  highlights a difference with  Dōgen’s cook. Learning to savor a raisin means training the mind.  Dōgen emphasizes concrete practices involving the entire person: cleaning/sorting rice, getting/heating water, actually cooking food for a group of people.  Such an involvement, in turn, is not separated from natural and social circumstances. It’s not: me, alone, learning to savor a raisin.


Since “mindfulness” can suggest a withdrawal into one’s conscious self, Dōgen’s translator suggests “wholeheartedness” as an alternative.  The entire person, situated in a natural and social context, and engaged in magnanimous and caring practices is the focus.
Ehrenreich’s criticism, focused on attention deficit, misses a deeper pathology.  Today’s technological wonders do not just encourage truncated, limited attention. They also intensify something that has been central since the Renaissance: conceiving ourselves as, at bottom, self-sufficient and autonomous. Martin Luther’s criticism of this tendency
was harsh, insisting that the activity of turning in on oneself, incurvatus in se, was the heart of sinfulness.    


Buddhism, also concerned about incurvatus in se, challenges the tendency at its very roots.  Its guiding assumption: nothing is autonomous. Everything is connected to everything else.  Buddhists also emphasizes anatman, “no self.” By this they do not mean that humans are not unique persons in the world. Anatman indicates that the non-connected, non-dependent self is an illusion. The so-called “self-made” man,  would, if he examined his life in detail, find loans from banks (with money deposited by lots of other people), architects. engineers and contractors to build edifices and make machines,  employees, clients, a legal system allowing him to operate, and many other ways of being connected to and dependent on others.


Belief in the autonomous self has been prominent in Western thought since the 16th century. There are even people who extol “individualism.” This is not to be confused with respect for this or that particular person.  It is rather the conscious belief that one is fundamentally self-sufficient. It’s a belief that culminates in today’s culture of loneliness.


When cooks practice their craft wholeheartedly, with loving attentiveness, they recognize lots of connections: to the gifts of nature: sun, soil, rain, bacteria, worms; to the gifts of other humans: people who plant, till, harvest, ship; to the gifts of technology:  cleavers, pots, stoves, dishes.The tenzo ( Dōgen’s “cook”) cannot work as a non-connected, non-dependent individual.

 Mindfulness, isolated from its Buddhist roots, might help counteract attention deficit, but it can also worsen rather than challenge the faulty picture of ourselves as self-sufficient individuals.



Wednesday, August 22, 2018

FRANKENSTEIN: ABSTRACTION IS EVIL

One of the great scenes in Young Frankenstein is when the creature wanders into the home of a blind man (Gene Hackman at his best).  Blind man and Creature have lots in common: they
are different (blind in one case, large and ugly in the other); because they are different, they are lonely; because they are lonely, they need companionship.
“Companionship” literally means “bread with” another. As a story, Young Frankenstein emphasizes something earlier films missed. The creature is
unloved.  He has no one to break bread with, no companions. What he longs for is a stable, loving family life.
Such domestic felicity is, sadly, closed to the creature because he is so different.  What he most longs for is a companion, someone to share home and hearth, someone who, based on such sharing, will get to know him beyond the physical impressions at which others recoil.  Victor Frankenstein promises that he will put together another creature, this time a female, the much needed, much desired companion. Alas, Victor reneges on his promise.  The creature, betrayed and hurt, turns vengeful and a violent.
It all goes back to “bread together,” and to the awareness that comes from time spent in loving companionship.  The book of Genesis famously has Adam
“knowing” Eve, a kind of knowing that leads to pregnancy. This use of the term also indicates a kind of knowing that emerges from companionship, from spending a life together. It’s a term that suggests awareness of what another is like based on familiarity, on relationality. When we make a claim like "I know her/him really well," it's quite different from knowing that 3 plus 2 equals 5 or that water freezes at 0 degrees Celsius.
The sad aspect of Mary Shelley’s Frankenstein is that no one takes the time to “know” the creature along the model of companionship.  The people with whom the creature interacts cannot get past his “otherness.” Instead of thinking and being thoughtful, they remain on the level of abstract, immediate sense perception.  What results is a truncated kind of awareness. The concrete, whole self is ignored. One aspect alone, ugliness, is highlighted
There is an important lesson here for philosophy. Abstraction, often touted by philosophers, is actually an ally of evil. Those who judge the creature based on  his looks are reacting abstractly, just as those who would judge by skin color alone, or gender, or religion.
The surprising lesson for philosophy is one that has been articulated by Alfred North Whitehead.  Philosophy is not the arena for abstractions. Rather, philosophy
should be the critic of abstractions.  Humans feel a strong tug in the the direction of abstracting, of simplifying, of separating, isolating, disconnecting.  Philosophy, tied to wisdom and to thoughtfulness, has a special role to play: it must provide a counterbalance.
The hard corollary: thinking and abstraction do not go together.   Logic and abstraction go together. Calculative rationality (planning in terms of fixed goals and the most efficient means to arrive at them) depends on abstraction.  “Thinking,” or being “thoughtful” is quite different. Indeed, abstraction is a hindrance to thinking and thoughtfulness. The latter’s mode of reflection results in a particular sort of claim, the “I know" that results from companionship.
Martin Heidegger made much of the link between “thinking” and “thanking.” Thinking (and its partner thoughtfulness) is a grateful response to our being in the world. It is not separated from affectivity. Its aim is full awareness about what is, an awareness not accessible to those who remain detached from what they are seeking to understand.
Thinking thus gives rise to a kind of concrete knowing, the one Latin calls cognoscere, French connaître, and German kennen. (As opposed to the more abstract scire, savoir, and wissen.)  This awareness, this companionship-based-getting-to-know, cannot happen when thinking and abstracting are confused.  Abstraction, specifically undertaken for specific purposes, plays an important role, especially in laboratory sciences.  In everyday life, though, the allure of abstraction must be resisted.
In Mary Shelley’s novel, only the blind man is thoughtful in regard to the creature. Everyone else succumbs to abstract perception.  Ugliness, size, and skin texture are all they pay attention to. Abstraction is plentiful. Thoughtfulness is rare.
Young Frankenstein may be a comedy, but its take on the creature’s story is revealing. The characters in Young Frankenstein live out the knowledge of
companionship in relation to the creature. Shelley’s novel depicts the sad state that results when abstract knowledge rather than companionship knowledge dominates.