Wednesday, October 3, 2018

ADVERTISERS LOVE FREE WILL

“Data ‘R Us” could be the slogan for our era. It used to be that we, the inquirers, were on
one side, and data  were on the other. Scientists undertook their inquiries because they believed that, as a consequence, they would have a secure, fact-based story to tell.


Today the gap between data and us has been erased.  Algorithms, mining Facebook for example, aim “to gather sensitive personal information about sexual orientation, race, gender, even intelligence and childhood trauma.”  


What good is such data?  Follow the money. Scientists mine data because they want an accurate story to tell.  Marketers love the new data because they have something to sell.  


For marketers, “value” is defined as whatever will increase sales for whoever is paying them. Sometimes the admission is delivered deceptively: “The more data you collect from customers…the more value you can deliver to them.” It soon becomes clear that “value” does not means something that is independently good: “And the more value you can deliver to them…the more revenue you can generate.” Generating revenue via data-aided manipulation, i.e. interfering with freedom, is what it’s all about.


Within the world of food, the results of successful advertising are disastrous. When people eat what is best for the revenue of food companies, they are not eating what is
healthy for themselves. The results: a situation in which obesity and early onset diabetes are way more common than they need be.

Most people realize that marketing and advertising have a specific goal:  impacting our powers of adjudication, i.e. our ability to make free, well-informed choices. Why, then is advertising so often defended in the name of freedom rather than understood as the obstruction to freedom that it is?


Here is where philosophy plays a role.  It’s all about the meaning of “freedom.”  
We, inheritors of bad philosophy, continue to confuse freedom with “free will.”  There’s the rub. Freedom and “free will” are NOT the
same thing. Human freedom is the ability of reasonable beings to consider their circumstances intelligently and to elect among possibilities.  “Free will,” by contrast, suggests an inner faculty, completely unconditioned and unaffected by circumstances, one that provides a power at any and all times to make uninfluenced choices. “Free will” is attractive because it offers us a self-description  as somehow disincarnate. It is also dangerous. It encourages us to think in all-or-nothing terms. Either we possess free will or not. Forces shaping and influencing us are irrelevant. Defining freedom strictly in terms of free will impacts our thinking. It discourages concrete questions about what factors favor informed election among options. It encourages the all-or-nothing perspective which directs thought away from examining various ways in which freedom can be enhanced or minimized.


The flawed identification of freedom with ‘free will” provides plenty of cover for marketers. They can defend their work by saying that no matter how much money and expertise is marshalled to shape people’s selections, such individuals continue to have free will.  At the same time they can hire psychological
and data specialists for precisely one reason: to limit freedom. They can thus have it both ways: justifying their practices by saying that there is always “free will,” and adjusting those practices so that they become more and more effective in interfering with freedom.  At the same time they can block the efforts of public servants by claiming that it is they, the public servants rather than the advertisers, who seek to suppress freedom.


Freedom is a great good. It recognizes how we can chart paths for ourselves.  At its best it manifests itself in appraisals and evaluations that results in well-considered judgments. Those judgments, in traditional language, are verdicts, i.e. decisions based on truths. Ver-dict, truth saying, for
marketing, is beside the point, and, really, an obstacle.   The defenders of advertising are friends of free will but not friends of freedom. Adjudication on the basis of evidence is exactly what they seek to eliminate. They are savvy manipulators who construct scenarios, not because they have something important to tell, but because they have something to sell.

Wednesday, September 26, 2018

ANOREXIA INTELLECTUALIS


The Righteous Brothers had a hit song with a line that highlighted an important stomach-related word.  “I’ve hungered” it went, then, repeating the word, “hungered for your touch”.  It’s a lovely song, but it has nothing do do with eating. Still, that doesn’t matter.  Listeners know that “hunger” and “hungered,” when the context is human beings, can mean more than filling an empty stomach.


Take Aristotle. One of his most cited
lines reads like this: “all men, by nature, desire to know.”


What he says is that we have an "orexis" for knowledge. The term translated “desire” can mean reach for, yearn for, or hunger for. But why can “hunger” a biologically-derived term, apply in contexts of affection and intelligence?  Humans are biological creatures and they are more than that. The metaphorical use of “hunger”  offers one level of support for such continuity with difference.   Aristotle’s entire philosophy was friendly to biology. This did not mean he reduced "human" to the merely biological.  There are continuities and differences. He insisted that all ideals had a natural basis and all natural tendencies had an ideal culmination (to paraphrase George Santayana).


Philosophers are often accused of playing with words. Sometimes this is true.  Sometimes, recognizing distinctions is meaningful. In his book.
Hunger, Raymond Tallis suggests that “hunger” and “appetite” should be distinguished.  The former, along with the non food-centered term, “desire,” is more indicative of all-embracing humanity than is “appetite.”  The difference? Narrative time. “A desire is an appetite that has a story--albeit unfolding, evolving, fragmented, at its heart.”  It is a hunger that narrates itself and tries to make sense of itself.”


Appetites are raw, instinctual, sated in a stimulus-response  way. Desires/hungers are multi-layered. They are often occasioned, not by an object but by a relationship. That’s certainly the case in both the Unchained Melody of the Righteous Brothers and in Aristotle’s claim about human hunger for knowledge. What is yearned for is ongoing (time-factored) actualizing: making love real in one case and satisfying curiosity in the other.


The culminations of such hungers occur only within spans of time. Unfolding within spans of time means not only that they are ongoing, but that they require effort, and, instead of unimprovable perfection, often result in culminations with loose ends.  They are thus precarious, subject to contingency, require that we make ourselves vulnerable, and can be frustrating.


For those reasons, there is a general tendency to an-orexia, the suppression of hunger. This is not just the pathology associated with physiological appetite.
 There is also intellectual an-orexia: the longing for absolutely secure, contingency-free, doubt-free existence. One impediment to such security: thinking. After all, thinking is bothersome, annoying, discomforting.  If only the yearning, the hunger for knowing could be suppressed, if only intellectual anorexia could become the rule, then a sense of comfort, and security could be established. The paths to intellectual anorexia are well-paved and multiple.  Two came to special prominence during the Modern (roughly 1600-1900) era: logical deduction and calculative rationality. Using formal logic to deduce conclusions that are certain has little to do with thinking. The latter always engages complicated situations and arrives at resolutions that are only secure within varying degrees of probability.  Instrumental rationality, which starts with a pre-ordained end and then calculates the most efficient means of getting there, offers a poor substitute for thinking which puts ends themselves into play. Other strategies have been around forever: absolutism and skepticism.
But wait, readers may be thinking, aren’t these opposed to one another.  Well, yes, they are. One says, “the answers pre-exist, all you have to do is accept them.” The other says “there are no answers.”


Despite their differences, and like logic and instrumental rationality, both offer a particular and attractive invitation: stop thinking. It might be “stop thinking because the answers are already at hand.”  It could be “stop thinking because there are no answers anyway.” In each case, the result is the same: peace of mind occasioned by sidestepping the annoying practice of thinking.


For the physiological anorexic, the body’s constant demands for edibles is felt as an oppressive demand. For the philosophical anorexic it is the demands of the mind that are annoying. Thinking involves “trials,” literally the processes of sifting out, of trying one path and then another. Such processes are  are also “trying” in the sense of being annoying, bothersome. Like hunger, thinking never leads to stasis. The practice is unending. How to eliminate such discomfort? Go intellectually an-orexic. Hide behind pure logic. Transform thinking into calculating. Pretend the answers are already obvious. Insist that finding answers is not possible. In other words, suppress a basic human orexis, stop thinking.

Wednesday, September 19, 2018

Eat/Die2




Robert Indiana’s Diptych highlights two words Eat and Die.  “Eat,” as we saw in the previous blog, is complicated. Yes, we eat in order to stay alive, but human eating tells us something about what it means to be fully human.


Something similar occurs with “die.”  Martin Heidegger, once again, leads the way.  He suggests that animals “perish” whereas humans “die.” Conscious awareness marks the difference.  Heidegger gets all worked up about this, referring to human life as Sein-zum-tode “being unto death.” On one level the phrase is a truism: we live in the awareness of our mortality.  As a result, prioritization about how to spend our limited span is forced upon us. At its best, it encourages  a more “authentic” existence.


So far so good, but there are limitations.


First, the “being-unto-death” focus is too individualistic.  Fastening on concern about my individual death ignores an important aspect of existence: sorrow and
grief. Grief at the death of others is often the earliest confrontation with mortality.  The experience of grief moves us away from the isolated subject facing his (this is the pronoun Heidegger would have preferred) own death. Grief means we have loved. The love/grief awareness, in turn, reveals something important about our cares, concerns and patterns of significance. Death experienced as grief diverts attention from the atomized subject and its authenticity. It emphasizes, rather, worthiness of a life lived in relation with others.


Second, we are not just “beings-unto-death.”  Our physiologies signal a pattern. Fully articulated, it’s a pattern of birth, puberty (because we do not live forever),
birth of others, then death.  Even if we do not have children of our own, we are attuned to the birth/puberty/birth/death/ cycle.This is where eating, and the mother imagery from the previous blog, come again to the fore.  The gestating mother, the one who is food for the fetus, is about to do something very important, something inseparable from mortality, give birth to a new human being.


Isolating mortality from natality encourages insulated, self-directed, reflection on how to shape our life trajectories.  Admitting we are both “beings-unto-death” and “beings-unto-birth” encourages a more social set of concerns. It encourages us to reflect on the  kind of world we will leave behind for the offspring who will follow (whether we actually produced some or not).


Third, life-time should not be thought of as an unending sequence of instants. When time is envisioned as a line, it makes sense to envision life as possibly going on forever. This is what the 16th century philosopher Spinoza, did when he
asserted  that any finite entity strives to “persevere in its being.” Experience teaches a different lesson. Finite beings realize they are part of a finite cycle.
 
Time is every bit as much a cycle as it is a line. Perhaps, for lived experience, time is actually more periodic than linear. Even philosophers fascinated by an unending line recognize nature’s cycles of seasons and years. They realize how  the sun itself will, at some point, come to an end. Life-time is periodic, punctuated by  beginnings and an endings. For biological creatures, periodicity is marked by overlapping spans, with  newer generations always in the making.



Death may be untimely and thus appear as a dark-garbed, scythe-wielding assassin. But death itself is not opposed to life. It is a necessary constituent of any life-trajectory. Our path as organisms is to live a limited span, make a kind of mark, leave offspring and die.  Emphasizing only the impulse to live on, or thinking of death as a kind of enemy is, in an important sense, to deny our humanity. Contrary to Spinoza we should encourage finite beings to accept their finitude.


When we think mortality, grieving, and natality together, we can recognize how thoughtful reflection can move in a direction different from the existentialist emphasis on an”authentic” life. That
emphasis includes a few important elements: (a) Thinking mortality/grief/natality together encourages us to think in terms of “we” rather than “I.” (b) A fundamental concern moves from  “how shall I live authentically,” to “how shall we make a world for the next generation?” (c) Finally the mortality/grief/natality awareness encourages a shift in time-consciousness. We recognize ourselves as periodic beings, part of a particular span which will impact overlapping, subsequent spans. The emphasis is thus on generosity in terms of what we bequeath. Worthiness replaces authenticity.


Eat/Die. Both unavoidable. Both complicated. Both sources of philosophical reflection. The best reflection will hold them together. We are beings unto life and death. Isolating one from the other falsifies our condition and leads to faulty thinking.

Wednesday, September 12, 2018

Eat/Die


EAT/DIE.  Nothing could be simpler.  Life reduced to two words encased in a famous set of diptychs by Robert Indiana. Organisms must eat. At some point, they die. Eat/die. Both guaranteed, both straightforward.


“Guaranteed,” sure. But “straightforward”?  Maybe. At least until thinking gets involved. Then philosophy interferes with “straightforward.” It asks: “what exactly is meant by "eat" and by "die?"  Philosophers are not just being difficult. They take on the tough job of thinking, seeking to grasp how things are--even if “how things are” is not straightforward.  


Robert Indiana said he got the diptych idea from the last word spoken by his mother: “eat.”   


As a mother’s last in injunction, “eat” ranks way up there.  Mothers, after all, are, for the period of gestation, food for the fetus.  Mothers also, in general, have carried the burden of supplying family members with nutritious, tasty food. And this, not just as a one-off thing, but every day, multiple times a day.  


“Eat” also distinguishes humans as a species. The philosopher Martin
Heidegger made much of how the German language sorted out, with two different words, what animals do, i.e. “feed,” fressen, and what humans do, “eat,” essen.  Animals relieve hunger, most often individually even if they are a pack feeding side by side. They ingest whatever is identifiable as food. Then they move on.


Essen, “eat,” as a mother’s last utterance, conveys an important message.  Keeping the fressen/essen distinction in mind, it suggests “be fully human.” And, what, in turn, does this mean?


1 Don’t eat alone.  Eating alone, let’s face it, is kind of a drag. Counselors report how, for first-year college students, one of the most anxiety-inducing situations is sitting alone in the cafeteria, an anxiety made famous in Mean Girls when Lindsay
Lohan, worried about being alone at a table, retreats to the rest room with her food. We are social animals. One manifestation of that sociality is eating with others.


2. “Savor.”  “Relish.” Eating is a value-laden activity. We make distinctions between better and worse. What’s on the plate may be rated somewhere between tasty and bland as well as somewhere on the ugly to beautiful spectrum.  Preferences and avoidances are enacted.
“Taste” is something to be educated and cultivated. “Eating,” can, of course, be reduced to “feeding,” to a straightforward intake of calories. The latest attempt in this direction is the “meal replacement drink”  Soylent. To choose feeding (whatever rationalizations are offered in terms of efficiency and convenience) is, in effect saying “no” to being fully human.


3. “Connect.” All food was once a plant. Even meat  is from animals who fed on plants. Intelligent awareness associated with eating highlights dependences and connections to our natural surroundings. It draws our attention  to rain, sunshine, fertile soil, bacteria that keep it fertile, insects that pollinate, and, within ourselves, the bacteria that help us digest. It also links us to fellow humans, unrecognized mostly, but without whom, seeds would not be planted, farmland not tended to, plants not harvested, animals not butchered, fruits, vegetables and meats not delivered.



4. “Thank.” The religious word “Eucharist” means thankfulness.  It’s not clear whether pets are grateful. Some sure seem to be. But for humans, gratefulness marks a mode of relating which manifests full bore humaneness.  It’s beyond simple economic exchange, i.e. more than “I offer this service and now you owe me in return.” That exchange can take place without any sense of “thank you.”  What used to be uttered before meals, “grace,” like “Eucharist,” originally indicated thankfulness. A common prayer, “Bless this food,” can, by contrast, be misleading. It suggests that food is, initially, just stuff.  Such stuff would require a special effort to make it enter the realm of the sacred. If we pay attention to etymology, though, even “bless,” this is not surprising, carries within itself the sense of giving thanks.
Of course we can choose to live a life in the mode of exchange, the mode in which all objects and actions become commodities, one in which human relations devolve to contractual transactions alone. Gratitude at table helps inoculate us against such retrogression.



Today’s fast food world is one which offers temptations to which we can easily succumb: to think of ourselves as above and beyond the ordinary pleasures of the table, to prize individual autonomy, to think solely in contractual terms.


A mother’s exhortation to “eat” urges us not to succumb.  It suggests that we embrace and welcome our humanity in its fullest and best possibilities. Not bad advice

What about "die" the other half of the diptych? That's for another blog.




Wednesday, September 5, 2018

macrophages, logic and bio-logic

The French are, at least in part, a hippophagous people.  Caterpillars are phyllophagous, and, with population growing,
humans will soon be entomophagous.  Yes, it’s not hard to guess, the Greek root phagein means “to eat.”  It might be horses. It might be leaves.  It might be insects. In each case, there is an eater and an eaten.


Something similar occurs on the cellular level. The eater in this case is a type of white blood cell.  It becomes a “big eater,” a “macrophage.” Despite their formidable name,
macrophages are mostly an organism’s friend. What they eat are intruders and trouble makers, nasty bacteria, parasites, rogue cells. Macrophages thus provide protection. They serve an important function.


In official philosophical terminology, this end-directed, or goal-directed activity has a specific name, “teleology.”  We shiver, for example, when cold. This helps us keep warm. Ditto, but in reverse, for perspiration. Dr. Walter B.
Cannon’s well known book Wisdom of the Body, detailed the various processes contributing to, here comes another big word, “homeostasis.” It means keeping the body’s key activities within an established, familiar (“homeo”) and stable (“stasis)  range.


Walter B. Cannon was even willing, as we saw above, to speak of “wisdom” as the body preserved its harmonious, well-functioning operations.  At one time, the “big eaters,” the macrophages, fit well within this paradigm. Their activity, gobbling up troublemakers, made of them a sort of poster child for the “wisdom of the body.” Then came a bit of trouble.  


It turns out that, for some ailments, macrophages actually aggravate the problem. Atherosclerosis, autoimmune disorders, and even cancers provide the best-known examples. When it comes to cancers, the “big eaters” should be  “tumoricidal.” Instead, they often become “tumorigenic,” fostering tumor growth.


Just as Cannon used a human descriptor “wisdom” to describe our physiological functioning, so Barbara Eherenreich in a recent book discussing macrophages, also uses human descriptors. Hers, though, are negative. She speaks of macrophages as “treasonous,” as having a “mind of their own.”  The “wisdom” of the body may thus need to be rethought. Instead of the smooth functioning of harmoniously interacting components, the organism should be seen as a “battleground where its own cells and tissues meet in  mortal combat.


At the same time there is something discomforting about alternating between harmonious “wisdom” and all-out combat.  It would be good, Ehrenreich thinks, to have a new paradigm, one moving away from the sharp utopian or dystopian models.


She could be helped by taking a page from the realm of food. Salt, sugar, fat.  Are they good
for us? “Yes, assuredly.” Are they bad for us? “Definitely.” It’s not all that unusual for the same factor to embody both beneficient and maleficient possibilities. The technical term for this comes from ancient Greek.  The word is pharmakon.  It means both “drug, in the medical sense, and “drug” in the sense of poison.


Humans seem to have a natural aversion to holding opposites together. In strict formal logic the contradictory claims “sugar is good” and “sugar is bad” simply can’t coexist.

Logically, a pharmakon makes no sense.   Bio-logically it’s fairly prevalent. Logic may be neat and clean, but life if often messy, context-sensitive, subject to contingencies,  and full of surprises. In logic, strict determinism is assumed. In bio-logic it’s all about probabilities.


Using metaphors is hard to avoid. But they can be misleading. An organism is not a “machine.”
At the same time, it is not a conscious being able to possess “wisdom.” It can’t be “treasonous.” The same thing, sugar or a macrophage, in conjunction with an organism, can be both beneficient and maleficient.  


If a paradigm switch is needed it’s one in the direction of a favorite phrase of Aristotle, a biology-inspired philosopher: “always or for the most part.”  Today, we might tweak it a bit and say “usually, or for the most part.”

Neither pure determinism nor sheer randomness, neither peaceful harmony or treasonous betrayal can faithfully describe organic operations.  The biological realm need not follow any other pattern, certainly not that of machines; not that of generous-spirited or nasty humans. The biological is sui generis, unique and complicated. Teleological flexibility in light of homeostatis is real.  It works well “usually or for the most part.” The pharmakon dimension is also real. What is good can also be bad.

Flexibility and pharmakon as pivots of a new paradigm can help make good sense of “usually or for the most part.” For a long time philosophers were fascinated with certitude. This might work well in deductive logic.  However, in bio-logic, it’s all about probabilities. Start with a large enough sample, recognize flexibility, admit a pharmakon dimension, and atherosclerosis, diabetes, arthritis and various cancers will appear in the population. It’s not about wisdom having failed or treason succeeding. It’s just about bio-logic.
















 

Wednesday, August 29, 2018

MINDFULNESS or WHOLEHEARTEDNESS?

Mindfulness is everywhere.
When a movement is so popular, it (1) has something going for it and (2) will bring out detractors.  Cue Barbara Ehrenreich and a chapter entitled “The Madness of Mindfulness.”  Distractedness is a real problem. But, “mindfulness” as a solution?  The economic dimension, as usual with Ehrenreich, dominates. Mindfulness has been co-opted by commerce.  Smartphones foster distractedness. We might, therefore, be wary of purchasing any of the plentiful
mindfulness apps that bind us even more to our glowing rectangles.  There is also the disconnection from roots in Buddhism. Mindfulness is now understood as a tool for solving specific problems. It’s not a practice that gives rise to an enlightened awareness that might challenge the entire context that occasioned the problems in the first place.  Finally, she asserts that scientific support is weak. While recognizing “neuroplasticity,”she claims the benefits of mindfulness can be arrived at by other means: muscle relaxation, medication, and psychotherapy.


Ehrenreich overreaches with her blanket condemnation. After all, if it brings benefits, without drugs or psychotherapy, why not appreciate mindfulness?  She is on stronger ground when she emphasizes how mindfulness has shed much its Buddhist context.


Her way of phrasing it, though, is questionable. Mindfulness, she says, has been “drained of all reference to the transcendent.”  This is a very Western way of identifying the religious dimension, i.e. associating it with a higher power in a different realm. One of the great Zen texts is the work of Dōgen (1200-1253) who
introduced a form of Zen to Japan. It’s called Instructions for the Zen Cook. He focuses on the here and now, on humble activities involved in food preparation.  As the book’s introduction puts it, Zazen is neither an escape from the world, nor an attempt to achieve a separate goal. In fact, the practice is “tainted” if it aims  at some external, separate gain.
Eating, if not cooking, was central to  contemporary mindfulness. The source of
it all,  Jon Kabat-Zinn, suggested a simple initial exercise: eating a raisin mindfully.


The raisin example  highlights a difference with  Dōgen’s cook. Learning to savor a raisin means training the mind.  Dōgen emphasizes concrete practices involving the entire person: cleaning/sorting rice, getting/heating water, actually cooking food for a group of people.  Such an involvement, in turn, is not separated from natural and social circumstances. It’s not: me, alone, learning to savor a raisin.


Since “mindfulness” can suggest a withdrawal into one’s conscious self, Dōgen’s translator suggests “wholeheartedness” as an alternative.  The entire person, situated in a natural and social context, and engaged in magnanimous and caring practices is the focus.
Ehrenreich’s criticism, focused on attention deficit, misses a deeper pathology.  Today’s technological wonders do not just encourage truncated, limited attention. They also intensify something that has been central since the Renaissance: conceiving ourselves as, at bottom, self-sufficient and autonomous. Martin Luther’s criticism of this tendency
was harsh, insisting that the activity of turning in on oneself, incurvatus in se, was the heart of sinfulness.    


Buddhism, also concerned about incurvatus in se, challenges the tendency at its very roots.  Its guiding assumption: nothing is autonomous. Everything is connected to everything else.  Buddhists also emphasizes anatman, “no self.” By this they do not mean that humans are not unique persons in the world. Anatman indicates that the non-connected, non-dependent self is an illusion. The so-called “self-made” man,  would, if he examined his life in detail, find loans from banks (with money deposited by lots of other people), architects. engineers and contractors to build edifices and make machines,  employees, clients, a legal system allowing him to operate, and many other ways of being connected to and dependent on others.


Belief in the autonomous self has been prominent in Western thought since the 16th century. There are even people who extol “individualism.” This is not to be confused with respect for this or that particular person.  It is rather the conscious belief that one is fundamentally self-sufficient. It’s a belief that culminates in today’s culture of loneliness.


When cooks practice their craft wholeheartedly, with loving attentiveness, they recognize lots of connections: to the gifts of nature: sun, soil, rain, bacteria, worms; to the gifts of other humans: people who plant, till, harvest, ship; to the gifts of technology:  cleavers, pots, stoves, dishes.The tenzo ( Dōgen’s “cook”) cannot work as a non-connected, non-dependent individual.

 Mindfulness, isolated from its Buddhist roots, might help counteract attention deficit, but it can also worsen rather than challenge the faulty picture of ourselves as self-sufficient individuals.