Wednesday, October 24, 2018

Loss of Appetite


Albert Camus’s novel The Stranger features a scene in which a character, despite it being lunchtime, loses her appetite.   Her boyfriend, in the same circumstances, goes right back to chowing down.


Who are they? What are the circumstances? They are Meursault, the novel’s main character, and his girlfriend/fiancee Marie. Meursault, famously, does not play societal games. He does not pretend to be what he is not.  A neighbor, Raymond, has asked him for help writing a letter. The aim of the letter: entrap a young woman with whom Raymond had a relationship. On the day of the disrupted lunch, screams come from Raymond’s apartment. The  letter has worked. The young woman came by. Raymond beat her.


The ruckus causes neighbors to gather in the hallway. Marie, troubled by what she has witnessed, loses her appetite. Meursault remains unaffected.


Camus sought to write a novel featuring an “authentic” central character, one who would not take on roles expected of and assigned by society. What he skillfully depicted is someone completely detached from what goes on around him.  Meursault is indifference personified. He does not care that the letter he has cooperated in writing is part of a vengeful plan. He is not moved by the beating of a young woman. He lives in a self-enclosed world. Societal games might not be for him, but are all evaluative, sympathetic responses part of the false veneer of society?  Meursault does not just refuse hypocritical games. He goes so far as to numb his evaluative response to surrounding events. A young woman is being beaten next door? Duly noted. A mere piece of neutral data. Now, back to lunch.


Not Marie. She cannot remain detached.  The beating scene affects her. She loses her appetite.  Basic bodily needs are eclipsed as non-physiological concerns take center stage.


Abraham Maslow famously identified a “hierarchy of needs.”  It offered a neat framework which ranked the needs in a particular way: the lower needs had to be fulfilled before higher ones could be realized.
It offered a clear-cut system of “axiology”--concern with value and its ranking.  Marie’s loss of appetite suggests how Maslow’s hierarchy was a bit too neat. There are times when the tug of ideals, the commitment to key values, are more important than the demands of physiology. Marie, revulsed by the appalling scene of a young woman being beaten, offers one example.  The injustice, unfairness, general evil of the scene comes to supersede her physiological craving for food.


The same can be said for more well-known scenarios: suffragettes on a hunger strike;  Bobby Sands dying as a result of refusing nutrition; religiously induced fasts; starving artists, the Russian scientists charged with preserving a seed bank who starved rather than eat the seeds entrusted to them. There is also the pathology of anorexia where priorities have been upended and hunger is overridden.


Such scenarios suggest that the Maslow hierarchization needs to be tinkered with in two ways.  First, Self-actualization is not a good name for the highest level. Second, for humans, the hierarchy is flexible, what is higher on the pyramid can, under certain
circumstances, become more fundamental than what is at the base.


Against Self-Actualization.  Suffragettes, Bobby Sands, Russian seed bank scientists: What they held most dear was not actualizing the self. They sacrificed the self for a more communal cause, a cause they saw themselves as serving. One distinguishing  feature of humans is precisely this willingness to place a communal cause as a central ideal, even if it means loss of life.


For a flexible axiology. Humans manifest a cluster of aspirations, impulses, inclinations, wants. Living out the fullness of human life means not being limited by a hierarchy with pre-ordained, fixed levels.  It means being drawn as much by transcendent ideals as by physiological necessity. Both the physiological and the axiological can work together. At difficult times, in certain circumstances, people have to choose.  That the axiological can override the physiological is an indication of how complicated a creature we are.


Of course, humans can always, Meursault-style, numb themselves.  “It’s all the same to
me” they can say. In making such an utterance they situate themselves apart from ordinary human experience.  The position of complete detachment is a seriously artificial one. It requires a lot of work, either individual effort or culturally absorbed effort, to reach the point of neutering, treating as neutral data, the occurrences which make up part of our surroundings.  It is natural for us to hunger for food. It is also natural for us to hunger for fairness, justice, freedom, and respect. What characterizes us, as the special animal we are, is that the prioritization of these hungers is not rigidly fixed.

Connoisseur or Consumer? Butter and Margarine


“If we butter up the boss, maybe we’ll get a raise.”  “One of the great marketing ploys was to label turkey a ‘butterball.’” “Some people, not happy with what is owed them, want their bread buttered on both sides.” Butter, metaphorical butter at least, has a good rap. For real butter, the story has not always been the same. During the “avoid cholesterol,” “avoid animal fats" era of the 1980s butter became a public health target.  Longing to savor a “buttery” croissant? Stock 1980s advice: Don’t do it.


The millennium brought about a change. Things began to shift  toward natural products: real animal fats, as opposed to trans fats, and sugar as the new health threat. Butter sales began climbing once again.

Some obvious lessons have been drawn from this.  There was a time when the butter-alternative, margarine, was touted both for reasons of health and a general fascination with progress. One French margarine brand actually called itself l’Avenir, the future.  As Margaret Visser explains, margarine is a “wonderful
mirror of certain images of ourselves.” At least this was the case at one era.  It represented modernity, the triumph of science over nature, convenience and thrift.


The only downside: a taste that was bland and predictable.  It’s color, white, was also unappetizing, but this was easily remedied by artificial coloring. The taste issue proved less of a problem than might at first have seemed. This was because humans had shifted from "connoisseurs" to “consumers.” Central to being defined as a “consumer” is a two-fold move. First, relativism looms large.  All taste is believed to be just subjective. Second,there is an encouragement of de-skilling, the dismantling of abilities to
cook. Qualitative distinctions, especially those based on conditions operative apart from humans, begin to disappear. Humans become like little gods. Whatever they pronounce as good becomes, thereby, good.  It’s a bonanza for advertisers.

They fill the “it’s all subjective” anyway void. Relativism and de-skilling present advertisers with great opportunities. Critical, evaluative faculties are weakened. Heritage and tradition-associated tastes lose their hold. Some results: mass appeal comes to dominate over qualitative taste; “No surprises” comes to be considered a determinative factor in choice.


The new millennium brought important changes. One beneficiary: butter.  Its sales grew as those of margarine shrank. The “we are essentially consumers” model continued to be strong but some chipping away was evident. Human beings began to recapture their biological label Homo Sapiens, "man the taster." “Connoisseur,” one who judges based on knowledge, one concerned with quality, could be revived as a label to contrast with “consumer.”


The altered framework shakes up the consumer-centered value system.  The consumer framework prizes certain ideals: convenience, efficiency, and predictability are chief among them.  The “connoisseur” framework tends to highlight taste, health, and naturalness. Philosophically, there was also a move against relativism. As a medieval adage once put it, our judgments, at their best, have a  fundamentum in re, a basis in reality.  Of course the individual is doing the judging. But it’s not arbitrary and solely subjective.  There is interaction with conditions operative apart from ourselves. “Opinions,” may be matters of arbitrary predilection.  “Judgments” are not. They are distinguished by their awareness of some fundamentum in re.  


The challenge to relativism is also a challenge to consumerism.  The flattening, trivializing and levelling of taste is almost a necessity for a consumer culture. Real discrimination is disastrous for mass sales. Products like beers that are not really “beer” (made with rice instead of barley, for example) can be mass-marketed and become widely popular. The consumer is, on one hand flattered, treated like a little-god whose preferences are the final evaluative word. On the other hand, the consumer is de-skilled to the degree that powers of
discrimination are lost. The resultant void opens the door for the manipulations of advertising.  


The recent rise in sales of butter over margarine are indicative of the newer attitude. So is the penchant of millenials to favor real cheese over factory produced simulacra.

Judgments, for intelligent creatures, should indeed have a fundamentum in re, be based on conditions actually operative apart from us. The temptation to be God-like is ever present.  Its most recent variation: “consumers as little gods deciding, simply by their subjective preferences, what is good or bad.” This represents the latest version of the “thee shall be like gods” temptation.  It appears that, although their parents might have succumbed, millenials are somewhat resistant. They understand themselves more as “connoisseurs” than as “consumers.”

Wednesday, October 17, 2018

Grazing Overtakes Eating

“Three squares.” Even today, people can recognize this as identifying three wholesome meals following a pattern of eating in the morning, at midday and in the evening.  


The expression was first used in the 1860s.  The expression’s frequency peaked in 1920. It has been in decline ever since.  This makes sense since much of the 20th century was guided by ideals that tended to diminish the ritualistic pattern of three daily breaks for food. Certain values tended to fade away: personal interaction, tradition, sociability, leisure. They were replaced by values more conducive to a consumer society: efficiency, convenience, expediency. A recent survey laid out the results: fewer and fewer people follow the three-meals-a-day pattern.


The new way of eating was parodied in David Lodge’s Novel  Paradise News. The main character, a quiet British guy is visiting Hawaii.  What strikes him immediately is how everyone is carrying around food and snacking.  This isn’t eating, he thinks. It’s more like “grazing.” Twentieth-century folks might comment: Why not? Efficiency, convenience and expediency provide our guiding ideals.  Grazing, not having to bother with the inconvenience, time-intensive, cumbersome practice of sitting down to a meal is just how we live out our ideals. It’s also liberating.  Limits, restrictions,
constraints are bad. It is always good to break free of the heavy-handed, tyrannical, burdensome hold they have upon us. Grazing is a liberatory act. No one is telling us when it is ok to eat and when it is not.  


The centrality of efficiency and convenience indicate an important shift in our self-understanding.  Today, that understanding, in the advanced technological west at least, is highly individualistic, inclined toward iconoclasm, caught in a wild pendulum swing between sensualism and asceticism, and fascinated with authenticity.  What all of this means is that some traditional markers of the human condition are pushed to the periphery: the importance of community,
recognition of how inherited traditions are bearers of wisdom, a sense of incarnation which aims at a spiritual/sensual harmonization, and a commitment to goodness that defines it as more than immediate gratification of whatever aspirations happen to float through our beings.  The complex of conditions identifying our contemporary situation also plays itself out in practical nutritional terms. We know those well, especially obesity as a public health issue.


It’s quite possible that those who embrace convenience and efficiency, who reject the pattern of “three squares,” think of themselves as  emancipated from the tyranny of being forced into a particular pattern. They can envision themselves as being more free than those who continue to be constrained by the culturally approved pattern. Their days are defined by doing what they want , when they want, not by an externally imposed schema.

This sounds fine...until we start thinking about it.  Plato, a long time ago, had noticed a problem.

Those who say: “I do what I want whenever I want” are actually the least free of individuals. Yes,  counterintuitive as it may seem, this was Plato’s point. Such people are prisoners of their immediate impulses. They are controlled by every and any immediate inclination that courses through them.  They have little effective freedom, that is, the ability to chart a course, say engage in healthy eating, and actually achieve the envisioned outcome.


Even time, for them is, oppressive. For the patterned eater, the day is divided into spans, some for engaging in projects, some for shared time eating with friends and colleagues. Time is not separated from ongoing events. It is, in fact, marked by such events: work time, lunchtime. The day is parcelled out via the quality of ongoing events.  Time is not just a neutral ticking away of seconds. When people used to talk about “high noon” they
meant a particular portion of the day, the one in which the sun was at its highest point. Time and ongoing events were inseparable. Golf, tennis and baseball still embody this notion of time.


When abstract time, i.e. clock time completely separated from actual events, dominates, then the temptation for giving into immediate inclinations or impulses is harder to resist.  Every moment is potentially eating time. Grazing then makes sense. What also makes sense is indulging in fast food and processed food, the kind of edibles most congenial to grazing.



Skipping the “three squares” (or however else patterned eating has been parcelled out) has little to do with liberation from constraints.  It actually adds a new level of oppression: being imprisoned by immediate impulses and inclinations.

Wednesday, October 3, 2018

ADVERTISERS LOVE FREE WILL

“Data ‘R Us” could be the slogan for our era. It used to be that we, the inquirers, were on
one side, and data  were on the other. Scientists undertook their inquiries because they believed that, as a consequence, they would have a secure, fact-based story to tell.


Today the gap between data and us has been erased.  Algorithms, mining Facebook for example, aim “to gather sensitive personal information about sexual orientation, race, gender, even intelligence and childhood trauma.”  


What good is such data?  Follow the money. Scientists mine data because they want an accurate story to tell.  Marketers love the new data because they have something to sell.  


For marketers, “value” is defined as whatever will increase sales for whoever is paying them. Sometimes the admission is delivered deceptively: “The more data you collect from customers…the more value you can deliver to them.” It soon becomes clear that “value” does not means something that is independently good: “And the more value you can deliver to them…the more revenue you can generate.” Generating revenue via data-aided manipulation, i.e. interfering with freedom, is what it’s all about.


Within the world of food, the results of successful advertising are disastrous. When people eat what is best for the revenue of food companies, they are not eating what is
healthy for themselves. The results: a situation in which obesity and early onset diabetes are way more common than they need be.

Most people realize that marketing and advertising have a specific goal:  impacting our powers of adjudication, i.e. our ability to make free, well-informed choices. Why, then is advertising so often defended in the name of freedom rather than understood as the obstruction to freedom that it is?


Here is where philosophy plays a role.  It’s all about the meaning of “freedom.”  
We, inheritors of bad philosophy, continue to confuse freedom with “free will.”  There’s the rub. Freedom and “free will” are NOT the
same thing. Human freedom is the ability of reasonable beings to consider their circumstances intelligently and to elect among possibilities.  “Free will,” by contrast, suggests an inner faculty, completely unconditioned and unaffected by circumstances, one that provides a power at any and all times to make uninfluenced choices. “Free will” is attractive because it offers us a self-description  as somehow disincarnate. It is also dangerous. It encourages us to think in all-or-nothing terms. Either we possess free will or not. Forces shaping and influencing us are irrelevant. Defining freedom strictly in terms of free will impacts our thinking. It discourages concrete questions about what factors favor informed election among options. It encourages the all-or-nothing perspective which directs thought away from examining various ways in which freedom can be enhanced or minimized.


The flawed identification of freedom with ‘free will” provides plenty of cover for marketers. They can defend their work by saying that no matter how much money and expertise is marshalled to shape people’s selections, such individuals continue to have free will.  At the same time they can hire psychological
and data specialists for precisely one reason: to limit freedom. They can thus have it both ways: justifying their practices by saying that there is always “free will,” and adjusting those practices so that they become more and more effective in interfering with freedom.  At the same time they can block the efforts of public servants by claiming that it is they, the public servants rather than the advertisers, who seek to suppress freedom.


Freedom is a great good. It recognizes how we can chart paths for ourselves.  At its best it manifests itself in appraisals and evaluations that results in well-considered judgments. Those judgments, in traditional language, are verdicts, i.e. decisions based on truths. Ver-dict, truth saying, for
marketing, is beside the point, and, really, an obstacle.   The defenders of advertising are friends of free will but not friends of freedom. Adjudication on the basis of evidence is exactly what they seek to eliminate. They are savvy manipulators who construct scenarios, not because they have something important to tell, but because they have something to sell.

Wednesday, September 26, 2018

ANOREXIA INTELLECTUALIS


The Righteous Brothers had a hit song with a line that highlighted an important stomach-related word.  “I’ve hungered” it went, then, repeating the word, “hungered for your touch”.  It’s a lovely song, but it has nothing do do with eating. Still, that doesn’t matter.  Listeners know that “hunger” and “hungered,” when the context is human beings, can mean more than filling an empty stomach.


Take Aristotle. One of his most cited
lines reads like this: “all men, by nature, desire to know.”


What he says is that we have an "orexis" for knowledge. The term translated “desire” can mean reach for, yearn for, or hunger for. But why can “hunger” a biologically-derived term, apply in contexts of affection and intelligence?  Humans are biological creatures and they are more than that. The metaphorical use of “hunger”  offers one level of support for such continuity with difference.   Aristotle’s entire philosophy was friendly to biology. This did not mean he reduced "human" to the merely biological.  There are continuities and differences. He insisted that all ideals had a natural basis and all natural tendencies had an ideal culmination (to paraphrase George Santayana).


Philosophers are often accused of playing with words. Sometimes this is true.  Sometimes, recognizing distinctions is meaningful. In his book.
Hunger, Raymond Tallis suggests that “hunger” and “appetite” should be distinguished.  The former, along with the non food-centered term, “desire,” is more indicative of all-embracing humanity than is “appetite.”  The difference? Narrative time. “A desire is an appetite that has a story--albeit unfolding, evolving, fragmented, at its heart.”  It is a hunger that narrates itself and tries to make sense of itself.”


Appetites are raw, instinctual, sated in a stimulus-response  way. Desires/hungers are multi-layered. They are often occasioned, not by an object but by a relationship. That’s certainly the case in both the Unchained Melody of the Righteous Brothers and in Aristotle’s claim about human hunger for knowledge. What is yearned for is ongoing (time-factored) actualizing: making love real in one case and satisfying curiosity in the other.


The culminations of such hungers occur only within spans of time. Unfolding within spans of time means not only that they are ongoing, but that they require effort, and, instead of unimprovable perfection, often result in culminations with loose ends.  They are thus precarious, subject to contingency, require that we make ourselves vulnerable, and can be frustrating.


For those reasons, there is a general tendency to an-orexia, the suppression of hunger. This is not just the pathology associated with physiological appetite.
 There is also intellectual an-orexia: the longing for absolutely secure, contingency-free, doubt-free existence. One impediment to such security: thinking. After all, thinking is bothersome, annoying, discomforting.  If only the yearning, the hunger for knowing could be suppressed, if only intellectual anorexia could become the rule, then a sense of comfort, and security could be established. The paths to intellectual anorexia are well-paved and multiple.  Two came to special prominence during the Modern (roughly 1600-1900) era: logical deduction and calculative rationality. Using formal logic to deduce conclusions that are certain has little to do with thinking. The latter always engages complicated situations and arrives at resolutions that are only secure within varying degrees of probability.  Instrumental rationality, which starts with a pre-ordained end and then calculates the most efficient means of getting there, offers a poor substitute for thinking which puts ends themselves into play. Other strategies have been around forever: absolutism and skepticism.
But wait, readers may be thinking, aren’t these opposed to one another.  Well, yes, they are. One says, “the answers pre-exist, all you have to do is accept them.” The other says “there are no answers.”


Despite their differences, and like logic and instrumental rationality, both offer a particular and attractive invitation: stop thinking. It might be “stop thinking because the answers are already at hand.”  It could be “stop thinking because there are no answers anyway.” In each case, the result is the same: peace of mind occasioned by sidestepping the annoying practice of thinking.


For the physiological anorexic, the body’s constant demands for edibles is felt as an oppressive demand. For the philosophical anorexic it is the demands of the mind that are annoying. Thinking involves “trials,” literally the processes of sifting out, of trying one path and then another. Such processes are  are also “trying” in the sense of being annoying, bothersome. Like hunger, thinking never leads to stasis. The practice is unending. How to eliminate such discomfort? Go intellectually an-orexic. Hide behind pure logic. Transform thinking into calculating. Pretend the answers are already obvious. Insist that finding answers is not possible. In other words, suppress a basic human orexis, stop thinking.

Wednesday, September 19, 2018

Eat/Die2




Robert Indiana’s Diptych highlights two words Eat and Die.  “Eat,” as we saw in the previous blog, is complicated. Yes, we eat in order to stay alive, but human eating tells us something about what it means to be fully human.


Something similar occurs with “die.”  Martin Heidegger, once again, leads the way.  He suggests that animals “perish” whereas humans “die.” Conscious awareness marks the difference.  Heidegger gets all worked up about this, referring to human life as Sein-zum-tode “being unto death.” On one level the phrase is a truism: we live in the awareness of our mortality.  As a result, prioritization about how to spend our limited span is forced upon us. At its best, it encourages  a more “authentic” existence.


So far so good, but there are limitations.


First, the “being-unto-death” focus is too individualistic.  Fastening on concern about my individual death ignores an important aspect of existence: sorrow and
grief. Grief at the death of others is often the earliest confrontation with mortality.  The experience of grief moves us away from the isolated subject facing his (this is the pronoun Heidegger would have preferred) own death. Grief means we have loved. The love/grief awareness, in turn, reveals something important about our cares, concerns and patterns of significance. Death experienced as grief diverts attention from the atomized subject and its authenticity. It emphasizes, rather, worthiness of a life lived in relation with others.


Second, we are not just “beings-unto-death.”  Our physiologies signal a pattern. Fully articulated, it’s a pattern of birth, puberty (because we do not live forever),
birth of others, then death.  Even if we do not have children of our own, we are attuned to the birth/puberty/birth/death/ cycle.This is where eating, and the mother imagery from the previous blog, come again to the fore.  The gestating mother, the one who is food for the fetus, is about to do something very important, something inseparable from mortality, give birth to a new human being.


Isolating mortality from natality encourages insulated, self-directed, reflection on how to shape our life trajectories.  Admitting we are both “beings-unto-death” and “beings-unto-birth” encourages a more social set of concerns. It encourages us to reflect on the  kind of world we will leave behind for the offspring who will follow (whether we actually produced some or not).


Third, life-time should not be thought of as an unending sequence of instants. When time is envisioned as a line, it makes sense to envision life as possibly going on forever. This is what the 16th century philosopher Spinoza, did when he
asserted  that any finite entity strives to “persevere in its being.” Experience teaches a different lesson. Finite beings realize they are part of a finite cycle.
 
Time is every bit as much a cycle as it is a line. Perhaps, for lived experience, time is actually more periodic than linear. Even philosophers fascinated by an unending line recognize nature’s cycles of seasons and years. They realize how  the sun itself will, at some point, come to an end. Life-time is periodic, punctuated by  beginnings and an endings. For biological creatures, periodicity is marked by overlapping spans, with  newer generations always in the making.



Death may be untimely and thus appear as a dark-garbed, scythe-wielding assassin. But death itself is not opposed to life. It is a necessary constituent of any life-trajectory. Our path as organisms is to live a limited span, make a kind of mark, leave offspring and die.  Emphasizing only the impulse to live on, or thinking of death as a kind of enemy is, in an important sense, to deny our humanity. Contrary to Spinoza we should encourage finite beings to accept their finitude.


When we think mortality, grieving, and natality together, we can recognize how thoughtful reflection can move in a direction different from the existentialist emphasis on an”authentic” life. That
emphasis includes a few important elements: (a) Thinking mortality/grief/natality together encourages us to think in terms of “we” rather than “I.” (b) A fundamental concern moves from  “how shall I live authentically,” to “how shall we make a world for the next generation?” (c) Finally the mortality/grief/natality awareness encourages a shift in time-consciousness. We recognize ourselves as periodic beings, part of a particular span which will impact overlapping, subsequent spans. The emphasis is thus on generosity in terms of what we bequeath. Worthiness replaces authenticity.


Eat/Die. Both unavoidable. Both complicated. Both sources of philosophical reflection. The best reflection will hold them together. We are beings unto life and death. Isolating one from the other falsifies our condition and leads to faulty thinking.