Saturday, September 3, 2016

Eating Bugs


Warren Belasco has done a lot of thinking about the future of food. When he asks students what they think the future of food will be, one answer stands out: “probably food in a pill.”  Why not? It’s efficient, speedy, involves no need to plan meals, shop or clean up.  Still, even those who think this is where we are headed do not necessarily favor it.  The space program offers a good counterexample.  Compact foods in tubes, a “drink” like Tang, various versions of the “meal in a pill” theme, have been kind of a bust with astronauts. Nutritionists have now done a better job mimicking real meals, even going so far as to fashion a table around which astronauts could gather.


Down here on earth, the latest incarnation of the “meal-in-a-pill” is the drink soylent. It claims to offer a nutritionally sufficient alternative to old-fashioned food.  Just gulp down the drink several times a day and no need to bother with meal-planning, shopping, cutting, peeling, seasoning, cooking or cleaning up.  Soylent recently branched out, developing a crunchy bar as a snack supplementing the bland drink. (When my students tasted the liquid soylent their general reaction was: tastes like the milk left in the bowl after eating cereal).


So, here go the contradictory moves. People think we are headed in the direction of synthetic food.  There is little evidence that synthetic food will catch on.


It could just be that the future of food lies not in the direction of artificial concoctions but rather in expanding the kinds of creatures we eat.  I’m talking entomophagy here--what could very well indicate the direction of food’s future. “Entomophagy” might be a big word but it means “eat little things”  insects.    


The background is by now familiar: lots and lots of humans
crowding the earth, meat eating as an ecological disaster, starvation and ill-health as evils to avoid.  Result: we need a new food source.  Sci-fi, techno-geek types go the “meal-in-a-pill" route. Head-in-the-sand types pretend there is no problem.  Others take the entomophagic gamble: we’ll just have to get used to eating insects.  They are already available online.  Entomophagy already has its boosters.


Plentiful, easily replenishable, insects can help provide the amount of food needed for a crowded planet. They can do so in ways that are nutritious, plentiful and do minimal damage to the environment.    It all seems “win, win, win.”  But, then, the “ick,” “yuck,” “gross” factor kicks in.  Disgust raises its head at this point and disgust is not to be taken lightly. Paul Rozin the psychologist who has studied disgust is straightforward about its association with humanity:  "It's hard to imagine civilization and culture without disgust, the sense of what's inappropriate. If you could imagine a person who is free of disgust, it's sort of hard to imagine how they would be distinctly human. It's got to do with the modern sensibility; it is the -- the -- sign of civilization."

Disgust operates as a safety device.  Socrates used to say that he felt a kind of voice within which never told him what to do, but regularly warned him not to do something. Disgust is sort of like that.  A lovely table, nice china, a plate full of vomit--disgusting and not to be eaten.  For many of us, insects sort of cross the border from the merely distasteful to the disgusting.  Can this be overcome?

Some indications say yes. First of all, not all human find the ingestion of insects to be disgusting.  Second, taste is educable. Third, educability arises from a simple fact; disgust is not infallible.  The familiar example of stinky cheese is regularly used as an example of the second and third indicators.

Predicting the future is fraught with difficulties. Right now it looks like eating bugs is inevitable.  For some of us it’s not as much of a change as it might seem.  I write this from Maine where the great delicacy is lobster.  Fishermen refer to them as “bugs.”  The word “lobster” derives from an old word meaning “locust.” There is also actual evidence of a genetic link between land bugs and lobsters.  A moral case can be made for refusing to eat lobsters.  David Foster Wallace did so  most famously.  For those who savor the crustacean, entomophagy (albeit in disguised form) is already part of their diet. Switching from ocean bugs to earth and air bugs might not be too big a step after all.     


Saturday, August 27, 2016

All You Can Eat



“All you can eat.”  “Ice cold beer.”  Signs meant to attract when they should repel.  What is it about us which finds attractive an appeal to food for bulk; to a liquid so chilled that subtleties of taste have disappeared?  Naturally, as readers of this blog realize, it all goes back to philosophy.  
For a long time, evaluation, determining what is most worthy of selection, was dominated by familiar themes: moderation, limit, balance.  Take the ancient Greek exhortations: “Know Thyself” and “Nothing in Excess.”  


These are closely related.  By knowing ourselves, (1) we realize the importance of  achieving moderation, the right mean between extremes.   In turn, moderation makes little sense without (2) an appreciation for limits, and (3) a sense of multiple variables that have to be kept in balance.  The food world offers ready-made examples: salt, sugar, fat.  Each of these is good, in moderation, within limits, as part of a balanced diet.
The “nothing in excess” philosophy held sway for a long time.  Like any position, it could be pushed to excesses of its own.  “Limits” morph into rigid social structures and oppressive moral codes.    So the Modern, post-Renaissance, world  was inaugurated by a challenge to limits. The Modern era
was liberationist. Freedom, i.e. rejecting limits, constraints and restraints, became the highest good.  
One positive result was the establishment of democratic republics.  Once again, though, the “nothing in excess” rule came to be violated.  Not just violated, but the very rule was put into question.  Limits became inherently bad. They all needed to be transgressed, subverted, overcome.    We have here an oft-repeated pattern: overreaction.  Proper attempts at reform come to be dominated by simplification and short-cuts.
Nothing offers a more convenient short-cut  than quantitative measures. Such a move also helps encourage another human temptation: evading responsibility. Instead of responsibility, we seek neat algorithms, easy formulas to follow.

The “nothing in excess” position made evading responsibility difficult.  “Moderation, by its nature, is an elusive target.  Success depends on many factors: Individual circumstances, the right cluster of ideals, tradition, experience,  evidence from experts. This mix hardly provides a clear-cut, knock-down, "just follow this" algorithm. In other words, uncertainty and anxiety are built in to the “nothing in excess” model.
How circumvent uncertainty?  Minimize anxiety? Reduce responsibility?  Simple: abandon the “nothing in excess” philosophy and adopt a “challenge limits/seek quantitative  measures” approach.  “Good” can now come to mean (a) ignoring limits by seeking always to overcome them, (b) ignoring balance by  maximizing one outcome,  and (c) ignoring moderation by taking guidance from quantitative measures.
Daniel Yankelovich, the public opinion analyst, well described this last strategy:
 "The first step is to measure whatever can be easily measured. This is OK as far as it goes. The second step is to disregard that which can't be easily measured or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can't be measured easily really isn't important. This is blindness. The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide."
Step three is where “all you can eat” and “ice cold beer” fit in.  The simplification/short cut is clear: the best food “deal”  provides the most bulk for the price; the best beer is chilled to the max.  Both standards are measurable.  Both isolate a single factor.  Each rejects limits.  Neither aims for the right mean between extremes.  Both, indeed, identify good with a rejection of moderation.
So, why are we drawn by such invitations? We have bought into a “know thyself” attitude that is quite different from the “nothing in excess” one.   The self we know today is a bundle of desires seeking satisfaction.  The “good” life is one which maximizes satisfaction of desires, breaks boundaries, and finds comfort in quantitative standards.
In short, today’s response to ‘know thyself” is “I am a consumer.”  The "citizen" of ancient and modern republics has given way to the "consumer."  “All you can eat” turns out to be a siren song. It’s a feel-good, don’t-let- others-limit-you, kind of call.  In the end, though, Nemesis, the Greek goddess of comeuppance, returns.  Subversion of food limits comes with an inevitable non-monetary cost:  weight and health suffer.  Our value system is badly distorted. Signs like “All you can eat” and “ice cold beer”  should send a clear signal: stay away.
























Saturday, August 20, 2016

Avoid Laziness: Dispute Tastes

A poet addressing an audience of 14,000?  It happened. The speaker: T.S. Eliot. The year: 1956. The place: a Minnesota athletic stadium.  How things have changed!  Today, beyond athletic events and rock concerts, stadiums are permeated by what has become most pervasive: advertising.   Best slogan for the last 60 years: from poetry to advertising.


Poetry is thick with allusions, references, ambiguity, the blend of sound and sense.  Its aim: revelation. The poet has something to tell.  Advertising relies on stripped down language.  It’s aim: manipulation.  The advertiser has something to sell.


Food studies people pay attention to advertising and its impact on health.  Now we have a philosophy book whose reflections begin by noting how ads are everywhere (see urinals, airplane tray tables, gasoline pumps). Not only that, the book uses the example of a cook both prominently and positively. Matthew Crawford is the author. The World Beyond Your Head is the title.  
Crawford’s concern is with “attention,” especially how our attention has been colonized by those with something to sell. He realizes, it’s a philosophy book after all, that this state of affairs was prepared by a general understanding of who we are, a particular “philosophical anthropology.”  This anthropology describes humans as minds (the “head” of his book’s title).  Things and events become neutral items providing data for the mind. We call these data “objects,” items in the “world beyond your head.” One outcome of this picture is expressed in ordinary language as “it’s all subjective,” or the food-related  “there is no disputing about taste,” i.e. the individual mind is the source of value.  Examined psychologically, such assertions reflect  laziness and self-interest. They combine an unwillingness to scrutinize value judgments and the self-satisfaction of the status quo. To say “there is no disputing about taste” is, in effect, to say, let’s not bother thinking about this, let’s leave things as they are.  Examined philosophically, as Crawford does, such assertions are part and parcel of a picture which, forgetting hands and stomach, artificially characterizes the human situation as  “head” and “world outside your head.”   


In the realm of food it is a commonplace to recognize that as the stomach-hand connection is interrupted, the more “de-skilled” we become. One example: the less we are able to cook.  In turn, this de-skilling renders us vulnerable to those who would gladly (a) provide the service and (b) shape our judgments.  Attention can be colonized because the de-skilling is  accompanied by  a loss of what alone can provide a defense: standards and measures for discrimination.


Crawford’s concerns are similar. The realm of “it’s all subjective,” thrives in the absence of justifiable, generally accepted criteria.  Why?  Without agreement on standards, we are empty, blank slates, ready for colonization. The colonization provides substitutes. On one hand, as qualitative criteria disappear, quantitative benchmarks, often “narrow economic considerations” become the default.  On the other hand, we are told to reach within and find our true selves, i.e. accept uncriticizable subjective feelings.   “The fact that these preferences are the object of billion-dollar, scientifically informed efforts of manipulation doesn’t square with the picture of the choosing self assumed in the idea of a “free market.”
In all of this, an overarching norm takes center stage: conformity. Why do the hard work of selecting quality-driven exemplars to emulate?  An obvious and easy (laziness again) standard is now available: opinion polls.  “We cannot look to custom or established authority, so we look around to see what everyone else thinks.  The demand to be an individual makes us feel anxious, and the remedy for this, ironically enough, is conformity. We become more deferential to public opinion.”


The best way to liberate attention from its colonizers is to multiply engaged activities, like cooking.    We then attend to our surroundings in ways that, (1) encourage us to recognize factors of significance built-in to surrounding conditions, factors open to reasonable discussion and debate.  Their  significance may be related to ongoing
projects  (what the “it’s all subjective” fans emphasize),  but  the what  and  why of significance are not of our making.   (2) Within any craft tradition there are masters and experts. We learn to appreciate, admire, and strive to emulate role models. It is genuine living models who serve as touchstones not some statistical mean.

Ours is a world dominated by those who have something to sell.  Conformity is crucial for them.  Confusing “norm” and “average” helps their cause.  Having a de-skilled populace is a great boon. Cooking and other hands-on activities allow openings for our liberation.  Crawford seeks to blend hands and mind. To his credit, he wants to make room for those who have something to tell.

Monday, August 8, 2016

Unforget Stomach, Remember Socrates


Rowman and Littlefield has now made I Eat, Therefore I Think available in paperback. This comes as the number of philosophy books addressing food issues  is growing. Some recent examples include:  Paul Thompson, From Field to Fork: Food Ethics for Everyone, Corine Pelluchon, Les Nourritures: Philosophie du Corps Politique,  Julian Baggini, The Virtues of the Table: how to Eat and Think.  In addition, there is the book authored by Lisa Heldke and me, Philosophers at Table.

Why the upsurge? There is renewed interest in everyday activities typically ignored by philosophers.  Food studies has burgeoned.  Both hunger and obesity offer contemporary challenges.  Issues of sustainability and the treatment of animals have also provoked reflection.

Where does my book fit into the discussion?  Perhaps the most important lesson is one the book does not explicitly mention: philosophy is not an abstract discipline.  Philosophy, as Alfred North Whitehead put it, is a critic of abstractions.  “But, wait,” readers are probably thinking, “Isn’t philosophy abstract thinking?”  “That’s what we learned in philosophy class.”  

All well and good, but partial and misguided.  In its historical trajectory, philosophy has indeed come to be associated with abstraction and mind puzzles.  I Eat, Therefore I Think aims at changing this take on philosophy.


To accomplish this, I began with a clumsy term:  “unforgetting.”  “Un-forgetting” literally translates the Greek word for ‘“truth,”  aletheia.  My book seeks to unforget that we are stomach-endowed. It might seem obvious that humans are creatures for whom the stomach plays a major role, but philosophers of a particular period have tended to ignore this.   Second the book seeks to unforget, i.e. reinsert, some Socratic elements into philosophy.  

A few Socrates-inspired elements stand out:

Dialogue. Socrates believed that thinking through an issue involved cooperation with others. Thinking is not an isolated activity that goes on in the head. Thinking, in the fullest sense, takes place via dialogue.

Irony. Socrates embraced “irony” in its philosophical sense: an awareness that even our best formulations somehow fall short. Philosophy’s task is unending because our articulations, while they get something right, are, at the same time, lacking in other ways.  Reality is too complex to be captured in any snapshot.

Agora.  Socrates practiced philosophy in the agora, the marketplace, the public square.  The topics he discussed were rooted in the living concerns of flesh and blood people.

What? Socrates showed how philosophy is not concerned with the question “why.” Its purview is the question “what.”  What is friendship? What is love?  What is virtue?  Philosophy means “love of wisdom” and the “what” questions allow for actual, helpful answers that guide “wisdom.” ”“Why” questions, e.g. “Why is there something rather than nothing?”  “Why are we here?” are more the province of myth, non-philosophical ways to deal with questions about absolute origins or absolute ends.  

How does a stomach-friendly help re-invigorate these four Socratic themes?  A major metaphorical alteration is first needed: thinking of conceptual abstractions as recipes. Then, just as recipes, though helpful, can be critiqued and altered, so philosophy can undertake its role as “critic of abstractions.” The same move helps philosophy become Socratic once again.

Dialogue and the Agora.  Recipes involve dialogue.  We start where we are, in the agora. The cultural heritage in which we find ourselves privileges a conceptual framework, a series of recipes.  Engaging in dialogue with that tradition, we critique, we restore, we revise received recipes, we develop new ones. In this process, we are often aided by adding a dialogue with other traditions.  

Irony. Recipes are never fixed, finished and final, i.e. the slippage between formulation and reality, a slippage which identifies genuine irony, is ever-present.  What counts, in the end, is how recipes become manifest in experience.  The philosophical “criticism” of abstractions involves the back and forth between abstractions and the way they play out in lived experience.

What, not why. When we think about recipes, the ‘’what” dimension dominates.  Questions about ultimate origins and ends are unanswerable apart from storytelling, from myths. These, it must be noted, may be religious or they may be evolutionary.  Either way, they are not philosophical.  The recipe focus, which begins by humbly accepting the simple fact of our being here, tends rather to highlight “what” questions.”   What makes a meal delicious and nutritious?  What makes a life good? Starting with only vegetables can we do to come up with a delicious/nutritious combination?  What is the extent of justice?  Does it encompass non-human animals?

I Eat, Therefore I Think offers itself as the kind of corrective that will re-define philosophy as both the critic of abstractions and the friend of Socrates.  Hopefully, the paperback version will allow those ends to be widely discussed and disseminated.











Friday, July 29, 2016

Festivity: Thanksgiving vs Las Vegas


Georgetown Island, where we spend our summers, is 300 years old. Let the festivities begin!   Wherever there is celebration, there is food. Georgetown’s festivities began, naturally enough, with a birthday cake.
Festivity, despite its prevalence in history, is sort of fading in our age. We have embraced the 24/7-365 world. That means nonstop work, interspersed with diversion. Such diversion, It is important to add, offers a poor and false substitute for festivity. In fact, a defining trait of our time is the substitution of entertainment, pleasure-seeking, and diversion, for festivity.


Festivals are about a combination of factors:  (1) recognizing how we are part of something greater than ourselves, (2) appreciating the dependencies that characterize our lives, and (3) suspending our normal schedules to celebrate those interdependencies.


“Festivity” is at heart a religious affair. Why? Because its mode of responding is built around the nexus identified above: dependence, gratitude, celebration.  The wider world, which includes sources of nourishment, biological ancestors, political and social ancestors, along with the ability to bring about future generations, all of this is understood as a gift.  Within such a context, the festive dimension is fundamental.  The regular workaday efforts at making a living remove us, much of the time, from this festive awareness. They should not, however, completely occlude it.  



Such occlusion is more than ever prevalent today, dominated as we are by the 24/7-365 commerce-and-diversion lifeworld.  In the United States, Thanksgiving long remained a firewall, an inviolable witness to the festive.  Now, as that holiday gets overrun, nothing is left to block the new prototype: Las Vegas.  Here is a place where even major national and religious holidays never interrupt the commerce-and-diversion industry.  The resulting symbol for us: not the social grouping seated around a meal table, but rather the singular self seated at a slot machine.


How did we get here?  In terms of ideas,  the answer has to do with “philosophical anthropology,” the general way we define ourselves.  It is hard to deny that we are creatures of hunger. Hard also to deny that we are creatures of natality, i.e. we have been born. Hunger and natality, these set the stage for the dependence-gratitude-celebration triad that defines the festive.  


But hunger and natality mean dependence, an admission that we are beholden to forces outside ourselves.  Modern philosophy, post-Medieval philosophy, was a huge attempt to escape from dependence.  It was all about self-sufficiency.  To get there, intellectuals had to redefine the human being. This redefinition was rooted in a fantasy about human origins. The fantasy came to be known as the “social
contract” story.  The story starts not with birth from an actual female, not with hungry infants suckling at the breast, not with relatives and community members who are supplying food.  It begins with full grown individuals, portrayed as  “encapsulated selves” as critics call them. These encapsulated selves then agree, rationally and in terms akin to a commercial exchange, to enter into a contract and become social, members of a community.
  
Because philosophy forgot hunger and  natality, it drifted away from the genuinely “festive.” With self-sufficiency all the rage, the encapsulated human became the default position.  In addition, to the degree that humans forgot their hunger and natality, the world to which these connected them was also transformed.  It became more and more mere matter, stuff to be manipulated.  Humans began to understand themselves, not as woven into the fabric of things, but as outsiders somehow stuck in a  natural setting that was neutral, indifferent, waiting to be transformed and mastered.  Dependence, gratitude, celebration?  Not in this newer world.


So our time and that of the ancients sort of reverse each other.  Holidays like Thanksgiving lose their vibrancy.  The draw of Las Vegas becomes ever more powerful.


Food also takes on a different meaning. Fast food is what results from a world in which festivity has lost its primordial pull.  A traditional Catholic calendar identifies each day as a feast day.  Though they may not all be major celebrations, all meals, drawn from the earth’s bounty, shared with others, accompanied by gratitude, should be somewhat celebratory. By contrast, in our 24/7-365 world, pausing to recall how each day is really a feast day and each meal should be celebratory, these become the exception, not the rule.  

Philosophical shifts in self-understanding make major differences. Start with a physiological creature who is hungry, has been born, is thankful and thoughtful.  Then festivity is primordial. Start with an encapsulated self fascinated with self-sufficiency, set over against a neutral reality, a reality simply awaiting manipulation. Then the 24/7-365 world becomes our default condition.  Goodbye Thanksgiving. Hello Las Vegas.



Friday, July 22, 2016

Appetite-Taste/Nature-Culture



Immanuel Kant once disparaged an Iroquois visitor to Paris for appreciating nothing other than “the eating houses.”  Kant wished to sort out humans guided by appetite from others guided by superior tastes.    A recent article in Aeon magazine deals with food metaphors relative to reading. It charts the trajectory from appetite to taste, suggesting how “In the 18th century, writers began to distinguish between appetite (the connection between reading and the body) and taste (connection between reading and the mind).”


“Appetite” is immediate and indiscriminate.  “Taste,” is selective, refined and mediated.  Appetite and taste may, in ordinary minds, be intermingled.  This would not do for Kant. Aided by free-standing substantives like “body” and “mind” his ideal was not proper integration of the bodily and the mental. It was the segregation of body and mind.  Physiological taste relating to food was suspicious because it remained intertwined with mere appetite.    The more elevated tastes would have little to do with food.  


The pattern here is familiar: (a)  a need to sort out better and worse; (b) mapping better and worse isomorphically with mind and body; (c) thinking that humanity, as the  Aeon writer put it, represents little more than a  “cesspit of ungoverned appetite;” and (d) subsequently, celebrating a power of domination, self-control, as the only hope for keeping appetites in check.


On the other side, there have always been philosophers championing  “natural” tendencies. They warned against substituting impositions of artificiality and convention.  The Stoics and Epicureans moved in this direction.  Their “follow nature” mantra was resurrected by a thinker like Jean-Jacques Rousseau, by Romanticism, and by various communal living experiments in the 1960s.


Both tendencies, the superior-taste crowd and the follow-nature crowd, share one commonality: they are autoreferential. This allows them an escape from risk and responsibility. The superior-taste position says says it’s all about the cultural conventions we have imposed.  End of story. The follow-nature school says, I am just following my own true nature. End of story.


What is absent? Risk and responsibility. These emerge with dependence,  hetero-referentiality. The latter asks of us that we respond to signs and signals coming from outside of us.   
The appetite/taste contrast is one version of the wider nature/culture opposition.  Are there ways to rethink this opposition? To break with autoreferentiality?  To embrace, rather than escape, vulnerability and dependence?  To emphasize the unending need for exploration, questioning, dialogue?  


The answer is yes, but the approach may be surprising--taking taste, this time actual physiological taste, seriously.   Such a taste is unavoidably hetero-referential.  It involves responsiveness, i.e. responding to signs and signals coming from outside of us. Some results: natural appetite and taste are not two contrasting forces. They are correlative.  Second, the “taste” that fosters well-being is neither a “construction,” an artificial, merely subjective imposition, nor is it merely hardwired. It is appetite cultivated, channelled,  informed by experience, experiment and tradition.   


“Taste” serves as a good model for hetero-referential responsibility because it cannot use either culture or nature as a final determining factor. Rather than calling on a single foundation, it always involves election among alternatives. This election, in turn, emerges from clues, indications, signs that are present in a world we have not made but on which we depend.   Taste  not only depends on factors apart from us, but is also, as scientists put it, “multi-modal” involving, as it does, taste buds, smell, tactility, family and cultural practices, visual clues, temperature and even sounds.

What does all this mean?  First of all, we have to reintegrate appetite and taste (also nature/culture).   Appetite requires taste, and by nature we require culture.  Second, when using “taste” metaphorically, we should not stray too far from its physiological, food-related associations.  Wanting to get beyond responsibility defined as election among alternatives, we tend to ignore the irreducibility of the multiple and seek some single criterion. This criterion then is utilized as a simple, straightforward guide which mandates a particular behavior.

When, by contrast, we remain close to the multi-modal understanding of taste, we are always (i) dealing with a multiplicity of factors, (ii) many of those factors involve an inseparable blend of nature/culture, (iii) our response is a melange which seeks a proper balance of factors, (iv) we must always take an active role in adjusting the balance, revising  and polishing it, engaging in an experimental back and forth that moves from worse to better, (v) all the while realizing that a perfect unity or un-revisable blend will never be achieved.  

Kant was right to emphasize, taste, just not in the way he envisioned it.

Friday, July 8, 2016

Ferran and Heston OR Julia and Alice


I’ve just returned from a conference on food aesthetics where, no surprise, the food/art connection took center stage.  Use the expression “food art” and what comes to mind?  Well, not really food.  Still life paintings can be beautiful to admire.  They do little for hunger.

  During the 1960s food was used in happenings, happenings far from the experience of the table.  One included a scene of “women licking jam off a car.”

Another  called “meat joy” involved young writhing bodies interacting with dead chickens, fish and each other.  More recently, Felix Gonzales-Torres produced a sculpture made of candy in brightly colored wrappers, a work commemorating the death of Gonzales-Torres’ partner.


What do these have in common? Ordinary eating is bracketed. When Chefs are hailed as artists they tend to be those who have “elevated”  cookery from the realm of the everyday.  Ferran Adria’s molecular cooking and Heston Blumenthal’s playful creations have allowed them to drift upward.  


Seems sensible enough.  “Sensible enough” though, is precisely what sets philosophers going.  It means that among a possible range of  options, one has become so central that others are forgotten. Thinking, envisioning other possibilities, is then blocked.  
“Art” offers a paradigmatic case.  I often ask students to identify favorite artists. The responses: Van Gogh, Rembrandt, Goya, Pollock,  Monet.  I then indicate how the question was not about favorite painters, but artists.   The sedimented sense of “Art” is well symbolized by the unquestioned and thus unreflected upon assumption that painting is what should primarily come to mind when we discuss “Art.”


Such an understanding of Art  (upper case “A” is important) emerges within a specific philosophical take on things. This particular take characterizes humans as essentially spectators to the world.  Those arts which are most “spectatorial” then move to central prominence.    


The inherited philosophy devalorizes the everyday world of ordinary practices.  Beauty is outsourced.  To the museum, the concert hall, the theatre. This is what allows the complementary conceptualizations “fine” and “applied” to emerge. As a corollary, fine and applied are inversely related.  Works “tainted” with utility, arts like  weaving, pottery, architecture, ritualistic dancing, landscape design and cookery, fall to a secondary status.  


Changing the complementary labels I would say that, in the inherited framework, the only way that  the operative  arts can rise in stature, is to approximate the spectatorial ones.  Heston Blumenthal’s Artist credentials are evident when he re-envisions  sugary, frozen treats of his youth, transforming them into  savory concoctions which stun, surprise and wow his patrons.  Similarly, when a diner bites down on one of Ferran Adria’s liquid “olives” there is surprise, wonderment, astonishment.  Above all, the experience must move well beyond a regular, normal repast.  


But that is to violate one of the great injunctions of creativity: working within constraints while making excellence real.  A vase that is exceptionally lovely but not functional falls short in this regard. When we think of operative creativity as that which offers the best combinations, combinations which include utility, then judgments about what counts as fineness become more suitably contextualized.  There is no longer need to copy the spectator arts in order to achieve fineness.  For the operative arts, living beauty in  day-to-day existence becomes a major desideratum. It also offers special challenges to creativity, challenges that cannot be bracketed.


If this is the case, the contemporary way of selecting food artists highlights the wrong exemplars. We tend to fasten on those who can most imitate and emulate the spectatorial arts, the Adrias and Blumenthals.  If we emphasize day-to-day beauty, stressing the conjunction of beauty and use, insisting that fuller, more replete beauty lives in this combination, a combination requiring high levels of creativity, other models come to mind.  I am thinking in particular of Julia Child and Alice Waters.  What both of these chefs aimed at was  food that looks like food, tastes like food, satisfies hunger, encourages conviviality, is complex, and delicious.  Waters specifically named her restaurant after a hospitable, generous fictional character.  She hoped a meal at her restaurant would be like a convivial dinner at home.

What is important here is how the particular experience, which is participational, tied to nutrition, and associated with sociality, is brought to its highest culmination, a culmination that does not force cookery to mimic the spectator arts.  The operative arts can achieve levels of fineness at the highest level. They serve as models for the rest of us.  They also set a standard for living beauty rather than outsourcing it.  In these regards it seems to me better to celebrate Julia and Alice, not Ferran and Heston.