Telling Us How to Think – WIF Mind Games

Leave a comment

20th Century

Philosophers

(And What They Believed)

There’s a joke about a degree in philosophy where the people majoring in it get asked, “would you like fries with that?” Getting a degree in philosophy is supposed to be such a waste of time and money because philosophy ostensibly does not provide a utilitarian skill set. This claim was compellingly countered by Atlantic magazine in 2015, which found they had an average mid-career income of $82,000. It indicates that while philosophers can seem like marginal people — if not frivolous — they can make their contributions felt even while we mock them.

This list will be focusing on philosophers from the previous century. Philosophers from two or three centuries seem to get all the attention, not to mention all the philosophers from about two millennia ago. Some of these names will be familiar.

10. Ludwig Wittgenstein

Born in Vienna, Austria in 1889 to a wealthy family of musicians, in his youth the strongest desire of Wittgenstein’s heart was to work in engineering; specifically as it related to the infant technology of motorized flight. Wittgenstein happened to get hung up on pure mathematics and went to Cambridge to be taught by Bertrand Russell himself. It would be during a 1908 retreat in Norway, in a cabin he built, that Wittgenstein would have the inspiration for the “Picture Theory of Meaning” that would make him famous after he fought in World War I and got a job as an elementary school teacher for six years in 1920 because he’d divested himself of his inheritance in 1919.

Wittgenstein laid out the Picture Theory of Meaning in his 1921 book Tractatus Logico-Philosophicus, which in brief said that unless a statement could be translated from an abstraction into an “arrangement of objects” then it had no meaning. It was a literalism that could be expected of a philosophically-minded engineer and which he also applied to the classroom, having students perform such hands-on learning as constructing models and dissecting animals (and applying corporal punishment to a degree that compelled him to lie about it and quit his job).

Wittgenstein reinforced the point of applied philosophy with his other book of philosophy that’s held up as a classic, Philosophical Investigations from 1953. Wittgenstein claimed that ethics and logic are inextricably linked, and that actions were the only way that a person could follow their ethics was to act on them. In his own words, “It is not possible to obey a rule ‘privately’: otherwise thinking one was obeying a rule would be the same thing as obeying it.” It’s a harsh rebuke for people who claim to be above others by not participating in the world around them, or who convince themselves that what matters is who they are “on the inside.”

9. Hiratsuka Raicho

For this philosopher born in 1886, her feminist beliefs that would one day change the face of Japan were initially born more of religion than pure humanism. She had been taught that the Buddha claimed all people were equal, and naturally that meant all the women must be equal to men despite lacking key civil rights. It wasn’t until she read the work of Ellen Key that she began to think of women as deserving equal rights for purposes of autonomy and individualism. As she wrote in her autobiography, women had been “the sun” but society had reduced them to “ …a wan and sickly moon, dependent on another, reflecting another’s brilliance.”

The single most significant action Raicho undertook was founding and editing Seito, a literary magazine, that ran from 1911 to 1916. She continued campaigning after seeing the appalling conditions of textile factories, which tended to employ primarily female crews. In 1920 she founded the New Woman’s Association. They were able to almost pass women’s suffrage in 1921, and in 1922 successfully pressured the government to amend the Public Order and Police Law. Although her goal of suffrage wasn’t achieved until 1945, her efforts still got her elected president of the Women’s Federation in 1953. In 1908, she scandalously accompanied her (platonic) best friend to a mountain for a ritual suicide with an attitude of curiosity about what it was like to die, and because she suspected her partner would lose his willingness to kill himself when push came to shove. It was the sort of combination of deep conviction and apathy to social pressure that is often significant to bringing about change.

8. Noam Chomsky

There are two movies about long-term Massachusetts Institute of Technology linguist Noam Chomsky that, between them, encapsulate his two main areas of interest: Manufacturing Consent from 1992, an analysis of the profit-driven and narrative-driven media and US foreign policy, and Is the Man Who is Tall Happy?from 2013, a collection of interviews Michel Gondry wherein Chomsky’s answers or asks philosophical questions about how language shapes thoughts and memories practically since his birth in 1929, all of which are illustrated/animated in Gondry’s intentionally rough and child-like style.

Chomsky’s core beliefs relate to how controlling media (news coverage, commonly employed phrases and the words that back them, etc.) can be used to create public approval of what by basic human nature would not be acceptable. Back in 1968 in a televised debate with William Buckley (who threatened Gore Vidal on air), he argued how the US government arguing the military was occupying South Vietnam for the good of the Vietnamese was an excuse used since Ancient Roman conquests. Manufacturing Consent also devoted much of its run time to how the media would withhold coverage of the Indonesian invasion of East Timor because it served elite interests to ignore it. One of the core values of Chomsky’s political commentary and his stated views on language is to always question the narrative being provided. He goes so far as to say that in his childhood during the 1930s he went to a school that was well-suited for him before he went to high school because he was given freedom in classes to question instead of going through a highly regimented curriculum.

7. Jacques Derrida

Lately you hear the word “deconstruction” thrown around a lot in regards to media with some form of meta-commentary (e.g., a superhero movie where the filmmakers have the characters comment on the supposedly fascistic power fantasy nature of superhero narratives within the movie’s dialogue). We can attribute the popularity of that phrase to a man born in French-Algeria in 1930; a man who flunked his own exams to become a licensed Parisian philosopher in 1952. Badly flunked, too: A score of five out of twenty, and he choked disastrously on the written portion. He would need three attempts to pass in 1956, and after some time in the military he spent decades teaching. It was while working in education that he would write the essays that made him famous with the English-speaking world.

If Derrida’s philosophical insight that made him so influential were to be reduced to a logline (and bear in mind that this is someone who wrote 70 books and countless essays), it would be to critique other writers who claimed they were being objective. Derrida said that was functionally impossible, as the education any analyst had received would introduce biases that would impact their views one way or another. That claim is a rebuke to every school of thought, even Chomsky’s “question everything” philosophy. It argues there are limited a very limited number of questions a person will ask and narrow-minded ways they will be asked, the limitations being set by the person’s upbringing. Derrida’s seemingly detached central tenet didn’t mean he avoided controversial opinions, since he was an admirer of Karl Marx and Nazi Party member Martin Heidegger.

6. Judith Jarvis Thomson

Whatever your views on the abortion debate, there’s no denying Thomson’s influence over the issue in the United States of America. Born in 1929, by 1969 she was a professor at the Massachusetts Institute of Technology in the Department of Linguistics and Philosophy. In 1971 she wrote “A Defense of Abortion” and went a long way to reframing the debate in a manner which put the feminist movement behind the landmark 1973 Roe V. Wade ruling. Its influence and controversy has led to her essay being dubbed “the most widely reprinted essay in all of contemporary philosophy.”

The most momentous passage of Thomson’s essay is a metaphor. Thomson asks the reader to imagine they woke up on life support (the reader’s kidneys being used to support the life of a violinist in a coma), and the reader is being used for this process because they’re the only matching blood type. While the violinist certainly has the right to life, Thomson asserts that the reader would also have a right to their own body and potentially their own life. In so doing she reframed the debate from focusing on the rights of the fetus to those of the parent. This is hardly her sole contribution to the philosophical landscape, such as her redesign of the famous Trolley Problem (i.e. the moral quandary about whether it’s inherently better to take action to kill one person and save five) but the 1971 essay remains her most momentous piece of writing.

5. Jean-Paul Sartre

Born in Paris in 1905, his body of work would, by the time of his death in 1980, includes books and plays such as Being and Nothingness and The Flies, which were key to spreading existential philosophy around the world. His most famous play, No Exit, coined the popular expression “hell is other people.” Sartre rejected the label of existentialist for a time, and in 1964 he rejected the Nobel Prize in literature, criticizing its Eurocentrism (he came to regret this latter rejection in particular, saying he could have donated the prize money to an anti-Apartheid committee in London). Also in 1964, he renounced all literature as a substitute for taking meaningful action in the world.

Sartre was a nihilist when it came to human nature, as he outlined in Existentialism is Humanism. He argued that human beings, as autonomous and sentient entities, have to define themselves as they live, and they do so through their actions (as Wittgenstein did). Sartre was not positive about this state being, calling it “anguish.” Little wonder he felt Hell is other people.

4. Giovanni Gentile

The inclusion of any figure on this list is not an endorsement of their views. we want to be made especially clear in this case, as in 1932 this Italian philosopher born in 1886 was literally a co-author for The Doctrine of Fascism with Benito Mussolini. Meaning, of course, that he indirectly helped write the blueprints for much more destructive German fascism. He created a philosophical movement of his own known initially as “actual idealism,” which was shortened to “actualism.” It was largely an extension of the work of nineteenth century philosopher Georg Hegel.

Gentile argued that objective reality was unknowable and that individual identities were an illusion, which in turn he argued meant that the only way to find value was to bind oneself into a larger group. In a sense it’s a form of nihilism since everything outside the group is unquantifiable and thus can’t have a value, giving people within the group tacit approval to subjugate any outsiders however they please.

3. Ayn Rand

Few people are as well known for their contradictions as this bestselling author born in Russia in 1905 who created the Objectivist movement. She is highly lauded in right wing circles despite being aggressively pro-choice. She believed only in wealth redistribution through private charity but is very often mocked for accepting social security near the time of her passing in 1982. Her books The Fountainhead, We the Living, Anthem, and Atlas Shrugged are all endlessly derided and bought. Despite how far out of fashion her writing style and subject matter have fallen she remains popular enough that blockbuster director Zack Snyder plans to make a film adaptation of The Fountainhead.

It is often asserted that the Soviet government’s seizure of her father’s pharmacywhile she was a child inspired her to design a philosophical framework of her own which is often referred to as Randianism but which she called Objectivism. Objectivism argues that the best way for humanity to proceed is for everyone to act in their rational self-interest. People will act ethically because it is in the best interest of capitalism for them to treat everyone ethically, so that others will treat them ethically. Morality cannot be forced on anyone, and to use the threat of physical violence to compel people to act morally (e.g. to use the threat of arrest to coerce citizens to give tax money that would be used to help the needy) is itself amoral.

2. Leo Strauss

Leo Strauss has not become a household name since his death in 1973. Even among the circle that knew him at the time he was more polarizing than most. Strauss is more influential because he was read by a few at the top than by many at the bottom or in the middle. From Gerald Ford to the Bushes, his work was taught and discussed in the White House itself every time there was a Republican in office. Even William Gaston, a domestic policy adviser to Bill Clinton for two years, was a student of him.

Strauss believed that human beings do not have natural rights, and are inherently unequal and thus shouldn’t be treated as if they are. He argued that society needed to have its “noble lies,” which was what Strauss considered religion, so that the lower classes would remain productive. He said science and philosophy must be the “preserve of a small minority” because science and philosophy are attempts to replace opinion, and opinion is “the element in which society breathes.”

1.  Albert Camus

Like Derrida he was born in Algeria, though in Camus’s case in 1913. He also shared with Derrida a soft spot for communism, though that was out of his system by the time he was in France and made his name. Derrida is said to be the father of deconstruction, Camus is credited with being one of the fathers of absurdism as a philosophical movement, even if he rejected “armchair philosophy” in favor of going out and living life to the fullest.

Camus’s first published book is 1942’s The Stranger, a novel about a sociopathic man who neither cares at his mother’s death nor understands why everyone else does. Accused of premeditated murder, what actually gets him sentenced to death is his apathy and atheism. Before his execution he tears into the priest sent to receive his confession, and manages to find peace in accepting the meaninglessness of life.

His most famous book, and his winner of the Nobel Prize in literature, is 1947’s The Plague. A story of a bubonic plague epidemic in Oran, Algeria (based on a cholera epidemic that hit the town in 1849 and metaphorical for the presence of the Third Reich in France) it’s the story of how society is broken down so that people isolate themselves in the hopes of riding the plague out and others fight against it. Even though Camus treats the struggle against the plague as absurd, it’s clear that the resisting characters have his sympathies.

Camus’s other work of nonfiction is The Myth of Sisyphus, a 130 page essay published in 1942 about the character from Greek mythology who is condemned to forever push a boulder up a mountain, a task impossible because in some versions it will always roll back down or just can’t be moved in the first place. Camus argued that this was a perfect symbol for the human condition: Forever pointlessly struggling since the inevitability of oblivion hangs over everyone and everything at all times. So why not commit suicide instead? Rather than reaching a dour, nihilistic conclusion from that, Camus said “one must imagine Sisyphus happy.” After all, he does have an eternal sense of purpose. In its way, Camus’s absurdist philosophy is a optimistic and accepting form of nihilism.


Telling Us How to Think –

WIF Mind Games

World Wide Words Issue 902 – WIF Style

Leave a comment
Issue 902

Issue 902

 

(hover)

(hover)

World Wide Words

Issue 902: 6 November 2014

 

Feedback, Notes and Comments

Lost in translation. A Sic! item in the last issue reproduced a report from the Sydney Morning News on that supposed Russian incursion into Swedish waters. Rear-admiral Anders Grenstad was reported as saying “It could be a submarine, or a smaller submarine”. Many readers suspected, as I did, that the newspaper quote was a bad translation.

Terry Walsh went back to the Swedish original and confirmed that the English version should have been “We are certainly able to exclude a conventional submarine. Some of the observations do not allow for that depth, says Grenstad. But he is sure of one thing. There is some type of underwater vehicle, at least one.” How one gets from that to the quote in the newspaper is hard to understand.

By the way, the original text showed that the usual Swedish term for a submarine is u-båt. This comes from U-Boot, an abbreviation of German Unterseeboot, literally undersea boat, likewise the standard German term for a submarine. We took this into English as U-boat to mean specifically German military submarines of the two World Wars.

Trunk and boot. Lance Jones wrote, “I live in the wire-grass section of South Georgia. My grandmother, and other folks I have known of her age (born around the time of the First World War), call the trunk of a car the cooter hull . I think as in turtle shell. They tend to use hull more than shell but that seems to be passing as they do.” [Cooter is a regional term for a freshwater turtle in parts of the southern US.]

Boot turns out not to be entirely unknown in the US in the car sense. John C noted, “At least until the early 1950s in the mountains of North Carolina and Georgia, it was common to refer to the trunk of an automobile as the boot. My wife, who grew up in Springfield, Missouri, remembers that her grandparents used boot for their car trunk.”

Randy Sigman wrote, “It may be of interest to you that the US automaker Tesla Motors has coined the term frunk.” This is the front storage space where the engine would have been in most petrol or diesel cars. So that was what we should have called the same space in a Volkswagen!

Several readers have suggested I do a similar analysis of bonnet and hood . I’ve added it to my — alarmingly long — list of words to look at.

Rumbles and dickies. Karl Franklin wrote, “Your comments on the back storage area of a car and its etymology reminded me that in the old days we used to ride in the ‘rumble seat’ of a car. It was an outdoor seat that could fold out or in, and was in about the same area as the trunk or boot. You probably are familiar with the name but I wonder what the equivalent is in British English.”

 

In Britain it was called the dicky, or dicky seat . We may guess it’s the same word as the familiar form of Richard, but that merely takes the obscurity back a step. Dicky had been taken over for cars from a similar seat for servants at the back of horse-drawn carriages and one writer has suggested that Richard was once a generic term for a servant. I’ve been unable to confirm this.

The experts are sure it isn’t connected to another British English sense of dicky, for something that isn’t working properly (“he had a dicky heart”) though it may be connected with dicky for a false shirt front. The dicky in a car was often used to carry luggage and the word is still in use in Indian English for the boot/trunk. Another term for it in the US was mother-in-law seat.

Incidentally, a rumble, or rumble seat , was also used of servants’ seats at the rear of carriages from early in the nineteenth century, presumably because being over the back wheels it took up the vibration of the vehicle on the road. It’s yet another oddity of automobile vocabulary that it was transferred to the car seat in the US but not in Britain.

The Language Myth, by Vyvyan Evans

For the past half-century, the dominant view in linguistics has been that human beings uniquely possess a hard-wired concept of language. This implies that all languages are related at a deep level, because all of them are created on the same fundamental grammar template. It explains how a child is able to readily learn any language.

 

The idea, called Universal Grammar, was created by the linguist Noam Chomsky in the 1950s and has been enormously influential, not only in linguistics but also in fields such as psychology and philosophy. It’s still the standard view in most textbooks and has been popularised by Steven Pinker in The Language Instinct and later books.

However, the concept that language is an instinct, and a uniquely human one, has been challenged as a result of research in a number of fields in recent decades. We now know much more about how children acquire language, the diversity of the world’s languages, the evolution of the human species, the structure and function of our brains, and the ways in which other animals communicate.

Vyvyan Evans

 

A vigorous debate is raging. Vyvyan Evans, the professor of linguistics at Bangor University in north Wales, has written The Language Myth to bring together the growing evidence against Universal Grammar.

For example, Chomsky’s view that this instinct for language is unique to humans and arrived suddenly as a mutation about 100,000 years ago cannot be true. Our complicated vocal apparatus, with the sophisticated brain necessary to manipulate it to utter and remember speech, couldn’t have been the result of a single sudden change but must have evolved stage by stage among our hominin ancestors. Neanderthals had similar vocal anatomy to ours and so were very probably able to communicate through speech.

One implication of Universal Grammar is that there must be some module or faculty in the brain, present at birth, dedicated to processing grammar. Though the brain does have sections devoted to specific functions, such as Broca’s area, responsible for the creation of speech, we know now that this area does other jobs as well and that the work of processing language takes place quite widely across various parts of the brain. A grammar module as such doesn’t exist.

The truth, Professor Evans argues on the basis of current research, is very different. Babies are not born with a set of internal rules but with a universal capacity to learn about themselves and the world around them. The brains of infants are plastic: experience and discovery moulds them and acquiring a language is one aspect of this.

Professor Evans also partly rehabilitates a theory developed in the 1930s by Benjamin Whorf; a version that was developed after Whorf’s death is called the Sapir-Whorf hypothesis, after him and his mentor Edward Sapir. Whorf called it linguistic relativity, arguing that speakers of different languages conceptualize and experience the world differently. This has been denied by followers of Chomsky’s work, since if true it would refute the view that language is innate and universal. Subtle neurological experiments in the past couple of decades have suggested that at an unconscious level people can be influenced by the nature of their language.

The Language Myth is a wide-ranging polemical dismissal of the received wisdom of many linguists. It’s worth reading also as a classic case study of an orthodoxy undergoing what Thomas Kuhn called a paradigm shift.

[Evans, Vyvyan, The Language Myth: Why Language is Not an Instinct; published by Cambridge University Press in hardback, paperback and e-book; ISBN 978-1-107-04396-1 (hbk), 978-1-107-61975-3 (pbk).]

Wordface

Shining light. Kathy Atkinson asked about the term lilly-lo, which her late grandmother in Yorkshire used for a child’s nursery light, or any light if she was talking to a baby or small child. The English Dialect Dictionary, compiled at the end of the nineteenth century, has it in that spelling and also as lilly-low and confirms that it was then used throughout northern England and Scotland. The second part is from an old Scandinavian word that meant a light or a flame (a distant relative through Indo-European of our light) and, as lowe, has been used for a fire or a small candle or other naked flame. It’s still known in Scotland and parts of northern England. The longer form is a playful extension used particularly with children, source unknown, though we might guess at a nursery version of little.

Sic!

An e-newsletter from an organisation called Better Markets was sent to Todd Bernhardt. He found that it referred to “Washington’s white-color defense lawyers”.

A mailing from Damart surprised Philip Stevens: “Don’t miss our brand new TV ad with 20% off & Free Delivery!” He didn’t realise he had to pay for an ad to be delivered, especially four-fifths of one.

World Wide Words

Issue 902