shop imp kerr
nswd

ideas

‘Truth does not belong to the order of power.’ –Michel Foucault

211.jpg

Are there really such things as artistic masterworks? That is, do works belong to the artistic canon because critics and museum curators have correctly discerned their merits? (…)

For the sake of this discussion let’s focus on two possible views: the first, call it Humeanism, is the view that when we make evaluative judgements of artworks, we are sensitive to what is good or bad. On this Humean view, the works that form the artistic canon are there for good reason. Over time, critics, curators, and art historians arrive at a consensus about the best works; these works become known as masterworks, are widely reproduced, and prized as highlights of museum collections.

However, a second view—call it scepticism—challenges these claims about the role of value in both artistic judgement and canon formation. A sceptic will point to other factors that can sway critics and curators such as personal or political considerations, or even chance exposure to particular works, arguing that value plays a much less important role than the Humean would lead us to believe. According to such a view, if a minor work had fallen into the right hands, or if a minor painter had had the right connections, the artistic canon might have an entirely different shape.

How is one to determine whether we are sensitive to value when we form judgements about artworks? In a 2003 study, psychologist James Cutting briefly exposed undergraduate psychology students to canonical and lesser-known Impressionist paintings (the lesser-known works exposed four times as often), with the result that after exposure, subjects preferred the lesser-known works more often than did the control group. Cutting took this result to show that canon formation is a result of cultural exposure over time.

{ Experimental Philosophy | Continue reading }

arwtork { Marlene Dumas, Dead Girl, 2002 }

Excusez-moi mais vous êtes en train d’uriner sur ma voiture

410.jpg

Far be it from me to suggest the abolition of sociology departments. After all, I’m the chairman of one.

But even a chairman would need an inhuman capacity for denial to fail to see that:

1. The overwhelming amount of current sociology is simply nonsense–tendentious nonsense at that.

2. Nearly all of the sociology that is not tendentious nonsense is so obvious that one wonders why anyone would see the need to demonstrate it.

{ Steven Goldberg | Continue reading }

‘One must have a good memory to be able to keep the promises one makes.’ –Nietzsche

12.jpg

If sometime in the future, we learn to travel in time, why hasn’t someone come back from the future, to tell us how to do it.

{ Stephen Hawking | Continue reading | via Overcoming Bias }

‘Appearance’ is a word that contains many temptations, which is why I avoid it as much as possible.

47.jpg

{ United Academics | Continue reading }

The best color on a horse is fat

63.jpg

If one could rewind the history of life, would the same species appear with the same sets of traits? Many biologists have argued that evolution depends on too many chance events to be repeatable. But a new study investigating evolution in three groups of microscopic worms, including the strain that survived the 2003 Columbia space shuttle crash, indicates otherwise. When raised in a lab under crowded conditions, all three underwent the same shift in their development by losing basically the same gene. The work suggests that, to some degree, evolution is predictable.

{ Nature | Continue reading }

images { Jesse Richards | Susan Rothenberg }

When it’s suddenly gone and there’s nothing left to prove it was ever there

62.jpg

Hume’s account of the self is to be found mainly in one short and provocative section of his Treatise of Human Nature – a landmark work in the history of philosophy, published when Hume was still a young man. What Hume says here (in “Of Personal Identity”) has provoked a philosophical debate which continues to this day. What, then, is so novel and striking about Hume’s account that would explain its fascination for generations of philosophers?

One of the problems of personal identity has to do with what it is for you to remain the same person over time. In recalling your childhood experiences, or looking forward to your next holiday, it appears that in each case you are thinking about one and the same person – namely, you. But what makes this true? The same sort of question might be raised about an object such as the house in which you’re now living. Perhaps it has undergone various changes from the time when you first moved in – and you may have plans to alter it further. But you probably think that it is the same house throughout. So how is this so? It helps in this case that at least we’re pretty clear about what it is for something to be a house (namely, a building with a certain function), and therefore to be the same house through time. But what is it for you to be a person (or self)? This is the question with which Hume begins. He is keen to dismiss the prevailing philosophical answer to this question – (…) that underlying our various thoughts and feelings is a core self: the soul, as it is sometimes referred to. (…)

How, then, does Hume respond to this view of the self? (…) Hume concludes memorably that each of us is “nothing but a bundle or collection of different perceptions”. We might be inclined to think of the mind as a kind of theatre in which our thoughts and feelings – or “perceptions” – make their appearance; but if so we are misled, for the mind is constituted by its perceptions. This is the famous “bundle” theory of the mind or self that Hume offers as his alternative to the doctrine of the soul.

{ The Philosophers’ Magazine | Continue reading }

The smaller the attendance the bigger the history. There were 12 people at the Last Supper. Half a dozen at Kitty Hawk. Archimedes was on his own in the bath.

59.jpg

What is the art of immersion? The focus of the book is on how the internet is changing storytelling; and the idea is really that every time a new medium comes along, it takes people 20 or 30 years to figure out what to do with it, to figure out the grammar of that medium. The motion picture camera was invented around 1890 and it was really about 1915 before the grammar of cinema–all the things we take for granted now, like cuts and point-of-view shots and fades and pans–were consolidated into first what we would recognize as feature films. Birth of a Nation being the real landmark. It wasn’t the first film that had these characteristics but it was the first film to use all of them and that people settled on that really made a difference. I think we are not quite there yet with the internet but we can see the outlines of what is happening, what is starting to emerge; and it’s very different from the mass media that we’ve been used to for the past 150 years. (…)

NotSoSerious.com–the campaign in advance of the Dark Knight. This was what’s known as an alternate reality game. This was a particularly large-scale example that took place over a period of about 18 months. Essentially the purpose of it was to create this experience that kind of started and largely played out online but also in the real world and elsewhere that would familiarize people with the story and the characters of the Dark Knight. In particular with Heath Ledger as the Joker. Build enthusiasm and interest in the movie in advance of its release. On one level it was a marketing campaign; on another level it was a story in itself–a whole series of stories. It was developed by a company called 42 Entertainment, based in Pasadena and headed by a woman named Susan Bonds who was interestingly enough educated and worked first as a Systems Engineer and spent quite a bit of time at Walt Disney Imagineering, before she took up this. It’s a particularly intriguing example of storytelling because it really makes it possible or encourages the audience to discover and tell the story themselves, online to each other. For example, there was one segment of the story where there were a whole series of clues online that led people to a series of bakeries in various cities around the United States. And when the got to the bakery, the first person to get there in each of these cities, they were presented with a cake. On the icing to the cake was written and phone number and the words “Call me.” When they called, the cake started ringing. People would obviously cut into the cake to see what was going on, and inside the cake they found a sealed plastic pouch with a cell phone and a series of instructions. And this led to a whole new series of events that unfolded and eventually led people to a series of screenings at cities around the country of the first 7 minutes of the film, where the Heath Ledger character is introduced. (…)

The thing about Lost was it was really a different kind of television show. What made it different was not the sort of gimmicks like the smoke monster and the polar bear–those were just kind of icing. What really made it different was that it wasn’t explained. In the entire history of television until quite recently, just the last few years, the whole idea of the show has been to make it really simple, to make it completely understandable so that no one ever gets confused. Dumb it down for a mass audience. Sitcoms are just supposed to be easy. Right. Lost took exactly the opposite tack, and the result was–it might not have worked 10 years ago, but now with everybody online, we live in an entirely different world. The result was people got increasingly intrigued by the essentially puzzle-like nature of the show. And they tended to go online to find out things about it. And the show developed a sort of fanatical following, in part precisely because it was so difficult to figure out.

There was a great example I came across of a guy in Anchorage, Alaska who watched the entire first season on DVD with his girlfriend in a couple of nights leading up to the opening episode of Season 2. And then he watched the opening episode of Season 2 and something completely unexpected happened. What is going on here? So he did what comes naturally at this point, which was to go online and find out some information about it. But there wasn’t really much information to be found, so he did the other thing that’s becoming increasingly natural, which was he started his own Wiki. This became Lostpedia–it was essentially a Wikipedia about Lost and it now has tens of thousands of entries; it’s in about 20 different languages around the world. And it’s become such a phenomenon that occasionally the people who were producing the show would themselves consult it–when their resident continuity guru was not available.

What had been published in very small-scale Fanzines suddenly became available online for anybody to see. (…)

The amount of time people devote to these beloved characters and stories–which are not real, which doesn’t matter really at all, which was one of the fascinating things about this whole phenomenon–it couldn’t have happened in 1500. Not because of the technology–of course they are related–but you’d starve to death. The fact that people can devote hundreds of hundreds of hours personally, and millions can do this says something about modern life that is deep and profound. Clay Shirky, who I believe you’ve interviewed in the past, has the theory that television arrived just in time to soak up the excess leisure time that was produced by the invention of vacuum cleaners and dishwashers and other labor-saving devices.

{ Frank Rose/EconTalk | Continue reading }

‘Time is an illusion, lunchtime doubly so.’ –Douglas Adams

73.jpg

Today hardly anyone notices the equinox. Today we rarely give the sky more than a passing glance. We live by precisely metered clocks and appointment blocks on our electronic calendars, feeling little personal or communal connection to the kind of time the equinox once offered us. Within that simple fact lays a tectonic shift in human life and culture.

Your time — almost entirely divorced from natural cycles — is a new time. Your time, delivered through digital devices that move to nanosecond cadences, has never existed before in human history. As we rush through our overheated days we can barely recognize this new time for what it really is: an invention. (…)

Mechanical clocks for measuring hours did not appear until the fourteenth century. Minute hands on those clocks did not come into existence until 400 years later. Before these inventions the vast majority of human beings had no access to any form of timekeeping device. Sundials, water clocks and sandglasses did exist. But their daily use was confined to an elite minority.

{ Adam Frank/NPR | Continue reading }

The 12-hour clock can be traced back as far as Mesopotamia and Ancient Egypt: Both an Egyptian sundial for daytime use and an Egyptian water clock for night time use were found in the tomb of Pharaoh Amenhotep I. Dating to c. 1500 BC, these clocks divided their respective times of use into 12 hours each.

The Romans also used a 12-hour clock: daylight was divided into 12 equal hours (of, thus, varying length throughout the year) and the night was divided into four watches. The Romans numbered the morning hours originally in reverse. For example, “3 am” or “3 hours ante meridiem” meant “three hours before noon,” compared to the modern usage of “three hours into the first 12-hour period of the day.”

{ Wikipedia | Continue reading }

Also: The terms “a.m.” and “p.m.” are abbreviations of the Latin ante meridiem (before midday) and post meridiem (after midday).

‘It is not reason which is the guide of life, but custom.’ –Hume

57.jpg

My topic is the shift from ‘architect’ to ‘gardener’, where ‘architect’ stands for ’someone who carries a full picture of the work before it is made’, to ‘gardener’ standing for ’someone who plants seeds and waits to see exactly what will come up’. I will argue that today’s composer are more frequently ‘gardeners’ than ‘architects’ and, further, that the ‘composer as architect’ metaphor was a transitory historical blip.

{ Brian Eno/Edge }

photos { Sid Avery, Sammy Davis Jr, Dean Martin, Frank Sinatra and Joey Bishop Stage a Fight During the Making of Ocean’s Eleven }

Sunlight is the best disinfectant

83.jpg

Kings, queens and dukes were always richer and more powerful than the population at large, and would surely have liked to use their money and power to lengthen their lives, but before 1750 they had no effective way of doing so. Why did that change? While we have no way of being sure, the best guess is that, perhaps starting as early as the 16th century, but accumulating over time, there was a series of practical improvements and innovations in health. (…)

The children of the royal family were the first to be inoculated against smallpox (after a pilot experiment on condemned prisoners), and Johansson notes that “medical expertise was highly priced, and many of the procedures prescribed were unaffordable even to the town-dwelling middle-income families in environments that exposed them to endemic and epidemic disease.” So the new knowledge and practices were adopted first by the better-off—just as today where it was the better-off and better-educated who first gave up smoking and adopted breast cancer screening. Later, these first innovations became cheaper, and together with other gifts of the Enlightenment, the beginnings of city planning and improvement, the beginnings of public health campaigns (e.g. against gin), and the first public hospitals and dispensaries, they contributed to the more general increase in life chances that began to be visible from the middle of the 19th century.

Why is this important? The absence of a gradient before 1750 shows that there is no general health benefit from status in and of itself, and that power and money are useless against the force of mortality without weapons to fight. (…)

Men die at higher rates than women at all ages after conception. Although women around the world report higher morbidity than men, their mortality rates are usually around half of those of men. The evidence, at least from the US, suggests that women experience similar suffering from similar conditions, but have higher prevalence of conditions with higher morbidity, and lower prevalence of conditions with higher mortality so that, put crudely, women get sick and men get dead.

{ Angus Deaton, Center for Health and Wellbeing, Princeton University | Continue reading | PDF }

artwork { Willem de Kooning, Queen of Hearts, 1943-46 }

In the beginning, philosophy was an anti-object-oriented enterprise

45.jpg

“Literally’’ (…)

It’s a word that has been misused by everyone from fashion stylist Rachel Zoe to President Obama, and linguists predict that it will continue to be led astray from its meaning. There is a good chance the incorrect use of the word eventually will eclipse its original definition.

What the word means is “in a literal or strict sense.’’ Such as: “The novel was translated literally from the Russian.’’

“It should not be used as a synonym for actually or really,’’ writes Paul Brians in “Common Errors in English Usage.’’

{ Boston Globe | Continue reading }

Eyewitnesses say they are ordinary-looking people. Some say they appear to be in a kind of trance. Others describe them as being misshapen monsters. At this point, there’s no really authentic way for us to say who or what to look for and guard yourself against.

56.jpg

If you ask a person when “middle age” begins, the answer, not surprisingly, depends on the age of that respondent. American college-aged students are convinced that one fits soundly into the middle-age category at 35. For respondents who are actually 35, middle age is half a decade away, with 40 representing the inaugural year. (…) Recently, a large sample of Swiss participants spanning several generations agreed with one another that middle-aged people are those who are between 35 to 53 years of age.

However, the precise chronological point at which we formally enter “middle age” is of little importance. (…) We’ve all heard of the dreaded “midlife crisis,” but what, exactly, is it? Furthermore, does it even exist as a scientifically valid concept? (…)

“Midlife crisis” was coined by Elliott Jacques in 1965. (…) According to him, the midlife crisis is such a crisis that many great artists and thinkers don’t even survive it. (…) He decided to crunch the numbers with a “random sample” of 310 such geniuses and, indeed, he discovered that a considerable number of these formidable talents—including Mozart, Raphael, Chopin, Rimbaud, Purcell, and Baudelaire—succumbed to some kind of tragic fate or another and drew their last breaths between the ages of 35 and 39. “The closer one keeps to genius in the sample,” Jacques observes, “the more striking and clear-cut is this spiking of the death rate in midlife.” (…)

Basically, argues Jacques, around the age of 35, genius can go in one of three directions. If you’re like that last batch of folks, you either die, literally, or else you perish metaphorically, having exhausted your potential early on in a sort of frenzied, magnificent chaos, unable to create anything approximating your former genius. The second type of individual, however, actually requires the anxieties of middle age—specifically, the acute awareness that one’s life is, at least, already half over—to reach their full creative potential. (…) Finally, the third type of creative genius is prolific and accomplished even in their earlier years, but their aesthetic or style changes dramatically at middle age, usually for the better. (…)

The phrase “midlife crisis” didn’t really creep into suburban vernacular as a catchall diagnosis until the late 1970s. This is when Yale’s Daniel Levinson, building on the stage theory tradition of lifespan developmentalist Erik Erikson, began popularizing tales of middle-class, middle-aged men who were struggling with transitioning to a time where “one is no longer young and yet not quite old.”

{ Scientific American | Continue reading }

Do you have a favorite-sounding word? My top-five are ointment, bumblebee, Vladivostok, banana, and testicle.

23.jpg

I know you’ve thought (and taught) about the fragment as a mode of writing. I’m wondering how your study of the form influences the way you use it.

While writing a book, I’m influenced by things the same way I would imagine most writers are: I look for what I want to steal, then I steal it, and make my own weird stew of the goods. Often while writing I’d re-read the books by Barthes written in fragments—A Lover’s Discourse, Roland Barthes by Roland Barthes—and see what he gained from an alphabetical, somewhat random organization, and what he couldn’t do that way. I mostly read Wittgenstein, and watched how he used numbered sections to think sequentially, and to jump, in turn. (…) I re-read Haneke’s Sorrow Beyond Dreams, which finally dissolves into fragments, after a fairly strong chronological narrative has taken him so far.

{ Maggie Nelson/Continent. | Continue reading }

When I was young, I invented an invisible friend called Mr Ravioli. My psychiatrist says I don’t need him anymore, so he just sits in the corner and reads.

53.jpg

While the Depravity Scale allows a ranking of just how depraved/horrific/ egregious specific behaviors are, the GASP scale allows us to assess the level to which others (or ourselves) are prone to guilt and shame reactions.

There are, perhaps not surprisingly, disagreements among researchers about how to distinguish between guilt and shame. According to some, guilt is largely focused on your behavior (“I did a bad thing”) while shame is focused on your character (“I am a bad person”). Others think that private (that is, not publicly known) bad behaviors cause guilt, where transgressions that are made public cause shame.

{ Keen Trial | Continue reading }

photo { Christopher Payne }

How do I look? You look ready.

43.jpg

According to Maslow we have five needs [diagram]. However, many other people have thought about what human beings need to be happy and fulfilled, what we strive for and what motivates us, they have come up with some different numbers. (…)

David McClelland (1985) proposed that, rather than being born with them, we acquire needs over time. They may vary considerably according to the different experiences we have, but most of them tend to fall into three main categories. Each of these categories is associated with appropriate approach and avoidance behaviours.

Achievement. People who are primarily driven by this need seek to excel and to gain recognition for their success. They will try to avoid situations where they cannot see a chance to gain or where there is a strong possibility of failure.

Affiliation. People primarily driven by this need are drawn towards the achievement of harmonious relationships with other people and will seek approval. They will try to avoid confrontation or standing out from the crowd.

Power. People driven by this need are drawn towards control of other people (either for selfish or selfless reasons) and seek compliance. They will try to avoid situations where they are powerless or dependent. (…)

Martin Ford and C.W. Nichols seem to have gone a bit overboard. Their taxonomy of human goals has two dozen separate factors.

{ Careers in theory | Continue reading }

The stars are real. The future is that mountain.

42.jpg

When tracked against the admittedly lofty hopes of the 1950s and 1960s, technological progress has fallen short in many domains. Consider the most literal instance of non-acceleration: We are no longer moving faster. The centuries-long acceleration of travel speeds — from ever-faster sailing ships in the 16th through 18th centuries, to the advent of ever-faster railroads in the 19th century, and ever-faster cars and airplanes in the 20th century — reversed with the decommissioning of the Concorde in 2003, to say nothing of the nightmarish delays caused by strikingly low-tech post-9/11 airport-security systems. Today’s advocates of space jets, lunar vacations, and the manned exploration of the solar system appear to hail from another planet. A faded 1964 Popular Science cover story — “Who’ll Fly You at 2,000 m.p.h.?” — barely recalls the dreams of a bygone age.


The official explanation for the slowdown in travel centers on the high cost of fuel, which points to the much larger failure in energy innovation. (…)

By default, computers have become the single great hope for the technological future. The speedup in information technology contrasts dramatically with the slowdown everywhere else. Moore’s Law, which predicted a doubling of the number of transistors that can be packed onto a computer chip every 18 to 24 months, has remained broadly true for much longer than anyone (including Moore) would have imagined back in 1965. We have moved without rest from mainframes to home computers to the Internet. Cellphones in 2011 contain more computing power than the entire Apollo space program in 1969.


{ National Review | Continue reading }

Fab Five Freddy said everybody’s high

6.jpg

The “Big Five” factors of personality are five broad domains or dimensions of personality which are used to describe human personality. (…)

Openness to experience – (inventive/curious vs. consistent/cautious). Appreciation for art, emotion, adventure, unusual ideas, curiosity, and variety of experience.

Conscientiousness – (efficient/organized vs. easy-going/careless). A tendency to show self-discipline, act dutifully, and aim for achievement; planned rather than spontaneous behaviour.

Extraversion – (outgoing/energetic vs. solitary/reserved). Energy, positive emotions, surgency, and the tendency to seek stimulation in the company of others.

Agreeableness – (friendly/compassionate vs. cold/unkind). A tendency to be compassionate and cooperative rather than suspicious and antagonistic towards others.

Neuroticism – (sensitive/nervous vs. secure/confident). A tendency to experience unpleasant emotions easily, such as anger, anxiety, depression, or vulnerability.

{ Wikipedia | Continue reading }

related { The Psychology of Unusual Handshakes. }

screenshot { Wallace Shawn quoting Ingmar Bergman in Louis Malle’s My Dinner with Andre, 1981 }

‘Gozzi maintained that there can be but thirty-six tragic situations. Schiller took great pains to find more, but he was unable to find even so many as Gozzi.’ –Goethe

51.jpg

There are only seven plots in all of fiction — in all of human life, really — and chances are you’re living one of them.

Plenty of people have tried to boil things down for ease of interpretation or just to get their screenplay moving. Dr. Johnson was the first to suggest “how small a quantity of real fiction there is in the world; and that the same images have served all the authors who have ever written,” but he wandered off for an ale and left the idea where it lay. (…)

In 2004 journalist Christopher Booker put it in a doorstopper of a book, The Seven Basic Plots: Why We Tell Stories, and made an awfully good case that writers are all fiddling with a mere handful of stories, like songwriters with a few tunes in their pocket.

Overcoming the Monster: The Monster is your boss, mother, ex-spouse, the little voice in your head… (…)

Rags to Riches: This plot inhabits our dreams. All reality shows are rags-to-riches tryouts, as are romance novels and junior hockey games. (…)

The Quest: People go on trips, like Homer in The Odyssey and guys in road-trip movies like The Hangover. (…)

Voyage and Return: This plot sends hapless innocents into a strange landscape where they have to cope with oddity, danger and separation from all they know and love. (…)

Comedy: This is the oldest plot of all (see Aristophanes in 425 BC), based on confusion, misunderstanding and lack of self-knowledge. (…)

Tragedy: This is the most complicated plot of all.

{ Toronto Star | Continue reading }

The Thirty-Six Dramatic Situations is a descriptive list which was created by Georges Polti, a French writer from the mid-19th century, to categorize every dramatic situation that might occur in a story or performance.

{ Wikipedia | Continue reading }

‘Our destiny exercises its influence over us even when, as yet, we have not learned its nature: it is our future that lays down the law of our today.’ –Nietzsche

516.jpg

I predict that if we were to poll professional economists a century from now about who is the intellectual founder of the discipline, I say we’d get a majority responding by naming Charles Darwin, not Adam Smith. Smith, of course, would be the name out of 99% of economists if you asked the same question today. My claim behind that prediction is that in time, not next year, we’ll recognize that Darwin’s vision of the competitive process was just a lot more accurate and descriptive than Smith’s was. I say Smith’s–I really mean Smith’s modern disciples. Neoclassical economists. I think Smith was amazed that when you turn selfish people loose and let them seek their own interests, you often get good results for society as a whole from that process. I don’t think anybody had quite captured the logic of that narrative anywhere near as clearly as Smith had before he wrote. So, it’s a hugely important contribution. Smith, however, didn’t say you always got good results. He was quite circumspect about the claim. (…)

You go back and read Smith–it’s amazing his insights all across the spectrum have held up. Where I think he missed, or at any rate his modern disciples have missed a key feature of competition was–he saw clearly in a way that I don’t think others do yet, that competition favors individual actors. That’s what it does. Correct. Sometimes in the process it helps the larger group, but there are lots and lots of instances in which competition acts against the interests of the larger group. (…)

Familiar example from the Darwinian domain is the kinds of traits that have evolved to help individual animals do battle with one another for resources that are important. Think about polygynous mating species, the vertebrates; for the most part the males take more than one mate if they can. Obviously the qualifier is important; if some get more than one mate, you’ve got others left with none; and that’s the ultimate loser position in the Darwinian scheme. You don’t pass your stuff along into the next generation. So, of course the males fight with each other. If who wins the fights gets the mates, then mutations will be favored that help you win fights. So, male body mass starts to grow. Not without limit, but well beyond the point that would be optimal for males as a group.

{ Robert Frank/EconTalk | Continue reading }

No one would care, no one would cry

122.jpg

The bonds that unite another person to ourself exist only in our mind. Memory as it grows fainter relaxes them, and notwithstanding the illusion by which we would fain be cheated and with which, out of love, friendship, politeness, deference, duty, we cheat other people, we exist alone. Man is the creature that cannot emerge from himself, that knows his fellows only in himself; when he asserts the contrary, he is lying.

{ Marcel Proust, In Search of Lost Time, The Sweet Cheat Gone, 1925 }

artwork { Hamish Blakely }



kerrrocket.svg