I’d say that in about half of my business conversations, I have almost no idea what other people are saying to me. The language of internet business models has made the problem even worse. When I was younger, if I didn’t understand what people were saying, I thought I was stupid. Now I realize that if it’s to people’s benefit that I understand them but I don’t, then they’re the ones who are stupid.
Group settings can diminish expressions of intelligence, especially among women
Research led by scientists at the Virginia Tech Carilion Research Institute found that small-group dynamics — such as jury deliberations, collective bargaining sessions, and cocktail parties — can alter the expression of IQ in some susceptible people. “You may joke about how committee meetings make you feel brain dead, but our findings suggest that they may make you act brain dead as well,” said Read Montague, who led the study.
In fact, it was because of my feminism that I wanted to like Erotic Capital: Whether from nature or nurture, women have traditionally excelled at “soft skills” like taking the emotional temperature of others, listening, adjusting one’s behavior to any given situation, and cooperating. These all happen to be skills that, until fairly recently, have been undercompensated in the workplace. In Hakim’s book I anticipated a deftly written argument that would reclaim the value of women’s work so that maybe we’d eventually start paying people in the professions that make use of those skills — say, teaching and nursing — their true value.
That’s the book I wanted to read. The book I actually read was more like this: Men supposedly have higher sex drives than women, creating a “male sex deficit,” which means men are always in a state of wanting more of what women supply. (…) So women who are willing to address that deficit, by either having actual sex with men suffering from it or presenting themselves in an enchanting manner to exploit it, have erotic capital that can be traded for other forms of capital.
Erotic capital has many guises: from “trophy wives” whose skilled self-presentation becomes a part of a man’s public persona, to men or women who style themselves in such a way as to garner attention at their workplace, to women with otherwise limited means who sell their erotic capacity (whether forthrightly, as with sex workers and performers, or more covertly, as with sales jobs) to establish themselves. It’s “sell yourself” meets “sex sells.” What’s most surprising about all this is that Hakim seems to think she’s saying something new. (…)
That she fails to name a single feminist who has actually come out against presenting oneself well (as opposed to presenting oneself as stereotypically feminine) indicates that she’s attacking a straw feminist, not an actual one. Where are the radical feminists urging women to not use their people skills on the job? Who are these radical feminists who blame women for wearing makeup to work instead of directing their critiques at institutions that demand women do so? Hakim falsely asserts that feminists have been fighting for the eradication of charisma and charm instead of the eradication of coyness and the deployment of sex appeal as woman’s strongest — or only — weapons.
Popular discourse portrays marriage as a source of innumerable public and private benefits, happiness, companionship, financial security, and even good health. Complementing this view, our legal discourse frames the right to marry as a right of access, the exercise of which is an act of autonomy and free will.
However, a closer look at marriage’s past reveals a more complicated portrait. Marriage has been used - and importantly, continues to be used - as state-imposed sexual discipline.
Until the mid-twentieth century, marriage played an important role in the crime of seduction. Enacted in a majority of U.S. jurisdictions in the nineteenth century, seduction statutes punished those who ’seduced and had sexual intercourse with an unmarried female of previously chaste character’ under a ‘promise of marriage.’ Seduction statutes routinely prescribed a bar to prosecution for the offense: marriage. The defendant could simply marry the victim and avoid liability for the crime. However, marriage did more than serve as a bar to prosecution. It also was understood as a punishment for the crime. Just as incarceration promoted the internalization of discipline and reform of the inmate, marriage’s attendant legal and social obligations imposed upon defendant and victim a new disciplined identity, transforming them from sexual outlaws into in-laws.
What occurred to Newton was that there was a force of gravity, which of course everybody knew about, it’s not like he actually discovered gravity– everybody knew there was such a thing as gravity. But if you go back into antiquity, the way that the celestial objects, the moon, the sun, and the planets, were treated by astronomy had nothing to do with the way things on earth were treated. These were entirely different realms, and what Newton realized was that there had to be a force holding the moon in orbit around the earth. This is not something that Aristotle or his predecessors thought, because they were treating the planets and the moon as though they just naturally went around in circles. Newton realized there had to be some force holding the moon in its orbit around the earth, to keep it from wandering off, and he knew also there was a force that was pulling the apple down to the earth. And so what suddenly struck him was that those could be one and the same thing, the same force.
(…)
I’m not sure it’s accurate to say that physicists want to hand time over to philosophers. Some physicists are very adamant about wanting to say things about it; Sean Carroll for example is very adamant about saying that time is real. You have others saying that time is just an illusion, that there isn’t really a direction of time, and so forth. I myself think that all of the reasons that lead people to say things like that have very little merit, and that people have just been misled, largely by mistaking the mathematics they use to describe reality for reality itself. If you think that mathematical objects are not in time, and mathematical objects don’t change — which is perfectly true — and then you’re always using mathematical objects to describe the world, you could easily fall into the idea that the world itself doesn’t change, because your representations of it don’t.
There are other, technical reasons that people have thought that you don’t need a direction of time, or that physics doesn’t postulate a direction of time. My own view is that none of those arguments are very good. To the question as to why a physicist would want to hand time over to philosophers, the answer would be that physicists for almost a hundred years have been dissuaded from trying to think about fundamental questions. I think most physicists would quite rightly say “I don’t have the tools to answer a question like ‘what is time?’ - I have the tools to solve a differential equation.” The asking of fundamental physical questions is just not part of the training of a physicist anymore.
(…)
On earth, of all the billions of species that have evolved, only one has developed intelligence to the level of producing technology. Which means that kind of intelligence is really not very useful. It’s not actually, in the general case, of much evolutionary value. We tend to think, because we love to think of ourselves, human beings, as the top of the evolutionary ladder, that the intelligence we have, that makes us human beings, is the thing that all of evolution is striving toward. But what we know is that that’s not true. Obviously it doesn’t matter that much if you’re a beetle, that you be really smart. If it were, evolution would have produced much more intelligent beetles. We have no empirical data to suggest that there’s a high probability that evolution on another planet would lead to technological intelligence. There is just too much we don’t know.
Many of us cling to the notion that memory is a reliable record and trawling through it can be similar to flipping through an old photo album. But what about the memories - sometimes vivid in nature - of things that never were?
Examining the false stories that we can create for ourselves is the aim of a new initiative led by artist Alasdair Hopwood. As part of a residency at the Anomalistic Psychology Research Unit led by Chris French at Goldsmiths College, University of London, Hopwood aims to explore what false memories reveal about our sense of identity.
To do this, he has created the False Memory Archive, a collection of people’s fabricated recollections either jotted down after talks he has given or submitted online at the project’s website. (…)
For Hopwood, examining the ways we deceive ourselves through memory is perhaps a natural progression. He has worked with fellow artists as part of the WITH Collective on projects that expose and poke fun at the many ways we style our public selves. “Identity is not fixed,” he says. Instead, it shifts depending on the company we are in, and even the format of the interaction - be it social media or in person. We’re extraordinarily preoccupied with sculpting our identities, as the glut of self-help books and pseudoscientific methods for personal development demonstrates.
Benford’s Law, also known as the rule of first-digits, is a rule that says in data sets borne from real-life (perhaps sales of coffee or payments to a vendor), the number 1 should be the first digit in a series approximately 30% of the time, instead of 11% as would happen had a random number between one and nine been generated.
The rule was first developed by Simon Newcomb, who noticed that in his logarithm books the first pages showed much greater signs of use than those pages at the end. Later the physicist Frank Benford collected some 20,000 observations to test the theory, which he too stumbled upon.
Benford found that the first-digits of a variety of things in nature, like elemental atomic weights, the areas of rivers, and the numbers that appeared on front pages of newspapers, started with a one more often than any other digit.
The reason for that proof is the percentage difference between consecutive single-digit numbers. Say a firm is valued at $1 billion. For the first digit to become a two (or to reach a market cap of $2 billion), the value of the firm will need to increase by 100%. However, once it reaches that $2 billion mark, it only needs to increase by 50% to get to $3 billion. That difference continues to decline as the value increases.
Coordinated universal time (UTC) is the time scale used all over the world for time coordination. It was established in 1972, based on atomic clocks, and was the successor of astronomical time, which is based on the Earth’s rotation. But since the atomic time scale drifts from the astronomical one, 1-second steps were added whenever necessary. In 1972 everybody was happy with this decision. Today, most systems we use for telecommunications are not really happy with these “leap” seconds. So in January member states of the International Telecommunications Union will vote on dropping the leap second. (…)
We are using a system that breaks time. The quality of time is continuity. This is why a majority in the international community want to change the definition of UTC and drop the leap second. (…)
It was agreed some years ago that we should not think of any kind of adjustment in the near future, the next 100 or 200 years. In about the year 2600 we will have a half-hour divergence. However, we don’t know how time-keeping will be then, or how technology will be. So we cannot rule for the next six or seven generations.
Unfortunately the critical ability – both of ideology and literature – is rather limited today. There is a certain type of commentary, thinking and writing which is both perceived and sees itself as critical thinking, without actually being either critical or thinking (literary critique in which the feelings of the reviewer is understood as the truth, academic research which is only restating an existing doxa, predictable moves in media debates which only confirm current positions). (…) But it is difficult to resist the demand for simple answers, clear stances, confirmation of identities and solutions which correspond to the current order. And if you do resist the demand for simplicity, there is always someone else who delivers the preferred answers and acceptable opinion.
Observe your own mood, and that of others, in the context of how recently they have eaten. If there’s a hothead in your circle, notice that his anger is greatest before meals, when hunger is highest, and rarely does he explode during meals or just after. When you feel agitated, try eating some carbs. They’re like a miracle drug. I suspect that anger is evolution’s way of telling you to go kill something so you can eat.
In 1889, when Friedrich Nietzsche suffered the mental collapse that ended his career, he was virtually unknown. Yet by the time of his death in 1900 at the age of 55, he had become the philosophical celebrity of his age. From Russia to America, admirers echoed his estimation of himself as a titanic figure who could alter the course of history. (…)
Suffering from violent migraines, Nietzsche resigned his academic post when he was 34 and began the life of a little-heeded nomad-intellectual in European resorts. With escalating intensity, he issued innovative works of philosophy that challenged every element of European civilization. He celebrated the artistic heroism of Beethoven and Goethe; denigrated the “slave morality” of Christianity, which transfigured weakness into virtue and vital strength into sin; and called on the strong in spirit to bring about a “transvaluation of all values.” The “higher man” — or as Nietzsche sometimes called him, the “overman” or “Übermensch” — did not succumb to envy or long for the afterlife; rather he willed that his life on earth repeat itself over and over exactly as it was. (…)
If God was dead, so too were equally fictitious entities like the self. There was no objective truth, only the truth-effects engendered by the workings of power and the instabilities of language. (…) More brilliantly than anyone, Nietzsche understood the peril of modern nihilism and the need to cultivate robust souls who would strive to achieve excellence without authoritative religious belief. (…)
Several decades before Nietzsche wrote, “What does not kill me makes me stronger,” Emerson wrote, “In general, every evil to which we do not succumb, is a benefactor.”
The third (and last) time I went to New Orleans was in September of 1978. I was living in Marin County, and I took the red-eye out of San Francisco, flying on a first-class ticket paid for by Universal Pictures, the studio that was financing the movie I was contracted to write. The story was to be loosely based on an article written by Hunter Thompson that had been recently published in Rolling Stone magazine. Titled “The Banshee Screams for Buffalo Meat,” the 30,000-word piece detailed many of the (supposedly) true-life adventures Hunter had experienced with Oscar Zeta Acosta, the radical Chicano lawyer who he’d earlier canonized in Fear and Loathing in Las Vegas.
Hunter and I were in New Orleans to attend the hugely anticipated rematch between Muhammad Ali and Leon Spinks, the former Olympic champion who, after only seven fights, had defeated Ali in February. The plan was to meet up at the Fairmont, a once-elegant hotel that was located in the center of the business district and within walking distance of the historic French Quarter. Although Hunter was not in his room when I arrived, he’d instructed the hotel management to watch for me and make sure I was treated with great respect.
“I was told by Mister Thompson to mark you down as a VIP, that you were on a mission of considerable importance,” said Inga, the head of guest services, as we rode the elevator up to my floor. “Since he was dressed quite eccentrically, in shorts and a Hawaiian shirt, I assumed he was pulling my leg. The bellman who fetched his bags said he was a famous writer. Are you a writer also?” I told her I wrote movies. “Are you famous?”
“No.”
“Do you have any cocaine?”
I stared at her. Her smile was odd, both reassuring and intensely hopeful. In the cartoon balloon I saw over her head were the words: I’m yours if you do. “Yes, I do.”
“That is good.”
Inga called the hotel manager from my room and told him, in a voice edged with professional disappointment, that she was leaving early because of a “personal matter.” After she hung up, she dialed room service and handed me the phone. She directed me to order two dozen oysters, a fifth of tequila, and two Caesar salads. Then, with a total absence of modesty, she quickly stripped off her clothes, walked into the bathroom, and a moment later I heard the water running in the shower.
One day in 1945, a man named Percy Spencer was touring one of the laboratories he managed at Raytheon in Waltham, Massachusetts, a supplier of radar technology to the Allied forces. He was standing by a magnetron, a vacuum tube which generates microwaves, to boost the sensitivity of radar, when he felt a strange sensation. Checking his pocket, he found his candy bar had melted. Surprised and intrigued, he sent for a bag of popcorn, and held it up to the magnetron. The popcorn popped. Within a year, Raytheon made a patent application for a microwave oven.
The history of scientific discovery is peppered with breakthroughs that came about by accident. The most momentous was Alexander Fleming’s discovery of penicillin in 1928, prompted when he noticed how a mould that floated into his Petri dish killed off the surrounding bacteria. Spencer and Fleming didn’t just get lucky. Spencer had the nous and the knowledge to turn his observation into innovation; only an expert on bacteria would have been ready to see the significance of Fleming’s stray spore. As Louis Pasteur wrote, “In the field of observation, chance favours only the prepared mind.”
The word that best describes this subtle blend of chance and agency is “serendipity.” (…)
Today’s internet plies us with “relevant” information and screens out the rest. Two different people will receive subtly different results from Google, adjusted for what Google knows about their interests. Newspaper websites are starting to make stories more prominent to you if your friends have liked them on Facebook. We spend our online lives inside what the writer Eli Pariser calls “the filter bubble.”
Thousands of characters — letters and obscure symbols — filled the more than 100 pages of a centuries-old text that had been located in East Berlin after the end of the Cold War. No one knew what the text meant, or even what language it was in. It was a mystery that USC computer scientist Kevin Knight and two Swedish researchers sought to solve. (…)
After months of painstaking work and a few trips down the wrong path, the moment finally came when the team knew it was on to something. Out of what had been gibberish emerged one word: ceremonie — a variation of the German word for ceremony. Knight said they figured out the rest from there.
Breaking the code on the document known as the Copiale Cipher revealed the rituals and political observations of an 18th century secret German society, as well as the group’s unusual fascination with eye surgery and ophthalmology.
But the larger significance of the team’s work wasn’t necessarily the discovery, it was how they arrived at it. (…)
“You start to see patterns, then you reach the magic point where a word appears,” he said. It was then, he said, “you no longer even care what the document’s about.”
The team ran statistical analyses of 80 languages, initially believing that the code lay in the Roman letters between the symbols that dotted the pages. Using a combination of brain power and computer wizardry, they broke the code by figuring out the symbols.
In a new study from the Centre for Addiction and Mental Health (CAMH), people with schizophrenia showed greater brain activity during tests that induce a brief, mild form of delusional thinking. This effect wasn’t seen in a comparison group without schizophrenia.
“We studied a type of delusion called a delusion of reference, which occurs when people feel that external stimuli such as newspaper articles or strangers’ overheard conversations are about them,” says CAMH Scientist Dr. Mahesh Menon, adding that this type of delusion occurs in up to two-thirds of people with schizophrenia. “Then they come up with an explanation for this feeling to make sense of it or give it meaning.”
The study was an initial exploration of the theory that the overactive firing of dopamine neurons in specific brain regions is involved in converting neutral, external information into personally relevant information among people with schizophrenia. This may lead to symptoms of delusions. “We wanted to see if we could find a way to ’see’ these delusions during Magnetic Resonance Imaging scanning,” says Dr. Menon.
Schizophrenia is a mental disorder characterized by a breakdown of thought processes and by poor emotional responsiveness. It most commonly manifests itself as auditory hallucinations, paranoid or bizarre delusions, or disorganized speech and thinking, and it is accompanied by significant social or occupational dysfunction. The onset of symptoms typically occurs in young adulthood, with a global lifetime prevalence of about 0.3–0.7%.
Genetics, early environment, neurobiology, and psychological and social processes appear to be important contributory factors; some recreational and prescription drugs appear to cause or worsen symptoms. Current research is focused on the role of neurobiology, although no single isolated organic cause has been found. The many possible combinations of symptoms have triggered debate about whether the diagnosis represents a single disorder or a number of discrete syndromes.
Despite the etymology of the term from the Greek roots skhizein (”to split”) and phren- (”mind”), schizophrenia does not imply a “split mind” and it is not the same as dissociative identity disorder—also known as “multiple personality disorder” or “split personality”—a condition with which it is often confused in public perception.
The mainstay of treatment is antipsychotic medication, which primarily suppresses dopamine (and sometimes serotonin) receptor activity.
Carissa Kowalski Dougherty explores how album covers moved from the purely functional to graphic works of art that conveyed the tone, mood, and feel of the lyric-less jazz music contained within. Dougherty also investigates how race is designated on the covers, an item, she says, that is inextricably linked to the music itself.
During the postwar period, African-American artists and musicians were confronting the same issues in their respective fields: how to retain their identity as black Americans while being recognized as skilled artists regardless of race; how to convey their own personal experiences; how to overcome discrimination; how to succeed in their field, and how to express pride in their African heritage—all without the aid of words.
Professor Trifonov analyzes the vocabulary of 123 existing definitions of life in order to provide a possible path for finding a possible minimal agreement among scientists. To this purpose, he compares from a linguistic point of view the definitions and ranks the terms used therein according to their frequency. (…)
The outcome of this analysis is a definition of life as “self-reproduction with variations.” (…)
Is “self-reproduction with changes” a good definition? Can this definition actually provide a minimal basis of consensus?
Names of countries in foreign languages (exonyms) often bear no relationship to the names of the same countries in their own official language or languages (endonyms). Such differences are generally accepted without complaint; the fact that English speakers refer to Deutschland as Germany and Nihon as Japan is not a problem for the governments or the people of those countries.
Occasionally, however, diplomats from a given country request that other governments change its name. (…)
Over the past several years, Georgia has been trying to convince a number of countries to call it “Georgia,” even though the Georgian name for the country is Sakart’velo.