nswd



ideas

Adam’s age at death is given as 930 years

64.jpg

life expectancy for men in 1907 was 45.6 years; by 1957 it rose to 66.4; in 2007 it reached 75.5. Unlike the most recent increase in life expectancy (which was attributable largely to a decline in half of the leading causes of death including heart disease, homicide, and influenza), the increase in life expectancy between 1907 and 2007 was largely due to a decreasing infant mortality rate, which was 9.99 percent in 1907; 2.63 percent in 1957; and 0.68 percent in 2007.

But the inclusion of infant mortality rates in calculating life expectancy creates the mistaken impression that earlier generations died at a young age; Americans were not dying en masse at the age of 46 in 1907. The fact is that the maximum human lifespan — a concept often confused with “life expectancy” — has remained more or less the same for thousands of years. The idea that our ancestors routinely died young (say, at age 40) has no basis in scientific fact. […]

If a couple has two children and one of them dies in childbirth while the other lives to be 90, stating that on average the couple’s children lived to be 45 is statistically accurate but meaningless.

{ LiveScience | Continue reading | BBC }

‘If people remembered the same they would not be different people.’ –Nabokov

551.jpg

Say you travelled in time, in an attempt to stop COVID-19’s patient zero from being exposed to the virus. However if you stopped that individual from becoming infected, that would eliminate the motivation for you to go back and stop the pandemic in the first place. This is a paradox, an inconsistency that often leads people to think that time travel cannot occur in our universe. […] In the coronavirus patient zero example, you might try and stop patient zero from becoming infected, but in doing so you would catch the virus and become patient zero, or someone else would. No matter what you did, the salient events would just recalibrate around you. Try as you might to create a paradox, the events will always adjust themselves, to avoid any inconsistency.

{ Popular Mechanics | Continue reading | More: Classical and Quantum Gravity }

What with reins here and ribbons there all your hands were employed so she never knew was she on land or at sea or swooped through the blue like Airwinger’s bride

62.jpg

I examine the relationship between unhappiness and age using data from eight well-being data files on nearly 14 million respondents across forty European countries and the United States and 168 countries from the Gallup World Poll. […] Unhappiness is hill-shaped in age and the average age where the maximum occurs is 49 with or without controls.

{ Journal of Economic Behavior & Organization | Continue reading }

A large empirical literature has debated the existence of a U-shaped happiness-age curve. This paper re-examines the relationship between various measures of well-being and age in 145 countries. […] The U-shape of the curve is forcefully confirmed, with an age minimum, or nadir, in midlife around age 50 in separate analyses for developing and advanced countries as well as for the continent of Africa. The happiness curve seems to be everywhere.

{ Journal of Population Economics | PDF }

photo { Joseph Szabo }

‘It ain’t what they call you… it’s what you answer to.’ –W.C. Fields

6.jpg

In the year 1930, John Maynard Keynes predicted that, by century’s end, technology would have advanced sufficiently that countries like Great Britain or the United States would have achieved a 15-hour work week. There’s every reason to believe he was right. In technological terms, we are quite capable of this. And yet it didn’t happen. Instead, technology has been marshaled, if anything, to figure out ways to make us all work more. In order to achieve this, jobs have had to be created that are, effectively, pointless. […]

productive jobs have, just as predicted, been largely automated away […] But rather than allowing a massive reduction of working hours to free the world’s population to pursue their own projects, pleasures, visions, and ideas […] It’s as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is precisely what is not supposed to happen. Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat). But, of course, this is the sort of very problem market competition is supposed to fix. According to economic theory, at least, the last thing a profit-seeking firm is going to do is shell out money to workers they don’t really need to employ. Still, somehow, it happens.

{ David Graeber | Continue reading }

what I am calling “bullshit jobs” are jobs that are primarily or entirely made up of tasks that the person doing that job considers to be pointless, unnecessary, or even pernicious. Jobs that, were they to disappear, would make no difference whatsoever. Above all, these are jobs that the holders themselves feel should not exist.

Contemporary capitalism seems riddled with such jobs.

{ The Anarchist Library | Continue reading }

image { Alliander, ElaadNL, and The incredible Machine, Transparent Charging Station, 2017 }

The sun is there, the slender trees, the lemon houses

Moringa oleifera, an edible tree found worldwide in the dry tropics, is increasingly being used for nutritional supplementation. Its nutrient-dense leaves are high in protein quality, leading to its widespread use by doctors, healers, nutritionists and community leaders, to treat under-nutrition and a variety of illnesses. Despite the fact that no rigorous clinical trial has tested its efficacy for treating under-nutrition, the adoption of M. oleifera continues to increase. The “Diffusion of innovations theory” describes well the evidence for growth and adoption of dietary M. oleifera leaves, and it highlights the need for a scientific consensus on the nutritional benefits. […]

The regions most burdened by under-nutrition, (in Africa, Asia, Latin America, and the Caribbean) all share the ability to grow and utilize an edible plant, Moringa oleifera, commonly referred to as “The Miracle Tree.” For hundreds of years, traditional healers have prescribed different parts of M. oleifera for treatment of skin diseases, respiratory illnesses, ear and dental infections, hypertension, diabetes, cancer treatment, water purification, and have promoted its use as a nutrient dense food source. The leaves of M. oleifera have been reported to be a valuable source of both macro- and micronutrients and is now found growing within tropical and subtropical regions worldwide, congruent with the geographies where its nutritional benefits are most needed.

Anecdotal evidence of benefits from M. oleifera has fueled a recent increase in adoption of and attention to its many healing benefits, specifically the high nutrient composition of the plants leaves and seeds. Trees for Life, an NGO based in the United States has promoted the nutritional benefits of Moringa around the world, and their nutritional comparison has been widely copied and is now taken on faith by many: “Gram for gram fresh leaves of M. oleifera have 4 times the vitamin A of carrots, 7 times the vitamin C of oranges, 4 times the calcium of milk, 3 times the potassium of bananas, ¾ the iron of spinach, and 2 times the protein of yogurt” (Trees for Life, 2005).

Feeding animals M. oleifera leaves results in both weight gain and improved nutritional status. However, scientifically robust trials testing its efficacy for undernourished human beings have not yet been reported. If the wealth of anecdotal evidence (not cited herein) can be supported by robust clinical evidence, countries with a high prevalence of under-nutrition might have at their fingertips, a sustainable solution to some of their nutritional challenges. […]

The “Diffusion of Innovations” theory explains the recent increase in M. oleifera adoption by various international organizations and certain constituencies within undernourished populations, in the same manner as it has been so useful in explaining the adoption of many of the innovative agricultural practices in the 1940-1960s. […] A sigmoidal curve (Figure 1), illustrates the adoption process starting with innovators (traditional healers in the case of M. oleifera), who communicate and influence early adopters, (international organizations), who then broadcast over time new information on M. oleifera adoption, in the wake of which adoption rate steadily increases.

{ Ecology of Food and Nutrition | Continue reading }

The operation was a success, but the patient died

61.jpg

Currently, we produce ∼1021 digital bits of information annually on Earth. Assuming a 20% annual growth rate, we estimate that after ∼350 years from now, the number of bits produced will exceed the number of all atoms on Earth, ∼1050. After ∼300 years, the power required to sustain this digital production will exceed 18.5 × 1015 W, i.e., the total planetary power consumption today, and after ∼500 years from now, the digital content will account for more than half Earth’s mass, according to the mass-energy–information equivalence principle. Besides the existing global challenges such as climate, environment, population, food, health, energy, and security, our estimates point to another singular event for our planet, called information catastrophe.

{ AIP Advances | Continue reading }

It is estimated that a week’s worth of the New York Times contains more information than a person was likely to come across in a lifetime in the 18th century. […] The amount of new information is doubling every two years. By 2010, it’s predicted to double every 72 hours. […] The lunatic named Bobby Fisher “despised the media”: “They’re destroying reality, turning everything into media.” “News exceed reality” writes Thomas Bernhard somewhere. The saturation and repetitions in Basquiat’s paintings. The high-frequency trading. “an immense accumulation of nothing“ (Imp Kerr, 2009). An immense accumulation of ignorance. […,]

From what precedes it necessarily follows that the inescapable future of knowledge is banality, falsehood, and overabundance, which sum is a form of ignorance.

{ The New Inquiry | Continue reading }

‘Never stop dreaming.’ –Freddy Krueger

5.jpg

Enjoying short-term pleasurable activities that don’t lead to long-term goals contributes at least as much to a happy life as self-control, according to new research. […]

simply sitting about more on the sofa, eating more good food and going to the pub with friends more often won’t automatically make for more happiness.

{ UZH | Continue reading }

‘Tout vainqueur insolent à sa perte travaille.’ –Jean de La Fontaine

5.jpg

The Parrondo’s paradox, has been described as: A combination of losing strategies becomes a winning strategy. […]

Consider two games Game A and Game B, this time with the following rules:

1. In Game A, you simply lose $1 every time you play.
2. In Game B, you count how much money you have left. If it is an even number, you win $3. Otherwise you lose $5.

Say you begin with $100 in your pocket. If you start playing Game A exclusively, you will obviously lose all your money in 100 rounds. Similarly, if you decide to play Game B exclusively, you will also lose all your money in 100 rounds.

However, consider playing the games alternatively, starting with Game B, followed by A, then by B, and so on (BABABA…). It should be easy to see that you will steadily earn a total of $2 for every two games.

Thus, even though each game is a losing proposition if played alone, because the results of Game B are affected by Game A, the sequence in which the games are played can affect how often Game B earns you money, and subsequently the result is different from the case where either game is played by itself.

{ Wikipedia | Continue reading }

Give me a ho, if you’ve got your funky bus fare, ho

What is the feasibility of survival on another planet and being self-sustaining? […] I show here that a mathematical model can be used to determine the minimum number of settlers and the way of life for survival on another planet, using Mars as the example. […] The minimum number of settlers has been calculated and the result is 110 individuals.

{ Nature | Continue reading }

supp? hating myself rn. lol.

Do women need to have children in order to be fulfilled? […] motherhood status did not explain differences in self-reported life satisfaction, and mothers reported only slightly greater happiness than women who were not mothers.

{ Social Psychological and Personality Science | Continue reading }

‘“I am a Microsoft Word man.” Says the human dressed like Microsoft Word.’ –David A Banks

32.jpg

David Silver [the creator of AlphaZero] hasn’t answered my question about whether machines can set up their own goals. He talks about subgoals, but that’s not the same. That’s a certain gap in his definition of intelligence. We set up goals and look for ways to achieve them. A machine can only do the second part.

So far, we see very little evidence that machines can actually operate outside of these terms, which is clearly a sign of human intelligence. Let’s say you accumulated knowledge in one game. Can it transfer this knowledge to another game, which might be similar but not the same? Humans can. With computers, in most cases you have to start from scratch.

{ Gary Kasparov/Wired | Continue reading }

photo { Kelsey Bennett }

Arms apeal with larms

31.jpg

The madman theory is a political theory commonly associated with U.S. President Richard Nixon’s foreign policy. He and his administration tried to make the leaders of hostile Communist Bloc nations think Nixon was irrational and volatile. According to the theory, those leaders would then avoid provoking the United States, fearing an unpredictable American response.

{ Wikipedia | Continue reading }

The author finds that perceived madness is harmful to general deterrence and is sometimes also harmful in crisis bargaining, but may be helpful in crisis bargaining under certain conditions.

{ British Journal of Political Science | Continue reading }

black smoke shells fitted with computer chips { Cai Guo-Qiang, Wreath (Black Ceremony), 2011 }

‘And now it goes as it goes and where it ends is Fate.’ –Aeschylus

27.jpg

Founded in 1945 by University of Chicago scientists who had helped develop the first atomic weapons in the Manhattan Project, the Bulletin of the Atomic Scientists created the Doomsday Clock two years later, using the imagery of apocalypse (midnight) and the contemporary idiom of nuclear explosion (countdown to zero) to convey threats to humanity and the planet. The decision to move (or to leave in place) the minute hand of the Doomsday Clock is made every year by the Bulletin’s Science and Security Board in consultation with its Board of Sponsors, which includes 13 Nobel laureates. The Clock has become a universally recognized indicator of the world’s vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains.

To: Leaders and citizens of the world
Re: Closer than ever: It is 100 seconds to midnight
Date: January 23, 2020

{ Bulletin of the Atomic Scientists | Continue reading }

my wife said I never listen to her, or something like that

[W]hile time moves forward in our universe, it may run backwards in another, mirror universe that was created on the “other side” of the Big Bang.

{ PBS (2014) | Continue reading }

‘Nous avons exagéré le superflu, nous n’avons plus le nécessaire.’ –Proudhon

bambi.png

{ Sergei Eisenstein, On Disney )

I’m in that black on black Porsche Panamera, in the back like “ooh wee”

21.jpg

Using Music as Medicine – finding the optimum music listening ‘dosage’

There was a general agreement of dosage time across 3 of the 4 domains with 11 minutes being the most common amount of time it took for people to receive the therapeutic benefit from their self- selected music preferences. The only exception was the domain of happiness where the most common length of time for people to become happier after listening to their chosen music was reduced to 5 minutes, suggesting that happy music takes less time to take effect than other music. 


{ British Academy of Sound Therapy.com | PDF | More

photo { Sarah Illenberger }

As you spring so shall you neap

pierpaolo-ferrari.jpg

Most of the research on happiness has documented that income, marriage, employment and health affect happiness. Very few studies examine whether happiness itself affect income, marriage, employment and health. […] Findings show that happier Indonesians in 2007 earned more money, were more likely to be married, were less likely to be divorced or unemployed, and were in better health when the survey was conducted again seven years later.

{ Applied Research in Quality of Life | Continue reading }

image { Maurizio Cattelan and Pierpaolo Ferrari, Toilet Paper #1, June 2010 }

Olobobo, ye foxy theagues!

44.jpg

English speakers have been deprived of a truly functional, second person plural pronoun since we let “ye” fade away a few hundred years ago.

“You” may address one person or a bunch, but it can be imprecise and unsatisfying. “You all”—as in “I’m talking to you all,” or “Hey, you all!”—sounds wordy and stilted. “You folks” or “you gang” both feel self-conscious. Several more economical micro-regional varieties (youz, yinz) exist, but they lack wide appeal.

But here’s what’s hard to explain: The first, a gender-neutral option, mainly thrives in the American South and hasn’t been able to steal much linguistic market share outside of its native habitat. The second, an undeniable reference to a group of men, is the default everywhere else, even when the “guys” in question are women, or when the speaker is communicating to a mixed gender group.

“You guys,” rolls off the tongues of avowed feminists every day, as if everyone has agreed to let one androcentric pronoun pass, while others (the generic “he” or “men” as stand-ins for all people) belong to the before-we-knew-better past. […]

One common defense of “you guys” that Mallinson encounters in the classroom and elsewhere is that it is gender neutral, simply because we use it that way. This argument also appeared in the New Yorker recently, in a column about a new book, The Life of Guy: Guy Fawkes, the Gunpowder Plot, and the Unlikely History of an Indispensable Word by writer and educator Allan Metcalf.

“Guy” grew out of the British practice of burning effigies of the Catholic rebel Guy Fawkes, Metcalf explains in the book. The flaming likenesses, first paraded in the early 1600s, came to be called “guys,” which evolved to mean a group of male lowlifes, he wrote in a recent story for Time. Then, by the 18th century, “guys” simply meant “men” without any pejorative connotations. By the 1930s, according to the Washington Post, Americans had made the leap to calling all persons “guys.”

{ Quartz | Continue reading }

Tony: [to Lady and Tramp with an Italian accent] Now-a, first-a we fix the table-a.

6.jpg

related { Disney Plus warns users of ‘outdated cultural depictions’ in old movies }

Lee Jun-fan (November 27, 1940 – July 20, 1973), known professionally as Bruce Lee

51.jpg

of course there is no behind the scenes, no real self, no authenticity, etc. just a precession of simulacra; influencers sort of serve the same function Baudrillard thought Disneyland served: to make everyone else feel “authentic”

{ Rob Horning }



kerrrocket.svg