Tuesday, December 11, 2012

On the 12th day of Christmas ... your gift will just be junk | George Monbiot | Comment is free | The Guardian

On the 12th day of Christmas ... your gift will just be junk | George Monbiot | Comment is free | The Guardian

While I find some of his examples a bit too frivolous, our addiction to consumption (for both pyschological and perhaps more importantly economic reasons) is a serious issue. If I were to name some of our behaviours which might be viewed by our descendants as morally reprehensible then I think this consumption would rank for two reasons. First due to its impact on the worlds resources and climate change, and hence the consequences for innumerable future generations, and secondly due to the near-slavelike cheap labour system that underpins it all. But we are oblivious to the scale or moral nature of the problem.


The only possible silver lining is that modern technology does allow new, immaterial forms of consumption and commerece, with the classic example being apps. Creation and consumption of virtual objects could provide sustainable growth. Alas, while in principle this sounds promising, in reality the relative volumes probably make it insignificant to the inevitable material economy.


Thursday, November 29, 2012

Clare Carlisle on Evil

An interesting series on the concept of 'evil' in the Guardian : Clare Carlisle | The Guardian 
While of course personally have no interest in the theological uses of the term, I do think it still is important as a psychological concept, but not because it is intrinsically valid, but because humanity so often reaches for it to explain/deal with the horrors of life, natural and manmade.

part 1: how can we think about evil?
The religious idea that thinking about evil involves coming to terms with a darkness in all our hearts provides food for thought

part 2: does it exist?
St Augustine's theory was that evil was 'nothing other than the absence of good' ? an idea supported by modern science
  • Surprisingly, though, the basic insight of Augustinian theodicy finds support in recent science. In his 2011 book Zero Degrees of Empathy, Cambridge psychopathology professor Simon Baron-Cohen proposes "a new theory of human cruelty". His goal, he writes, is to replace the "unscientific" term "evil" with the idea of "empathy erosion": "People said to be cruel or evil are simply at one extreme of the empathy spectrum," he writes. (He points out, though, that some people at this extreme display no more cruelty than those higher up the empathy scale – they are simply socially isolated.)  Loss of empathy resembles the Augustinian concept of evil in that it is a deficiency of goodness – or, to put it less moralistically, a disruption of normal functioning – rather than a positive force. In this way at least, Baron-Cohen's theory echoes Augustine's argument, against the Manicheans, that evil is not an independent reality but, in essence, a lack or a loss
part 3: does freedom make us evil?
Kierkegaard believed that human sin was a result of a combination of pride and fear in the face of freedom
  • Many are suspicious of the Christian concept of sin, but Kierkegaard's reinterpretation of the traditional doctrine is illuminating from a psychological perspective as well as from a religious one. While Augustine thought that Adam and Eve's first sin was transmitted to their descendents biologically – through sexual intercourse – Kierkegaard rejects this literal explanation of sin. Our failure to be good, he argues, is due to the way we deal with being both less and more free than we wish to be. Like stroppy, insecure teenagers, we crave independence, resent authority, and are scared to take responsibility for ourselves. By emphasising the importance of both humility and courage, Kierkegaard suggests a way to cope with this predicament – a non-moralistic basis for morality. And by pointing to the fear that lies beneath evil, he uncovers something common to both victims and perpetrators.
Reading this raised an interesting thought, from a theological point of view, is freedom of the will a good thing? Assuming any creator could make us good/bad or free, and would punish us accordingly, then how is free, the chance to fail, much better than being created bad, and destined to fail from the start. And of course since 'free to choose' implies an innate character which does the choosing, this character itself can only be good/bad, or else, (to avoid infinite regress) random. Again, no fair basis for ultimate judgement.

part 4: the social dimension
Does contemporary society give rise to conditions more conducive to evil than in the past?

part 5: making sense of suffering
One of the basic purposes of our culture is to interpret suffering, to make it meaningful. Myth, art and religion all do this job

part 6: the trial of Eichmann (1)
In finding Hitler's transport administrator guilty, the court recognised him as a free, morally responsible human being

part 7: the trial of Adolf Eichmann (2)
At the heart of Eichmann's banality was not thoughtlessness but evasiveness, and the 'interplay between knowing and willing'

Monday, October 15, 2012

Evil, part one: how can we think about evil? | Clare Carlisle | Comment is free | guardian.co.uk

Evil, part one: how can we think about evil? | Clare Carlisle | Comment is free | guardian.co.uk

I find the concept of 'evil' an intriguing and important one, since I think in the modern world, aware of the how environment and genetics influence, and yet do not determine, our behaviour, it is important to debate whether such a term can is usable, or useful.

While this is something I need to think about and write on in greater length, my first thoughts are that 'evil' is often used as a way to give up thinking about, or seeking further, the causes of action. To consider someone evil is to render them unknowable and unchangeable - almost to strip them of their humanity, and I have an instinctive revulsion something which so stinks of over-simplification. Given that the worst inhumanities we are capable of generally have their roots such a dehumanizing of other groups or people, then it has to be an approach we should be wary of using. With us or against us may be psychologically soothing, we have to be careful and justified when choosing such dichotomies.

That said, I also think that it is highly likely that there are some people that technically are not human in moral respects - psychopaths for example. Without dealing with the cause or curabily of this condition, at the very least it can be said that there are cases of psychopathic behaviour which is not just chosen, but firmly ground in differeng brain states - blindness to the emotions of others etc. In such cases, these people would seem indeed to qualify as 'evil' - beyond the pale of reason or (social) redemption, uninfluenced by either deterrent or punishment.

However the problem is 'evil' while on the one hand removing people from the moral sphere, is itself a moral judgement. While we may consider evil people as acting like animals, we don't judge them as animals. Since morality implies choice, then it is as if they choose to be animals, choose to not be human. But to me this raises a fundamental question - what is the difference between having the biological setup which makes one choose to be act like psychopath, than having (the presumably morally neutral) misfortune to be a pyschopath. What is the difference between being born as somone who is evil, and someone who is born as someone who chooses to be evil? If there is responsibility, there is a character,, but whence the blame for the character?

So  it seems to me that the term 'evil' is not something which has a place in the modern world. It is a moral sop, a relic of a religious heritage. The point of course is that this might practically make no difference - we will still need to protect from and punish those who 'do' evil, but there is no place in moral judgement for horrific nature, even if it's human.


Tuesday, October 9, 2012

Tories go back to basics on burglars | Politics | The Guardian

Tories go back to basics on burglars | Politics | The Guardian
Of all the recent policy announcements, there are few that can be as primitive or populist than the latest 'batter a burglar' (as the Sun put it) policy from the UK Conservatives.

Currently householders are allowed use  'proportionate' force, but now the Tories are pushing for (literally) 'disproportionate' force to be acceptable as well. The very language itself highlights the lack of logic - if the law is there in general to prescribe suitable, i.e. proportionate actions and reactions, how can it be validly used to justify unsuitable actions as well?

Of course the standard argument is that people confronted by a burglar are afraid to react with force at all, in case they are later judged to have overreacted, but since there are presumably still some limits as to what can be meted out to an intruder (I think the phrase will be 'grossly disproportionate') then surely the same abiguity remains, just shifted to the more violent end of the spectrum. And in fact the consequences for homeowners could be then much worse, since if they overdoing a severe beating is always going to be a more serious offence (with risk of death, permanent injury) than overdoing a minor one. And of course if burglars can expect to be attacked, then they will be prepared for it, either by bringing weapons, attacking the homeowner first, and being more vicious when they do so.

And what all of this misses is the moral argument. If a homeowner can use (again literally) excessive force, then it moves from the realm of defence and prevention and into punishment and vengeance. Is it then a fair punishment for the crime of trying to steal an xbox to be severely beaten or injured? Or killed? Surely it is this reasoning that is behind non-US prohibitions on a free for all against home invaders - a realization that having your house robbed while upsetting, annoying and possibily in some way traumatizing, is still in the great scheme of things not the worst of crimes, and hence to be treated reasonably by the courts, and not be dependent on the inflamed passions of a scared homeowner?

Of course it would be nice to think that the mere deterrent would put burglars off, but the sad fact is that most are probably driven by situation or addiction to resort to it, and are not going to make such a rational decision. To see this is the case, one only needs to look at the US, where death sentences and gun toting homeowners don't seem to have resulted in some calm paradise where break ins and robberies have been eradicated. As somone who shouldn't have been famous might have said 'how's that shooty fry-ey thing working out for y'all?'





Wednesday, September 19, 2012

Facebook and Twitter: the art of unfriending or unfollowing people | Technology | The Guardian

Facebook and Twitter: the art of unfriending or unfollowing people | Technology | The Guardian

I am particularly intrigued by the idea thatfriend clutter relates to an intrinsic problem in dealing with endings, even though beginnings (marriages, births) are culturally celebrated, and every beginning marks a different ending.
some extracts :
  • Technology exposes us to vastly more opportunities for making social connections, and far more effortlessly than even a stroll down the street and a handshake. Yet an etiquette for terminating those links, should they outlive their mutual benefit ? if they ever had any ? remains as absent as ever.
  • Physical clutter...We think we want this stuff, but, once it becomes clutter, it exerts a subtle psychological tug. It weighs us down. The notion of purging it begins to strike as us appealing, and dumping all the crap into bin bags feels like a liberation. "Friend clutter", likewise, accumulates because it's effortless to accumulate it: before the internet....Friend clutter exerts a similar psychological pull.
  • Last year, a writer of romance novels from Illinois named ArLynn Presser embarked upon what you might call an audit of her so-called friends..she made a New Year's resolution to visit them all, to find out why or, indeed, whether  they were friends.
  • [however] according to an ever-growing body of evidence, social media isn't making us lonelier or less deeply connected. Instead, study after study endorses the idea of "media multiplexity": people who communicate lots via one medium, it turns out, are the kind of people who communicate lots via others as well. Regular emailers are more likely also to be regular telephoners, one study found; people who use Facebook multiple times a day, according to another investigation, have 9% more close ties in their overall social network, on average, than those who don't. Social media builds social capital, rather than degrading it:
  • Even  chilling statistics about more Americans lacking a confidant now looks dubious: a new analysis by the sociologist Claude Fischer concluded that the finding arose because of a change in how the questions were asked.
  • The anthropologist and evolutionary psychologist Robin Dunbar famously calculated "Dunbar's number" ? the notion that the largest number of meaningful social relationships that any one person can maintain is somewhere around 150.
  • Online networks have a tendency to obliterate the nuances between different kinds of relationships. Despite Facebook's lists, privacy settings and the rest, Mullany points out, "ultimately, somebody is either your friend on Facebook or they're not. In real life, we're very political about our friendships, and I don't mean that in a bad way." There are friendships we'll let fade to nothing; others for which we'll put on a facade for a few hours at Christmas; or friendships of necessity, where we'll give the impression of intimacy without the reality. In contrast, "Facebook essentially doesn't allow us to be political."
  • The more profound truth behind friend clutter may be that, as a general rule, we don't handle endings well. "Our culture seems to applaud the spirit, promise and gumption of beginnings," writes the sociologist Sara Lawrence-Lightfoot in her absorbing new book, Exit: The Endings That Set Us Free, whereas "our exits are often ignored or invisible". We celebrate the new ? marriages, homes, work projects ? but "there is little appreciation or applause when we decide (or it is decided for us) that it's time to move on". We need "a language for leave-taking", Lawrence-Lightfoot argues, and not just for funerals.

Sunday, August 19, 2012

Insight: The dark side of Germany's jobs miracle | Reuters

Wage restraint and labor market reforms have pushed the German jobless rate down to a 20-year low, and the German model is often cited as an example for European nations seeking to cut unemployment and become more competitive. But critics say the reforms that helped create jobs also broadened and entrenched the low-paid and temporary work sector, boosting wage inequality.
Insight: The dark side of Germany's jobs miracle | Reuters

  • a 20-year low, and the German model is often cited as an example for European nations seeking to cut unemployment and become more competitive. But critics say the reforms that helped create jobs also broadened and entrenched the low-paid and temporary work sector, boosting wage inequality. Labor office data show the low wage sector grew three times as fast as other employment in the five years to 2010, explaining why the "job miracle" has not prompted Germans to spend much more than they have in the past. Pay in Germany, which has no nationwide minimum wage, can go well below one euro an hour, especially in the former communist east German states.
  • Trade unions and employers in Germany traditionally opt for collective wage agreements, arguing that a legal minimum wage could kill jobs, but these agreements only cover slightly more than half the population and can be circumvented
  • Critics say Germany's reforms came at a high price as they firmly entrenched the low-wage sector and depressed wages, leading to a two-tier labor market.New categories of low-income, government-subsidized jobs - a concept being considered in Spain - have proven especially problematic. Some economists say they have backfired. They were created to help those with bad job prospects eventually become reintegrated into the regular labor market, but surveys show that for most people, they lead nowhere.
  • While wage inequality used to be as low in Germany as in the Nordic countries, it has risen sharply over the past decade. "The poor have clearly lost out to the middle class, more so in Germany than in other countries," said OECD economist Isabell Koske. Depressed wages and job insecurity have also kept a lid on domestic demand, the Achilles heel of the export-dependent German economy, much to the exasperation of its neighbors.
  • ILO's Ernst says Germany can only hope that other European countries do not emulate its own wage deflationary policies too closely, as demand will dry up: "If everyone is doing same thing, there won't be anyone left to export to.

Thursday, August 9, 2012

Decision Quicksand

Should you read this story? Why you're having trouble deciding - Red Tape
some extracts:
  • Little decisions cause a big problem precisely because they are surprisingly hard. Faced with too many options, consumers unconsciously connect difficulty with importance, and their brains are tricked into heavy deliberation mode.
  • Instead of realizing that picking a toothbrush is a trivial decision, we confuse the array of options and excess of information with decision importance, which then leads our brain to conclude that this decision is worth more time and attention.
  • Research shows that time spent in decision quicksand before a choice correlates with dissatisfaction after the fact.  And of course, there's all that wasted time and emotional energy.
  • Set decision rules and stick to them. In other words, start with a time limit that reflects the true importance of the choice. For example, "I will book a flight in 5 minutes, no matter what."
  • Breaks can also help. Spending time away from a decision-making process can free the brain from an obsessive loop. "Even minor interruptions, short breaks, or momentary task switching can change information processing from a local, bottom-up focus to a top-down, goal-directed mode 
I wonder if this is in anyway related to the phenomenon described by Antonio Damasio in his book Descarte's Error, where a subject with frontal lobe damage, whcih meant he had limited emotional engagement in decisions, couldn't decide between two roughly equal dates for an appointment. It seemed the lack of an emotional weight one way or the other (or to end the process) resulted in an endless rational reasoning loop. Could another factor in decision quicksand be that trivial decisions lack emotional weight, and hence are vulnerable to such infinite analysis? And does this say anything about those of us (like me) who seem particularly inclined to fall into it?

Wednesday, August 8, 2012

The paradox of behaviour tests

Failure rate of 50% a worrying statistic for drivers - The Irish Times - Wed, Aug 08, 2012



It seems 50% of cars fail their yearly roadworthiness test in Ireland, which raises some interesting paradoxes. Should this be viewed as a good thing, in that so many dangerous cars are identified and taken off the road? Or is the testing contributing to the problem, since knowing that there will be a yearly test, people are skipping on regular maintenance and services? This is an interesting example of a common undesired side-effects of checking human behaviour : it removes responsibility for the problem from the individual, which means both bad and good behaviour is reduced. I could see something similar occurring in France, where now motorists are legally obliged to carry breathalizers  with them. Initially, and overall, I think this is a good thing, since most people do not want to drive over the limit, but might often (if even wilfully) assume they are not. However if confronted with clear evidence (from their own breathalizer) that they are, then they are left no moral wiggle room, and can only continue to drive if make a clear decision to break the law, as opposed to just hoping they aren't. And of cours those that would drive regardless would so so anyway, test or not. The corrolary of the test though, is that there are probably many cases where people overestimate their blood alcohol content, and don't drive because they think they are over the limit when they actually aren't (the limit is apparently surprisingly high for some people). These people would never have driven when fully inebriated, but now would be encouraged to do so when partially. Given that any alcohol in the system affects performance, then there is a likelihood that this will actually result in these people crashing, when they otherwise wouldn't.

So while testing is overall needed, since it clamps down on the extremes, it is never without unintended consequences.

Eat,fast, live longer

Another interesting (if not yet reliably useful) food/health documentary last night: Horizon's "Eat, fast, live longer" which reported on recent studies suggesting occasional extreme fasting (~50 cals per day for 4 days every few months) or regular (alternate-day) near-fasting (~400 cals per day) could provide major health benefits, even if people ate normally (or even badly) the rest of the time.

(BBC link here, guardian review here)

This is not the first time I've heard of how low calorie intake is linked with longevity, but until now I only knew about people who were (without a lot of human based evidence) living long term with massively reduced energy intakes, which didn't seem very worthwhile, since even if it did make them live longer, reduced their energy levels so much it was debatable whether they were able to live life at all. The recent research however, though presumably arising from the same evidence, suggests that it might not (just) be consistent reductions in calories that brings benefits, but even short term occasional fasts. There seems to be signs that the body enters a kind of 'repair' mode when starving, and the are hopefuly indicators of how this might reduce or even prevent illnesses such as heart disease or stroke.

While I can't comment on the scientific basis (which was admitted to be only new and limited, albeit promising), there are perhaps a few general common sense points that can be made. My own personal 'gut instinct' is that any diet which matches our evolution, has to make sense; hence modern artificially concentrations of sugars,salt, and fat have to be viewed with suspicion. Beyond this though, as incredibly versatile omnivores, it is probably quite hard to work out what our ideal pre-modern diet actually was. It's likely that life was hard, and resources scarce, so it does make sense that it was largely plant based, since presumably hunting was harder and more unpredictable. However, given that we can consume animal meat, and in certain cultures (eskimos) can survive on it, then I would be wary of excluding it altogether, although it might make sense to limit it. So I would think a balanced diet, with emphasis on plants, sounds like a good idea, and have never seen strong evidence against this.

However, leaving aside the type of food, and focusing on the quantity, such a boom-and-bust cycle of nutrition could also fit well with the latest fasting research, since our bodies would have done well to adapt to (and even make use of) such times of shortage. The problem however, as with all neat evolutionary stories, is it is important to keep in mind the arena in which evolution operates, something which might not align well with modern life expectancies. Evolution works on the fitness of the animal primarily until it procreates, and then perhaps for some while after as it supports its offspring/descendants. The problem is, given that ancient man might only have lived till 30 or 40, and heart attacks and cancer are mainly illnesses of later years (perversely rising cancer rates can be a good thing, since indicates a society living longer), it's very possible that there is nothing evolutionary in us to combat them. This doesn't mean there aren't optimum ways of extending our lifespan, but rather that we are inventing them, not re-discovering them. It might be that the issue is not making the machine run as it should, but finding ways to make it run as it could.

However, this then raises the questionas to whether the same approaches fit all stages of life. It could be that in earlier, fitter, more active years one type of diet is optimum (and matches evolutionary history), but in later years a change of strategy is needed. For example I wonder about the link between growth and cancer. Several times recently I have seen evidence about diets which seem to reduce cancer rates, but perhaps also impact normal cell growth. This of course makes sense, since cancer ultimately is excessive growth, so it is logical that what would stop growth normally, would stop cancer as well. But what is far less from clear is whether this is a desirable thing, under normal healthy circumstances. Like constant restriction of calories, could the cure (limited life) be worse than the disease (risk of limited lifetime). It coudl even be the case that in certain stages of life one approach makes sense (run body at full power, albeit putting under stress) but at other, older, stages another is appropriate (scale back to focus on preservation, not performance). Ultimately of course a balance is needed, but making the judgement needs more evidence and consideration than just jumping on the latest diet that shows (even if verified) improvement in one range of parameters. Better cholesterol, lower glucose levels, are good things, but what else changes, and what are the overall effects? From what I remember, the studies for example (as those mentioned in the film forks over knives) involved people who were already unhealthy - older, obese etc. It is one thing to provide a solution for people who clearly have a problem, it is another to show that solution is applicable to preventthat problem in the first place.

in the spirit of balance and moderation, I therefore think fasting diets sound so extreme that is would be prudent to wait for more evidence. One probably reliable maxim that can be taken is overconsumption, even without obesity, but of meats etc. might be a problem. So probably a safe bet, to limit, but definitely not exclude, meats, ramp up plant elements, and overall keep consumption down and in line with activity. Still, who knows how this research will develop, and all thought about food, is valuable food for thought.


Thursday, July 19, 2012

Are Believers Really Happier Than Atheists?



Are Believers Really Happier Than Atheists?
Who is better off: the religious or atheists? Cultural values determine the answer

http://www.scientificamerican.com/article.cfm?id=healthy-skepticism


  • Being religious is often linked with greater well-being. New research suggests that the effect is culture-specific.
  • A strong predictor of a person’s religiosity is the condition of the society in which he or she lives.
  • Finding communities and social groups that align with your beliefs can improve life satisfaction.


Monday, July 16, 2012

The Money-Empathy Gap


New research suggests that more money makes people act less human. Or at least less humane.


  • Earlier this year, Piff, who is 30, published a paper in the Proceedings of the National Academy of Sciences that made him semi-famous. Titled “Higher Social Class Predicts Increased Unethical Behavior,” it showed through quizzes, online games, questionnaires, in-lab manipulations, and field studies that living high on the socioeconomic ladder can, colloquially speaking, dehumanize people. It can make them less ethical, more selfish, more insular, and less compassionate than other people. 
  • In a country that likes to think that class doesn’t matter, these social scientists are beginning to prove just how determinative money is.
  • Nor does it attempt to apply its conclusions about the selfishness and solipsism of a broad social stratum to every member within it: Gateses and Carnegies have obviously saved lives and edified generations, and one of the biggest predictors of a person’s inclination to donate to charity is how much money he has.
  • studies of ethical behavior indicate a strong correlation between high socioeconomic status and interpersonal disregard. It’s an “additive” effect; the fever line points straight up. “People higher up on the socioeconomic ladder are about three times more likely to cheat than people on the lower rungs,” he says. Piff’s research also suggests that people who yearn to be richer or more prominent make different choices than those more content with their present level of material comfort.
  • Americans across the board can have a high tolerance for inequality if they believe it is meritocratic. The research by Piff and his colleagues points to a different possible explanation for the income gap: that it may be at least in part psychologically destined. This in turn raises the ancient conundrum of chicken and egg. If getting or having money can make you hard-hearted, do you also have to be hard-hearted to become well-off in the first place? The bulk of the new research points decisively in the direction of the former
  • “Upper-class drivers were the most likely to cut off other vehicles even when controlling for time of day, driver’s perceived sex, and amount of traffic.” When Piff designed a similar experiment to test drivers’ regard for pedestrians, in which a researcher would enter a zebra crossing as a car approached it, the results were more staggering
  • In experiments she published in the journal Science in 2006, Vohs “primed” her subjects to think about money, which is to say she planted the idea of money in their minds without their knowledge before observing their social interactions compared with a control group. ..... Every subject in the study bent down to pick up the mess. But the money-primed subjects picked up 15 percent fewer pencils than the control group. In a conversation in her office in May, Vohs stressed that money-priming did not make her subjects malicious—just disinterested. “It’s not a bad analogy to think of them as a little autistic,” she said. “I don’t think they mean any harm, but picking up pencils just isn’t their problem.”
  • Over and over, Vohs has found that money can make people antisocial. She primes subjects by seating them near a screen-saver showing currency....Vohs showed that money-primed subjects gave less time to a colleague in need of assistance and less money to a hypothetical charity. 
  • “Money,” says Vohs, “brings you into functionality mode. When that gets applied to other people, things get mucked up. You can get things done, but it does come at the expense of people’s feelings or caring about them as individuals.”
  • The corollaries to this poverty work are potentially explosive: Wealth may give you a better brain. It may make you a more strategic thinker, a savvier planner... And the cognitive benefits of affluence may accrue incrementally, speculates Dovidio, so that very rich people have better brain functioning than moderately rich people. These hypotheses are at the untested frontier of the new science: “I think in ten years we’ll have a compelling story on this,” says Dacher Keltner, the psychologist who oversees the work of Piff and his colleagues. But already the outline is becoming clear. Princeton University psychologist Eldar Shafir has shown that in environments of abundance, people make better financial decisions—it’s not that rich people tend to be better educated and can afford better advice, but that people living paycheck to paycheck don’t have the mental space to make the smartest long-term moves. The efficiencies of the affluent brain may trigger the shutting down of what the researchers call “pro-social” impulses and lead people toward the kinds of behaviors that a hedge-fund manager I spoke to characterized as “ruthless.
  • This is Hazel Markus’s main research interest: the mind-sets of class. She and her colleagues have found, broadly speaking, that the affluent value individuality—uniqueness, differentiation, achievement—whereas people lower down on the ladder tend to stress homogeneity, harmonious interpersonal relationships, and group affiliation
  • The American Dream is really two dreams. There’s the Horatio Alger myth, in which a person with grit, ingenuity, and hard work succeeds and prospers. And there’s the firehouse dinner, the Fourth of July picnic, the common green, in which everyone gives a little so the group can get a lot. Markus’s work seems to suggest the emergence of a dream apartheid, wherein the upper class continues to chase a vision of personal success and everyone else lingers at a potluck complaining that the system is broken. (Research shows that the rich tend to blame individuals for their own failure and likewise credit themselves for their own success, whereas those in the lower classes find explanations for inequality in circumstances and events outside their control.) But the truth is much more nuanced. Every American, rich and poor, bounces back and forth between these two ideals of self, calibrating ambitions and adjusting behaviors accordingly. Nearly half of Americans between 18 and 29 believe that it’s “likely” they’ll get rich, according to Gallup—in spite of all evidence to the contrary

Monday, July 9, 2012

TED talk : consumers create jobs, not the rich


TED talk by Nick Hanauer :
(with transcript at http://lybio.net/tag/nick-hanauer-ted-talks-the-inequality-speech-transcription/)

Extract :
"I have started, or helped start, dozens of companies and initially hired lots of people. But if there was no one around who could afford to buy what we had to sell, all those companies and all those jobs would have evaporated. Source: LYBIO.net
That's why I can say with confidence that rich people don't create jobs, nor do businesses, large or small. Jobs are a consequence of a circle of life-like feedback loop between customers and businesses. And only consumers can set in motion this virtuous cycle of increasing demand and hiring. In this sense, an ordinary consumer is more of a job creator than a capitalist like me.
That's why when business people take credit for creating jobs, it's a little bit like squirrels taking credit for creating evolution. It's actually the other way around.
Anyone who's ever run a business knows that hiring more people is a course of last resort for capitalists. It's what we do if, and only if, rising customer demand requires it. And in this sense, calling yourselves job creators isn't just inaccurate, it's disingenuous."

Sunday, July 8, 2012

Longer prison terms really do cut crime, study shows | Law | The Observer

Longer prison terms really do cut crime, study shows | Law | The Observer

Since the impact seemed to be most with repeat offenders however, it does indicate that the longer sentances are simply reducing the number of criminals at large, rather than acting as deterrance or cure. Furthermore it could then just provide a temporary lull due to an extended prison pipeline, and hence not be a long term solution at all. Executions would work even better, but would be no more jusifiable.

Tuesday, July 3, 2012

Bankers and the neuroscience of greed | Ian Robertson | Comment is free | guardian.co.uk

Bankers and the neuroscience of greed | Ian Robertson | Comment is free | guardian.co.uk

  • power is one of the most potent brain-changing drugs known to humankind, unconstrained power has enormously distorting effects on behaviour, emotions and thinking.
  • Researchers at Tilburg University showed that people made to feel more powerful cheated more when they believed themselves to be unobserved. Power also made ordinary people more hypocritical when making judgments about moral dilemmas, being much more strict in applying rules to others, but much more lax in applying them to themselves. Even tiny amounts of artificial power, in other words, increased both immorality and hypocrisy.
  • Paul Piff of the University of Berkeley found in a US-based study that, compared with lower class people, upper class individuals were more likely to break the law while driving, to show unethical tendencies in decision-making, to take valued goods from others, to lie in a negotiation, to cheat in order to improve their chances of winning a prize, and to endorse unethical behaviour in a work situation.
  •  It has become a cliche to explain the behaviour of bankers in terms of greed, but cliches are not always wrong. Power and money both act on the brain's reward system, which if over-stimulated for long periods develops appetites that are difficult to satisfy, just as is the case for drug addiction. We call these appetites greed and greedy people are never satisfied. That is the challenge for politicians and regulators.

Thursday, May 17, 2012

Feel bad to feel good

Alain de Botton on self-help books : " [they] make the grave assumption that the best way to cheer someone up is to tell them that all will be well. They are utterly cut off from the spirit of their more noble predecessors, who knew that the fastest way to make someone feel well is to tell her that things are as bad as, and possibly much worse than, she could ever  have thought. Or, as Seneca put it so well, "What need is there to weep over parts of life? The whole of it calls for tears." "

Tuesday, May 15, 2012

The Irrationality of Irrationality: The Paradox of Popular Psychology | Guest Blog, Scientific American Blog Network

The Irrationality of Irrationality: The Paradox of Popular Psychology | Guest Blog, Scientific American Blog Network
Extracts : 
  • People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.
  • we humans love narratives...But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one?s worldview. Relying on them often leads to inaccuracies and stereotypes..rarely do we ask: ?What more would I need to know before I can have a more informed and complete opinion??
  • The last several years have seen many popular psychology books that touch on this line of research....[BUT] when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment
  • The crux of the problem, as Cowen points out [http://www.youtube.com/watch?v=RoEEDKwzNBw], is that it?s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful
  • Brenner, Koehler and Tversky...reduced conclusion jumping by getting people to consider the other information at their disposal
  • Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don?t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.

Wednesday, May 9, 2012

Ten things you need to know about tax | Money | The Guardian

Ten things you need to know about tax | Money | The Guardian

extracts:
  • 'Taxes are the price we pay for civilisation," US supreme court justice Oliver Wendell Holmes
  •  Philosopher Peter Singer addresses this issue in his book on American politics, The President of Good and Evil. "It makes no sense to talk of the money you would have if the government did not levy taxes," he writes. Imagine, he suggests, you're working for a car manufacturer and get $1,000 a week, $200 of which is taken in taxes. Why can't I donate that $200 to the donkey sanctuary or use it to destroy my septum with illegal drugs, professor? Well, says Singer, your car company could not make cars without a legal system that protects mining rights, private ownership of land, accepted currency, transport systems, energy production, an educated labour force, patent protection, judicial resolution of disputes, national defence, protection of trading routes
  • Nobel prize-winning economist Herbert Simon once estimated that it is such social capital ? or the social environment in a wealthy country such as the US or UK ? that enables its residents to generate 90% of its income.

Friday, May 4, 2012

the wrong point in the rights argument

Watching the BBC2 programme on the furore over certain human rights cases (Rights gone wrong?) it seemed to be that most of the controversial cases had the same basic problem : the primary mechanism to handle these cases was inadequate to provide the sense justice the public desired, but their anger was directed at the way follow up approaches were then often blocked by the European Court on Human Rights.

So for example the case of the failed asylum seeker who ran over a 12 year old and left her dying in the street. After serving his (unbelievably short) four month sentence he was allowed stay in the country (on the grounds of right to a family life) because he had in the meantime married an English woman. While the length of the sentence sounds ridiculous, the point is that it was only this which related to his crime, and it was this that was actually lacking. Once he had served his time then he couldn't be punished in future ways on the basis of it, yet this is what was basically being demanded in deporting him and breaking up his family.  Of course it is abhorrent to think of him leading a normal life after having paid such a small penalty for killing a young girl and bringing untold grief to her family, but the problem is his original sentence. It would be like if such a criminal won the lottery afterwards, and suddenly was seen to live a life of luxury. It would be frustratingly unfair in the grand scheme of things, but there couldn't really be a process by which such winnings would be taken off him 'just because'.

The same misplaced reaction is to be seen in the other cases mentioned. There was the issue of forced marriages, and how raising the age limit of people entering the UK to be married had (until stricken down by Strasbourg) reduced the phenomenon. Again the point was something should be done to tackle the problem directly, not rely on indirect methods (if no marriages were allowed at all then the number of forced ones would also drop). Similarly with the Abu Qatada case, since the fundamental problem here is that nothing can be done to him legally in the UK for what he is doing, but they want to deport him because of this (and can't because of rights against shipping off people to kangaroo courts). if the UK has a problem with his behaviour it needs to be able to bring in laws to deal with it, not rely on indirect workarounds (and of course the point is there are no laws because it is such a complicated and sensitive topic.

What I do admit is more complicated is the case of prisoners having the right to vote. When it comes to ex-convicts then would think the logic should be that if they have served their sentence, then afterwards they should get back their normal rights. Either they have paid their price, or they haven't, but if they haven't then there is something wrong with the original sentencing.  What I don't perhaps agree with is insistence on rights when in prison. If society has the right to take away some rights (their liberty) then I don't see why it can't take away others, for a specified amount of time. Maybe it is the blanket nature of the denial which is the problem, but surely this could be resolved. And of course there could be a case for certain extra limitations - for example psychological analysis - to refrain from returning right to vote after prison, but this would perhaps be too slipper a slope to start down, since having to meet certain conditions set by the state to be allowed vote for it could lead to unwanted places.

What was most informative in the programme was the history of the convention on human rights, and how it had been, of all people, conservatives like Winston Churchill who had pushed most for it in the post-war era. And of course there it is very debateable as to whether (as is the case in the UK) the democratically chosen laws of a country should be subject to a foreign court. The main point I would think in this argument would be that it is I think in general a good thing to have a semi-inviolate set of rules (consitution, bill of rights, etc.) which cannot be easily discarded by lawmakers, even if popular at certain times), but these rules need to be something society as a whole signs up for, and perhaps repeatedly, and one problem with the European convention is that too much unanticipated interpretation and stress might be being placed on a too old set of rules.


However, as stated above, it seems most of the bad reputation the convention gets is from a minority of cases with unappealing outcomes, in which the convention is involved, but not primarily at fault. And when one considers the great bulwark the convention acts as against unfair intrusion by the powers that be into the lives of individuals, and as the rules of a club into which countries with serious problems can be coerced, then much careful debate and analysis would be required before tinkering with it.

further information :


Wednesday, May 2, 2012

On writing, memory, and forgetting

I only recently encountered the Zeignarik Effect (that people remember uncompleted or interrupted tasks better than completed tasks) for the first time  in a book on concentration, and had only thought about it in negative terms, since it could lead to constant distraction as unfinished tasks repeatedly popped back into mind. So intriguing (and after consideration very plausible) to think that it might have a more general and positive role in maintaining memories, or at least the ones that matter (which maybe are just those that are still 'open'). In this article, Maria Konnikova wonders if this effect was something that has been recognized as far back as Socrates, and whether his warnings then against the written word might be relevant today with respect to our embracing of online tools and databases. In both cases we are delegating the mental effort of memory to something external to us, and while this is  useful and necessary for the preservation of the data itself, it is perhaps worth considering what the affect it has on our own remaining awareness of that information, once we have successfully 'shelved' it. Indeed she even mentions a study which suggested that people are far less able to recall information that they expect to be able to have access to in the future. Socrates on Google, now that's prescient....

On writing, memory, and forgetting: Socrates and Hemingway take on Zeigarnik
Some extracts from Konnikova 's article :
  • I can't help but think of an admonition that came centuries before Ms. Zeigarnik sat down to her Viennese coffee: Socrates? reproach in The Phaedrus that the written word is the enemy of memory. In the dialogue, Socrates recounts the story of the god Theuth, or Ammon, who offers the king Thamus the gift of letters:
 This, said Theuth, will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. Thamus replied: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners? souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.
  • In this [Hemingway's] view, talking something through?completing it, so to speak, off the page?impedes the ability to actually create it to its fullest potential. Somehow, that act of closure, of having talked through a piece of work, takes away the motivation to finish.
  • the advice offered by the author Justin Taylor: "Don't take notes. This is counterintuitive, but bear with me. You only get one shot at a first draft, and if you write yourself a note to look at later then that's what your first draft was?a shorthand, cryptic, half-baked fragment"
  • Hemingway seems to be, in many ways, on the same page as Socrates and the same page as Zeigarnik and her foundational studies of our memories? curious quirks. What's more, the more we know about memory, the more true it seems to be that we somehow let go of the information that we no longer feel we absolutely must hold on to. Last year, a study by Betsy Sparrow and colleagues, published in Science, suggested that people are far less able to recall information that they expect to be able to have access to in the future. Instead, they remember where and how to find that information.- I would never give up the ability to record, to access, to research endless topics at the click of a button. But, with Hemingway and Socrates never far from mind, I may be slightly more cautious about how I use that ability.



note : the paper "Google Effects on Memory" is discussed in this article, and the abstract is :


Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips
Betsy Sparrow1,*,Jenny Liu2,Daniel M. Wegner
The advent of the Internet, with sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger. No longer do we have to make costly efforts to find the things we want. We can ?Google? the old classmate, find articles online, or look up the actor who was on the tip of our tongue. The results of four studies suggest that when faced with difficult questions, people are primed to think about computers and that when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it. The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves.

Tuesday, May 1, 2012

news nibbling - think before you read

Throughout history information has been the driver of progress-  the more we knew, the more we could do. While in previous ages it was only a lucky elite who had the time and resources to be involved in this process , the impetus was always a common human drive, which is why the internet, perhaps the ultimate information gathering and disseminating technology, has been seized upon and spread into every area of modern life. We are hard wired by evolution, and encouraged by culture, to discover and learn, and the one hunger which is considered always 'good', is the hunger for knowledge.

But is it possible that we could have too much of a good thing? Reading an piece on concentration recently I identified all too well with the statistics on how much time is diverted into online distraction. While of course we all click occasionally on light and meaningless topics, funny youtube links etc. the real point is that most of this time is not necessarily spent on such frivolous 'junk' but on seemingly worthwhile matter, like news, comment or reports. But the article's description of 'the long tail of information porn' made me realise that ultimately there might be not too much difference between this supposedly worthy weighty stuff, and the lighter dross, since both tap into innate urges for the novel, the desire for new facts to tickle our fancy, even if an intellectual one. These cravings are now amply catered to via the internet, and it is perhaps worth considering if there are parallels between the way we satisfy, and satiate, ourselves informationally, and how modern advances allowed us to go from simply providing for, to pandering to, other urges. Our dispositions were shaped by our evolutionary history, and the problem arises when a drive matched to a scarce natural resource, is confronted with artificial plenty. The classic example of this is our taste for sugary and fat things, and how this healthy drive is driven unhealthily  haywire by the hyper-sources of these substances modern society has created: fast food to sweets. Even though we know too much is bad for us, we are driven to start, for evolutionary emotional reasons, and find it hard to stop for rational ones.

It is possible that a similar (if much weaker) story is now starting to play out with information. Last year I encountered for the first time the phrase 'psychological obesity' and it struck me as encapsulating the dangers perfectly. Could it be that we are getting so used to gorging ourselves on information, and over relying on the mechanisms which deliver it, that we are at risk of missing out on the real benefits that underlie it, and which made it an evolutionary goal? It seems ludicrous to suggest that more information is anything but a good thing, but maybe in harsher times and societies the same could be thought about food, and what would seem to be the insane possibility of eating too much. The point is what we learn is only in important in so far as it contributes to what we actually know, and what we can do with it. Just as food is only a resource to enable the production of energy, to be able to DO things, information and even knowledge itself is only a means to an end. Since it worked for most of our evolutionary history nature has used the shortcut of embedding in us the emotional drive for the means, but now that the normally required levels can be surpassed, we must use our rational self-control to focus on the meaning instead.  We need to realise that we cannot drift through the new oceans of information scooping up data like whales with plankton, since as in all areas, we can over consume and like the obese body disabled by too much resources, we won't be able to use even some of them properly.

There is however probably a psychological side to this problem, since apart from the drive for gain, their is our powerful aversion to loss. Faced with a deluge of data we need to choose what is most important and relevant to us, but to choose somethings is to discard others, and this we hate doing. In the material domain we (mostly) have grown to appreciate we can't actually have it all, at the same time, but we are less used to such limitations in our mental worlds. Furthermore this involves one of the more personal and poignant kinds of loss, since to choose to know is to choose what to be, and we are particularly reluctant to close off possible futures, possible selves. Choice is hard, which is why we find it easier to surf endlessly on the tsunami of information, rather than swim in any particular direction. We feel like we're doing valid travelling, but really we're going nowhere.

To really move forward we not only need to obtain information, we need to process it, and apart from the requirement of manageable amounts, we also need to develop the mental mechanisms and habits to do so properly. Again the internet and other technology are a double-edged sword, since while they provide us with valid and valuable tools and shortcuts, they do not (yet, and for the foreseeable future) provide the complete answer. To use data we need to know more than just how to access it, which webpage or search tool will find it for us, but how to interpret it, and how it relates. While I may not need to know the exact dates of events leading up to say the first world war, knowing where to find them will never allow me apply this information in another area. Only if I have analyzed the history, thought about the chronology, recognize the patterns, will I actually UNDERSTAND what is otherwise simply a sequence of events, and be able, for example, to spot parallels and resonances in current situations. Wikipedia means I do not need to clutter my mind with the minutiae, but I can only use it do to help me once I have already understood the meaning.

Unfortunately this takes time, and effort. We must choose what we want to know, and then educate ourselves, and there is no taskbar which can speed this process up for us, since we need to allow our minds to mull and manipulate the data, and bring its unparalleled pattern matching powers to bear. But in this too the mood of the internet is against us, because the constant drive for the new, means constant changeover, and little time for such consideration. I was struck once by a comment a travel blogger made to me about how even when writing about places and societies that were hundreds or even thousands of years old, in the blogosphere there was a mentality that if it was not immediate, not in real time, it somehow was less relevant. When writing about a trip along the ancient route of the silk road, he felt like there was a demand for him to post while on the move,, from phone or netcafe, even though the thing being written about had not changed for centuries. The web audience feels that anything written in the past or retrospect, is somehow already dated, and this is not just removed from what is needed for proper knowledge, but anathema to it.  I admit I succumb to this prejudice myself often enough, for example if choosing a new psychology book to read anything more than 10 years old feels somehow outdated and less worthwhile than a newer text. While of course theories develop, a classic remains a classic just because it achieves an insight that lasts, and it could be argued that a decade old book that is still around, has proved its worth by its endurance against something new which might be a flash in the pan. Ironically the effect of the modern zeitgeist is to trap us in the now, while the point of deep knowledge is to allow us to escape to the past and future as well.

So, the upshot of this is I feel that it wouldn't be a bad thing to impose on myself a bit of a 'digital diet'. Just as one can count calories to manage one's weight, I think one needs to become an information connoisseur. To know anything I need to accept I cannot know everything.

Tuesday, April 24, 2012

Missing Children and handling the media

As a parent, nothing is more unnerving  than reading about a case of child abduction. The problem is these stories are so emotionally disturbing, so vivid and on such an close and important theme that the usual human biases when handling media stories kick in to maximum effect. For all the vague awareness of actual improbabilities, there's always a voice that says, "yes, unlikely, but what if", which is very hard to counter.One problem is this 'vague awareness', and the media itself is often to blame in this, For example, this Guardian article highlights that 2,185 children are reported missing in the US every day. This sounds horrific, until one realises that given the sensitivity of the topic, many a scared mother rings the police when little johnny isn't home on time, even though he's probably just thoughtlessly gone round to little billy's house without telling her. At least (and to the article's credit, and a great advantage of reading online) the source of this statisic (the US National Centre for missing children) is provided by hyperlink, and while still not good (since not zero), the actual statistics for number and type of abductions are :


The U.S. Department of Justice reports
  • 797,500 children (younger than 18) were reported missing in a one-year period of time studied resulting in an average of 2,185 children being reported missing each day.
  • 203,900 children were the victims of family abductions.
  • 58,200 children were the victims of non-family abductions.
  • 115 children were the victims of “stereotypical” kidnapping. (These crimes involve someone the child does not know or someone of slight acquaintance, who holds the child overnight, transports the child 50 miles or more, kills the child, demands ransom, or intends to keep the child permanently.)
This would indicate only 32% of the reported missing are actual abductions (still ~700 a day), but most importantly only 115, or 0.01% are 'real' kidnappings (though the description is not perfectly clear here).
Looking at the demographics for the US it seems there are ~74.7 million children under 18 in the US, so 115 per year of this is ~1 in 650,000,  or between 1 and 2 in a million.  Of course this means still several children in every city, and each individual case is a horrific tragedy if it really comes to the worst, but it has to be remembered that there are many many horrible events at this order or probability (even focused on Children - for example more than 1 in 6000 children under 15 in the US have cancer , making it 10 times more likely than abduction). The point is not to fear these things, but to fear the appropriately to the likelihood, since if we really considered what could happen we would be paralyzed by fear. So while horrible to read about such stories as Etan Patz or Madeline McCann, we can at least be comforted by the fact that they make the news because they are news, something out of the ordinary. It's a scary world, but (most of the time) the odds are with us.

Monday, April 23, 2012

The future internet, no dogs allowed?

Online identity: is authenticity or anonymity more important?
Before Facebook and Google became the megaliths of the web, the most famous online adage was, "on the internet, no one knows you're a dog". It seems the days when people were allowed to be dogs is coming to a close

http://www.guardian.co.uk/technology/2012/apr/19/online-identity-authenticity-anonymity

The argument (and trend) seems to be that real identities enable more people to trust the web, and thus take part in it :
"Allan believes the benefits of authentic identity outweigh the costs. Facebook and other services with an assurance of security and credibility are more inclusive, and open up the web to new audiences who never would have gone online before, he says. "We're optimists. Facebook enables hundreds of millions of people to express themselves online because they didn't have or know how to use the tools they needed." Facebook, he believes, is a stepping stone to the rest of the web."

My personal position is that while it would be good to have consistent identities online (regular use of same psuedonym, or a limited set of psuedonyms, across platforms) there are many reasons why this should not mean real life identity. The biggest reason is perhaps the dangers people would expose themselves to, from the obvious cases of antagonizing a repressive (or even democratic) systems laws to the more widespread risk of 'peer-to-peer' persecution and mob justice. The internet often brings out the worst in both people and state in how they react to perceived violations of norms and values, and revealing one's true identity would expose one to this. Of course it may be argued that if everyone is identifiable then the risk from trolls etc. is also removed, but while this might temper it, and prevent outright illegal persection, it would not prevent plenty of malign and vindictive behaviour. Furthermore it is a fact that the nastiest people are often the most passionate, and hence most likely to go to the effort to bypass any authentication system (something which will always be possible no matter how well implemented).

Furthermore, on a lower level, but still important, there is the problem of interaction with real world relationships - friends, acquaintances, co-workers employers etc. The sharing and linking of the web means anything one says or does can easily be passed on, taking it out of the intended social circle, and out of context. The result would be that, if real identities were necessitated on services such as blogger, one would have the option of either limiting who can hear to a (hopefully) trustworthy small circle of friends, or resort to comments so bland and innane as to be worthless (and even this might not work, since even a fair and balanced statement might anger a racist or bigotted associate).

It is interesting that such a anonymous-but-traceable approach is supported even in the restrictive regimes of China and South Korea:
"An online identity can be as permanent as an offline one: pseudonymous users often identify themselves in different social networks using the same account name. But because their handles aren't based on real names, they can deliberately delineate their identity accordingly, and reassert anonymity if they wish. Psychologists argue that this is valuable for the development of a sense of who one is, who one can be, and how one fits into different contexts. This kind of activity is allowed even in countries where social network account holders are required to register for a service using a national ID, as in South Korea and China; their online public identities are still fabrications. Even with this explicit link with the state, when users are aware that their activities online are traceable, identity play continues."

This would make it seem that in some ways China is more permissive than Google, since although the state might want to know if you're a dog, it will still let you bark around the virtual park if you want; when it comes to Google+ however, no dogs allowed.

Sunday, April 22, 2012

how many freeloaders in a trusting society?

I always think it is a good sign of the overall health of their societies that cities like Munich and Vienna, despite their size, still can implement an 'honesty' ticket system on the public transport. While of course there are regular checks, my own personal experience is to have traveled many times and only had my ticket checked once or twice, and having only once seen someone else in the carriage being forced to pay during such a check, it was obvious the level of conformity was quite high. But it is interesting to get some hard numbers on just how many 'cheats' these checks turn up. The following story from the Austrian News reports 60,000 violators, out of approximately 2 million people checked. While 60,000 sounds a lot, it works out as just 3%, which is remarkably low. One could also expect that the checks are presumably carried out at the most likely times for people to cheat (I would guess there's more temptation at night than on a regular morning commute) so overall the percentage is probably even slightly lower. So good to hear, and a good example of how people can be trusted to do the right thing...at least in Vienna!
http://wien.orf.at/news/stories/2529980/

Share and enjoy

Now for the good news – sharing can make you happy. Pass it on
some extracts:
-Social media have tapped into something quite fundamental and the sharing urge in human nature may stem from something more basic than anything else: simple arousal and the fight-or-flight response that we share with our distant ancestors. .
-A group of students was asked either to sit still or jog in place for 60 seconds and then to read a neutral news article that they could email to anyone if they so desired. Fully 75% of the joggers decided that the article was fascinating enough to email to someone else. And the non-joggers? Only 33% thought sharing was the right option. Perhaps Facebook's next major acquisition should be StairMaster.
-being virtually rejected actually activates the same brain areas that are associated with physical distress, the dorsal anterior cingulate cortex and the anterior insula.
-Psychologist and novelist Charles Fernyhough once referred to Twitter – on Twitter, of course – as "a great example of what Piaget called 'collective monologues'. Lots of people chattering away with no attention to each other."
-Indeed, a recent study suggested that individuals who ranked higher on emotional instability were more likely to share online, though not in person, echoing the findings of psychologist John Cacioppo that a greater proportion of online interactions correlates with increased loneliness and isolation. Clearly, not all sharing is created equally.
-According to unpublished results by Eva Buechel, now at the University of Miami, online sharing can actually make us feel better, serving as a very real form of emotional therapy. It's as if every tweet that gets passed on, every link that is re-shared activates our brains' pleasure centres, releasing endorphins in much the same way as physical pleasure, exercise, excitement or strong sensory stimulation.

Wednesday, April 18, 2012

Moral behaviour in Animals

Excellent TED talk from Frans de Waal on Moral Behaviour in animals :

"Empathy, cooperation, fairness and reciprocity -- caring about the well-being of others seems like a very human trait. But Frans de Waal shares some surprising videos of behavioral tests, on primates and other mammals, that show how many of these moral traits all of us share."
http://www.ted.com/talks/frans_de_waal_do_animals_have_morals.html

The basic point is that the foundations for the pillars of our morality, fairness/reciprocity and compassion/empathy can be seen in animal behaviour, and thus have a long evolutionary history.

In this talk de Waal shows examples of co-operation, even when no immediate gain to one partner, and perhaps more important, awareness of another, and even concern. So while perhaps unsurprising that Chimpanzes will work together to obtain a treat, or even that one will help out even if not hungry (presumably in anticipation of getting the favour returned in the future) it is less expected that even if they obtain the same treat, they will favour a mechanism whereby another Chimp obtains something as well over one in which only they are rewarded. The most amusing example was of 2 Capuchin monkeys being handed different treats. Initially the first monkey was happy to be rewarded with some cucumber, but when it saw its neighbour being rewarded with a grape for the same task, it not only got upset, but hurled the cucumber treat back at the experimenter. Even just being upset while being simply economically rational would still indicate awareness to the how the other monkey is being treated, but the fact that the first monkey is so upset as to reject its treat completely is irrational, and fits well with a concept of fairness which is so emotionally ingrained that its violation trumps immediate self-interest (as captured in the human phrase to cut off one's nose to spite one's face).

Ability to co-operate in light of future interaction, awareness of others, and an ingrained and importantly emotionally powerful sense of fairness are key to a functioning moral system, and it is fascinating to see these at work in other species, showing a clear evolutionary basis.

While not shown, de Waal even suggested the Capuchin monkeys once demonstrated the ultimate moral element -  self-denial in solidarity with others - when one monkey was seen to refuse the grape unless the other got one too. While this might be aberrant Capuchin behaviour, it also must have some evolutionary history so could well be present, if partially and easily overrided, in such animals.


Tuesday, April 17, 2012

You choose your friends, but only 150 of them

Interesting talk from TEDxObserver given by Robin Dunbar :

http://www.youtube.com/watch?v=07IpED729k8&list=PL880EAF3F736F99AC&index=6&feature=plpp_video
"Robin, currently director of the Institute of Cognitive and Evolutionary Anthropology of the University of Oxford, is renowned for creating a formula which is now known as 'Dunbar's number' - and that number is 150. This calculates the 'cognitive limit' of the number of people we can hold meaningful friendships with. When it was first formulated it created a fevered debate about the nature of and the differences between, online and real 'friendships'.
Robin will explore the psychology and ethology of romantic love to find out if the brain - and science - can help us explain how and why we fall in love."


In the talk he mentions also that we have circles of levels of friendship, with the average number of close friends being 5, then a next level of about 15, etc. Perhaps not surprisingly being in a romantic relationship costs 2 close friends, due mainly to the lack of time available to devote to such friendships. Indeed time spent is a key factor in how friendships are maintained, though there are it seems gender differences, with women spending more time in verbal communication (average female phone call was something close to an hour) than men (average phone call something like 7 seconds! 'meet you at 10? yip! grand so'). And face to face time, meeting in person or even skype, is the most valuable of all, which indicates that online relationships face major hurdles in being maintained

Also interesing is from this guardian interview with him on the question as to whether Dunbar's number can be increased :

"We're caught in a bind: community sizes were designed for hunter-gatherer- type societies where people weren't living on top of one another. Your 150 were scattered over a wide are, but everybody shared the same 150. This made for a very densely interconnected community, and this means the community polices itself. You don't need lawyers and policemen. If you step out of line, granny will wag her finger at you.
Our problem now is the sheer density of folk – our networks aren't compact. You have clumps of friends scattered around the world who don't know one another: now you don't have an interwoven network. It leads to a less well integrated society. How to re-create that old sense of community in these new circumstances? That's an engineering problem. How do we work around it?"

Since a major issue in the modern world is how to view, shape and even sign up to 'society' then the fact that our brains are evolved for such small social groupingswill have to be taken into account. Perhaps one element of a solution would be to consider 'groups' as 'persons'. So maybe if one viewed 'other commuters' or 'Germany' or 'bankers' as 1 of the 150 individuals one can keep track of, then it would allow the rest of society to be some how kept socially 'in mind' rather than being a blurry 'other'. Of course the main problem with this would be the risk of stereotyping and blanket generalization, but since we have know complex and deep individuals, maybe we can similarly have 'friendship' with complex and complicated groups. Of course this is just a spur of the moment idea, but the point is we need to take our natural dispositions into account if we are to adapt to the modern world.

Sunday, April 15, 2012

UK Tax report : 25% who cheat means 75% who don't

Tax, and who should pay what, and who actually does, is a cornerstone of modern society, so interested to see some statistics on tax payments for higher earners 

Treasury reveals how little tax the super-rich pay | Society | The Guardian

The article reports "The new Treasury figures show 10,000 UK taxpayers earn between £1m and £5m, and, of those, 10% pay between 30% and 40% in tax, 5% pay between 20% and 30% tax, and 3% pay less than 10%. Of those earning between £250,000 and £500,000, 27% were paying tax of less than 40%. All the figures cover the financial year 2010-11."
While it is disgraceful that significant numbers seem to be avoiding paying what they should in tax, at least these numbers suggest that at least 75-80% of higher earners are paying above 40%, which I assume is roughly the appropriate rate. Furthermore the most suspicious cases (paying tax of less than 10% on vast income)  seem to correspond to about 5% at all levels (250,000-500,000,1m to 5m, above 5m etc).

While of course this is a disgrace, and apart from the moral problem corresponds to significant lost revenue, one can always find 5% of people with some kind of objectionable behaviour/outlook (for example in all countries roughly far right parties attract this level of support or more), so this would tend to suggest it's not a failure of morals in society (the UK is not a nation of cheats) but a failure of enforcement and policy. One could probably expect at least 5% to try to cheat, the issue is how many get away with it.

Perhaps too newspapers should report the 80% number as well, since I read once of a case in Australia where tax payments rose after it was reported by the revenue commission there that 'most' people pay their taxes fairly. If people think everyone else is doing the right thing, then they are more likely to do so themselves, even if not being obeserved. Similarly they are probably more likely to do the wrong thing if they think everyone else is. So while it is worth pointing out that 5% of people cheat to extremes, it's also worth highlighting that 95% don't.

Wednesday, April 4, 2012

The price of your soul: How your brain decides whether to 'sell out'

A neuro-imaging study shows that personal values that people refuse to disavow, even when offered cash to do so, are processed differently in the brain than those values that are willingly sold
http://www.emory.edu/EMORY_REPORT/stories/2012/01/esc_brain_decides_sell_out.html
Some points from the study :
  • The brain imaging data showed a strong correlation between sacred values and activation of the neural systems associated with evaluating rights and wrongs (the left temporoparietal junction) and semantic rule retrieval (the left ventrolateral prefrontal cortex), but not with systems associated with reward.
  • "Most public policy is based on offering people incentives and disincentives," Berns [ Gregory Berns, director of the Center for Neuropolicy at Emory University and lead author of the study] says. "Our findings indicate that it?s unreasonable to think that a policy based on costs-and-benefits analysis will influence people?s behavior when it comes to their sacred personal values, because they are processed in an entirely different brain system than incentives."
  • Research participants who reported more active affiliations with organizations, such as churches, sports teams, musical groups and environmental clubs, had stronger brain activity in the same brain regions that correlated to sacred values. "Organized groups may instill values more strongly through the use of rules and social norms" Berns says.
  • The experiment also found activation in the amygdala, a brain region associated with emotional reactions, but only in cases where participants refused to take cash to state the opposite of what they believe."Those statements represent the most repugnant items to the individual" Berns says, "and would be expected to provoke the most arousal, which is consistent with the idea that when sacred values are violated, that induces moral outrage."
  • Future conflicts over politics and religion will likely play out biologically, Berns says. Some cultures will choose to change their biology, and in the process, change their culture, he notes. He cites the battles over women's reproductive rights and gay marriage as ongoing examples.
The full report is available here

Virtuous Behaviors Sanction Later Sins

People are quick to treat themselves after a good deed or healthy act:

"the study, published in the journal Addiction, is the first to examine the health ramifications of the licensing effect, but others have shown its influence on moral behavior. In 2009 a study found that reminding people of their humanitarian attributes reduced their charitable giving. Last year another experiment showed that when individuals buy ecofriendly products, they are more likely to cheat and steal.
"Sometimes after we behave in line with our goals or standards, it's as if our action has earned ourselves some moral credit," says psychologist Nina Mazar of the University of Toronto, an author of the green products study. "This credit can then subsequently be used to engage in self-indulgent or selfish behaviors without feeling bad about it."

 http://www.scientificamerican.com/article.cfm?id=license-to-sin


While I think it is already clear that morality relies on feeling ‘good’ about something (but this is not to denigrate it, just to point out it’s mechanism is tied to our emotions ) maybe these findings indicate that the emphasis is more about feeling good about oneself, as opposed to feeling good about any particular action. I.e. it is not at the low level of individual moral choices that the emotional drives work, but at the higher level of our overall self-image, whether we see ourselves as ‘good’ people. Seen in this way, maybe if one boosts one’s self-perception, by for example giving to charity, or making an effort to recycle, then there is temporarily less drive to boost it any further. This ‘I’ve done my bit’ approach is of course insidious, since can lead to a slippery slope of moral abrogation, and it is our actions in all moral spheres that matter.

Are there any possible solutions? If this really is the case  then the most effective one might be to try to make the right actions less ‘special’  but instead consider them basic minimum of what we should do, and hence less  ‘boosting’ individually to our overall image. So for example if green behavior is the exception, then it might lead to excessive moral self-congratulation, whereas if it is the norm, then this effect is limited. Nobody feels particularly good about not having dropped litter, since it is just expected, and such ‘baseline’ attitudes need to be expanded. Of course ‘norms’ rely on society at large, and are slow, and hard, to change. Incidentally this is perhaps another argument against the ‘makes no difference’ response, since even if the action itself (recycling one’s own waste) results in no tangible effects, it has a social effect by serving as an example, and might help change the norms so that eventually enough people act so as to make a difference (and of course arguing against it has an even greater corrosive negative effect, since it also contributes to what the ‘norm’ will be).

But what can the individual do? As suggested in the report self-awareness seems the best candidate to try compensate for the ‘license to sin’ effect. Knowing that we are predisposed to such behavior may make us analyze each future choice that bit more, and by centering our focus on the now and not the past, our self-perception will be perhaps less biased by the good things we’ve done previously.

"You may be able to avoid the pitfall simply by remembering that the feeling of having "earned it" leads down a path of iniquity."
 
Hopefully this will work, since to twist the famous phrase, it seems all that is needed for evil to triumph is for good people to have done some good before.

Tuesday, April 3, 2012

Aristotle : actions speak louder than reasons

From Story, by Robin McKie :
"As Aristotle observed, why a man does a thing is of little interest once we see the thing he does. A character is the choices he makes to take the actions he takes. Once the deed is done his reasons why begin to dissolve into irrelevancy."

Monday, April 2, 2012

How stereotypes influence us, even if we don't accept them

Interesting analysis of the recent Trayvon Martin incident based on psychology studies involving 'the Police Officer's Dilemma' : tests to make split second shoot/don't shoot decisions.
http://blogs.scientificamerican.com/guest-blog/2012/03/26/trayvon-martins-psychological-killer-why-we-see-guns-that-arent-there/?WT_mc_id=SA_WR_20120330
Some points :
  • First of all, no matter how racist the participants were (or were not), they were equally likely to shoot unarmed Black targets; outright levels of racism did not predict the results at all. However, one thing did predict performance on the task ? the participants? level of awareness that there is prejudice towards Black people in American society, even if the participant adamantly did not support those stereotypes. Simply being highly aware of prejudice in the world, even if you don?t agree with, support, or like that prejudice, makes it more likely that you might make the fateful mistake of shooting an unarmed target when making split-second decisions in uncertain conditions. The more aware you are of cultural stereotypes, the more likely you are to make a biased mistake.
  • Correll?s research demonstrated that everyone ? even an upstanding college undergraduate lacking any racial prejudice ? is vulnerable to making racially biased decisions, particularly under the split-second pressures of the Police Officer?s Dilemma. Did racism motivate George Zimmerman?s actions against Trayvon Martin? Yes. But does a person have to be racist to make the same split-second decision? No.
  • When the ?shooting game? task was given to Black participants, they turned out to be just as likely to accidentally shoot unarmed Black targets as the White participants were.
  • At the end of the day, it?s not always about whether or not you are racist, or whether or not you think that Black people are violent. Cultural stereotypes can become automatically activated and applied to our behaviors even when we don?t actually endorse them; the sheer knowledge that these stereotypes exist can be enough to influence our judgments, especially when it comes to split-second decisions. Because of cultural stereotypes, the shooters in Correll?s games had a lower threshold for when they would decide it was OK to shoot at Black targets, although most of them probably could not have told you that this was happening, and most of them would have been appalled to find out about their biases.

Sunday, April 1, 2012

David Mitchell on spectator short

As usual a nice humorous piece from Mitchell, but also hits on something important in how many sports have become so professionally perfected that they lose some of the human element that is essential to spectator sport

Without Jocky Wilson, Subo would still be singing in the bath | Comment is free | The Observer

"Spectator sport seems to have changed a lot over my lifetime. I was watching the rugby a few weeks ago and they showed a clip of Bill Beaumont's Grand Slam-winning England team of 1980. It didn't look like sport looks nowadays. They were just normal men. Wearing rugby kit, in relatively good shape, and quite big and burly, but recognisable as people you might see walking down the street, having a pint in a pub or wearing a suit and tie in a meeting.
When rugby union was an amateur sport, it was undoubtedly played to a much lower standard, but no one felt that at the time. The crowds watching Bill Beaumont weren't missing the rugby union of today, ruthlessly played by 30 versions of Mr Incredible. Professionalism has brought remuneration to players, and the greater corporate involvement required to fund that, but it hasn't done much for spectators except put more adverts on the pitch. A higher standard of play isn't in the interests of sport any more than inflation is in the interests of commerce.
With greater demands on their time and physique, it's no surprise that the sportspeople of today can seem one-dimensional – and I don't just mean they're thinner. Like most contemporary politicians, our elite athletes haven't lived normal lives, so there's something alien about them. Simon Cowell, among others, spotted this change. His primetime TV formats are plugging the emotional gap that sport used to fill – replacing Jimmy White and Jocky Wilson in the same way that astrology and homeopathy are supplanting religion."

Saturday, March 31, 2012

Teaching kids code not computers, but to other levels as well

" Starting in primary school, children from all backgrounds and every part of the UK should have the opportunity to: learn some of the key ideas of computer science; understand computational thinking; learn to program; and have the opportunity to progress to the next level of excellence in these activities.
We'll get to why this is important and necessary in a moment, but first we need to face up to a painful fact. It is that almost everything we have done over the last two decades in the area of ICT education in British schools has been misguided and largely futile. Instead of educating children about the most revolutionary technology of their young lifetimes, we have focused on training them to use obsolescent software products. And we did this because we fell into what the philosopher Gilbert Ryle would have called a "category mistake" – an error in which things of one kind are presented as if they belonged to another. We made the mistake of thinking that learning about computing is like learning to drive a car, and since a knowledge of internal combustion technology is not essential for becoming a proficient driver, it followed that an understanding of how computers work was not important for our children"
A radical manifesto for teaching computing | Education | The Observer

Overall of course I agree with this, since perhaps the most notable feature of modern technology, is how little specialized skill is needed to use them. Motion sensors, voice recognition and touch screens mean even the input mechanisms are almost "natural" and one barely needs to be able to use even a keyboard and mouse any more. This trend indicates the real need will not be training in tools, since the tools require training less and less,  but education about the tools,  their hidden internal structures,  not only to foster an environment to develop more and better tools, but to maintain an independence from them, since a common downside of ease of use is over reliance and too much trust. The less we have to think about using the tools, the less able we are to handle situations where they go wrong, or , more importantly, even spot when they might.
The only reservation I have is that while incredibly useful, computational, algorithmic thinking can also be restrictive, and am always wary of it being exalted as the paradigm of "intelligence",  and there is a risk that introducing these skills early, without appropriate balance, may have limiting effects on creative development. The point is just as  education should focus on the structures behind the tools, it must also include an analysis of the foundations of those very structures themselves. Education not only go beyond the how to the why, but to the why not as well.