10 years of Skype: your stories | Technology | theguardian.com
Have been using skype myself since 2005....first to keep in contact with parents, chat messages with friends in other countries (before facebook, googlechat!), then for kids with grandparents, meant could have a great relationship with them, when meet in person seems like afetr a few days break instead of a few months...
Favourite skype moment - watching live rugby matches with my father another country...
most unusual usage - live video feed from LEGO mindstorms robot being controlled remotely...
The HalfTalk Post - Society & Psychology
Friday, August 30, 2013
Wednesday, February 20, 2013
Making the best of a bad job
Interesting analysis program on the denigration of lower paid jobs. An interesting point about increased social mobility is that it increases the stigma of those who don't move up, and that for 'good' jobs to be promoted (e.g. for macro policy reasons) then there need to be 'bad' jobs to contrast with. If everyone should strive to be above average, what does that mean for the other 49%?
http://www.bbc.co.uk/programmes/b01qlmlg
David Goodhart considers whether the declining status of basic jobs can be halted and even reversed.
http://www.bbc.co.uk/programmes/b01qlmlg
David Goodhart considers whether the declining status of basic jobs can be halted and even reversed.
Successive
governments have prioritised widening access to higher education to try
to drive social mobility, without giving much thought to the impact
this has on the expectations of young people who, for whatever reason,
are not going to take that path.
But even in a knowledge-based economy, the most basic jobs survive. Offices still need to be cleaned, supermarket shelves stacked, and care home residents looked after.
The best employers know how to design these jobs to make them more satisfying. Are politicians finally waking up to the problem?
But even in a knowledge-based economy, the most basic jobs survive. Offices still need to be cleaned, supermarket shelves stacked, and care home residents looked after.
The best employers know how to design these jobs to make them more satisfying. Are politicians finally waking up to the problem?
Wednesday, January 16, 2013
Gestalt Shift
Gestalt Shift
We can see the two-dimensional figure below as a [three-dimensional] cube. There are [at least] two ways to do this, and the switch between the two is called a 'Gestalt shift'. It is not a voluntary shift, although one can try to bring it about.
We can see the two-dimensional figure below as a [three-dimensional] cube. There are [at least] two ways to do this, and the switch between the two is called a 'Gestalt shift'. It is not a voluntary shift, although one can try to bring it about.
Tuesday, December 11, 2012
On the 12th day of Christmas ... your gift will just be junk | George Monbiot | Comment is free | The Guardian
On the 12th day of Christmas ... your gift will just be junk | George Monbiot | Comment is free | The Guardian
While I find some of his examples a bit too frivolous, our addiction to consumption (for both pyschological and perhaps more importantly economic reasons) is a serious issue. If I were to name some of our behaviours which might be viewed by our descendants as morally reprehensible then I think this consumption would rank for two reasons. First due to its impact on the worlds resources and climate change, and hence the consequences for innumerable future generations, and secondly due to the near-slavelike cheap labour system that underpins it all. But we are oblivious to the scale or moral nature of the problem.
The only possible silver lining is that modern technology does allow new, immaterial forms of consumption and commerece, with the classic example being apps. Creation and consumption of virtual objects could provide sustainable growth. Alas, while in principle this sounds promising, in reality the relative volumes probably make it insignificant to the inevitable material economy.
While I find some of his examples a bit too frivolous, our addiction to consumption (for both pyschological and perhaps more importantly economic reasons) is a serious issue. If I were to name some of our behaviours which might be viewed by our descendants as morally reprehensible then I think this consumption would rank for two reasons. First due to its impact on the worlds resources and climate change, and hence the consequences for innumerable future generations, and secondly due to the near-slavelike cheap labour system that underpins it all. But we are oblivious to the scale or moral nature of the problem.
The only possible silver lining is that modern technology does allow new, immaterial forms of consumption and commerece, with the classic example being apps. Creation and consumption of virtual objects could provide sustainable growth. Alas, while in principle this sounds promising, in reality the relative volumes probably make it insignificant to the inevitable material economy.
Thursday, November 29, 2012
Clare Carlisle on Evil
An interesting series on the concept of 'evil' in the Guardian : Clare Carlisle | The Guardian
While of course personally have no interest in the theological uses of the term, I do think it still is important as a psychological concept, but not because it is intrinsically valid, but because humanity so often reaches for it to explain/deal with the horrors of life, natural and manmade.
part 1: how can we think about evil?
The religious idea that thinking about evil involves coming to terms with a darkness in all our hearts provides food for thought
part 2: does it exist?
St Augustine's theory was that evil was 'nothing other than the absence of good' ? an idea supported by modern science
Kierkegaard believed that human sin was a result of a combination of pride and fear in the face of freedom
part 4: the social dimension
Does contemporary society give rise to conditions more conducive to evil than in the past?
part 5: making sense of suffering
One of the basic purposes of our culture is to interpret suffering, to make it meaningful. Myth, art and religion all do this job
part 6: the trial of Eichmann (1)
In finding Hitler's transport administrator guilty, the court recognised him as a free, morally responsible human being
part 7: the trial of Adolf Eichmann (2)
At the heart of Eichmann's banality was not thoughtlessness but evasiveness, and the 'interplay between knowing and willing'
While of course personally have no interest in the theological uses of the term, I do think it still is important as a psychological concept, but not because it is intrinsically valid, but because humanity so often reaches for it to explain/deal with the horrors of life, natural and manmade.
part 1: how can we think about evil?
The religious idea that thinking about evil involves coming to terms with a darkness in all our hearts provides food for thought
part 2: does it exist?
St Augustine's theory was that evil was 'nothing other than the absence of good' ? an idea supported by modern science
- Surprisingly, though, the basic insight of Augustinian theodicy finds support in recent science. In his 2011 book Zero Degrees of Empathy, Cambridge psychopathology professor Simon Baron-Cohen proposes "a new theory of human cruelty". His goal, he writes, is to replace the "unscientific" term "evil" with the idea of "empathy erosion": "People said to be cruel or evil are simply at one extreme of the empathy spectrum," he writes. (He points out, though, that some people at this extreme display no more cruelty than those higher up the empathy scale – they are simply socially isolated.) Loss of empathy resembles the Augustinian concept of evil in that it is a deficiency of goodness – or, to put it less moralistically, a disruption of normal functioning – rather than a positive force. In this way at least, Baron-Cohen's theory echoes Augustine's argument, against the Manicheans, that evil is not an independent reality but, in essence, a lack or a loss
Kierkegaard believed that human sin was a result of a combination of pride and fear in the face of freedom
- Many are suspicious of the Christian concept of sin, but Kierkegaard's reinterpretation of the traditional doctrine is illuminating from a psychological perspective as well as from a religious one. While Augustine thought that Adam and Eve's first sin was transmitted to their descendents biologically – through sexual intercourse – Kierkegaard rejects this literal explanation of sin. Our failure to be good, he argues, is due to the way we deal with being both less and more free than we wish to be. Like stroppy, insecure teenagers, we crave independence, resent authority, and are scared to take responsibility for ourselves. By emphasising the importance of both humility and courage, Kierkegaard suggests a way to cope with this predicament – a non-moralistic basis for morality. And by pointing to the fear that lies beneath evil, he uncovers something common to both victims and perpetrators.
part 4: the social dimension
Does contemporary society give rise to conditions more conducive to evil than in the past?
part 5: making sense of suffering
One of the basic purposes of our culture is to interpret suffering, to make it meaningful. Myth, art and religion all do this job
part 6: the trial of Eichmann (1)
In finding Hitler's transport administrator guilty, the court recognised him as a free, morally responsible human being
part 7: the trial of Adolf Eichmann (2)
At the heart of Eichmann's banality was not thoughtlessness but evasiveness, and the 'interplay between knowing and willing'
Monday, October 15, 2012
Evil, part one: how can we think about evil? | Clare Carlisle | Comment is free | guardian.co.uk
Evil, part one: how can we think about evil? | Clare Carlisle | Comment is free | guardian.co.uk
I find the concept of 'evil' an intriguing and important one, since I think in the modern world, aware of the how environment and genetics influence, and yet do not determine, our behaviour, it is important to debate whether such a term can is usable, or useful.
While this is something I need to think about and write on in greater length, my first thoughts are that 'evil' is often used as a way to give up thinking about, or seeking further, the causes of action. To consider someone evil is to render them unknowable and unchangeable - almost to strip them of their humanity, and I have an instinctive revulsion something which so stinks of over-simplification. Given that the worst inhumanities we are capable of generally have their roots such a dehumanizing of other groups or people, then it has to be an approach we should be wary of using. With us or against us may be psychologically soothing, we have to be careful and justified when choosing such dichotomies.
That said, I also think that it is highly likely that there are some people that technically are not human in moral respects - psychopaths for example. Without dealing with the cause or curabily of this condition, at the very least it can be said that there are cases of psychopathic behaviour which is not just chosen, but firmly ground in differeng brain states - blindness to the emotions of others etc. In such cases, these people would seem indeed to qualify as 'evil' - beyond the pale of reason or (social) redemption, uninfluenced by either deterrent or punishment.
However the problem is 'evil' while on the one hand removing people from the moral sphere, is itself a moral judgement. While we may consider evil people as acting like animals, we don't judge them as animals. Since morality implies choice, then it is as if they choose to be animals, choose to not be human. But to me this raises a fundamental question - what is the difference between having the biological setup which makes one choose to be act like psychopath, than having (the presumably morally neutral) misfortune to be a pyschopath. What is the difference between being born as somone who is evil, and someone who is born as someone who chooses to be evil? If there is responsibility, there is a character,, but whence the blame for the character?
So it seems to me that the term 'evil' is not something which has a place in the modern world. It is a moral sop, a relic of a religious heritage. The point of course is that this might practically make no difference - we will still need to protect from and punish those who 'do' evil, but there is no place in moral judgement for horrific nature, even if it's human.
I find the concept of 'evil' an intriguing and important one, since I think in the modern world, aware of the how environment and genetics influence, and yet do not determine, our behaviour, it is important to debate whether such a term can is usable, or useful.
While this is something I need to think about and write on in greater length, my first thoughts are that 'evil' is often used as a way to give up thinking about, or seeking further, the causes of action. To consider someone evil is to render them unknowable and unchangeable - almost to strip them of their humanity, and I have an instinctive revulsion something which so stinks of over-simplification. Given that the worst inhumanities we are capable of generally have their roots such a dehumanizing of other groups or people, then it has to be an approach we should be wary of using. With us or against us may be psychologically soothing, we have to be careful and justified when choosing such dichotomies.
That said, I also think that it is highly likely that there are some people that technically are not human in moral respects - psychopaths for example. Without dealing with the cause or curabily of this condition, at the very least it can be said that there are cases of psychopathic behaviour which is not just chosen, but firmly ground in differeng brain states - blindness to the emotions of others etc. In such cases, these people would seem indeed to qualify as 'evil' - beyond the pale of reason or (social) redemption, uninfluenced by either deterrent or punishment.
However the problem is 'evil' while on the one hand removing people from the moral sphere, is itself a moral judgement. While we may consider evil people as acting like animals, we don't judge them as animals. Since morality implies choice, then it is as if they choose to be animals, choose to not be human. But to me this raises a fundamental question - what is the difference between having the biological setup which makes one choose to be act like psychopath, than having (the presumably morally neutral) misfortune to be a pyschopath. What is the difference between being born as somone who is evil, and someone who is born as someone who chooses to be evil? If there is responsibility, there is a character,, but whence the blame for the character?
So it seems to me that the term 'evil' is not something which has a place in the modern world. It is a moral sop, a relic of a religious heritage. The point of course is that this might practically make no difference - we will still need to protect from and punish those who 'do' evil, but there is no place in moral judgement for horrific nature, even if it's human.
Wednesday, October 10, 2012
Tuesday, October 9, 2012
Tories go back to basics on burglars | Politics | The Guardian
Tories go back to basics on burglars | Politics | The Guardian
Of all the recent policy announcements, there are few that can be as primitive or populist than the latest 'batter a burglar' (as the Sun put it) policy from the UK Conservatives.
Currently householders are allowed use 'proportionate' force, but now the Tories are pushing for (literally) 'disproportionate' force to be acceptable as well. The very language itself highlights the lack of logic - if the law is there in general to prescribe suitable, i.e. proportionate actions and reactions, how can it be validly used to justify unsuitable actions as well?
Of course the standard argument is that people confronted by a burglar are afraid to react with force at all, in case they are later judged to have overreacted, but since there are presumably still some limits as to what can be meted out to an intruder (I think the phrase will be 'grossly disproportionate') then surely the same abiguity remains, just shifted to the more violent end of the spectrum. And in fact the consequences for homeowners could be then much worse, since if they overdoing a severe beating is always going to be a more serious offence (with risk of death, permanent injury) than overdoing a minor one. And of course if burglars can expect to be attacked, then they will be prepared for it, either by bringing weapons, attacking the homeowner first, and being more vicious when they do so.
And what all of this misses is the moral argument. If a homeowner can use (again literally) excessive force, then it moves from the realm of defence and prevention and into punishment and vengeance. Is it then a fair punishment for the crime of trying to steal an xbox to be severely beaten or injured? Or killed? Surely it is this reasoning that is behind non-US prohibitions on a free for all against home invaders - a realization that having your house robbed while upsetting, annoying and possibily in some way traumatizing, is still in the great scheme of things not the worst of crimes, and hence to be treated reasonably by the courts, and not be dependent on the inflamed passions of a scared homeowner?
Of course it would be nice to think that the mere deterrent would put burglars off, but the sad fact is that most are probably driven by situation or addiction to resort to it, and are not going to make such a rational decision. To see this is the case, one only needs to look at the US, where death sentences and gun toting homeowners don't seem to have resulted in some calm paradise where break ins and robberies have been eradicated. As somone who shouldn't have been famous might have said 'how's that shooty fry-ey thing working out for y'all?'
Of all the recent policy announcements, there are few that can be as primitive or populist than the latest 'batter a burglar' (as the Sun put it) policy from the UK Conservatives.
Currently householders are allowed use 'proportionate' force, but now the Tories are pushing for (literally) 'disproportionate' force to be acceptable as well. The very language itself highlights the lack of logic - if the law is there in general to prescribe suitable, i.e. proportionate actions and reactions, how can it be validly used to justify unsuitable actions as well?
Of course the standard argument is that people confronted by a burglar are afraid to react with force at all, in case they are later judged to have overreacted, but since there are presumably still some limits as to what can be meted out to an intruder (I think the phrase will be 'grossly disproportionate') then surely the same abiguity remains, just shifted to the more violent end of the spectrum. And in fact the consequences for homeowners could be then much worse, since if they overdoing a severe beating is always going to be a more serious offence (with risk of death, permanent injury) than overdoing a minor one. And of course if burglars can expect to be attacked, then they will be prepared for it, either by bringing weapons, attacking the homeowner first, and being more vicious when they do so.
And what all of this misses is the moral argument. If a homeowner can use (again literally) excessive force, then it moves from the realm of defence and prevention and into punishment and vengeance. Is it then a fair punishment for the crime of trying to steal an xbox to be severely beaten or injured? Or killed? Surely it is this reasoning that is behind non-US prohibitions on a free for all against home invaders - a realization that having your house robbed while upsetting, annoying and possibily in some way traumatizing, is still in the great scheme of things not the worst of crimes, and hence to be treated reasonably by the courts, and not be dependent on the inflamed passions of a scared homeowner?
Of course it would be nice to think that the mere deterrent would put burglars off, but the sad fact is that most are probably driven by situation or addiction to resort to it, and are not going to make such a rational decision. To see this is the case, one only needs to look at the US, where death sentences and gun toting homeowners don't seem to have resulted in some calm paradise where break ins and robberies have been eradicated. As somone who shouldn't have been famous might have said 'how's that shooty fry-ey thing working out for y'all?'
Wednesday, September 19, 2012
Facebook and Twitter: the art of unfriending or unfollowing people | Technology | The Guardian
Facebook and Twitter: the art of unfriending or unfollowing people | Technology | The Guardian
I am particularly intrigued by the idea thatfriend clutter relates to an intrinsic problem in dealing with endings, even though beginnings (marriages, births) are culturally celebrated, and every beginning marks a different ending.
some extracts :
I am particularly intrigued by the idea thatfriend clutter relates to an intrinsic problem in dealing with endings, even though beginnings (marriages, births) are culturally celebrated, and every beginning marks a different ending.
some extracts :
- Technology exposes us to vastly more opportunities for making social connections, and far more effortlessly than even a stroll down the street and a handshake. Yet an etiquette for terminating those links, should they outlive their mutual benefit ? if they ever had any ? remains as absent as ever.
- Physical clutter...We think we want this stuff, but, once it becomes clutter, it exerts a subtle psychological tug. It weighs us down. The notion of purging it begins to strike as us appealing, and dumping all the crap into bin bags feels like a liberation. "Friend clutter", likewise, accumulates because it's effortless to accumulate it: before the internet....Friend clutter exerts a similar psychological pull.
- Last year, a writer of romance novels from Illinois named ArLynn Presser embarked upon what you might call an audit of her so-called friends..she made a New Year's resolution to visit them all, to find out why or, indeed, whether they were friends.
- [however] according to an ever-growing body of evidence, social media isn't making us lonelier or less deeply connected. Instead, study after study endorses the idea of "media multiplexity": people who communicate lots via one medium, it turns out, are the kind of people who communicate lots via others as well. Regular emailers are more likely also to be regular telephoners, one study found; people who use Facebook multiple times a day, according to another investigation, have 9% more close ties in their overall social network, on average, than those who don't. Social media builds social capital, rather than degrading it:
- Even chilling statistics about more Americans lacking a confidant now looks dubious: a new analysis by the sociologist Claude Fischer concluded that the finding arose because of a change in how the questions were asked.
- The anthropologist and evolutionary psychologist Robin Dunbar famously calculated "Dunbar's number" ? the notion that the largest number of meaningful social relationships that any one person can maintain is somewhere around 150.
- Online networks have a tendency to obliterate the nuances between different kinds of relationships. Despite Facebook's lists, privacy settings and the rest, Mullany points out, "ultimately, somebody is either your friend on Facebook or they're not. In real life, we're very political about our friendships, and I don't mean that in a bad way." There are friendships we'll let fade to nothing; others for which we'll put on a facade for a few hours at Christmas; or friendships of necessity, where we'll give the impression of intimacy without the reality. In contrast, "Facebook essentially doesn't allow us to be political."
- The more profound truth behind friend clutter may be that, as a general rule, we don't handle endings well. "Our culture seems to applaud the spirit, promise and gumption of beginnings," writes the sociologist Sara Lawrence-Lightfoot in her absorbing new book, Exit: The Endings That Set Us Free, whereas "our exits are often ignored or invisible". We celebrate the new ? marriages, homes, work projects ? but "there is little appreciation or applause when we decide (or it is decided for us) that it's time to move on". We need "a language for leave-taking", Lawrence-Lightfoot argues, and not just for funerals.
Sunday, August 19, 2012
Insight: The dark side of Germany's jobs miracle | Reuters
Wage restraint and labor market reforms have pushed the German jobless rate down to a 20-year low, and the German model is often cited as an example for European nations seeking to cut unemployment and become more competitive. But critics say the reforms that helped create jobs also broadened and entrenched the low-paid and temporary work sector, boosting wage inequality.
Insight: The dark side of Germany's jobs miracle | Reuters
Insight: The dark side of Germany's jobs miracle | Reuters
- a 20-year low, and the German model is often cited as an example for European nations seeking to cut unemployment and become more competitive. But critics say the reforms that helped create jobs also broadened and entrenched the low-paid and temporary work sector, boosting wage inequality. Labor office data show the low wage sector grew three times as fast as other employment in the five years to 2010, explaining why the "job miracle" has not prompted Germans to spend much more than they have in the past. Pay in Germany, which has no nationwide minimum wage, can go well below one euro an hour, especially in the former communist east German states.
- Trade unions and employers in Germany traditionally opt for collective wage agreements, arguing that a legal minimum wage could kill jobs, but these agreements only cover slightly more than half the population and can be circumvented
- Critics say Germany's reforms came at a high price as they firmly entrenched the low-wage sector and depressed wages, leading to a two-tier labor market.New categories of low-income, government-subsidized jobs - a concept being considered in Spain - have proven especially problematic. Some economists say they have backfired. They were created to help those with bad job prospects eventually become reintegrated into the regular labor market, but surveys show that for most people, they lead nowhere.
- While wage inequality used to be as low in Germany as in the Nordic countries, it has risen sharply over the past decade. "The poor have clearly lost out to the middle class, more so in Germany than in other countries," said OECD economist Isabell Koske. Depressed wages and job insecurity have also kept a lid on domestic demand, the Achilles heel of the export-dependent German economy, much to the exasperation of its neighbors.
- ILO's Ernst says Germany can only hope that other European countries do not emulate its own wage deflationary policies too closely, as demand will dry up: "If everyone is doing same thing, there won't be anyone left to export to.
Thursday, August 9, 2012
Decision Quicksand
Should you read this story? Why you're having trouble deciding - Red Tape
some extracts:
some extracts:
- Little decisions cause a big problem precisely because they are surprisingly hard. Faced with too many options, consumers unconsciously connect difficulty with importance, and their brains are tricked into heavy deliberation mode.
- Instead of realizing that picking a toothbrush is a trivial decision, we confuse the array of options and excess of information with decision importance, which then leads our brain to conclude that this decision is worth more time and attention.
- Research shows that time spent in decision quicksand before a choice correlates with dissatisfaction after the fact. And of course, there's all that wasted time and emotional energy.
- Set decision rules and stick to them. In other words, start with a time limit that reflects the true importance of the choice. For example, "I will book a flight in 5 minutes, no matter what."
- Breaks can also help. Spending time away from a decision-making process can free the brain from an obsessive loop. "Even minor interruptions, short breaks, or momentary task switching can change information processing from a local, bottom-up focus to a top-down, goal-directed mode
Wednesday, August 8, 2012
The paradox of behaviour tests
Failure rate of 50% a worrying statistic for drivers - The Irish Times - Wed, Aug 08, 2012
It seems 50% of cars fail their yearly roadworthiness test in Ireland, which raises some interesting paradoxes. Should this be viewed as a good thing, in that so many dangerous cars are identified and taken off the road? Or is the testing contributing to the problem, since knowing that there will be a yearly test, people are skipping on regular maintenance and services? This is an interesting example of a common undesired side-effects of checking human behaviour : it removes responsibility for the problem from the individual, which means both bad and good behaviour is reduced. I could see something similar occurring in France, where now motorists are legally obliged to carry breathalizers with them. Initially, and overall, I think this is a good thing, since most people do not want to drive over the limit, but might often (if even wilfully) assume they are not. However if confronted with clear evidence (from their own breathalizer) that they are, then they are left no moral wiggle room, and can only continue to drive if make a clear decision to break the law, as opposed to just hoping they aren't. And of cours those that would drive regardless would so so anyway, test or not. The corrolary of the test though, is that there are probably many cases where people overestimate their blood alcohol content, and don't drive because they think they are over the limit when they actually aren't (the limit is apparently surprisingly high for some people). These people would never have driven when fully inebriated, but now would be encouraged to do so when partially. Given that any alcohol in the system affects performance, then there is a likelihood that this will actually result in these people crashing, when they otherwise wouldn't.
So while testing is overall needed, since it clamps down on the extremes, it is never without unintended consequences.
It seems 50% of cars fail their yearly roadworthiness test in Ireland, which raises some interesting paradoxes. Should this be viewed as a good thing, in that so many dangerous cars are identified and taken off the road? Or is the testing contributing to the problem, since knowing that there will be a yearly test, people are skipping on regular maintenance and services? This is an interesting example of a common undesired side-effects of checking human behaviour : it removes responsibility for the problem from the individual, which means both bad and good behaviour is reduced. I could see something similar occurring in France, where now motorists are legally obliged to carry breathalizers with them. Initially, and overall, I think this is a good thing, since most people do not want to drive over the limit, but might often (if even wilfully) assume they are not. However if confronted with clear evidence (from their own breathalizer) that they are, then they are left no moral wiggle room, and can only continue to drive if make a clear decision to break the law, as opposed to just hoping they aren't. And of cours those that would drive regardless would so so anyway, test or not. The corrolary of the test though, is that there are probably many cases where people overestimate their blood alcohol content, and don't drive because they think they are over the limit when they actually aren't (the limit is apparently surprisingly high for some people). These people would never have driven when fully inebriated, but now would be encouraged to do so when partially. Given that any alcohol in the system affects performance, then there is a likelihood that this will actually result in these people crashing, when they otherwise wouldn't.
So while testing is overall needed, since it clamps down on the extremes, it is never without unintended consequences.
Labels:
laws,
psychology
Eat,fast, live longer
Another interesting (if not yet reliably useful) food/health documentary last night: Horizon's "Eat, fast, live longer" which reported on recent studies suggesting occasional extreme fasting (~50 cals per day for 4 days every few months) or regular (alternate-day) near-fasting (~400 cals per day) could provide major health benefits, even if people ate normally (or even badly) the rest of the time.
(BBC link here, guardian review here)
This is not the first time I've heard of how low calorie intake is linked with longevity, but until now I only knew about people who were (without a lot of human based evidence) living long term with massively reduced energy intakes, which didn't seem very worthwhile, since even if it did make them live longer, reduced their energy levels so much it was debatable whether they were able to live life at all. The recent research however, though presumably arising from the same evidence, suggests that it might not (just) be consistent reductions in calories that brings benefits, but even short term occasional fasts. There seems to be signs that the body enters a kind of 'repair' mode when starving, and the are hopefuly indicators of how this might reduce or even prevent illnesses such as heart disease or stroke.
While I can't comment on the scientific basis (which was admitted to be only new and limited, albeit promising), there are perhaps a few general common sense points that can be made. My own personal 'gut instinct' is that any diet which matches our evolution, has to make sense; hence modern artificially concentrations of sugars,salt, and fat have to be viewed with suspicion. Beyond this though, as incredibly versatile omnivores, it is probably quite hard to work out what our ideal pre-modern diet actually was. It's likely that life was hard, and resources scarce, so it does make sense that it was largely plant based, since presumably hunting was harder and more unpredictable. However, given that we can consume animal meat, and in certain cultures (eskimos) can survive on it, then I would be wary of excluding it altogether, although it might make sense to limit it. So I would think a balanced diet, with emphasis on plants, sounds like a good idea, and have never seen strong evidence against this.
However, leaving aside the type of food, and focusing on the quantity, such a boom-and-bust cycle of nutrition could also fit well with the latest fasting research, since our bodies would have done well to adapt to (and even make use of) such times of shortage. The problem however, as with all neat evolutionary stories, is it is important to keep in mind the arena in which evolution operates, something which might not align well with modern life expectancies. Evolution works on the fitness of the animal primarily until it procreates, and then perhaps for some while after as it supports its offspring/descendants. The problem is, given that ancient man might only have lived till 30 or 40, and heart attacks and cancer are mainly illnesses of later years (perversely rising cancer rates can be a good thing, since indicates a society living longer), it's very possible that there is nothing evolutionary in us to combat them. This doesn't mean there aren't optimum ways of extending our lifespan, but rather that we are inventing them, not re-discovering them. It might be that the issue is not making the machine run as it should, but finding ways to make it run as it could.
However, this then raises the questionas to whether the same approaches fit all stages of life. It could be that in earlier, fitter, more active years one type of diet is optimum (and matches evolutionary history), but in later years a change of strategy is needed. For example I wonder about the link between growth and cancer. Several times recently I have seen evidence about diets which seem to reduce cancer rates, but perhaps also impact normal cell growth. This of course makes sense, since cancer ultimately is excessive growth, so it is logical that what would stop growth normally, would stop cancer as well. But what is far less from clear is whether this is a desirable thing, under normal healthy circumstances. Like constant restriction of calories, could the cure (limited life) be worse than the disease (risk of limited lifetime). It coudl even be the case that in certain stages of life one approach makes sense (run body at full power, albeit putting under stress) but at other, older, stages another is appropriate (scale back to focus on preservation, not performance). Ultimately of course a balance is needed, but making the judgement needs more evidence and consideration than just jumping on the latest diet that shows (even if verified) improvement in one range of parameters. Better cholesterol, lower glucose levels, are good things, but what else changes, and what are the overall effects? From what I remember, the studies for example (as those mentioned in the film forks over knives) involved people who were already unhealthy - older, obese etc. It is one thing to provide a solution for people who clearly have a problem, it is another to show that solution is applicable to preventthat problem in the first place.
in the spirit of balance and moderation, I therefore think fasting diets sound so extreme that is would be prudent to wait for more evidence. One probably reliable maxim that can be taken is overconsumption, even without obesity, but of meats etc. might be a problem. So probably a safe bet, to limit, but definitely not exclude, meats, ramp up plant elements, and overall keep consumption down and in line with activity. Still, who knows how this research will develop, and all thought about food, is valuable food for thought.
(BBC link here, guardian review here)
This is not the first time I've heard of how low calorie intake is linked with longevity, but until now I only knew about people who were (without a lot of human based evidence) living long term with massively reduced energy intakes, which didn't seem very worthwhile, since even if it did make them live longer, reduced their energy levels so much it was debatable whether they were able to live life at all. The recent research however, though presumably arising from the same evidence, suggests that it might not (just) be consistent reductions in calories that brings benefits, but even short term occasional fasts. There seems to be signs that the body enters a kind of 'repair' mode when starving, and the are hopefuly indicators of how this might reduce or even prevent illnesses such as heart disease or stroke.
While I can't comment on the scientific basis (which was admitted to be only new and limited, albeit promising), there are perhaps a few general common sense points that can be made. My own personal 'gut instinct' is that any diet which matches our evolution, has to make sense; hence modern artificially concentrations of sugars,salt, and fat have to be viewed with suspicion. Beyond this though, as incredibly versatile omnivores, it is probably quite hard to work out what our ideal pre-modern diet actually was. It's likely that life was hard, and resources scarce, so it does make sense that it was largely plant based, since presumably hunting was harder and more unpredictable. However, given that we can consume animal meat, and in certain cultures (eskimos) can survive on it, then I would be wary of excluding it altogether, although it might make sense to limit it. So I would think a balanced diet, with emphasis on plants, sounds like a good idea, and have never seen strong evidence against this.
However, leaving aside the type of food, and focusing on the quantity, such a boom-and-bust cycle of nutrition could also fit well with the latest fasting research, since our bodies would have done well to adapt to (and even make use of) such times of shortage. The problem however, as with all neat evolutionary stories, is it is important to keep in mind the arena in which evolution operates, something which might not align well with modern life expectancies. Evolution works on the fitness of the animal primarily until it procreates, and then perhaps for some while after as it supports its offspring/descendants. The problem is, given that ancient man might only have lived till 30 or 40, and heart attacks and cancer are mainly illnesses of later years (perversely rising cancer rates can be a good thing, since indicates a society living longer), it's very possible that there is nothing evolutionary in us to combat them. This doesn't mean there aren't optimum ways of extending our lifespan, but rather that we are inventing them, not re-discovering them. It might be that the issue is not making the machine run as it should, but finding ways to make it run as it could.
However, this then raises the questionas to whether the same approaches fit all stages of life. It could be that in earlier, fitter, more active years one type of diet is optimum (and matches evolutionary history), but in later years a change of strategy is needed. For example I wonder about the link between growth and cancer. Several times recently I have seen evidence about diets which seem to reduce cancer rates, but perhaps also impact normal cell growth. This of course makes sense, since cancer ultimately is excessive growth, so it is logical that what would stop growth normally, would stop cancer as well. But what is far less from clear is whether this is a desirable thing, under normal healthy circumstances. Like constant restriction of calories, could the cure (limited life) be worse than the disease (risk of limited lifetime). It coudl even be the case that in certain stages of life one approach makes sense (run body at full power, albeit putting under stress) but at other, older, stages another is appropriate (scale back to focus on preservation, not performance). Ultimately of course a balance is needed, but making the judgement needs more evidence and consideration than just jumping on the latest diet that shows (even if verified) improvement in one range of parameters. Better cholesterol, lower glucose levels, are good things, but what else changes, and what are the overall effects? From what I remember, the studies for example (as those mentioned in the film forks over knives) involved people who were already unhealthy - older, obese etc. It is one thing to provide a solution for people who clearly have a problem, it is another to show that solution is applicable to preventthat problem in the first place.
in the spirit of balance and moderation, I therefore think fasting diets sound so extreme that is would be prudent to wait for more evidence. One probably reliable maxim that can be taken is overconsumption, even without obesity, but of meats etc. might be a problem. So probably a safe bet, to limit, but definitely not exclude, meats, ramp up plant elements, and overall keep consumption down and in line with activity. Still, who knows how this research will develop, and all thought about food, is valuable food for thought.
Thursday, July 19, 2012
Are Believers Really Happier Than Atheists?
Are Believers Really Happier Than Atheists?
Who is better off: the religious or atheists? Cultural values determine the answer
http://www.scientificamerican.com/article.cfm?id=healthy-skepticism
- Being religious is often linked with greater well-being. New research suggests that the effect is culture-specific.
- A strong predictor of a person’s religiosity is the condition of the society in which he or she lives.
- Finding communities and social groups that align with your beliefs can improve life satisfaction.
Monday, July 16, 2012
The Money-Empathy Gap
New research suggests that more money makes people act less human. Or at least less humane.
- Earlier this year, Piff, who is 30, published a paper in the Proceedings of the National Academy of Sciences that made him semi-famous. Titled “Higher Social Class Predicts Increased Unethical Behavior,” it showed through quizzes, online games, questionnaires, in-lab manipulations, and field studies that living high on the socioeconomic ladder can, colloquially speaking, dehumanize people. It can make them less ethical, more selfish, more insular, and less compassionate than other people.
- In a country that likes to think that class doesn’t matter, these social scientists are beginning to prove just how determinative money is.
- Nor does it attempt to apply its conclusions about the selfishness and solipsism of a broad social stratum to every member within it: Gateses and Carnegies have obviously saved lives and edified generations, and one of the biggest predictors of a person’s inclination to donate to charity is how much money he has.
- studies of ethical behavior indicate a strong correlation between high socioeconomic status and interpersonal disregard. It’s an “additive” effect; the fever line points straight up. “People higher up on the socioeconomic ladder are about three times more likely to cheat than people on the lower rungs,” he says. Piff’s research also suggests that people who yearn to be richer or more prominent make different choices than those more content with their present level of material comfort.
- Americans across the board can have a high tolerance for inequality if they believe it is meritocratic. The research by Piff and his colleagues points to a different possible explanation for the income gap: that it may be at least in part psychologically destined. This in turn raises the ancient conundrum of chicken and egg. If getting or having money can make you hard-hearted, do you also have to be hard-hearted to become well-off in the first place? The bulk of the new research points decisively in the direction of the former
- “Upper-class drivers were the most likely to cut off other vehicles even when controlling for time of day, driver’s perceived sex, and amount of traffic.” When Piff designed a similar experiment to test drivers’ regard for pedestrians, in which a researcher would enter a zebra crossing as a car approached it, the results were more staggering
- In experiments she published in the journal Science in 2006, Vohs “primed” her subjects to think about money, which is to say she planted the idea of money in their minds without their knowledge before observing their social interactions compared with a control group. ..... Every subject in the study bent down to pick up the mess. But the money-primed subjects picked up 15 percent fewer pencils than the control group. In a conversation in her office in May, Vohs stressed that money-priming did not make her subjects malicious—just disinterested. “It’s not a bad analogy to think of them as a little autistic,” she said. “I don’t think they mean any harm, but picking up pencils just isn’t their problem.”
- Over and over, Vohs has found that money can make people antisocial. She primes subjects by seating them near a screen-saver showing currency....Vohs showed that money-primed subjects gave less time to a colleague in need of assistance and less money to a hypothetical charity.
- “Money,” says Vohs, “brings you into functionality mode. When that gets applied to other people, things get mucked up. You can get things done, but it does come at the expense of people’s feelings or caring about them as individuals.”
- The corollaries to this poverty work are potentially explosive: Wealth may give you a better brain. It may make you a more strategic thinker, a savvier planner... And the cognitive benefits of affluence may accrue incrementally, speculates Dovidio, so that very rich people have better brain functioning than moderately rich people. These hypotheses are at the untested frontier of the new science: “I think in ten years we’ll have a compelling story on this,” says Dacher Keltner, the psychologist who oversees the work of Piff and his colleagues. But already the outline is becoming clear. Princeton University psychologist Eldar Shafir has shown that in environments of abundance, people make better financial decisions—it’s not that rich people tend to be better educated and can afford better advice, but that people living paycheck to paycheck don’t have the mental space to make the smartest long-term moves. The efficiencies of the affluent brain may trigger the shutting down of what the researchers call “pro-social” impulses and lead people toward the kinds of behaviors that a hedge-fund manager I spoke to characterized as “ruthless.
- This is Hazel Markus’s main research interest: the mind-sets of class. She and her colleagues have found, broadly speaking, that the affluent value individuality—uniqueness, differentiation, achievement—whereas people lower down on the ladder tend to stress homogeneity, harmonious interpersonal relationships, and group affiliation
- The American Dream is really two dreams. There’s the Horatio Alger myth, in which a person with grit, ingenuity, and hard work succeeds and prospers. And there’s the firehouse dinner, the Fourth of July picnic, the common green, in which everyone gives a little so the group can get a lot. Markus’s work seems to suggest the emergence of a dream apartheid, wherein the upper class continues to chase a vision of personal success and everyone else lingers at a potluck complaining that the system is broken. (Research shows that the rich tend to blame individuals for their own failure and likewise credit themselves for their own success, whereas those in the lower classes find explanations for inequality in circumstances and events outside their control.) But the truth is much more nuanced. Every American, rich and poor, bounces back and forth between these two ideals of self, calibrating ambitions and adjusting behaviors accordingly. Nearly half of Americans between 18 and 29 believe that it’s “likely” they’ll get rich, according to Gallup—in spite of all evidence to the contrary
Monday, July 9, 2012
TED talk : consumers create jobs, not the rich
TED talk by Nick Hanauer :
(with transcript at http://lybio.net/tag/nick-hanauer-ted-talks-the-inequality-speech-transcription/)
Extract :
"I have started, or helped start, dozens of companies and initially hired lots of people. But if there was no one around who could afford to buy what we had to sell, all those companies and all those jobs would have evaporated. Source: LYBIO.net
That's why I can say with confidence that rich people don't create jobs, nor do businesses, large or small. Jobs are a consequence of a circle of life-like feedback loop between customers and businesses. And only consumers can set in motion this virtuous cycle of increasing demand and hiring. In this sense, an ordinary consumer is more of a job creator than a capitalist like me.
That's why when business people take credit for creating jobs, it's a little bit like squirrels taking credit for creating evolution. It's actually the other way around.
Anyone who's ever run a business knows that hiring more people is a course of last resort for capitalists. It's what we do if, and only if, rising customer demand requires it. And in this sense, calling yourselves job creators isn't just inaccurate, it's disingenuous."
Labels:
tax
Sunday, July 8, 2012
Longer prison terms really do cut crime, study shows | Law | The Observer
Longer prison terms really do cut crime, study shows | Law | The Observer
Since the impact seemed to be most with repeat offenders however, it does indicate that the longer sentances are simply reducing the number of criminals at large, rather than acting as deterrance or cure. Furthermore it could then just provide a temporary lull due to an extended prison pipeline, and hence not be a long term solution at all. Executions would work even better, but would be no more jusifiable.
Since the impact seemed to be most with repeat offenders however, it does indicate that the longer sentances are simply reducing the number of criminals at large, rather than acting as deterrance or cure. Furthermore it could then just provide a temporary lull due to an extended prison pipeline, and hence not be a long term solution at all. Executions would work even better, but would be no more jusifiable.
Tuesday, July 3, 2012
Bankers and the neuroscience of greed | Ian Robertson | Comment is free | guardian.co.uk
Bankers and the neuroscience of greed | Ian Robertson | Comment is free | guardian.co.uk
- power is one of the most potent brain-changing drugs known to humankind, unconstrained power has enormously distorting effects on behaviour, emotions and thinking.
- Researchers at Tilburg University showed that people made to feel more powerful cheated more when they believed themselves to be unobserved. Power also made ordinary people more hypocritical when making judgments about moral dilemmas, being much more strict in applying rules to others, but much more lax in applying them to themselves. Even tiny amounts of artificial power, in other words, increased both immorality and hypocrisy.
- Paul Piff of the University of Berkeley found in a US-based study that, compared with lower class people, upper class individuals were more likely to break the law while driving, to show unethical tendencies in decision-making, to take valued goods from others, to lie in a negotiation, to cheat in order to improve their chances of winning a prize, and to endorse unethical behaviour in a work situation.
- It has become a cliche to explain the behaviour of bankers in terms of greed, but cliches are not always wrong. Power and money both act on the brain's reward system, which if over-stimulated for long periods develops appetites that are difficult to satisfy, just as is the case for drug addiction. We call these appetites greed and greedy people are never satisfied. That is the challenge for politicians and regulators.
Monday, July 2, 2012
Subscribe to:
Posts (Atom)