Monday, February 28, 2011

Is lack of sleep and water giving ecstasy a bad name?

ALL-NIGHT ravers who take ecstasy might not be harming their brains any more than drug-free party animals.

So say John Halpern and colleagues at Harvard Medical School in Boston, who argue that many studies apparently showing that ecstasy use can lead to memory loss and depression were flawed as they did not take account of the rave culture associated with ecstasy use. Lack of sleep and dehydration resulting from all-night dancing can cause cognitive problems on their own, they say.

Halpern's team compared ecstasy users with non-users who had a history of all-night dancing with limited exposure to alcohol and drugs. Both groups completed tests for verbal fluency, memory, depression and other factors.

The team found no significant differences in cognitive performance between the two groups, even when they compared non-users with heavy users of the drug


Sunday, February 27, 2011

Scientists create one-dimensional ferroelectric ice

By Lisa Zyga

The researchers, including Hai-Xia Zhao, Xiang-Jian Kong and La-Sheng Long, along with their coauthors from Xiamen University in Xiamen, China, and Hui Li and Xiao Cheng Zeng from the University of Nebraska in the US, have published their study in a recent issue of the Proceedings of the National Academy of Sciences.

Every water molecule carries a tiny electric field. But because water molecules usually freeze in a somewhat random arrangement, with their bonds pointing in different directions, the ice’s total electric field tends to cancel out. In contrast, the bonds in ferroelectric ice all point in the same direction at low enough temperatures, so that it has a net polarization in one direction that produces an electric field.

Ferroelectric ice is thought to be extremely rare; in fact, scientists are still investigating whether or not pure three-dimensional ferroelectric ice exists in nature. Some researchers have proposed that ferroelectric ice may exist on Uranus, Neptune, or Pluto. Creating pure 3D ferroelectric ice in the laboratory seems next to impossible, since it would take an estimated 100,000 years to form without the assistance of catalysts. So far, all ferroelectric ices produced in the laboratory are less than three dimensions and in mixed phases (heterogeneous).

In the new study, the scientists have synthesized a one-dimensional, single-phase (homogeneous) ferroelectric ice by freezing a one-dimensional water ‘wire.’ As far as the scientists know, this is the first single-phase ferroelectric ice synthesized in the laboratory.

To create the water wire, the researchers designed very thin nanochannels that can hold just 96 H2O molecules per crystalline unit cell. By lowering the temperature from a starting point of 350 K (77°C, 171°F), they found that the water wire undergoes a phase transition below 277 K (4°C, 39°F), transforming from 1D liquid to 1D ice. The ice also exhibits a large dielectric anomaly at this temperature and at 175 K (-98°C, -144°F).


Saturday, February 26, 2011

Mobiles 'increase brain activity'

“Mobile phones are a brain cell killer,” reported The Sun. The newspaper claimed that a study of hundreds of mobile users found that the signals emitted during calls can cause a 7% rise in chemical changes in the brain. It said that these may boost the chances of developing cancer. Other papers also reported the study in a more balanced way.

The laboratory-based study recruited 47 healthy volunteers who had their brain activity measured while they had mobile phones fixed to both sides of their head. One of the handsets received a call on silent for 50 minutes. Brain scans showed there was a 7% increase in brain activity in the area closest to that phone’s antenna.

The Sun over-interpreted the findings of this study and put an alarming spin on it that is not supported by the findings. The study did not show that mobile phones kill brain cells or cause cancer. The size of the effect was small, and the researchers themselves say that the findings are of “unknown clinical significance”. They state that it is not possible to tell from their findings whether or not these effects are harmful. Further research is needed.


Friday, February 25, 2011

People with low self-esteem show more signs of prejudice

When people are feeling badly about themselves, they're more likely to show bias against people who are different. A new study published in Psychological Science, a journal of the Association for Psychological Science, examines how that works. "This is one of the oldest accounts of why people stereotype and have prejudice: It makes us feel better about ourselves," says Jeffrey Sherman of the University of California, Davis, who wrote the study with Thomas Allen. "When we feel bad about ourselves, we can denigrate other people, and that makes us feel better about ourselves."

Sherman and Allen used the Implicit Association Test (IAT)—a task designed to assess people's automatic reactions to words and/or images—to investigate this claim. In order to reveal people's implicit prejudice, participants are asked to watch a computer monitor while a series of positive words, negative words, and pictures of black or white faces appear. In the first part of the test, participants are asked to push the "E" key for either black faces or negative words and the "I" key for white faces or positive words. For the second task, the groupings are reversed—participants are now supposed to associate positive words with black faces and negative words with white faces.

Determining prejudice in the IAT is pretty straightforward: If participants have negative associations with black people, they should find the second task more difficult. This should be especially true when people feel bad about themselves.

But what psychologists don't agree on is how this works. "People were using the exact same data to make completely different arguments about why," Sherman says. There are two possibilities: either feeling bad about yourself activates negative evaluations of others, or it makes you less likely to suppress those biases.

In their experiment, Sherman and Allen asked participants to take a very difficult 12-question test that requires creative thinking. No one got more than two items correct. About half of the participants were given their test results and told that the average score was nine, to make them would feel bad about themselves. The other half were told that their tests would be graded later. All of the participants then completed the IAT and, as expected, those who were feeling bad about their test performance showed more evidence of implicit prejudice.

But Sherman and Allen took it a step farther. They also applied a mathematical model that reveals the processes that contribute to this effect. By plugging in the data from the experiment, they were able to determine that people who feel bad about themselves show enhanced prejudice because negative associations are activated to a greater degree, but not because they are less likely to suppress those feelings.

The difference is subtle, but important, Sherman says. "If the problem was that people were having trouble inhibiting bias, you might try to train people to exert better control," he says. But his results suggest that's not the issue. "The issue is that our mind wanders to more negative aspects of other groups. The way around that is to try and think differently about other people. When you feel bad about yourself and catch yourself thinking negatively about other groups, remind yourself, 'I may be feeling this way because I just failed a test or something.'"


Wednesday, February 23, 2011

UK’s Chief Scientific Adviser criticizes “journalists wilfully misusing science, distorting evidence by cherry-picking data that suits their view, giving bogus authority to people who misrepresent the absolute basics of science, and worse”

Government Chief Scientific Adviser John Beddington is stepping up the war on pseudoscience with a call to his fellow government scientists to be “grossly intolerant” if science is being misused by religious or political groups.

In closing remarks to an annual conference of around 300 scientific civil servants on 3 February, in London, Beddington said that selective use of science ought to be treated in the same way as racism and homophobia. “We are grossly intolerant, and properly so, of racism. We are grossly intolerant, and properly so, of people who [are] anti-homosexuality…. We are not—and I genuinely think we should think about how we do this—grossly intolerant of pseudo-science, the building up of what purports to be science by the cherry-picking of the facts and the failure to use scientific evidence and the failure to use scientific method,” he said.

Beddington said he intends to take this agenda forward with his fellow chief scientists and also with the research councils. “I really believe that… we need to recognise that this is a pernicious influence, it is an increasingly pernicious influence and we need to be thinking about how we can actually deal with it.

I first reported on Beddington back in 2009 when he warned that by 2030, “A ‘perfect storm’ of food shortages, scarce water and insufficient energy resources threaten to unleash public unrest, cross-border conflicts and mass migration as people flee from the worst-affected regions.” See “When the global Ponzi scheme collapses (circa 2030), the only jobs left will be green” for an amazing speech explaining why.

No doubt Beddington is thinking of UK journalists like David Rose and Richard North (see links below) — and James Delingpole, who recently melted down on the BBC and said, “It is not my job to sit down and read peer-reviewed papers because I simply haven’t got the time…. I am an interpreter of interpretations.”

Here’s more from the UK’s Chief Scientific Adviser:

“We should not tolerate what is potentially something that can seriously undermine our ability to address important problems.“There are enough difficult and important problems out there without having to… deal with what is politically or morally or religiously motivated nonsense.”

Beddington also had harsh words for journalists who treat the opinions of non-scientist commentators as being equivalent to the opinions of what he called “properly trained, properly assessed” scientists. “The media see the discussions about really important scientific events as if it’s a bloody football match. It is ridiculous.”

His call has been welcomed by science groups, including the Campaign for Science and Engineering.

Edzard Ernst, professor of the study of complementary medicine at Exeter University, whose department is being closed down, said he was “delighted that somebody in [Beddington’s] position speaks out”. In an interview with Research Fortnight Ernst said that the analogy with racism was a good one and that he, like Beddington, questioned why journalists have what he called “a pathological need” to balance a scientific opinion with one from outside of science.

“You don’t have that balance in racism,” he said. “You’re not finishing [an article] by quoting the Ku Klux Klan when it is an article about racist ideas,” Ernst said.

“This is strong language because the frustration is so huge and because scientists are being misunderstood. For far too long we have been tolerant of these post-modern ideas that more than one truth is valid. All this sort of nonsense does make you very frustrated in the end.”

Ben Goldacre, a science journalist and medical doctor, agrees. “Society has been far too tolerant of politicians, lobbyists, and journalists wilfully misusing science, distorting evidence by cherry-picking data that suits their view, giving bogus authority to people who misrepresent the absolute basics of science, and worse,” he told Research Fortnight. “This distorted evidence has real world implications, because people need good evidence to make informed decisions on policy, health, and more. Beddington is frustrated, and rightly so: for years I’ve had journalists and politicians repeatedly try to brush my concerns on these issues under the carpet.” Scientists need to fight back, he says.


Tuesday, February 22, 2011

More Intelligent People Are More Likely to Binge Drink and Get Drunk

by Satoshi Kanazawa

Not only are more intelligent individuals more likely to consume more alcohol more frequently, they are more likely to engage in binge drinking and to get drunk.

In an earlier post, I show that, consistent with the prediction of the Hypothesis, more intelligent individuals consume larger quantities of alcohol more frequently than less intelligent individuals. The data presented in the post come from the National Child Development Study in the United Kingdom. The NCDS measures the respondents’ general intelligence before the age of 16, and then tracks the quantity and frequency of alcohol consumption throughout their adulthood in their 20s, 30s, and 40s. The graphs presented in the post show a clear monotonic association between childhood general intelligence and both the frequency and the quantity of adult alcohol consumption. The more intelligent they are in childhood, the more and the more frequently they consume alcohol in their adulthood.
Related Links

There are occasional medical reports and scientific studies which tout the health benefits of mild alcohol consumption, such as drinking a glass of red wine with dinner every night. So it may be tempting to conclude that more intelligent individuals are more likely to engage in such mild alcohol consumption than less intelligent individuals, and the positive association between childhood general intelligence and adult alcohol consumption reflects such mild, and thus healthy and beneficial, alcohol consumption.

Unfortunately for the intelligent individuals, this is not the case. More intelligent children are more likely to grow up to engage in binge drinking (consuming five or more units of alcohol in one sitting) and getting drunk.

The National Longitudinal Study of Adolescent Health (Add Health) asks its respondents specific questions about binge drinking and getting drunk. For binge drinking, Add Health asks: “During the past 12 months, on how many days did you drink five or more drinks in a row?” For getting drunk, it asks: “During the past 12 months, on how many days have you been drunk or very high on alcohol?” For both questions, the respondents can answer on a six-point ordinal scale: 0 = none, 1 = 1 or 2 days in the past 12 months, 2 = once a month or less (3 to 12 times in the past 12 months), 3 = 2 or 3 days a month, 4 = 1 or 2 days a week, 5 = 3 to 5 days a week, 6 = every day or almost every day.

As you can see in the following graph, there is a clear monotonic positive association between childhood intelligence and adult frequency of binge drinking. “Very dull” Add Health respondents (with childhood IQ < 75) engage in binge drinking less than once a year. In sharp contrast, “very bright” Add Health respondents (with childhood IQ > 125) engage in binge drinking roughly once every other month.

The association between childhood intelligence and adult frequency of getting drunk is equally clear and monotonic, as you can see in the following graph. “Very dull” Add Health respondents almost never get drunk, whereas “very bright” Add Health respondents get drunk once every other month or so.

In a multiple ordinal regression, childhood intelligence has a significant (ps < .00001) effect on adult frequency of both binge drinking and getting drunk, controlling for age, sex, race, ethnicity, religion, marital status, parental status, education, earnings, political attitudes, religiosity, general satisfaction with life, taking medication for stress, experience of stress without taking medication, frequency of socialization with friends, number of sex partners in the last 12 months, childhood family income, mother’s education, and father’s education. I honestly cannot think of any other variable that might be correlated with childhood intelligence than those already controlled for in the multiple regression analyses. It is very likely that it is childhood intelligence itself, and not anything else that is confounded with it, which increases the adult frequency of binge drinking and getting drunk.

Note that education is controlled for in the ordinal multiple regression analysis. Given that Add Health respondents in Wave III (when the dependent measures are taken) are in their early 20s, it may be tempting to conclude that the association between childhood intelligence and adult frequency of binge drinking and getting drunk is mediated by college attendance. More intelligent children are more likely to go to college, and college students are more likely to engage in binge drinking and get drunk. The significant partial effect of childhood intelligence on the adult frequency of binge drinking and getting drunk, net of education, shows that this indeed is not the case. It is childhood intelligence itself, not education, which increases the adult frequency of binge drinking and getting drunk.

In fact, in both equations, education does not have a significant effect on binge drinking and getting drunk. Net of all the other variables included in the ordinal multiple regression equations, education is not significantly correlated with the frequency of binge drinking and getting drunk. Among other things, it means that college students are more likely to engage in binge drinking, not because they are in college, but because they are more intelligent.


Thursday, February 17, 2011

Humans Living in East Africa 200,000 Years Ago Were as Complex in their Behavior as Humans Living Today

John Shea, Ph.D., Refutes Long-Standing Myth About Human Origins

"Stone points dating to at least 104,000 years ago from Omo Kibish, Ethiopia. These points, shaped by pressure-flaking and likely used as projectile points are more than 65,000 years older than the oldest similar artifacts from the European Upper Paleolithic Period. The Omo Kibish toolmakers showed equal skill at making similar points out of very different kinds of stone.

STONY BROOK, N.Y., February 17, 2011— In a paper recently published in Current Anthropology, SBU Professor John Shea disproves the myth that the earliest humans were significantly different from us. The idea that human evolution follows a progressive trajectory is one of the most deeply-entrenched assumptions about Homo sapiens evolution. In fact, archaeologists have long believed that modern human behaviors emerged tens of thousands of years after our species first evolved. And while scientists disagreed over whether the process was gradual or quick, they have agreed that Homo sapiens once lived who were very different from us.

“Archaeologists have been focusing on the wrong measurement of early human behavior,” says John Shea, Ph.D, professor of Anthropology at SBU and a Research Associate with the Turkana Basin Institute in Kenya. “The search has been for evidence of ‘behavioral modernity,’ a quality supposedly unique to Homo sapiens, when archaeologists ought to have been investigating ‘behavioral variability,’ a quantitative dimension to the behavior of all living things.”

Early humans were not “behaviorally modern,” meaning they did not collect difficult-to-procure foods, nor did they use complex technologies like traps and nets. But, according to Shea, there is now evidence that some of the behaviors associated with modern humans—specifically our capacity for wide behavioral variability, —did occur among early humans.

The European Upper Paleolithic archaeological record has long been the standard against which the behavior of earlier and non-European humans is compared. During the Upper Paleolithic (45,000-12,000 years ago), Homo sapiens fossils first appeared, together with complex tool technology, carved bone tools, complex projectile weapons, advanced techniques for using fire, cave art, beads and other personal adornments. Similar behaviors are either universal or very nearly so among recent humans, and, thus, archaeologists cite evidence for these behaviors as evidence of human behavioral modernity.

Yet, the oldest Homo sapiens fossils occur between 100,000-200,000 years ago in Africa and southern Asia and in contexts lacking clear and consistent evidence for such behavioral modernity. For decades anthropologists contrasted these earlier “archaic” African and Asian humans with their “behaviorally-modern” Upper Paleolithic counterparts, explaining the differences between them in terms of a single “Human Revolution” that fundamentally changed human biology and behavior. Archaeologists disagree about the causes, timing, pace, and characteristics of this revolution, but there is a consensus that the behavior of the earliest Homo sapiens was significantly different than that of more-recent “modern” humans.

Professor Shea tested the hypothesis that there were differences in behavioral variability between earlier and later Homo sapiens using stone tool evidence dating to between 250,000-6,000 years ago in eastern Africa, which features the longest continuous archaeological record of Homo sapiens behavior. “A systematic comparison of variability in stone tool making strategies over the last quarter-million years shows no single behavioral revolution in our species’ evolutionary history,” notes Professor Shea. “Instead, the evidence shows wide variability in stone tool making strategies over the last quarter-million years and no single behavioral revolution. Particular changes in stone tool technology are explicable in terms of principles of behavioral ecology and the costs and benefits of different tool making strategies.”

The study, entitled “Homo sapiens Is as Homo sapiens Was: Behavioral Variability vs. ‘Behavioral Modernity’ in Paleolithic Archaeology,” has important implications for archaeological research on human origins. “Comparing the behavior of our most ancient ancestors to Upper Paleolithic Europeans holistically and ranking them in terms of their “behavioral modernity” is a waste of time,” argues Shea. “There are no such things as modern humans, just Homo sapiens populations with the capacity for a wide range of behavioral variability. Whether this range is significantly different from that of earlier and other hominin species remains to be discovered, but the best way to advance our understanding of human behavior is by researching the sources of behavioral variability.”

About Stony Brook University
Part of the State University of New York system, Stony Brook University encompasses 200 buildings on 1,450 acres. In the 53 years since its founding, the University has grown tremendously, now with nearly 25,000 students and 2,200 faculty, and is recognized as one of the nation’s important centers of learning and scholarship. It is a member of the prestigious Association of American Universities, and ranks among the top 100 national universities in America and among the top 50 public national universities in the country according to the 2010 U.S. News & World Report survey. Stony Brook University co-manages Brookhaven National Laboratory, as one of an elite group of universities, including Berkeley, University of Chicago, Cornell, MIT, and Princeton that run federal research and development laboratories. SBU is a driving force of the Long Island economy, with an annual economic impact of $4.65 billion, generating nearly 60,000 jobs, and accounts for nearly 4% of all economic activity in Nassau and Suffolk counties, and roughly 7.5 percent of total jobs in Suffolk County.

Baby gorilla takes its first steps

Remember the tiny baby gorilla born at London Zoo last October? Well, he's back and this time zookeepers were on hand with a video camera when Tiny - as he's now called - took his first steps this week (see video, above).

The infant has been clinging to his mother since his birth, so it's no surprise she gave him an encouraging shove in the right direction when he tried to return for a cuddle. The keepers are now trying to come up with a permanent name, as he is fast outgrowing his nickname.

Wednesday, February 16, 2011

Supermassive black holes not so big after all

BRISTOL: Supermassive black holes are between 2 and 10 times less massive than previously thought, according to new calculations published by German astrophysicists.

At the centre of most galaxies, including our own, sit supermassive black holes, believed to be between 100,000 and several billion times more massive than the Sun. Previous estimates of black hole masses had contradicted theory, particularly for far away or young black holes. But new research shows that these estimates were wrong.

“It caused problems for the theory of galactic evolution that young galaxies should have these massive black holes,” said lead researcher Wolfram Kollatschny of the University of Göttingen in Germany. “Knowing the rotational velocity of surrounding material we could calculate the central black hole masses unambiguously.”

Probing the black holes

Supermassive black holes are thought to grow from massive star supernovae, sucking in so much surrounding gas that they eventually gravitate to the centre of their galaxy. They are surrounded by bright hot discs of material - called accretion discs - waiting to fall into the abyss.

Emission spectra – which identify the elements in matter - emanating from these discs contain important information about the black holes they surround. Scientists use one line in these spectra to estimate young and distant black hole masses and another for closer black holes.

What the latest research published in the journal Nature shows, however, is that one line “is always broader” than the other. “If we don’t correct for this effect we overestimate the masses of distant and young black holes,” said Kolatschny.

All previous calculations overestimated

Kollatschny and Matthias Zetzl, also from the University of Göttingen dissected spectra from 37 active galactic nuclei and found that the line widths of broad emission lines are caused by a combination of turbulence and rotational speed.

“We could separate their shares in individual emission lines,” explained Kollatschny. “Only the rotational velocity should be used to derive the central black hole masses.”

When they did this they found that previously calculated masses had all been overestimated. Furthermore, they found that “the ratio of the turbulence with respect to the rotational speed gives detailed information on the accretion disk geometrical structure.”

A clue to galaxy formation

According to Emmanuele Berti from the University of Mississippi it is important to have accurate estimates of black hole masses: “It is widely believed that the black hole mass is intimately related to other properties of their galactic environment.”

He continued, “If we can measure this at different times during cosmic history, we may learn something about how the Universe became what we see today.”

Since a black hole only has two properties: mass and angular momentum, an accurate estimate of the mass is essential if astrophysicists are to have any hope of understanding what is going on now and what went on during galaxy formation.

Gas can potentially corrupt results

Scientists are confident that they know the mass of the supermassive black hole at the centre of the Milky Way, as Berti explained, “Observing the orbits of stars at the centre of our own galaxy yielded the most precise supermassive black hole mass measurement to date.”

However, David Ballantyne of Georgia Institute of Technology in the U.S. urged caution when using other methods.

“For active galactic nuclei it is tricky to estimate the black hole mass because they are relatively rare (and farther away), and they emit a lot of light which blocks our view of the nucleus,” he said.

Hence, said Ballantyne, scientists are forced to use gas tracers, but gas can be compressed, heated and/or shocked, which can corrupt its velocity signature. “The accuracy of these methods is not fully known.”