Thursday, June 29, 2006

Making eyes at each other

A few years ago, I was at a restaurant with a friend, and we were getting routinely ignored by the waitstaff. So I proceeded to stare intently at the back of the waiter's neck. He came over in a jiffy.

My friend, of course, wanted to know how I did that. The answer, in all likelihood, is that it was a probability bias -- a sort of variation on the gambler's fallacy, in which the guy was statistically likely to attend to us within a certain time frame, and my staring only produced the illusion of causality. That's not what I said, because people get a little grumpy when I'm constantly deflating their cognitive biases. But the explanation I offered is also true: humans are uniquely capable of knowing when people are looking at them, because the whites of our eyes (the sclera) are so distinct from the pupils and irises. In other words, it's really easy to tell whether a human eye is looking directly at you or slightly askance. You can't tell this with, for instance, a cat eye, which is almost all iris (see image). Other primates' eyes work this way too, but our whites are whiter and our brights are brighter. In short, humans can signal more with simple eye redirection than can any other creature.

What do we signal? Well, submission, when the eyes are cast down. Where a dog might make an elaborate show of submission, we barely need to move the rest of our bodies. The eyes have it. We can also signal fear; seeing an eye with an unusually large proportion of white (i.e., an eye open wide in terror) triggers responses in the amygdala, passing on the "danger" signal. Even if the eye images are shown for subliminally brief periods, you end up with amygdala activation. This can indicate danger ahead, or that the person is anxious about something -- maybe worried about being caught in a lie. And we process these interpretations without even noticing it.

And, as it now turns out, a direct gaze -- just a different proportion and direction of pupils and whites -- can make people more honest... even if the eyes aren't real.
Melissa Bateson and colleagues at Newcastle University, UK, put up new price lists each week in their psychology department coffee room. Prices were unchanged, but each week there was a photocopied picture at the top of the list, measuring 15 by 3 centimetres, of either flowers or the eyes of real faces. The faces varied but the eyes always looked directly at the observer.

In weeks with eyes on the list, staff paid 2.76 times as much for their drinks as in weeks with flowers.
Of course everybody, even New Scientist, is jumping in with the Big Brother jokes, and you have to admit, this experiment implies that that iconic poster would have been pretty effective in assuring compliance among the masses. Likewise, perhaps, the eyes of Dr. T.J. Eckleburg. I see it, though, as a paean to our sclera, perhaps the most underrated part of the visible eye. We get colored contacts because we want our eyes to be more striking, or maybe because we want to look like vampires and cats. We worry -- well, some of us worry -- about finding eye makeup that will play up our eye color. We maintain a mythology of Carries and Clarks Kent, whose prom dates don't think they're beautiful (or whose populace doesn't think they're super) until the glasses come off and the irises are in view. But it's in the sclera that our powers of communication lie.

We've got armadillos in our trousers

PZ has put up an amusing repost about fish and spider penises. The findings may surprise you. I find the spider penis-analogues a little creepy -- something about translucent genitalia doesn't do it for me -- but the final conclusion about fish packages is not to be missed.

Wednesday, June 28, 2006

Oh Richard Feynman we love you get up

While Dan made his QSOs this weekend, I read QED. Overall I found it pretty congenial, though one or two of the metaphors were actually too simplified for me -- I could understand how I was supposed to use them, but not what they actually represented. I was most struck, however, by Feynman's very matter-of-fact explanations of what physics is actually for, and what it actually does. As you know, this is knowledge that I feel is crucial, whether you're in the sciences or the humanities (and especially if you're a politician!). So I wanted to share a couple of quotes.

On page 8 in my edition:
We physicists are always checking to see if there is something the matter with the theory. That's the game, because if there is something the matter, it's interesting! But so far, we have found nothing wrong with the theory of quantum electrodynamics.
Replace "quantum electrodynamics" in that quote with, say, "evolution" -- or any other well-supported theory -- and you're left with a pretty concise, parsimonious explanation of the goals and methods of science.

My favorite quote on the subject of general science, though, is on page 10 in my edition:
Finally, there is this possibility: after I tell you something, you just can't believe it. A little screeen comes down and you don't listen anymore. I'm going to describe to you how Nature is -- and if you don't like it, that's going to get in the way of your understanding it. It's a problem that physicists have learned to deal with: They've learned to realize that whether they like a theory or they don't like a theory is not the essential question. Rather, it is whether or not the theory gives predictions that agree with experiment. It is not a question of whether a theory is philosophically delightful, or easy to understand, or perfectly reasonable from the point of view of common sense. The theory of quantum electrodynamics describes Nature as absurd from the point of view of common sense. And it agrees fully with experiment. So I hope you can accept Nature as She is -- absurd.
Of course, everyone reading this knows that this is the case. But Feynman was talking to people who didn't know, presumably, and I'm willing to bet that he convinced them. Do we have people doing this now? Great scientists, widely acknowledged to be great scientists, who bother to sit people down and say "here's how science works -- not just what we've discovered, but why it means something that we discovered it"? Daniel Dennett is the closest thing I can think of, and of course he's an amazing writer and thinker, but his status as a philosopher (rather than a scientist) is potentially an Achilles heel. People are more than willing to speak of him dismissively. Even the New York Times Magazine didn't give him a fucking break.

I think perhaps the problem is that great scientists are no longer widely (i.e. near-universally) acknowledged as great scientists. Certainly it's a rare scientist who has the force of personality that Feynman had, but I can't think offhand of any current scientists who enjoy "public intellectual" status. There was an article in a recent New Scientist, which I also read in the tent this weekend, complaining about "the fall of reason in the West":
The other challenge was external: a much more critical view of science adopted by the rest of society. Science revealed a darker side. Suspicions arose that it was dehumanising and the tool of dictators. Then came the atom bomb. Since the 1960s, evidence has begun to pile up that science's triumphs are poisoning the planet.

The result is a widespread western, and especially American, descent into superstition. About 40 per cent of Americans believe that Genesis accurately describes the creation. There is an apparent belief in magic that has had no parallel since the Middle Ages. The growing anti-intellectualism has no western precedent at all. We are witnessing the elevation of emotion over reason, of personal conviction over hard thinking.

...

But pause. Reflect on the inspirations for modern science: belief in God and belief in humanity, a rational world view, and optimism about humanity's place in the cosmos. Science, it seems, has disposed of much of what made it successful. It has eaten away at its thought-foundations: its contribution to human meaning, the human spirit and the non-material richness of civilisation has shrivelled.
This article gave me pause, although I didn't fully agree with all of it. Primarily, I'm not sure I accept the active voice when the authors say that "science...has disposed of much of what made it successful." I tend to think that the fault lies not with the scientists, who after all are just doing what they've always done, and more with the state of public education -- most people are being fed misinformation about science and its role, and the public education system is ill-equipped to adequately counter this misinformation, nor to stanch it at its source. But I wonder whether our side just needs a charismatic, plain-talking genius in the Feynman mold.

So... nominations?

(P.S., yes, it's a Frank O'Hara reference. My MA in English is basically useless on the job front, so I might as well get some fun out of it.)

Wednesday, June 21, 2006

The Internet and I are fighting

So I haven't posted since last Friday, but I'll be extending my break for at least another few days. Basically, it has become impossible to ignore the fact that the internet is full of jerks, and it's stressing me out. I haven't actually been fighting with anybody (flamewars lost their novelty in the 90s), but in order to keep up with what's going on in the blogosphere, I have to deal with things like Ann Coulter's book and the Stop the ACLU dipshits -- and even though I mostly read our side's responses (like those two links), it's enough to make you want to sever relations with humanity. Then there's this spiteful old man on the eHam forums, who tried to pick a fight with Dan (and everyone else in the vicinity who suggested that an "article" should probably include some useful information). He couldn't argue on a technical level, so he got personal, which is a nasty move even though all the insinuations were untrue. Oh, and then he went on to say that anyone with a tech-class (entry-level) license was automatically unqualified to discuss any aspect of ham radio (keep in mind that I am not a ham at all and even I could tell that his article was nonsense). Not even the new Carnival of the Liberals -- which links Bee Policy and Truth Tables and is hilarious and is borne up on fluttering left-wings -- made me feel substantially better about the overall composition of cyberspace.

I've been mulling over some ideas for blog posts, including (ironically) one about emergent behavior and how it relates to internet content. But the communications morass we call the Web is so peppered with unpleasant people and vile ideas that any time I sit down at the computer, I lose my will to compose. Right now, the internet doesn't feel like a conversation in which I want to participate. I know it's a skewed and incomplete view, but in order to regain perspective, I just need to avoid everything but email and webcomics for a while.

Luckily, we're going camping this weekend, because Saturday is Field Day. This should be just the break I need -- woods, campfire (probably in one of those little campsite containers), maybe reading some New Yorker by the propane lamp... and most importantly, a very old and regimentally polite communications medium. Some hams may be jerks on the internet, but nobody's a jerk on the radio, especially not on Field Day when everyone is just trying to make as many contacts as possible. It's a very relaxing sort of communication -- you exchange callsigns, locations, and information about what kind of power source and how many transmitters you're using, and then you move on to the next person. Just "CQ Field Day, this is November Three Oscar X-ray, you're one alpha in Mike Delta Charlie, QRZ" (or even better: -.-. --.- ..-. -.. -. ...-- --- -..- .---- .- -- -.. -.-. --.- .-. --..), pretty much over and over again, allowing of course for a lot of repetition of the call sign. I can cope with that.

I might even operate a little, which you're allowed to do without a license on Field Day. I haven't had much reason to get a ham radio license, though Dan's been pushing for me to get one before they totally nix the Morse code requirement. I don't copy Morse very well, and I've never been a particular tinkerer (though I have now learned how to solder), so I didn't seriously consider it. But I can't say that I'm not craving a return to an older, more civil sort of communication.

So I'll be out of commission for at least the rest of the week. Meanwhile, if you are a tinkerer (or just interested), you should swing by a Field Day operation site. Dan and I are going to be hiding in the woods, but not all hams are so antisocial; while Field Day is technically intended as emergency communications practice, a lot of people see it as a chance to introduce amateur radio to the public. Plus, you get 100 extra points for media coverage, so a lot of local papers have been receiving press releases about area operations, and they may have blurbs. Amateur radio appears to be endless fun for people who enjoy activities like building circuits, bolting things together, and getting postcards from overseas. Even just living with a ham, I've learned all sorts of stuff about coaxial cables and how the ionosphere works. A lot of the hams on the online forums are big jerks, but a lot of people on any online forum are big jerks too. I recommend it for anyone geeky enough not to care that it's a geeky hobby. Or anyone who wants an alternative to the internet.

Friday, June 16, 2006

Feeling 0.2 pounds too heavy? Time for micro-lipo

Part of the problem I discussed yesterday, about non-comprehensive approaches to obesity, stems from the fact that many people don't understand the pervasiveness and harmfulness of our cultural attitude towards fat. We're so thoroughly acculturated that the idea becomes transparent. If you're not a woman, if you're not American, or if somehow you've managed to have a perfectly culture-approved body all your life (not too fat, not too thin, no lumps or stretch marks -- yeah, like you exist), it's possible to be incompletely attuned, even insensible, to the very real tyranny of looks. Things that fat activists take as axiomatic -- that it's seen as acceptable and even expected to make fun of fat people, that fat people are perceived as sexless or unattractive, that magazines and television promote unrealistic goals that young women genuinely expect themselves to achieve, that doctors are unresponsive to legitimate health complaints from fat people, that gyms are inhospitable places for the overweight -- may strike others as overblown. They are not. American culture's relation to looks and weight is a terrifically dysfunctional one, and is visited on almost all women (and many men) who grow up here.

As a conveniently-timed illustration, here's an article from the New York Times about increasingly specific plastic surgery. Where desperate and unhappy people once resorted to liposuction for their spare tires and badonkadonks, now they're going for their bra fat or their overly fleshy knees:
Last year, Americans had about 455,000 liposuction operations, making fat removal the most popular cosmetic surgery procedure, according to the American Society for Aesthetic Plastic Surgery. But in the last two to three years, liposuction, once used predominantly to reduce the flabby abdomens, hips and thighs of average Americans, has become a tool to enhance the near-perfect body parts of the already fit.

For this designer-body approach, an increasing number of doctors are using a technique known variously as precision, selective or micro liposuction. The goal is to remove an ounce or three of fat from ankles, knees, chins, necks, backs and upper arms, according to some prominent plastic surgeons and dermatologists.
There's an example of the feared and hideous "bra fat" (the term seems to be applied to flesh near both the armpit and the back strap) at the top of this post. I got that image from a site that was advertising, instead of highly specific liposuction, a procedure called mesotherapy, in which chemicals are injected under the surface of the skin. I'm not sure about the risks of this procedure, since many of its practitioners are outside the US and thus outside the jurisdiction of the FDA. Complications from lipo include infection, necrosis, and embolism, and smaller and trickier procedures carry more risk: "Ankles have superficial nerves and arteries that can be damaged, [Dr. Fodor] said. Fat on the back or kneecap is very fibrous and can be difficult to remove evenly. And kneecaps have sac-like cavities that can be easily traumatized." But come on... you might have a knee pooch!

Even more upsetting, some of the "fat deposits" that these women are spending thousands of dollars to remove seem to be the result of ill-fitting clothes. The woman in the lede, for instance:
"I had a little roll of fat hanging over the back of my jeans, like a spare bicycle tire in the back," said Dana Conte, a bartender in Manhattan. It was so obvious that her mother constantly came up behind her and pulled her shirt down over it, Ms. Conte said. "When your mother is doing that, it means there's a problem."
...
Last August, she had liposuction on her lower back around her waistline, and in January, she had liposuction again, this time on her mid- and upper-back to eliminate "bra fat," bulges that can occur when "your bra pushes lumps of fat down your back and up over the bra fastening and to the sides right near your arms," Ms. Conte said.

The total fee for both procedures, $10,000, was well worth it, she said.
Yes, when your mother pulls your shirt down to cover your back bump, it means there's a problem: your shirt is too short and your jeans are too tight. And that so-called "bra fat" is the result of wearing a bra with a too-small band size. Cost of a new bra, a complimentary bra fitting, and a new pair of jeans? Oh, probably $150 tops, if your jeans are not redonkulous. Not spending $10,000 on risky cosmetic procedures? Priceless.

What I found particularly interesting was that one of the women in the article requested a procedure to remove fat from her mons pubis. She claimed that "a little bit of fat stuck out over her bikini." In this case the doctor refused to do the surgery, but it brings up an interesting question: where do you draw the line between culturally unacceptable fat deposits and parts of your body? The mons pubis is a body part, not a body flaw -- it's intended to provide cushioning during sex. It's as though someone went to the doctor and said "I have these fleshy excrescences hanging off the bottoms of my ears -- can you lipo them out?" Cosmetic surgery is usually considered to be up to the individual, but doctors can't legally remove healthy body parts. When we become so image-obsessed that we start removing functional fat, have we crossed that line?

My overall point is this: when you think about obesity, don't just think about the headless bodies that lumber through news reports. If you want a picture of weight in America, imagine a skinny woman staring at her kneecap chub forever.

Thursday, June 15, 2006

Too fat for the comments box

Kevin Beck at Dr. Joan Bushwell's Chimpanzee Refuge is posting a five-part series on the dangers of fat acceptance. My mother is working on an article for the NYT Magazine about potential microbial explanations for obesity, which I've been editing and discussing with her, and of course I've been wondering how much weight I'll turn out to have lost when I next go to the endocrinologist, and whether she'll increase my medication or not. Plus I'd just been discussing this exact issue with Laura. In other words, I've been thinking about this stuff a lot, so my response got too long for the comments, and I'm posting it here instead.

This began as a response to a comment (rather an unnecessarily sarcastic and reactionary response, I thought, but I'm giving Kevin the benefit of the doubt later information shows that impression to be mistaken), in which he stated that "the reason I didn't bother [to discuss the impact of food marketing] is because I'm not addressing the causes of obesity in the first place."

--

Why aren't you addressing the causes? Buddha-belly was right to point out that these are multivalent, not to mention supremely important in considering whether we can treat obesity as a disease with a moral component. Which is really what NAAFA is about in the first place.

I'm constantly struck by people's inability to be moderate on this issue. Fat activists totally ignore evidence that being fat can be very unhealthy, or that some people really are fat because they eat too much and don't exercise enough; people decrying the obesity epidemic tend to ignore socioeconomic and non-habit-based causes, and downplay evidence that being fat doesn't have to be unhealthy. The underlying problem in both cases is the unwillingness to differentiate on the basis of cause.

There are several problems with NAAFA's approach, as you say. For one thing, if you refuse to accept that many people are obese because of bad eating and exercising habits, you blithely allow them to pass these habits on to their kids, and that's not acceptable behavior for a supposedly activist group. It's one thing to fight for people's self-respect, but quite another to endanger children. In general, NAAFA conveniently ignores the fact that just because obesity isn't necessarily unhealthy doesn't mean it can't be unhealthy. The NAAFA activists are so terrified of making fat into a blame issue that they resist the existence of obesity-related health problems, which as you point out is extremely dangerous.

But while they ignore a great deal of pertinent information, a lot of their assertions are not wrong, and that's not coming through here. You admit that yo-yo dieting is unhealthy; eating disorders haven't been mentioned, but I doubt anyone would classify them as salutary. Fat people are ridiculed, and their health problems ignored or sidelined -- I've had heavy friends go to the doctor with acute infections, and have to wait for treatment until after they'd had a diabetes test. (Anecdotal, I know, but I'm sure I could dig up studies on the phenomenon -- it's widespread.) There are serious consequences, psychological and physiological, to the exaggerated societal attitude towards fat -- and it is exaggerated, beyond what is reasonable for something potentially caused by unhealthy habits. We don't see people making nasty jokes about the employability or attractiveness of folks with emphysema, just because they were probably smokers; it's inappropriate to brook nasty jokes about fat people simply because they might be gluttons.

And I say "might," because obesity is not infrequently caused not by habits but by genetics, slowed metabolism, hormonal imbalances such as hypothyroidism, antidepressants or other medication, and potentially gut flora imbalances and adenoviruses. For people who are heavy at an early age, these disorders can actually be the result of the very real psychological consequences of anti-fat sentiment -- eating disorders, for instance, can significantly slow metabolism, and childhood and adolescent misery can require the prescription of weight-boosting antidepressants. NAAFA is wrong to ignore the contribution of bad habits to obesity, and its potential health repercussions; we would also be wrong to ignore the possible irrelevance of bad habits to obesity, and its potential health-related causes.

And neither of these positions acknowledges the socioeconomic underpinnings of the obesity epidemic, which are habit-based and health-endangering but not due to greed or indolence. Although fast-food companies -- in the wake of one of those lawsuits you present as so frivolous -- are offering less detrimental options, the fact remains that healthy food is expensive, and time for exercise is a luxury many can't afford. If you work two sedentary jobs and can't afford not to eat at McDonald's, you'll get fat. NAAFA's position will do you no good in that case -- you should be exercising, and you are at a health risk, and it's dangerous to pretend you aren't -- but neither will an unwillingness to treat obesity as anything but a moral failing. In fact, I'm willing to throw my lot in with NAAFA to this degree: we should never treat obesity as a moral failing. Where NAAFA goes wrong is in imagining that this means we shouldn't treat it as a genuine health risk.

Tuesday, June 13, 2006

Your whole future is behind you

Quick: "Wednesday's meeting was moved forward two days." Was the meeting on Monday or on Friday?

In English, the likelihoods are about equal. While we generally think of the future being ahead of us and the past behind us -- you need only consider the mnemonic "spring forward, fall back" -- there are circumstances in which the two are reversed:
You can use the word "ahead" to signify an earlier point in time, saying "We are at 20 minutes ahead of 1 p.m." to mean "It’s now 12:40 p.m." Based on this evidence alone, a Martian linguist could then justifiably decide that English speakers... put the past in front.

There are also in English ambiguous expressions like "Wednesday’s meeting was moved forward two days." Does that mean the new meeting time falls on Friday or Monday? Roughly half of polled English speakers will pick the former and the other half the latter. And that depends, it turns out, on whether they’re picturing themselves as being in motion relative to time or time itself as moving. Both of these ideas are perfectly acceptable in English and grammatical too, as illustrated by "We’re coming to the end of the year" vs. "The end of the year is approaching."
Of course, these are exceptions, for the most part. In our culture and in most cultures we're familiar with, we consider ourselves to be oriented with the future at our helm. "Let's put this behind us." "Your whole life is ahead of you." "So we beat on, boats against the current, borne back ceaselessly into the past."

Now, according to an article in Cognitive Science, scientists have happened upon a South American people, the Aymara, who conceive of time in the reverse direction. According to an excellent writeup in UCSD News, both linguistic and gestural data implied that the Aymara consider themselves to be metaphorically oriented with the future at their backs:
The linguistic evidence seems, on the surface, clear: The Aymara language recruits "nayra," the basic word for "eye," "front" or "sight," to mean "past" and recruits "qhipa," the basic word for "back" or "behind," to mean "future." So, for example, the expression "nayra mara" -– which translates in meaning to "last year" -– can be literally glossed as "front year."
...
Analysis of the gestural data proved telling: The Aymara, especially the elderly who didn’t command a grammatically correct Spanish, indicated space behind themselves when speaking of the future -– by thumbing or waving over their shoulders – and indicated space in front of themselves when speaking of the past – by sweeping forward with their hands and arms, close to their bodies for now or the near past and farther out, to the full extent of the arm, for ancient times. In other words, they used gestures identical to the familiar ones -– only exactly in reverse.
Once you think about it, this makes perfect sense. My first thought, and the researchers involved seem to concur, was that the past is "ahead" of you because you can "see" it by remembering it. The analogy isn't to the direction of movement, but to the direction of sight. It's an interesting indicator of cultural differences. If the future is ahead, then you're charging on blindly; if the past is ahead, then you're gazing back in memory.

More important than the putative explanations, though, is the fact that such a transparent idea can actually be culturally specific. The whole "meeting moved ahead two days" thing is a bit of a semantic trick, intended to illustrate the fact that linguistic data alone can be misleading; in truth, we would almost never think to question the idea that the future is in front of us. And for a lot of people, that simple fact -- that we wouldn't think to question it -- would be enough to mark such a concept as "innate." If everyone believes it, if nobody would ever consider believing otherwise, then it must be the natural way of things.

We should resist this thinking. It leads to a blithe inability to deconstruct such "natural" ideas as "thin people are more attractive" or "women are more emotional than men," or (in earlier times) "homosexuality is perverted" and "black people are dumb." Though this is of course a more benign situation, it should stand as a reminder that there's precious little that's purely "innate," whether we think to question it or not. And we should think to question it.

When explaining to my students what a "transparent technology" was, I used the example of a cup. Nobody thinks of it as technology, because everybody knows how to use it. One of my students gleefully interrupted to say that I was wrong, he'd just seen a show on Lifetime or something about princesses and one of them didn't know how to pour liquid into a cup! Of course, this only proved my point: when you take away the cultural familiarity that allows the technology to go unquestioned, it ceases to be transparent. We should be aware of this transparency, even in such simple things as cups and time-related language. It's easy to mistake "unquestioned" for "unquestionable."

Monday, June 12, 2006

A circadian classic

Coturnix has just reposted an excellent post about sleep and circadian rhythms. It includes handy tips about correcting a late-night, late-morning ("owl") sleep schedule, such as:
  • If you are an extreme owl, when you first get up in the morning, immediately go out in the sunlight (that is thousands of lux of light energy, compared to hundreds from a lightbox) for a jog with your dog. If you do not have a dog, buy one.

  • Work at night, sleep during the day (in a pitch-dark, light-tight, sound-proof room) and enjoy life in all its quirkiness.

  • If you wake up in the middle of the night, do not get up or switch on the light. Have sex instead. Hopefully your partner will enjoy being woken up by your kinky activities.
There's also a consideration of why teenagers have different sleep schedules than adults, some sociobiological explanations for human sleep patterns, a primer on "sleep etiquette," thoughts on the future of sleep, and a consideration of the role of sleep in our Puritan society:
I see some striking parallels between the way this society treats sleep and the way it treats sex. Both are sinful activities, associated with one of the Seven Deadly Sins (Sloth and Lust). Both are associated with the most powerful biological needs. Both are supposed to be a taboo topic. Both are supposed to be done in private, at night, with a pretense that it is never actually happening. Education in sleep hygiene and sex hygiene are both slighted, one way or another (the former passively, the latter actively opposed). Both are thought to interfere with one's productivity - ah, the good old Protestant work ethic! Why are Avarice and Greed not treated the same way? Raking in money by selling mega-burgers is just fine, and a decent topic of conversation, even a point of pride. Why are we still allowing Puritan Calvinist way of thinking, coupled with capitalist creed, to still guide the way we live our lives, or even think about life. Sleeping, whether with someone or alone, is a basic human need, thus a basic human right. Neither really detracts from the workplace productivity - au contraire: well rested and well satisfied people are happy, energetic, enthusiastic and productive. It is sleep repressed people, along with the dour sex repressed people, who are the problem, making everyone nervous. How much longer are we going to hide under the covers?
Coturnix claims that this isn't his best post, which just shows how good his other posts are. But for someone who tends towards an atypical sleep schedule, it was very enlightening. Don't read it right before bed -- too interesting -- but do read it, even if you're a perfectly normal sleeper. In fact, it might make you question what "perfectly normal sleeper" really means.

My ears are 18

My mother sent me a link to the NYT article about "teen-only" ringtones, lamenting that she had listened to the audio file and couldn't hear a thing. The idea here is that kids have made ringtones out of a noise, called "Mosquito," that was originally developed to disperse loitering youngsters. Kids can hear it, adults can't. I tried to reassure Mom that I'd actually heard it was a hoax, since I had read on Boing Boing that cell phone speakers couldn't make that sound. Cory had expressed admiration: "I had similar doubts -- which suggests that these kids have done something even more subversive than creating an adult-proof ringtone: they've convinced adults that there's an inaudible sound that they can all hear" -- and like Cory, I kind of wanted the story to be one of psychological, rather than technological, ingenuity. But it turns out that piezoelectric speakers can handily reproduce the "Mosquito" tone, so the story is plausible. And besides, I can hear the thing.

It's pretty horrible, actually. As I described it to Mom, "imagine someone playing a kicky rhythm on the blackboard with their nails." Actually, if anyone remembers the high-pitched squeal made by a roomful of Apple II-es, it's much like that -- and that noise used to drive me out of the computer lab. Which raises some questions. The NYT article claims that presbycusis, the natural aging-related hearing loss that makes adults immune to "Mosquito," starts in "early middle age." In other words, I should be able to hear this awful thing. So why was it ever economically feasible to blast it from speakers outside stores? Sure, you'd do away with teenage loiterers, but is that really worth losing all of your customers under 40? It's not the brightest of choices for a teen-only ringtone, either. I had a lot of middle-aged teachers, but a fair number of young ones, too... and if your teacher can hear your ringtone AND finds it exceedingly unpleasant, you are not exactly ahead of the game.

In any case, while I was sad to lose my story of psychological warfare, the technological ingenuity is actually quite admirable. It's a nice example of turning a weakness into a strength; if someone is going to exploit your ability to hear high-pitched noises, use high-pitched noises to keep your activities secret. And it indicates that the real difference between kids and adults isn't presbycusis -- it's techie know-how.

Sunday, June 11, 2006

New on the blogroll

I'm very grateful to ScienceBlogs for bringing in a huge new crop of bloggers, including many people I read previously. Now, when I want to check up on Neurotopia or Science and Politics or Framing Science or The Loom, I can just stop in at ScienceBlogs. Plus, they've reorganized the front page, so that you can navigate the new denizens (and the old favorites) in different ways. It's true that I've had... opinions about SEED in the past, but they are doing a gorgeous job with their little blogoverse.

But not everybody worth reading can ScienceBlog, and while I've whittled down my sidebar due to SB's move towards one-stop shopping, I've also added a new link. Sex Drugs and DNA follows science policy, with a concentration rarely seen. Usually you have political blogs that also cover some science, or science blogs that also cover some politics. Here we have a blog entirely devoted to science-related bills, ethics, activist groups, and the uneasy bedfellowship between a public that sometimes believes in science and an administration that almost never does. You can watch Bill Frist equivocate about stem cells, or track the progress of the proposed science integrity amendment. Highly recommended reading.

Saturday, June 10, 2006

The least tasty way to support evolution

When Adam Scott got it into his head to eat monkey chow for a week, I don't think he was particularly trying to prove our kinship with nonhuman primates. In fact, his stated reason is that he hates food shopping and doing the dishes:
Imagine going to the grocery store only once every 6 months. Imagine paying less than a dollar per meal. Imagine never washing dishes, chopping vegetables or setting the table ever again. It sounds pretty good, doesn't it?

But can a human subsist on a constant diet of pelletized, nutritionally complete food like puppies and monkeys do? For the good of human kind, I'm about to find out. On June 3, 2006, I began my week of eating nothing but monkey chow: "a complete and balanced diet for the nutrition of primates, including the great apes."
Still, while this is an effective (if foolhardy) way to get out of doing the dishes, it's also an interesting (if foolhardy) experiment in human-monkey kinship. In a sense, Adam is martyring himself for all of us, staking his health on his innocent conviction that what's good for our monkey brethren is good for us. It's touching, really.

It makes some sense, too. The stuff is labelled "Primate Food," we're primates, so it's good for us, right? The answer is that I'm not sure. Monkey teeth are pretty close to ours, and the only significant jaw difference I could find had to do with minor disparities in stretchiness and bite force. It's not as though you can't have speciation based on food choices -- Galapagos finches, anyone? -- but I would think that the dental similarity would indicate that we're akin to other primates in that particular. Nutrition-wise, it's not at all tough to imagine that we could survive on the fruits and leaves that monkeys eat in the wild; in fact, this rather obsolete press release indicates that we might be better off. In short: besides the fact that we enjoy nice tastes, there's no obvious reason why monkey chow shouldn't be human chow.

But to understand that, you have to be fully committed to the deep and abiding brotherhood between man and monkey. So Adam, thank you; you're not just avoiding the dishes, you're lending credence to evolution. In the words of interviewer Steve Johnson (not that Steven Johnson): "This is a great thing you're doing, both for science and for my personal enjoyment."

Sunday, June 04, 2006

We Have Always Been Posthuman

First, let me assure you that if you're an English scholar with a sufficiently postmodern focus, the title of this post is a riotous, riotous joke. See, it references Latour's We Have Never Been Modern and Kate Hayles' How We Became Posthuman! So funny. (This is what literature scholarship does to your sense of humor. Anyone who's still wondering why I took a break from academia, wonder no more.)

Anyway, the week's Big Duh award goes to Reuters for reporting that we may not be made up entirely of human cells. Anyone who grew up on Madeleine L'Engle is currently rolling their eyes and saying "uh, yeah, the mitochondria and the farandolae? Please." Even those of us who have refined our science knowledge past the first-grade reading level, and who know there's no such thing as farandolae, are probably pretty comfortable with the idea that mitochondria have their own DNA. We also know that bacteria have genetic material, whether or not they live in our intestines. So when we read about a TIGR researcher explaining to a reporter that "We are somehow like an amalgam, a mix of bacteria and human cells," we can almost hear the tone of exaggerated patience.

Still, the sheer volume of bacterial cells, and the ratio of bacterial to human DNA in some parts of our bodies, is pretty impressive. The digestive system, in particular, seems to be more microbe than man:
They [a TIGR team] compared the gene sequences to those from known bacteria and to the human genome and found this so-called colon microbiome -- the entire sum of genetic material from microbes in the lower gut -- includes more than 60,000 genes.

That is twice as many as found in the human genome.

"Of all the DNA sequences in that material, only 1 to 5 percent of it was not bacterial," Gill said.
What's not mentioned in this quote: the "so-called colon microbiome" was extrapolated from feces. So I'm not sure how they ruled out DNA from masticated critters, or how much human DNA they really expected to find. Clearly there is something about poop analysis that I don't quite understand (and am not sure I want to). But the upshot is, humans are ecosystems.

But does this mean, as the Reuters headline claims, that we are not entirely human? Well, it depends what you mean by "we" and what you mean by "human" (how's that for an if-by-whiskey argument?). Are our bodies entirely human? Well, DNA found within the boundaries of the human body isn't necessarily human DNA, and never has been. But DNA in human germ cells is, no matter how many bacteria live in our guts. So, does "completely human" mean "containing, at any given moment, only human DNA," or does it mean "having only human DNA govern your initial genetic development"?

Certainly the presence or absence of intestinal flora has a huge effect on an organism, potentially an effect that could alter later gene expression. But I'm not sure this makes us "not completely human." The Earth is inhabited by all sorts of flora and fauna, all of which have profound effects on it (none more than us). None of these are planets. Is the Earth not a planet either?

If the question you're asking is whether our minds are completely human, the answer at first seems simpler. Gut flora probably don't have an enormous effect on our proprioception, emotions, consciousness -- the things that make us us. Again, the presence or absence of these bacteria may cause environmental effects profound enough to affect cognitive development or even gene expression in the brain, but I don't see any evidence that they can make us psychologically or neurologically nonhuman. And yet once again, one has to consider the fact that in a sense, we've never been purely human on the cognitive level either. In our highly technological age, we're becoming more aware of the "cyborg consciousness" that occurs when human brains rewire themselves to accommodate iPods or laptops or cell phones. There's nothing natural about using a computer or an MP3 player, but these technologies become transparent as we train our brains to think of them as extensions of ourselves. It's even more obvious with cars, where we actually use our proprioceptive sense to avoid obstacles or parallel park -- we treat this unwieldy metal casing as a part of our own bodies. This is what it means to be "posthuman" -- a fundamental interrelation between technology and embodiment.

Freaky 21st century stuff, right? But we've been absorbing technology like this since we, as a species, were in short pants. Consider eyeglasses. Bicycles. Crutches. Musical instruments. Weapons. Flint arrowheads. Rocks and sticks. The very earliest cultural and societal advances, the ones we think about as setting us apart from less civilized animals, involved using tools as improved fingernails or teeth, longer arms or digits, stronger fists. Since we've been human, our adaptable plastic brains have allowed us to incorporate technology into our self-perception. Without this ability to have our tools rewire our brains, we wouldn't have the dexterity and control that would allow us to throw a spear or manipulate a paintbrush. Thus, in what might look like a paradox, we've been posthuman as long as we've been human.

In other words, part of what it means to have a human mind is to have a mind that incorporates and adjusts to peripherals. Part of what it means to have a human body is to have a body that works symbiotically with other organisms like bacteria. Does this multiplicity make us "not completely human"? Quite the opposite.

Saturday, June 03, 2006

Move over, Butter

Despite technically living half a mile from the Zoo, I never did make it down to see Butterstick, and now he's been supplanted in cuteness. The Sumatran tigers have had three cubs, as yet unnamed, and of unknown sex since they haven't yet had their first medical exam. These, to the left, are not they; these are just some cute-ass baby tigers I found on the internet. Want to see the real thing? Well, like Butterstick, the tigers have a webcam. Pretty soon the National Zoo is going to have all their newborn animals blogging, too.

Of course, Butterstick was an achievement for science because it's so damn difficult to get pandas to have sex correctly, and even then, you can't tell they're pregnant and they're unlikely to bring the fetus to term. And even then, panda infants are so fragile that they often die. So Butterstick is not just a mind-alteringly adorable little fuzzball -- he's also a scientific triumph. The baby tigers? Well, they're just darned cute.

Thursday, June 01, 2006

Escherichia oompaloompa

It was a relief to find, when I clicked on this article from New Scientist, that the author had already made the Willy Wonka reference. When you read about researchers feeding caramel to E. coli, and then powering a fuel cell with the excreted hydrogen, it's hard to pass up the image of Oompa-Loompas in coveralls tipping chocolate into the gas tank of a candy car.
The team fed Escherichia coli bacteria diluted caramel and nougat waste. The bacteria consumed the sugar and produced hydrogen, which they make with the enzyme hydrogenase, and organic acids. The researchers then used this hydrogen to power a fuel cell, which generated enough electricity to drive a small fan.
Now, of course, this has the same problem as any other energy source: namely, it takes energy to make caramel and nougat, just like it takes energy to make biodiesel and ethanol. And eventually, at least right now, all our energy comes from oil. But if there really is a significant "waste sweeties" backlog, maybe the Wonka-car is a good candidate for limiting our oil use -- operate our surrealistic fantasy chocolate factories off of oil, and use bacteria-fueled hydrogren cells for things like running people off the roads with our candy-powered SUVs.