30 November 2009

For new brain cells, go to the wild

ResearchBlogging.orgThis post was chosen as an Editor's Selection for ResearchBlogging.orgFor a while, people thought that vertebrate brains did not get new neurons throughout life. Songbirds provided one of the first known counterexamples, when it was discovered that some birds generated new neurons, associated with song centers, every year. Later, mammals were discovered to generate new neurons in adulthood, most interestingly in the hippocampus.

The hippocampus is deeply involved in all manner of learning and memory, but particularly spatial learning and memory. The natural hypothesis is making new neurons might be related to learning.

Demonstrating what causes new neurons to form is tricky, though. For example, one standard spatial learning test, the Morris water maze, is set up so that animals (rats, actually) that learn the task move around less (swim, actually) than animals that don’t learn the task. Learning and the physical activity are inextricably linked, so you cannot easily say which caused new neural growth.

Researcher Lara LaDage and colleagues tried to address this problem by using a different animal and a different learning task. Chickadees are very good at spatial learning because they store food in little hiding spots, and come back for it later. Many birds are dramatically better at these kinds of spatial memory tasks than humans.

LaGade and company captured a group of young chickadees (Poecile gambeli) in the wild, and split them into three groups.

  1. Banded, let loose to live free in the wild, and recaptured three months later.
  2. Captured and kept in in captivity for three months, where they were allowed to cache food and do associative learning.
  3. Captured and kept in in captivity for three months, where they were not allowed to cache food.

A lot of the difficulty in running an experiment like this is trying to make sure everything is the same except one variable. With the free-living chickadees, you just have to accept that you can’t control everything. The trick is in making everything the same for the two groups of lab animals, a key one being keeping the animals’ general activity levels the same. LaDage and colleagues measured the chickadees’ perching in the lab, and were able to show they moved about the same distance, eliminating the confound between activity and learning.

All of this was done to see the impact on the brain. LaDage and colleagues identified new neurons by labeling a protein called doublecortin. Doublecortin shows up fairly specifically in new neurons, so by using an antibody to it, they could look in various regions of the hippocampus to see how many new neurons had formed in roughly the last month.

All the birds grew new neurons, but the free range birds had more new hippocampal neurons – both in sheer raw numbers and proportionately – than the captive birds allowed to cache food. The captive birds that cached food had more new neurons than the captive birds that did not cache food. A simplistic summary might be that animals’ brains show greater capacity to change and adapt when they get to be animals in the environment that natural selection has shaped them for.

Reference

LaDage, L., Roth, T., Fox, R., & Pravosudov, V. (2010). Ecologically relevant spatial memory use modulates hippocampal neurogenesis Proceedings of the Royal Society B: Biological Sciences DOI: 10.1098/rspb.2009.1769

Picture by user jerryoldenettel on Flickr, used under a Creative Commons license.

29 November 2009

28 November 2009

More reality science ideas

Quite a while ago (Sometimes I forget how long I’ve been blogging), I wrote an article about reality television shows for scientists.

A while back, Bitesize Bio took up the torch with a new list of science reality shows we should have, but don’t. I particularly liked:

Research project with the stars: B-list celebrities become first year graduate students and experience rotating through three labs, presenting their work to their department and trying to meet all of the expectations of any other PhD student. Watch them struggle with error bars and statistics, freak out at the MSDS for dihydrogen monoxide and laugh at their tantrums when the acetone destroys their manicures.

Of course, I like this idea in part because of the cheap help in the lab...

27 November 2009

Life in the holodeck

I’ve written a bit about how evolutionary theory might help resolve Fermi’s paradox. A new article by Geoffrey Miller in Seed focuses on another part of the Drake equation, culture:

I think the aliens don’t blow themselves up; they just get addicted to computer games.

Of course, as is often the case, the article really isn’t about SETI, but our own culture.
Post title pinched from bfchirpy.

Let’s get personal

The Dr. Jekyll & Mrs. Hyde blog had a quite good post on writing personal statements.

There’s a tricky thing about personal statements: It’s almost impossible to find good and bad examples of them. Because personal statements are part of competitive applications for programs or positions, they’re confidential, and rightfully so. It’s almost impossible to see how other people do them.

Thus, people don’t have an easy way to learn that something that is highly important to them personally – like having seen a close relative struggle with illness – is not that unusual. It’s easy for a professor or administrator to joke about the “grandmother story” being a cliché, but someone trying to write a personal statement for the first time has no easy way of knowing that their story is common.

The moral, for someone trying to write a personal statement, is to find someone who reads them regularly and get feedback before you send it off to that big program you’re interested in.

26 November 2009

Shell shock: Is the new way to dispatch a lobster a better way?

ResearchBlogging.orgWhen you’re a crustacean neurobiologist, cooking a lobster is a topic you’d better be familiar with, because you will be asked about it. (See posts in February 2005; May 2003; maybe this subject needs to get its own label.)

The Daily Mail has an article on the latest effort to deal with concerns that boiling lobster alive is inhumane. The title claims it’s a way “to kill a lobster with kindness.” This potentially more humane alternative to boiling?

Electrocution.

And it’s electrocution with a cutesy name. The device is called “CrustaStun,” which I’ll wager is supposed to be pronounced “cru-stay-stun” so that it rhymes with “crustacean.”

I never would have thought of electrocution as a means of killing a lobster, because earlier this year, I read a pair of papers by Robert Elwood and Miriam Appel relating to electricity and crustaceans. They were studying hermit crabs rather than lobsters, but most decapod crustaceans have very similar nervous systems and broadly respond to the same kinds of stimuli.

These two papers from these authors are a matched pair. Indeed, the two should have been combined into one paper. Each is very slight, with one experiment that is a variation of the other. In both, they implanted wires into the shells hermit crabs live in, so they could deliver small electric shocks to the hermit crab’s abdomen. In one (Appel and Elwood), they varied the intensity of the electrical stimulus. In the other (Elwood and Appel), they kept the intensity of the electrical stimulus the same, but gave the animals options for new shells to enter should they leave the shells they had been shocked in.

Hermit crabs are unarmored and do not like being outside of their shells. In both studies, they found the electric shock significantly changed the behaviour of the hermit crabs. The crabs left the shells they normally inhabit, and the authors saw some strange behaviours, including some aggressive behaviour directed at the shell. In other words, after getting a shock form the shell, some crabs started treating its shell like an enemy.

Elwood and Appel argue that their results show evidence for electrical shock eliciting pain responses in the hermit crabs. In the context of cooking lobsters, there is more experimental evidence supporting the idea that electric shock is noxious to crustaceans than evidence that high temperatures are noxious.

Nevertheless, there may still be an advantage to the new method in the speed. The Daily Mail article claims that:

The machine can knock a large crustacean unconscious in less than 0.3 seconds and kill it in five to ten. Crabs take four to five minutes to die in boiling water, while lobsters take three minutes.

I can’t help but think of the guillotine, which also had the goal of a quick, humane death. The actual practice of using the guillotine was, I understand, not always so tidy, and I wonder about how consistently the machine will accomplish its task.

The article claims in passing that this device this may improve the taste. I can think of no particular reason that should be, but the flavour issue is something I will leave for others to decide. But I do find it sad that a question that is so often asked generates strong opinions more than well designed experiments.

Reference

Appel, M., & Elwood, R. (2009). Motivational trade-offs and potential pain experience in hermit crabs Applied Animal Behaviour Science, 119 (1-2), 120-124 DOI: 10.1016/j.applanim.2009.03.013

Elwood, R., & Appel, M. (2009). Pain experience in hermit crabs? Animal Behaviour, 77 (5), 1243-1246 DOI: 10.1016/j.anbehav.2009.01.028

Is a syllabus that hard?

Things you rarely read on RateMyProfessors.com:

“This professor had a great syllabus.”

At a meeting about faculty development last week, a few people brought up “How to write a syllabus” as a skill that new faculty could benefit in receiving some instruction. This surprised me, because I never found putting one together to be particularly challenging. The main thing that students care about how they’re going to be evaluated, and I do think it’s important to spell that out.

Most of the rest of a syllabus is legal fine print, which does prevents instructors from screwing over students. But the fine print has been getting longer over the years. There’s why I don’t think a syllabus matters that much, because it’s not mainly about teaching or learning, it’s about administrative butt-covering. I am reminded of the TED talk below that discusses the relentless emphasis on creating standards, in which Barry Schwartz says.

Scripts like these are insurance policies against disaster. And they prevent disaster. But what they assure in its place is mediocrity.

Similarly, Johnnie Moore writes:

It often seems to me that everytime we experience a crisis, the solution is to write more rules. ... (T)he practical effect is to engulf people in explicit, complicated systems and reduce their freedom - based on an unconscious assumption that everyone is not to be trusted.



Has anyone encountered a lengthy detailed syllabus anywhere outside of formal educational institutions?

25 November 2009

Come on, let me hear you sing with tailfeathers

ResearchBlogging.orgHummingbirds are amazing animals, but one of their less appreciated talents is their vocal abilities. Hummingbirds are one of the few animals that engage in vocal learning, for instance, which puts them in a small group with humans and a few other bird groups. The males learn songs because, as in other species, the lady birds find a good song... sexy.

Authors Clarke and Feo had previously shown that during courtship, male hummingbirds in one genus, Calypte (Calypte costae shown above; Calypte anna below) make sounds with their tailfeathers that are similar to sounds they make when singing. This paper tries to figure out how this particular signal evolved. One of the strengths of this paper is that they have four hypotheses about how the tail sound might have originated, three of which generate different predictions. The song and feather sound could:

  1. Be unrelated as far as the female is concerned; two entirely different cues. If so, there would be no reason for the sounds to be the same as each other in the same species, or to be the same in different species.
  2. Both be exploiting a female preference. If so, there is strong reason for the sounds to be the same as each other in both a single species, and between different species.
  3. Have evolved sequentially, with female preference for one sound shaping the other. If so, you would predict the sounds to be the same in one species, but to differ between the two species.

Strong inference all the way, baby!

This paper does several things. One, they nail down exactly which feathers are responsible for the sound by experimentally removing feathers, and artificially running blasts of air over them. In both species, the same feathers make the sound, suggesting that both birds inherited this feature from their common ancestor. Interestingly, while hummingbirds outside the genus dive the females, few sing, suggesting that the diving is more ancient than singing in this group.

They also show that the tailfeather sounds of each species most strongly resembles that of the song made the normal vocal way. Interestingly, a hybrid between the two species generates a sound intermediate to the other two.

That there is an “intermediate” tail sound made by a hybrid suggests where we’re going with the matching. The two species make rather different sounds with their feathers. The differences in song patterns are fairly complex, but as a for instance, the sounds made by C. costae are around the 7 kHz range, whereas the sounds made by C. anna tend to be lower-pitched.

All of which suggests that one of the two sounds the males make evolved as a sort of “enhancement” to the earlier evolved sound. Given that singing seems to be the more recent innovation in this particular group of hummingbirds, it looks like the males started to sing to enhance the sounds made by their tailfeathers, rather than the song having primacy, as we might intuitively expect.

Reference

Clark, C., & Feo, T. (2009). Why Do Hummingbirds “Sing” with Both Their Tail and Their Syrinx? An Apparent Example of Sexual Sensory Bias The American Naturalist DOI: 10.1086/648560

Picture of C. costae by users Lance and Erin on Flickr, used under a Creative Commons license.

Picture of C. anna by user wolfpix on Flickr, used under a Creative Commons license.

24 November 2009

Tuesday Crustie: “I hate a Barnacle as no man has done before”

Today is the 150th anniversary of the publication of On the origin of species by means of natural selection, or the preservation of favoured races in the struggle for life by Charles Darwin. Although I’ve already featured a crustacean that Darwin collected, it seems appropriate to feature a couple more Darwinian crustaceans.

On his famed trip around the world on the H.M.S. Beagle, Darwin collected many crustaceans. When he returned, the collection was scattered and partly lost. But here is just one crab collected by Darwin himself, one of many from Oxford collection of Darwin’s crustaceans:



Darwin’s crab collection recently went on tour to Australia.

Crustaceans are mentioned in the Origin:

(E)ven the illustrious Cuvier did not perceive that a barnacle was, as it certainly is, a crustacean; but a glance at the larva shows this to be the case in an unmistakeable manner.

It is no accident to find reference to barnacles, since Darwin had some years earlier published major monographs on barnacles, a plate from which is the second Tuesday Crustie, which shows some of the larvae that showed that barnacles were crustaceans:




Description of the plate is here. For comparison, here’s a live one, taken from here:



Darwin spent years trying to sort out the barnacles. Jonathan Weiner argues in The Beak of The Finch barnacles forced Darwin to confront the problem of variation, and the difficulties of determining what is a species and what is merely a “variety.” And it was not an easy task, as indicated in the title of this post, which was taken from a letter Darwin wrote to W.D. Fox.

fMRI in the courtroom

From ScienceInsider:

fMRI scans of brain activity have been used as evidence in the sentencing phase of a murder trial.

I’m guessing we’re only a few years out from a trial that will rule on whether fMRI passes precedents for routine use in American courts.

Hat tip: mocost

Is it time to beg yet?


The nomination deadline for The Open Laboratory anthology, the annual compilation of the best science writing on blogs, has only one week to go. If there’s a post from this blog that you liked, please consider nominating it for the anthology. Posts from 1 December 2008 to 30 November 2009 are eligible.

Here are some suggestions, in roughly chronological order. If there are other posts you like, by all means, submit those.

Click here to submit an entry. A Blog Around the Clock regularly updates the list of entries; here’s a recent one (probably out of date by the time you read this).

In case you’re wondering why I don’t just nominate myself, because it feels gauche to nominate a bunch of your own stuff.

23 November 2009

Dear students, about this coming holiday

Dear students,

I know that the university is closed on Thursday and Friday this week. I’m told it’s some kind of American holiday thing.

Should you happen to ask me if we are having classes on Wednesday, please forgive me if I reply with a little more sharpness in my voice than is really necessary. Please forgive me if I narrow my eyes, close them for a second, or even roll them.

You and I both know full well that the university is not closed on Wednesday. Wednesday is a regularly scheduled class day. It would be unprofessional of me not to have classes Wednesday. I’m sure you value learning (you paid to take this class with me, after all), and I would hate for you to feel you did not get full value for your dollar because I just canceled classes at random. On a whim. Because, you know, I just didn’t feel like coming in, or it would be inconvenient to me.

I know you don’t intend to impugn my professionalism, but that’s how it feels to me when you ask if I’m having class on a day the university is open. And because I have professional pride, I sometimes tend to overreact a little to questions like that.

Additional: See here.

22 November 2009

Texas textbooks toast?

I’ve frequently had reason to talk about Texas science standards here, thanks to the dubious actions of the Texas State Board of Education. It’s often been noted by commentators around the nation is that because Texas is such a huge textbook market, textbook publishers make their books conform to the Texas standards. So when Texas adopts bad standards, the rest of the United States tends to feel the effects.

The Fort Worth Star-Telegram reports that the state is seriously considering moving to online materials, and it could be very quickly – maybe less than a year. The Texas Tribune has a longer article detailing the same subject.

I think this is probably good news in many ways (some of my thoughts on textbooks are here). People in other states will probably have less fear of bad decisions bleeding over into their jurisdictions.

That said, I am worried that this might reduce the spotlight on the Texas State Board of Education off the hook. Their decisions might be subjected to less scrutiny and criticism nationally, creating the possibility that they might do more damage, albeit to fewer people.

21 November 2009

Who doesn’t love a good Venn diagram?



That said, this only covers the main science portion of the blog, not the various other concerns addressed here.

20 November 2009

Taming the backchannel beast

Over at Speaking About Presenting, Olivia Mitchell has written a free ebook about working with Twitter and similar online tools during presentations. It’s a great snapshot of a fairly fast-moving aspect of presentations, and has many good, practical ideas. And it documents some of the harshtag horror shows that have been happening in the last few months.

Extinction through fornication

ResearchBlogging.orgNormally, we think of extinction happening because organisms fail to reproduce. In the case of Canadian sticklebacks, some incipient species are going extinct because they are reproducing all too well.

Three-spined stickleback have many populations around the world. The main population lives in marine systems most of their lives. But over and over, small pockets of animals have been caught in freshwater lakes, and started to go off on their own evolutionary trajectory, separating from their marine ancestors.

Within the individual lakes, the sticklebacks can continue to branch off further from each other. Some specialize in lake bottoms; some on the surfaces. In my old stomping grounds of Vancouver Island, this has been going on in Enos Lake. Back in the early 1990s, the bottom- and surface dwellers were said to be separate species, as they were very morphologically distinct and didn’t interbreed with each other. That said, they were never formally described as separate species and given new species names, as far as I can tell.

This may turn out to be a good thing, because when Behm and colleagues sampled the stickleback populations in 2005, the two distinct groups were gone. To be clear, there were still stickleback – but instead of two clearly distinct clusters with no interbreeding, there was one cluster with lots of interbreeding.

Previously, the two different body shapes had different diets, as measured both by direct observation of the prey types in stomach, and by carbon isotopes found in the fish. You’ll forgive me, I hope, if I skip the details of how lake food webs influence the ration of different carbon types the fish ingest. In any case, that difference between body shape and food type had also vanished.

It’s possible that there are a lot of intermediate shaped fish, but maybe they’re not doing so well. Turns out that as of right now, there’s no evidence of that. The intermediates don’t seem to be small, seem to be growing fine, and so on.

All of this raises the question: What the heck has suddenly caused these two distinct groups to start merging back together?

The authors’ surprising suggestion is that it’s because of crayfish. Signal crayfish (Pacifastacus leniusculus) were introduced into Enos Lake around 1990. Behm and company don’t have a clear reason why having crayfish in the lake should have this dramatic impact on the stickleback populations, but the timing of the two events is mighty suspicious. There’s clearly some interesting ecology to track down in this system.

In the last few decades, we’ve really come to appreciate how dynamic and quick evolutionary processes, like speciation, can be. This research shows that speciation, at least in the early stages, can also be a fairly fragile thing, easily falling apart for reasons that are not entirely clear at first glance.

Reference

Behm, J., Ives, A., & Boughman, J. (2009). Breakdown in Postmating Isolation and the Collapse of a Species Pair through Hybridization The American Naturalist DOI: 10.1086/648559

Stickleback picture by user SuperIDR on Flickr, and used under a Creative Commons license.

19 November 2009

Overqualified

Sincere question, not rhetorical.

How and when and, for goodness’ sake, why did “overqualified” become a reason to turn someone away from employment?

This is starting to bother me a lot. I mean, I help run a Masters program, and am advisor on a major grant to create graduate opportunities for Hispanics, am the lead on one undergraduate research program and participate in another.

But I have my doubts. Reading things like this or this... don’t help.

The concept of “overqualified” just gnaws away at the whole reason for those programs. Not just ours, but nationally. Internationally.

How are you going to create a technically skilled work force in a society when there’s a threat that that very training can be held against them?

I understand that when there are too many applications too read, you’ve got to cut something somewhere. But it seems to me that, “Oh, they’ll just leave when the economy is better” is short-sighted. This is a fantastic opportunity to get amazingly smart people in your company. Imagine the energy and talent a hiring business could recruit today if it said, “nobody is overqualified.” Even if they do leave, you’ll probably have a better company at the end than when you started.

Somehow, I can’t help but think that what’s really leading to the concept of “overqualified” is that an employer doesn’t want an employee smarter than they are. Too likely to upset delicate workplace power relationships. Or something.

I clearly don’t get it.

18 November 2009

The Zen of Presentations, Part 29: The shirt on your back

Steve Jobs is good at presentations. Garr Reynolds has written about this a lot over on his Presentation Zen blog. Now, a whole new book has been written about him. This slideshow says:

Steve Jobs can wear a black mock turtleneck, blue jeans, and running shoes because, quite simply, he has earned the right to dress anyway he wants. For most communicators, it’s best to dress a little better than everyone in your audience.

I can’t help but find the rationalization funny. Author Carmine Gallo spends the book looking at what makes one person a great speaker, but shies away from the possibility that maybe he is great partly because of how he dresses, not in spite of how he dresses.

Maybe people are responding to seeing someone they can relate to. Maybe people are responding to someone who is not relying on artifice. Maybe people are responding to seeing someone who is genuine.

Audiences crave authenticity. It’s a driver behind the success of so-called “reality” shows or YouTube videos: people are looking for the unscripted, the immediate.

With too many presenters, you can tell their dress for their presentation is an act. A total put on. A sham. It’s not real, it’s not who they are, and they’re not comfortable.

Soon after, I spotted this post by Kathy Reiffenstein on what to wear during a presentation. This also struck me as greatly over-stressing formality and business wear, but I appreciated Chris Atherton’s response to it:

Love how much of this is really about attention (yours and audience's).

Right. Be worried not so much about how you look as whether that look will distract you or the audience.

I spelled out my own take over on dressing for presentations on Better Posters.

17 November 2009

Tuesday Crustie: Termite of the sea

 

This squat lobster, Munida andamanica, has been in the news recently for its peculiar ability to eat wood. Picture from here.

Additional:  For more on the science behind the discovery of this animal’s diet, see this post on Deep-Sea News.

For your consideration



I am not too proud to beg.

The nomination deadline for The Open Laboratory anthology, the annual compilation of the best science writing on blogs, is in two weeks. If there’s a post from this blog that you liked, please consider nominating it for the anthology. Posts from 1 December 2008 to 30 November 2009 are eligible.

Here are some suggestions, in roughly chronological order. If there are other posts you like, by all means, submit those.

Click here to submit an entry. A Blog Around the Clock regularly updates the list of entries; here’s a recent one (probably out of date by the time you read this).

In case you’re wondering why I don’t just nominate myself, I think it’s better for readers to decide what’s good than writers. And because it feels gauche to nominate a bunch of your own stuff.

16 November 2009

How to act when you might be eaten

ResearchBlogging.orgIf you don’t have to worry about reproduction, life gets a little simpler. You have to worry about eating. You have to worry about being eaten. And with that, you’re pretty close to done.

Aspidoscelis uniparens is a parthenogenetic whiptail lizard (formerly Cnemidophorus uniparens), and Eifler and colleagues tested how they allocated their time under threat. To do they, they made six large enclosures (225 square meters) and put in six of these small lizards in. In half of these, they also included two leopard lizards, which prey on the smaller one. In fact, the leopard lizards successfully caught a couple of the subjects during the experiment. Unfortunately, the authors describe the predators in much less detail than the whiptails, which make it a bit difficult to interpret their findings.

The whiptails, not surprisingly, act differently when enclosed with the leopard lizards. They spend a less time moving when housed with leopard lizards, and they’re visible less often to human observers. The whiptails also seem to shift their activity more towards the early hours of the morning (say, 7:00 am), seemingly because the leopard lizards’ activity starts to pick up around 8:00 am.

But the whiptails didn’t change their behaviour as much as much as you might think. time spent in vegetation? Time spent digging? How fast the whiptails move? No differences between the control enclosures and the predator encloses. These are small animals, only a few centimeters long, in a very big, natural enclosure with lots of places to hide. There’s not really any way to predict how “worried” the whiptails would have to be – that is, how strong the predator cues would have to be – for them to start changing their behaviour more dramatically, or to even start changing them at all.

Reference

Eifler, D., Eifler, M., & Harris, B. (2007). Foraging under the risk of predation in desert grassland whiptail lizards (Aspidoscelis uniparens) Journal of Ethology, 26 (2), 219-223 DOI: 10.1007/s10164-007-0053-0

15 November 2009

Comments for first half of November 2009

The Byte Sized Biology blog asks questions about submission fees for Open Access journals. I point out there is a distinct lack of any rules, as far as I can see, about who gets to have their fees waived.

I point out that Canadians don’t spell like either Americans or British over at Science after Sunrise.

Speaking of home, I had to express my sympathy for Canadian Girl Postdoc in Canada, who had a run-in with the American health care system.

On Deep Sea News, I correct a description of one of my blog posts. I wouldn’t have minded if it wasn’t for a contest.

At Pleiotropy, which shows an ophthalmologist who pulls out the ol’ “Evolution... just a theory” argument, I note that the journal did publish a reply.

Maniactive talks about the importance of an audience to making a great presentation, which I’ve discussed here once or thrice.

A student asks Dr. Isis about authorship over at On Becoming a Domestic and Laboratory Goddess... I point out that there are guidelines for determining who should be an author. “Because I’m the boss” or “Because I wrote the grant that pays you” are not acceptable reasons for putting your name on a scientific paper.

13 November 2009

Neuroethology on numb3rs

Whoa! Just had a major geek out at seeing some neuroethology research I wrote about only a few months ago show up in the opening scenes of numb3rs. Video clip and everything. Good on you, Ken Catania!

Additional:  A video clip from last night’s episode with Catania’s snake research is now up.

Hitting the target

Spotted this on Facebook...



... and I was off to the races:

Philosophy argues about the target but never takes the shot.

Math assumes the target is a perfect sphere on a frictionless surface.

Science shoots at the target, misses, and says it hit the target within an order of magnitude.

Engineering tears down the old target and builds a new, larger, easier to hit target.

Religion believes in the target.

Crazy draws the target wherever the shot lands.

Sawed-off shotgun ignores the target.

Canada fires at the target, hits it perfectly, then apologizes.

Got more? There’s a comments section!

Quitting the game

Yesterday, I talked about why an instructor should care about whether students come to class. Yesterday was also the deadline for students to drop classes, which is another reason instructors should care whether students come to class or not: not coming is a warning sign that people are just going to give up.

Like so many aspects of teaching (see here, here, and here) there are important parallels with gaming. This article talks about people who don’t finish video games, and there is so much to think about in terms of teaching. For instance, replace “player” with “student” and see how it reads:

Keeping players motivated is difficult. ... The goal is to strike the right balance between difficulty and player ability, thereby always keeping the player within arm's reach of a new achievement.

Despite these attempts to balance difficulty for a wide range of people, the players will still experience failure. More importantly, many of these folks will stop playing because of these failures. It’s rare for people to leave a restaurant because they don’t like the food, and it’s not too common for people to walk out of a movie because it's bad &ndash but game players do put down the controller and leave the game all the time.

I’m not a die hard video gamer by any stretch of the imagination, but I have finished some fairly length video games, and there are a few differences between those I’ve finished, and those I haven’t finished yet.

Sudden ramp up: You’re playing the game, and then suddenly, it gets harder. Way harder. You have not opportunity to practice the skills you need to continue. If you’ve been playing a game and you need two or three tries to complete a level, then suddenly you hit a level where you’ve had at it eight times and feel no nearer success. Teaching lesson: Increase difficulty gradually.

Guides and hints: You hit a puzzle or task that is necessary to progress in the game, and you just get stuck. I loved the video game Okami because you had a guide who gave you hints. You were always pretty clear on what the task was. Teaching lesson: Give people clear goals, and give them feedback before they attempt high-stakes tasks.

Bad save points: Pointlessly replaying what you can do just to get to what you can’t do. In de Blob, to get to the final boss battle (which also has the ramp up problem), I have to start the entire level from the beginning, and it feels like it takes forever. Teaching lesson: Don’t make people keep re-proving themselves.

Also consider what both gamers and students are tempted to do when they get stuck: go online and get a walkthrough or a cheat code.

12 November 2009

Elizabethan anti-intellectualism

Over at The Intersection, Chris Mooney reminds me of another academic argument that isn’t really an argument: Who wrote Shakespeare?

While I’ve written much about denialism here, I had completely forgotten about other disciplines. Interestingly, for a movement that contradicts so much scholarship, most of the arguments I’ve read on this seem to start with, “William Shakespeare was barely literate,” or “poorly educated.” I agree with Neil Gaiman’s take on people who argue that Shakespeare didn’t write Shakespeare:

I don't have much time for the Shakespeare wasn't Shakespeare people overall, as the engine that seems to drive so many of them is snobbery. They want Shakespeare to have been an aristocrat, and not a base-born playwright.

What other “denial” fringe theories are out there in the humanities and other disciplines?

We are our brains

Raymond Tallis argues over at New Humanist that neuroscience is being oversold. He writes a lot of sensible things about overinterpreting research findings.

The fundamental assumption is that we are our brains and this, I will argue presently, is not true.

Then... what are we? Tallis appears to be a Cartesian dualist:

The brain, as understood by neuroscience, is a piece of matter tingling with electrochemical activity. There is nothing in this activity that would make the stand-alone brain capable of making the material objects around it have an appearance to it or able to have the sense of itself as the subject to whom these objects appear. ...

Some may argue that, mysterious or not, this is what the brain does. However, there are other aspects of human consciousness – the unity of the self, the formulation of intentions, the performance of voluntary actions – that are even further out of reach of neuroscientific explanation. This has led some neuroscientists and their philosophical followers to deny their existence: the self and free will belong to a pre-scientific “folk psychology”.

This gets even stronger later (emphasis added):

The human world is an entirely new realm created by all the means we have of joining attention and consciousness. It is unknown to nature, though it creates a mirror in which nature is reflected.

Tallis implies that there is a non-material, supernatural explanation for our sense of self: that humans have a soul. So chuck it all, scientists: There are things that are beyond the reach of science, that (cue lightning flash and ominous music) man was not meant to know!

Maybe organizers of the next Neuroscience meeting should add a new registration category after “Member,” “Postdoc member,” and “Student member”: “Angry villager.”

Tallis undermines his credibility on scientific issues by invoking “things not known to nature.” Soul is not a scientific concept.

Regardless of this, one does not have to be a dualist to agree with this claim:

(O)ne would like to know how it would be possible for people to formulate social policies based on neuroscience.

That’s a fair comment. One of the greatest issues of the day is how to turn scientific knowledge into public policy. Economic policies dedicated to climate change, teaching policies related to creationism, public health policies related to fear of vaccines are all examples of how difficult this is. I absolutely agree that we are not at a point where neuroscience should be strongly informing policy. But being rightfully skeptical of the relationship between science and policy does not mean that the underlying scientific premise – that our brains are responsible for our mental lives – is wrong.

Additional
: Mind Hacks also responds to Tallis’s article. Thanks to Julie Dirksen for spotting it.

Why do we care if students come to class?

Last week, I wrote about why instructors might want to lecture even though we now have the ability to make good recordings and post them online exists now. One of my colleagues asked how we educate students about how to use recorded lectures to their best advantage, and not use recordings as a reason to miss scheduled class sessions. (Incidentally, Tegrity claims that the fears of students not coming to class are misplaced.)

Students have a very simple strategy, I think, for determining what we instructors think is valuable: Marks. If it's not graded, it's not important. If we want to send a message to students that their presence matters, we have to give them points towards their final grade that they can only earn by being physically present.

That said, I don't suggest points just for attendance. I much prefer giving points that are related in some way to the material and the subject that people are trying to learn. I use in-class clicker questions, so students have to be there to answer clicker questions to get the points for them.

A question worth thinking about is why do we want students to come to class? I hear my colleagues complain about students not showing up, and I’ve done my fair share of it, too. But if students are completing all their assignments, show understanding of the material, can communicate well, why do we want their bums in the seats?

One reason is that showing up to scheduled meetings on time, every time is a mark of professionalism. That’s worth encouraging.

Another line of argument to students goes. “You should show up to class because it’s good for you. Our experience as instructors is that students who show up do learn more and get higher grades.” But this is about as appealing as being told to “Eat your vegetables.”

That’s a short and unappealing list. Anyone have more?

But a nearly empty lecture hall or class room can just be damn depressing to the instructor. I have told students sometimes that if I got upset every time a student missed class, I’d never stop being upset. Which is true... to a certain degree. When there are only a quarter of the students who were there in the first day of class, yeah, it gets to you. I know that I should be trying to make my classes so amazing that people who aren't even registered for the class will just come in and listen.

Note to students: If you miss half your scheduled classes, you lose the right to complain about aloof instructors.

11 November 2009

You smell like chicken (to a mosquito)

ResearchBlogging.orgThis post was chosen as an Editor's Selection for ResearchBlogging.orgAlthough swine flu is the virus of the year, there are plenty of other viruses out there, like West Nile virus. This new paper looks at the interplay between the virus, birds, humans, and the sensory systems of mosquitoes.

Insects live in a different sensory world than humans. Insects react to molecules (smells) the way we react to photons (vision). Female mosquitoes use their sense of smell to track down larger animals so that they can get a blood meal, which they need to reproduce. Syed and Leal look at the sense of smell in a mosquito (Culex pipiens; pictured).

Because it’s already known that these mosquitoes feed on both humans and birds, the authors extracted odourants from both. Strangely, they present data from five different ethnicities... but three of them are represented by only one or two subjects. I’m not convinced that sample sizes of one or two tell us anything useful. The authors claim that there is no difference, so I would rather have just seen the pooled data and leaving ethnicity out of the picture.

The antennae has three kinds of small sensory hairs (sensilla).

  • A1 sensilla (sharp trichoid sensilla), which come in long and short varieties. Each sensillum has two different sensory neurons in it, which can usually be distinguished by the size of their spikes.
  • A2 sensilla (blunt trichoid sensilla), which also have two different sensory neurons in them.
  • A3 sensilla (grooved pegs).

Much of the paper consists of throwing chemical after chemical on the antennae and looking for neural responses to the chemicals. Their big finding is that one of the sensory neurons in A2 sensilla is highly sensitive to a chemical called nonanal. These sensory neurons respond to nonanal at concentrations about a hundred times lower than any other chemical tested.

That matters, because they show nonanal is the major chemical in the odours of the birds they tested, and it’s a major chemical in human odours, too. That these chemical are present in things mosquitoes feed on, and that a whole class of neurons are highly tuned to that chemical, strongly suggests nonanal is used in blood-seeking behaviour. It also neatly explains why this mosquito could be a major source of viruses jumping from bird to human populations, and vice versa.

Once you know a bias, you can exploit it. Syed and Leal also baited mosquito traps, which normally use carbon dioxide to lure in the mozzies, with nonanal. Nonanal alone didn’t work as well as carbon dioxide alone... but the combination worked better than either alone. Although trapping mosquitoes probably couldn’t be used for mosquito control, it could improve mosquito monitoring.

Unfortunately, the organization of the paper is slightly disjointed. For instance, alternating the names from “thichoid sensilla” to “A2 sensilla” causes more re-reading that would otherwise be needed. The paper uses the acronym “EAD” in the figures in the Results well before they spell it what it means a few pages later in the Discussion section. (It means “electroantennographic detection,” a needlessly fancy way of saying extracellular recording.) Not cool.

Reference

Syed, Z., & Leal, W. (2009). Acute olfactory response of Culex mosquitoes to a human- and bird-derived attractant Proceedings of the National Academy of Sciences, 106 (44), 18803-18808 DOI: 10.1073/pnas.0906932106

10 November 2009

Tuesday Crustie: Strigose

Galathea strigosa
Strigose (from the New Latin strigosus, from striga row of bristles, from Latin, furrow): having appressed bristles or scales <a strigose leaf>

And you needed to know that to fully appreciate the name for this squat lobster, Galathea strigosa. (It’s the one in front, not that chordate interloper in back.)

The view from the top of the Texas uni system

One of the administrative issues I’ve been dealing with is grad student class size. Basically, we’re running into issues because our graduate classes are small. This photo caption does not fill me with hope:

Raymond A. Paredes, Commissioner of Higher Education, lays out the many challenges facing higher ed, including dismal graduation rates, a disconnect between what’s expected of high school students compared to what universities expect, the challenge of attracting Latinos to college programs, and the need to cut costs by dropping low-production courses and asking faculty members to take on heavier teaching loads.

Part of a longer article here.

The Zen of Presentations, Part 28: Sour notes

One of my classes is also used for a student seminar class. This morning, I found a set of abandoned note cards. This one filled me with disappointment:



No. Please, no.

(In fairness, most of the other 28 note cards were better.)

09 November 2009

ScienceOnline evolution competition

NESCentA few weeks back, I put up a post that was part of a blogging evolution competition sponsored by NESCent for the Science Online 2010 conference. The list of entries is here. Lots of fine reading!

I’d encourage you to go vote for me, but it’s not that kind of contest. So just enjoy.

Can I get this on Guitar Hero?



From a Discover Magazine contest to explain evolution in two minutes or less.

Less than a year to go for Texas State Board of Eductation vote

Ah, Texas State Board of Education member and former chair Don McLeroy has competition when he runs for re-election next year. Tom Ratliff wants to run for the Republican party, and has McLeroy pegged:

“I truly believe he (McLeroy) thinks he knows better” than educators what should be taught and how, Ratliff said. “I am one hundred and eighty degrees from that mentality.”

I would extend that. McLeroy doesn’t just think he knows better than educators: he thinks he knows better than everybody. His now imfamous, “Somebody has to stand up to these experts,” quote was proof of that.

McLeroy, as usual, makes some eyebrow raising quotes:

McLeroy makes no apologies for grafting a political agenda onto education.  “The culture war over science education, the teaching of evolution, is going to be there, no matter what,” he said. “Education is too important not to politicize.”

On the particular matter of evolution, he says:

McLeroy brushes off the controversy over science curriculum. The media, he said, seeks to pigeonhole him and his allies on the board as “religious fanatics.”

“I’m here on a social equity issue,” he said. “As a Christian with strong Christian beliefs … I know all these children are created in God’s image, and we need to help these kids. It’s a moral responsibility.”

I wonder how the media made Mr. McLeroy mention Christianity twice and God once in his answer. What social equity issues McLeroy is interested in remains a mystery.

07 November 2009

I want to be Carl Sagan, but can’t

I am nearly ready to throttle the next person who mentions Carl Sagan. There seems to be a new cliché emerging in science writing: “We need more Carl Sagans.” This shows up in the books Don’t be Such a Scientist, Unscientific America, and I just spotted Erik Klemetti, who wrote on the Eruptions blog:

We need our new Carl Sagans, Arthur C. Clarkes or Stephen Goulds - people who understand science and can advocate for it. I have trouble thinking of anyone filling those roles anymore.

And it’s very likely that nobody ever will fill those roles again. Look at the names that people toss around as science communicators (with an admitted bias toward my own field of biology).

Carl Sagan: Pulitzer Prize winning book The Dragons of Eden published in 1978.

Stephen Jay Gould: Breakthrough paper on punctuated equilibrium published in 1972.

E.O. Wilson: Breakthrough book Sociobiology: The New Synthesis published in 1975.

Richard Dawkins: Breakthrough book The Selfish Gene published in 1976.

David Attenborough: Major television series Life on Earth debuted in 1979.

David Suzuki: Began as host of The Nature of Things in 1979.

See a pattern? All these people rose to prominence for their work in the 1970s or so. Forget that the internet didn’t exist. This was a world where you could count the television channels on one hand and have fingers left over. This was a time of true mass media rather than pervasive media (which is what we now have).

You see the same thing in music. U2 may be the last band to achieve true “rock star” status. You see the same thing in television. Look at the list of highest rated television shows in the U.S., and you’ve got to go down into the 30s before hitting a show from this decade. People have more choices of what they can read and watch, and people are less famous than they used to be.

How could someone today get the opportunity to talk to people about science at the level that Carl Sagan did? There are a thousand weird, intangible factors that would all have to align in a way that you could not plan or predict. Neil deGrasse Tyson said in an interview on This Week in Science (1 Sept 2009 episode; free on iTunes) that one reason he gets contacted by the media so much is:

My office is eight blocks north of all the new gathering headquarters of the nation. ... When the universe flinches, they just send up an action cam. So I’m an easy date for them.

Don’t get me wrong, Tyson is fantastic at what he does (great interview here), but I’d wager there are many other scientists who would be just as effective on The Daily Show as Tyson was. But not everyone can just go to multiple television studios by taking a taxi to them after work.

Plus, most of the names on that list have done significant technical work that had made them reasonably well-known among their peers. With financial support for science going down and administrative responsibilities increasing, this isn’t exactly easy, either. This means that the huge number of scientists at universities with modest research programs are right out of the picture, no matter how much they might want to do outreach and might be good at it.

The conditions that allowed Carl Sagan and Stephen Jay Gould and the others to thrive have pretty much gone away. I’m irritated when people say that scientists don’t want to come down from their ivory towers. There’s an implication that the reason we haven’t seen another Sagan or Gould is that nobody wants to do it, and that scientists could do it if they put their minds to it.

I would bloody love to have Carl Sagan’s gig. I would love to be able to tell a lot of people very cool stories, and help them understand what science is about. But it’s not going to happen, no matter how hard I work.

Note: I swear to you, I started writing this post and all but finished it before I realized today was Carl Sagan day.

06 November 2009

From backchannel to blackboard

Back in March, I wrote:

Consider this scenario. An instructor is giving a talk. When the instructor wants a question answered, students Twitter their response instead of using clickers. Ideally, this would involve some software that could recognize some special symbol or code associated with a class, so that responses could be tallied and a graph could be generated on the fly. I’m guessing that such software isn't out of reach for good programmers. Conceptually, it seems actually pretty simple.

I’ve since learned that such technology does indeed exist, and reckoned I should report on my (tiny) experience with it.

I heard about Poll Everywhere over on Slide:ology. Basically, you can ask people to vote by phone, Twitter, or web, and it gets copiled and updated into a bar graph on the fly.

I used it to create polls (one shown at right) and make them into PowerPoint slides. Since my students don’t bring laptops to class, and apparently none of them have Twitter accounts (at least, nobody has admitted to it), I chose to set it up so that they could send a text message using their mobile phones.

It worked as advertised. Nobody had any problems sending the text messages, although the numbers they have to key in are arbitrary, so it’s not easy to do without carefully looking at the slide and double checking the digits. The animation of the bars when new information is coming in is nice and smooth. It was certainly not perfect. Importing the slide into a larger presentation was a little twitchy. And I consciously did not write that the graph updated in “real time,” because the updates were a trifle slow.

For comparison, I’ve been using a clicker system in my classes for several years now. The system is a bit more elaborate, in that it requires a dedicated clicker, receiver, and software. But if you’re doing a lot of in class questions, the difference between pushing a single button on a clicker and pushing about a dozen buttons on a mobile phone adds up. Combine that with the lag in the Poll Everywhere system, and the clicker system comes out feeling more natural and agile and less intrusive.

The clicker system does not compile results on the fly like Poll Everywhere does. Instead, it takes everyone’s answers, waits, then draws a graph. This is an important consideration in an education setting. You want people to have a chance to think on their own for a moment, and not be influenced by other people’s answers.

Poll Everywhere is great for instructors have the occasional one-off poll that they want to do in class. So far, I wouldn’t use it to replace a dedicated clicker system, but I could see how it could do so in the near future.

And, incidentally, in the little test poll shown? None of my students got the name of their own university right. It’s supposed to be the second from the bottom.

More backchannel and Twitter tools are given over at Olivia Mitchell’s blog. Watch for lots of clever ideas using these tools in the near future.

05 November 2009

The Zen of Presentations, Part 27: Coping with anxiety

I had a student in my office this week for advising, and I noted that she hadn’t taken Biology Seminar, a required class for all our majors. She said she had been putting it off, and putting it off, and was deliberately taking it at the last possible time. She was absolutely terrified of giving a talk. Even as I was talking to her, I could see her getting wound up at the prospect of something that might be weeks, if not months, away.

Before getting to the advice, let me preface what I’m about to say with a general principle:

There is no virtue in suffering.

We often tend to treat that people who are genuinely anxious about presentation with little sympathy. People are told to keep suck it up and keep practicing. There is more than a little “You should suffer for your art” attitude out there.

ValiumIf you are truly frantic about the prospect of speaking in public, why not make an appointment with your physician and see about getting a prescription drug to help with the anxiety?

I don’t say this lightly. There’s a reason that some drugs are only available by prescription, and only recommend this is as a last resort for extreme cases.

I know one person who had to give a lot of presentations, and hated every second. The stress was quite debilitating, so this individual got a prescription for a beta blocker, and took a pill before giving a talk. The talk I saw this person give was fine, and I’m convinced the audience wouldn’t have known this person was dealing with high anxiety.

In most cases, you’re better off practicing and learning the skill of presentation that getting medication. But not everyone is the same, and some people may need more help than practice and preparation alone can give.

04 November 2009

Why lecture?

Our university started using a system to record lectures this semester called Tegrity. I’ve used it a bit, and it has some sweet features. It captures voice, video if you’re so inclined, and everything on the computer screen. Once recorded, the lecture can be viewed online, downloaded as an audio only podcast, streamed to an iPhone. My favourite feature is that all the text on the screen – whether a PowerPoint slide or something on web browser – is searchable. Can’t remember where in the lecture a technical term was mentioned? Type in the text and it’ll locate it.

First response one of my colleagues had to it:

“Just another excuse for students to be lazy.”

You might say there’s a little diversity of opinion on the subject of lecture recording. So this post by Donald Clark, with the bracing title of “Lecturing – stupidest profession?”, is timely. There are a lot of comments to read, too.

Lecturing is a weird historical artifact. To the best of my knowledge, the main reason that lecturing because a method for transferring information was that it originated before the introduction of the printing press, when there were literally not enough copies of books to go around. Despite books being widely available for centuries now, lecturing persisted. On the face of it, lecturing is stupid.

As you might be able to tell from the comments in the first paragraph, I think there’s a lot to be gained from recording lectures. The major thing is that attention, no matter how compelling an instructor might try to be, drifts. There are just times you miss a point, and even if you get it the first time, it often takes multiple times listening to something to remember or understand. And students are often unable to make a lecture due to some sort of happenstance.

On the other hand, I think there’s way more to a lecture than a YouTube video. I’ve written before about videotaped presentations:

Human beings are very good at conversation, very good at face-to-face interactions. We like it. We crave it. Trying to take the information out of that social context almost always kills it.

When I first started moving to partly online classes, I asked students if they wanted me to work up a class as a completely online class. The answer was largely, “No”: they still considered the lecture time valuable. “Face time” still matters for people, even if it looks like a simple one-way flow of information. If we were just interested in pure information transfer, we’d just give students books.

The danger for both the student and the instructor is that doing recorded lectures not because it’s a new resource for people to learn, but because it’s convenient. It’s easy for an instructor to get up and talk for an hour, and easy for students to listen. It’s not very threatening. It’s not very demanding.

Structuring lectures to take full advantage of the immediacy and social interaction inherent in the format is time consuming and hard. And university students frequently don’t want to come along for the ride. University students often got into university because they figured out how to make lectures work for them.

While my colleague considered recording lectures an excuse for students to be lazy, it’s also an excuse for instructors to be lazy.

Recording lectures is a great opportunity, but you have to do it right. It’s one thing to record a new lecture every class session of a class that is never the same way twice to give students something to review, and pre-recording a lecture once in your office with no audience and never changing it again.

03 November 2009

Rating journals and articles

Richard Smith argues that having measurements (or, as fashion calls them, “metrics”) for individual scientific articles means that impact factors for scientific journals are going to go away very soon.

I don’t think it’s going to be quite that easy.

Scientists have had “article level” measurements for a long time: the number of citations an article receives in other papers. To be fair, Smith mentions citations:

Citations are used to calculate the impact factor, but these citations come from only one (expensive) database. It’s better to use more than one database.

Putting aside the impact factor calculation for a second, has Smith not seen that Google Scholar has citation information? I also do not know why Smith claims more databases are better (apart from the possibility that one might be free). We want the actual number of times an article has been cited, so multiple databases should all contain the same number of citations an article receives.

But as long as there are multiple venues for researchers to publish their research, editors and publishers will use some kind of measure of their publications to establish credibility. Does it matter if a journal promotes itself by a high impact factor, a high number of combined article downloads, or some other aggregate number?

And researchers need some sort of way to determine a journal’s credibility. Otherwise, we’re left with individuals pulling hoaxes to figure out if a journal is the real deal or not.*

More and faster measurements of articles is interesting and valuable, though the underlying question remains unanswered: What measurement of an article should we be most interested in? Page views are different from downloads, which are different from blog mentions, and all are different from citations. In an open access arena, you can probably crank up the number of page views and downloads by publishing anything with “dinosaur” in the title. (And I say this because I love me some dinosaurs!) In terms of scientific impact, an article read by 500 people might be more influential than one read by 5,000 people, if it’s the right 500 people; i.e., hits the target audience squarely.

In the end, all of these numbers are about money: grant money, salary money, merit money, and so on. If it weren’t for that, nobody would care about impact factors – or any other measurement of a scientific article’s worth – in the first place.

* Though to hear some of the advocates for Open Access projects like PLoS One, you could be forgiven for thinking that what they want to see is a situation where there are no other journals.