Hopefully I didn't traumatize too many of the students in the seminar, ruining their booze-filled orientation week. But hopefully I did shake them up a little – welcome to college!
Archive for: August, 2010
I came across this visualization via Flowing Data. It was generated by Scott Manley at the Armagh Observatory and shows the discovery of new asteroids between 1980 and 2010. Whenever a new asteroid s discovered it is shown in white and usually fades to green but remains in orbit. The ones that fade to yellow are so-called earth approaching asteroids – those who are near earth's orbit but do not intersect it. The ones that fade to red are the ones that cross earth's orbit and could one day kill all of us once and for all in a fiery cataclysm. As a result of automated sky scanning projects in the 1990's the rate of asteroid discovery greatly increases. The caption states that there are currently of half a million known asteroids orbiting the sun, and many more are expected to be discovered. It's amazing how so much stuff is just floating around in the solar system, and amazing we're all still here.
A few months ago an article came out stating that in certain tasks, individuals with Tourette's syndrome show superior "timing control". Meaning that they were better at predicting certain time intervals than the non-Tourrete's control group. This article received a bit of media attention as well as the attention of a couple of science bloggers, here and here.
One thing that tends to pop out when the media or blogs report on these sorts of articles is the tendency to ascribe some kind of genius superpower to people with certain neurological conditions, maybe perhaps to make them more interesting or exotic. Think of the character Dustin Hoffman plays in the movie "Rain Man", or the dude that plays Rachmaninoff in "Shine". In both blog posts about the timing control study, there's a mention of US soccer team goalie Tim Howard, who has Tourrette's, and they seem to imply that this superior timing control is what makes him such a good goalie. First of all, while Tim Howard is a decent goalie, there are plenty other goalies just as good or better that don't have Tourette's, and plenty of people with Tourette's that would make terrible goalies (myself included). Speaking from personal experience, as someone with Tourette's, I have a terrible sense of timing. I suck at videogames. When I try and play an instrument, my lack of a sense of rhythm makes me sound terrible. I bad at most sports. I'm uncoordinated. I know I am just a single example, but this idea of people with Tourette's having superior cognitive control and awesome timing powers just does not ring true.
If I were to characterize my sense of timing, I would rather characterize it as odd and irregular. I read somewhere that jazz musician Thelonious Monk had Tourrette's. I don't know if this was true, but if you have ever seen a video of the dude performing you can sort of see it. His music definitely has an odd sense of timing which to me rings more familiar. And that's maybe why I like James Brown. I often find myself with the random urge to scream "Haah!" or "Hot Pants!" in the middle of a faculty meeting, or "Pop-Corn!" during a seminar. In the book "Motherless Brooklyn" by Jonathan Lethem, the main character is a detective with Tourette's syndrome. While for the most part I think that Lethem got the essence of the disorder wrong, and the character is mostly a caricature, there's one bit that I think he got right. The main character loves the song "Kiss" by Prince. I have to agree that the tempo, timing and changing cadence of the song definitely has a tic-ish quality to it that somehow resonates with my odd sense of timing. So no, unfortunately I am not endowed with awesome superpowers, but at least I can say I have interesting musical tastes.
On the other hand... maybe I should exploit this. Maybe I can cultivate an aura of an exotic twitchy genius who will rule the world with his superior cognitive control and impeccable timing. Hot pants! Haah!
Basically, because the earth is rotating, the oceans bulge around the equator by about 5 miles. In the absence of this rotation, the water levels would redistribute toward the poles, submerging most of North America, Europe and northern Asia. It would also cause land that was previously in the bottom of the ocean around the equator to be above the surface, creating a continuous ring of land separating a Northern and Southern ocean. I love things like this!
So I''m finally back after spending some quality time at a more northernly locale! I came back to work to find that: my email inbox had about 300 messages, that we had two papers rejected, that my university's animal care committee wants me to re-write a large part of my animal protocol which had been fine up to now, and that the semester starts a week sooner than I expected. Shit!
More importantly, I was also informed by my department's chair that they sent out requests for letters of support for my tenure case. Gulp… These letters are supposed to be, so I've been told, what can make or break your tenure case. They are letters from experts in your field from so-called "peer institutions" and who are asked to evaluate your work up to now. This of course is somewhat oblivious as to how science is done, since the world expert in your field may find themselves in a small university, while a large research university may have nobody working in your field. In this case, it is your departmental tenure committee and department chair's job to convince the university tenure committee and administration that the letter from super big expert in little university should carry more weight than the one from not so big expert from super fancy university. But what I find most troubling is this idea of what a "peer institution" is. So for example, if you are in a medium-sized research university with a heavy emphasis on undergraduate education, although the faculty members in your department may be leaders in their fields, as a junior faculty you will not have access to the resources that you would have at a university affiliated with a major medical center. The type of core facilities you will have access to are much more limited, the graduate programs are likely smaller so you will have less access to graduate students, startup packages are smaller, there's less possibility for collaboration and you will likely have to teach a lot more than your colleagues at a major medical school. Which means that unless you have superhuman powers, your productivity will be somewhat less. Also, smaller universities tend to have shorter tenure clocks, so again, by the time your colleagues come up for tenure they will have amassed a greater amount of publications, grants and fame. Despite all this, from what we've been told at these tenure workshops our administration holds for junior faculty, is that they consider these universities with major medical centers to be our peer-institutions. It might be the case for some of the humanities, but certainly not the case for life sciences. One question the letter-writers get asked is "would this person would get tenure at your institution?", and of course we are comparing apples and oranges here. OK, maybe more like apples and pears, but still. Particularly if the tenure clock in their institution is twice as long as that in your own institution. One would hope that an experienced letter-writer will highlight these differences, but maybe they might not, or be oblivious to them. You never know, and to me this is a bit of a worry. In which case it will rest on my department's chair, if my department decides to recommend me for tenure, to explain these subtleties to the university tenure committee which can contain people ranging from anthropology to Slavic studies. And to also explain these subtleties to the administration - as to why major medical center X is not really a peer institution.
So in any case - it's somewhat out of my hands now. Lets hope the letter writers can tell the difference between an apple and a pear and maybe even a quince. Speaking of apples, it's almost apple picking season! Yum.
So you've invited your lab members and their families/significant others and a few other colleagues over to your place for a barbecue. What are you going to do now? Rush over to Costco to buy a buttload of hot dogs, frozen patties and a few veggie burgers for the vegetarians? Put out some fluorescent green pickle slices, booger-yellow mustard, ketchup and stale buns to go with it? No, mis amigos. You will be heading out to your local butcher and will be serving them Carnitas! Not only will they be impressed with your cooking wizardry and your ethnic sophistication, but afterwards they will all sit lazily on your lawn, enjoying a sunny afternoon, having another beer, rather than rushing back to the lab with half-digested burgers in their stomachs. Any vegetarians will be converted. Carnitas, a typical dish from Mexico, is one of the easiest most delicious things you will ever make, and you cannot screw them up. Here's how to make 'em:
About 5 lbs of boneless pork shoulder. Ask for it at the meat counter, once home trim off the extra fat and rub it with salt (preferably kosher salt) and stick it in the fridge for up to 24 hrs. But you can also cook it immediately.
1 tsp oregano
3 bay leaves
pinch of cumin
5-6 Garlic cloves, peeled and chopped
2 cinnamon sticks
Cut the salted and trimmed pork shoulder into 3-5 inch pieces. Heat the oil (you don't need much) in a thick-bottomed skillet/pot and brown the pork pieces, ideally until they are quite dark. Make sure the skillet is large enough so that all the meat fits in one layer. If it doesn't fit, use 2 skillets. Once the pork is browned, add enough water to come up a little more than halfway up the sides of the chunks of pork, but make sure they are not totally submerged. Scrape any browned bits from the bottom of the pan and add all the spices, bay leaves, garlic and cinnamon sticks. You can substitute some of the water for half a bottle of beer (your favorite kind, drink the rest). Bring to a boil and then reduce heat to a low simmer. Let the meat braise, uncovered for 3-4 hours, turning the pork over occasionally (say every hour or so). By the end, most of the liquid should have evaporated and the pork will be falling apart. If the water levels drop early on, you can add a little more water (or beer) as you go along to prevent sticking. Remove the pork from the pan and shred it into large-ish chunks, cut off any remaining bits of fat. Return everything to the pan. If the remaining pan juices are too fatty you can skim some of this off too. Turn up the heat and finish cooking the pork in the pan until the pork gets crispy and the liquid thickens. This'll feed about 8 people.
And that's it! Serve with warm tortillas, and some rice, black beans, guacamole and green salsa on the side. If this isn't barbecue-ish enough, buy a few skirt or flank steaks, add salt (and only salt) and grill them as a side dish. Buen provecho!
I was really fascinated by this amazing set of close-up pictures of various eyes, I just had to link to it. Can anyone spot the Crypts of Fuchs?
(link via BB)
An interesting paper was recently published in PLoS ONE, which follows from a long line of experiments regarding how brain cells encode information. But before I get into it, and I want to get into it, let me provide a little context. One thing that makes neurons special among different cells is that they can generate electrical impulses called action potentials or spikes. Several other cell types also can generate electrical impulses such as muscle cells in the heart and skeletal muscle, and even some plant cells. In the canonical view, a neuron receives inputs from other neurons, and if these inputs are sufficiently large, they will cause the neuron to generate an action potential. The action potential then travels down a specialized protrusion called an axon and ends at what known as a synaptic terminal. The action potential causes the release of chemicals called neurotransmitters from the synaptic terminal. Neurotransmitters then activate other neurons and ultimately cause these to generate more action potentials and ultimately communicate with more downstream neurons. Furthermore, axons can be very long – the nerves that run throughout the body are bundles of many axons and can more than a meter long. In order for action potentials to travel down nerves, they are continually regenerated by a seers of proteins called ion channels.
One important aspect of how neurons compute information is that the strength of a stimulus is encoded by the rate of action potentials – a stronger stimulus will cause a neuron to fire action potentials more frequently. This is something known as rate coding and was first described by Edgar Adrian, also known as Lord Adrian (back in the days when science was often done by aristocratic types who could fund their own work), in the 1920's. Adrian used a device, ( a triode thermionic valve amplifier and a capillary electrometer, for aficionados) in which he mounted an isolated muscle from a frog leg with the sensory nerve still attached. Muscles have all sorts of sensory receptors that help them detect things like stretch or pressure, and this information is conveyed to the brain via sensory nerves attached to these receptors. In Adrian's experiment, he recorded electrical impulses from the sensory nerve while he hung different weights from one end of the muscle, creating increasing amounts of tension. Here's his diagram of the device:
What he found was that as he stretched the muscle, he could record discrete electrical impulses in the sensory nerve, and that the rate of these impulses increased as he increased the size of the weight. He therefore concluded that information about tension was being conveyed by the rate of these impulses, and stronger stimuli resulted in faster generation of spikes. He also made a point to note that these spikes were discrete events, mostly about the same size, and were therefore likely to be all-or-nothing events. A few years later the experiments of Hodgkin and Huxley described the ionic basis of these action potentials, furthering the view that action potentials were discrete events and that neurons communicated with each other by varying the rate at which action potentials were generated. If a neuron fired more action potentials it would release more neurotransmitter, if it fired less it would release a smaller amount of neurotransmitter. This became one of the fundamental tenets of neuroscience.
However, nothing is ever as clear cut. If you ask anyone who performs electrophysiological recordings from individual neurons, they will tell you that although action potentials are discrete events, the can vary widely in size and shape within a single neuron. For example, in many neurons when you cause them to fire a train of action potentials, you will sometimes see that the size of the action potentials diminishes the further along you are in this train. This led people to speculate that it might not just be the rate of action potentials that affects the output of the neuron, but also their size and shape. In 2006 a pair of papers by independent labs came out in Science and Nature which showed that in fact smaller action potentials could result in less neurotransmitter release and bigger and longer-lasting action potentials could result in more neurotransmitter release. Thus coding of information was not just digital, but could have a graded analogue component. These papers received a lot of attention because the seemed to overturn one of the long-held dogmas in neuroscience.
OK, now back to the PLoS paper. In this paper, Chen and colleagues essentially reach a completely opposite conclusion than the Science and Nature articles. They show that even if they record action potentials in the cell body of a neuron that have different sizes and shapes, the output of the neuron, as measured by the release of neurotransmitter, does not correlate with action potential size. To test how this happens, they record electrical activity from within a neuron in the cell body and in the tip of the axon simultaneously. What they found was that although action potential size can vary significantly within the cell body, by the time the action potentials reach the end of the axons they are all roughly the same size. This suggests that there is something that the axon does to normalize the size of the action potentials. The authors go on to demonstrate that this is due to specific properties of the ion channels in the axon that allow action potentials to be regenerated as all or nothing events. If you look at the figure below, in panel A you see action potentials (the 2 bumps) recorded in the same cell over several trials and notice how the second one varies a lot in size. In panel b, notice that for the most part, the size and shape of the second action potential is the same for all the trials.
Thus, what the authors conclude is that neurons have a built in process by which they can amplify even little action potentials in the cell body and faithfully transmit them down the axon. This ensures not only that rate-coding can work efficiently, but eliminates some of the variability in the output of the cell that could be introduced by action potentials of different sized. As I said before, this is exactly the opposite conclusion than the 2006 papers. This doesn't mean that those papers are wrong, what might vary is the type of cell they record from, the exact recording conditions or the age of the animals used. What surprises me is that the authors of the PLoS paper only mention the other papers in passing in the introduction even if their work is directly relevant. It also surprises me that they published this in such a low-impact journal. The experiments seem well done and there is a lot of work and novelty in it. And this may go to show that if a lab is not well known, their work will not get published in a high-visibility journal. Finally, I think that this set of papers illustrates how even concepts that are accepted as central tenets can change as different experimental techniques become available, and that the controversies that were relevant 90 years ago, are still alive and well, and like all good scientific theories, they evolve over time.
Chen, N., Yu, J., Qian, H., Ge, R., & Wang, J. (2010). Axons Amplify Somatic Incomplete Spikes into Uniform Amplitudes in Mouse Cortical Pyramidal Neurons PLoS ONE, 5 (7) DOI: 10.1371/journal.pone.0011868
Adrian, ED. The impulses produced by sensory nerve endings: Part I. J Physiol. 1926 Mar 18;61(1):49-72.
I have to say that unlike many people who write about graduate and postdoctoral life, I really enjoyed being a graduate student and postdoc. I consider these to have been highly productive and formative years and I always look back on them fondly. It may be that I happened to have two very good mentors, who ran small to medium sized labs and who were willing to give me almost complete independence to pursue my projects. However they were also very accessible that if I got stuck or needed to discuss data or a next step they were always available to talk and their doors were always open. Since each person in the lab was working on a different project, there was no internal competition and there was a lot of collaboration. People got along and were willing to teach each other how to perform different types of experiments and set up collaborations amongst themselves. Unlike labs where John is the specialist in anatomy, Ringo is the molecular biologist, George is the electrophysiologist and Paul is the computer whiz, and everyone contributes their specialty to the common project, in the labs I worked at everyone had their own project and learned to perform all or most of the techniques necessary. Sort of like the difference between the A-team and MacGyver. While these types of labs might not be ideal for someone who is not self-driven, for the most part I think most of my colleagues would feel the same way about their experiences there, and I try and model my mentoring style in my lab after both of my previous mentors.
So while environment played a huge part of it, what I liked the most of both graduate school and postdoc was the feeling of starting something new. You get to pick an interesting problem, and you have to do whatever it takes to solve it. In front of you is a completely open field, you can go anywhere, and you have the resources provided for you to do this. You have to figure out which experiments you need to do, which techniques you have to use and if none are available, which ones you have to develop. You know that anywhere you go it will be someplace new, that you are discovering things that have never been known, that the data you generate represents new knowledge that for a little while you and maybe a couple of other people are the only ones who have seen it. And this is a really cool feeling. By the time you are done, you are the world expert on that little subfield, but more importantly you have learned how to ask the questions, solve the problems, seek help when necessary and interpret your results. Now you can do anything. And then, as you finish your PhD and are starting to get tired of your project, as a postdoc you get to start over with something new! Another open field, a new pasture! But this time you are wiser and smarter.
This feeling of elation was not quite as fulfilled when I started my own lab. Finally you are at a point where you are completely independent and a new adventure begins. But unlike being a postdoc or a graduate student in a well-funded, established lab with ample resources, starting your own lab is like becoming captain of a creaky old ship, partially sunk, with an inexperienced and somewhat dysfunctional crew and no compass. Ahoy! So while in principle you can sail in any direction you want most of your time is spent in keeping the ship afloat and the crew from killing each other. Trying to get competent people, writing grants to fund your research, finding your place in the field and in your institution, teaching, filing animal protocols and fighting with the grants office leave you numb and tired and with little energy and excitement to do the actual science. So you creak along, following the default course you set off in as you finished your postdoctoral training, watching all your exciting new ideas and directions get shot down by grant reviewers or being scooped by competitors. And then get very, very depressed. But then things pick up, papers come out, they start to improve in quality, your crew resolves its issues, the bilge pumps get repaired, you get new sails, you get funded, students stop complaining about your lectures and off you go. And now, you have the resources and the crew to to take your lab wherever you want. You are like a cow (going back to that analogy - the ship one was getting a bit cliché) in a fresh pasture. New grasses and weeds to munch on, lots of space to spread your manure around! Moo!
And that is sort of the point I am reaching right now in my career. The lab seems to be running OK and I'm getting excited about some new projects and directions we are taking. Next week I will be going to an epilepsy meeting in which I was invited to speak. While my research had always skirted around issues that may be relevant for understanding epilepsy, I was never really in the epilepsy field. So I was quite excited when I got invited to present in this meeting. I am looking forward to learning a lot more about this field, and meeting new people and hopefully coming up with new ideas for projects and collaborations we can embark on. And this feeling of an open pasture is coming back, my four-chambered stomach churning, the cud chewing accelerating. Let's just hope I get tenure...