<![CDATA[ - Science Blog]]>Sat, 06 Feb 2016 22:26:11 -0800Weebly<![CDATA[Asparagus pee´╗┐]]>Thu, 05 Nov 2015 00:04:52 GMThttp://www.jessicapjohnson.com/science-blog/asparagus-peeAsparagus was dinnertime torture for me as a kid, but my mom loved it and served it frequently. Somehow she was able to overlook its limp, grey, sliminess as she emptied it from cans covered in pictures that bore no resemblance to the actual contents. Dinnertime negotiations ensued, but always ended with me gulping down a few bites while holding my nose with one hand.
My aversion to asparagus was so strong that at least two decades passed before I gave it a second chance. During dinner at a friend’s house one night, I discovered that not only does asparagus not have to come from a can, but it can be delicious. It wasn’t long before I also discovered that my new love came with an unexpected digestive consequence – asparagus pee.
​The urinary odor is often described as similar to that of cooked cabbage or reconstituted asparagus, and some people report smelling it as soon as 15 minutes after eating asparagus. Marcel Proust, the 19th century French novelist, had a different take. Asparagus "...transforms my chamber-pot into a flask of perfume," he wrote. I don’t know what perfumes Proust was used to smelling, but I think most would disagree with his characterization today.
The asparagus pee phenomenon may not be popular dinner conversation, but a 2010 study by scientists at the Monell Chemical Senses Center in Philadelphia, Penn., suggests that at least 94 percent of you have experienced it. Marcia Pelchat and colleagues found that the vast majority of test subjects could smell asparagus odor in their own urine or the urine of others. Only six percent of people lacked the ability to smell it, and some lacked the ability even when they produced the odor in their own urine. Only 8 percent of subjects did not produce the odor at all.
Ultimately, Pelchat identified a single gene – an olfactory receptor gene – that is linked to people’s ability to smell or not smell the asparagus odor. As for what causes the odor itself – scientists still don’t know. But if you’re one of the few who has never experienced the smell of asparagus pee, count yourself lucky. Take it from someone who inherited the asparagus smelling gene – you’re not missing much.
<![CDATA[Tracking Disease with Unusual Tools: How the Internet Could Speed Detection of Disease Outbreaks]]>Sat, 10 Dec 2011 04:04:09 GMThttp://www.jessicapjohnson.com/science-blog/tracking-disease-with-unusual-tools-how-the-internet-could-speed-detection-of-disease-outbreaks When a cholera epidemic struck the London neighborhood of Soho in 1854, physician John Snow hit the streets, knocking on the doors of the dead. He asked surviving family members about the victims’ daily routines so he could identify similarities among them that might point to the cause of the disease. He began to suspect that a common source of drinking water was the key, and when he drew a map showing which city water pump residents used, he finally discovered the pattern he was looking for—the dead had all lived close to the same pump. By the time Snow convinced the city council to disable the pump by removing its handle, nine days after the outbreak began, the disease had claimed over 500 lives and 75 percent of the neighborhood’s residents had fled. Snow lamented the fact that his efforts had done little to diminish a death toll that would ultimately reach 616.

Over a century and a half later, epidemiologist John Brownstein doesn’t have to knock on a single door. Rather, he tracks the spread of infectious diseases by analyzing the symptoms that people search for on the Internet. Brownstein’s photo on the Children’s Hospital Boston’s website shows a serious-looking man in his mid-30s wearing a button-down shirt and tie and a professional haircut. But on this early fall morning, he’s in jeans, shaggy haired and drinking from a jumbo cup from Subway. His relaxed look and demeanor belie the seriousness of his work—spotting disease outbreaks before they become epidemics. 
Brownstein makes maps too, but his are different from the one that John Snow created, and even from those created just a few years ago by the Centers for Disease Control and Prevention (CDC), the nation’s most reliable source for up-to-date disease information. Unlike theirs, Brownstein’s maps cut out the middle-man. Rather than wait for public health officials to confirm disease cases and the beginning of an outbreak, a process that can take weeks, Brownstein’s team strives to get ahead of the spread by collecting data from what he calls “informal” sources. Formal sources include laboratory tests or a physician confirming clinical cases. But the reality is that many people experience symptoms long before they ever get tested or seek treatment. Brownstein’s project—HealthMap.org—collects data from cases reported by news agencies, emergency room visits, and suggested by self-reported symptoms. The maps are updated hourly and show the progress of a disease as it is spreading.

As a disease travels through a population, the number of people affected looks like a bell curve. Early on, only a few people fall ill each day, but as time passes, the number of new victims climbs. At its peak, the population is well-infiltrated, and there are fewer people around to get sick, so the disease slows down, infecting fewer and fewer people each day until it trails off to pre-outbreak levels. Though there is yet no crystal ball to predict outbreaks, early detection could dramatically flatten and shorten the bell curve, reducing the peak number of infected individuals and the duration of the outbreak. Early detection means early public warning: physicians prepare themselves to treat symptoms, tell the sick to stay indoors, and tutor the healthy in how to avoid contracting the disease including getting vaccinated. These efforts can’t eliminate outbreaks, but can reduce illness, and death.

For the last ten years, the CDC has also been experimenting with the use of informal disease reports to improve disease surveillance, but Brownstein’s team is pushing things one step further. He has just completed his second test-run with of a new tool for the early detection of contagious disease—tracking Google searches. Brownstein and his colleagues partnered with the CDC and Google.org, the company’s philanthropic arm, to analyze disease symptoms typed into Google search and look for patterns that matched the actual symptoms of a particular disease. They began with flu tracking and found that spikes in searches for symptoms like fever, nausea, and body aches, correlated with 90 percent accuracy to confirmed cases of flu from formal reporting sources. Brownstein’s team developed a mathematical algorithm that collects and analyzes search terms as people enter them, allowing potential flu cases to be identified on the spot. And because searches are associated with an IP address, the searcher’s location can be pinpointed down to their zip code. Google strips the search data of all other identifying information in order to protect individuals’ privacy.

When search criteria match flu symptoms, a pinpoint is added to Brownstein’s map. If a large number of cases cluster in one area, it signals the beginning of an outbreak. “By detecting disease early, you can buy yourself a week or two to get ready,” says Taha Kass-Hout, a deputy director with the CDC’s Public Health Surveillance Program Office. “You can catch up with your resources and direct your public policy and communication to warn and treat the public. You can delay the disease’s progression.” In 2008, Google Flu Trends launched its first map based on Brownstein’s work with the search term data. The CDC has incorporated Flu Trends into what Kass-Hout calls its “surveillance mosaic,” using the new technique in combination with other informal and formal reports. He says that the public also benefits by having direct access to up-to-date disease information, empowering them to make important health decisions. The flu tracking tool is so new that the CDC has not yet calculated the number of cases of illness prevented, “but we do see great potential in the new technology,” says Ashley Fowlkes, an epidemiologist with the CDC’s National Center for Immunization and Respiratory Disease, Influenza Division.

Last year Brownstein’s team began developing a similar tool to track Dengue fever. The CDC estimates that one-third of the world’s population is at risk of contracting Dengue, a mosquito-borne illness characterized by sudden high fever, severe headache and pain behind the eyes, rash, muscle and joint pain, and bleeding. Though the disease is tropical and therefore rare in the continental United States, it is on the rise even here as mosquito populations move north due to climate change. In 2010, the CDC reported the first US case of Dengue since 1945--in Florida. Dengue virus mutates as it is carried by mosquitoes from person to person, and this constant transformation of the virus makes vaccine development very challenging. But the sooner researchers can reach an infected population and study the disease, the better the chance of developing an effective vaccine. Earlier this year, Brownstein’s team showed that the algorithm used to track flu also works with Dengue fever, and Google launched Dengue Trends. “These tools do not predict whether or not a disease will hit, but they do detect its progress in real-time,” says Corrie Conrad, a program manager at Google.org who worked on the Flu Trends team.

Though Brownstein’s new tracking techniques can reduce the lag time between onset and detection of an outbreak, all parties agree that they will never replace formal reporting methods. The reason is the informal nature of the data used to make the Google-based early detection. Suspected disease cases are never confirmed by a physician and not all symptom searches necessarily indicate actual illness, so not all of the pinpoints on Brownstein’s maps will be true cases of disease. For example, you never know whether someone is searching out of panic or for the symptoms of a relative in a distant country, says Justin Stoler, a spatial epidemiologist in the geography department at San Diego State University. For this reason, Brownstein’s team monitors their data to minimize erroneous outbreak spikes. False spikes in search term frequency can be distinguished from an actual outbreak spike because the rate of increase of the search term is much faster than the rate at which the disease normally spreads through the population. In a sense, panic is a much faster-moving virus than flu.

The problem is, when you’re monitoring the data in real time, it may take a while to see the difference in rate between an outbreak spike and a panic spike, says Laura White, associate professor of biostatistics at Boston University’s School of Public Health. “You don’t want to be constantly sounding the alarm because people may stop listening.” White also points out that flu symptoms overlap with a lot of other diseases, so it’s often hard to distinguish it from a common cold, especially without a doctor’s diagnosis. “It’s a good monitoring tool for the flu,” she says, “but I haven’t seen a lot of evidence yet that it’s a good outbreak detection tool.”

Another problem is that not everyone has access to the Internet. While 65 percent of Europeans are online, and 79 percent of Americans, only 9.6 percent of Africans are, the International Telecommunication Union reported last year. But access to the Internet will not be a hindrance for long, says Kass-Hout. Advancing mobile phone technologies are the key. Over 90 percent of the world’s population has access to mobile phone networks and nearly three quarters of all subscribers live in the developing world. Of course cell phone subscription does not guarantee access to the Internet. For that, one needs the money to pay for both a smart phone device and the monthly data plan. Plus, not all countries have access to a 3G network, the minimum technology required to support mobile Internet access. But at least 130 of 196 countries in the world do have access and that number continues to grow. Mobile subscriptions in developing countries are one tenth the cost of fixed Internet connections and the infrastructure is far easier to install, so the potential exists for even the poorest individuals in the least-developed countries to get online in the very near future. Just as with fixed Internet searches, location can be linked to mobile Internet searches by tracking the location of the cell towers the phone is using at the time of the search.

Until then, even mobile subscribers without smart phones can benefit from instant access to public health information. During the 2009 H1N1 swine flu pandemic, residents of many Mexican towns received text messages from public health officials asking whether they were experiencing flu symptoms. Many people responded and officials were able to track the pandemic’s progress through Mexico. In return, the ill were sent medical advice on how to treat their symptoms. “I think that these new tools have a lot of promise,” says White, “but there are still a lot of hurdles to overcome since this data can be challenging to work with. That said, I think that innovative approaches, such as this, are the way we need to go in surveillance.”

Kass-Hout agrees that Brownstein’s flu and Dengue tracking techniques are works in progress. But as they evolve and improve, he says other Internet-based sources of data from the internet such as social media sites, could be used to track and target even non-contagious public health problems. Brownstein and the CDC are already experimenting with using these data sources to track depression and mental illness. And Marcel Salathé, a computer scientist and biology professor at The Pennsylvania State University, is interested in behavioral patterns. He recently showed that Twitter updates could be used to track anti-vaccination sentiment in the US during the 2009 flu pandemic. “I don’t want to stretch the analogy too far, but behaviors can sometimes spread as if they were a disease,” he said.
<![CDATA[Conducting Properties in Bacterial Nanowires Discovered ]]>Mon, 08 Aug 2011 16:29:07 GMThttp://www.jessicapjohnson.com/science-blog/-home-sections-channels-resources-support-about-blogs-new-login-register-sign-up-for-newswise-wires-become-a-contributor-media-presspass-sample-effectiveness-reports-sections-latest-news-science-news-medSo-called "microbial nanowires" in the bacterium Geobacter sulfurreducens_ can transport electrons over long distances.  This property could be a game-changer for nanotechnology and bioelectronics by providing a source of lower-cost, nontoxic conductive materials.  Read the news coverage of the Nature Nanotechnology research paper that I co-authored here.
<![CDATA[Herd Immunity: It's Not About You, It's About Your Grandmother]]>Wed, 09 Feb 2011 05:11:31 GMThttp://www.jessicapjohnson.com/science-blog/herd-immunity-its-not-about-you-its-about-your-grandmother Whooping cough cases in California last year reached numbers not seen since 1947 when vaccination for this disease was first implemented on a large scale. This and other localized outbreaks of measles and bacterial meningitis across the U.S. have the medical community scrambling to get people vaccinated. Vaccines prevent the spread of contagious bacterial or viral infections. When enough people in a population are vaccinated, even unvaccinated individuals enjoy a lower risk of contracting the disease.  How does this work? And how many people need to be vaccinated in order for others to be protected?
The phenomenon is called “herd immunity.” Most people don’t see themselves as members of a herd. And yet, it’s our herd (or community) lifestyle that makes us prone to passing disease to our fellow herd-members. But when the majority of a population is vaccinated, the chance of an unvaccinated individual encountering an unvaccinated, infected person is very small. Herd immunity can slow the spread of disease and sometimes completely eliminate it: smallpox, for example, was eradicated worldwide in 1979 through intense vaccination efforts.

Herd immunity protects those too young or frail to be vaccinated, a group that includes infants, chemotherapy patients, organ transplant recipients, the elderly, and people with immune diseases such as AIDS.  

To achieve herd immunity, the number of vaccinated individuals in a population must reach a critical mass. The Centers for Disease Control sets different vaccination goals for each disease depending on how contagious it is and how effective the vaccine for it is. For example, whooping cough is highly contagious and the vaccine is not very good, so the goal is to have 92 to 94 percent of the population vaccinated. In contrast, mumps is less contagious so only 75 to 86 percent of the population needs to be vaccinated in order to protect unvaccinated individuals.

Whooping cough vaccination in the United States has met or exceeded the minimum herd immunity levels since 1994. But localized outbreaks of the disease show what happens when immunization drops below critical mass. In the last year, the number of whooping cough cases in California exploded to four times that of the previous year. A healthy adult can usually survive the “100-day cough” typical of the disease, but young or unhealthy people risk death. As of October 2010, ten infants have died in California from this vaccine-preventable disease.

The primary cause of these outbreaks, experts say, is a trend toward fewer childhood vaccinations and the failure of adults to get their booster shots. In California, vaccination rates fell to a level that compromised herd immunity. The vulnerable population, including infants too young to be vaccinated, was left unprotected.

Health officials say that fear over vaccine safety is a leading cause of the drop in vaccinations. A 1998 study in The Lancet linked the measles-mumps-rubella (MMR) vaccine to autism. Although the study was widely criticized, it wasn’t officially retracted until 2010. In the meantime, some parents refused to vaccinate their children against MMR and other vaccine-preventable diseases. This has led to outbreaks of whooping cough, measles, and bacterial meningitis in communities across the U.S.

Thanks to Dr. Christopher Gill, an infectious disease specialist with the Boston University School of Public Health, and to Thomas Skinner, Press Officer specializing in immunity and vaccinations with the Centers for Disease Control and Prevention.

<![CDATA[Got Biscuits? My new little brother's disgusting behavior.]]>Mon, 22 Nov 2010 08:00:00 GMThttp://www.jessicapjohnson.com/science-blog/got-biscuits-my-new-little-brothers-disgusting-behaviorPicture
Say hello to Petey, the little mutt who has officially replaced my brother and me in my dad’s heart.  We were never this adorable. 

They got him as a puppy last August.  My dad, never much of a cuddler with us kids, now carries Petey around the house speaking a type of man-baby talk that I don’t remember ever hearing come out of his mouth before.  He and Petey even take naps together every afternoon.

But on a recent visit, I discovered Petey’s fatal flaw.  His most favorite snack is…(pause for dramatic effect)…his own poo.

“Right on,” I thought.  “I never did that!”  I may still win this popularity contest.

You really have to keep an eye on the guy when you take him out for walks.  He’s smart.  He’ll run to a spot where you can’t see him, do his business, and then munch away on it directly.  If I’m making you a little sick with my description, I’m sorry, but imagine my pain in having to witness this little horror scene every day of my two-week visit. 

My dad is sad that he can’t let Petey lick his face anymore, knowing where his mouth has been.  So when I recently asked my family to give me a list of science questions they want to learn more about, Petey’s taste in snacks was at the top of their list.

The technical term is “autocoprophagy,” eating one’s own feces.  There are a number of animals that do this and generally the purpose seems to be to gain additional nutrients.   For example, rabbits have poor digestive systems, so they send their food through twice, but never eat their waste the second time.

Autocoprophagy and coprophagy (that is, the eating of other animals’ feces) by dogs is not well-studied and therefore not well-understood.  Veterinarian Erik Hofmeister and colleagues at Washington State University list several theories as to why dogs may do this:

1) Dietary deficiency.
2) The dog is fed too frequently or too infrequently.
3) Medical problems such as exocrine pancreatic insufficiency, pancreatitis, intestinal infections, malabsorptive syndromes.
4) Attention-seeking behavior: the dog gets attention from its owner by being reprimanded for eating feces.
5) Mimicry: the dog observes the owner picking up feces or sees other dogs eating it and mimics them.
6) Keeping the living space clean: this may arise if owners fail to clean up feces frequently enough.
7)  Dominance behavior: some submissive dogs will consume the feces of dominant dogs in the household.
8) It simply tastes good.

While this behavior is disgusting for the owner to observe, the good news is that it does not harm the dog. 

Hofmeister recommends bringing the dog to a veterinarian in order to rule out any medical reasons for the behavior.  If there is no underlying medical cause, the only recommendations are to make sure that the dog’s living space is clean and to use positive reinforcement to train the dog not to eat feces. 

In other words, train the dog to eat the type of biscuits that come in a box, not from its own personal conveyor belt.  If Petey takes to his training and stops this gross habit, my brother and I will be forever in second place in the eyes of my dad.  But it’s hard to see Petey’s cute little face and be too upset about it.