Tracking the past of the ocean’s tiny organisms

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.

sor10_norico

While many people head to the ocean to spot Japan’s impressive marine life, graduate student Norico Yamada had a more unusual goal when she joined the research vessel, ‘Toyoshio Maru’ at the end of May. Norico was after samples of sea slime and she’s been collecting these from around the world. 

As part of Hokkaido University’s Laboratory of Biodiversity II, Norico’s research focusses on the evolution of plankton: tiny organisms that are a major food source for many ocean animals.

Plankton, Norico explains, are a type of ‘Eukaryote’, one of the three major super groups into which all forms of life on Earth can be divided. To make it into the Eukaryote group, the organism’s cells must contain DNA bound up in a nucleus. In fact, the esoteric group name is just an allusion to the presence of the nucleus, since ‘krayon’ comes from the Greek meaning ‘nut’. 

The Eukaryote group is so large, it contains most of what we think of as ‘life’. Humans belong to a branch of Eukaryotes called ‘Opistokonta’, a category we share with all forms of fungus. This leads to the disconcerting realisation that to a biodiversity expert like Norico, the difference between yourself and a mushroom is small. 

Of the five other branches of Eukaryote, one contains all forms of land plants and the sea-dwelling green and red algae. Named ‘Archaeoplastida’, these organisms all photosynthesise, meaning that they can convert sunlight into food. To do this, their cells contain ‘chloroplasts’ which capture the sunlight’s energy and change it into nutrients for the plant.

This seems very logical until we reach Norico’s plankton. These particular organisms also photosynthesise, but they do not belong to the Archaeoplastida group. Instead, they are part of a third group called ‘SAR’ whose members originally did not have this ability, but many acquired it later in their evolution. So how did Norico’s plankton gain the chloroplasts in their cells to photosynthesise? 

Norico explains that chloroplasts were initially only found a single-cell bacteria named ‘cyanobacteria’. Several billion years ago, these cyanobacteria used to be engulfed by other cells to live inside their outer wall. These two cells would initially exist independently in a mutually beneficial relationship: the engulfing cell provided nutrients and protection for the cyanobacteria which in turn, provided the ability to use sunlight as an energy source. Over time, DNA from the cyanobacteria became incorporated in the engulfing cell’s nucleus to make a single larger organism that could photosynthesise. The result was the group of Archaeoplastidas.

To form the photosynthesising plankton in the SAR group, the above process must be repeated at a later point in history. This time, the cell being engulfed is not a simple cyanobacteria but an Archaeoplastida such as red algae. The cell that engulfs the red algae was already part of the SAR group, but then gains the Archaeoplastida ability to photosynthesise. 

To understand this process in more detail, Norico has been studying a SAR group organism that seems to have had a relatively recent merger. Dubbed ‘dinotom’, the name of this plankton combines its most recent heritage of a photosynthesising ‘diatom’ plankton being engulfed by a ‘dinoflagellate’ plankton. Mixing the names ‘dinoflagellate’ and ‘diatom’ gives you the new name ‘dinotom’. This merger of the two algae is so recent that the different components of the two cells can still be seen inside the dinotom, although they cannot be separated to live independently.

From samples collected from the seas around Japan, South Africa and the USA, Norico identified multiple species of dinotoms. Every dinotom was the product of a recent merger between a diatom and dinoflagellate, but the species of diatom engulfed varied to give different types of dinotoms. As an analogy, imagine if it were possible to merge a rodent and an insect to form a ‘rodsect’. One species of rodsect might come from a gerbil and a bee, while another might be from a gerbil and a beetle. 

Norico identified the species by looking at the dinotom’s cell surface which is covered by a series of plates that act as a protective armour. By comparing the position and shape of the plates, Norico could match the dinotoms in her sample to those species already known. However, when she examined the cells closer, she found some surprises. 

During her South Africa expedition, Norico had collected samples from two different locations: Marina Beach and Kommetjie. Both places are on the coast, but separated by approximately 1,500 km. An examination of the plates suggested that Norico had found the same species of dinotom in both locations, but when she examined the genes, she discovered an important difference. The engulfed diatom that had provided the cells with the ability to photosynthesise were not completely identical. In our ‘rodsect’ analogy, Norico had effectively found two gerbil-beetle hybrids, but one was a water beetle and the other a stag beetle. Norico therefore concluded that the Marina Beach dinotom was an entirely separate species from the Kommetjie dinotom.

This discovery did not stop there. Repeating the same procedure with a dinotom species collected in Japan and the USA, Norico found again that the engulfed diatom was different. In total, she found six species of dinotoms in her sample, three of which were entirely new. 

From this discovery, Norico concluded that the engulfing process to acquire the ability to photosynthesis likely happens many times over the evolution of an organism. This means that the variety of species that can mix is much greater, since the merger does not happen at a single point in history. Multiple merging events also means that the organism can sometimes gain, lose and then re-gain this ability during its evolution. 

Next year, Norico will graduate from Hokkaido and travel to Osaka prefecture to begin her postdoctoral work at Kwansei Gakuin University, where she hopes to uncover still more secrets of these minute lifeforms that provide so much of the ocean’s diversity.

Fighting disease with computers

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.

DSC01567.JPG

When is comes to students wishing to study the propagation of diseases, Heidi Tessmer is not your average new intake.

“I did my first degree in computer science and a masters in information technology,” she explains. “And then I went to work in the tech industry.”

Yet it is this background with computing that Heidi wants to meld with her new biological studies to tackle questions that require the help of some serious data crunching.

Part of the inspiration for Heidi’s change in career came from her time working in the UK, where she lived on a sheep farm. It was there she witnessed first hand the devastating results from an outbreak of the highly infectious ‘foot and mouth’ disease. This particular virus affects cloven-hoofed animals, resulting in the mass slaughter of farm stock to stem the disease’s spread.

“The prospect of culling of animals was very hard on the farmers,” she describes. “I wanted to know if something could be done to save animal lives and people’s livelihoods. That was when I began to look at whether computers could be used to solve problems in the fields of medicine and disease.”

This idea drove Heidi back to the USA, where she began taking classes in biological and genetic sciences at the University of Wisconsin-Madison. After improving her background knowledge, she came to Hokkaido last autumn to begin her PhD program at the School of Veterinary Medicine in the Division of Bioinformatics.

“Bioinformatics is about finding patterns,” Heidi explains.

Identifying trends in data seems straight forward enough until you realise that the data sets involved can be humongous. Heidi explains this by citing a recent example she has been studying that involves the spread of the influenza virus. While often no more than a relatively brief sickness in a healthy individual, the ease at which influenza spreads and mutates gives it the ongoing potential to become a global pandemic, bringing with it a mortality figure in the millions. Understanding and controlling influenza is therefore a high priority across the globe.

Influenza appears in two main types, influenza A and B. The ‘A’ type is the more common of the two, and its individual variations are named based on the types of the two proteins that sit on the virus’ surface. For example, H1N1 has a subtype 1 HA (hemagglutinin) protein and subtype 1 NA (neuraminidase) protein on its outer layer while H5N1 differs by having a subtype 5 HA protein.

The inner region of each influenza A virus contains 8 segments of the genetic encoding material, RNA. Similar to DNA, it is this RNA that forms the virus genome and allows it to harm its host. When it multiplies, a virus takes over a normal cell in the body and injects its own genome, forcing the cell to begin making the requisite RNA segments needed for new viruses. However, the process which gathers the RNA segments up into the correct group of 8 has been a mystery to researchers studying the virus’ reproduction.

In the particular case study Heidi was examining (published by Gog et al. in the journal of Nuclear Acids Research in 2007), researchers proposed that this assembly process could be performed using a ‘packaging signal’ incorporated into each RNA segment. This packaging signal would be designed to tell other RNA segments whether they wished to be part of the same group. If this signalling could be disrupted, proposed the researchers, then the virus would form incorrectly, potentially rendering it harmless.

This, explains Heidi, is where computers come in. Each RNA segment is made up of organic molecules known as ‘nucleotides’, which bunch together in groups of three called ‘codons’. The packaging signal was expected to be a group of one or more codons that were always found in the same place on the RNA segment; signalling a crucial part of encoding. In order to find this, codon positions had to be compared across 1000s of RNA segments. This job is significantly too difficult to do by hand, but it is a trivial calculation for the right bit of computer code. Analysing massive biological data efficiently in this way is the basis for bioinformatics.

In addition to the case above, Heidi cites the spread of epidemics as another area that is greatly benefiting from bioinformatics analysis. By using resources as common as Google, new cases of disease can be compared with historical data and even weather patterns.

“The hardest part about bioinformatics is knowing what questions to ask,” Heidi concludes. “We have all this data which contains a multitude of answers, but you need to know what question you’re asking to write the code.”

It does sound seriously difficult. But Heidi’s unique background and skill set is one that just might turn up some serious answers.

Reactions in the coldest part of the galaxy

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.

Professor Naoki Watanabe with graduate student, Kazuaki Kuwahata.

Professor Naoki Watanabe with graduate student, Kazuaki Kuwahata.

Within the coldest depths of our galaxy, Naoki Watanabe explores chemistry that is classically impossible.

At a staggering -263°C, gas and dust coalesce into clouds that may one day birth a new population of stars. The chemistry of these stellar nurseries is therefore incredibly important for understanding how stars such as our own Sun were formed.

While the cloud gas is made primarily from molecular (H2) and atomic (H) hydrogen, it also contains tiny dust grains suspended like soot in chimney smoke. It is on these minute surfaces that the real action can begin.

Most chemical reactions require heat. In a similar way to being given a leg-up to get over a wall, a burst of energy allows two molecules to rearrange their bonds and form a new compound. This is known as the ‘activation energy’ for a reaction. The problem is that at these incredibly low temperatures, there’s very little heat on offer and yet mysteriously reactions are still happening.

It is this problem that Professor Naoki Watanabe in the Institute of Low Temperature Science wants to explore and he has tackled it in a way different from traditional methods.

“Most astronomers want to work from the top down,” Naoki describes. “They prepare a system that is as similar to a real cloud as possible, then give it some energy and examine the products.”

The problem with this technique, Naoki explains, is that it is hard to see exactly which molecule has contributed to forming the new products: there are too many different reactions that could be happening. Naoki’s approach is therefore to simplify the system to consider just one possible reaction at a time. This allows his team to find out how likely the reaction is to occur and thereby judge its importance in forming new compounds.

This concept originates from Naoki’s background as an atomic physicist, where he did his graduate and postdoctoral work before moving fields into astrochemistry. Bringing together expertise in different areas is a primary goal for the Institute of Low Temperature Science and Naoki’s group is the top of only three in the world working in this field.

Back in the laboratory, Naoki shows the experimental apparatus that can lower temperatures down to the depths of the galactic clouds. Dust grains in the clouds are observed to be covered with molecules such as ice, carbon monoxide (CO) and methanol (CH3OH). Some of these can be explained easily: for instance, carbon monoxide can form from a positively charged carbon atom that is known as an ion (C+). The charge on the ions make them want to bond to other atoms and become neutral, resulting in no activation energy being required. A second possible mechanism is by a carbon atom (C) attaching to a hydroxyl molecule (OH): a particularly reactive compound known as a ‘radical’ due it possessing a free bond. Like the charge on the C+, this lose bond wants to be used, allowing the reaction to proceed without an energy input. These paths allow CO to form in the cloud even at incredibly low temperatures and freeze onto the dust.

However, methanol is more of a mystery. Since the molecule contains carbon, oxygen and hydrogen (CH3OH), it is logical to assume that initially CO forms and then hydrogen atoms join in the fun. The problem is that to add a hydrogen atom and form HCO requires a significant activation energy, and there is no heat in the cloud to trigger that reaction. So how does the observed methanol get formed?

Naoki discovered that while the low temperature prevented HCO forming through a thermal (heat initiated) reaction, it was that same coldness that opened the door to a different mechanism: that of quantum tunnelling. According to quantum mechanical theory, all matter can behave as both a particle and a wave. As a wave, your position becomes uncertain. Your most likely location is at the wave peak, but there is a small chance you will be found to either side of that, in the wave’s wings.

Normally, we do not notice the wave nature of objects around us: their large mass and energy makes their wavelength so small this slight uncertainty goes unnoticed. Yet in the incredibly cold conditions inside a cloud, the tiny hydrogen atom starts to show its wave behaviour. This means that when the hydrogen atom approaches the carbon monoxide molecule, the edge of its wave form overlaps the activation energy barrier, giving a chance that it may be found on its other side. In this case, the HCO molecule can form without having to first acquire enough heat to jump over the activation barrier; the atoms have tunnelled through.

Such reactions are completely impossible from classical physics and the chance of them occurring in quantum physics is not high, even at such low temperatures. This is where the dust grain becomes important. Stuck to the surface of the grain, reactions between the carbon monoxide and hydrogen atom can occur many many times, increasing the likelihood that a hydrogen atom will eventually tunnel and create HCO.

After measuring the frequency of such reactions in the lab, Naoki was then able to use simulations to predict the abundance of the molecules over the lifetime of a galactic cloud. The results agreed so well with observations that it seems certain that this strangest of mechanisms plays a major role in shaping the chemistry of the galactic clouds.

Are you likely to be eaten by a bear?

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.

bear.jpg

The Hokkaido Prefectural Government…” The article published on the Mainichi newspaper’s English website at the end of September announced, “...has predicted that brown bears will make more encroachments on human settlements than usual this fall.

While written in calm prose, it was a headline to make you hesitate stepping outside to visit the convenience store: exactly how many bears were you prepared to face down to get that pint of milk?

Yet exactly how serious was this threat and what can be done to protect both the bears and the humans from conflict?

Professor Toshio Tsubota in the Department of Environmental Veterinary Sciences is an expert on the Hokkaido brown bear population. He explains that bears normally live in the mountains and only come into residential areas if there is a shortage of food. Sapporo City’s high altitude surroundings make it a popular destination when times are tough and bears have been sighted as far in as the city library in the Chuo Ward.

The problem this year came down to acorns. Wild water oak acorns are a major source of food for bears in the Autumn months, and this year the harvest has been low. Despite the flurry of concern, a low acorn crop is not an uncommon phenomenon. The natural cycle of growth produces good years with a high yield of nuts and bad years with a lower count in rough alternation. However, a bad yield year does increase the risk of bear appearances in the city.

With food being the driving desire, bears entering residential streets typically raid trees and garden vegetables. The quality of your little allotment not withstanding, bears will return to the mountains once the food supply improves, meaning their presence in the city is a short-lived, intermittent issue.

That is, unless the bears get into the garbage.

High in nutrients and calories, bears that begin to eat garbage will not return to their normal hunting habits. This is when more serious problems between humans and bears start to develop. Since yellow bagged waste products do not strongly resemble the bears’ normal food, they will not initially know that garbage is worth eating. It is therefore essential, Toshio says, to ensure your garbage is properly stored before collection.

In Japan, brown bears are found exclusively in Hokkaido with Honshu hosting a black bear population. The brown bear is the larger of the two species, with the females weighing in around 150 kg and the males between 200 – 300 kg. The exact population number in Hokkaido is unknown, but estimates put it around 3000 – 3500 bears on the island.

On the far eastern shore of Hokkaido is the Shiretoko World Heritage Site. Free from hunting, bears living in this area have little to fear from humans, making it the ideal location to study bear behaviour and habitat. Able to approach within 10 – 20 m of the bears, it is here that Toshio collects data for his research into bear ecology.

The acorn feast in the autumn comprises of the final meal the bears will eat before hibernation. Hokkaido’s long winters are passed in dens, with bears beginning their deep sleep near the end of November and not emerging until April or May. In warmer climates, this hibernation time is greatly reduced, with black bears on Honshu sleeping for a shorter spell and bears in southern Asia avoiding hibernation altogether.

For bears that do hibernate, this sleepy winter is also the time when expectant mothers give birth. Despite this rather strenuous sounding activity, the mother bear’s metabolic rate remains low during this period, even while she suckles her cubs. This is in stark contrast to the first few months after a human baby’s arrival where the mother typically gets very little sleep at all.

Another feature human expectant mothers may envy is the ability of the bear to put her pregnancy on hold for several months. Bears mate in the summer whereas the cubs are born in the middle of winter, making the bear pregnancy official clocking in at 6 – 7 months. However, the foetus only actually takes about two months to develop; roughly the same time scale as for a cat or dog. This pregnancy pausing is known as ‘delayed implantation’ and stops the pregnancy at a very early stage, waiting until the mother has gained weight and begun her hibernation before allowing the cubs develop.

Brown bears typically have two cubs per litter and the babies stay with their mother during the first 1.5 – 2.5 years of their life. With an infant mortality of roughly 40 – 50%, this means that drops in the bear population are difficult to replenish. Toshio’s work in understanding brown bear ecology is therefore particularly important to preserving the species in Hokkaido.

As hunting drops in popularity, Hokkaido’s bear population has improved, yet this could be seriously damaged if humans and bears come into conflict.

“A bear does not want to attack a person,” Toshio explains. “He just want to find food. But sometimes bears may encounter people and this leads to a dangerous situation.”

To date, bears venturing into Sapporo have not resulted in serious accidents, although people have been killed by bears in the mountains during the spring. These attacks occur when both people and bears seek out the wild vegetables, producing a conflict over a food source.

Toshio believes that education is of key importance to keeping both bear and human parties safe, and advocates that a wildlife program be introduced in schools.

“If there are no accidents, people love bears! They are happy with their life and population,” Toshio points out. “But if people see bears in residential areas they will become scared and want to reduce their number.”

If you do see a bear in the street, Toshio says that keeping quiet and not running are key. In most cases, the bear will ignore you, allowing you to walk slowly away and keep your distance.

“We talk about ‘one health’,” Toshio concludes. “Human health, animal health and ecological health. These are important and we need to preserve them all.”

Baby Galaxies

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.


babygalaxies.jpg

Hold an American dime (a coin slightly smaller than a 1 yen piece) at arms length and you can just make out the eye of Franklin D. Roosevelt, the 32nd President of the USA whose profile is etched on the silver surface. Lift the eye on that coin up to the sky and you’re looking at a region containing over 10,000 galaxies. 

In 1995, the Hubble Space Telescope was trained on a small patch of space in the constellation Ursa Major. So tiny was this region that astronomers could only make out a few stars belonging to our Milky Way. To all intents and purposes, the space telescope was pointing at nothing at all.

After 10 days, scientists examined the results. Instead of a pool of empty blackness, they saw the light of 3,000 galaxies packed into a minute one 24-millionth of the entire sky. In 2003, the observation was repeated for an equally tiny region of space in the constellation Fornax. The result was over 10,000 galaxies. These images are known as the ‘Hubble Deep Field’ and ‘Hubble Ultra Deep Field’ and became some of the most famous results in the space telescope’s legacy. 

For their second public Sci-Tech talk, the Faculty of Science’s Office of International Academic Support invited Professor John Wise from the Georgia Institute of Technology in the USA to talk about his search to understand the formation of the very first galaxies in the Universe. 

John explained that when you look at the Hubble Deep Field, you do not see the galaxies as they are now. Rather, you are viewing galaxies born throughout the Universe’s life, stretching back billions of years. The reason for this strange time alignment is that light travels at a fixed, finite speed. This means that we see an object not as it is now, but how it was at the moment when light left its surface. 

Since light travels 300,000 km every second, this delay in light leaving an object and reaching your eyes is infinitesimally small for daily encounters. For light to travel to you from a corner shop 100 meters away, takes a tiny  0.0000003 seconds. However, step into the vastness of space and this wait becomes incredibly important. 

Light travelling to us from the moon takes 0.3 seconds. From the Sun, 8 minutes. For light to reach us from the Andromeda Galaxy, we have to wait 2.4 million years. The Andromeda we see is therefore the galaxy as it was 2.4 million years ago, before humans ever evolved on Earth. 

At a mere 24,000,000,000,000,000,000 km away, Andromeda is one of our nearest galactic neighbours. Look further afield, and we start to view galaxies that are living in a much younger universe; A universe that has only just began to build galaxies. 

“At the start of the Universe,” Wise explains. “There were no stars or galaxies, only a mass of electrons, protons and particles of light. But when it was only 380,000 years old, the Universe had cooled enough to allow the electrons and protons to join and make atoms.”

This point in the Universe’s 13.8 billion year history is known as ‘Recombination’; a descriptive name to mark electrons and protons combining to make the lightest elements: Hydrogen, Helium and a sprinkling of Lithium. 

Then, the Universe goes dark. 

For the next 100 million years, the Universe consists of neutral atoms that produce no light nor scatter the current populations of light particles. Astrophysicists, who rely on these light scatterings to see, refer to this time as the ‘Dark Ages’ where they are blind to the Universe’s evolution. 

Then, something big happens.

An event big enough that it sweeps across the Universe, separating electrons from their atom’s nucleus in an event called ‘Reionisation’. 

“It was like a phase transition,” Wise describes. “Similar to when you go from ice to water. Except the Universe went from neutral to ionised and we were now able to see it was full of young galaxies.”

But what caused this massive Universe phase transition? Wise suspected that the first galaxies were the cause, but with nothing visible during the Dark Ages, how is this proved?

To tackle this problem, Wise turned to computer models to map the Universe’s evolution through its invisible period. Before he could model the formation of the first galaxy, however, Wise had to first ask when the first stars appeared.

“Before any galaxies formed, the first stars had to be created,” Wise elaborates. “These were individual stars, forming alone rather than in galaxies as they do today.”

As the pool of electrons and protons join at ‘Recombination’ to create atoms, small differences in density were already in place from the first fraction of a second after the Big Bang. The regions of slightly higher density began to exert a stronger gravitational pull on the atoms, drawing the gas towards them. Eventually, these clouds of gas grew massive enough to begin to collapse and star was born in their centre. 

Their isolated location wasn’t the only thing that made the first stars different from our own Sun. With only the light elements of hydrogen, helium and lithium from which to form, these stars lacked the heavier atoms astrophysicists refer to as ‘metals’. Due to their more complicated atomic structure, metals are excellent at cooling gas. Without them, the gas cloud was unable to collapse as effectively, producing very massive stars with short lifetimes.

“We don’t see any of these stars in our galaxy today because their lifetimes were so short,” Wise points out. “Our Sun will live for about 10 billion years, but a star 8 times heavier would only live for 20 million years; thousands of times shorter. That’s a huge difference! The first star might be ten times as massive with an even shorter lifetime.” 

In the heart of the newly formed star, a key process then began: that of creating the heavier elements. Compressed by gravity, hydrogen and helium fuse to form carbon, nitrogen and oxygen in a process known as ‘stellar nucleosynthesis’. As it reaches the end of its short life, the first stars explode in an event called a ‘supernovae’, spewing the heavier metals cooked within their centre out into the Universe. 

The dense regions of space that have formed the first stars now begin to merge themselves. Multiple pools of gas come together to produce a larger reservoir from which the first galaxies begin to take shape. 

Consisting of dense, cold gas, filled with metals from the first stars, these first galaxies turn star formation into a business. More stars form, now in much closer vicinity, and the combined heat they produce radiates out through the galaxy. This new source of energy hits the neutral atoms and strips their electrons, ionising the gas. 

As the fields of ionised gas surrounding the first galaxies expand, they overlap and merge, resulting in the whole Universe becoming ionised. Wise’s simulations show this in action, with pockets of ionised gas and metals spreading to fill all of space. 

While this uncovers the source of the Universe’s ionisation, the impact of the first galaxies does not stop there. In the same way that the regions of first star formation had combined to form galaxies, further mergers pull the galactic siblings together. The result is the birth of a galaxy like our own Milky Way. 

These first galaxies, Wise concludes, therefore form a single but important piece of the puzzle that ultimately leads to our own existence here in the Milky Way. 

Can you catch your dog's illness?

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.


DSC01439.JPG

The Research Center for Zoonosis Control is concerned with the probability that you will catch a disease from your dog. Or –to put it more generally– zoonosis looks at the transmission of diseases between humans and animals. 

While we do not normally consider an illness in an animal a risk to humans, you can almost certainly name a number of exceptions: Severe Acute Respiratory Syndrome (SARS), avian Influenza (bird flu), rabies, Escherichai coli (E. coli ) and Creutzfeldt-Jakob disease (CJD), to offer a few examples. 

Diseases can be produced by different types of agents: viruses, bacteria, prions (a type of protein), fungi and parasites. Postdoctoral researcher, Jung-Ho Youn, is interested in the second of these types; diseases caused by bacteria. 

Bacteria, Jung-Ho explains, are everywhere and most are harmless or even essential to our health. For the bacteria which do cause disease, treatment usually involves antibiotics. 

In 1928, Scottish scientist Alexander Fleming discovered that a sample of the disease-producing bacterium, Staphylococcus aureus (S. aureus), had become contaminated by a mould which was inhibiting its growth. This fungus turned out to be producing a substance that would later be known as the antibiotic, penicillin. So successful was penicillin at treating previously fatal bacterial diseases that it was hailed the ‘miracle drug’. 

Yet, 85 years later we are far from celebrating the demise of all bacterial illness. Indeed, a variant of the very bacteria Fleming was studying can be a serious problem in hospitals due to its strong resistance to many antibiotics. 

So what went wrong?

The problem, Jung-Ho explains, is that bacteria are continuously changing. With the ability to divide in minutes, bacterial numbers can rise exponentially and these microorganisms are not always identical. Random mutations to the genetic structure of a bacterium usually have little effect but a small number of cases can arise that produce a spontaneous resistance to the drug set to kill them. For example, the bacteria’s surface may change to prevent the antibiotic bonding or it could reduce the build up of the drug by creating a pump to dispel it from its system. These changes are the same principal as evolution in mammals but on a much faster time scale due to the bacteria’s fast reproduction. This means that when a sick person takes a course of antibiotics such as penicillin, they kill most but not all of the bacteria. 

“It’s like people wearing bullet-proof vests,” Jung-Ho described. “The antibiotics are the guns. They kill most of the bacteria but not all because a few are wearing these bullet-proof vests. This is why bacteria have survived until now.” 

But the problems with bacteria do not stop at the ability to develop resistance once exposed to a drug. Not only can a resistant bacterium multiply to produce more bacteria will the same protection, it can share this information with different types of bacteria. The bacteria that gain this protection may never have been exposed to the antibiotic and –a key issue for zoonosis– they may not even cause disease in the same species. 

Jung-Ho’s research is with Staphylococcus pseudintermedius (S. pseudintermedius), a bacteria commonly found in dogs. While playing with your pet, this bacteria can transfer onto your skin where it can meet the human variant, S. aureus; the same bacteria Fleming was studying during his discovery of penicillin. 

“Normally there are no problems,” Jung-Ho assures animal lovers. “You don’t need to get rid of your pet. It’s just in some cases…”

In some cases, the type of S. pseudintermedius bacteria on your dog may be resistant to an antibiotic and pass this resistance onto the S. aureus. The result is a disease-spreading human bacteria that is resistant to an antibiotic it has never been exposed to. Likewise, the reverse may occur where a resistant human bacteria passes on its resistance to an animal  disease.

How can this spread of resistance between animals and humans be prevented? The first step, Jung-Ho explains, is to find out where these antibiotic resistant bacteria are and how they are spreading. 

Jung-Ho describes an experiment conducted in a veterinary waiting room. Swabs are taken from the patient pets, veterinary surgeons and the environment, such as the waiting room chairs and computers. The bacteria are then analysed to see if the same microorganism variant (or ‘strain’) is found in more than one place. For example, if the same bacteria strain are found on the vets’s skin and the dog, this suggests transmission is occurring between the two parties. 

The direction of this transmission (vet to dog or dog to vet) is not possible to determine, but in the case of S. pseudintermedius, Jung-Ho knows it is predominantly found on dogs, so must be moving from animal to human. If a large quantity of this bacteria is also found on the furnishings of the clinic, it is likely that significant extra transmission is occurring by people touching their keyboard or chair, even if they have never interacted with an animal. This kind of spread can then be prevented by sterilising the area more thoroughly: a find that can reduce disease and the spread of resistant bacteria. 

This approach sounds straightforward until the sheer number of bacteria are considered. Even when narrowed down to a specific strain, it is not immediately obvious whether the bacteria found have the same source, or if they have originated from entirely different locations. Jung-Ho described two main methods for checking to see if the samples are the result of a single transmitted population:

(1) The identification of ‘Housekeeping genes’. These genes are particular to a bacteria type and essential to its survival, resulting in these genes being present in each cell of the microorganism. ForStaphylococcus bacteria, there are seven housekeeping genes but their order may differ between bacteria populations. By identifying these genes in bacteria found in different locations, scientists can establish if they were transmitted between the two places. This technique is known as ‘Multilocus sequence typing’ or MLST. 

(2) The use of another molecule that cuts the bacteria’s DNA into pieces. Called ‘Pulsed-field gel electrophoresis’ (PFGE) this technique is also used in genetic fingerprinting to identify an individual from their DNA. In this case, the bacteria’s DNA is sliced using a gene cutting enzyme that only produces the same pieces if the bacterial populations are identical. 

Jung-Ho points out that such methods –while effective– are cumbersome compared with the prospect of being able to sequence the whole DNA of a bacteria. He hopes that with the new generation of machines capable of quickly finding complete gene sequences, it will become easier to track the antibiotic resistant bacteria. This will not only allow scientists to minimise infection but also warn doctors that a disease may be resistant to certain drugs. Such an early detection of the bacteria type can save vital time when treating a patient. 

One of the difficulties with this research is that it sits between human and animal medicine. Jung-Ho’s own background is in veterinary science, yet to understand the bacterial spread across species he must now gain expertise on the medical side. 

“It is a big question in zoonsis research,” he explains. “Who should do it? Doctors or vets? Someone has to, I think.”

 

Were Dinosaurs adaptable enough to survive the meteor?

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.


dinosaur1_yoshi.png

Based on the third floor of the Hokkaido University museum, Assistant Professor Yoshitsugu Kobayashi is a dinosaur hunter. His searches for the fossilised remains of Earth’s previous rulers has taken him on a pan Pacific sweep of countries that has most recently landed him in Alaska. 

With its freezing winters and long nights, Alaska is not the first place one might suppose 

to find the remains of beasts resembling giant lizards. Nor has the landmass moved from more comfortable tropics over the last 66 million years since the dinosaurs became extinct. To understand how these creatures could have thrived in such a hostile environment, Yoshitsugu suggests that our view of dinosaurs needs to change. 

Dinosaurs have long been considered the enormous ancestors of cold blooded reptiles. Unable to control their own body temperature, these creatures would need to live in warm climates to maintain a reasonable metabolism. Finding fossils in Alaska therefore, presents scientists with some problems. 

“To survive winters in Alaska, the dinosaurs would need to be warm blooded,” Yoshitsugu explains. “This allows them to adapt more easily to different environments.”

Which brings us to Yoshitsugu’s next big question; to what extent were the dinosaurs able to flourish in different living conditions? 

To investigate this, Yoshitsugu and his team examined the fossils found in Alaska and compared them with known dinosaur groups elsewhere in the world. There were three main possibilities for their findings:

The Alaskan population of dinosaurs could match that of North American dinosaurs at lower latitude, implying that same group had migrated north.

dinosaur2_triceratops.png

Rather than North American, the dinosaurs could be kin to those found in Asia. This would require a migration across from Russia to Alaska and its likelihood is determined by the condition of the Bering Strait; an 82 km stretch between Russia’s Cape Dezhnev and Cape Prince of Wales in Alaska. In the present time this gap is a sea passage but in the past, a natural land bridge has existed and is thought to be responsible for the first human migration into America. Could the dinosaurs have taken the same path millions of years before? 

The third option is that the Alaskan dinosaurs resemble neither their North American or Russian cousins and a different kind of dinosaur inhabited these frozen grounds.  

Examining their finds from this region, Yoshitsugu’s team concluded that what they had were North American dinosaurs. This meant that dinosaurs --like humans-- were capable of living across an incredibly wide range of environments, with different climates, food sources and dangers.  

“People often picture the dinosaurs as having small brains,” Yoshitsugu amends. “But the same dinosaurs lived in very different places. They were intelligent and adaptable.”

This adaptability has one very important consequence; the meteor that hit the Earth 66 million years ago is unlikely to have been solely responsible for the dinosaurs’ demise. 

dinosaur3_afrovenator.png

As it crashed into the Earth, the meteor raised a huge cloud of dust that blotted out the sun, creating a cold and dark new world. Had the dinosaurs been cold blooded, such a massively extended winter would have proved fatal, but Yoshitsugu’s research supports suggestions that not only were the dinosaurs warm blooded, they were also equipped to deal with this new harsh environment. 

This doesn’t mean the meteor failed to have an impact on the dinosaur population. With a drop in sunlight, the vegetation would have decreased, reducing the food source for herbivores and subsequently, the carnivores who fed on them. Yet, it does appear that the dinosaurs could have survived if this were the only disaster.  

So what else could have happened to destroy this species? 

dinosaur4_camarasaurus.png

Yoshitsugu pointed to two other possible causes: The first is major volcanic activity in the vast mountain range that covers large parts of India. In a series of prolonged eruptions that persisted for thousands of years, lava poured over hundreds of miles of land and the ejected carbon and sulphur dioxide combined to acidify the oceans. The impact on life was catastrophic and longer lasting than that from the meteor.

The second possible cause of dinosaur extinction is linked with the drop in the sea levels that occurred towards the end of the 66 million year mark. As the water receded, land bridges were formed between the continents, allowing dinosaurs to travel freely across the globe. The result was a drop in genetic diversity among the dinosaur species. The same effect is seen today if a foreign breed of animal is introduced into a new ecosystem. In the UK, the introduction of the eastern grey squirrel has driven the native red squirrel to the point of extinction. The grey squirrel can digest local food sources more effectively and also transmits a disease fatal to red squirrels, driving down their numbers. This uniformity in dinosaur genetics made them highly susceptible to eradication, since a physiological or environmental impact will equally affect the entire, near-identical, population. 

It was most likely a combination of these three catastrophes that caused the death of the dinosaurs, Yoshitsugu concludes. 

dinosaur6_giantcroc.png

However, dinosaurs were not the only giants walking the Earth 230 million years ago. Living alongside these warm blooded creatures were genuine cold blooded reptiles; the giant crocodiles. With limbs that came out from the body sideways, not straight, these huge beasts were a distinct species from the dinosaurs. Exactly how these monsters dwelt together is currently being explored in the new exhibit at the Hokkaido University Museum, ‘Giant Crocodiles vs Dinosaurs’, which opened last Friday. The exhibit includes full sized casts of both adult and juvenile crocodiles and dinosaurs, describing their location and where they overlapped. There is also a movie sequence Yoshitsugu was involved in creating that shows a simulated fight between a giant crocodile and a T-Rex over prey. 

This unique view of the world of the dinosaurs is open until October 27th. We hope to see you there!

--

[Photo captions from top to bottom: (1) Yoshitsugu Kobayashi standing in front of the skeleton of an afrovenator dinosaur. (2) The skull of a triceratops. (3) The carnivorous  afrovenator dinosaur. (4) The herbivorous camarasaurus. (5) The skull of a giant crocodile that lived alongside the dinosaurs.]

The trouble with bicycles

Spotlight on Research is the research blog I author for Hokkaido University, highlighting different topics being studied at the University each month. These posts are published on the Hokkaido University website.


katia.jpg

Walking through the Hokkaido University campus is a risky business. In winter, snow and ice lie ready to send the even slightly unbalanced into a horizontal slide. In the summer, you are likely to be crushed by a bicycle.

An astounding 19% of daily trips in Japan are made by bike. This is in comparison to less than 2% of trips in the UK. We live in a nation that loves the peddle-powered friend.

While such a mode of transport is great for the environment, it is rather less friendly for those on foot. Unlike in other countries where the bike is considered a road vehicle, here in Japan, it is commonly treated the same as pedestrians with both groups generally using the same sidewalks to travel. Not only is this problematic for pedestrians, it is also unsustainable, Assistant Professor Katia Andrade from the Department of Engineering explains.

“People use bikes because they feel secure,” Katia tells us. “They’re on the sidewalk which seems safe, but the accident rate is actually booming.”

In the last 20 years, the number of accidents involving bicycles and pedestrians has multiplied by over four times. Yet, this huge increase has been largely overlooked by the police because the incidents are typically small with few (although not zero) fatalities. However, as bicycle usage continues to increase, there will soon be no safe route for cyclists or pedestrians unless action is taken.

Action is not easy. Katia points out that a change to the way bicycles are handled in Sapporo requires expenditure of public money; always a limited asset. Additionally, implementing a system such as bicycle lanes is unlikely to be immediately effective, since people must feel confident in the available infrastructure before they will use it.

“If there is less infrastructure, less people will use it,” Katia elaborates. “But, even if you increase the infrastructure, people won’t immediately use it because they need time to feel secure.”

This means the task of safe cycling cannot be resolved by just laying down bike lanes all through the city; its too expensive and the uptake is too small. So how can a system be introduced that would have a high impact on safety for a reasonable investment? Katia’s research suggests that a key consideration might be land use.

Katia has been exploring the connection between the way land is physically used –for example by commercial buildings, school areas, residential housing blocks or industry– and the different modes of transport and resulting accidents.

To explain this connection, Katia offers a simple example of a large shopping mall being built in a suburban area. Prior to its construction, only a single bus route might have been needed to connect the surrounding houses and the centre of town. However, with more people now travelling to the mall, extra bus routes or a subway line might be required. The change in the land use has resulted in a change in the transportation.

Carry the same example another 5 years down the line and smaller shops have sprung up near the mall and the houses have expanded. This results in the number of people walking or cycling to the mall has now increased, coming into conflict with each other and the stream of motorised vehicles. The balance between the transportation modes has now changed and with it, the types of accidents occurring.

Such a correlation is often ignored in present day research. When an accident occurs, human factors such as drink driving or age are considered along with vehicle issues such as an old or faulty car. Yet, possibly such accidents could be avoided by better transport planning based on the given land use.

For instance, mixed land use where the distances between amenities such as shops, houses and work are small, are likely to be popular with cyclists. By anticipating this, city planners could keep this group safe by reducing the speed limit or providing cycle routes in this area. Investing in appropriate measures where there is a high demand will encourage cyclists to come off the crowded sidewalks and onto the safe, designated routes.

Reducing accidents by consideration of the commercial, residential and industrial composition is a key area in Katia’s current research. She explains though, that the results from her research will not be a worldwide ‘one size fits all’ due to the social nature of the problem.

Coming from Brazil, Katia understands bicycle usage between countries is very different. Here in Japan, it is common to see men cycling to work in their business suits and women in high heels off for a night’s dancing. In Brazil, both those activities would be reached using a car because it is associated with a higher social status. This social effect on her research makes the topic both vexing and fascinating. Nevertheless, at the end of the day, the goal is the same:

“We need to reduce the conflict between different modes of transport,” Katia concludes. “The less conflict, the less accidents.”