Chronic exposure to dim light at night can lead to depressive symptoms in rodents -- but these negative effects can be reversed simply by returning to a standard light-dark cycle, a new study suggests. While hamsters exposed to light at night for four weeks showed evidence of depressive symptoms, those symptoms essentially disappeared after about two weeks if they returned to normal lighting conditions.
Even changes in the brain that occurred after hamsters lived with chronic light at night reversed themselves after returning to a more normal light cycle.
These findings add to the growing evidence that suggest chronic exposure to artificial light at night may play some role in the rising rates of depression in humans during the past 50 years, said Tracy Bedrosian, lead author of the study and doctoral student in neuroscience at Ohio State University.
"The results we found in hamsters are consistent with what we know about depression in humans," Bedrosian said.
But the new study, published online in the journal Molecular Psychiatry, also offers some hope.
"The good news is that people who stay up late in front of the television and computer may be able to undo some of the harmful effects just by going back to a regular light-dark cycle and minimizing their exposure to artificial light at night," Bedrosian said. "That's what the results we found in hamsters would suggest."
Bedrosian conducted the study with Ohio State colleagues Randy Nelson, professor of neuroscience and psychology, and Zachary Weil, research assistant professor in neuroscience.
This study is the latest in a series out of Nelson's lab that have linked chronic exposure to light at night to depression and obesity in animal models.
The new study found that one particular protein found in the brain of hamsters -- and humans -- may play a key role in how light at night leads to depression.
They found that blocking effects of that protein, called tumor necrosis factor, prevented the development of depressive-like symptoms in hamsters even when they were exposed to light at night.
The study involved two experiments using female Siberian hamsters, which had their ovaries removed to ensure that hormones produced in the ovary would not interfere with the results.
In the first experiment, half of the hamsters spent eight weeks in a standard light-dark cycle of 16 hours of light (150 lux) and 8 hours of total darkness each day. The other half spent the first four weeks with 16 hours of normal daylight (150 lux) and 8 hours of dim light -- 5 lux, or the equivalent of having a television on in a darkened room.
Then, these hamsters were moved back to a standard light cycle for either one week, two weeks or four weeks before testing began.
They were then given a variety of behavior tests. Results showed that hamsters exposed to chronic dim light at night showed less total activity during their active period each day when compared to those in standard lighting conditions.
Those hamsters exposed to dim light also showed greater depressive symptoms than did the others-- such as less interest in drinking sugar water that they usually enjoy.
But within two weeks of returning to a standard light cycle, hamsters exposed to dim night light showed no more depressive-like symptoms than did hamsters that always had standard lighting. In addition, they returned to normal activity levels.
After the behavioral testing, the hamsters were sacrificed and the researchers studied a part of their brains called the hippocampus, which plays a key role in depressive disorders. Findings showed that hamsters exposed to dim light showed a variety of changes associated with depression.
Most importantly, hamsters that lived in dim light showed increased expression of the gene that produces tumor necrosis factor. TNF is one of a large family of proteins called cytokines -- chemical messengers that are mobilized when the body is injured or has an infection. These cytokines cause inflammation in their effort to repair an injured or infected area of the body. However, this inflammation can be damaging when it is constant, as happens in hamsters exposed to dim light at night.
"Researchers have found a strong association in people between chronic inflammation and depression," said Nelson, who is a member of Ohio State's Institute for Behavioral Medicine Research.
"That's why it is very significant that we found this relationship between dim light at night and increased expression of TNF."
In addition, results showed that hamsters that lived in dim light had a significantly reduced density of dendritic spines -- hairlike growths on brain cells which are used to send chemical messages from one cell to another.
Changes such as this have been linked to depression, Bedrosian said.
However, hamsters that were returned to a standard light-dark cycle after four weeks of dim light at night saw their TNF levels and even their density of dendritic spines return essentially to normal.
"Changes in dendritic spines can happen very rapidly in response to environmental factors," Bedrosian said.
In a second experiment, the researchers tested just how important TNF might be in causing the negative effects seen in hamsters exposed to light at night. In this experiment, some hamsters were given a drug called XPro1595, which is a TNF inhibitor -- meaning that it negates the effects of some forms of TNF in the brain.
Results showed that hamsters exposed to dim light at night did not show any more depressive-like symptoms than standard-light hamsters if they were given XPro1595. However, the drug did not seem to prevent the reduction of dendritic spine density in hamsters exposed to dim light.
These results provide further evidence of the role TNF may play in the depressive symptoms seen in hamsters exposed to dim light. But the fact that XPro1595 did not affect dendritic spine density means that more needs to be learned about exactly how TNF works, Nelson said.
The study was supported by grants from the National Science Foundation and the U.S. Department of Defense. The XPro1595 used in the study was provided by Xencor, Inc
--------------------------------------------------------------------------------------
I find that this article is very interesting as I did not know of any effects of light at night. In the past, I did not realise that chronic exposure to artificial light can actually cause the rising rates of depression in humans. This includes people who stay up late in front of the television and computer, and this actually concerns us students. Most of us have many enrichment programmes after school and reach home only at night and will use computers to either complete our homework or do our projects. It is very shocking that this can lead to depression and be bad for our health. However, I was very much relieved as we might be able to undo some of these harmful effects by going back to a regular light-dark cycle and minimizing their exposure to artificial light at night.
Artificial light, which is invented by Thomas Alba Edison, is essential for modern societies to function and it has clearly brightened up our world and has increased the productivity of work. We will not be able to live without artificial light and only with natural light sources such as moonlight at night. However, we should also limit the time we spend in artificial light sources light television and computer at night, in order to reduce the adverse effects on our health. In order to achieve this, we must learn how to strike a balance between work and rest, so that we will be able to complete as much work as possible in a short period of time. We must never work too late at night on the computer and should also try to minimise our exposure to artificial light as much as possible. This is to prevent us from suffering from depression due to the chronic exposure to artificial light.
In conclusion, I feel that this article is very interesting and useful, and has warned me on the adverse effects of exposure to artificial light. Although it may be hard, I will try my best not to work too late into the night and fixate my eyes only on the computer screen. I also hope that this article can be spread to other HCI students so that they can be aware of the effects of artificial light, and hopefully after reading this article, they will plan their time more wisely and finish their work in the afternoon, instead of staying up late at night to burn midnight oil.
2. Scientists
discover cancer drug could lead to cure for AIDS
A team of scientists is hopeful that the first steps
towards a cure for AIDS has been uncovered, after finding a drug used to treat
cancer can flush out dormant HIV cells.
David Margolis MD, of the University of North Carolina at
Chapel Hill, is the co-author of a study published in the journal Nature, which
has identified the usage of a cancer drug that can flush out dormant HIV cells.
Naharnet reported Margolis announced "It is the beginning of work toward a
cure for AIDS," as he spoke at the International Aids Conference in
Washington.
The scientific study found the chemotherapy drug
vorinostat, which is used to treat lymphoma, can rouse the dormant HIV virus
and eliminate it. Until now the dormant virus can turn into an infection that
attacks the immune system.
Margolis explained "After a single dose of the drug,
at least for a moment in time, (vorinostat) is flushing the virus out of
hiding." He added "This is proof of the concept, of the idea that the
virus can be specifically targeted in a patient by a drug, and essentially
opens up the way for this class of drugs to be studied for use in this
way."
According to Discover Magazine "If all the hidden
viruses could be activated, it should be possible to completely drain the
reservoir" and then use drugs to stop fresh viruses infecting healthy
cells. The study indicates the research could lead to a safer version of the
toxic cancer drug being developed to mimic its effects.
Naharnet reported HIV researcher Steven Deeks said the
research provided "the first evidence that ... a cure might one day be
feasible".
--------------------------------------------------------------------------------------
After reading this article, I am glad that scientists
have discovered a cancer drug that can possibly cure AIDS, but I am not very
optimistic about it. First, a brief introduction of AIDS. AIDS stands for Acquired
Immune Deficiency Syndrome and is caused by a virus called HIV. HIV is a virus which
attacks the immune system of a human itself and thus makes a person more
vulnerable to infections as the immune system might not be able to fight off
these infections. It was first identified in early 1980s and since then, 30
million people have died from AIDS. Although I am glad that a new cancer drug
has been discovered which can possibly be a cure for AIDS, more studies must be
done to confirm this and there is little optimism about this.
Since the 1990s, many public health officials were
optimistic that a vaccine for the disease would be developed. However, till
today, there has not been a cure for AIDS. Unproven cures for AIDS have also
been circulating around for decades. Developing a cure for AIDS is very
difficult. This is because the AIDS vaccine must be capable of not only
preventing a disease, but also an infection. This is because AIDS actually
attacks the immune system and thus prevents the body form recognising and
fighting it. AIDS also mutates rapidly to adjust to the differences in the
human body, and hence a vaccine that can cure a particular AIDS virus might be
useless a few years later, especially if the AIDS virus were to mutate. Hence,
we can see that a cure for AIDS is extremely unlikely, and it requires years
and years of research. Many fake cures have been developed but no actual tests
have been made to confirm this.
I feel that although it might take many years for an AIDS
vaccine to be found, we must never give up. However, before jumping into conclusions
and developing cures of AIDS straightaway, research have to done on the HIV
virus first. There must be greater understanding of how a cure for HIV could
work, the barriers that must be overcome, and the potential strategies that
could be used to overcome them. To achieve this, there must be strong
collaboration between scientists and researchers so that there can be a greater
chance of the cure being cost-effective. For now, more should be done to educate the
public and raise awareness about AIDS and to prevent this deadly disease from
continuing to spread.
3. Genetically Modified Flowers That Can Smell Like Anything
Ever wanted a rose that smelled like bananas? Maybe a
petunia that reeked of root beer? Researchers at the University of Florida
Gainesville have isolated 13 genes in flowers that key for the blossom’s
fragrance. These same genes hold the secrets to improving the tastes of some
fruits. According to a news release from UF and an interview in Discovery News,
these scientists have already started work on tastier tomatoes, and their first
crop of petunias that smell like roses are scheduled to blossom this summer.
The genetic modification of flowers for scent was detailed in Plant Journal as
well as Phytochemistry, and could herald a new era of designer blossoms.
Imagine going to a florist and asking for roses that smelled like bacon. By
discovering the genes that code for scent, these scientists have opened the
door to genetically modified plants that smell and taste better than ever
before.
When you select for one trait, you tend to sacrifice
others. The race to breed better blossoms over the past fifty years has
improved size and beauty at the cost of scent. The same holds true for food
crops – fruit and veggies are getting bigger, but they aren’t getting tastier.
Even projects that take advantage of genetic modification typically only focus
on resistance to parasites and improved yield. The University of Florida work,
headed by David Clark, was itself focused on improving pollination by
increasing the lifespan of petunia petals. Discovering the dozen or so
fragrance genes was an accidental find – they’ve examined more than 8000 such
genes over the past decade.
According to the UF news release, Clark and his
colleagues are first looking to restore the “lost” fragrances of many flowers
that have been breed for other characteristics in the last century. Eventually,
however, the same genes that could return a flower to its ancestral scent could
also be used to create entirely artificial smells. Flowers can be made to smell
like other species, other foods, maybe even inorganic compounds. The
implications extend outside of the florist shop. Perhaps flowers that smelled
strongest whenever there were too many heavy metals in the water supply? As
genetic modification finds its way into blossoms, these plants could move from
ornamentation to practical applications. GM has its down side, but you can bet
that discoveries like this one are leading to a better (smelling) future.
--------------------------------------------------------------------------------------
Before going into understanding this article, we have
to first know what genetic modification is and how it is applied in this case.
Genetic modification is the use of modern biotechnology techniques to change
the genes of an organism, such as a plant or animal. It can change the genes of
an organism in ways not possible through traditional breeding techniques
providing opportunities for new plant varieties and animal breeds. In this
case, genetic modification of flowers for scent is done by altering the genes
in the plant for the code of scent.
This is definitely a step to a better-smelling
future. We can actually look forward to
roses smelling like bacon; hibiscus smelling like bananas; and even a petunia
smelling like root beer. Food can also
be genetically modified to taste better than what we taste now. Children who do
not like to eat a particular food can now enjoy it, due to this change in
taste.
However, I do not think that flowers should be
genetically modified to alter their fragrance, as there are actually much more
disadvantages than its advantages.
Firstly, it can have undesirable effects on the plant itself. If the
fragrance of plants is being modified, the features like scent that is
originally suited to help in pollination by attracting insects like bees, will
be gone. This will in turn cause the plant to be unable to pollinate itself and
this will thus affect eth sexual reproduction of lowers. In worst case
scenario, this can lead to the extinction of a particular species, if this
entire process is not properly managed. Furthermore, when these plants
reproduce, there is less certainty about which genes in the particular plant will
be passed on to a new generation and which genes won't. This can also affect
the appearance of particular species of plant. Lastly, genetically modified
plants might also become too expensive due to the work involved. The genetic
engineers can alter the plants in ways that prevent gardeners from harvesting
the seeds and regrowing the plants, and thus they can take total control of the
price of these plants, which might not be very good for consumers.
In conclusion, I feel that plants should be the way
they are now. The appearance and fragrance of the plant should be decided by
natural selection instead of genetic modification by humans as some of the
genetically modified plants might not appear very natural. Hence, I do not
support genetic engineering.
In
conclusion, I feel that this is a good example of how math and science can
provide insight into sports performance, which has not been done previously to
such a degree of accuracy. In the meantime, Bolt will still have a long way to
go to reaching the ultimate limit of how man can go – 9.45, as scientist
predict.
5. 1.5 Million Years of Climate History Revealed After Scientists Solve Mystery of the Deep
--------------------------------------------------------------------------------------
4. How Usain Bolt
Could Break his World Record Without Running Faster in the Olympics
Usain
Bolt is the best human sprinter there has ever been. Yet, few would have
guessed that he would run so fast over 100m after he started out running the
400m and 200m races when in his mid teens. His coach decided to shift him down
to running the 100m one season so as to improve his basic sprinting speed.
No
one expected him to shine there. Surely he is too big to be a 100m sprinter?
How wrong they were. Instead of shaving the occasional hundredth of a second
off the world record, he took big chunks out of it, first reducing Asafa
Powell’s time of 9.74 s down to 9.72 in New York in May 2008, and then down to
9.69 (actually 9.683) at the Beijing Olympics later that year, before
dramatically reducing it again to 9.58 (actually 9.578) at the 2009 Berlin
World Championships. His progression in the 200m was even more astounding:
reducing Michael Johnson’s 1996 record of 19.32 s to 19.30 (actually 19.296) in
Beijing and then to 19.19 in Berlin. These jumps are so big that people have
started to calculate what Bolt’s maximum possible speed might be.
Unfortunately,
all the commentators have missed the two key factors that would permit Bolt to
run significantly faster without any extra effort or improvement in physical
conditioning. “How could that be?” I hear you ask.
The
recorded time of a 100m sprinter is the sum of two parts: the reaction time to
the starter’s gun and the subsequent running time over the 100m distance. An
athlete is judged to have falsestarted if he reacts by applying foot pressure
to the starting blocks within 0.10 s of the start gun firing. Remarkably, Bolt
has one of the longest reaction times of leading sprinters—he was the second
slowest of all the finalists to react in Beijing and third slowest in Berlin
when he ran 9.58. Allowing for all this, Bolt’s average running speed in
Beijing was 10.50 m/s and in Berlin (where he reacted faster) it was 10.60 m/s.
Bolt is already running faster than the ultimate maximum speed of 10.55 m/s
that a team of Stanford human biologists recently predicted for him.
In
the Beijing Olympic final, where Bolt’s reaction time was 0.165 s for his 9.69
run, the other seven finalists reacted in 0.133, 0.134, 0.142, 0.145, 0.147,
0.165 and 0.169 s. From these stats it is clear what Bolt’s weakest point is:
he has a very slow reaction to the gun. This is not quite the same as having a
slow start. A very tall athlete, with longer limbs and larger inertia, has got
more moving to do in order to rise upright from the starting blocks. If Bolt
could get his reaction time down to 0.13, which is very good but not
exceptional, then he would reduce his 9.58 record run to 9.56. If he could get
it down to an outstanding 0.12 he is looking at 9.55 and if he responded as
quickly as the rules allow, with 0.1, then 9.53 is the result. And he hasn’t
had to run any faster!
This
is the first key factor that has been missed in assessing Bolt’s future
potential. What are the others? Sprinters are allowed to receive the assistance
of a following wind that must not exceed 2 m/s in speed. Many world records
have taken advantage of that, and the most suspicious set of world records in
sprints and jumps were those set at the Mexico Olympics in 1968, where the wind
gauge often seemed to record 2 m/s when a world record was broken. But this is
certainly not the case in Bolt’s record runs. In Berlin his 9.58 s time
benefited from only a modest 0.9 m/s tailwind and in Beijing there was no wind,
so he has a lot more still to gain from advantageous wind conditions. Many
years ago, I worked out how the best 100m times are changed by wind. A 2 m/s
tailwind is worth about 0.11 s compared to a no-wind performance, and a 0.9 m/s
tailwind 0.06 s, at a low-altitude site.
So,
with the best possible legal wind assistance and reaction time, Bolt’s Berlin
time is down from 9.53 s to 9.47 s and his Beijing time becomes 9.51 s. And
finally, if he were to run at a high-altitude site like Mexico City, then he
could go faster still and effortlessly shave off another 0.07 s. So he could
improve his 100m time to an amazing 9.4 s without needing to run any faster.
--------------------------------------------------------------------------------------
Usain
Bolt, a 21-year-old Jamaican is the current world record holder for the 100
metres sprint, with a timing of 9.58 seconds. But, as records are being broken
again and again, this raises the question of how much faster can a human
actually run? The 100 metres record has been broken for 19 times, since an
American called Don Lippincott ran 10.6 seconds in 1912, and the eighth new 100
metres record set since 1991. The 10-second barrier was broken in 1968, the
9.90 barrier in 1991, the 9.80 barrier in 1999, the 9.70 barrier in 2008, and
the 9.60 barrier in 2009.
In
my opinion, I feel that this record has the potential of being broken again and
again, until the fastest that man can go, which has been ranging from 9.4 to
9.5, according to what some scientists predict. Certainly, I believe that Usain
Bolt has the capability of going faster. The recorded time of a 100m sprinter
is the sum of two parts: the reaction time to the starter’s gun and the
subsequent running time over the 100m distance. Bolt has one of the longest
reaction times of leading sprinters—he was the second slowest of all the
finalists to react in Beijing and third slowest in Berlin when he ran 9.58.
Hence, I feel that he has to improve on his reaction time if he really wants to
go faster. Bolt is currently running faster than what scientists predict man
can go, thus he would not be able to improve any much more than that on his
speed, he just has to improve on his reaction time.
Wind
can also allow Bolt to run faster. If he wants to shave even more off his world
record, he should compete in low-altitude areas, where he can receive
assistance from the wind to run even faster.
Running at altitude reduces the air density in the wind drag
calculation, which also translates to better timings.
In
the quest to achieve better timings, I feel that we should still maintain our
integrity. We should never take any drug which help to boost our performance.
Ben Johnson was infamously stripped of his 1988 Olympic 100-metre title – and
world record of 9.79 sec with it – for taking banned steroids. I feel that
athletes should give their genuine best and not resort to such tactics just to
improve their timings.
5. 1.5 Million Years of Climate History Revealed After Scientists Solve Mystery of the Deep
Scientists have announced a major breakthrough in
understanding Earth's climate machine by reconstructing highly accurate records
of changes in ice volume and deep-ocean temperatures over the last 1.5 million
years.
The study, which is reported in the journal Science,
offers new insights into a decades-long debate about how the shifts in Earth's
orbit relative to the sun have taken Earth into and out of an ice-age climate.
Being able to reconstruct ancient climate change is a
critical part of understanding why the climate behaves the way it does. It also
helps us to predict how the planet might respond to human-made changes, such as
the injection of large quantities of carbon dioxide into the atmosphere, in the
future.
Unfortunately, scientists trying to construct an accurate
picture of how such changes caused past climatic shifts have been thwarted by
the fact that the most readily available marine geological record of ice-ages
-- changes in the ratio of oxygen isotopes (Oxygen 18 to Oxygen 16) preserved
in tiny calcareous deep sea fossils called foraminifera -- is compromised.
This is because the isotope record shows the combined
effects of both deep sea temperature changes, and changes in the amount of ice
volume. Separating these has in the past proven difficult or impossible, so
researchers have been unable to tell whether changes in Earth's orbit were
affecting the temperature of the ocean more than the amount of ice at the
Poles, or vice-versa.
The new study, which was carried out by researchers in
the University of Cambridge Department of Earth Sciences, appears to have
resolved this problem by introducing a new set of temperature-sensitive data.
This allowed them to identify changes in ocean temperatures alone, subtract
that from the original isotopic data set, and then build what they describe as
an unprecedented picture of climatic change over the last 1.5 million years --
a record of changes in both oceanic temperature and global ice volume.
Included in this is a much fuller representation of what
happened during the "Mid-Pleistocene Transition" (MPT) -- a major
change in Earth's climate system which took place sometime between 1.25 million
and 600 thousand years ago. Before the MPT, the alternation between glacial
periods of extreme cold, and warmer interglacials, happened at intervals of
approximately 41,000 years. After the MPT, the major cycles became much longer,
regularly taking 100,000 years. The second pattern of climate cycles is the one
we are in now. Interestingly, this change occurred with little or no orbital
forcing.
"Previously, we didn't really know what happened
during this transition, or on either side of it," Professor Harry Elderfield,
who led the research team, said. "Before you separate the ice volume and
temperature signals, you don't know whether you're seeing a climate record in
which ice volume changed dramatically, the oceans warmed or cooled
substantially, or both."
"Now, for the first time, we have been able to
separate these two components, which means that we stand a much better chance
of understanding the mechanisms involved. One of the reasons why that is
important, is because we are making changes to the factors that influence the
climate now. The only way we can work out what the likely effects of that will
be in detail is by finding analogues in the geological past, but that depends
on having an accurate picture of the past behaviour of the climate
system."
Researchers have developed more than 30 different models
for how these features of the climate might have changed in the past, in the
course of a debate which has endured for more than 60 years since pioneering
work by Nobel Laureate Harold Urey in 1946. The new study helps resolve these
problems by introducing a new dataset to the picture -- the ratio of magnesium
(Mg) to calcium (Ca) in foraminifera. Because it is easier for magnesium to be
incorporated at higher temperatures, larger quantities of magnesium in the tiny
marine fossils imply that the deep sea temperature was higher at that point in
geological time.
The Mg/Ca dataset was taken from the fossil record
contained in cores drilled on the Chatham Rise, an area of ocean east of New
Zealand. It allowed the Cambridge team to map ocean temperature change over
time. Once this had been done, they were able to subtract that information from
the oxygen isotopic record. "The calculation tells us the difference
between what water temperature was doing and what the ice sheets were doing
across a 1.5 million year period," Professor Elderfield explained.
The resulting picture shows that ice volume has changed
much more dramatically than ocean temperatures in response to changes in
orbital geometry. Glacial periods during the 100,000-year cycles have been
characterised by a very slow build-up of ice which took thousands of years, the
result of ice volume responding to orbital change far more slowly than the
ocean temperatures reacted. Ocean temperature change, however, reached a lower
limit, probably because the freezing point of sea water put a restriction on
how cold the deep ocean could get.
In addition, the record shows that the transition from
41,000-year cycles to 100,000-year cycles, the characteristic changeover of the
MPT, was not as gradual as previously thought. In fact, the build-up of larger
ice sheets, associated with longer glacials, appears to have begun quite
suddenly, around 900,000 years ago. The pattern of Earth's response to orbital
forcing changed dramatically during this "900,000 year event," as the
paper puts it.
The research team now plan to apply their method to the
study of deep-sea temperatures elsewhere to investigate how orbital changes
affected the climate in different parts of the world.
"Any uncertainty about Earth's climate system fuels
the sense that we don't really know how the climate is behaving, either in
response to natural effects or those which are man-made," Professor
Elderfield added. "If we can understand how earlier changes were initiated
and what the impacts were, we stand a much better chance of being able to
predict and prepare for changes in the future."
After reading this article, I am very glad as after years
and years of research, we have finally found out how the shifts in Earth's
orbit relative to the sun have taken Earth into and out of an ice-age climate,
which is a major breakthrough in understanding Earth’s climate and why it
behaves the way it is.
One of the major concepts in this breakthrough is the Mid-Pleistocene
Transition, which is “a shift in the periodicity of northern hemisphere
glaciations from low amplitude 41-kyr to large amplitude 100-kyr
glacial-interglacial cycles”, or in simplified terms, a major change in Earth's
climate system, where alternation between glacial periods changes from once
every 40,000 years to once every 100,000 years. Before this study, scientists
have been unable to explain this unusual phenomenon, in which it was caused by little
or no orbital forcing. But after this study, scientists now found out that the build-up
of larger ice sheets, associated with longer glacials, appears to have begun
quite suddenly, around 900,000 years ago, and was not as gradual as previously
thought. It also revealed that that ice volume has changed much more
dramatically than ocean temperatures in response to changes in orbital geometry
during this change.
This breakthrough is very important because scientists
are making changes to the factors that influence the climate now and they have to
know the effects of doing this. And now after this research, scientists can
know understand the past behaviour of the climate system and have a more
accurate picture of these effects. They are now able to understand how earlier
changes were initiated, and they will thus be able to predict and prepares of
changes in our climate in the future. This future, may not belong to us, nor
our next generation, but it still belongs to the future of mankind.
No comments:
Post a Comment