According to a new study, the ‘Cry’ toxins that Monsanto’s GMO crops have been genetically modified to produce are a lot more toxic to mammals than previously thought, primarily to the blood.
Dr. Mezzomo and his team from the Department of Genetics and Morphology at the Institute of Biological Sciences, University of Brasilia recently performed and published a study done involving testing Bacillus thuringensis toxin (Bt toxin) on swiss albino mice. This toxin is the same one built into Monsanto’s GMO Bt crops such as corn and soy as a pesticide. While Bt toxin has been used quite safely in conventional and organic farming as an occasional spray used when dealing with a pest problem, now it has been engineered to be produced by and present throughout the inside of every cell and intercellular space of the plants themselves, which is why they chose to undertake the study. It should also be noted that as bacteria use lateral transference of genetic material, making it a possibility for this genetic material to become part of the human body’s bacterial bouquet that we depend on for our health (our bodies contain more bacteria cells than human ones by number).
“…advances in genetic engineering promise the expression of multiple Cry toxins in Bt-plants, known as gene pyramiding. Therefore, studies on non-target species are requirements of international protocols to verify the adverse effects of these toxins, ensuring human and environmental biosafety.
Due to its growing use in agricultural activities, Bt presence has already been detected in different environmental compartments such as soil and water. Consequently, the bioavailability of Cry proteins has increased, and for biosafety reasons their adverse effects might be studied, mainly for non-target organisms. Studies are therefore needed to evaluate Bt toxicity to non-target organisms; the persistence of Bt toxin and its stability in aquatic environments; and the risks to humans and animals exposed to potentially toxic levels of Bt through their diet.
Thus, we aimed to evaluate, in Swiss albino mice, the hematotoxicity and genotoxicity of four Bt spore-crystals…”
The scientists already knew that Bt toxin was very toxic and potentially deadly at levels above 270 milligrams per kilogram (basically ppm), so they instead tested levels ranging from 27mg/kg, 136mg/kg, and 270mg/kg for one to seven days (each of the Cry toxins were separated out and tested individually to maximize accuracy and total info). It was quite clear right off the bat that these Cry toxins were quite hemotoxic even at the lowest level of 27mg/kg administered only one time and one day as they clearly had damaged the blood, particularly in reference to red blood cells. The quantity and size of the erythrocytes (RBCs) were both significantly reduced, as was the overall levels of hemoglobin for which oxygen to attach to. All major factors regarding RBCs demonstrated some level of damage present for all levels of toxin administered and across all Cry proteins, although there were some clear variances present between different proteins and levels for certain factors. The white blood cell count was also quite noticeably raised, and as expected it dramatically increased depending on the duration the subject was tested for. The tests clearly demonstrated that Cry proteins were cytotoxic to bone marrow cells, accounting for a portion of the measured effects. It should also be noted that a previous study found that these proteins caused hemolysis (they killed blood cells) in vitro, particularly seeming to target the cell membranes of red blood cells.
Cry1Ab (the protein produced in common Bt corn and soy) induced microcytic hypochromic anemia in mice, even at the lowest tested dose of 27 mg/Kg, and this toxin has been detected in blood of non-pregnant women, pregnant women and their fetuses in Canada, supposedly exposed through diet . These data, as well as increased bioavailability of these MCA in the environment, reinforce the need for more research, especially given that little is known about spore crystals’ adverse effects on non-target species.”
While Bt toxin is not known to bioaccumulate in fat cells and internal organs, it is of note that the study demonstrated clearly that there was a significant increase in measurable negative effects of the toxin as time progressed especially concerning the higher doses. Also of note was the increased inflammatory response, while it was quite minor, the scientists consider it to be statistically significant due to the intricacies of their chosen test subjects’ biology. No measurable genotoxicity was found.
The full results of the study and a more detailed explanation can be found at, along with full citations for this article: http://gmoevidence.com/wp-content/uploads/2013/05/JHTD-1-104.pdf
NaturalNews) A taxpayer-funded government task force has issued new guidelines that literally urge healthy women to take toxic cancer drugs “preventively” in order to allegedly decrease their risk of developing breast cancer. As recently promoted by The New York Times (NYT), these shocking new recommendations from the U.S. Preventive Services Task Force (USPSTF) have been issued despite a complete lack of evidence that the dangerous cancer drugs being recommended have any preventive efficacy whatsoever.
Fortifying earlier recommendations from 2002 that encouraged both tamoxifen and raloxifene as so-called preventive breast cancer treatment, USPSTF now says that healthy women with either a personal or family history of breast cancer, or who are considered “high risk,” should consider taking either of the two drugs for at least five years, even though doing so could cause major side effects like blood clots or stroke. USPSTF is also now pressing doctors to being actively prescribing such drugs to their healthy female patients, and particularly those between the ages of 40 and 70.
The task force says it recently evaluated a host of new data on the subject of breast cancer prevention and determined that taking either tamoxifen or raloxifene while healthy may help block estrogen, a hormone that feeds roughly 75 percent of the type of breast cancers that women today face. The group estimates that for every 1,000 healthy women who take either of the two drugs, roughly eight of them will avoid developing breast cancer in the following five years.
Target group for preventive cancer therapy will not even benefit from it, admits report
However, as many as seven additional women per 1,000 taking tamoxifen or raloxifene will also admittedly develop blood clots during the same time frame, according to the report, while about four others per 1,000 will develop uterine cancer from the drugs. These figures represent a doubled risk of both conditions as a result of taking either of the two drugs preventively rather than doing nothing at all, which means millions of women are now at substantial health risk due to USPSTF’s recommendations.
Beyond this, the very target group that USPSTF is now urging to take cancer drugs preventively is actually the least likely to derive any benefits from the “treatment.” As it turns out, the vast majority of healthy women considered to be at high risk of developing breast cancer will never develop breast cancer, according to the report. And most breast cancer cases occur in women who were never identified as being “high risk” in the first place, which makes USPSTF’s new recommendations laughable.
“Most women identified as ‘high risk’ will not develop breast cancer,” explains the report, which also duplicitously states that “high risk,” healthy women should be first in line to take the drugs preventively. “[T]he majority of breast cancer cases will arise in women who are not identified as having increased risk.”
Let your voice be heard: Comment on proposed new guidelines before May 13
The primary entity that will benefit from USPSTF’s obvious affront to common sense is the cancer industry, which will have the opportunity to sell its toxic cancer pills to a whole new market of healthy women that do not need them. The losers, of course, will be those gullible members of the public that fall for the ridiculous shenanigan.
But you can help stop the recommendations from gaining official status by making your voice heard on the published draft form. Public comments will be accepted until May 13, and you can leave them at the following link:
This may seem like the sort of statement usually delivered by an overblown narrator as rockets and lasers go zooming* by, but here goes: In the world of journalism, the future is now! Granted, it’s the kind of future that often makes waves in the present and raises at least as many questions as it answers, but if you wanted a bright, problem-free future, you’d have to travel back to the divergence point somewhere between Philip K. Dick and The Jetsons… and then eliminate the dystopians.
*Yes, I realize lasers don’t make noise or “zoom” by, but that hasn’t prevented George Lucas from becoming insanely rich, has it?
But you can’t, so here we are, discussing journalism… by robots! [INS FANFARE/LASER NOISES]
Journalist Ken Schwencke has occasionally awakened in the morning to find his byline atop a news story he didn’t write.
No, it’s not that his employer, The Los Angeles Times, is accidentally putting his name atop other writers’ articles. Instead, it’s a reflection that Schwencke, digital editor at the respected U.S. newspaper, wrote an algorithm — that then wrote the story for him.
Instead of personally composing the pieces, Schwencke developed a set of step-by-step instructions that can take a stream of data — this particular algorithm works with earthquake statistics, since he lives in California — compile the data into a pre-determined structure, then format it for publication.
His fingers never have to touch a keyboard; he doesn’t have to look at a computer screen. He can be sleeping soundly when the story writes itself.
This isn’t exactly new news. (Then again, neither is the morning paper, but that’s a discussion for another time…) Algorithmic story generation has been around for a few years now, with Narrative Science leading the field. A couple of years ago, Narrative Science was the story, rather than just the automated recap. George Washington University’s website had covered a GWU baseball game with a longish recap that only got around to mentioning the opposing pitcher’s perfect game in the seventh (out of eight) paragraph. Speculators wondered if a bot was behind this “ignoring the forest for the trees” recap. Narrative Science’s techies were highly offended and responded by producing two algorithmically-generated recaps — one from the home team POV and a more neutral piece.
The first concern with robo-journalism is often expressed by the journalists themselves: are we getting pushed out?
Kristian Hammond, co-founder of Narrative Science, doesn’t see it that way.
This robonews tsunami, he insists, will not wash away the remaining human reporters who still collect paychecks. Instead the universe of newswriting will expand dramatically, as computers mine vast troves of data to produce ultracheap, totally readable accounts of events, trends, and developments that no journalist is currently covering.
This is somewhat echoed by L.A. Times reporter Schwencke, who sees the algorithmic output as a boon for busy journalists.
Schwencke says the use of algorithms on routine news tasks frees up professional reporters to make phone calls, do actual interviews, or dig through sophisticated reports and complex data, instead of compiling basic information such as dates, times and locations.
“It lightens the load for everybody involved,” he said.
Schwenke’s “bot” is rather simple, functioning best with a limited dataset and a minimum of formatting. Narrative Science’s output is a bit more complex, allowing customers to adjust the “slant” of the generated stories. Not only that, but the software can cop an attitude, if requested.
The Narrative Science team also lets clients customize the tone of the stories. “You can get anything, from something that sounds like a breathless financial reporter screaming from a trading floor to a dry sell-side researcher pedantically walking you through it,” says Jonathan Morris, COO of a financial analysis firm called Data Explorers, which set up a securities newswire using Narrative Science technology. (Morris ordered up the tone of a well-educated, straightforward financial newswire journalist.) Other clients favor bloggy snarkiness. “It’s no more difficult to write an irreverent story than it is to write a straightforward, AP-style story,” says Larry Adams, Narrative Science’s VP of product. “We could cover the stock market in the style of Mike Royko.”
This leads to the ethical quandary presented by the use of bots. Is robo-generated journalism really journalism, and is the use of algorithms a betrayal of readers’ trust, especially when a familiar name is on the byline? If factual errors are discovered, does the blame lie with the software, or with the journalist who agreed to let the article “write itself?”
The answer here isn’t simple (and the question likely isn’t even fully formed yet), but the key is transparency.
“People are already reading automated data reports that come to them, and they don’t think anything of it,” said Ben Welsh, a colleague of Schwencke’s at the Times.
Welsh says that responsibility for accuracy falls where it always has: with publications, and with individual journalists.
“The key thing is just to be honest and transparent with your readers, like always,” he said. “I think that whether you write the code that writes the news or you write it yourself, the rules are still the same.”
“You need to respect your reader. You need to be transparent with them, you need to be as truthful as you can… all the fundamentals of journalism just remain the same.”
Questions involving intellectual property are also raised, although they aren’t discussed in these articles. Who holds the copyright on the generated articles? In Schwencke’s case, these rights are likely retained by the L.A. Times. In the case of Narrative Science, it’s probably defined by contractual terms with the end user. Once the contract is up, the generated articles’ copyright reverts to the end user.
Schwencke’s homebrewed algorithm is a different IP animal. If he switches papers, does he retain the right to the “bot?” Or is that algorithm, developed while employed with the L.A. Times, considered a “work for hire,” and thus, the paper’s property? Arguably, his algorithm is an extension of him, covering his area of expertise and designed to emulate his reporting. What if Schwencke generates a similar piece of software for his new employer? Would he be permitted to do this, or would this be prevented by additions to “non-compete” clauses? Is it patentable?
The more ubiquitous “robo-journalism” becomes, the more issues like these will arise. Hopefully, IP turf wars will remain at a minimum, allowing for the expansion of this promising addition to the journalist’s toolset. With bots handling basic reporting, journalists should be freed up to pursue the sort of journalism you can’t expect an algorithm to handle — longform, investigative, etc. This is good news for readers, even if they may find themselves a little unnerved (at first) by the journalistic uncanny valley.
Colorado, and apparently Texas (next) are being targeted with an attempt to set up a federal authority framework that will enable Secret Service agents (not just those guarding the president), and others of the U.S. Secret Service including uniformed division officers, physical security technicians and specialists, and other ‘special officers’, to arrest and remove an elected sheriff for refusing to enforce the law (or anyone breaking the law).
The bills being introduced defines law as including any rule, regulation, executive order, court order, statute or constitutional provision.
Why are they doing this? Here’s why…
It would establish federal authority police powers in a State, enabling an enforcement arm reporting directly to the president (the Secret Service).
It would potentially lead to enabling the president / executive branch to theoretically override the actions and preventative measures that are now being taken by many States throughout the country who are trying to preserve 2nd Amendment gun rights and who are prohibiting the enforcement of unconstitutional law passed by Congress or pushed by executive order.
As some of you may know, a growing list of sheriffs (more than 340 so far) across the country have expressed that they will not enforce a Washington mandate that clearly violates the Second Amendment.
Many State laws to preserve gun rights are gaining momentum. States include Montana, Ohio, Kentucky, Idaho, Louisiana, Oklahoma, Texas, Arizona, Michigan, Utah, and New Mexico.
However, in Colorado, Senate Bill SB-13-013 has evidently just passed the Senate, and will be heading on to its potential signing by the governor, giving police powers and arrest authority to the executive branch of federal government (Secret Service) within the State. In Texas a similar bill has just been introduced in the State legislature.
The president and vice-president Biden have been actively pursuing state legislatures and pushing for passage of the bills. Obama is scheduled to visit Colorado in just a few days. “Colorado is a pawn for the Obama-Biden plan,” and then on to the next… at least those that won’t fall into line.
Quoted from Rep. Lori Saine of Colorado, who says she believes the bill is intended to be used as a foundation for later legislation that will surrender still greater control to federal officials…
“There’ve been so many explanations for the reasons they really need this bill passed. So what is it really?” “I believe it is intended to be used for setting up a framework so that at some other time they could expand it to possibly include being able to arrest a sheriff who is refusing to enforce unconstitutional laws. They would justify it by saying that since we’ve already given the Secret Service this ability, why not give them just one more?”
It is a full court press by the executive branch of the federal government to empower themselves even further by inserting themselves as police authority within the state, to eliminate opposition.
…thought you’d like to know
Some of the data for this report has been sourced from, WND
Researchers at The University of Auckland have proposed a new method for finding Earth-like planets and they anticipate that the number will be in the order of 100 billion using gravitational microlensing, currently used by a Japan-New Zealand collaboration called MOA (Microlensing Observations in Astrophysics) at New Zealand’s Mt John Observatory. Their work will appear in the Oxford University Press journal Monthly Notices of the Royal Astronomical Society.
Lead author Dr Phil Yock from the University of Auckland’s Department of Physics explains that the work will require a combination of data from microlensing and the NASA Kepler space telescope. *”Kepler finds Earth-sized planets that are quite close to parent stars, and it estimates that there are 17 billion such planets in the Milky Way. These planets are generally hotter than Earth, although some could be of a similar temperature (and therefore habitable) if they’re orbiting a cool star called a red dwarf.”
“Our proposal is to measure the number of Earth-mass planets orbiting stars at distances typically twice the Sun-Earth distance. Our planets will therefore be cooler than the Earth. By interpolating between the Kepler and MOA results, we should get a good estimate of the number of Earth-like, habitable planets in the Galaxy. We anticipate a number in the order of 100 billion.”
“Of course, it will be a long way from measuring this number to actually finding inhabited planets, but it will be a step along the way.” The first planet orbiting a Sun-like star was not found until 1995, despite strenuous efforts by astronomers. Dr Yock explains that this reflects the difficulty of detecting from a distance a tiny non-luminous object like Earth orbiting a bright object like the Sun. The planet is lost in the glare of the star, so indirect methods of detection must be used. Whereas Kepler measures the loss of light from a star when a planet orbits between us and the star, microlensing measures the deflection of light from a distant star that passes through a planetary system en route to Earth – an effect predicted by Einstein in 1936. In recent years, microlensing has been used to detect several planets as large as Neptune and Jupiter.
Dr Yock and colleagues have proposed a new microlensing strategy for detecting the tiny deflection caused by an Earth-sized planet. Simulations carried out by Dr Yock and his colleagues – students and former students from The University of Auckland and France – showed that Earth-sized planets could be detected more easily if a worldwide network of moderate-sized, robotic telescopes was available to monitor them.
Coincidentally, just such a network of 1m and 2m telescopes is now being deployed by Las Cumbres Observatory Global Telescope Network (LCOGT) in collaboration with SUPA/St Andrews (Scottish Universities Physics Alliance), with three telescopes in Chile, three in South Africa, three in Australia, and one each in Hawaii and Texas. This network is used to study microlensing events in conjunction with the Liverpool Telescope in the Canary Islands which is owned and operated by Liverpool John Moores University.
It is expected that the data from this suite of telescopes will be supplemented by measurements using the existing 1.8m MOA telescope at Mt John, the 1.3m Polish telescope in Chile known as OGLE, and the recently opened 1.3m Harlingten telescope in Tasmania.
For more information: The new work will appear in Monthly Notices of the Royal Astronomical Society (Oxford University Press): dx.doi.org/10.1093/mnras/stt318 Journal reference: Monthly Notices of the Royal Astronomical Society.
The Daily Galaxy via Royal Astronomical Society
Big brother to log your drinking habits and waist size as GPs are forced to hand over confidential records
GPs are to be forced to hand over confidential records on all their patients’ drinking habits, waist sizes and illnesses.
The files will be stored in a giant information bank that privacy campaigners say represents the ‘biggest data grab in NHS history’.
They warned the move would end patient confidentiality and hand personal information to third parties.
The data includes weight, cholesterol levels, body mass index, pulse rate, family health history, alcohol consumption and smoking status.
Diagnosis of everything from cancer to heart disease to mental illness would be covered. Family doctors will have to pass on dates of birth, postcodes and NHS numbers.
Officials insisted the personal information would be made anonymous and deleted after analysis.
Read more: http://www.dailymail.co.uk/news/article-2272166/Big-brother-log-drinking-habits-waist-size.html#ixzz2JlSGvuMw
The government is once again promoting the idea of “those who are reverent of individual liberty” being terrorists with a new study funded by the Department of Homeland Security.
The study and related data were recently produced by the National Consortium for the Study of Terrorism and Responses to Terrorism, or START, at the University of Maryland. START was launched with a $12 million grant from DHS and is recognized by the organization as one of its “Centers for Excellence.” In December, DHS announced it was renewing START’s funding with another $3.6 million.
Now that the election is over, the propaganda media can back off the burying of those critical stories that they couldn’t be bothered to report in the lead up to the re-election of President Obama.
What are we talking about?
For starters, the Federal Reserve’s recent report, which received nary a comment from the political and financial pundits on television.
While the economy was on the minds of most voters last night, what they didn’t know may have very well swung the election to one candidate over another.
And this particular tidbit of data is as important as it gets when we’re talking about economic health:
Here’s an interesting new data point that the St Louis Fed has put together to calculate recession probabilities:
“Recession probabilities for the United States are obtained from a dynamic-factor markov-switching model applied to four monthly coincident variables: non-farm payroll employment, the index of industrial production, real personal income excluding transfer payments, and real manufacturing and trade sales. “
What’s interesting about this index is the current reading. At 20%, the index is at a level that has ALWAYS been followed by a recession. As you can see below, the index has never approached 20% without a subsequent recession. All 6 recessions since 1967 have coincided with 20%+ readings in the US Recession Probabilities index.
It’s no secret that the economy is still hurting. According to this report we are on the verge of another recession within the midst of a broader ‘depression.’ Contrarian analysts have already suggested this is the case, with many saying we’ve been in recession since at least summer.
Moreover, if you look at the real numbers behind the numbers, like the rate of real inflation, and bounce those against this purported economic growth you may be surprised to find that we never exited the recession!
Look at the chart below. You see that red line? That’s the government’s official GDP, a measure for economic growth. The government shows it in positive territory and its been heralded without question by the mainstream machine as the proof for an economic recovery.
Now look at the blue line. That’s the unofficial GDP as calculated by economist John Williams using algorithms that account for distortions in the way government calculates inflation.
A recession, as defined by most traditional measures of economics, is a period of two consecutive quarters with negative economic growth.
That’s right — this whole time during which millions were losing jobs and homes, and as food stamp usage doubled, we have been in recession. That’s over four years now.
But did we really need a report from the Federal Reserve to confirm that for us?
On another (related) note, stock markets are down over 300 points as of this writing. Apparently Europe is in shambles (again).
It seems this is how financial markets around the world are celebrating the re-election of a President who has presided over the largest cumulative debt increase in the nation’s history.
Now that the election is over, we can return to our regularly scheduled programming. .
Costas Vaxevanis, a 46-year-old veteran television journalist who now publishes a magazine, has insisted he was doing his job and accused ministers responsible for vetting the list for possible tax evasion of doing nothing for two years.
“We will endure this. Will they?” Vaxevanis tweeted ahead of the trial.
Vaxevanis, who was arrested on Sunday, was charged with breach of privacy and had faced a maximum three-year prison sentence if convicted.
Calling for his conviction, the prosecutor said: “You have publicly ridiculed a series of people, you have delivered these people to a society that is thirsty for blood.”
“The solution to the problems that the country is facing is not cannibalism,” added the prosecutor.
But after 12 hours of trial, the court acquitted Vaxevanis.
The ruling was met with applause, while a visibly emotional Vaxevanis thanked the judge.
Several media workers had testified on behalf of Vaxevanis, including the head of the International Federation of Journalists, Jim Bumelha, who called the trial an “absurd farce”.
“Colleagues from all over the world will be keeping an eye on this. If something happens to Costas, we will gather all of the forces that we have got, wherever we are, to campaign for his release,” he told reporters ahead of the verdict.
The head of the Athens union of journalists, Dimitris Trimis, also took the stand.
“I would have done the same thing,” Trimis told the court, according to excerpts posted on a blog operated by Vaxevanis.
“A bank account is not personal data, we live in an era of transparency,” Trimis said.
A radical leftist lawmaker whose father is on the journalist’s legal team denounced the case as a “blow to democracy”.
Amnesty International’s deputy program director for Europe and Central Asia, Marek Marczynski, said ahead of the ruling that it was “deeply troubling” that Vaxevanis is facing charges “for disclosing information in the public interest”.
“This step increases the risk that other journalists will censor themselves and refrain from legitimate criticism of the government to avoid prosecution,” he said.
Vaxevanis has accused the Greek state of hypocrisy and says the justice system is bowing to a corrupt political system.
“Our politicians declare themselves to be democrats. I see no evidence of this,” he wrote in Britain’s The Times newspaper on Thursday.
“I wonder if Greek justice will show that the law safeguards the public interest and freedom of speech… in journalism you must do what you think is right without worrying about the consequences,” he wrote.
The journalist has also accused Greek media of burying the story.
Vaxevanis’ “Hot Doc” magazine on Saturday published the names of more than 2,000 Greeks, allegedly from a controversial list of HSBC account holders that was originally leaked by a bank employee and passed to Greece in 2010 by France’s then finance minister Christine Lagarde, who is now IMF chief.
Greek authorities took no action given that the list was considered stolen data that could not be used against potential tax evaders.
When the case resurfaced last month, it took several days for officials to even locate a copy of the so-called “Lagarde List”.
Among those named are prominent businessmen, shipowners, lawyers, doctors, journalists and a former minister, as well as companies, housewives and students although no deposit sums were published.
The data has been the subject of intense discussion, with the government facing calls to use it to crack down on potential tax cheats as the country grapples with a massive debt crisis.
On Thursday, a special economic prosecutor asked parliament to investigate whether previous finance ministers could be faulted for failing to take action on the list, media reports said.
Evangelos Venizelos, the leader of the socialist Pasok party and a former finance minister, told a parliament committee that he had ordered the finance ministry’s tax police to investigate, a claim which the department’s chief at the time denies.
Ex-finance minister George Papaconstantinou, the first recipient of the data in 2010, said he did not know what had happened to the original version of the list, raising speculation that it could have been tampered with.
Current finance chief Yannis Stournaras has asked France to re-send the list.
Vaxevanis says he got the information in an anonymous letter whose sender claimed to have received it from a politician.
On Wednesday, police arrested another journalist who claimed to have in his possession a list of finance ministry documents allegedly stolen by hackers from the state general accounting office
Editors note:for years we have heard that global warming is going to destroy our planet ,and I along with many others have said” well the numbers don’t reveal this and the methods that Global warming shills at the university of East Anglia were flawed and not to be trusted.
Al Gore is heavily invested in the carbon credit scam and ponsi scheme.Phil Jones and the Climatic Research unit at East Anglia has been completely discredited.
This is not to say that our global resources are not horribly mismanaged by people who get rich of this management while the regular people starve and learn to do more with less.
It is time for a global awakening ,time for 6 billion peaceful revolutions.
I will provide a link to a very interesting article from the daily mail in the UK.
The world stopped getting warmer almost 16 years ago, according to new data released last week.
The figures, which have triggered debate among climate scientists, reveal that from the beginning of 1997 until August 2012, there was no discernible rise in aggregate global temperatures.
This means that the ‘plateau’ or ‘pause’ in global warming has now lasted for about the same time as the previous period when temperatures rose, 1980 to 1996. Before that, temperatures had been stable or declining for about 40 years.
The new data, compiled from more than 3,000 measuring points on land and sea, was issued quietly on the internet, without any media fanfare, and, until today, it has not been reported.
This stands in sharp contrast to the release of the previous figures six months ago, which went only to the end of 2010 – a very warm year.
Ending the data then means it is possible to show a slight warming trend since 1997, but 2011 and the first eight months of 2012 were much cooler, and thus this trend is erased.
Some climate scientists, such as Professor Phil Jones, director of the Climatic Research Unit at the University of East Anglia, last week dismissed the significance of the plateau, saying that 15 or 16 years is too short a period from which to draw conclusions.
Others disagreed. Professor Judith Curry, who is the head of the climate science department at America’s prestigious Georgia Tech university, told The Mail on Sunday that it was clear that the computer models used to predict future warming were ‘deeply flawed’.
Even Prof Jones admitted that he and his colleagues did not understand the impact of ‘natural variability’ – factors such as long-term ocean temperature cycles and changes in the output of the sun. However, he said he was still convinced that the current decade would end up significantly warmer than the previous two.
Disagreement: Professor Phil Jones, left, from the University of East Anglia, dismissed the significance of the plateau. Professor Judith Curry, right, from Georgia Tech university in America, disagreed, saying the computer models used to predict future warming were ‘deeply flawed’
The regular data collected on global temperature is called Hadcrut 4, as it is jointly issued by the Met Office’s Hadley Centre and Prof Jones’s Climatic Research Unit.
Since 1880, when worldwide industrialisation began to gather pace and reliable statistics were first collected on a global scale, the world has warmed by 0.75 degrees Celsius.
Some scientists have claimed that this rate of warming is set to increase hugely without drastic cuts to carbon-dioxide emissions, predicting a catastrophic increase of up to a further five degrees Celsius by the end of the century.
The new figures were released as the Government made clear that it would ‘bend’ its own carbon-dioxide rules and build new power stations to try to combat the threat of blackouts.
At last week’s Conservative Party Conference, the new Energy Minister, John Hayes, promised that ‘the high-flown theories of bourgeois Left-wing academics will not override the interests of ordinary people who need fuel for heat, light and transport – energy policies, you might say, for the many, not the few’ – a pledge that has triggered fury from green activists, who fear reductions in the huge subsidies given to wind-turbine firms.
Flawed science costs us dearly
Here are three not-so trivial questions you probably won’t find in your next pub quiz. First, how much warmer has the world become since a) 1880 and b) the beginning of 1997? And what has this got to do with your ever-increasing energy bill?
You may find the answers to the first two surprising. Since 1880, when reliable temperature records began to be kept across most of the globe, the world has warmed by about 0.75 degrees Celsius.
From the start of 1997 until August 2012, however, figures released last week show the answer is zero: the trend, derived from the aggregate data collected from more than 3,000 worldwide measuring points, has been flat.
Not that there has been any coverage in the media, which usually reports climate issues assiduously, since the figures were quietly release online with no accompanying press release – unlike six months ago when they showed a slight warming trend.
The answer to the third question is perhaps the most familiar. Your bills are going up, at least in part, because of the array of ‘green’ subsidies being provided to the renewable energy industry, chiefly wind.
They will cost the average household about £100 this year. This is set to rise steadily higher – yet it is being imposed for only one reason: the widespread conviction, which is shared by politicians of all stripes and drilled into children at primary schools, that, without drastic action to reduce carbon-dioxide emissions, global warming is certain soon to accelerate, with truly catastrophic consequences by the end of the century – when temperatures could be up to five degrees higher.
Hence the significance of those first two answers. Global industrialisation over the past 130 years has made relatively little difference.
And with the country committed by Act of Parliament to reducing CO2 by 80 per cent by 2050, a project that will cost hundreds of billions, the news that the world has got no warmer for the past 16 years comes as something of a shock.
It poses a fundamental challenge to the assumptions underlying every aspect of energy and climate change policy.
This ‘plateau’ in rising temperatures does not mean that global warming won’t at some point resume.
But according to increasing numbers of serious climate scientists, it does suggest that the computer models that have for years been predicting imminent doom, such as those used by the Met Office and the UN Intergovernmental Panel on Climate Change, are flawed, and that the climate is far more complex than the models assert.
‘The new data confirms the existence of a pause in global warming,’ Professor Judith Curry, chair of the School of Earth and Atmospheric Science at America’s Georgia Tech university, told me yesterday.
‘Climate models are very complex, but they are imperfect and incomplete. Natural variability [the impact of factors such as long-term temperature cycles in the oceans and the output of the sun] has been shown over the past two decades to have a magnitude that dominates the greenhouse warming effect.
‘It is becoming increasingly apparent that our attribution of warming since 1980 and future projections of climate change needs to consider natural internal variability as a factor of fundamental importance.’
Professor Phil Jones, director of the Climate Research Unit at the University of East Anglia, who found himself at the centre of the ‘Climategate’ scandal over leaked emails three years ago, would not normally be expected to agree with her. Yet on two important points, he did.
The data does suggest a plateau, he admitted, and without a major El Nino event – the sudden, dramatic warming of the southern Pacific which takes place unpredictably and always has a huge effect on global weather – ‘it could go on for a while’.
Like Prof Curry, Prof Jones also admitted that the climate models were imperfect: ‘We don’t fully understand how to input things like changes in the oceans, and because we don’t fully understand it you could say that natural variability is now working to suppress the warming. We don’t know what natural variability is doing.’
Yet he insisted that 15 or 16 years is not a significant period: pauses of such length had always been expected, he said.
Yet in 2009, when the plateau was already becoming apparent and being discussed by scientists, he told a colleague in one of the Climategate emails: ‘Bottom line: the “no upward trend” has to continue for a total of 15 years before we get worried.’
But although that point has now been passed, he said that he hadn’t changed his mind about the models’ gloomy predictions: ‘I still think that the current decade which began in 2010 will be warmer by about 0.17 degrees than the previous one, which was warmer than the Nineties.’
Only if that did not happen would he seriously begin to wonder whether something more profound might be happening. In other words, though five years ago he seemed to be saying that 15 years without warming would make him ‘worried’, that period has now become 20 years.
Meanwhile, his Met Office colleagues were sticking to their guns. A spokesman said: ‘Choosing a starting or end point on short-term scales can be very misleading. Climate change can only be detected from multi-decadal timescales due to the inherent variability in the climate system.’
He said that for the plateau to last any more than 15 years was ‘unlikely’. Asked about a prediction that the Met Office made in 2009 – that three of the ensuing five years would set a new world temperature record – he made no comment. With no sign of a strong El Nino next year, the prospects of this happening are remote.
Why all this matters should be obvious. Every quarter, statistics on the economy’s output and models of future performance have a huge impact on our lives. They trigger a range of policy responses from the Bank of England and the Treasury, and myriad decisions by private businesses.
Yet it has steadily become apparent since the 2008 crash that both the statistics and the modelling are extremely unreliable. To plan the future around them makes about as much sense as choosing a wedding date three months’ hence on the basis of a long-term weather forecast.
Few people would be so foolish. But decisions of far deeper and more costly significance than those derived from output figures have been and are still being made on the basis of climate predictions, not of the next three months but of the coming century – and this despite the fact that Phil Jones and his colleagues now admit they do not understand the role of ‘natural variability’.
The most depressing feature of this debate is that anyone who questions the alarmist, doomsday scenario will automatically be labelled a climate change ‘denier’, and accused of jeopardising the future of humanity.
So let’s be clear. Yes: global warming is real, and some of it at least has been caused by the CO2 emitted by fossil fuels. But the evidence is beginning to suggest that it may be happening much slower than the catastrophists have claimed – a conclusion with enormous policy implications.
Read more: http://www.dailymail.co.uk/sciencetech/article-2217286/Global-warming-stopped-16-years-ago-reveals-Met-Office-report-quietly-released–chart-prove-it.html#ixzz29N2LmkUv Follow us: @MailOnline on Twitter | DailyMail on Facebook