Top

Al Gore Invented The Internet

January 22, 2012 by staff 

Al Gore Invented The Internet, For over 350 years, some of the greatest minds of science struggled to prove what was known as Fermat’s Last Theorem – the idea that a certain simple equation had no solutions.

Imagine if I told you that I had just solved Fermat’s Last Theorem — but I was keeping the proof to myself for proprietary reasons. “I’m an expert, take my word for it!” Of course, blind-faith is not how mathematics or science works. Hard sciences are infused with mathematics; this extends to the medical, engineering and industrial disciplines. Standards of professional conduct have broken down, however, in Climate Science,

Since the Enlightenment scientific progress has relied upon publication of logical and empirical “proofs” for new theories and theorems. This practice serves to disseminate the new knowledge generated, resulting in peer review. Specialists in the field have the most interest in reading particular publications; they potentially have the most to gain from the new learning, and they will likely have the most relevant things to say in response. Honest scholars seek the truth and are happy when an error in the record is corrected. However it is not necessarily known a-priori who is the most expert among the readership, or who will deliver the best, most cogent replies that might either concur or differ with a proposed new theory. Therefore true scientists seek the widest possible audience for their work. In an intellectual process akin to crowd-sourcing, the most reliable, truthful theories emerge over time.

In our illustrative case,

“Despite large prizes being offered for a solution, Fermat’s Last Theorem remained unsolved. It has the dubious distinction of being the theorem with the largest number of published false proofs.”

So we see that publication of a theorem’ proof alone cannot be construed as passing peer-review, qualifying to be recognized widely as the truth. Peer review is essentially a two-step process: 1) journal editors review submissions for rejection or publication; and then 2) the readership weighs in on the subject matter, offering criticism in letters-to-the-editors and follow-on publications. In this manner, errors in the false published proofs of the Fermat’s Last Theorem were identified — errors that had escaped the attention of the editors. This process occurs constantly in all scientific disciplines.

Quality Control of this nature has been subverted in Climate Sciences. First, the editorial peer-review was corrupted as exposed in the Climate-gate e-mails. What is revealed therein is an IPCC-cabal covertly pressuring the editors of various journals, forcing them to exclude certain authors who did not toe the party-line, all in a repellant manner reminiscent of a claque of high school cheerleaders seeking to rig the election for prom-queen. In sum, they reveal themselves to be totally unserious about their obligations to open scholarly debate, teaching, and learning.

Second, we learn that the proprietary “greenhouse gas” climate data and models behind the theory of Anthropogenic Global Warming — computer codes and measurements that serve essentially as the mathematical and empirical proofs of “Climate Crisis” — have NOT been fully published. The US Dept. of Energy is complicit in this reprehensible circumstance, which is at odds with the scholarly traditions of integrity, honesty and openness.

Third, according to the Government Accountability Office (GAO), the existing network of ground-level atmospheric temperature measurement stations (weather stations) have been largely mismanaged by the cognizant agency, the National Oceanic and Atmospheric Administration, according to their own standards (see here). In this report entitled “CLIMATE MONITORING NOAA Can Improve Management of the U.S. Historical Climatology Network” we learn on page 14 that “Close to Half of USHCN Stations Do Not Meet NWS Siting Standards” (NWS = National Weather Service).

The chart is reproduced from another GAO report indicating that the Federal Government spends $3-5 billion per year on “Climate Change” yet an adequate temperature measurement cannot be completed.

Chart 1 reproduced from “CLIMATE CHANGE Federal Reports on Climate Change Funding Should Be Clearer and More Complete” GAO Report Aug 2005 http://www.gao.gov/new.items/d05461.pdf

In light of Chart 1, let’s examine some of the deficiencies in temperature measurement uncovered by the GAO in the report cited previously. First we learn on page 4 that the NOAA relies on “Volunteer observers at the stations [who] generally record daily maximum and minimum temperatures and 24-hour precipitation totals and submit the data to NWS over the telephone, by Internet, or by mail.” Let’s ponder that for a moment: $3-5 billion spent per year but the essential temperature measurements are collected by volunteers! And the data is sent by mail! Another deficiency cited on page 18 is as follows:

“Limitations due to temperature-measuring equipment. The use of temperature-measuring equipment that is connected by a cable to an indoor readout device can require installing equipment closer to buildings than specified in the standards, according to our survey. Weather forecast office staff must dig trenches for the cables, and paved surfaces such as sidewalks and driveways, as well as the cost of cable for trenching, can limit the length of trenches and consequently the ability to locate stations so that they adhere to the siting standards. According to data from NCDC, about three-quarters of stations in the USHCN use such equipment.”

Report to Team

_________________________________________
Please feel free to send if you have any questions regarding this post , you can contact on

usspost@gmail.com

Disclaimer: The views expressed on this site are that of the authors and not necessarily that of U.S.S.POST.

Comments

Comments are closed.

Bottom