Voici la question qui me guide dans mes recherches...

L’appât du gain manifesté par les entreprises supranationales et certains groupes oligarchiques, de même que le contrôle des ressources naturelles par ceux-ci, dirigent l’humanité vers un nouvel ordre mondial de type féodal, voir même sa perte. Confronté à cette situation, l’être humain est invité à refuser d’accepter d’emblée une pseudo-vérité véhiculée par des médias peut-être à la solde de ces entreprises et groupes. Au contraire, il est invité à s’engager dans un processus de discernement et conscientisation afin de créer sa propre vérité par la confrontation de sa réalité nécessairement subjective à des données objectives, telles que révélées par la science, par exemple.

The penalty that good men pay for not being interested in politics is to be governed by men worse than themselves. - Plato

mercredi 26 décembre 2012

Avons-nous perdu tout sens critique?

Sur Radio-Canada.ca aujourd’hui ainsi que sur la majorité des médias de masse, nous lisons la nouvelle suivante :

  1. Une étude publiée par l'Université de l'Ohio révèle que l'Antarctique occidental se réchauffe presque deux fois plus vite qu'on le pensait.
  2. Des chercheurs ont constaté que la température annuelle moyenne relevée à la station américaine sur la Terre Marie Byrd avait augmenté de 2,4 degrés Celsius depuis les années 50, ce qui est le triple de la moyenne mondiale.
  3. Cette hausse accentue les craintes au sujet de la fonte accélérée des glaces aux pôles et de la montée du niveau des mers qui en résulte. Une fonte de toute la glace qui recouvre l'Antarctique occidental pourrait élever le niveau des eaux de 3,30 mètres, au minimum.
  4. Un groupe d'experts qui dépend des Nations unies estime que le siècle actuel pourrait voir le niveau des mers augmenter de 18 cm à près de 60 cm, sans prendre en compte l'accélération de la fonte des glaces dans l'Antarctique et le Groenland.Au cours du 20e siècle, le niveau des mers a augmenté d'environ 20 cm.
Qu’est-ce que cette nouvelle nous apprend et ce que nous ne savons pas?
1. Nous avons une étude de l’université de l’Ohio
  • A-t-elle été revue par des pairs?
  • Qui sont les auteurs?
  • Avons-nous les sources de l'information?
  • Quelle méthodes de mesure à été utilisée?
2. La température moyenne à augmenté de 2,4 Celsius depuis 1950, 3 fois la moyenne mondiale
  • De quelle façon a été mesurée cette température moyenne?
  • 3x comparés à la moyenne mondiale, quelle moyenne? Plusieurs existent.
  • Quelles sont les influences sur ces mesures et quelle est la qualité de celles-ci?
3. Cette hausse accentue les craintes... niveaux des mers...Fonte de toute la glace... 3,3 mètres d'augmentation.
  • Ici on conclut avec l'information du paragraphe précédent que nous avons quelque chose d'inquiétant et pourrions voir le niveau de la mer grimper de 3,3  mètres?
  • Quelle est la température  moyenne de ce continent?
  • Combien de temps prendrait la glace pour fondre? 1,7 km d'épaisseur

4. Groupe  d'expert de l'ONU... niveau des mers augmenter de 18 à 60 cm... 20e siècle augmentation de 20cm

  • Quel groupe d'expert? le GIEC?
  • 18 à 60 cm... Énorme plage, pourquoi tant d’imprécision ?
  • 20e siècle, 20cm, d’où sortent ces chiffres?

Je ne prétends pas avoir toutes les réponses, mais j'ai quelques interrogation et observations à moi aussi. J'ai l'impression que la grande majorité des gens vont simplement lire cette "nouvelle" et conclure que l'homme est responsable et qu'il faut agir rapidement.

Pour commencer, allons voir si nous trouvons quelques informations supplémentaires.
Nous pouvons trouver en anglais plus d'information sur ce site.  Voici la traduction libre de quelques points intéressants:

  • Bien sûr, le réchauffement de l'Antarctique est un concept relatif. L'Antarctique de l'Ouest demeure un endroit très froid, avec des températures moyennes annuelles dans le centre de la calotte glaciaire qui sont près de -45 degrés Celsius.
    • Ça va prendre du temps pour se rendre au dessus de zéro et faire fondre 1,7 km de glace à une moyenne de 1-2 degré pas 100 ans?
  • Pour tenter de faire la lumière sur la question, David H. Bromwich de l'Ohio State University a rassemblé une équipe centrée sur un seul enregistrement de la température. Lors d'un avant-poste isolé appelé Byrd Station, dans le centre ouest de l'Antarctique, les gens et les équipements automatisés ont suivi la température et d'autres variables météorologiques depuis les années 1950.
    • Une étude scientifique basé sur un seul point de mesure!!! Hum pas très fiable tout ça.
  • C'est de loin les données les plus longues de la région, mais a eu des trous intermittents dans les données et autre problème ce qui a fait dire à certain chercheur de se méfier de ceux-ci. Le groupe Bromwich a décidé d'essayer de récupérer les données de Byrd.
    • OK nous avons donc seulement une mesure de température, mais celle-ci est aussi intermittente et à d'autres problèmes?  Hum... La qualité de ces données ne s'améliore pas!
  • En supposant que la recherche se maintient, il suggère que le document de 2009, loin de surestimer le réchauffement de l'Antarctique occidental, avait probablement sous-estimé, surtout en été.
    • OK nous avons donc une recherche, qui n'a pas été validée de toute évidence, mais qui a fait le tour de la planète en moins de 24 h comme une vérité avérée??
Comment pouvons-nous conclure à un réchauffement d'une grande région de l’Antarctique avec un seul point de mesure, et ce point de mesure n'est même pas fiable, plein de trous et de problèmes?

Autre point important à considérer est le fait que cette région de l’Antarctique est sur un volcan actif! Et oui dans la même région "Marie Byrd land". 

À ce que je sache, quand nous avons un volcan actif, nous avons normalement plus de chaleur dans les environs.... Donc nous devrions peut-être faire des études plus poussées et précises sur cette région du monde, faire valider nos recherches avant de conclure quoi que ce soit?

Autre point à préciser, la planète se réchauffe depuis la dernière ère glacière, ce qui est tout à fait naturel et fait partie des grands cycles de la planète... Gardez donc votre sens critique quand vous lisez ces nouvelles.

 

Et voici quelques autres références:
Niveau des mers :
Données sur les glaces:


lundi 8 octobre 2012

Gentilly-2 ou Éoliennes petit calcul rapide

Le PQ et Hydro Québec ferment Gentilly 2 et encouragent l'éolien, est-ce un choix économique?

Selon le maire de Trois-Rivières, Yves Lévesque. Hydro va payer 14 ¢/KWh pour l’éolien, ceci à des intérêts privés. Le projet éolien de Kruger en Montérégie a signé à 10,55 ¢/KWh avec Hydro.

Donc petit calcul rapide, pour remplacer 625 MW de production de Gentilly, pour 30 ans, ça va couter combien au Québec?

625 MW = 625,000 KW @ 14 ¢/KWh = $87,500/h de production.  Sur 30 ans... on multiplie par 24/jours x 365J/années x 30 ans = $22,995,000,000 donc, environ 23 milliards de dollars pour 30 ans!  Comparé à une réfection de 4,3 milliards pour Gentilly et des salaires bien payés dans une institution qui reste publique.

Donc sur 30ans, on va payer un surplus de 23-4,3=18,7 milliards à des intérêts privés pour notre électricité?  Ceci me surprend du PQ.

Si je me trompe, faites-moi savoir dans un commentaire.


dimanche 30 septembre 2012

Gentilly 2, quelques informations et analyse

La centrale nucléaire Gentilly 2 doit être rénovée.

Plusieurs débat sur la place publique sont en cours sur les coûts réel d'une telle rénovation.  Le plus récent du devoir, place le coût à 4,3 milliards.  J'ai bien hâte de lire le rapport pour connaitre la raison qui à fait que le coût passe de 1,8 milliard à 4,3 milliard en quelque années.

À 4,3 milliard, selon le devoir, ceci revient à 9,7 ¢/kWh produit. Selon l'article, ce coût est:
C’est encore trop pour assurer la rentabilité de la centrale nucléaire.

À titre de comparaison, le coût prévu par kilowattheure du complexe de La Romaine, présentement en construction, est de 6,2cents. D’aucuns estiment d’ailleurs que ce coût unitaire est trop élevé pour assurer la rentabilité de ce projet.
Il est intéressant de noter que les projets de parcs éoliens privés, ont une garantie d'achat de leur électricité à 10,55  ¢/kWh ou plus. Voici l'extrait du site du projet éolien de Kruger en Montérégie.

Quel est le prix du kWh vendu à Hydro-Québec Distribution?

Le prix d’électricité fixé au contrat est de 105,50 $/MWh, ce qui équivaut à 10,55 ¢/kWh. Il s’agit d’une valeur établie au 1er janvier 2007 qui sera indexée pour l’essentiel à l’inflation (la formule de prix peut être consultée dans le contrat d’achat d’électricité disponible dans la section Contrats du présent site Web).
Donc pour Hydro-Québec et la classe politique, il est OK de financer le privé à 10,55 ¢/kWh, mais pas la création d'emploi publique à 9,7 ¢/kWh?  Est-ce du favoritisme privé?  Des réponses s’imposent.

De plus les parc d'éoliennes ont un facteur de charge variant de 28% à 35% selon les compagnies privées. Les chiffres réels sont plus proches du 25%.  Ceci se compare à plus de 90% pour les centrales nucléaires.  Ceci veut dire qu'Hydro-Québec, doit compenser pour 75% de cette puissance intermittente.  Ceci de façon intermittente et aléatoire.  La capacité du réseaux à compenser pour des sources intermittentes est limitée et à surement de coût pour l'opérateur.

Le nucléaire est très bon pour fournir une énergie stable et forte durant les pointes de consommation. Au contraire, les parcs d'éoliennes, sont souvent au ralenti ou arrêté quand le vent ne souffle pas pendant les vagues de froid hivernal ou pendant les canicules estivales.


Donc pourquoi fermer une centrale qui peut fournir une meilleur énergie à plus faible coût que l'éolien tout en créant des emplois stables et payants pour une région?

Quels sont les arguments le plus vocaux contre le nucléaire?

#1 Le nucléaire est dangereux.
Cet argument est souvent utilisé de plusieurs façons, les accidents majeurs sont utilisés pour démontrer les danger du nucléaire, mais laissons les chiffres réels parler par eux-même.  Ce site web a compilé le nombre de mort par TWh produit (mille milliards de watts). J'en ai fait un petit graphique qui montre que le nucléaire se place très bien par rapport au autres sources d'énergie.


#2 Les radiations relâchées sont une cause de cancer dans les populations environnantes.
Cet argument revient souvent.  Comme nous l'avons vu dans le récent documentaire "Gentilly or not to be", ce point revient souvent.  Comme la écrite la commission canadienne de sûreté nucléaire ce n'est peut-être pas la meilleure référence scientifique: Allez lire l'article pour avoir les détails. Ils y énoncent six faussetés propagées durant ce reportage.
Gentilly or Not To Be : remettons les pendules à l’heure!

- Contre argument: Les radiations à faibles rayonnements sont bonnes pour vous! 
Contrairement à la croyance populaire, les rayonnements faibles de source radio-active sont bons pour vous, dans une certaine plage. Il est certain que n'importe quoi à trop forte dose est fatale pour l'humain, ceci s'applique à beaucoup d'éléments, pas juste les radiations.   Un bel exemple est la mort en buvant trop d'eau!

Depuis plusieurs années, je documente et recherche ce phénomène qui ce nomme: L'effet Hormésis - Ce qui ne nous tue pas, nous rend plus fort" .

Je vous recommande la lecture de mon article publié en septembre 2011 sur le sujet:

Low level radiation and Linear no threshold (LNT) theory.

We should revisit our exposure regulations because our regulatory history is founded on a deception.
#3 Le nucléaire est trop dispendieux.
Cette argument revient souvent du fait que les chiffres sont gros, on parle de milliard de dollars comparé aux parc d'éolienne en millions de dollar. Mais quand l'on fait ce type de comparatif, on ne compare pas des pommes avec des pommes!  Il faut plutôt comparer la production, l'efficacité, le facteur de charge et la durée de vie et aussi le territoire couvert.

Exemple, voici un petit calcul très simple:
  • Gentilly produit 625MW d'électricité à une moyenne de 75%
  • Ceci donne 468MW
  • Cout de l'électricité, 2 scénarios, un a 7 ¢/kWh et l'autre à 10,55 ¢/kWh comme le parc d'éolienne mentionné ci-haut.
  • Donc je calcule sur la durée de vie de 30 ans, un profit de 4,3 milliard à 7 ¢/kWh et 8,6 milliard à 10,55 ¢/kWh. 
Donc si c'était un projet privé... Ça serait très très profitable!

Vous cherchez plus d'information?


samedi 21 avril 2012

Kony 2012, our education system and economic system.

You may have heard about the "Kony 2012" campaign?

Kony 2012 is the title of a campaign launched by the organization Invisible Children Inc., focused for now on the half hour video of the same name, which has had a viral diffusion on the internet reaching in a few days almost one hundred million views (it was published only on the 5th March). The campaign aims at supporting the arrest of Joseph Kony, an Ugandan guerrilla leader accused of “crimes against humanity” by the International Criminal Court (ICC) in The Hague.

It is clearly explained in many articles the doubtful aspect of this campaign and the possible connection of this organization to goals not at all related to saving children in Uganda.

We should save ALL the children and stop ALL those abuser of the world, but is this the solution?

Like Jean Ziegler explained in his latest book: Betting on Famine: Why the World Still Goes Hungry
Jean Ziegler, U.N. special rapporteur on the right to food, alerted the more than 500 delegates that while 854 million people went without food in the world last year, enough food was produced to feed 12 billion people. "This is why a child that dies from famine is murder," Ziegler said. The prices of corn, rice and wheat are literally exploding because of market speculation on the basic commodities.
So you see, there a more urgent problems in the world caused by the speculation on many things, the world economies are about to explode because of speculation out of control, why is the world focusing on this "campaign"?


Here's some samples articles from serious and reputable sources:

Beyond “Kony 2012”. What is Happening in Uganda? America's "Invisible" Military Agenda by Daniele Scalea
The United States has intervened in Uganda within the framework of increased militarization in its relationship with the continent, made necessary by the political and trading penetration of China in Africa.

The sending of military advisers to Museveni, possibly a prelude to military escalation (maybe what the Kony 2012 viral campaign wants to achieve?), is to be taken in conjunction with drone bombardments in Somalia, intervention in Libya to overthrow Gaddafi, French intervention in Ivory Coast to depose Gbagbo. Julien Teil’s documentary The Humanitarian War has shown the role, not too clear, of NGOs in preparing the ground for NATO intervention in Libya.

Invisible Children emphasizes the need to send US troops to Uganda at a time when the LRA seems weakened and, according to many people, Kony hasn’t been in the country for years. At this point it does not seem rash to include also Kony 2012 in the arsenal of US soft power that should support the spread – not necessarily in a peaceful way – of Washington’s influence in Africa.

Democracy now: Kony 2012: Ugandans Criticize Popular Video for Backing U.S. Military Intervention in Central Africa

The relationship between Invisible Children and the U.S. government and the Ugandan regime disturbed me a lot. And we started doing more investigation and more research, and that’s how we came out with the revelation that the U.S. ambassador to Uganda in 2009 had actually written a cable, which was revealed through WikiLeaks, in February 2009, indicating that Invisible Children had approached the U.S. embassy and made it known to them that they were going to be conducting campaigns that would advocate and promote the military solution. That was one of the memos.

A second memo, which was actually much more disturbing, said that Invisible Children had provided a tip to Ugandan intelligence services, leading to the arrest of a suspected regime opponent. This suspected regime opponent used to be a child soldier, as well, and had been taken under the care of Invisible Children and was, in fact, staying at one of their facilities in Uganda. He was arrested. And as many people familiar with Uganda know, people that are arrested by intelligence services in Uganda are subjected to torture. Subsequent to his arrest, nine other Ugandans were arrested, and now they face treason charges. And treason in Uganda is punishable by the death penalty.

So the relationship between Invisible Children does not seem to be independent. It seems to be involved in line with the U.S. administration and the Ugandan regime in advocating and pushing the military solution as the only approach, and disregarding the voices of Ugandans such as Bishop Odama, Bishop Ochola, who come from the war-affected region, who have been pushing the resumption of a negotiated solution to this war.
This web site: World Under Control - Helping to expose corruption by the global elite, one truth at a time: have many interesting links and article to debunk this campaign:


All this make me ANGRY and think that there is something wrong with our education system.  I have seen my local school and many kids involved in this campaigns using emotions in the movie to indoctrinate them.

You then wonder why are we here, why are so many of us "buy" this stuff without asking any hard questions, without being critical and skeptics?

Go read this article on education:
The Purpose of Education: Social Uplift or Social Control? - The Path to Deconstructing Democracy
Fichte laid it down that education should aim at destroying free will, so that, after pupils have left school, they shall be incapable, throughout the rest of their lives, of thinking or acting otherwise than as their schoolmasters would have wished.
Wake up people. There are better way of helping Africa.

samedi 31 mars 2012

Radio-Adaptive Response to Environmental Exposures at Chernobyl

Here's an Interesting research pointing to the possibility of an hormesis effect of low level radiation. Basically, this means that low level radiation (up to 1000x normal level) for 45 days, can protect you from high intensity radiation and protect you from future damages.

The genetic consequences resulting from environmental exposure to ionizing radiation have a significant impact on both radiation regulatory policies and the comprehension of the human health risks associated with radiation exposure. The primary objectives of the study were to assess 1) genotoxicity of exposure to radiation as a function of absorbed dose and dose rate, and 2) induction of a radio-adaptive response following a priming dose at varying dose rates. Results demonstrated that sub-acute environmental exposures of 10cGy gamma radiation resulted in indistinguishable levels of chromosomal damage as compared to controls. A radio-adaptive response was observed in all experimental groups, exposed to a subsequent acute challenge dose of 1.5 Gy, demonstrating that low dose rates of low energy transfer (LET) radiation are effective in reducing genetic damage from a subsequent acute low-LET radiation exposure. Furthermore, the data presented herein demonstrate a potential beneficial effect of sub-chronic exposure to low levels of low-LET radiation in an environmental setting and do not support the Linear No Threshold (LNT) hypothesis.

Here's a video (in french) explaining this research.


Another study show the same conclusion: MIT study suggests that at low dose-rate, radiation poses little risk to DNA
it’s believed that all radiation is bad for you, and any time you get a little bit of radiation, it adds up and your risk of cancer goes up,” says Boreham, who was not involved in this study. “There’s now evidence building that that is not the case.”

mardi 31 janvier 2012

Toilette à composte... Bonne solution verte?

J'écoutais "la vie en vert" à Radio-Québec et le reportage du 31 jan 2012 sur les toilettes à composte. Curieux de nature, voici une petite analyse rapide.

  • La toilette chez home dépôt est vendu à 1565$
  • L'article parle de sauver 30,000 litres d'eau par années
  • Dans la même émission, on nous dit que chaque mètre cube d'eau (1000 lites) coûte 0,30$ à produir
Donc, 30,000 litres d'eau économisé divisé par 1000 litres pour 30 cents, ça nous donne 9$ d'économie par années, donc... 1565$/9$ =  174 ans pour rentrer dans notre argent... Hum, c'est long ça! Ceci ne tiens pas compte du coût pour traité les boues usées...

Par exemple, l'usine de Drummondville de traitement des eaux, peut traité 24,163,000,000 litres / année.  Une évaluation rapide, nous donne 7.1M$ par an en moyenne (Cout + opération) entre 1982 et 1999. Ceci nous donne un coût par 1000 litres traité de 0,30$ environ...

Donc pour traité 30,000 d'eau ça coûte 9$... donc 18$ d'économie... Donc on coupe à 87 ans notre retour sur investissement... Hum pas payant encore!

Je pense qu'il serait plus profitable pour tous d'installer des usines OHAP comme à Salaberry-de-Valleyfield.


Ce procédé possède plusieurs avantages par rapport aux autres méthodes de disposition des boues que sont la valorisation agricole, l'enfouissement, le compostage et l'incinération. Les principaux avantages sont les suivants :
  • C'est une solution finale complète, directement à la source
  • Élimination totale des risques sanitaires (pathogènes, virus, bactéries non digérées, etc.)
  • Encombrement des équipements minime et peut opérer de façon continue peu importe les conditions météorologiques.
  • Température d'opération basse donc moins de génération de NOx et moins d'entretien et remplacement des matériaux réfractaires
  • N'est pas assujetti aux variations du marché (e.g. coûts croissants de l'enfouissement, offre/demande du compost, etc.)
  • La récupération de la chaleur et des cendres permet leur valorisation
  • Procédé insensible aux caractéristiques physico-chimiques des boues (pH, concentrations de métaux lourds et autres substances toxiques).
  • C'est une technologie simple, efficace et fiable
  • Ne produit pas d'odeur
Il ne faut jamais oublier que pour chaque dollars dépensé au Québec, environ 0,62$ est dépensé en énergie fossiles.  Donc dépenser 1565$ pour une toilette, équivaut à dépensé 968$ en énergie fossiles... par très vert tout ça!

samedi 21 janvier 2012

La prochaine révolution verte

Prochaine révolution verte

Il existe déjà des solutions incroyables concernant la sécurité alimentaire avec l'agriculture verticale et l'aquaponie. Cependant, le temps que de telles technologies soient promues, qu'est ce que l'agriculteur moderne peut faire actuellement pour améliorer le rendement de ses terres tout en réduisant son empreinte écologique? Poussons un peu plus loin : comment pourrait-il également restaurer les terres dégradées par une exploitation trop massive de pesticides, des monocultures, et l'érosion prématurée des sols?

À l'heure actuelle, 40 % des sols du monde sont exploités par l'agriculture. La dégradation des sols menace la sécurité alimentaire à un point tel que les États-Unis sont passés de 2 ans de réserves en 1985 à seulement 43 jours en 2010. Le problème est préoccupant, mais effectivement, des solutions et des techniques existent. Si ces solutions étaient mises en application, nous n'aurions pas besoin de défricher davantage de forêts pour fournir une population croissante, mais pourrions produire plus sur beaucoup moins d'espace et restaurer les terres dévastées. Ces moyens ont déjà fait leurs preuves, et peuvent conduire à une plus grande prospérité.

Solution #1 : La permaculture

Le concept, inventé par Bill Mollison dans les années 70, est de recréer un écosystème autosuffisant par une symbiose de plusieurs espèces de plantes, d'insectes et de champignons. Geoff Lawton a prouvé au début des années 2000 qu'il est possible de reverdir le désert en Cisjordanie avec seulement un cinquième de l'eau nécessaire à l'agriculture par irrigation conventionnelle.

La technique consiste à creuser une tranchée qui récolte les retombées des pluies, assurant ainsi une réserve d'eau à longueur d'année. Sur chaque côté de la tranchée, des buttes couvertes de compost et de restes de cultures précédentes — qui auraient été brûlées normalement — sont reconverties en nutriments pour le sol. Des arbres fruitiers et autres types de cultures sont alors plantés en étages le long de la berge et irrigués par un système de compte-gouttes allant directement aux racines. À mesure que le temps passe, le sel emmagasiné dans le sol vient à être éliminé par des champignons et des insectes, rendant à nouveau le sol fertile. La terre, même après avoir été abandonnée pendant 5 ans par manque de fonds, a conservé toute sa vitalité. C'est ce qu'il a pu constater à son retour en 2009. Pour plus de détails, voici la vidéo qui relate l'épopée :



Voici un autre exemple de permaculture à l'œuvre, cette fois-ci par Willie Smits, qui en 4 années et avec la collaboration de plusieurs familles d'agriculteurs, a restauré une plaine quasi désertique en 500 acres de forêt amazonienne :



Sinon, certains préceptes de la permaculture ont été instaurés à l'agriculture de certains pays du tiers-monde. Cette technique, l'agroécologie, combine les propriétés de plusieurs plantes pour augmenter par 5 fois le rendement de l'agriculture de subsistance — si l'on se met dans le contexte que l'agriculture commerciale, augmente cette capacité de 10 fois. Mais c'est entièrement biologique, donc une bonne piste à explorer.



Solution #2 : Progrès technologiques et rendements optimisés

Voici comment une simple roue creusant des sillons semblables à une gaufre parvient à éviter l'érosion prématurée des sols et ceci sans compacter la terre tout en économisant l'eau bien répartie. Cette invention augmente l'efficacité d'un sol de 30 % selon les chiffres.



Sinon, la réorganisation des sols en modèle circulaire et une simple invention permet de pratiquer l'agrocircle, permet de réduire l'effort fourni par un simple agriculteur de 70 % en plus d'éliminer les émissions de gaz à effet de serre.
Irrigation au cœur du Sahara

 La vidéo est sous-titrée en anglais (cliquez sur cc).



Il est également prouvé par une étude de McGill que le rendement pourrait être optimisé si les cultures étaient adaptées selon les climats. Certaines cultures de maïs à travers le monde ne fournissent que de 20 à 40 % de rendement, alors qu'elles pourraient en fournir 100 %.

Solution #3 : Champignons et bactéries

L'agriculture industrielle utilise de nombreux fertilisants chimiques à base d'azote pour produire du nitrogène, ce qui a assuré le rendement commercial que l'on voit de nos jours. Même si jusqu'à maintenant nous avons pu nourrir la population mondiale actuelle (même au-delà si l'on comptabilise le gaspillage éhonté), cela n'a pas été sans aléas. Ces fertilisants, en se déversant dans nos cours d'eau, contribuent à la propagation des algues, ce qui cause des zones mortes pour la biodiversité.
Ce qui est important de trouver, ce sont de nouvelles méthodes pour améliorer le rendement des plantes, tout en capturant davantage de Co2 pour que les sols se régénèrent.

Eh bien!, n'ayez crainte, des recherches sont déjà en cours, avec des résultats pour le moins stupéfiants.

Les champignons contribuent largement à la conversion des matières organiques en nutriments pour le sol, et donc par intérim pour les plantes. Voici comment, avec l'aide de certains types de champignons, l'on pourrait revitaliser les sols dégradés, recréer des écosystèmes en laissant la nature se régénérer et créer des pesticides naturels en plus d'antibiotiques. Encore mieux, ces champignons pourraient être utilisés pour nettoyer les sols après un déversement de pétrole. Il faut le voir pour le croire. Ces recherches, effectuées par Paul Stamets, sont très prometteuses.



Mais, pourquoi se limiter seulement à des recherches alors que des résultats démontrent clairement une nette amélioration au rendement des cultures? Voici une vidéo de TED (comme de nombreuses autres précédemment), animée par Guy Webb, montrant des techniques qui favorisent l'extension des racines des plantes et utilisent des bactéries et champignons pour régénérer le sol en capturant le Co2. Ici, ils passent d'une capture de l'agriculture conventionnelle de 1,5 tonne de Co2 par hectare à 281 tonnes de Co2 par hectare... Ça donne à réfléchir.



En conclusion, certes, beaucoup de chemin reste à faire, mais ces découvertes ne sont que la pointe d'un iceberg de possibilités. Imaginez les progrès accomplis en à peine 20 ans. Imaginez une autre période de temps semblable, et la vitesse à laquelle notre technologie évolue. L'avenir ne sera pas aussi noir qu'on veut bien le laisser entendre. Au lieu de regarder un problème et de laisser quelqu'un d'autre s'en charger, au contraire, les mariages d'idées seront la panacée aux enjeux auxquels l'humanité devra faire face dans les années à venir.

Vincent Blanchette

dimanche 8 janvier 2012

LNT from the point of view of the academy of medicine of France.

Statement by the Academy of Medicine of France, December 4, 2001  source
4 December 2001
Medical Irradiation, Radioactivity Releases, and Disinformation:
An opinion by the Academy of Medicine
The Academy of Medicine, preoccupied by the problems that arose in the public about medical exposure to X rays and radioactive releases in the environment, and erroneous information that these subjects give rise to, wishes to give an opinion on this subject.
Humanity is exposed to ionizing radiation
Since the beginning, life developed in a bath of ionizing radiation to which it is adapted. These radiations have a cosmic origin or originate in the earthly crust where, since the creation of the earth, the unstable isotopes of the elements of very long physical half-lives remain: thorium, uranium, potassium, and rubidium. Natural exposure results therefore from internal and external sources, both characterized by various physical properties and different effects on human body.
The presence of radionuclides in the environment results in an average radioactivity of 10,000 Bq in the human body, essentially from carbon-14 and from potassium-40 whose concentration is regulated by homeostatic control of intracellular potassium content. Average exposure of humans to natural sources is evaluated to 2.4 mSv per year expressed as effective dose. It exists nevertheless with important variations according to the altitude and nature of the rock and soils in the ground, generally varying from 1 to 10 mSv, and attaining more than 100 mSv in wide regions such as Kerala in India, or the city of Ramsar in Iran (1). These natural variations involve different target tissues for the dose being delivered, such as lung for radon, kidney for uranium, bones for radium, and bones, hepatic and systemic phagocytes for thorium, of which the behavior and the radiological characteristics are similar to those of plutonium.
To natural background irradiation is added, since the end of the 19th century, a diagnostic medical irradiation which delivers an average of about 1 mSv per year, but with variations from less than 1 mSv to more than 20 mSv per year.
And last, since 1950, it is necessary to add irradiations of industrial origin - notably the one from producing electricity by nuclear energy (extraction and treatment of uranium, functioning of reactors, etc.) corresponding to an exposure of the order 0.01 to 0.02 mSv per year - and one of the other natural sources, from coal extraction and burning to 0.01 mSv per year. In addition, radioactivity releases to the atmosphere contribute to an average exposure of 0.005 mSv/yr, and the Tchernobyl accident to about 0.002 mSv/yr (1).
In measuring dose effectiveness, the biological effects of the different types of ionizing radiation are identical whether their origin is natural or artificial.
Exposure of workers to ionizing radiation (200,000 in France, of which more of than half are in the medical sector) results, in France, in an average exposure of 2 mSv per year (OPRI annual report) with less then 1% surpassing the average statutory limit of 20 mSv per year. Except for diagnostic irradiations, these exposures are characterized by low dose rate, chronic irradiation doses. This aspect distinguishes them clearly from accidental and therapeutic irradiations that are performed at high dose-rate, causing instantaneous accumulation of damaged molecules that perturb components of cellular repair mechanisms, with as little as a few absorbed mGy in a few minutes (2).
Dismantling of nuclear power plants and nuclear waste storage are activities that make small dose increases to the populations at very low dose rates (about 0.005 µSv per year for iodine-129 for example) (1), essentially by transfer in the food chain of various man-made radionuclides of long half-lives leading to either: –homogenous exposure of the whole body (as in the case of natural potassium-40), or to –selective organ exposure, e.g., to the large intestine, bone, liver and kidney, as in the case of the natural isotopes of uranium and thorium. It is therefore legitimate to infer their possible effects on human health from those known to result from natural sources which expose populations of many millions of residents.
The health consequences of the exposure of humans to a few mSv.
There exists data (3) establishing that high natural exposure is associated in adults to an increased rate of chromosome aberrations of the circulating lymphocytes, a biological indicator of exposure. It cannot be concluded, however, that this is an index of harm since there are detected no global increase of cancer risk (4), increase of congenital malformations (5), or abnormalities induced by cytogenetic effects with newborns (6), in the well-studied population of the particular highly-exposed region of Kerala India to external irradiation and to internal contamination. Identical conclusions are obtained in the exposed Chinese populations (7-8). And last, as stated by the NCRP in the United States (9): «It is important to notice that the incidence of cancers in most of the exposed populations to low-dose radiation has not been found to be increased, and that in most of the cases this incidence seems to have been reduced».
 The hypothesis of the risks of cancer induced by low doses and dose-rates is founded on the extrapolation of data of highly-exposed human groups, applying the risk as being constantly proportional to the received dose without being limited by a threshold, the linear no-threshold (LNT) assumption. This hypothesis conflicts with itself and has many scientific objections (10); and it is contradicted by experimental data (11) and epidemiology.
In the groups having received more than 200 mSv as adults, and 100 mSv as infants, excesses of cancer have been observed: in e.g., Japanese atomic bomb survivors in Hiroshima and Nagasaki, irradiated medical patients, nuclear workers, and residents of the former-USSR contaminated by nuclear wastes. No excess of cancers has been observed for doses lower than 100 mSv. A doubt remains nevertheless in the case of irradiation for x-ray in utero from 10 mSv because the epidemiological data are contradictory (12).
Having not observed excess cancer does not allow an effect for low doses to be excluded because of statistical limitations. Nevertheless it is necessary to recall that the linear theory with no threshold (LNT): -is contradicted by the observation of thresholds for bone cancers induced by radium-226, and cancers of the liver induced by Thorotrast; -is not compatible with induced leukemias in Hiroshima, nor with the patients treated by radioactive iodine (1,10,13). Besides, the historic epidemiological study of the British radiologists for the period 1897-1997 (14) showed that for the registered radiologists after 1954 these practitioners have no excess of cancers in comparison with their non-radiologist colleagues, with a tendency to a lower cancer rate, as in the case of populations described by the NCRP (9). Similar deficits were observed for many groups of exposed professional workers to ionizing radiation, notably radiologic technicians: while the frequency of cancers increased in their jobs during the period when there was limited radiation protection, the excesses of cancers disappeared when regulatory limits were reduced to 50 mSv/yr, as enforceable up to 1990 (12).
These observations, associated with the recent biological data, showing complexity and the variety of molecular and cellular mechanisms that control cell survival and mutagenesis according to the dose and dose-rate (1,2,11,13), remove all scientific rationale to a linear extrapolation that overestimates very widely the effects of low doses and dose-rates. One cannot add the exposures of a few mSv/yr, and a fortiori lower than 0.02 mSv/yr, delivered to a large number of individuals (as done with the use of collective doses) to estimate the risk of excess cancers (15). The Academy of Medicine, joining the position of the large international institutions, strongly affirms that such calculations have no scientific validity, notably to evaluate the associated risks to radiation, such as the effects claimed outside the former-USSR from the fallout from Tchernobyl.
The UNSCEAR 2000 report and the controversy with the OCHA.
The Tchernobyl catastrophe has caused to this day about 2,000 cancers of the thyroid in children, essentially from exposure to iodine-131 and the short-lived iodine isotopes. The delivered doses to the thyroid were on average of the order of 1 Gy, and of 3 Gy on average in the most exposed regions (16). This carcinogenic effect is therefore in keeping with the sum total of our knowledge of radiation risks.
In 2000, UNSCEAR concluded that there is an absence of excess leukemias and of cancer other than thyroid cancer in the population around Tchernobyl. It also did not find a relationship between the exposures to radiation and congenital malformations in these populations (1). This conclusion was questioned in 2001 by the OCHA, the humanitarian organization of the UN, but the OCHA publication was refuted in a response by the UNSCEAR committee, which alone has the medical and scientific competence to speak with the name the UN and of the WHO on this subject (17). A conference was therefore held in Kiev in June 2001, with the WHO, OCHA, UNSCEAR, ICRP and IAEA, and the conclusions have been published (annex). These conclusions find that the health conditions are alarming because of the general deterioration of the health and social conditions, notably in Belarus, but do not contradict the UNSCEAR conclusions. In fact, this deterioration is probably caused by the living conditions of the relocated populations, associated with psycho-sociological factors. Different questions have been raised that do appear to necessitate epidemiological research of the conditions of the catastrophe consisting of multiple susceptible factors that altered the health of populations: this is the recommendation of the Kiev conference.
It is possible to reduce human exposure to ionizing radiation, in particular of radiation medical origin, with the necessary means.
Radiological examinations represent, by very far, the principal cause of irradiation of human origin (effective average dose of about 1 mSv/yr in France). The recent direction of the European Union introduces two notions to this subject: -cost-optimization (to reduce as much as possible the dose per examination), -and justification (to evaluate the benefit and the risk of each examination, and to not practice it unless it is advantageous). These principles necessitate therefore the evaluation of effective doses received by the examined subject and the relevant risks. Now, according to the examinations and the techniques used, the effective doses vary from a fraction of a mSv to several tens of mSv (examinations by x-ray scanners or radiological interventions) and the risks vary widely according to age. An over-evaluation of risks could deprive a child of a useful examination; inversely, an under-evaluation could favor the multiplication of medical X ray examinations that are not useful. The Academy counsels therefore, in a first step: 1) to focus on the study and evaluation of examinations from which the potential risks are the largest: x-ray scans with young subjects, multiple radiological examinations with premature interventional angiography; 2) to promote the likely techniques to reduce or to eliminate irradiation without harm to the quality of clinical information and to stimulate the technical and basic research in this area; 3) to conduct epidemiological studies on groups of patients, notably infants, which have received the most important doses from radiological examinations; and 4) to favor the initial and continuing training of clinicians in matters of radiation protection.
It is unacceptable, while irradiation of medical origin represents, in France, 95% of the irradiation added to the natural background, that there is little benefit to affect reduction in the industrial environment by applying radiation protection at very high costs.
It is necessary to define health priorities in the matter of releases.
Outside of this context, some recommendations can be undertaken concerning the problem of radiation releases in the matter of health. It appears essential to support epidemiological studies concerning the populations exposed naturally to high-level background radiation, and even concerning the populations of the ex-USSR that were massively exposed to radioactivity releases and to other pollution. In the framework of studies dealing with potential health effects of nuclear waste management, the priority isotopes should not be selected according to the collective dose that some would use, but according to the potential doses to individuals because the calculated collective doses from low individual doses to a few microSieverts cannot have any effect on health. A significant national effort should be undertaken, as the one undertaken in the framework of the programs of the U.S. DOE, on the biological mechanisms in the cellular response to doses below 100 mSv, in particular, health effects from DNA repair, cell signaling, and the hereditary transmission in DNA sequence encoding of parental DNA modified by irradiation.
 
– Recommendations –
The Academy of Medicine:
1 – recommends increased effort for radiation protection in the area of radiological examinations, on the one hand to reduce received doses from certain types of examinations (x-ray scans with infants, interventional angiography, lung X ray examinations with premature treatments, etc…), and on the other hand, to allow radiology services, notably in radio-pediatrics, to obtain benefits of well-educated physicists for dosimetry and quality control of the devices, in a way similar to that previously undertaken with mammography in breast cancer surveys. It recommends to this end to support clinical and technical research in this area.
2 – recommends an effort of basic research: on the biological mechanisms activated by the repair of DNA damage after low doses up to 100 mSv; and on the effects of these doses on the exchanges of intra- and inter-cellular molecular signals.
3 – denounces utilization of the linear no-threshold (LNT) relation to estimate the effect of low doses to a few mSv (of the order of magnitude of variations of natural radiation in France) and a fortiori of doses hundreds of times lower, such as those caused by radioactive releases, or 20 times lower, such as those resulting in France from the fallout of radioactive materials from the Tchernobyl accident. It associates with many international institutions to denounce improper utilization of the concept of the collective dose to this end. These procedures are without any scientific validity, even if they appear be convenient to administrative ends.
4 – subscribes to the conclusions of the 2000 Report of the Scientific Committee on the Effects of Atomic Radiation of the United Nations (UNSCEAR) concerning the analysis of health consequences of the Tchernobyl accident, and denounces the propagation of allegations concerning excesses of other cancers than the thyroid cancer, and excesses of congenital malformations.
5 – recommends introduction of the ADIR (Annual Dose of Incorporated Radioactivity, being equivalent to 0.2 mSv, resulting from homogeneous exposure of the human body to natural potassium-40 and carbon-14) as this dose equivalent is almost constant whatever the size of the individual and the geographic region.
6 – The Academy of Medicine, in accordance with its October 3rd 2000 statement, continues to recommended maintaining without modification the European directive concerning regulatory limits (to 100 mSv/5yr). To substitute dose limits of 20 mSv/yr would reduce the flexibility of the European norm, all without any health advantage, and would harm the functioning of medical radiology services while making the improvement of applicable techniques more difficult.
Glossary
-Bq or becquerel, the radioactivity characterized by a disintegration per second. In the human body 10,000 Bq of the natural sources represent 1 ADRI that is equivalent by convention to a dose equivalent of 0.2 mSv
-Gy or gray, the absorbed dose corresponding to 1 joule per kg.
-Sv or sievert, the unit of equivalent dose obtained from the product of the dose absorbed by the weighting factor for radiation quality (1 for X, beta and gamma radiations … 20 for alpha radiation). The effective dose, also expressed in Sv, is the product of the dose equivalent by the weighting factor for organs (0.05 for the thyroid… 1 for the entire body).
IAEA: International Atomic Energy Agency
ADRI: Annual Dose of Incorporated Radioactivity, recommendation G. Charpak.
DOE: Department of Energy, U.S.
ICRP: International Commission on Radiation Protection
NCRP: National Council on Radiation Protection and Measurements (USA)
OCHA: Office for the Co-ordination of Humanitarian Affairs
WHO: World Health Organization
UNSCEAR: United Nations Scientific Committee on the Effects of Atomic Radiation

References:
  1. UNSCEAR: Sources and effects of ionizing radiation, Report to the General Assembly, with annexes, United Nations, 2000.
  2. Feinendegen L, Pollycove M, Biologic Responses to Low Doses of Ionizing Radiation: Detriment Versus Hormesis, J Nuclear Medicine, 42, 7, 17N-27N and 26N – 37N, 2001.
  3. BEIR V: Committee on the Biological Effects of Ionizing Radiation. Health effects of exposure to low levels of ionizing radiations. National US Academy of Sciences, National Research Council, Washington 1990.
  4. Nair MK, Nambi KS, Amma NS, Gangadharan P, Jayalekshmi P, Jayadevan S, Cherian V, Reghuram KN Population study in the high natural background radiation area of Kerala, India. Radiat Res. 152, 145-148S, 1999
  5. Jaikrishnan J'S and al, Genetic monitoring of the human population from high-level natural radiation areas of Kerala on the southwest coast of India. Prevalence of congenital malformations in newborns. Radiat Res 152, 149-153S, 1999.
  6. Cheryan VD et al. Genetic monitoring of the human population from high level natural radiation areas of Kerala on the southwest coast of India incidence of numerical structural and chromosomal aberrations in the lymphocytes of newborns. Radiat Res. 152, 154-158S, 1999.
  7. Tao Z J Radiat Res (Tokyo) 41 Suppl:31-4, 2000.
  8. Wei LX, Sugahara T. High background radiation area in china. J Rad. Research (Tokyo) 41, Suppl. 1-76, 2000.
  9. National Council on Radiation Protection and Measurements – Evaluation of the linear non-threshold model for ionizing radiation – NCRP-136, Bethesda MD, USA, 2001.
  10. Academy of Sciences – secured Problems of the effects of the low doses of ionizing radiations. Report 34, Oct 1995.
  11. Tanooka H. Threshold dose-response in radiation carcinogenesis: an approach from chronic alpha-irradiation experiments and a review of non-tumour doses. Int. J Radiat. Biol., 77, 541-551, 2001
  12. IARC 2000 – Monographs on the evaluation of carcinogenic risks to humans, Vol. 75, Ionizing radiation - IARC, Lyon, France
  13. Academy of Sciences – Symposium on risk due to carcinogens from ionizing radiation – Report, Academy of Sciences, Series III, 322, 81-256, 1999
  14. Berrington HAS. Darby SC, Weiss HA., Doll R. – 100 years of observation on British radiologists mortality from cancer and other causes 1897-1997. British Journal of Radiology, 74, 507-519, 2001
  15. BRPS Symposium, Warrenton: Bridging radiation policy and science (K.L. Mossman et al. Ed.) 2000
  16. IAEA, Final Report, Belarus, Ukrainian and Russian 2001: Health effects of the Tchernobyl accident.
  17. Holm LE (UNSCEAR Chairman) Chernobyl effects. Lancet, 356, 344, 2000
  18. European Directive 97/43 on radiological examinations, 1997

lundi 2 janvier 2012

Climate debate at the Canadian Senate - Why Canada is out of Kyoto

A very important and interesting debate took place mid December 2011 on the climate, CO2, global warming. After listening to those very knowledgeable and serious researchers, it's easy to see why the Canadian government jumped off the Kyoto protocol.

Invitee list:

  • Ross McKitrick, Professor, Department of Economics, University of Guelph
  • Ian D. Clark, Professor, Department of Earth Sciences, University of Ottawa
  • Jan Veizer, Professor Emeritus, Department of Earth Sciences, University of Ottawa
  • Timothy Patterson, Professor of Geology, Department of Earth Sciences, Carleton University
Here's the full presentation on YouTube:


Other video in other languages are available here.

The transcripts will be available here.

Ross McKitric testimony available here.
Copied here for quick reference:

Testimony
For delivery to the Senate Standing
Committee on Energy, the Environment
and Natural Resources,
Parliament of Canada
Ottawa Ontario
December 15, 2011
Ross McKitrick, Ph.D.
Professor of Economics
College of Management and Economics
University of Guelph
Guelph Ontario Canada
ross.mckitrick@uoguelph.ca TESTIMONY: Senate of Canada, Ross McKitrick, December 15 2011

1. My name is Ross McKitrick, and I am a Full Professor of Economics at the University of
Guelph where I specialize in environmental economics. I have published on both on the
economics of climate change and statistical analysis in climatology. I was an expert
reviewer for the Intergovernmental Panel on Climate Change (IPCC) Fourth
Assessment Report, and in 2006 I was one of 12 experts from around the world asked
to brief a panel of the US National Academy of Sciences examining paleoclimate
reconstruction methodology.

2. The global warming issue is often described with emphatic claims that the “science is
settled”, the situation is urgent, and the necessary actions are obvious. The reality is
that there are deep disagreements about underlying scientific issues, there is reason to
believe the problem has been exaggerated, and most  policy proposals simply do not
pass objective cost-benefit tests. Amidst the disputes and controversies of the past few
years, I believe two points have emerged with clarity.

3.  First, the economics of climate change do not favour Kyoto-type commitments. Under
current and foreseeable technologies, the greenhouse gas policies we can afford to
undertake would have such small climatic impacts as to be pointless. The same kinds of
models that are used to forecast global warming predict that, if all signatories to the
Kyoto Protocol complied with their commitments, the level of carbon dioxide (CO2) in the
atmosphere that we would have observed by 2100, would instead have been reached by
about 2105, a trivial difference. And Kyoto was too costly for countries to reach. When a
policy is proposed that is too costly to implement and yields benefits that are too small to
measure, you would expect reasonable people to see  it as a bad idea. Instead we
observed a dogmatic elite consensus emerge in support of Kyoto. In my mind this never
validated Kyoto, it merely discredited the elite consensus, and suggested to me that the
international political milieu in charge of the climate issue was unduly susceptible to
groupthink.

4. Unlike such air pollutants as sulphur dioxide and particulates, CO2 is not easy to
capture, and once captured, there is no obvious way to dispose of it. There appears to
be no way to cut CO2 emissions on a large scale without cutting energy consumption
and impeding economic activity. Despite their enthusiasm for embracing targets,
policymakers around the world have not been able to cut CO2 emissions while pursuing
economic growth. Simply put, with regard to climate policy, the cure is worse than the
disease.

5.  Second, the official process for assessing technical and scientific information on climate
change for the purpose of advising policymakers has become untrustworthy due to bias
and partisanship. As a member of the expert review team for the last IPCC Report, I
saw things take place that violated longstanding principles of peer review. I documented
some of them in various publications since 2007, but the issues never received much
attention until the fall of 2009, when thousands of emails from top IPCC scientists were
leaked onto the internet. The so-called Climategate emails confirmed the reality of bias
and cronyism in the IPCC process. The new leaks last month provided even more
confirmation that climate scientists privately express greater doubts and disagreement
about climate science than is reflected in IPCC reports.  TESTIMONY: Senate of Canada, Ross McKitrick, December 15 2011

6. Earlier this year I was asked by the London-based Global Warming Policy Foundation
to review IPCC procedures, and to make recommendations for reform. My report was
published last month, and includes a foreword written by John Howard, the former Prime
Minister of Australia. I have included a copy with this submission. I mainly focus on how
the IPCC handled issues with which I have first-hand knowledge as a contributor to the
peer-reviewed literature on the subject, and on which I worked closely on the IPCC text
in my capacity as an expert reviewer.

7. The IPCC is not a neutral observer of the scientific process, instead it has a party line. It
is controlled by a relatively small Bureau in Geneva, consisting of a small core
surrounded by a network of supportive academics and government officials. The
oversight body, called the IPCC plenary panel, is passive, inattentive, and overly
deferential to the Bureau. In effect there is no oversight.
8. The Bureau picks Lead Authors who share their views. They are routinely placed in the
position of reviewing their own work and that of their critics, and are free to rule in their
own favour. Lead Authors are also free to reject reviewer comments, over-ride Review
Editors, and even rewrite text after the close of the peer review process. The
combination of Bureau control over the selection of Lead Authors, and a toothless peerreview review process, means that IPCC Assessments are guaranteed merely to repeat
and reinforce a set of foregone conclusions that make up the party line.

9. In my report I document some disturbing cases where the IPCC violated proper peer
review practises. These include:
• Manipulating prominent graphs so as to conceal known flaws in the statistical basis
of paleoclimate reconstructions and to exaggerate evidence that modern climate
change is historically exceptional. This is the so-called “hide the decline” scandal.
• Fabricating a statistical test result to provide a rationale for dismissing published
evidence of urbanization-related contamination of the surface temperature record on
which key IPCC conclusions were based.
• Waiting until the close of peer review, then removing text that had initially, and
correctly, cautioned readers that the IPCC method of calculate warming trends likely
exaggerated their significance, and replacing it with unsupported text saying the
opposite.

10. My report documents these and other incidents that, in my view, suffice to discredit its
claims to rigour and objectivity, and point to the urgent need for procedural reform.

11. In 2010 The InterAcademy Council reviewed IPCC  procedures and drew attention to
many of the same problems as my report does. Unfortunately the IPCC’s internal reform
process has gone nowhere. I discuss this problem in Section 4 of my report.

12. At this point we could simply muddle along for  another 20 years enacting more and
more costly and wasteful schemes based on the increasingly biased and unreliable
guidance of the international climate policy milieu. That would be the easiest course of TESTIMONY: Senate of Canada, Ross McKitrick, December 15 2011
action, but would not serve the public interest. The more difficult option would be to
begin the hard work of improving the decision-making process itself, beginning with
reform of the IPCC.

13. My published research has led me to believe that the IPCC has overstated the global
warming issue. I have shown that the spatial pattern of warming trends in the surface
temperature record is strongly correlated with the  spatial pattern of industrialization,
even though this pattern is not predicted by climate models as a response to
greenhouse gases. This indicates that the standard climate data sets likely have a warm
bias due to their failure to correct for disturbances of the land surface from urbanization,
agriculture, and so forth.

14. I have also shown that climate models predict significantly more warming over the past
30 years in the tropical troposphere than is observed in satellite or weather balloon
records. This is a key region for measuring the water vapour feedbacks that control the
magnitude of greenhouse warming. Despite this being the region that models say
should be warming fastest in response to greenhouse gases, the 50-year balloon record
actually shows no positive trend once the effect of ocean circulation changes in the late
1970s are removed from the record. One of the most  telling emails in the so-called
Climategate 2.0 archive that was just released last month involves one IPCC expert
warning another that their efforts to finesse this issue by deceptive trend analysis is a
“fools paradise.”

15. Today you have a chance to hear from a number of serious Canadian scientists about
work that they and their colleagues have done that also calls into question aspects of
the IPCC party line.  The fact that you have learned little of what they are about to tell
you does not indicate any deficiencies in the research they or their colleagues have
done. Instead it points to the deficiencies in the  process that was supposed to have
brought this information to your attention long before now.