Voici la question qui me guide dans mes recherches...

L’appât du gain manifesté par les entreprises supranationales et certains groupes oligarchiques, de même que le contrôle des ressources naturelles par ceux-ci, dirigent l’humanité vers un nouvel ordre mondial de type féodal, voir même sa perte. Confronté à cette situation, l’être humain est invité à refuser d’accepter d’emblée une pseudo-vérité véhiculée par des médias peut-être à la solde de ces entreprises et groupes. Au contraire, il est invité à s’engager dans un processus de discernement et conscientisation afin de créer sa propre vérité par la confrontation de sa réalité nécessairement subjective à des données objectives, telles que révélées par la science, par exemple.

The penalty that good men pay for not being interested in politics is to be governed by men worse than themselves. - Plato

mardi 31 janvier 2012

Toilette à composte... Bonne solution verte?

J'écoutais "la vie en vert" à Radio-Québec et le reportage du 31 jan 2012 sur les toilettes à composte. Curieux de nature, voici une petite analyse rapide.

  • La toilette chez home dépôt est vendu à 1565$
  • L'article parle de sauver 30,000 litres d'eau par années
  • Dans la même émission, on nous dit que chaque mètre cube d'eau (1000 lites) coûte 0,30$ à produir
Donc, 30,000 litres d'eau économisé divisé par 1000 litres pour 30 cents, ça nous donne 9$ d'économie par années, donc... 1565$/9$ =  174 ans pour rentrer dans notre argent... Hum, c'est long ça! Ceci ne tiens pas compte du coût pour traité les boues usées...

Par exemple, l'usine de Drummondville de traitement des eaux, peut traité 24,163,000,000 litres / année.  Une évaluation rapide, nous donne 7.1M$ par an en moyenne (Cout + opération) entre 1982 et 1999. Ceci nous donne un coût par 1000 litres traité de 0,30$ environ...

Donc pour traité 30,000 d'eau ça coûte 9$... donc 18$ d'économie... Donc on coupe à 87 ans notre retour sur investissement... Hum pas payant encore!

Je pense qu'il serait plus profitable pour tous d'installer des usines OHAP comme à Salaberry-de-Valleyfield.


Ce procédé possède plusieurs avantages par rapport aux autres méthodes de disposition des boues que sont la valorisation agricole, l'enfouissement, le compostage et l'incinération. Les principaux avantages sont les suivants :
  • C'est une solution finale complète, directement à la source
  • Élimination totale des risques sanitaires (pathogènes, virus, bactéries non digérées, etc.)
  • Encombrement des équipements minime et peut opérer de façon continue peu importe les conditions météorologiques.
  • Température d'opération basse donc moins de génération de NOx et moins d'entretien et remplacement des matériaux réfractaires
  • N'est pas assujetti aux variations du marché (e.g. coûts croissants de l'enfouissement, offre/demande du compost, etc.)
  • La récupération de la chaleur et des cendres permet leur valorisation
  • Procédé insensible aux caractéristiques physico-chimiques des boues (pH, concentrations de métaux lourds et autres substances toxiques).
  • C'est une technologie simple, efficace et fiable
  • Ne produit pas d'odeur
Il ne faut jamais oublier que pour chaque dollars dépensé au Québec, environ 0,62$ est dépensé en énergie fossiles.  Donc dépenser 1565$ pour une toilette, équivaut à dépensé 968$ en énergie fossiles... par très vert tout ça!

samedi 21 janvier 2012

La prochaine révolution verte

Prochaine révolution verte

Il existe déjà des solutions incroyables concernant la sécurité alimentaire avec l'agriculture verticale et l'aquaponie. Cependant, le temps que de telles technologies soient promues, qu'est ce que l'agriculteur moderne peut faire actuellement pour améliorer le rendement de ses terres tout en réduisant son empreinte écologique? Poussons un peu plus loin : comment pourrait-il également restaurer les terres dégradées par une exploitation trop massive de pesticides, des monocultures, et l'érosion prématurée des sols?

À l'heure actuelle, 40 % des sols du monde sont exploités par l'agriculture. La dégradation des sols menace la sécurité alimentaire à un point tel que les États-Unis sont passés de 2 ans de réserves en 1985 à seulement 43 jours en 2010. Le problème est préoccupant, mais effectivement, des solutions et des techniques existent. Si ces solutions étaient mises en application, nous n'aurions pas besoin de défricher davantage de forêts pour fournir une population croissante, mais pourrions produire plus sur beaucoup moins d'espace et restaurer les terres dévastées. Ces moyens ont déjà fait leurs preuves, et peuvent conduire à une plus grande prospérité.

Solution #1 : La permaculture

Le concept, inventé par Bill Mollison dans les années 70, est de recréer un écosystème autosuffisant par une symbiose de plusieurs espèces de plantes, d'insectes et de champignons. Geoff Lawton a prouvé au début des années 2000 qu'il est possible de reverdir le désert en Cisjordanie avec seulement un cinquième de l'eau nécessaire à l'agriculture par irrigation conventionnelle.

La technique consiste à creuser une tranchée qui récolte les retombées des pluies, assurant ainsi une réserve d'eau à longueur d'année. Sur chaque côté de la tranchée, des buttes couvertes de compost et de restes de cultures précédentes — qui auraient été brûlées normalement — sont reconverties en nutriments pour le sol. Des arbres fruitiers et autres types de cultures sont alors plantés en étages le long de la berge et irrigués par un système de compte-gouttes allant directement aux racines. À mesure que le temps passe, le sel emmagasiné dans le sol vient à être éliminé par des champignons et des insectes, rendant à nouveau le sol fertile. La terre, même après avoir été abandonnée pendant 5 ans par manque de fonds, a conservé toute sa vitalité. C'est ce qu'il a pu constater à son retour en 2009. Pour plus de détails, voici la vidéo qui relate l'épopée :



Voici un autre exemple de permaculture à l'œuvre, cette fois-ci par Willie Smits, qui en 4 années et avec la collaboration de plusieurs familles d'agriculteurs, a restauré une plaine quasi désertique en 500 acres de forêt amazonienne :



Sinon, certains préceptes de la permaculture ont été instaurés à l'agriculture de certains pays du tiers-monde. Cette technique, l'agroécologie, combine les propriétés de plusieurs plantes pour augmenter par 5 fois le rendement de l'agriculture de subsistance — si l'on se met dans le contexte que l'agriculture commerciale, augmente cette capacité de 10 fois. Mais c'est entièrement biologique, donc une bonne piste à explorer.



Solution #2 : Progrès technologiques et rendements optimisés

Voici comment une simple roue creusant des sillons semblables à une gaufre parvient à éviter l'érosion prématurée des sols et ceci sans compacter la terre tout en économisant l'eau bien répartie. Cette invention augmente l'efficacité d'un sol de 30 % selon les chiffres.



Sinon, la réorganisation des sols en modèle circulaire et une simple invention permet de pratiquer l'agrocircle, permet de réduire l'effort fourni par un simple agriculteur de 70 % en plus d'éliminer les émissions de gaz à effet de serre.
Irrigation au cœur du Sahara

 La vidéo est sous-titrée en anglais (cliquez sur cc).



Il est également prouvé par une étude de McGill que le rendement pourrait être optimisé si les cultures étaient adaptées selon les climats. Certaines cultures de maïs à travers le monde ne fournissent que de 20 à 40 % de rendement, alors qu'elles pourraient en fournir 100 %.

Solution #3 : Champignons et bactéries

L'agriculture industrielle utilise de nombreux fertilisants chimiques à base d'azote pour produire du nitrogène, ce qui a assuré le rendement commercial que l'on voit de nos jours. Même si jusqu'à maintenant nous avons pu nourrir la population mondiale actuelle (même au-delà si l'on comptabilise le gaspillage éhonté), cela n'a pas été sans aléas. Ces fertilisants, en se déversant dans nos cours d'eau, contribuent à la propagation des algues, ce qui cause des zones mortes pour la biodiversité.
Ce qui est important de trouver, ce sont de nouvelles méthodes pour améliorer le rendement des plantes, tout en capturant davantage de Co2 pour que les sols se régénèrent.

Eh bien!, n'ayez crainte, des recherches sont déjà en cours, avec des résultats pour le moins stupéfiants.

Les champignons contribuent largement à la conversion des matières organiques en nutriments pour le sol, et donc par intérim pour les plantes. Voici comment, avec l'aide de certains types de champignons, l'on pourrait revitaliser les sols dégradés, recréer des écosystèmes en laissant la nature se régénérer et créer des pesticides naturels en plus d'antibiotiques. Encore mieux, ces champignons pourraient être utilisés pour nettoyer les sols après un déversement de pétrole. Il faut le voir pour le croire. Ces recherches, effectuées par Paul Stamets, sont très prometteuses.



Mais, pourquoi se limiter seulement à des recherches alors que des résultats démontrent clairement une nette amélioration au rendement des cultures? Voici une vidéo de TED (comme de nombreuses autres précédemment), animée par Guy Webb, montrant des techniques qui favorisent l'extension des racines des plantes et utilisent des bactéries et champignons pour régénérer le sol en capturant le Co2. Ici, ils passent d'une capture de l'agriculture conventionnelle de 1,5 tonne de Co2 par hectare à 281 tonnes de Co2 par hectare... Ça donne à réfléchir.



En conclusion, certes, beaucoup de chemin reste à faire, mais ces découvertes ne sont que la pointe d'un iceberg de possibilités. Imaginez les progrès accomplis en à peine 20 ans. Imaginez une autre période de temps semblable, et la vitesse à laquelle notre technologie évolue. L'avenir ne sera pas aussi noir qu'on veut bien le laisser entendre. Au lieu de regarder un problème et de laisser quelqu'un d'autre s'en charger, au contraire, les mariages d'idées seront la panacée aux enjeux auxquels l'humanité devra faire face dans les années à venir.

Vincent Blanchette

dimanche 8 janvier 2012

LNT from the point of view of the academy of medicine of France.

Statement by the Academy of Medicine of France, December 4, 2001  source
4 December 2001
Medical Irradiation, Radioactivity Releases, and Disinformation:
An opinion by the Academy of Medicine
The Academy of Medicine, preoccupied by the problems that arose in the public about medical exposure to X rays and radioactive releases in the environment, and erroneous information that these subjects give rise to, wishes to give an opinion on this subject.
Humanity is exposed to ionizing radiation
Since the beginning, life developed in a bath of ionizing radiation to which it is adapted. These radiations have a cosmic origin or originate in the earthly crust where, since the creation of the earth, the unstable isotopes of the elements of very long physical half-lives remain: thorium, uranium, potassium, and rubidium. Natural exposure results therefore from internal and external sources, both characterized by various physical properties and different effects on human body.
The presence of radionuclides in the environment results in an average radioactivity of 10,000 Bq in the human body, essentially from carbon-14 and from potassium-40 whose concentration is regulated by homeostatic control of intracellular potassium content. Average exposure of humans to natural sources is evaluated to 2.4 mSv per year expressed as effective dose. It exists nevertheless with important variations according to the altitude and nature of the rock and soils in the ground, generally varying from 1 to 10 mSv, and attaining more than 100 mSv in wide regions such as Kerala in India, or the city of Ramsar in Iran (1). These natural variations involve different target tissues for the dose being delivered, such as lung for radon, kidney for uranium, bones for radium, and bones, hepatic and systemic phagocytes for thorium, of which the behavior and the radiological characteristics are similar to those of plutonium.
To natural background irradiation is added, since the end of the 19th century, a diagnostic medical irradiation which delivers an average of about 1 mSv per year, but with variations from less than 1 mSv to more than 20 mSv per year.
And last, since 1950, it is necessary to add irradiations of industrial origin - notably the one from producing electricity by nuclear energy (extraction and treatment of uranium, functioning of reactors, etc.) corresponding to an exposure of the order 0.01 to 0.02 mSv per year - and one of the other natural sources, from coal extraction and burning to 0.01 mSv per year. In addition, radioactivity releases to the atmosphere contribute to an average exposure of 0.005 mSv/yr, and the Tchernobyl accident to about 0.002 mSv/yr (1).
In measuring dose effectiveness, the biological effects of the different types of ionizing radiation are identical whether their origin is natural or artificial.
Exposure of workers to ionizing radiation (200,000 in France, of which more of than half are in the medical sector) results, in France, in an average exposure of 2 mSv per year (OPRI annual report) with less then 1% surpassing the average statutory limit of 20 mSv per year. Except for diagnostic irradiations, these exposures are characterized by low dose rate, chronic irradiation doses. This aspect distinguishes them clearly from accidental and therapeutic irradiations that are performed at high dose-rate, causing instantaneous accumulation of damaged molecules that perturb components of cellular repair mechanisms, with as little as a few absorbed mGy in a few minutes (2).
Dismantling of nuclear power plants and nuclear waste storage are activities that make small dose increases to the populations at very low dose rates (about 0.005 µSv per year for iodine-129 for example) (1), essentially by transfer in the food chain of various man-made radionuclides of long half-lives leading to either: –homogenous exposure of the whole body (as in the case of natural potassium-40), or to –selective organ exposure, e.g., to the large intestine, bone, liver and kidney, as in the case of the natural isotopes of uranium and thorium. It is therefore legitimate to infer their possible effects on human health from those known to result from natural sources which expose populations of many millions of residents.
The health consequences of the exposure of humans to a few mSv.
There exists data (3) establishing that high natural exposure is associated in adults to an increased rate of chromosome aberrations of the circulating lymphocytes, a biological indicator of exposure. It cannot be concluded, however, that this is an index of harm since there are detected no global increase of cancer risk (4), increase of congenital malformations (5), or abnormalities induced by cytogenetic effects with newborns (6), in the well-studied population of the particular highly-exposed region of Kerala India to external irradiation and to internal contamination. Identical conclusions are obtained in the exposed Chinese populations (7-8). And last, as stated by the NCRP in the United States (9): «It is important to notice that the incidence of cancers in most of the exposed populations to low-dose radiation has not been found to be increased, and that in most of the cases this incidence seems to have been reduced».
 The hypothesis of the risks of cancer induced by low doses and dose-rates is founded on the extrapolation of data of highly-exposed human groups, applying the risk as being constantly proportional to the received dose without being limited by a threshold, the linear no-threshold (LNT) assumption. This hypothesis conflicts with itself and has many scientific objections (10); and it is contradicted by experimental data (11) and epidemiology.
In the groups having received more than 200 mSv as adults, and 100 mSv as infants, excesses of cancer have been observed: in e.g., Japanese atomic bomb survivors in Hiroshima and Nagasaki, irradiated medical patients, nuclear workers, and residents of the former-USSR contaminated by nuclear wastes. No excess of cancers has been observed for doses lower than 100 mSv. A doubt remains nevertheless in the case of irradiation for x-ray in utero from 10 mSv because the epidemiological data are contradictory (12).
Having not observed excess cancer does not allow an effect for low doses to be excluded because of statistical limitations. Nevertheless it is necessary to recall that the linear theory with no threshold (LNT): -is contradicted by the observation of thresholds for bone cancers induced by radium-226, and cancers of the liver induced by Thorotrast; -is not compatible with induced leukemias in Hiroshima, nor with the patients treated by radioactive iodine (1,10,13). Besides, the historic epidemiological study of the British radiologists for the period 1897-1997 (14) showed that for the registered radiologists after 1954 these practitioners have no excess of cancers in comparison with their non-radiologist colleagues, with a tendency to a lower cancer rate, as in the case of populations described by the NCRP (9). Similar deficits were observed for many groups of exposed professional workers to ionizing radiation, notably radiologic technicians: while the frequency of cancers increased in their jobs during the period when there was limited radiation protection, the excesses of cancers disappeared when regulatory limits were reduced to 50 mSv/yr, as enforceable up to 1990 (12).
These observations, associated with the recent biological data, showing complexity and the variety of molecular and cellular mechanisms that control cell survival and mutagenesis according to the dose and dose-rate (1,2,11,13), remove all scientific rationale to a linear extrapolation that overestimates very widely the effects of low doses and dose-rates. One cannot add the exposures of a few mSv/yr, and a fortiori lower than 0.02 mSv/yr, delivered to a large number of individuals (as done with the use of collective doses) to estimate the risk of excess cancers (15). The Academy of Medicine, joining the position of the large international institutions, strongly affirms that such calculations have no scientific validity, notably to evaluate the associated risks to radiation, such as the effects claimed outside the former-USSR from the fallout from Tchernobyl.
The UNSCEAR 2000 report and the controversy with the OCHA.
The Tchernobyl catastrophe has caused to this day about 2,000 cancers of the thyroid in children, essentially from exposure to iodine-131 and the short-lived iodine isotopes. The delivered doses to the thyroid were on average of the order of 1 Gy, and of 3 Gy on average in the most exposed regions (16). This carcinogenic effect is therefore in keeping with the sum total of our knowledge of radiation risks.
In 2000, UNSCEAR concluded that there is an absence of excess leukemias and of cancer other than thyroid cancer in the population around Tchernobyl. It also did not find a relationship between the exposures to radiation and congenital malformations in these populations (1). This conclusion was questioned in 2001 by the OCHA, the humanitarian organization of the UN, but the OCHA publication was refuted in a response by the UNSCEAR committee, which alone has the medical and scientific competence to speak with the name the UN and of the WHO on this subject (17). A conference was therefore held in Kiev in June 2001, with the WHO, OCHA, UNSCEAR, ICRP and IAEA, and the conclusions have been published (annex). These conclusions find that the health conditions are alarming because of the general deterioration of the health and social conditions, notably in Belarus, but do not contradict the UNSCEAR conclusions. In fact, this deterioration is probably caused by the living conditions of the relocated populations, associated with psycho-sociological factors. Different questions have been raised that do appear to necessitate epidemiological research of the conditions of the catastrophe consisting of multiple susceptible factors that altered the health of populations: this is the recommendation of the Kiev conference.
It is possible to reduce human exposure to ionizing radiation, in particular of radiation medical origin, with the necessary means.
Radiological examinations represent, by very far, the principal cause of irradiation of human origin (effective average dose of about 1 mSv/yr in France). The recent direction of the European Union introduces two notions to this subject: -cost-optimization (to reduce as much as possible the dose per examination), -and justification (to evaluate the benefit and the risk of each examination, and to not practice it unless it is advantageous). These principles necessitate therefore the evaluation of effective doses received by the examined subject and the relevant risks. Now, according to the examinations and the techniques used, the effective doses vary from a fraction of a mSv to several tens of mSv (examinations by x-ray scanners or radiological interventions) and the risks vary widely according to age. An over-evaluation of risks could deprive a child of a useful examination; inversely, an under-evaluation could favor the multiplication of medical X ray examinations that are not useful. The Academy counsels therefore, in a first step: 1) to focus on the study and evaluation of examinations from which the potential risks are the largest: x-ray scans with young subjects, multiple radiological examinations with premature interventional angiography; 2) to promote the likely techniques to reduce or to eliminate irradiation without harm to the quality of clinical information and to stimulate the technical and basic research in this area; 3) to conduct epidemiological studies on groups of patients, notably infants, which have received the most important doses from radiological examinations; and 4) to favor the initial and continuing training of clinicians in matters of radiation protection.
It is unacceptable, while irradiation of medical origin represents, in France, 95% of the irradiation added to the natural background, that there is little benefit to affect reduction in the industrial environment by applying radiation protection at very high costs.
It is necessary to define health priorities in the matter of releases.
Outside of this context, some recommendations can be undertaken concerning the problem of radiation releases in the matter of health. It appears essential to support epidemiological studies concerning the populations exposed naturally to high-level background radiation, and even concerning the populations of the ex-USSR that were massively exposed to radioactivity releases and to other pollution. In the framework of studies dealing with potential health effects of nuclear waste management, the priority isotopes should not be selected according to the collective dose that some would use, but according to the potential doses to individuals because the calculated collective doses from low individual doses to a few microSieverts cannot have any effect on health. A significant national effort should be undertaken, as the one undertaken in the framework of the programs of the U.S. DOE, on the biological mechanisms in the cellular response to doses below 100 mSv, in particular, health effects from DNA repair, cell signaling, and the hereditary transmission in DNA sequence encoding of parental DNA modified by irradiation.
 
– Recommendations –
The Academy of Medicine:
1 – recommends increased effort for radiation protection in the area of radiological examinations, on the one hand to reduce received doses from certain types of examinations (x-ray scans with infants, interventional angiography, lung X ray examinations with premature treatments, etc…), and on the other hand, to allow radiology services, notably in radio-pediatrics, to obtain benefits of well-educated physicists for dosimetry and quality control of the devices, in a way similar to that previously undertaken with mammography in breast cancer surveys. It recommends to this end to support clinical and technical research in this area.
2 – recommends an effort of basic research: on the biological mechanisms activated by the repair of DNA damage after low doses up to 100 mSv; and on the effects of these doses on the exchanges of intra- and inter-cellular molecular signals.
3 – denounces utilization of the linear no-threshold (LNT) relation to estimate the effect of low doses to a few mSv (of the order of magnitude of variations of natural radiation in France) and a fortiori of doses hundreds of times lower, such as those caused by radioactive releases, or 20 times lower, such as those resulting in France from the fallout of radioactive materials from the Tchernobyl accident. It associates with many international institutions to denounce improper utilization of the concept of the collective dose to this end. These procedures are without any scientific validity, even if they appear be convenient to administrative ends.
4 – subscribes to the conclusions of the 2000 Report of the Scientific Committee on the Effects of Atomic Radiation of the United Nations (UNSCEAR) concerning the analysis of health consequences of the Tchernobyl accident, and denounces the propagation of allegations concerning excesses of other cancers than the thyroid cancer, and excesses of congenital malformations.
5 – recommends introduction of the ADIR (Annual Dose of Incorporated Radioactivity, being equivalent to 0.2 mSv, resulting from homogeneous exposure of the human body to natural potassium-40 and carbon-14) as this dose equivalent is almost constant whatever the size of the individual and the geographic region.
6 – The Academy of Medicine, in accordance with its October 3rd 2000 statement, continues to recommended maintaining without modification the European directive concerning regulatory limits (to 100 mSv/5yr). To substitute dose limits of 20 mSv/yr would reduce the flexibility of the European norm, all without any health advantage, and would harm the functioning of medical radiology services while making the improvement of applicable techniques more difficult.
Glossary
-Bq or becquerel, the radioactivity characterized by a disintegration per second. In the human body 10,000 Bq of the natural sources represent 1 ADRI that is equivalent by convention to a dose equivalent of 0.2 mSv
-Gy or gray, the absorbed dose corresponding to 1 joule per kg.
-Sv or sievert, the unit of equivalent dose obtained from the product of the dose absorbed by the weighting factor for radiation quality (1 for X, beta and gamma radiations … 20 for alpha radiation). The effective dose, also expressed in Sv, is the product of the dose equivalent by the weighting factor for organs (0.05 for the thyroid… 1 for the entire body).
IAEA: International Atomic Energy Agency
ADRI: Annual Dose of Incorporated Radioactivity, recommendation G. Charpak.
DOE: Department of Energy, U.S.
ICRP: International Commission on Radiation Protection
NCRP: National Council on Radiation Protection and Measurements (USA)
OCHA: Office for the Co-ordination of Humanitarian Affairs
WHO: World Health Organization
UNSCEAR: United Nations Scientific Committee on the Effects of Atomic Radiation

References:
  1. UNSCEAR: Sources and effects of ionizing radiation, Report to the General Assembly, with annexes, United Nations, 2000.
  2. Feinendegen L, Pollycove M, Biologic Responses to Low Doses of Ionizing Radiation: Detriment Versus Hormesis, J Nuclear Medicine, 42, 7, 17N-27N and 26N – 37N, 2001.
  3. BEIR V: Committee on the Biological Effects of Ionizing Radiation. Health effects of exposure to low levels of ionizing radiations. National US Academy of Sciences, National Research Council, Washington 1990.
  4. Nair MK, Nambi KS, Amma NS, Gangadharan P, Jayalekshmi P, Jayadevan S, Cherian V, Reghuram KN Population study in the high natural background radiation area of Kerala, India. Radiat Res. 152, 145-148S, 1999
  5. Jaikrishnan J'S and al, Genetic monitoring of the human population from high-level natural radiation areas of Kerala on the southwest coast of India. Prevalence of congenital malformations in newborns. Radiat Res 152, 149-153S, 1999.
  6. Cheryan VD et al. Genetic monitoring of the human population from high level natural radiation areas of Kerala on the southwest coast of India incidence of numerical structural and chromosomal aberrations in the lymphocytes of newborns. Radiat Res. 152, 154-158S, 1999.
  7. Tao Z J Radiat Res (Tokyo) 41 Suppl:31-4, 2000.
  8. Wei LX, Sugahara T. High background radiation area in china. J Rad. Research (Tokyo) 41, Suppl. 1-76, 2000.
  9. National Council on Radiation Protection and Measurements – Evaluation of the linear non-threshold model for ionizing radiation – NCRP-136, Bethesda MD, USA, 2001.
  10. Academy of Sciences – secured Problems of the effects of the low doses of ionizing radiations. Report 34, Oct 1995.
  11. Tanooka H. Threshold dose-response in radiation carcinogenesis: an approach from chronic alpha-irradiation experiments and a review of non-tumour doses. Int. J Radiat. Biol., 77, 541-551, 2001
  12. IARC 2000 – Monographs on the evaluation of carcinogenic risks to humans, Vol. 75, Ionizing radiation - IARC, Lyon, France
  13. Academy of Sciences – Symposium on risk due to carcinogens from ionizing radiation – Report, Academy of Sciences, Series III, 322, 81-256, 1999
  14. Berrington HAS. Darby SC, Weiss HA., Doll R. – 100 years of observation on British radiologists mortality from cancer and other causes 1897-1997. British Journal of Radiology, 74, 507-519, 2001
  15. BRPS Symposium, Warrenton: Bridging radiation policy and science (K.L. Mossman et al. Ed.) 2000
  16. IAEA, Final Report, Belarus, Ukrainian and Russian 2001: Health effects of the Tchernobyl accident.
  17. Holm LE (UNSCEAR Chairman) Chernobyl effects. Lancet, 356, 344, 2000
  18. European Directive 97/43 on radiological examinations, 1997

lundi 2 janvier 2012

Climate debate at the Canadian Senate - Why Canada is out of Kyoto

A very important and interesting debate took place mid December 2011 on the climate, CO2, global warming. After listening to those very knowledgeable and serious researchers, it's easy to see why the Canadian government jumped off the Kyoto protocol.

Invitee list:

  • Ross McKitrick, Professor, Department of Economics, University of Guelph
  • Ian D. Clark, Professor, Department of Earth Sciences, University of Ottawa
  • Jan Veizer, Professor Emeritus, Department of Earth Sciences, University of Ottawa
  • Timothy Patterson, Professor of Geology, Department of Earth Sciences, Carleton University
Here's the full presentation on YouTube:


Other video in other languages are available here.

The transcripts will be available here.

Ross McKitric testimony available here.
Copied here for quick reference:

Testimony
For delivery to the Senate Standing
Committee on Energy, the Environment
and Natural Resources,
Parliament of Canada
Ottawa Ontario
December 15, 2011
Ross McKitrick, Ph.D.
Professor of Economics
College of Management and Economics
University of Guelph
Guelph Ontario Canada
ross.mckitrick@uoguelph.ca TESTIMONY: Senate of Canada, Ross McKitrick, December 15 2011

1. My name is Ross McKitrick, and I am a Full Professor of Economics at the University of
Guelph where I specialize in environmental economics. I have published on both on the
economics of climate change and statistical analysis in climatology. I was an expert
reviewer for the Intergovernmental Panel on Climate Change (IPCC) Fourth
Assessment Report, and in 2006 I was one of 12 experts from around the world asked
to brief a panel of the US National Academy of Sciences examining paleoclimate
reconstruction methodology.

2. The global warming issue is often described with emphatic claims that the “science is
settled”, the situation is urgent, and the necessary actions are obvious. The reality is
that there are deep disagreements about underlying scientific issues, there is reason to
believe the problem has been exaggerated, and most  policy proposals simply do not
pass objective cost-benefit tests. Amidst the disputes and controversies of the past few
years, I believe two points have emerged with clarity.

3.  First, the economics of climate change do not favour Kyoto-type commitments. Under
current and foreseeable technologies, the greenhouse gas policies we can afford to
undertake would have such small climatic impacts as to be pointless. The same kinds of
models that are used to forecast global warming predict that, if all signatories to the
Kyoto Protocol complied with their commitments, the level of carbon dioxide (CO2) in the
atmosphere that we would have observed by 2100, would instead have been reached by
about 2105, a trivial difference. And Kyoto was too costly for countries to reach. When a
policy is proposed that is too costly to implement and yields benefits that are too small to
measure, you would expect reasonable people to see  it as a bad idea. Instead we
observed a dogmatic elite consensus emerge in support of Kyoto. In my mind this never
validated Kyoto, it merely discredited the elite consensus, and suggested to me that the
international political milieu in charge of the climate issue was unduly susceptible to
groupthink.

4. Unlike such air pollutants as sulphur dioxide and particulates, CO2 is not easy to
capture, and once captured, there is no obvious way to dispose of it. There appears to
be no way to cut CO2 emissions on a large scale without cutting energy consumption
and impeding economic activity. Despite their enthusiasm for embracing targets,
policymakers around the world have not been able to cut CO2 emissions while pursuing
economic growth. Simply put, with regard to climate policy, the cure is worse than the
disease.

5.  Second, the official process for assessing technical and scientific information on climate
change for the purpose of advising policymakers has become untrustworthy due to bias
and partisanship. As a member of the expert review team for the last IPCC Report, I
saw things take place that violated longstanding principles of peer review. I documented
some of them in various publications since 2007, but the issues never received much
attention until the fall of 2009, when thousands of emails from top IPCC scientists were
leaked onto the internet. The so-called Climategate emails confirmed the reality of bias
and cronyism in the IPCC process. The new leaks last month provided even more
confirmation that climate scientists privately express greater doubts and disagreement
about climate science than is reflected in IPCC reports.  TESTIMONY: Senate of Canada, Ross McKitrick, December 15 2011

6. Earlier this year I was asked by the London-based Global Warming Policy Foundation
to review IPCC procedures, and to make recommendations for reform. My report was
published last month, and includes a foreword written by John Howard, the former Prime
Minister of Australia. I have included a copy with this submission. I mainly focus on how
the IPCC handled issues with which I have first-hand knowledge as a contributor to the
peer-reviewed literature on the subject, and on which I worked closely on the IPCC text
in my capacity as an expert reviewer.

7. The IPCC is not a neutral observer of the scientific process, instead it has a party line. It
is controlled by a relatively small Bureau in Geneva, consisting of a small core
surrounded by a network of supportive academics and government officials. The
oversight body, called the IPCC plenary panel, is passive, inattentive, and overly
deferential to the Bureau. In effect there is no oversight.
8. The Bureau picks Lead Authors who share their views. They are routinely placed in the
position of reviewing their own work and that of their critics, and are free to rule in their
own favour. Lead Authors are also free to reject reviewer comments, over-ride Review
Editors, and even rewrite text after the close of the peer review process. The
combination of Bureau control over the selection of Lead Authors, and a toothless peerreview review process, means that IPCC Assessments are guaranteed merely to repeat
and reinforce a set of foregone conclusions that make up the party line.

9. In my report I document some disturbing cases where the IPCC violated proper peer
review practises. These include:
• Manipulating prominent graphs so as to conceal known flaws in the statistical basis
of paleoclimate reconstructions and to exaggerate evidence that modern climate
change is historically exceptional. This is the so-called “hide the decline” scandal.
• Fabricating a statistical test result to provide a rationale for dismissing published
evidence of urbanization-related contamination of the surface temperature record on
which key IPCC conclusions were based.
• Waiting until the close of peer review, then removing text that had initially, and
correctly, cautioned readers that the IPCC method of calculate warming trends likely
exaggerated their significance, and replacing it with unsupported text saying the
opposite.

10. My report documents these and other incidents that, in my view, suffice to discredit its
claims to rigour and objectivity, and point to the urgent need for procedural reform.

11. In 2010 The InterAcademy Council reviewed IPCC  procedures and drew attention to
many of the same problems as my report does. Unfortunately the IPCC’s internal reform
process has gone nowhere. I discuss this problem in Section 4 of my report.

12. At this point we could simply muddle along for  another 20 years enacting more and
more costly and wasteful schemes based on the increasingly biased and unreliable
guidance of the international climate policy milieu. That would be the easiest course of TESTIMONY: Senate of Canada, Ross McKitrick, December 15 2011
action, but would not serve the public interest. The more difficult option would be to
begin the hard work of improving the decision-making process itself, beginning with
reform of the IPCC.

13. My published research has led me to believe that the IPCC has overstated the global
warming issue. I have shown that the spatial pattern of warming trends in the surface
temperature record is strongly correlated with the  spatial pattern of industrialization,
even though this pattern is not predicted by climate models as a response to
greenhouse gases. This indicates that the standard climate data sets likely have a warm
bias due to their failure to correct for disturbances of the land surface from urbanization,
agriculture, and so forth.

14. I have also shown that climate models predict significantly more warming over the past
30 years in the tropical troposphere than is observed in satellite or weather balloon
records. This is a key region for measuring the water vapour feedbacks that control the
magnitude of greenhouse warming. Despite this being the region that models say
should be warming fastest in response to greenhouse gases, the 50-year balloon record
actually shows no positive trend once the effect of ocean circulation changes in the late
1970s are removed from the record. One of the most  telling emails in the so-called
Climategate 2.0 archive that was just released last month involves one IPCC expert
warning another that their efforts to finesse this issue by deceptive trend analysis is a
“fools paradise.”

15. Today you have a chance to hear from a number of serious Canadian scientists about
work that they and their colleagues have done that also calls into question aspects of
the IPCC party line.  The fact that you have learned little of what they are about to tell
you does not indicate any deficiencies in the research they or their colleagues have
done. Instead it points to the deficiencies in the  process that was supposed to have
brought this information to your attention long before now.