Voici la question qui me guide dans mes recherches...

L’appât du gain manifesté par les entreprises supranationales et certains groupes oligarchiques, de même que le contrôle des ressources naturelles par ceux-ci, dirigent l’humanité vers un nouvel ordre mondial de type féodal, voir même sa perte. Confronté à cette situation, l’être humain est invité à refuser d’accepter d’emblée une pseudo-vérité véhiculée par des médias peut-être à la solde de ces entreprises et groupes. Au contraire, il est invité à s’engager dans un processus de discernement et conscientisation afin de créer sa propre vérité par la confrontation de sa réalité nécessairement subjective à des données objectives, telles que révélées par la science, par exemple.

The penalty that good men pay for not being interested in politics is to be governed by men worse than themselves. - Plato

vendredi 30 décembre 2011

It's the end of the world... Again... Hum...

Today, December 30th 2011, we read this propaganda piece in "The Gazette" and distributed, without analysis by Radio-Canada on twitter.

They did not even copied the right information. 2018 or 2020... It's important to get the end date right ;-)


ABOVE-AVERAGE TEMPERATURES WERE RECORDED DURING EVERY SEASON OF 2011. And while winter-weary Quebecers might welcome the warmer weather, it sends a more sinister message to scientists who see it as evidence that, by 2020, temperatures in Quebec will exceed the 2-degree tipping point they say will lead to catastrophic climate change. William Marsden reports, Page A3. WHAT KIND OF ENERGY IS QUEBEC USING - and what does it need? Our appetite will fuel debate in 2012, Lynn Moore writes. Page B1
Couple of points to note when you read this:
  • Above average temperatures..
    • Which data set did they take?  There are so many out there, without this vital information, it's hard to know what they are talking about.
    • Here's a graph of all data set for the last 10 years...
  • "evidence that, by 2020, temperatures in Quebec will exceed the 2-degree tipping point"
    • Evidence?  By 2020?, what data set extrapolation did they used?  What models?  Why 2 degrees?  All those are assumptions, extrapolations, based on mathematical models often unproven and unable to track real life data set.
  • lead to catastrophic climate change
    • Catastrophic climate change!!! Wow, 2 degrees more would be a catastrophe, if this ever happen, what are the proof that this will be a catastrophe, continue reading... we have seen worst in the past.
  • The rest of the small article, last 2 lines, does not make sense at all... No references provided on the web page of the Gazette.
I classify this small article as pure propaganda... When I read this I think of Mulder in X-files... I WANT TO BELIEVE!

Here's some real information that can help you to judge by yourself, if you are not the type that does believe everything that the mass media throws at you!

In all the data set, I think in my humble opinion that the more precises/accurate is the UAH dataset that started in 1978 and is based on microwave sounding units on polar orbiting satellites. Here's all the data since it started.  I plotted a green line for a running average of 13 months.  I also plotted two regression lines before and after the mount Pinatubo event.  Do you see in there a maximum average temperature higher than other years?  You also need to note the range of change since 1978, that is for that last 33 years...


But today's temperature data need to be put in perceptive on a longer time scale... Here's a study of the last 2485 years in the China region, it clearly shows that the temperature in the past was higher then today, if the tipping point theory was right, we should have seen a couple of time a run-away temperature on the planet where the planet would have burn to a crisp!  But if you read on this term "tipping point" it's not even clear what it means!.
Tree-ring-based temperature reconstruction for the central-eastern Tibetan Plateau during the past 2485 years (gray line), the 40-year moving average (thick black line) and the 40-year running standard deviation (thin black line); the horizontal line is the mean temperature for the 2485 years.

Here's another study for that last 4000 years in Greenland... As you see, many time, again, the climate was warmer.
Past 4000 years of
Greenland temperature. Thick blue line and band are the same as above. Thick green line represents 100‐year moving
averages. Black and red lines are the Summit [Box et al., 2009] and AWS [Stearns and Weidner, 1991; Shuman et al., 2001;
Steffen and Box, 2001; Vaarby‐Laursen, 2010] decadal average temperature, respectively. Blue and pink rectangles are the
periods of 1000–2010 C.E. (Figure 1, middle) and 1840–2010 C.E. (Figure 1, top), respectively. Present temperature is
calculated from the inversion adjusted AWS decadal average temperature (2001–2010) as −29.9°C
From Wikipedia, I highlighted in red what I think is missing in most climate related journalism reports.

The elements of journalism

According to The Elements of Journalism, a book by Bill Kovach and Tom Rosenstiel, there are nine elements of journalism.[2] In order for a journalist to fulfill their duty of providing the people with the information, they need to be free and self-governing. They must follow these guidelines:
  1. Journalism's first obligation is to the truth.
  2. Its first loyalty is to the citizens.
  3. Its essence is discipline of verification.
  4. Its practitioners must maintain an independence from those they cover.
  5. It must serve as an independent monitor of power.
  6. It must provide a forum for public criticism and compromise.
  7. It must strive to make the news significant, interesting, and relevant.
  8. It must keep the news comprehensive and proportional.
  9. Its practitioners must be allowed to exercise their personal conscience.
In the April 2007 edition of the book,[3] they added the last element, the rights and responsibilities of citizens to make it a total of ten elements of journalism.



So when you read stuff in the mass media, ask yourself a couple of question and always be skeptical!  Your way of live and of future generations may depend on it.

jeudi 29 décembre 2011

New research in low dose radiation disprove LNT

Following on my previous articles on LNT that you should read before this one. Here's some good news on this important research that could change the way we approach low dose radiation.



New Take on Impacts of Low Dose Radiation
Berkeley Lab Researchers Find Evidence Suggesting Risk May Not Be Proportional to Dose at Low Dose Levels

Researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab), through a combination of time-lapse live imaging and mathematical modeling of a special line of human breast cells, have found evidence to suggest that for low dose levels of ionizing radiation, cancer risks may not be directly proportional to dose. This contradicts the standard model for predicting biological damage from ionizing radiation – the linear-no-threshold hypothesis or LNT – which holds that risk is directly proportional to dose at all levels of irradiation.

Imaging of a cell’s DNA damage response to radiation shows that 1.5 minutes after irradiation, the sizes and intensities of radiation induced foci (RIF) are small and weak, but 30 minutes later damage sites have clustered into larger and brighter RIF, probably reflecting DNA repair centers.

“Our data show that at lower doses of ionizing radiation, DNA repair mechanisms work much better than at higher doses,” says Mina Bissell, a world-renowned breast cancer researcher with Berkeley Lab’s Life Sciences Division. “This non-linear DNA damage response casts doubt on the general assumption that any amount of ionizing radiation is harmful and additive.”

Bissell was part of a study led by Sylvain Costes, a biophysicist also with Berkeley Lab’s Life Sciences Division, in which DNA damage response to low dose radiation was characterized simultaneously across both time and dose levels. This was done by measuring the number of RIF, for “radiation induced foci,” which are aggregations of proteins that repair double strand breaks, meaning the DNA double helix is completely severed.

“We hypothesize that contrary to what has long been thought, double strand breaks are not static entities but will rapidly cluster into preferred regions of the nucleus we call DNA repair centers as radiation exposure increases,” says Costes. “As a result of this clustering, a single RIF may reflect a center where multiple double strand breaks are rejoined. Such multiple repair activity increases the risks of broken DNA strands being incorrectly rejoined and that can lead to cancer.”

Berkeley Lab biophysicist Sylvain Costes is generating 3D time lapse of DNA repair centers in human cells to understand better how cancer may arise from DNA damage. (Photo by Roy Kaltschmidt, Berkeley Lab)


Costes and Bissell have published the results of their study in the Proceedings of the National Academy of Sciences in a paper titled “Evidence for formation of DNA repair centers and dose-response nonlinearity in human cells.” Also co-authoring the paper were Teresa Neumaier, Joel Swenson, Christopher Pham, Aris Polyzos, Alvin Lo, PoAn Yang, Jane Dyball, Aroumougame Asaithamby, David Chen and Stefan Thalhammer.

The authors believe their study to be the first to report the clustering of DNA double strand breaks and the formation of DNA repair centers in human cells. The movement of the double strand breaks across relatively large distances of up to two microns led to more intensely active but fewer RIF. For example, 15 RIF per gray (Gy) were observed after exposure to two Gy of radiation, compared to approximately 64 RIF/Gy after exposure to 0.1Gy. One Gy equals one joule of ionizing radiation energy absorbed per kilogram of human tissue. A typical mammogram exposes a patient to about 0.01Gy.

Corresponding author Costes says the DNA repair centers may be a logical product of evolution.

“Humans evolved in an environment with very low levels of ionizing radiation, which makes it unlikely that a cell would suffer more than one double strand break at any given time,” he says. “A DNA repair center would seem to be an optimal way to deal with such sparse damage. It is like taking a broken car to a garage where all the equipment for repairs is available rather than to a random location with limited resources.”

However, when cells are exposed to ionizing radiation doses large enough to cause multiple double strand breaks at once, DNA repair centers become overwhelmed and the number of incorrect rejoinings of double strand breaks increases.

“It is the same as when dozens of broken cars are brought to the same garage at once, the quality of repair is likely to suffer,” Costes says.

The link between exposure to ionizing radiation and DNA damage that can give rise to cancerous cells is well-established. However, the standards for cancer risks have been based on data collected from survivors of the atomic bomb blasts in Japan during World War II. The LNT model was developed to extrapolate low dose cancer risk from high dose exposure because changes in cancer incidence following low dose irradiation are too small to be measurable. Extrapolation was done on a linear scale in accordance with certain assumptions and the laws of physics.

“Assuming that the human genome is a target of constant size, physics predicts DNA damage response will be proportional to dose leading to a linear scale,” Costes explains. “Epidemiological data from the survivors of the atomic bombs was found to be in agreement with this hypothesis and showed that cancer incidence increases with an increase in ionizing radiation dose above 0.1 Gy. Below such dose, the picture is not clear.”

Previous studies failed to detect the clustering of double break strands and the formation of DNA repair centers because they were based on single-time or single-dose measurements of RIF at a discrete time after the initial exposure to ionizing radiation. This yields a net number of RIF that does not account for RIF that have not yet appeared or RIF that have already made repairs and disappeared. The time-lapse imaging used by Costes, Bissell and their co-authors showed that RIF formation continues to occur well beyond the initial radiation exposure and after earlier repair issues have been resolved. Time-lapse imaging also indicates that double strand break clustering takes place before any RIF are formed.

“We hypothesize that double strand break clustering occurs rapidly after exposure to ionizing radiation and that RIF formation reflects the repair machinery put in place around a single cluster of double strand breaks,” Costes says. “Our results provide a more accurate model of RIF dose response, and underscore fundamental concerns about static image data analysis in the dynamic environment of the living cell.”

Previous studies also mostly involved fibroblast cells whereas Costes, Bissell and their colleagues examined epithelial cells, specifically an immortalized human breast cell line known as MCF10A, which has a much higher background of RIF than fibroblasts, even without ionizing irradiation. To compensate for this higher background, Costes developed a mathematical method that enables background to be corrected for on a per- nucleus basis in unirradiated cells. Still the use of a special line of immortalized breast cells is an issue that Costes and his colleagues plan to address.

“We are now looking at primary breast epithelial cells that have been removed from healthy donors to determine if our results are repeated beyond just a single cell line and under more realistic physiological conditions,” Costes says. “We’d also like to know if our findings hold true for fibroblasts as well as epithelial cells. Also, we’d like to know if double strand break clustering is the result of a random coalescence or if there is an active transport mechanism that moves these double strand breaks towards pre-existing DNA repair centers.”

Working in collaboration with Rafael Gomez-Sjoberg of Berkeley Lab’s Engineering Division, Costes and his group are also developing a special microfluidics lab-on-a-chip device that is integrated into an X-ray microbeam. The goal is to provide a means by which cells can be kept in a controlled microenvironment while being irradiated with multiple doses. This microfluidic array will be used to characterize DNA damage response in breast and blood cells collected from human donors.

“By characterizing DNA damage response in cells from many different human donors,” Costes says, “we should be able to determine the variation across humans and gain a better understanding of how sensitivity to DNA damage from ionizing radiation might vary from individual to individual.”

This research was supported by the DOE Office of Science.

Obama: Business as usual with the military industrial complex and removal of constitutional rights.

Appointing financiers involved with military contractors to the FED

While on vacation in Hawaii, Obama designated Jerome Powell to serve on the Federal Reserve Board of Governors.

Powell served as the undersecretary for finance under the president George H. W. Bush and was a partner of The Carlyle Group. The Carlyle Group is a massive private equity firm and one of the largest defense contractors in the world.

They're made up of some of the most influential policymakers over the last five administrations including both Bush presidents, former Secretary of State James Baker III, former Secretary of Defense Frank Carlucci, former Clinton Chief of Staff Mack McLarty, and former SEC Chairman Arthur Levitt to name a few.

Other notable investors in The Carlyle Group include the bin Laden family and the Saudi Royal Family. Coincidentally, George H. W. Bush was meeting at the Ritz Carlton Hotel in Washington on the morning of September 11th with one of Osama Bin Laden's brothers.

Watch the documentary below for some more background on the Carlyle Group:


Eisenhower warns us of the military industrial complex.



Obama Signing Statement Says He isn’t Bound by 20 Provisions of General Spending Bill

Some examples, the president objected to:
  • A ban on spending money to move terrorist detainees from Guantánamo to prisons in the U.S.
  • Restrictions on placing American troops involved in United Nations peacekeeping missions under foreign command
  • Thirty days advance notice to Congress for any use of the military that would involve more than $100,000 in construction costs
  • “Disclosure to the Congress of information regarding ongoing diplomatic negotiations”
  • Language that blocks health, climate, auto policy and urban affairs “czars” from being employed by the White House
  • A provision that bars health officials from advocating for gun control
  • Congressional approval of funds transferred from the Library of Congress to the Copyright Office.


National Defense Authorization Act - Tyranny Vs. Liberty

Subtitle D of the National Defense Authorization Act (NDAA), headed to the president’s desk. These under-the-radar provisions, co-sponsored by Senators John McCain and Carl Levin, would allow for the indefinite military detention of any person alleged to be a member of Al Qaeda, the Taliban or “associated forces.” The provisions also apply to any person who supports or aids “belligerent” acts against the United States, whether the person is apprehended here or abroad.




President Obama and the Spread of Security Theater

The TSA was created in response to the September 11, 2001, attacks. Its first administrator, John Magaw, was nominated by President Bush on December 10, 2001, and confirmed by the Senate the following January. The agency's proponents, including Transportation Secretary Norman Mineta, argued that an only single federal agency would better protect air travel than the private companies who operated under contract to single airlines or groups of airlines that used a given terminal facility.

Now the TSA would be expanded to all mode of travel.


The Globalization of War

The world's attention is increasingly focused on Syria and Iran as the region continues to move toward military confrontation. Less noticed, however, is that the pieces are being put into place for a truly global conflict, with military buildup taking place in every region and threatening to draw in all of the world's major powers.




The Pentagon and its Sock Puppets

The net result is that American viewers were sold on independent analysis and instead got individuals, often with ongoing contractor relationships with the Pentagon, who read from pre-prepared Pentagon talking points.

An internal Department of Defense review has concluded that a Rumsfeld-era program under which retired military officers who appeared on American broadcast media were given special briefings and access was consistent with Pentagon rules. The New York Times reports:

The inquiry found that from 2002 to 2008, Mr. Rumsfeld’s Pentagon organized 147 events for 74 military analysts. These included 22 meetings at the Pentagon, 114 conference calls with generals and senior Pentagon officials and 11 Pentagon-sponsored trips to Iraq and Guantánamo Bay, Cuba. Twenty of the events, according to a 35-page report of the inquiry’s findings, involved Mr. Rumsfeld or the chairman of the Joint Chiefs of Staff or both. One retired officer, the report said, recalled Mr. Rumsfeld telling him: “You guys influence a wide range of people. We’d like to be sure you have the facts.”

The inspector general’s investigation grappled with the question of whether the outreach constituted an earnest effort to inform the public or an improper campaign of news media manipulation. The inquiry confirmed that Mr. Rumsfeld’s staff frequently provided military analysts with talking points before their network appearances. In some cases, the report said, military analysts “requested talking points on specific topics or issues.” One military analyst described the talking points as “bullet points given for a political purpose.” Another military analyst, the report said, told investigators that the outreach program’s intent “was to move everyone’s mouth on TV as a sock puppet.”

The internal review also apparently found no fault with the exclusion of four individuals precisely because they refused to be sock puppets, speaking critically of some Pentagon decisions. One of them, General Wesley Clark, apparently lost his position as an analyst for CNN because of Pentagon and White House displeasure with what he had to say.

The investigation was prompted by David Barstow’s Pulitzer Prize–winning exposé of the Pentagon program. Barstow wrote:

Records and interviews show how the Bush Administration has used its control over access and information in an effort to transform the analysts into a kind of media Trojan horse—an instrument intended to shape terrorism coverage from inside the major TV and radio networks.

Analysts have been wooed in hundreds of private briefings with senior military leaders, including officials with significant influence over contracting and budget matters, records show. They have been taken on tours of Iraq and given access to classified intelligence. They have been briefed by officials from the White House, State Department and Justice Department, including Mr. Cheney, Alberto R. Gonzales and Stephen J. Hadley.

In turn, members of this group have echoed administration talking points, sometimes even when they suspected the information was false or inflated. Some analysts acknowledge they suppressed doubts because they feared jeopardizing their access.

The Barstow exposé revealed two of the most important media scandals to emerge from the Iraq War period. The first went to the Rumsfeld Pentagon’s deft use of its enormous public-affairs resources to influence the American media, often for blatantly political purposes. These operations were plainly illegal. Since World War II, Congress has imposed clear limits, written into defense-appropriations measures, on the Pentagon’s ability to engage in domestic public-relations operations. The Department of Defense is permitted to run recruitment campaigns and give press briefings to keep Americans informed about its operations, but it is not permitted to engage in “publicity or propaganda” at home. The internal DoD review exonerating the practice of mobilizing and directing theoretically independent analysts apparently focuses on the fact that the program conforms with existing department rules, but it overlooks the high-level prohibition on “publicity or propaganda,” which was plainly violated.

The second scandal goes to the broadcasters themselves. They apparently recruited these analysts anticipating access to the Pentagon and a steady conduit of information. Their compromise highlights the Achilles heel of the Beltway media: access, not critical or objective coverage, is everything. There is little evidence to suggest that the broadcasters took any meaningful steps to assert their independence or objectivity—indeed, the dismissal of Wesley Clark by CNN shows precisely the opposite. The net result is that American viewers were sold on independent analysis and instead got individuals, often with ongoing contractor relationships with the Pentagon, who read from pre-prepared Pentagon talking points.

In his 1961 farewell address, President Dwight Eisenhower warned against the “acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex.” Although he was persuaded that this new relationship between the Pentagon and its contractors was “a vital element in keeping the peace,” he was deeply troubled by the relationship’s potential to disrupt the delicate balance of interests that is fundamental to a modern democracy. David Barstow’s investigation provided some of the most subtle and compelling evidence of this process to appear in recent years. The Pentagon’s self-exonerating report, by contrast, suggests that media sock puppets may become a modus operandi.

mercredi 28 décembre 2011

L'agriculture verticale et autre technologie au service de l'humanité.

L'agriculture verticale

Vous avez sans doute lu les nouvelles récentes : nous avons franchi le cap des 7 milliards d'êtres humains sur la Terre le 31 octobre. Bon, tandis que les petits prophètes de malheur continuent leurs diatribes sur l'impact environnemental de ce qu'ils appellent l'explosion démographique et de l'avènement imminent d'ères de famines, il n'en reste pas moins qu’à l'heure actuelle, 925 millions d’êtres humains manque de nourriture.



Quand l’on regarde les faits, le système économique actuel place plutôt ses priorités dans une consommation éhontée pour un faible pourcentage de la population mondiale et une production de biocarburants utilisant des terres arables pour de l'éthanol. Cela est sans compter le rappel incessant de l'érosion des sols et bientôt le manque de terres arables pour l'agriculture. L'impact de ces pratiques pourrait augmenter alors que la population passera, selon les prédictions de l'O.N.U., à 9 milliards d'habitants en 2050. Une surface équivalente à celle du Brésil serait alors nécessaire pour nourrir tout ce beau monde.

Une économie de 30 à 60 % de nourriture serait possible en améliorant les systèmes de distribution de nourriture et en privilégiant la consommation directe aux humains. C’est le ratio qui se gaspille chaque année, selon l’organisation des Nations Unies pour l’alimentation et l’agriculture – F.A.O..


L'impact néfaste de l'agriculture commerciale actuelle :

L'utilisation massive de pesticides et d'insecticides doit également être abordée, car ils tuent peu à peu les insectes pollinisateurs — voir les débats sur les abeilles des cinq dernières années — et sont potentiellement cancérigènes. L’exploitation massive des terres cause un appauvrissement et une érosion prématurée des sols et un accroissement de la désertification.

Mis à part ces « bonnes » nouvelles, comment peut-on régler d'un seul coup tous ces problèmes tout en embellissant notre environnement? Puisqu'on en parle, aussi bien démontrer que l'innovation humaine ne connaît que les limites imposées par sa propre créativité.

L'idée provient d'un géologue, Gilbert Ellis Bailey, qui a publié en 1915 un livre sur « vertical farming ». Mais c'est Dickson Despommiers, professeur de microbiologie et de sciences environnementales à la New York Colombia University, qui a été incontestablement le fondateur de ce récent mouvement. Parti d'une idée folle, comme il l'a décrit lui-même en 1999. Dans une de ses classes, il parlait de faire de l’agriculture urbaine sur les toits des immeubles.



Le mouvement a pris tel un feu de poudre et s'est répandu très rapidement dans le monde entier en moins d'une décennie. Il lance alors l'idée d'implanter le procédé entièrement à un immeuble de 30 étages. Des architectes se sont alors lancés dans la conception d'éco-environnements, ou un cycle fermé de processus assure une utilisation intelligente des ressources. Dans un tel circuit fermé, rien ne se perd, rien ne se crée. Bien que l'idée ne soit encore qu'au stade expérimental, il existe actuellement six projets du genre :

  1. The Plant à Chicago
  2. Alpha farm à Manchester, qui sera la pièce de résistance pour l'exposition internationale de l'Angleterre en 2013
  3. Un laboratoire de recherche à Suwon, en Corée du Sud, disposé sur trois étages sinon, il reste plusieurs usines à légumes au Japon, ou des compagnies telles que 
  4. Terrasphere
  5. Aerofarms
  6. Valcent Verticrop.
D’une idée toute simple à l'origine a émergé ce que plusieurs appellent la prochaine grande révolution verte.
Ces essais n'en sont qu'à leurs balbutiements. Bien entendu, il demeure quelques obstacles sur la faisabilité d'un tel projet à grande échelle. Comme je le dis toujours, il suffit de combiner les savoirs actuels pour trouver les solutions aux problèmes.

Un des grands défis est l'alimentation électrique de telles structures. Selon certains calculs, cela prendrait huit fois la capacité des centrales électriques actuelles des États-Unis pour fournir la production lumineuse nécessaire à leurs besoins. Cependant, compte tenu des développements récents et progrès en matière de production d'énergie, il serait possible de fournir les besoins énergétiques d'un seul bâtiment grâce à des systèmes de pyrolyse ou gazéification au plasma des déchets, tels ceux fournis par la compagnie Terragon ou Plasco Energy Group. Sinon, à moins qu'il n'y ait une révolution de l'énergie et qu'elle coûte moins chère et soit moins dangereuse à produire (voir centrales nucléaires au thorium ou les réactions nucléaires à basse énergie — LENR), cela risque de poser encore quelques défis.

De nouvelles avancées à l'Université McGill concernant les lumières DEL permettent d'utiliser des spectres de différentes couleurs, pour stimuler la productivité des cultures tout en consommant 10 % de l'énergie des lumières actuelles.

De plus, les immeubles pourraient être construits avec le génie de la nature, tel qu'exposé par Michael Pawlyn. Les fenêtres pourraient être très solides et économiques à construire par un nouveau matériau économique et écologique nommé l’ETFE(2). Il s'agit d'une plaque de polymère pouvant être étiré sur une structure d'acier en trois couches et gonflée à l'air. Son coût est de 24 à 70 % moins à installer comparé au verre, supporte 400 fois son poids, est autonettoyant et recyclable. La lumière naturelle pourrait être ainsi maximisée, et ainsi limiterait l'utilisation de lumière artificielle.

Le volume de fruit et légume produit pourrait également être stimulé par une plus grande concentration de CO2 de l’ordre de 1200 ppm, augmentant potentiellement la croissance de 44 %.



Parlons maintenant des avantages, car ils sont légion :
  • Nous parlons d'une production entièrement biologique et parfaite, 365 jours par année, sans pesticides et insecticides et ceci indépendamment des saisons.
  • Nous parlons d'une commercialisation pouvant donner des milliers d'emplois dans les villes et une distribution locale et rapide de produits frais tout en limitant les dépenses de carburants fossiles.
  • Nous parlons d'un rendement de 5 à 10 fois supérieur à celui de l'agriculture conventionnelle sur 10 fois moins de terrain.
  • Nous parlons d'un procédé pouvant être implanté dans n'importe quel climat, peu importe le pays.
  • Nous parlons d'une économie d'eau de l'ordre de 5 fois comparativement à l'agriculture normale.
  • Bref, nous parlons d'un moyen de nourrir une population croissante, en bonne santé, tout en limitant l'impact environnemental de ce dernier.

Comme mentionné plus haut, tout serait pensé pour fermer le cycle de la consommation : de l'hydroponie combinée à de l'aquaculture, faisant l'élevage de plusieurs espèces de poissons tout en cultivant des laitues. Les déjections des poissons nourrissent les plantes en nutriments, l'eau étant également traitée, filtrée et recyclée. Les poissons seraient nourris grâce aux restes des cultures et des déchets des préparations dans les usines situées en dessous.

Un système de pyrolyse et/ou gazéification utiliserait les déchets non recyclables pour alimenter une partie du bâtiment en électricité, tout en récupérant de l’eau et autre matière. Un biodigesteur de biométhanisation pourrait aussi être utilisé pour prendre le relais avec les restes des usines de transformation des produits.

Des installations aéroponiques pourraient maximiser l'utilisation de l'eau – utilisant seulement 10 % d'eau comparée à l'agriculture intensive — des cultures maraîchères, de tomates, de concombres, de tous les types de salades, d'épinards et de laitues ainsi que des herbes et des épices. Ils peuvent également faire le même traitement grâce à un système de compte-gouttes et de cultures verticales, distribuant l'eau directement aux racines, et s'égouttant à d'autres plateaux superposés.

Une utopie que tout cela? Les projets en vigueur ne sont que les premiers pas vers un avenir meilleur si l'idée continue de faire son chemin tel qu'elle l'a fait déjà. Il est possible de rendre de telles infrastructures viables économiquement tout en augmentant le niveau de la dignité humaine. Le potentiel de ces technologies n'est plus un rêve, mais bien une réalité. Penseurs du monde, vous pouvez encore trouver d'autres manières d'améliorer la vie pour tous, autant pour le genre humain que pour la biodiversité de la planète.

Vidéo résumant bien les possibilités:


Vincent Blanchette

dimanche 11 décembre 2011

Remember cold fusion... Now LENR, Low Energy Nuclear Reactions

There have been lots of new development following Pons & Fleischmann Cold Fusion.

Now we are talking about LENR or Low Energy Nuclear Reactions.

Here's 3 presentations that occured at NASA on sept 22nd and some interesting slides extracted. More information and discussion here.

Zawodny Slides



Nelson Slides

Bushnell Slides
In Short, LENR , depending upon the TBD performance, appears to be capable of Revolutionizing Aerospace across the board. No other single technology even comes close to the potential impacts of LENR upon Agency Missions.


Some good video on the technology:

Low Energy Nuclear Revolution (English Version)


Cold Fusion More than Junk Science 60minutes 9-4-19 2 of 2


Dr. Robert Duncan on Cold Fusion at the Missouri Energy Summit 2009 Part 1-3 playlist

dimanche 27 novembre 2011

Interview: Clinton Bastin - Iran Has a Nuclear Power Program, Not a Weapons Program

Full text of the interview

Interview #1/2

Interview: Clinton Bastin #1/2
by: Astr0o0o0o

Iran Has a Nuclear Power Program, Not a Weapons Program

Interview #2/2
Interview: Clinton Bastin #2/2
by: Astr0o0o0o

Iran Has a Nuclear Power Program, Not a Weapons Program

dimanche 20 novembre 2011

How much bailout money was spent and what is the impact on workers

When Obama allowed to bail-out banks at a excess of 3 trillions dollar for institution that gambled, instead of let them fail and reform a system doom to fail, he has put more than 22 thousand dollars more debt on all 140 thousands workers.

We would have to extract 5% of the average salary of all US workers for almost 10 years just to pay for this one pass bailout.

What a mess!

dimanche 6 novembre 2011

The more energy you have, the longer you live and the richer you are

Energy and economy Source

Energy is the most fundamental requirement of every society or nation as it progresses through the ladder of development. Of course, once it reaches a relative degree of development, the energy demand becomes more stable. There is a distinct and categorical correlation between the energy consumption and income of a nation — each reinforcing the other. Look around you: every step into progress comes with an addition of demand for energy — cars, ships and aircraft to move, hospitals to give quality healthcare, education, as it follows the model of e-connectivity, production of more and better goods, irrigation for better farming. In fact, every element of our lives is increasingly going to become energy-intensive — that is a necessary prerequisite for development. This is clearly reflected in the average energy consumption per person across nations


Click on play!

Nuclear power is EXTREMELY dangerous, but compare to others, EXTREMELY safe!




As of March 2011, Naval Reactors have run 6300 reactor-years, driving 528 reactor cores on 220 ships over 145,000,000 miles without a single radiological incident  or injurious radiation exposure to crew or public.  Because of the shielding from the hull and the seawater, crews at sea generally get less radiation, living within 100 meters of an operating nuclear reactor, than their families at home.  All of the radiological information about the ships and associated shore facilities is released to the public in documents in which the detailed data are accumulated without a break since 1954.

With respect to the international nuclear power industry at large, John Ritch, the Director-General of the World Nuclear Association, made the following statement to the science editor of station NDTV on 24 October 2011:  “Perhaps I would think this problem is more serious if we had been besieged by many large fatality accidents in nuclear power. But I think I am correct in saying that in fourteen thousand five hundred reactor-years of civil nuclear power production we have not seen a fatality apart from the limited number of deaths that occurred as a result of the Chernobyl accident...Very few industries have produced such beneficial results with such an extremely low toll of damage to the environment or the public.  This industry has an amazing record of safe performance and beneficial contribution. That basic fact is much too little appreciated by the public.”

During the same period, the following non-nuclear accidents occurred:

Banqiao Dam Failure:  One of 62 hydroelectric dams in Zhumadian Prefecture in China that failed catastrophically or were intentionally destroyed in 1975 during Typhoon Nina.  An estimated 172,000 people were killed, 11 million people lost their homes, and about one-third of the electric power capacity of the national grid was destroyed.  The resulting damage to the farmland is not reported.

Bhopal Pesticide Factory Release:  A leak of methyl isocyonate gas from a Union Carbide pesticide plant in Bhopal, Madhya Pradesh, India in 1984 led to 558,125 injuries, including 38,478 temporary and partial, and 3900 severely and permanently injured.  An estimated 3000 died within weeks and another 8000 have since died from the incident.  Some believe these official estimates grossly understate the situation.

Deaths from Coal:  From coal’s air pollution alone, there have been 30,000 deaths per year in the US, 500,000 per year in China.  These figures do not include deaths of coals miners, the destruction of stream beds destroyed by pushing mountain-tops into stream beds, the effect of mercury and other toxins on fish, etc.

BP Oil Spill:  The environmental and health impact of this event has not been estimated.  And there are many other spills that have received little attention. The fact is that it is simply not true that nuclear radiation is uniquely hazardous, even when totally uncontrolled releases occasionally occur.

Another fact is that unwarranted fear of harmless levels of radiation has caused unprecedented damage.  People are afraid to return to their homes and businesses.  They’ve terrified themselves, their friends and their children.  The health effect of such widely enforced terrorism is itself devastating.  The effect on the economy is paralyzing.  

In Fukushima, amid thousands of non-nuclear deaths, international investigation under IAEA concluded: "To date no health effects have been reported in any person as a result of radiation exposure from the nuclear accident" But the Government is concerned about letting people return to their homes.

There is no defensible scientific basis for discouraging people from living where radiation levels are “high,” when they are still lower than the highest natural radiation levels in Iran, Brazil, Norway, India, China and other regions where people have dwelt healthfully for countless generations with backgrounds hundreds of times higher than deemed “permissible.” 

More fundamentally, why should radiation level be the prime consideration as to where and how one chooses to live?  Many people make decisions that increase their radiation dose many-fold by moving to mountainous regions, or by cladding their houses in brick or stone, or by visiting radioactive health spas.  By what authority do the radiation protection police have their particular concern outrank all others?  Are we going to let them strip the natural soil off the ground in Japan, to lower the radiation background to some arbitrary number?

Why should we fear “nuclear waste”?  The only way it can harm anyone is if it is eaten.  It is not in soluble form, so we store it in shielded cans until it is needed to be recycled as fuel in a reactor designed for that purpose.  This is not difficult; the process has been demonstrated, but it Is currently cheaper to just store the used fuel until needed.  Non-nuclear industry produces millions of times more lethal doses of other poisons.  The main difference is that the nuclear material gets less toxic every day, and after a few hundred years, becomes no more toxic than some natural ores.  But the non-nuclear wastes maintain full toxicity forever.  Fukushima and 9/11 have shown that we should design the plants to perform under even more extremes of conditions, and these improvements have been underway in America since immediately after 9/11.

Putting radiation numbers in perspective:
Marshall Brucer, “the father of nuclear medicine,” in his canonical Chronology of Nuclear Medicine,shows how widely radiation backgrounds vary.  On page 323, he lists various radiation background levels (with cosmic ray contribution removed) from New York City at 0.62 mSv/year to SW France up to 876; to the potash fertilizer area in Florida up to 1750.  He notes, “If you live in one place on earth, your background may vary from day to day by a factor of ten, or even 100…The inside exposure rate can change by a factor of 10 within hours, just by opening windows.”  He notes that building with brick, rather than wood, can nearly double your daily radiation dose, but that the radioactivity of  bricks and concrete is also highly variable: from 0.05 to 4.93 mSv/yr for bricks, and from 0.29 to 25.4 for concretes.  “A factor of 10 daily variation [in radiation dose] marks the diets of most people.”  [mR in original, converted here to mSv]
People have lived healthily for millennia with natural radiation up to following mSv/yr:
Ramsar, Iran (260), Kerala, India (35), Guaripari, Brazil (35), Yangiang, China (5.4)


Other article on nuclear power and radiation:
Comments?

vendredi 28 octobre 2011

Population to reach 7 billions this year

Population growth going down everywhere.


Also fertility rate.


Good article:
The world’s population will reach 7 billion at the end of October. Don’t panic

vendredi 14 octobre 2011

De bonne idée pour le transport en commun au Québec.



Le Monorail TrensQuebec comme entreprise nationale de transport.
http://www.trensquebec.qc.ca/

Building a modular nuclear power plant in a 11 weeks!

Looking at what we accomplish in the 1960's, we wonder why today it takes 10 years and billions to build a nuclear power plant?

In seventy-seven days, the Army team assembled the prefabricated reactor. Just nine hours after fuel elements containing forty-three pounds of enriched Uranium-235 were inserted into the reactor, electricity was produced.

WARNING: STRONG POLITICAL VIEW IN THIS VIDEO.


More info:
The US Army Nuclear Power Program was created to develop small nuclear power reactors for use at remote sites. Most were based on existing US Naval reactor designs. Eight reactors were built in all, and six of the eight produced useful power. The nuclear reactor at Camp Century was the first of the US Army's portable reactors to actually produce power.

The portable nuclear power plant at Camp Century was designated PM-2A. Its designation indicates: “P” for Portable; “M” for Medium Power; “2” for the sequence number; and the letter “A” indicates field installation. The PM-2A was rated two megawatts for electrical power and also supplied steam to operate the water well. The PM-2A was built by Alco Products, Inc. of Schenectady, New York. The USNS Marine Fiddler transported the reactor from Buffalo, New York to Thule Air Base in Greenland, arriving on July 10, 1960. Up to this time, it was the most valuable cargo ever shipped out of the port of Buffalo. In addition, the Army flew one of the three blast coolers to Thule on a C-124 Globemaster to demonstrate the practicality of air transport. Four hundred tons of pipes, machinery, and components were then carefully transported over the ice in twenty-seven packages. Special care was taken not to damage the parts, since intensely cold metal can become dangerously brittle. As a credit to superb packaging, a ceramic top to a lab cabinet was the only item damaged during transport.

In seventy-seven days, the Army team assembled the prefabricated reactor. Just nine hours after fuel elements containing forty-three pounds of enriched Uranium-235 were inserted into the reactor, electricity was produced. It was soon discovered that additional shielding would be necessary. This shielding was accomplished by adding a layer of two inch thick lead bricks to the primary shield tank. Except for downtime for routine maintenance and repairs, the reactor operated for thirty-three months, until July 9, 1963, when it was deactivated pending a decision to remove it. This decision stemmed from plans to discontinue year-round operations at Camp Century to reduce costs. In addition, the tunnel support structure sheltering the reactor was suffering from reoccurring damage due to compacting snow. A conventional diesel powered plant would have consumed over one million gallons of fuel over the same period. While the power plant was designed to provide 1560 kilowatts of power, Camp Century's power needs peaked at 500 kilowatts, and gradually declined from there. During the reactors operational life, a total of 47,078 gallons of radioactive liquid waste was discharged into the icecap. The PM-2A was removed in the summer of 1964 by the 46th Engineers based at Fort Polk, Louisiana. No military service was willing to accept the plant at another location so the PM-2A's components were put into storage. The reactor vessel was subjected to destructive testing in order to study neutron embrittlement of carbon steel. Phillips Petroleum Company conducted the testing for the US Atomic Energy Commission in 1966. After extreme testing, it was found to be much more durable than expected. Failure of the vessel finally occurred at minus twenty degrees Fahrenheit and 4,475 pounds per square inch pressure after hydrochloric acid was added to a machined defect.

Dr. Edward Calabrese: The Fraud of LNT and Future of Radiation

Like I explained before, the LNT or Linear No Threshold seems to be based on fraudulent science.  Dr. Calabrese explain the history of the LNT and the future of radiation if our policies would be based on science.

mardi 11 octobre 2011

If Vermont Yankee had an Incident like Fukushima

Good presentation on nuclear radiation and related topic.

Jones Seminar on Science, Technology, and Society.

"If Vermont Yankee had an Incident like Fukushima, What Would be the Responsibilities and Public Expectations of the Scientific Community?"

Presented by Harold Swartz, Dartmouth Medical School.



source


  The purpose of this presentation is:

  1. To understand the medical implications of a total body exposure of an individual to high levels of ionizing radiation (greater than 1 Gray), and also, potential long-term effects.
  2. To understand how to interpret the information and counsel colleagues and the public about radiation exposures in the environment.
  3. To understand public health implications and actions that should be taken when there are potential exposures of large numbers of individuals to ionizing radiation as might occur with a major accident from a nuclear power plant or an attack by terrorists that involves ionizing radiation.
  4. Delineate and differentiate between medical needs and risks between a nuclear device and an incident at a nuclear power plant.
  5. Provide guidance for your responses to a radiation event (because people will believe you!).
  6. Explain context and needs for dosimetry in large scale event.
  7. Describe our current research and activities in responses to major radiation events.

About the Speaker

Harold Swartz, MD, MSPH, PhD is a Professor of Radiology, Medicine (Radiation Oncology), Physiology, Community and Family Medicine, Chemistry, and Engineering and the director of the Dartmouth EPR Center and the Dart-Dose CMCR. He is an internationally recognized biophysicist and radiation biologist who has especially been involved in the development of magnetic resonance for preclinical and clinical applications. He has done research and teaching on the biological effects of ionizing radiation since 1962 at Walter Reed, Medical College of Wisconsin (MCW), University of Illinois at Urbana-Champaign (UIUC), and Dartmouth. The EPR Center at Dartmouth is especially focused on developing and applying in vivo EPR for measurements of physiologically and pathophysiological parameters in vivo. He has developed, at Dartmouth, the first clinical program in EPR, which has a special emphasis on using the technique to measure radiation dose after the fact for purposes of triage and to advance cancer treatment by using repetitive measurements of oxygen to optimize delivery of cancer therapy. He has founded and directed national EPR Centers at the Medical College of Wisconsin, University of Illinois at Urbana-Champaign, and at Dartmouth. He is the PI of one of the CMCR Centers, DART-DOSE CMCR, focusing on Physical Biodosimetry. He is the author/coauthor of approximately 450 papers and four books. He has received several international awards including the Zavoisky prize.

vendredi 7 octobre 2011

GreenPeace: Anti Science, Fear, Anti Nuclear, Climate Alarmists

Following some story about greenpeace... Click on the links for the full stories.

Patrick Moore: From Greenpeace Dove to Nuclear Power Phoenix
Summary

If energy, food and education are the building blocks of civilization, Greenpeace Cofounder Patrick Moore is using his role of "sensible environmentalist" to build support for the concept of sustainable power generation. In this exclusive interview with The Energy Report, Patrick proposes that industry and government work together to advance nuclear power in the United States as the most effective way to supply continuous energy to homes, businesses and institutions.
Excerpt
They [GreenPeace] are more aptly described as political or social activists, which is fine in its own right, but when you are starting to deal with complex issues of chemistry and biology, you do need a little grounding in science in order to make good decisions.
Greenpeace tended to take a more black-and-white approach to many of these issues, and today they are opposed to all nuclear energy, even though it's a safe and clean alternative to fossil fuels. They are opposed to genetic engineering even though this could help eliminate micronutrient deficiency or malnutrition around the world. They are opposed to sustainable forestry, even though it's the most renewable resource in the world. They are opposed to farming fish in the ocean, which is a way of taking pressure off of wild stocks, which are overfished.
I think we made the mistake of lumping nuclear energy in with nuclear weapons. Our original campaign was against nuclear war, and we painted everything nuclear as evil. Today that makes no sense at all to me. Nuclear medicine is obviously not evil. Those nuclear isotopes used in nuclear medicine are created in nuclear reactors. That's one of the things we can do with nuclear technology. Another one is to make energy that's clean and safe. 

Why is the New York Academy of Sciences allowing its name to be used in an anti-science FUD campaign?
Summary:
Greenpeace wrongly used a published volume from the NYAS (New York Academy of Sciences) to support unproven claims that close to 1 million people died from the Chernobyl disaster. The real number of deaths directly attributable to the materials released by the accident will end up to be very close to 50.
Excerpt
Greenpeace claims that “based on now available medical data, 985,000 people died as a result of the Chernobyl disaster.” The authority for this statement is “the book recently published by the New York Academy of Sciences.” That death-toll is not supportable by scientific evidence.
A 2005 review of the data by an informal group call the Chernobyl Forum, included a suggestion that, based on the LNT premise (that even a single gamma ray could cause a cancer), 4000 additional deaths might ultimately occur. Since there is no indication that these deaths are likely, and since “prediction” of deaths by adding up of thousands of small individual radiation doses has been repeatedly forbidden as scientifically unsound, the suggested 4000 deaths has not been widely accepted.

Greenpeace’s Fear Machine

Summary:

Art Horn, writing at the Energy Tribune, calls our attention to a Greenpeace document titled The Climate Time Bomb. It was written 17 years ago – in 1994 – but it may as well have been yesterday.
The dramatic language, the glass-is-always-half-empty perspective on the world, the blind faith in the Intergovernmental Panel on Climate Change (IPCC) – it’s all there.
Excerpt:
One of the more distressing parts of that 1994 Greenpeace report is the section on human health. It’s important to remember that the IPCC’s first health chapter didn’t appear until 1995. It was so badly botched that Paul Reiter, who has devoted his entire professional life to the study of diseases spread by mosquitoes – including malaria – later described it as “amateurish.”Nevertheless, in 1994 Greenpeace just knew that climate change would lead to more disease. Its report includes a closeup photo of a person with a large open sore on their face, and declares that: 
Our health is threatened by climate change. Malaria, asthma, encephalitis, tuberculosis, leprosy, dengue fever and measles are all expected to become more common.
How does Greepeace know this?
  • because Australian officials “believe hotter summer temperatures may be contributing” to an increase in malaria
  • because US researchers think the discovery of a new strain of mosquito suggests that these mosquitoes “may spread rapidly in a global warming world” 
  • because a UK government report “suggests that malaria and other tropical diseases, and even bubonic plague, could be reintroduced to the UK as a result of global warming.
This is all that Greenpeace requires. Beliefs and suggestions – nothing more. Greenpeace is happy to take mere possibilities and translate them into the confident statement that Our health is threatened by climate change. It feels no shame in adding a lurid photograph which, one supposes, is intended to imply that we’re all fated to become disfigured if we don’t get with the Greenpeace program.

mardi 4 octobre 2011

Nuclear risk management - Testimony by John D. Boice

Source

Testimony on Nuclear Energy Risk Management after Fukushima, May 13, 2011

John D. Boice, Jr., Sc.D.
Professor of Radiation Epidemiology
Department of Medicine
Vanderbilt University

United State House of Representatives
Committee on Science, Space and Technology
Subcommittees on Energy & Environment and Investigations and Oversight

Washington, DC
May 13, 2011

Testimony for the Record

Good morning, Mr. Chairmen, ranking Members, and Members of the Subcommittee. I am pleased to discuss the possible health implications of radiation from the Fukushima Daiichi nuclear power plant accident in Japan. Just a few days before the natural disasters struck on March 11, 2011, I was in Hiroshima, Japan as a member of the Radiation Effects Research Foundation's Science Council, reviewing the study of atomic bomb survivors. I would like to begin by expressing my heartfelt sympathy for the families of the tens of thousands who lost their lives as a result of the tsunami and earthquake and for the hundreds of thousands who have been displaced from their homes and livelihoods. The health consequences associated with the radiation exposures emanating from the Fukushima Daiichi plant pale in comparison.

As background, I am a radiation epidemiologist and Professor in the Department of Medicine at Vanderbilt University and Scientific Director of the International Epidemiology Institute. I have spent my career studying human populations exposed to radiation, including Chernobyl clean-up workers, patients receiving diagnostic and therapeutic radiation, underground miners exposed to radon, nuclear energy workers, atomic veterans, persons living in areas of high background radiation and U.S. populations living near nuclear power plants and other facilities. I am also a commissioner of the International Commission on Radiological Protection, an emeritus member of the National Council on Radiation Protection and Measurements, a U.S. delegate to the United Nations Scientific Committee on the Effects of Atomic Radiation, and a member of the Congressionally-mandated Veterans Advisory Board on Dose Reconstruction.

My remarks will cover five areas:
  • Fukushima is not Chernobyl.
  • The health consequences for Japanese workers and public appear to be minor.
  • The health consequences for United States citizens are negligible to nonexistent.
  • We live in a radioactive world.
  • There is a pressing need to learn more about the health consequences of radiation in humans when exposures are spread over time at low levels and not received briefly at high doses such as in atomic bomb survivors.

Fukushima is not Chernobyl

The Chernobyl accident on April 26, 1986, resulted in massive radiation exposures, both to the emergency workers putting out the ensuing fire and to the environment. There was no containment vessel and after the explosion a fire burned for ten days and spewed radioactive particles continuously into the environment. The emergency workers, the first responders and fire fighters, received so much radiation that 28 of them died of acute radiation sickness within a few months of exposure. Those who survived developed cataracts at a high rate and several subsequently died of myelodysplastic disorders. Radioactive iodines were deposited on large areas throughout the Ukraine, Belarus and Russian Federation and were ingested by cows who gave milk that was drunk by children, and an epidemic of thyroid cancer ensued beginning about five years after the accident. Over 520,000 recovery workers were sent to clean up the environment and build the so-called sarcophagus to contain the damaged nuclear reactor. To date there is little conclusive evidence for adverse health effects associated with radiation received during these clean-up operations. There have, however, been indications of severe psychological stress and increased rates of suicide.

In contrast, while the radiation releases from Fukushima are estimated to be up to 10% of that from Chernobyl, there appears to be substantially less worker and public exposure. The Japanese authorities relaxed the allowable annual limit of worker exposure from 2 to 25 rem for this emergency situation, but only about 21 workers received more than 10 rem and only two workers received between 20 and 25 rem. These levels are far below the hundreds of rem needed to cause acute radiation sickness. Those workers who experienced levels over 10 rem to their entire body, however, have an increased lifetime risk of developing cancer of about 1-2% over the expected normal lifetime rate of about 42%. There were reports of high radiation fields in the vicinity of the damaged reactors and spent fuel storage ponds and with the contaminated water, but apparently the Japanese authorities rotated workers in such a way that cumulative exposures to individuals were minimized. Three workers received beta particle exposures to their legs from an estimated 200-300 rem to the skin, but the health consequences of these localized exposures were minimal and resulted in only a reddening of the skin.

Exposure to the public was minimal in large part because of the prevailing winds and the quick action taken by the Japanese authorities. The prevailing winds were generally to the east and over the ocean and thus did not result in meaningful radiation exposures to the Japanese public. In contrast to the circumstances around Chernobyl where the authorities failed to alert or evacuate the surrounding populations until several days had passed, the Japanese government quickly evacuated persons living within 20 km of the Fukushima Daiichi plant and recommended that those living within 30 km stay indoors to minimize any possible exposure to radioactive releases. In addition, they immediately monitored the food and water supplies and banned the shipment of foodstuffs and milk where the radiation levels exceeded allowable standards.

These protective action measures, including the distribution of stable iodine pills (or syrup for children), minimized public doses and suggest that there will be minimal health consequences associated with any radiation exposures to the Japanese public. This is borne out in one survey of over 1,000 children who had their thyroids measured for possible uptakes of radioactive iodine. Not one child had a measurement above detectable limits. This is in contrast to children living near Chernobyl for whom large numbers had extremely high levels of radioactive iodine detected in their thyroids from drinking contaminated milk shortly after the accident.

Nonetheless, some of the prevailing winds did blow toward populated areas shortly after the accident and during the hydrogen explosions, and to the north-west in particular. Rain, snow and hail deposited radioactive particles in certain regions, including some beyond 20 km, and these areas will be a concern for remediation before allowing public access or return. The Japanese authorities are considering regular medical examinations for workers and inhabitants who received more than 10 rem. To reduce anxiety, they are considering medical check-ups for those who may have received between 2 to 10 rem. They are also grappling with important issues as to when and how to allow evacuated inhabitants to return to their homes. Childhood exposures are of particular concern and topsoil is already being removed from some school playgrounds.

Thus, while Fukushima is clearly a major reactor accident, the potential health consequences associated with radiation exposures in terms of loss of life and future cancer risk are small, particularly in contrast with those resulting from the Chernobyl accident some 25 years ago.

For completeness, the 1979 reactor accident at Three Mile Island did not release appreciable amounts of radioactive substances into the environment, and public and even worker exposures were minimal. The average dose to people in the area was only about 1 millirem, or about what would be received in three days from sources of natural background radiation to the surrounding population.

The health consequences for United States citizens are negligible to nonexistent
Fukushima is 5,000 miles away from the United States and the radiation that has been detected was substantially diluted after traveling such a long distance. The detection of trace amounts of radiation speaks more about the extreme sensitivity of our radiation detectors than about the potential health consequences from the radiation itself. In addition to EPA’s RadNet system that monitors water, milk and the atmosphere, the Department of Energy has radiation monitoring equipment that can detect minute quantities of radioactive particles from the other side of the world as part of the Comprehensive Nuclear Test Ban Treaty. The tiny amounts of detected radioactive materials from Fukushima pose no threat to human health. They represent, at most, only a tiny fraction of what we receive each day from natural sources, such as the sun, the food we eat, the air we breathe and the houses we live in.

It is impressive that radiation monitors can detect levels of radioactive iodine-131 as low as 0.03 Bq/L (0.8 pCi/L) in milk in Washington State; this is the decay of one radioactive atom per second in about 33 gallons of milk. Such a level is 5,000 of times below the Derived Intervention Level set by the Food and Drug Administration to trigger concern over radionuclides in food. An infant would have to drink hundreds of gallons of milk to receive a radiation dose equivalent to a day’s worth of natural background radiation exposure. Such tiny levels of radiation are inconsequential compared with the levels we experience in daily life.

Interestingly, the radiation monitoring stations in Washington State had to detect radionuclides other than iodine-131 in order to distinguish radiation from Fukushima from that at any local hospital in the area. Most nuclear medicine departments use radioactive iodine for imaging the thyroid and to treat thyroid diseases, and patients are discharged shortly after intake and remain radioactive for several months, releasing small but detectable levels of radioactive iodine into the environment.

The trivial levels of radiation from Japan, while detectable, should not be of a concern and Americans should not take stable iodine (potassium iodide pills, KI) as a preventive measure to block the thyroid’s uptake of radioactive iodine. There are potential adverse health effects from taking KI pills and these risks have to be balanced against a nonexistent benefit.

We live in a radioactive world
To place the radiation levels from Fukushima in brief perspective, it is important to recognize that we live in a radioactive world. A banana, for example, has 10 Bq of activity, that is, 10 radioactive potassium atoms decay every second. All the foodstuffs we eat that contain potassium also contain a small amount of radioactive potassium, a primordial element with a billion year half-life. There are no concerns and no health consequences from such exposures.

We breathe radioactive radon which contributes over the year to about 210 millirem of natural background radiation. Bricks and granite contain radioactive materials that result in radiation exposures to the public (20 millirem). The Capitol Building was constructed with granite and is frequently cited as having some of the highest radiation levels in all of the United States, about 85 millirem per year. Water contains small amounts of radioactive radium, thorium and uranium, all within allowable limits.

Not only do we live in a radioactive world, our bodies are radioactive (30 millirem per year). Each second over 7,000 radioactive atoms in our bodies decay and can irradiate those sitting next to us. The atoms are largely radioactive potassium in our muscles and carbon-14 in our tissues. The amount of radiation we receive each year from medical sources (300 millirem), such as CT and medical imaging, equals the amount received from natural sources (300 millirem). International travel increases our exposure to cosmic rays and space radiation. A roundtrip from Dulles to Tokyo would result in 20 millirem. Living in Denver for a year results in 450 millirem of radiation dose, or 35% more than the U.S. average of 310 millirem from natural sources. About 2.5 million Americans (0.8% of the population) receive more than 2,000 millirem per year from natural sources.

These examples are not to minimize the health consequences of high-level exposures which are clearly demonstrable in human populations and include acute radiation sickness at very high doses in excess of 200 rem and an increase in cancer at moderate doses above about 10 rem (10,000 millirem). The examples do indicate, however, that we live in a world of low-level radiation for which the possible health consequences are of little concern. The exposures to the U.S. population from Fukushima are tiny and thousands of times below U.S. standards or guidelines where remedial action would be triggered.

What research is needed?
Although we know much about the health effects of high levels of radiation when received briefly, as was the case for atomic bomb survivors, the risk following exposures experienced gradually over time is uncertain and remains the major unanswered question in radiation epidemiology.

One untapped opportunity is to study our own U.S. radiation workers and veterans. The Low Dose Radiation Program within the Department of Energy had the foresight to initiate pilot investigations of over one million such workers and this comprehensive work should continue. Cooperating agencies include the National Cancer Institute, the Department of Defense, the Department of Veterans Affairs, the Nuclear Regulatory Commission and others. The study populations include early DOE and Manhattan Project workers, atomic veterans who participated in nuclear weapons testing in the 1940s and 1950s, nuclear utility workers, medical workers and others involved in the development of radiation technologies, as well as nuclear navy personnel.

Such a large study in the United States is critically important to understand scientifically the health consequences of low-dose radiation experienced over time and is directly relevant to the setting of protection standards for workers and the public; the assessment of possible risks from enhanced medical technologies such as CT and nuclear medicine imaging; the expansion of nuclear power; the handling of nuclear waste; the compensation of workers with prior exposures to radiation; and even the possible consequences of the radiation released from reactor accidents such as at Fukushima. To date, no direct study of these issues has been large enough to provide convincing answers and extrapolations from the atomic bomb exposures in 1945 have to be relied upon.

Summary
Fortunately, the health consequences from the radiation releases from the Fukushima Daiichi power plant appear to be minimal and are of little importance with regard to the U.S. public. The Japanese authorities acted quickly to evacuate over 200,000 inhabitants living near the damaged reactors; they monitored food and water and took rapid action to ban foodstuffs with increased radiation levels; they distributed stable iodine pills and syrup; and they made measurements on over 175,000 persons. The lasting effects upon the Japanese population will most likely be psychological with increased occurrence of stress-related mental disorders and depression associated not necessarily with the concern about reactor radiation, but with the horrific loss of life and disruption caused by the tsunami and earthquake. There is a need for better public understanding and better communications on the health effects of radiation exposures. Finally, there is now the opportunity in the United States to learn directly about low-dose, long-term radiation health effects by studying our workers and veterans.
Thank you for this opportunity to testify. I welcome any questions that you may have.

Relevant References
Boice JD Jr. Lauriston S. Taylor lecture: radiation epidemiology--the golden age and future challenges. Health Physics 100(1):59-76, 2011.

Christodouleas JP, Forrest RD, Ainsley CG, Tochner Z, Hahn SM, Glatstein E. Short-Term and Long-Term Health Risks of Nuclear-Power-Plant Accidents. New England Journal of Medicine, April 20, 2011.

Idaho National Laboratory. Oversight Program: Guide to Radiation Doses and Limits. [http://www.deq.idaho.gov/inl_oversight/radiation/radiation_guide.cfm]

International Atomic Energy Agency. Fukushima Nuclear Accident Update Log
[http://www.iaea.org/newscenter/news/tsunamiupdate01.html]

National Council on Radiation Protection and Measurements, NCRP Report No. 160, Ionizing Radiation Exposure of the Population of the United States, March 2009.

Report of the President's Commission on the Accident at Three Mile Island, Washington, D.C. (The Kemeny Commission Report), October 1979.

Smith J. A long shadow over Fukushima. Nature, April 5, 2011.

UNSCEAR. United Nations Scientific Committee on the Effects of Atomic Radiation. Sources and Effects of Ionizing Radiation, UNSCEAR 2008 Report to the General Assembly, with Scientific Annexes, Volume II, Annex D, health Effects due to Radiation from the Chernobyl Accident (United Nations Publications, New York), 2011.

U.S. Army Corps of Engineers. [http://www.lrb.usace.army.mil/fusrap/docs/fusrap-fs-uranium-2008-09.pdf]

Wakeford R. And now, Fukushima (editorial). Journal of Radiological Protection (in press).