Voici la question qui me guide dans mes recherches...

L’appât du gain manifesté par les entreprises supranationales et certains groupes oligarchiques, de même que le contrôle des ressources naturelles par ceux-ci, dirigent l’humanité vers un nouvel ordre mondial de type féodal, voir même sa perte. Confronté à cette situation, l’être humain est invité à refuser d’accepter d’emblée une pseudo-vérité véhiculée par des médias peut-être à la solde de ces entreprises et groupes. Au contraire, il est invité à s’engager dans un processus de discernement et conscientisation afin de créer sa propre vérité par la confrontation de sa réalité nécessairement subjective à des données objectives, telles que révélées par la science, par exemple.

The penalty that good men pay for not being interested in politics is to be governed by men worse than themselves. - Plato

vendredi 30 décembre 2011

It's the end of the world... Again... Hum...

Today, December 30th 2011, we read this propaganda piece in "The Gazette" and distributed, without analysis by Radio-Canada on twitter.

They did not even copied the right information. 2018 or 2020... It's important to get the end date right ;-)


ABOVE-AVERAGE TEMPERATURES WERE RECORDED DURING EVERY SEASON OF 2011. And while winter-weary Quebecers might welcome the warmer weather, it sends a more sinister message to scientists who see it as evidence that, by 2020, temperatures in Quebec will exceed the 2-degree tipping point they say will lead to catastrophic climate change. William Marsden reports, Page A3. WHAT KIND OF ENERGY IS QUEBEC USING - and what does it need? Our appetite will fuel debate in 2012, Lynn Moore writes. Page B1
Couple of points to note when you read this:
  • Above average temperatures..
    • Which data set did they take?  There are so many out there, without this vital information, it's hard to know what they are talking about.
    • Here's a graph of all data set for the last 10 years...
  • "evidence that, by 2020, temperatures in Quebec will exceed the 2-degree tipping point"
    • Evidence?  By 2020?, what data set extrapolation did they used?  What models?  Why 2 degrees?  All those are assumptions, extrapolations, based on mathematical models often unproven and unable to track real life data set.
  • lead to catastrophic climate change
    • Catastrophic climate change!!! Wow, 2 degrees more would be a catastrophe, if this ever happen, what are the proof that this will be a catastrophe, continue reading... we have seen worst in the past.
  • The rest of the small article, last 2 lines, does not make sense at all... No references provided on the web page of the Gazette.
I classify this small article as pure propaganda... When I read this I think of Mulder in X-files... I WANT TO BELIEVE!

Here's some real information that can help you to judge by yourself, if you are not the type that does believe everything that the mass media throws at you!

In all the data set, I think in my humble opinion that the more precises/accurate is the UAH dataset that started in 1978 and is based on microwave sounding units on polar orbiting satellites. Here's all the data since it started.  I plotted a green line for a running average of 13 months.  I also plotted two regression lines before and after the mount Pinatubo event.  Do you see in there a maximum average temperature higher than other years?  You also need to note the range of change since 1978, that is for that last 33 years...


But today's temperature data need to be put in perceptive on a longer time scale... Here's a study of the last 2485 years in the China region, it clearly shows that the temperature in the past was higher then today, if the tipping point theory was right, we should have seen a couple of time a run-away temperature on the planet where the planet would have burn to a crisp!  But if you read on this term "tipping point" it's not even clear what it means!.
Tree-ring-based temperature reconstruction for the central-eastern Tibetan Plateau during the past 2485 years (gray line), the 40-year moving average (thick black line) and the 40-year running standard deviation (thin black line); the horizontal line is the mean temperature for the 2485 years.

Here's another study for that last 4000 years in Greenland... As you see, many time, again, the climate was warmer.
Past 4000 years of
Greenland temperature. Thick blue line and band are the same as above. Thick green line represents 100‐year moving
averages. Black and red lines are the Summit [Box et al., 2009] and AWS [Stearns and Weidner, 1991; Shuman et al., 2001;
Steffen and Box, 2001; Vaarby‐Laursen, 2010] decadal average temperature, respectively. Blue and pink rectangles are the
periods of 1000–2010 C.E. (Figure 1, middle) and 1840–2010 C.E. (Figure 1, top), respectively. Present temperature is
calculated from the inversion adjusted AWS decadal average temperature (2001–2010) as −29.9°C
From Wikipedia, I highlighted in red what I think is missing in most climate related journalism reports.

The elements of journalism

According to The Elements of Journalism, a book by Bill Kovach and Tom Rosenstiel, there are nine elements of journalism.[2] In order for a journalist to fulfill their duty of providing the people with the information, they need to be free and self-governing. They must follow these guidelines:
  1. Journalism's first obligation is to the truth.
  2. Its first loyalty is to the citizens.
  3. Its essence is discipline of verification.
  4. Its practitioners must maintain an independence from those they cover.
  5. It must serve as an independent monitor of power.
  6. It must provide a forum for public criticism and compromise.
  7. It must strive to make the news significant, interesting, and relevant.
  8. It must keep the news comprehensive and proportional.
  9. Its practitioners must be allowed to exercise their personal conscience.
In the April 2007 edition of the book,[3] they added the last element, the rights and responsibilities of citizens to make it a total of ten elements of journalism.



So when you read stuff in the mass media, ask yourself a couple of question and always be skeptical!  Your way of live and of future generations may depend on it.

jeudi 29 décembre 2011

New research in low dose radiation disprove LNT

Following on my previous articles on LNT that you should read before this one. Here's some good news on this important research that could change the way we approach low dose radiation.



New Take on Impacts of Low Dose Radiation
Berkeley Lab Researchers Find Evidence Suggesting Risk May Not Be Proportional to Dose at Low Dose Levels

Researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab), through a combination of time-lapse live imaging and mathematical modeling of a special line of human breast cells, have found evidence to suggest that for low dose levels of ionizing radiation, cancer risks may not be directly proportional to dose. This contradicts the standard model for predicting biological damage from ionizing radiation – the linear-no-threshold hypothesis or LNT – which holds that risk is directly proportional to dose at all levels of irradiation.

Imaging of a cell’s DNA damage response to radiation shows that 1.5 minutes after irradiation, the sizes and intensities of radiation induced foci (RIF) are small and weak, but 30 minutes later damage sites have clustered into larger and brighter RIF, probably reflecting DNA repair centers.

“Our data show that at lower doses of ionizing radiation, DNA repair mechanisms work much better than at higher doses,” says Mina Bissell, a world-renowned breast cancer researcher with Berkeley Lab’s Life Sciences Division. “This non-linear DNA damage response casts doubt on the general assumption that any amount of ionizing radiation is harmful and additive.”

Bissell was part of a study led by Sylvain Costes, a biophysicist also with Berkeley Lab’s Life Sciences Division, in which DNA damage response to low dose radiation was characterized simultaneously across both time and dose levels. This was done by measuring the number of RIF, for “radiation induced foci,” which are aggregations of proteins that repair double strand breaks, meaning the DNA double helix is completely severed.

“We hypothesize that contrary to what has long been thought, double strand breaks are not static entities but will rapidly cluster into preferred regions of the nucleus we call DNA repair centers as radiation exposure increases,” says Costes. “As a result of this clustering, a single RIF may reflect a center where multiple double strand breaks are rejoined. Such multiple repair activity increases the risks of broken DNA strands being incorrectly rejoined and that can lead to cancer.”

Berkeley Lab biophysicist Sylvain Costes is generating 3D time lapse of DNA repair centers in human cells to understand better how cancer may arise from DNA damage. (Photo by Roy Kaltschmidt, Berkeley Lab)


Costes and Bissell have published the results of their study in the Proceedings of the National Academy of Sciences in a paper titled “Evidence for formation of DNA repair centers and dose-response nonlinearity in human cells.” Also co-authoring the paper were Teresa Neumaier, Joel Swenson, Christopher Pham, Aris Polyzos, Alvin Lo, PoAn Yang, Jane Dyball, Aroumougame Asaithamby, David Chen and Stefan Thalhammer.

The authors believe their study to be the first to report the clustering of DNA double strand breaks and the formation of DNA repair centers in human cells. The movement of the double strand breaks across relatively large distances of up to two microns led to more intensely active but fewer RIF. For example, 15 RIF per gray (Gy) were observed after exposure to two Gy of radiation, compared to approximately 64 RIF/Gy after exposure to 0.1Gy. One Gy equals one joule of ionizing radiation energy absorbed per kilogram of human tissue. A typical mammogram exposes a patient to about 0.01Gy.

Corresponding author Costes says the DNA repair centers may be a logical product of evolution.

“Humans evolved in an environment with very low levels of ionizing radiation, which makes it unlikely that a cell would suffer more than one double strand break at any given time,” he says. “A DNA repair center would seem to be an optimal way to deal with such sparse damage. It is like taking a broken car to a garage where all the equipment for repairs is available rather than to a random location with limited resources.”

However, when cells are exposed to ionizing radiation doses large enough to cause multiple double strand breaks at once, DNA repair centers become overwhelmed and the number of incorrect rejoinings of double strand breaks increases.

“It is the same as when dozens of broken cars are brought to the same garage at once, the quality of repair is likely to suffer,” Costes says.

The link between exposure to ionizing radiation and DNA damage that can give rise to cancerous cells is well-established. However, the standards for cancer risks have been based on data collected from survivors of the atomic bomb blasts in Japan during World War II. The LNT model was developed to extrapolate low dose cancer risk from high dose exposure because changes in cancer incidence following low dose irradiation are too small to be measurable. Extrapolation was done on a linear scale in accordance with certain assumptions and the laws of physics.

“Assuming that the human genome is a target of constant size, physics predicts DNA damage response will be proportional to dose leading to a linear scale,” Costes explains. “Epidemiological data from the survivors of the atomic bombs was found to be in agreement with this hypothesis and showed that cancer incidence increases with an increase in ionizing radiation dose above 0.1 Gy. Below such dose, the picture is not clear.”

Previous studies failed to detect the clustering of double break strands and the formation of DNA repair centers because they were based on single-time or single-dose measurements of RIF at a discrete time after the initial exposure to ionizing radiation. This yields a net number of RIF that does not account for RIF that have not yet appeared or RIF that have already made repairs and disappeared. The time-lapse imaging used by Costes, Bissell and their co-authors showed that RIF formation continues to occur well beyond the initial radiation exposure and after earlier repair issues have been resolved. Time-lapse imaging also indicates that double strand break clustering takes place before any RIF are formed.

“We hypothesize that double strand break clustering occurs rapidly after exposure to ionizing radiation and that RIF formation reflects the repair machinery put in place around a single cluster of double strand breaks,” Costes says. “Our results provide a more accurate model of RIF dose response, and underscore fundamental concerns about static image data analysis in the dynamic environment of the living cell.”

Previous studies also mostly involved fibroblast cells whereas Costes, Bissell and their colleagues examined epithelial cells, specifically an immortalized human breast cell line known as MCF10A, which has a much higher background of RIF than fibroblasts, even without ionizing irradiation. To compensate for this higher background, Costes developed a mathematical method that enables background to be corrected for on a per- nucleus basis in unirradiated cells. Still the use of a special line of immortalized breast cells is an issue that Costes and his colleagues plan to address.

“We are now looking at primary breast epithelial cells that have been removed from healthy donors to determine if our results are repeated beyond just a single cell line and under more realistic physiological conditions,” Costes says. “We’d also like to know if our findings hold true for fibroblasts as well as epithelial cells. Also, we’d like to know if double strand break clustering is the result of a random coalescence or if there is an active transport mechanism that moves these double strand breaks towards pre-existing DNA repair centers.”

Working in collaboration with Rafael Gomez-Sjoberg of Berkeley Lab’s Engineering Division, Costes and his group are also developing a special microfluidics lab-on-a-chip device that is integrated into an X-ray microbeam. The goal is to provide a means by which cells can be kept in a controlled microenvironment while being irradiated with multiple doses. This microfluidic array will be used to characterize DNA damage response in breast and blood cells collected from human donors.

“By characterizing DNA damage response in cells from many different human donors,” Costes says, “we should be able to determine the variation across humans and gain a better understanding of how sensitivity to DNA damage from ionizing radiation might vary from individual to individual.”

This research was supported by the DOE Office of Science.

Obama: Business as usual with the military industrial complex and removal of constitutional rights.

Appointing financiers involved with military contractors to the FED

While on vacation in Hawaii, Obama designated Jerome Powell to serve on the Federal Reserve Board of Governors.

Powell served as the undersecretary for finance under the president George H. W. Bush and was a partner of The Carlyle Group. The Carlyle Group is a massive private equity firm and one of the largest defense contractors in the world.

They're made up of some of the most influential policymakers over the last five administrations including both Bush presidents, former Secretary of State James Baker III, former Secretary of Defense Frank Carlucci, former Clinton Chief of Staff Mack McLarty, and former SEC Chairman Arthur Levitt to name a few.

Other notable investors in The Carlyle Group include the bin Laden family and the Saudi Royal Family. Coincidentally, George H. W. Bush was meeting at the Ritz Carlton Hotel in Washington on the morning of September 11th with one of Osama Bin Laden's brothers.

Watch the documentary below for some more background on the Carlyle Group:


Eisenhower warns us of the military industrial complex.



Obama Signing Statement Says He isn’t Bound by 20 Provisions of General Spending Bill

Some examples, the president objected to:
  • A ban on spending money to move terrorist detainees from Guantánamo to prisons in the U.S.
  • Restrictions on placing American troops involved in United Nations peacekeeping missions under foreign command
  • Thirty days advance notice to Congress for any use of the military that would involve more than $100,000 in construction costs
  • “Disclosure to the Congress of information regarding ongoing diplomatic negotiations”
  • Language that blocks health, climate, auto policy and urban affairs “czars” from being employed by the White House
  • A provision that bars health officials from advocating for gun control
  • Congressional approval of funds transferred from the Library of Congress to the Copyright Office.


National Defense Authorization Act - Tyranny Vs. Liberty

Subtitle D of the National Defense Authorization Act (NDAA), headed to the president’s desk. These under-the-radar provisions, co-sponsored by Senators John McCain and Carl Levin, would allow for the indefinite military detention of any person alleged to be a member of Al Qaeda, the Taliban or “associated forces.” The provisions also apply to any person who supports or aids “belligerent” acts against the United States, whether the person is apprehended here or abroad.




President Obama and the Spread of Security Theater

The TSA was created in response to the September 11, 2001, attacks. Its first administrator, John Magaw, was nominated by President Bush on December 10, 2001, and confirmed by the Senate the following January. The agency's proponents, including Transportation Secretary Norman Mineta, argued that an only single federal agency would better protect air travel than the private companies who operated under contract to single airlines or groups of airlines that used a given terminal facility.

Now the TSA would be expanded to all mode of travel.


The Globalization of War

The world's attention is increasingly focused on Syria and Iran as the region continues to move toward military confrontation. Less noticed, however, is that the pieces are being put into place for a truly global conflict, with military buildup taking place in every region and threatening to draw in all of the world's major powers.




The Pentagon and its Sock Puppets

The net result is that American viewers were sold on independent analysis and instead got individuals, often with ongoing contractor relationships with the Pentagon, who read from pre-prepared Pentagon talking points.

An internal Department of Defense review has concluded that a Rumsfeld-era program under which retired military officers who appeared on American broadcast media were given special briefings and access was consistent with Pentagon rules. The New York Times reports:

The inquiry found that from 2002 to 2008, Mr. Rumsfeld’s Pentagon organized 147 events for 74 military analysts. These included 22 meetings at the Pentagon, 114 conference calls with generals and senior Pentagon officials and 11 Pentagon-sponsored trips to Iraq and Guantánamo Bay, Cuba. Twenty of the events, according to a 35-page report of the inquiry’s findings, involved Mr. Rumsfeld or the chairman of the Joint Chiefs of Staff or both. One retired officer, the report said, recalled Mr. Rumsfeld telling him: “You guys influence a wide range of people. We’d like to be sure you have the facts.”

The inspector general’s investigation grappled with the question of whether the outreach constituted an earnest effort to inform the public or an improper campaign of news media manipulation. The inquiry confirmed that Mr. Rumsfeld’s staff frequently provided military analysts with talking points before their network appearances. In some cases, the report said, military analysts “requested talking points on specific topics or issues.” One military analyst described the talking points as “bullet points given for a political purpose.” Another military analyst, the report said, told investigators that the outreach program’s intent “was to move everyone’s mouth on TV as a sock puppet.”

The internal review also apparently found no fault with the exclusion of four individuals precisely because they refused to be sock puppets, speaking critically of some Pentagon decisions. One of them, General Wesley Clark, apparently lost his position as an analyst for CNN because of Pentagon and White House displeasure with what he had to say.

The investigation was prompted by David Barstow’s Pulitzer Prize–winning exposé of the Pentagon program. Barstow wrote:

Records and interviews show how the Bush Administration has used its control over access and information in an effort to transform the analysts into a kind of media Trojan horse—an instrument intended to shape terrorism coverage from inside the major TV and radio networks.

Analysts have been wooed in hundreds of private briefings with senior military leaders, including officials with significant influence over contracting and budget matters, records show. They have been taken on tours of Iraq and given access to classified intelligence. They have been briefed by officials from the White House, State Department and Justice Department, including Mr. Cheney, Alberto R. Gonzales and Stephen J. Hadley.

In turn, members of this group have echoed administration talking points, sometimes even when they suspected the information was false or inflated. Some analysts acknowledge they suppressed doubts because they feared jeopardizing their access.

The Barstow exposé revealed two of the most important media scandals to emerge from the Iraq War period. The first went to the Rumsfeld Pentagon’s deft use of its enormous public-affairs resources to influence the American media, often for blatantly political purposes. These operations were plainly illegal. Since World War II, Congress has imposed clear limits, written into defense-appropriations measures, on the Pentagon’s ability to engage in domestic public-relations operations. The Department of Defense is permitted to run recruitment campaigns and give press briefings to keep Americans informed about its operations, but it is not permitted to engage in “publicity or propaganda” at home. The internal DoD review exonerating the practice of mobilizing and directing theoretically independent analysts apparently focuses on the fact that the program conforms with existing department rules, but it overlooks the high-level prohibition on “publicity or propaganda,” which was plainly violated.

The second scandal goes to the broadcasters themselves. They apparently recruited these analysts anticipating access to the Pentagon and a steady conduit of information. Their compromise highlights the Achilles heel of the Beltway media: access, not critical or objective coverage, is everything. There is little evidence to suggest that the broadcasters took any meaningful steps to assert their independence or objectivity—indeed, the dismissal of Wesley Clark by CNN shows precisely the opposite. The net result is that American viewers were sold on independent analysis and instead got individuals, often with ongoing contractor relationships with the Pentagon, who read from pre-prepared Pentagon talking points.

In his 1961 farewell address, President Dwight Eisenhower warned against the “acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex.” Although he was persuaded that this new relationship between the Pentagon and its contractors was “a vital element in keeping the peace,” he was deeply troubled by the relationship’s potential to disrupt the delicate balance of interests that is fundamental to a modern democracy. David Barstow’s investigation provided some of the most subtle and compelling evidence of this process to appear in recent years. The Pentagon’s self-exonerating report, by contrast, suggests that media sock puppets may become a modus operandi.

mercredi 28 décembre 2011

L'agriculture verticale et autre technologie au service de l'humanité.

L'agriculture verticale

Vous avez sans doute lu les nouvelles récentes : nous avons franchi le cap des 7 milliards d'êtres humains sur la Terre le 31 octobre. Bon, tandis que les petits prophètes de malheur continuent leurs diatribes sur l'impact environnemental de ce qu'ils appellent l'explosion démographique et de l'avènement imminent d'ères de famines, il n'en reste pas moins qu’à l'heure actuelle, 925 millions d’êtres humains manque de nourriture.



Quand l’on regarde les faits, le système économique actuel place plutôt ses priorités dans une consommation éhontée pour un faible pourcentage de la population mondiale et une production de biocarburants utilisant des terres arables pour de l'éthanol. Cela est sans compter le rappel incessant de l'érosion des sols et bientôt le manque de terres arables pour l'agriculture. L'impact de ces pratiques pourrait augmenter alors que la population passera, selon les prédictions de l'O.N.U., à 9 milliards d'habitants en 2050. Une surface équivalente à celle du Brésil serait alors nécessaire pour nourrir tout ce beau monde.

Une économie de 30 à 60 % de nourriture serait possible en améliorant les systèmes de distribution de nourriture et en privilégiant la consommation directe aux humains. C’est le ratio qui se gaspille chaque année, selon l’organisation des Nations Unies pour l’alimentation et l’agriculture – F.A.O..


L'impact néfaste de l'agriculture commerciale actuelle :

L'utilisation massive de pesticides et d'insecticides doit également être abordée, car ils tuent peu à peu les insectes pollinisateurs — voir les débats sur les abeilles des cinq dernières années — et sont potentiellement cancérigènes. L’exploitation massive des terres cause un appauvrissement et une érosion prématurée des sols et un accroissement de la désertification.

Mis à part ces « bonnes » nouvelles, comment peut-on régler d'un seul coup tous ces problèmes tout en embellissant notre environnement? Puisqu'on en parle, aussi bien démontrer que l'innovation humaine ne connaît que les limites imposées par sa propre créativité.

L'idée provient d'un géologue, Gilbert Ellis Bailey, qui a publié en 1915 un livre sur « vertical farming ». Mais c'est Dickson Despommiers, professeur de microbiologie et de sciences environnementales à la New York Colombia University, qui a été incontestablement le fondateur de ce récent mouvement. Parti d'une idée folle, comme il l'a décrit lui-même en 1999. Dans une de ses classes, il parlait de faire de l’agriculture urbaine sur les toits des immeubles.



Le mouvement a pris tel un feu de poudre et s'est répandu très rapidement dans le monde entier en moins d'une décennie. Il lance alors l'idée d'implanter le procédé entièrement à un immeuble de 30 étages. Des architectes se sont alors lancés dans la conception d'éco-environnements, ou un cycle fermé de processus assure une utilisation intelligente des ressources. Dans un tel circuit fermé, rien ne se perd, rien ne se crée. Bien que l'idée ne soit encore qu'au stade expérimental, il existe actuellement six projets du genre :

  1. The Plant à Chicago
  2. Alpha farm à Manchester, qui sera la pièce de résistance pour l'exposition internationale de l'Angleterre en 2013
  3. Un laboratoire de recherche à Suwon, en Corée du Sud, disposé sur trois étages sinon, il reste plusieurs usines à légumes au Japon, ou des compagnies telles que 
  4. Terrasphere
  5. Aerofarms
  6. Valcent Verticrop.
D’une idée toute simple à l'origine a émergé ce que plusieurs appellent la prochaine grande révolution verte.
Ces essais n'en sont qu'à leurs balbutiements. Bien entendu, il demeure quelques obstacles sur la faisabilité d'un tel projet à grande échelle. Comme je le dis toujours, il suffit de combiner les savoirs actuels pour trouver les solutions aux problèmes.

Un des grands défis est l'alimentation électrique de telles structures. Selon certains calculs, cela prendrait huit fois la capacité des centrales électriques actuelles des États-Unis pour fournir la production lumineuse nécessaire à leurs besoins. Cependant, compte tenu des développements récents et progrès en matière de production d'énergie, il serait possible de fournir les besoins énergétiques d'un seul bâtiment grâce à des systèmes de pyrolyse ou gazéification au plasma des déchets, tels ceux fournis par la compagnie Terragon ou Plasco Energy Group. Sinon, à moins qu'il n'y ait une révolution de l'énergie et qu'elle coûte moins chère et soit moins dangereuse à produire (voir centrales nucléaires au thorium ou les réactions nucléaires à basse énergie — LENR), cela risque de poser encore quelques défis.

De nouvelles avancées à l'Université McGill concernant les lumières DEL permettent d'utiliser des spectres de différentes couleurs, pour stimuler la productivité des cultures tout en consommant 10 % de l'énergie des lumières actuelles.

De plus, les immeubles pourraient être construits avec le génie de la nature, tel qu'exposé par Michael Pawlyn. Les fenêtres pourraient être très solides et économiques à construire par un nouveau matériau économique et écologique nommé l’ETFE(2). Il s'agit d'une plaque de polymère pouvant être étiré sur une structure d'acier en trois couches et gonflée à l'air. Son coût est de 24 à 70 % moins à installer comparé au verre, supporte 400 fois son poids, est autonettoyant et recyclable. La lumière naturelle pourrait être ainsi maximisée, et ainsi limiterait l'utilisation de lumière artificielle.

Le volume de fruit et légume produit pourrait également être stimulé par une plus grande concentration de CO2 de l’ordre de 1200 ppm, augmentant potentiellement la croissance de 44 %.



Parlons maintenant des avantages, car ils sont légion :
  • Nous parlons d'une production entièrement biologique et parfaite, 365 jours par année, sans pesticides et insecticides et ceci indépendamment des saisons.
  • Nous parlons d'une commercialisation pouvant donner des milliers d'emplois dans les villes et une distribution locale et rapide de produits frais tout en limitant les dépenses de carburants fossiles.
  • Nous parlons d'un rendement de 5 à 10 fois supérieur à celui de l'agriculture conventionnelle sur 10 fois moins de terrain.
  • Nous parlons d'un procédé pouvant être implanté dans n'importe quel climat, peu importe le pays.
  • Nous parlons d'une économie d'eau de l'ordre de 5 fois comparativement à l'agriculture normale.
  • Bref, nous parlons d'un moyen de nourrir une population croissante, en bonne santé, tout en limitant l'impact environnemental de ce dernier.

Comme mentionné plus haut, tout serait pensé pour fermer le cycle de la consommation : de l'hydroponie combinée à de l'aquaculture, faisant l'élevage de plusieurs espèces de poissons tout en cultivant des laitues. Les déjections des poissons nourrissent les plantes en nutriments, l'eau étant également traitée, filtrée et recyclée. Les poissons seraient nourris grâce aux restes des cultures et des déchets des préparations dans les usines situées en dessous.

Un système de pyrolyse et/ou gazéification utiliserait les déchets non recyclables pour alimenter une partie du bâtiment en électricité, tout en récupérant de l’eau et autre matière. Un biodigesteur de biométhanisation pourrait aussi être utilisé pour prendre le relais avec les restes des usines de transformation des produits.

Des installations aéroponiques pourraient maximiser l'utilisation de l'eau – utilisant seulement 10 % d'eau comparée à l'agriculture intensive — des cultures maraîchères, de tomates, de concombres, de tous les types de salades, d'épinards et de laitues ainsi que des herbes et des épices. Ils peuvent également faire le même traitement grâce à un système de compte-gouttes et de cultures verticales, distribuant l'eau directement aux racines, et s'égouttant à d'autres plateaux superposés.

Une utopie que tout cela? Les projets en vigueur ne sont que les premiers pas vers un avenir meilleur si l'idée continue de faire son chemin tel qu'elle l'a fait déjà. Il est possible de rendre de telles infrastructures viables économiquement tout en augmentant le niveau de la dignité humaine. Le potentiel de ces technologies n'est plus un rêve, mais bien une réalité. Penseurs du monde, vous pouvez encore trouver d'autres manières d'améliorer la vie pour tous, autant pour le genre humain que pour la biodiversité de la planète.

Vidéo résumant bien les possibilités:


Vincent Blanchette

dimanche 11 décembre 2011

Remember cold fusion... Now LENR, Low Energy Nuclear Reactions

There have been lots of new development following Pons & Fleischmann Cold Fusion.

Now we are talking about LENR or Low Energy Nuclear Reactions.

Here's 3 presentations that occured at NASA on sept 22nd and some interesting slides extracted. More information and discussion here.

Zawodny Slides



Nelson Slides

Bushnell Slides
In Short, LENR , depending upon the TBD performance, appears to be capable of Revolutionizing Aerospace across the board. No other single technology even comes close to the potential impacts of LENR upon Agency Missions.


Some good video on the technology:

Low Energy Nuclear Revolution (English Version)


Cold Fusion More than Junk Science 60minutes 9-4-19 2 of 2


Dr. Robert Duncan on Cold Fusion at the Missouri Energy Summit 2009 Part 1-3 playlist