Consider the nuclear option. Whilst it offers a low carbon solution to generating power, it carries with it the risk of meltdown and the release of nuclear material into the environment. This release could have dire consequences for flora and fauna near the emission source. However, because 'radiation' obeys the inverse square law its intensity reduces by a factor of four as the distance from the source doubles. So the risk is highest closest to the source.
Now consider the CCS option. Let's imagine that the Fukushima Power Station in Japan was actually coal fired and been injecting CO2 into the rock strata for the last 40 years. Had the earthquake ruptured the strata and released the CO2 into the atmosphere then the consequences are not localised like the nuclear release, but global. Thus the world is exposed, rather than a localised community.
This to me is the dilemma – and I'm not deliberately scaremongering. Sure I am pro-nuclear, but for this very reason: it is not a matter of opinion as to whether or not CO2 absorbs IR radiation, it is a matter of scientific fact. However, the consequences of increased amounts of CO2 in the atmosphere is (in my view) a matter of opinion – will the sea levels rise? Will the trans-Atlantic conveyor shutdown? Will more land become desert?
There is 50 years of nuclear expertise and evolution, yet I'm unaware of a full-scale CCS plant being in operation. The aging coal fired and gas fired power stations need upgrading – surely it is a case of better the devil we know, at least for the next 50 years?