The problem is that heat is being continously generated in the Earth from breakdown of radioactive elements. It's the reason that Jupiter gives off more heat than it receives from the Sun
Radiogenic heat has certainly extended the time that the Earth requires until it can cool near to the ideal BB on which 'GHE' models are based..
That would be odd, since UVB on skin produces vitamin D.
I will accept that correction. I should have said significantly less UVB reached the surface in the early 70's.
The dramatic rise in UVB exposure from 1979-2005 has been well documented, as has its effects on organisms.
for example:
https://phys.org/newman/gfx/news/hires/1-uvexposureha.jpg
The issue is how much, if at all, it warms the Earth. The greatest amount of warming is longer IR rays.
hmmm. it is my understanding that the greatest amount of solar induced warming (of the surface and ocean) comes from the visible light spectrum. Do you have a source that suggests its IR?
I will also point out that UV is a far greater energy level than IR. I think Dr.Ward's site that I linked you to, has a pretty good discussion on this. (although his ideas on how that energy propagates through space are certainly unorthodox).
For example, it doesnt matter how much IR at 15 microns you have you cant disassociate o2 with it...
Further IR only penetrates ocean water a short distance (mm's ?) while UVB penetrates 10s of meters below the surface.. (which do you think more likely to warm the oceans?)
Regardless, I hope you would agree, that where solar radiation is absorbed, there will be an increase in temp (heating).
But I'd be pleased to hear it.
Cool.
Now I think Dr. Ward did a fine job showing the link between vulcanism (his area of expertise) and ozone depletion (from outgassed c

rine and bromine). And the correlation of ozone depletion and observed temp rise, so I'm not going to rehash that here.
I will start by saying that if it is indeed UVB (and where it is absorbed) that has been driving the recent (30-40 year) temp trend then:
If ozone concentration recovers significantly stratospheric temperatures must increase, followed by decreasing ocean and tropospheric temps..
If the ozone concentration remains relatively stable, temperatures will remain relatively stable
If the ozone concentration decreases then stratospheric temps must fall, followed by increasing ocean and tropospheric temps.
The missing piece here is ozone production!
Earth has an abundance of O2 thanks to plant life, the limiting factors for the creation of ozone are the available amount of UVC to disassociate 02 and thus the supply of atomic O (uvb just causes O3 and 02 to play hotpotato)
however there is another source of O that co2 has an effect on!
Infalling O from the mesopause...
CO2 governs cooling in the upper mesosphere, preventing some O (and N and etc..) from escaping the TOA (which I will define here as the mesopause)
If you increase the concentration of CO2 you increase the number of collisions between O and colder slower CO2, decreasing the amount exiting the TOA and increasing the amount slowing and falling back towards the stratopause.
So the natural cycle works like this:
1. ozone depletion cause less UVB absorption in the stratosphere and its subsequent cooling
2. more UVB reaches earth (and ocean) surface
3. increased absorption by oceans raises ocean temps
4. warmer oceans dissolve less co2
5. atmospheric co2 concentration rises
6. upper mesosphere cools infalling O concentration increases
7. ozone concentration rises with subsequent stratospheric warming from increased UVB absorption
8. oceans cool due to less uvb absorption
9. colder oceans absorb more co2
10. atmospheric co2 concentrations fall
11. upper mesosphere warms, more O escapes the TOA
12. ozone concentration decreases.. go to step 1
If this is the case, then yes there
was induced human global warming when we pumped a bunch of c

rine into the stratosphere...
The irony of course is that if long term co2 rise leads to cooling, we would be slowing that cooling by restricting our co2 outputs. ;)
Peace!