Has anyone quantified/estimated the amount of heat generated by a given amount of surface area during a wildfire event?Hint-my Vermont Castings wood stove gives off 50,000 btu's of heat per hour while in use.
50 K btu per hour for about the equivalent of a 2 sq. ft. campfire.Or another hint-a 1500 watt electric heater gives off 5000 btu's of heat per hr.How many sq ft. of sunlight in a day would that equal?
Of course peat,tundra,and forests give off varying amounts of heat when burning.Sunlight is measured in watts per meter sq. hitting the land surface.I think that number could be around 1360 watts per m sq/day.
So if a fire is burning in a forest that sq meter of surface is generating about 50,000 btu x5
(10.5 sq ft per meter)=250,000+/- btu per sq m per hour x4150( one acre divided by 10.5)= slightly over one billion btu per hour per acre!Vs less than 21 million btu a day of sunlight falling in a day.
Houston are you still there?And we have another problem- thousands of acres of burning Siberian peat exhausting in sweltering plumes into the Arctic Circle each day.That heat also is part of a problem that nobody considers.The co2 and clouds keep it in the atmosphere.It is dynamically drawn to one of the coldest parts of the planet and is melting ice as I type.Large areas along the Siberian Coast are already blue water and shallow.The stage is set and we have many warm/hot days in the Arctic before this year's melt stops in the fall!
One more point, it will not take a complete melt of the Arctic Ice Cap to generate a calamity