If anyone is curious about how to calculate the point in the transition into fall at which certain parts of the arctic can be expected to be giving off a net heat loss rather than getting a net heat gain from solar radiation to melt ice, here are some tools I've found. First, you start with this well-known graph showing lattitude vs. date vs. insolation:
Then you can go to this handy Stefan-Boltzmann law calculator here:
http://www.endmemo.com/physics/radenergy.phpTo find out the emissivity of the Earth (under clear skies), I went here:
http://rabett.blogspot.com/2008/02/why-it-pays-to-have-clever-anonymice.htmlSo, if you punch "0.96" into the emissivity field, and 273 K into the temperature field, and "1" into the meters^2 field, you get...302 Watts. So, the Earth will lose 302 W/m^2 at 273 K (0 Celsius).
So, really, to be gaining net heat to melt ice, a part of the Earth needs to be getting more than 300 W/m^2. (Otherwise, melting can only happen through heat transport from other parts of the ocean and atmosphere that are getting more than 300 W/m^2, or that have heat stored up in water that is significantly above 0 Celsius).
In our first graph, 90-North dips below the 300 W/m^2 insolation mark at about August 20th. 60-North dips below 300 W/m^2 by about September 5th. We can guess that 75-North would be about August 30th.
There's only about another 1-2 weeks left where the sun still matters, folks! After that, clouds become the friend of (next year's) sea ice melt. A cloudy sky drops the emissivity all the way down to 0.5. That almost halves the lost heat to space. If you drop the temperature down to 243 K (-30 Celsius) and assume cloudy skies, then the heat lost to space in the winter is only 98 W/m^2.
Here's a math problem: let's imagine that the arctic was totally cut off from the rest of the climate system. Whatever ice it had to melt in the summer, it had to do on its own yearly budget. How cold would it need to get in the winter to balance the budget?
Let's say that the arctic gets about an average of 350 W/m^2 for the six months of the melt season. Let's imagine that it is losing 300 W/m^2 during those six months (because the temperature is brought up to 0 Celsius). That's a net gain of 50 W/m^2 over six months, or overall, the arctic would soak up 4,320 Watt-hours/m^2 over that melting season.
On the flipside, there are six months where the average insolation is maybe 50 W/m^2. That would mean that the arctic would be able to afford to lose only 100 W/m^2 on average in order to end up with the 50 W/m^2 net loss over six months to lose the 4,320 accumulated Watt-hours/m^2.
One way to lose 100 W/m^2 is to have cloudy skies (emissivity = 0.5) and an average temperature of 243 K (-30 Celsius). But what if the skies are clear (emissivity = 0.96)? Then you'd need the average temperature in the arctic in the winter to get down to about 205 K (about -68 Celsius) to lose only ~100 W/m^2. What if the emissivity were 0.64 (which is what some estimate to be the average emissivity of the atmosphere, factoring in cloudy and clear days)? Then you could get away with having an average arctic temperature of 228 K (-45 Celsius) and still manage to lose only an average of 100 W/m^2 over those six months (50 W/m^2 net loss).
Since the average arctic temperature from September to March is probably a little bit above -45 Celsius, we can tell that the arctic is receiving substantial heat inputs from the lower lattitudes.
Next math problem: how much more would first-year ice thicken with an average temperature of -45 Celsius over a six-month freezing season compared to how much it tends to thicken now with whatever the average temperature from September to March is now?