Saturday, October 7, 2017

Relative Time

'Rel Time'

Leap years don't really cause a problem -- except for 1900.


These errors are due to the use of monthly averages where the differing number of days in each month are not accounted for.

Friday, October 6, 2017

ENSO modeling

An ENSO model has been developed by Paul Pukite and Paul has given me a copy of his spreadsheet.  The spreadsheet was designed to find a best model fit to the Southern Oscillation using one of the Southern Oscillation indexes (SOI); Paul was using the NINO34 index.

I decided to expand the choice of index to include all that I could find on the web.  I found 10 in total.  Five that stretch all the way back to 1880 and five from 1950 to the present.  The five that go back to 1880 are the NINO34 index, the 'Cold Tongue Index,' the SOI_signal index, the SOI_noise index, and the SOI_boma (Bureau of Meteorology - Australia).

I trained the model on the period  from 1885 - 1935 using the CTI dataset.  I then calculated the correlation coefficients for the Model output versus the various datasets over various time periods. Here are the results.



This result was not wholly unexpected the different ENSO datasets are highly correlated - except for one outlier in each group.





Sunday, October 1, 2017

Natural Variability in the GMST record

What is the natural variability in the surface temperature record?

One way to explore this is to take the temperature record and subtract all the known non-random forcings and fluctuations.  What's left over is our best guess at natural variation.

I did this with the HADCRUT global land/ocean temperature dataset from 1881 thru 2010. The known forcing that I used were:

  • CO2
  • TSI
  • LOD
  • ENSO (using two different ENSO indexes for comparison purposes)
The result looks like this:


Using the NINO34 ENSO dataset the standard deviation is 0.105503°C
Using the CTI ('Cold Tongue Index') the standard deviation is 0.107541°C

With global warming since pre-industrial at approximately 1°C, natural variation is not the answer.  It's known forcings -- and CO2 is most of that answer.

We should also check the sensitivity of the analysis to the various datasets used.  In the first analysis we looked at two different ENSO indexes.  A second analysis replaces the HADCRUT gmst data with the BEST gmst dataset.  The results are the same to within a few thousandths of a degree:


One other analysis that might be of interest is to gauge how 'The Tidal Model of ENSO' that Paul Pukite has developed fares in this regard.  I have one of his model spreadsheets at hand and so I went back to the first analysis using HADCRUT and substituted the model output for the CTI:


For this application, the model output works as well as the NINO34 dataset to within 4 ten-thousandths of a degree and actually slightly better than the CTI.

Thursday, September 21, 2017

Reproducing L&W 2005

Luckman & Wilson 2005, Fig 1a (red) overlaid with my reproduction (blue)


Wednesday, September 6, 2017

WHUT's Tidal Model of ENSO

Paul Pukite (aka WHUT or geoenergymath) has a Tidal Model of ENSO based on solar / lunar gravitational forcing.  I've been following his travails in working through the his models since he first began writing about CSALT back in 2013.

WHUT can annoy some people with his constant peddling of his ENSO model, but I really can't blame him.  He's found that ENSO is a largely deterministic process that can be both hindcast and forecast with pretty damn good accuracy -- especially when one considers that many believe that ENSO cannot be predicted more than 4 to 6 months in advance and that as a result natural variability in climate is chaotic and impossible to predict.

I've seen many dismiss WHUT's claims on general grounds, but never actually dispute any of the math or results.  I can only assume they've never actually taken the 10 or 15 minutes time to look at his model and the results.

He has been treated like a crackpot in some venues and I'm either a fool or they are.  Someone needs to tell me what's actually *wrong* with his model as opposed to just dismissing it out-of-hand.  A simple graph like this should be enough to catch anyone's eye that's paying any attention whatsoever.




ENSO model for predicting El Nino and La Nina events

Thursday, August 31, 2017

How do we put tides in GCMs?

How do we put tides in GCMs?

The accuracy of surface elevations in forward global barotropic and baroclinic tide models, Arbic et al, 2004, doi:10.1016/j.dsr2.2004.09.014 lists 10 different parameters that need to be set.  The anomalistic month appears as Mm, but geoenergymath/WHUT's Tidal Model of ENSO also requires the tropical and draconic numbers.



The limitations, computational expense, and difficulty in achieving this is probably best spelled out by reading Concurrent simulation of the eddying general circulation and tides in a global ocean model, Arbic et al, 2010,  https://doi.org/10.1016/j.ocemod.2010.01.007  pre-print pdf

Thursday, August 24, 2017

A 2000 year record of climate variations reconstructed from Haukadalsvatn, West Iceland
Áslaug Geirsdóttir, Gifford H. Miller,Thorvaldur Thordarson, Kristín B. Ólafsdóttir
Fig 13


Monday, May 29, 2017

Mores snakes and Excel

In the previous post I listed the code to find external links in Excel WorkBooks.  In practice though I want loop thru and collect every Excel WorkBook in a directory, the Last Modification Date for the file, whether the WorkBook performs a specific function, the total # of and names of all WorkSheets in each WorkBook, and the total # of and names of all Defined Names in the WorkBooks.

This is what I ended up with:

#
# excel_info.py
# Find worksheets, defined names and CheckBox1 value in Excel WorkBooks.
# Loop thru directory and write each WorkBook's name, Last Modification Date,
# CheckBox1 value (to see if uncertainties are calculated), Total # of
# WorkSheets, the names of all Worksheets, Total # of Defined Names,and a
# list of all Defined Names into CSV
#
# In the CSV file:
#    Column 1 = WorkBook name
#    Column 2 = Last Modified Date
#    Column 3 = CheckBox1 value
#    Column 4 = Total # of WorkSheets in WorkBook
#    Column 5 = All Worksheet names
#    Column 6 = Total # of Defined Names in WorkBook
#    Column 7 = All WorkBook Defined Names
#
import win32com.client as win32
import os
import glob
import csv
import datetime
from collections import Counter

wb_def_names = Counter()
ws_names = Counter()

os.chdir('C:\origin')
xlfiles = sorted(glob.glob("*.xls"))
outputFile = open('output.csv', 'w', newline='')
outputWriter = csv.writer(outputFile)

xlApp = win32.gencache.EnsureDispatch('Excel.Application')
xlApp.EnableEvents = False
#xlApp.ScreenUpdating = False
cwd = os.getcwd()
this_file = 0
for xlfile in xlfiles:
    showunc_val = 0
    unccalc_val = bool(False)
    showunc = 0
    found = 0
    checkbox1_caption = "None"
    this_file = this_file + 1
    if this_file%100 == 0:
        print("Processing ",this_file)
    if this_file >= 1:
        open_ok = 1
        try:
            xlApp.DisplayAlerts = False
            xlwb = xlApp.Workbooks.Open(cwd+"\\"+ xlfile)
        except:
            open_ok = -1
            print("error in open xlfile")
        try:
            xlwb_name = xlApp.ActiveWorkbook.Name
        except:
            open_ok = -1
            print("error in open ActiveWorkbook.Name")
        if open_ok:
            cntr = 1
            try:
                 for n in xlApp.ActiveWorkbook.Names:
                    wb_def_names[(n.Name)] += cntr
                    found = (str.find(n.Name, "showunc"))
                    found2 = (str.find(n.Name, "unccalc"))
                    if found == 0 or found2 == 0:
                        this_n = str(n)
                        dummy = this_n.split("!")
                        dummy2 = str(dummy[0]).replace("=", "")
                        dummy2 = str(dummy2).replace("'", "")
                        if found == 0:
                            dummy3 = str(dummy[1])
                            showunc_val = xlApp.Sheets(dummy2).Range(dummy3).Value
                        if found2 == 0:
                            checkbox1_caption = xlApp.Sheets(dummy2).OLEObjects("CheckBox1").Object.Caption
                            unccalc_val = xlApp.Sheets(dummy2).OLEObjects("CheckBox1").Object.Value
                    cntr = cntr + 1
                 list_wb_def_names = [v for v in wb_def_names.keys()]
            except:
                open_ok = -1
                print("error in ActiveWorkbook.Names",n,n.Name,cntr)
            cntr2 = 1
            try:
                for s in xlApp.ActiveWorkbook.Sheets:
                    ws_names[(s.Name)] +=cntr2
                    cntr2 = cntr2 + 1
                list_ws_names = [v for v in ws_names.keys()]
            except:
                open_ok = -1
            try:
                xlApp.ActiveWorkbook.Saved = True
                xlApp.ActiveWorkbook.Close()
            except:
                open_ok = -1
                print("error in closing workbook")
            try:
                last_modified_date = datetime.datetime.fromtimestamp(os.path.getctime(cwd+"\\"+ xlfile))
            except OSError:
                last_modified_date = 0
            last_modified_date = last_modified_date.strftime("%d-%m-%Y")
            try:
                outputWriter.writerow([xlwb_name, last_modified_date,unccalc_val,cntr2-1,list_ws_names,cntr-1,list_wb_def_names])
            except:
                print("error writing to outputfile")
            print("Workbook = ", xlwb_name)
            wb_def_names.clear()
            ws_names.clear()
            checkbox1_caption = "None"
        else:
            print("Unable to process ",xlfile)
    else:
        dummy5 = 0
#assert isinstance(xlApp.Application, object)
#xlApp.Application.Quit()
outputFile.close()

I found that everything worked better if Excel was already open before I ran the program - not sure why. I'm retrieving the value for CheckBox1 and not validating it only because I did when testing the code and using the CheckBox Caption property showed that - in my instance - they were all what I thought they were - thank you Dennis Green :)


It doesn't work flawlessly. During runtime I get a quite a few (2 or 3% ??) of the files reporting "error in ActiveWorkbook.Names " from one of my error handlers.  It doesn't seem to affect the output, so I haven't bothered to figure out why some files generate the error.  I will eventually store just the filenames that create the error and see why.

A more significant error is that certain poorly written Excel macros in the WorkBook generate errors on opening the WorkBook.  As I suspected when writing the previous post, moving the statement xlApp.DisplayAlerts=False  helped a lot.  I added xlApp.EnableEvents=False to try and stop macros from causing a problem, but it too may need to be moved inside the main loop to be effective.

All in all, I know my code isn't up to 'professional' standards, but it works, does what I set out for it to do, and gives me enough knowledge to use it for many other similar projects.  Eventually, if I use Python often enough, I'll start building class libraries and user-defined functions, but I'm happy with the results as it is.


Sunday, May 28, 2017

Snakes on a computer

Wikipedia has a lot to say about Python, in part:
Since 2003, Python has consistently ranked in the top ten most popular programming languages as measured by the TIOBE Programming Community Index. As of March 2017, it is the fifth most popular language. It was ranked as Programming Language of the Year for the year 2007 and 2010. It is the third most popular language whose grammatical syntax is not predominantly based on C...

So, why has it taken me so long to play around with it?  Yesterday I remedied my ignorance.

My project was to collect information on 8000 Excel Workbooks.  Doing it manually was simply out of the question.  In the past I've used GUI macros to automate opening and inspecting 1000s of files, but those were text files.  Excel files are - obviously - a different critter altogether. Hence the need for a different approach.  Greg Laden has posted some articles regarding Python and those spurred me to investigate.

Here's the newbie code I wrote to inspect each file for external links:

1:  #  
2:  # links.py  
3:  # Find all exteral links referenced in Excel WorkBook  
4:  # Loop thru directory and write each WorkBook's name, and  
5:  # the names of all external links.  
6:  #  
7:  import win32com.client as win32  
8:  import os  
9:  import glob  
10:  import csv  
11:  from collections import Counter  
12:  os.chdir('C:\origin\XLSM')  
13:  xlfiles = sorted(glob.glob("*.xlsm"))  
14:  outputFile = open('output.csv', 'w', newline='')  
15:  outputWriter = csv.writer(outputFile)  
16:  xlApp = win32.gencache.EnsureDispatch('Excel.Application')  
17:  xlApp.ScreenUpdating = False  
18:  xlApp.DisplayAlerts = False  
19:  cwd = os.getcwd()  
20:  wb_links = []  
21:  this_file = 0  
22:  for xlfile in xlfiles:  
23:    this_file = this_file + 1  
24:    if (this_file >= 1):  # Mostly for debug purposes
25:      open_ok = 1  
26:      try:  
27:        xlwb = xlApp.Workbooks.Open(cwd+"\\"+ xlfile)  
28:      except:  
29:        open_ok = -1  
30:      try:  
31:        xlwb_name = xlApp.ActiveWorkbook.Name  
32:      except:  
33:        open_ok = -1  
34:      if open_ok:  
35:        try:  
36:          wb_links = xlApp.ActiveWorkbook.LinkSources()  
37:          print("Workbook = ", xlwb_name, wb_links)  
38:        except:  
39:          open_ok = -1  
40:          print("Unable to process ",xlfile)  
41:        try:  
42:          xlApp.ActiveWorkbook.Saved = True  
43:        except:  
44:          open_ok = -1  
45:        try:  
46:          xlApp.ActiveWorkbook.Close()  
47:        except:  
48:          open_ok = -1  
49:        outputWriter.writerow([xlwb_name, wb_links])  
50:      else:  
51:        print("Unable to process ",xlfile)  
52:    else:  
53:      dummy = 0  
54:  assert isinstance(xlApp.Application, object)  
55:  xlApp.Application.Quit()  
56:  outputFile.close()  

While this code just collects the external links, in the end my program collected the filename of each Workbook, file size, last modification date, names of all Worksheets, names and ranges of all Defined Names, external links and the cell value for a specific Defined Name of interest and wrote it all to a CSV file.

The only real problem is 18:  xlApp.DisplayAlerts = False  When the program is first run this seems to be working fine, but inevitably there is a file that causes an error in Excel that isn't covered by the 'DisplayAlerts" function.  When that happens it seems to turn 'DisplayAlerts' back on.

I might try to move Line 18 into the main loop to see if that works better, but otherwise I collected all the information I was interested in.  Total runtime (from within PyCharm) was close to two hours.

Saturday, May 20, 2017

Clueless

I think the woman on the left may be grandma - and possibly aunt Kelly in the middle with the glasses.  But that's just a WAG.  It could be grandma with her brothers and sisters - but I don't know how many she actually had.

The men in the 2nd photo must be some relation on mom's side, but not the faintest idea who they are.

These were scanned in from old negatives, palate inverted, then brightness and contrast adjusted to get something viewable.



More old photos

Impossible to say if that's Blanche or Tudy with (I assume) their dad - George Hammond.

Love the glasses on that state ID photo.  Surprised it took her 4 months after her 18th birthday to get an ID.


Thursday, May 18, 2017

Thursday, April 20, 2017

Wednesday, February 8, 2017

Means and mean error

Three sets of synthetic data:

Set L is a linear function increasing in value with each year.  Set R is a random normal distribution, and Set S is a sine function.  Do they RMSEs in and of themselves tell us anything about these functions? No.  Only by looking at the actual data would we be able to see this.



Tuesday, February 7, 2017

Someone forgot to tell the arctic to wait for Karl to bust the pause.



I was going to add text and arrow for "Curry jumps shark" but there would have been too many arrows and the graph too cluttered.

Saturday, February 4, 2017

Annual FDDs per DMI N80

Annual Total Accumulated Freezing Degree Days
(August 1st thru May 31st)

Does not include this freezing season as it is not yet complete.

So, what's happening during the summer months of June & July?


Odd, melt season temperatures are actually *decreasing* - now why would that be?




Thursday, February 2, 2017

N80 Freezing Degree Days thru Feb 2, 2017

Freezing Degree Days thru Feb 2, 2017
(Climatology = 1958 to 2002)

Accumulated FDDs 
Climatology: 3095.9
2016-7:          1833.0
Anomaly:     -1262.9


Implied new ice thickness to date:
Per Lebedev:
Climo:   1.755 m
2016-7:  1.288 m

Per Bilello:
Climo:   1.408 m
2016-7:  1.039 m

Thursday, January 26, 2017

Accumulated Freezing Degree Days N80 thru Jan 26, 2017

Accumulated Freezing Degree Days N80 thru Jan 26, 2017:
Climatology: 2899.4
2016:            1683.9
Anomaly:     -1215.4



Implied new ice thickness to date:
Per Lebedev:
Climo: 1.688 m
2016:   1.225 m

Per Billelo
Climo:  1.355 m
2016:    0.989 m


Note: Anyone claiming a "recovery" in extent is ignoring new ice thickness this year is 27% less than climatology and 10 to 15% less than the average of the past decade.  The accumulated FDD anomaly is now greater than at *any* time in the DMI N80 dataset with 40% of the freezing season still to go.