Support the Arctic Sea Ice Forum and Blog

Author Topic: nc netcdf data  (Read 37922 times)

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #1 on: February 24, 2015, 07:40:19 AM »
qgis is here

http://www2.qgis.org/en/site/

gdal is here

http://www.gdal.org/

GRASS is here

http://grass.osgeo.org/

take a huge hit, there are multiple prereqs

mmm, seems easier these days, for example

http://grass.osgeo.org/download/software/linux/

let me know problems. will try to help

sidd

Laurent

  • Young ice
  • Posts: 2546
    • View Profile
  • Liked: 13
  • Likes Given: 50
Re: nc netcdf data
« Reply #2 on: February 24, 2015, 10:10:04 AM »
I have looked in the ubuntu library and Grass is in it, so for whose with linuxmint or ubuntu you can download from the installer. (thought you may want to check the version that is installed because it may not be up to date)

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #3 on: February 24, 2015, 12:27:58 PM »
Here i have obtained some non-results on this Morlighem package using Panoply as the open source .nc file viewer.

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #4 on: February 24, 2015, 01:13:30 PM »
I'd already got NetCDF and the EPEL repo installed, but getting the basics into Scientific Linux (basically CENTOS 6) would seem to involve (as root) doing:

Code: [Select]
yum update
yum install netcdf-devel
yum install qgis
yum install grass

What should I try next?
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #5 on: February 24, 2015, 02:40:49 PM »
Keep going guys.

Not recommending Panoply at this point as it is rather peculiar in providing a small window and seemingly very low resolution views, though access to the raw netCDF numeric data. Version 4.11 and nothing, I do mean nothing, by way of documentation/user manual/help/mouseover hints. It grinds out animations but these seem to consist of the one frame showing, rather than stepping through a parameter.

But then it has this peculiar ability to export quite decent kmz files that seem perfectly registered in Google Earth projection. The one below shows contoured below sea lever bed elevations around Petermann for the Morlighem dataset overlying the Cresis radar flight tracks.

The second image shows another crazy-making journal image in which gratuitous text and grid are plopped down for no good reason right on top of the velocity data. Note once again the logarithmic key has little or nothing to do with the actual colors used in the speed map -- the reds on the right end are never made use of. And vice versa, colors on the map don't go to anything in the key.

Apparently the general reader is supposed to find and download some gigabyte .nc file and re-do the image from scratch, as they read the article on their ipad on an airplane. Few will. It would make more sense to simply provide the images as primordial lossless grayscales in supplemental.

Anybody seen an online tool for converting standard Greenland polar stereographic to Google Earth kmz? The tool provided in GooglE for this is very much still in beta. Oops, never mind, I see that all Panoply is doing is plugging in the lat,lon of the four corners and pointing to the image, GooglE is doing the rescaling (without knowing the projection of the image):

<?xml version="1.0" encoding="UTF-8"?>
<kml xmlns="http://www.opengis.net/kml/2.2" xmlns:gx="http://www.google.com/kml/ext/2.2" xmlns:kml="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom">
<GroundOverlay>
   <name>bed in MCdataset-2014-11-19</name>
   <Icon>
      <href>/Users/A-Team/Desktop/bed in MCdataset-2014-11-19.kmz/bed in MCdataset-2014-11-19.png</href>
   </Icon>
   <LatLonBox>
      <north>81.29537130722932</north>
      <south>76.69766046742312</south>
      <east>-23.46179769273073</east>
      <west>-78.00420230726927</west>
   </LatLonBox>
</GroundOverlay>
</kml>
« Last Edit: February 24, 2015, 05:44:47 PM by A-Team »

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #6 on: February 24, 2015, 10:04:04 PM »
"what should i try next"

i think that should do it. after the install, just invoke qgis or GRASS from the commandline and see what happens ...

in qgis, you can just say "add raster layers", choose the nc file, choose the layers you want to import.
if you are running 32 bit linux you may run outta memory if you try to import too many layers at once.

in grass the procedure is similar but i dont remember exactly.

or you can pretty much do everything from the command line with the gdal toolset

ncdump will dump out data in text format, which will help A-Team.

sidd

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #7 on: February 24, 2015, 10:18:41 PM »
I may easily have missed something somewhere sidd, but which nc file(s), obtained from where?

I have loads of Arctic sea ice concentration .nc files on my hard drive, but I don't think that's what A-Team is interested in at this juncture?! Where's the "Morlighem package" hiding?
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #8 on: February 25, 2015, 04:27:47 AM »

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #9 on: February 25, 2015, 12:53:31 PM »
sidd, do you recall your coordinate box for that Jakobshavn contour map? I am looking for the bare minimum that will pick up the south branch calving front, big bend, and a ways east.

I am finally seeing some numbers in the array window for Panoply for ftp://sidads.colorado.edu/DATASETS/IDBMG4_BedMachineGr/MCdataset-2014-11-19.nc

The plan is to grab the sub-array numbers and open as a grayscale graphic in ImageJ in raw format (or BMP) and be done with these funky interfaces. Seems like someone could just make a graphical web front end for this and put netCDF out of business.

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #10 on: February 25, 2015, 01:36:17 PM »
Sorry, the llink was posted by A-Team in the Jacobshawn thread

Thanks sidd - I have that data now, and 8 cores with not a lot else to do tonight (UTC). What numbers do you and A-Team suggest I set them off crunching?
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #11 on: February 25, 2015, 02:37:34 PM »
Quote
I have that data now, and 8 cores with not a lot else to do
That's fantastic, Jim.

I am looking for a grayscale bedrock DEM (not contoured) of Jakobshavn south branch at whatever bit depth the Morlighem data actually has (8, 16, 32, 64?), same region that sidd did the contour map for over at http://forum.arctic-sea-ice.net/index.php/topic,154.msg45799.html#msg45799 and below.

Ditto for the error map.

If this works out, ditto the duo for Petermann subglacial forum and Zachariae (for Espen and Wipneus), approx areas shown in maps below.

Thanks for looking into this, it would be a huge help if this works out!

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #12 on: February 25, 2015, 09:13:21 PM »
I'm trying to follow this tutorial (using the Greenland data, not Mt. Everest!):

http://www.qgistutorials.com/en/docs/working_with_terrain.html

First I had to install some more stuff:

Code: [Select]
yum install qgis-python qgis-grass qgis-mapserver
I try to clip to the Jakobshavn area, but keep getting this error message:

Code: [Select]
Computed -srcwin falls outside raster size of 10018x17946.
Any suggestions about how to proceed?
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #13 on: February 25, 2015, 09:48:19 PM »
will try to find the co-ordinates

in the meantime, try ncdump from the command line, dump bed layer as text, (which will be a hefty-ish) file and put up for A-Team to grab ? (That will be the whole bed layer ...)

sorry, i cant  be more helpful right now, will try to get back to this tomm.

sidd

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #14 on: February 25, 2015, 11:09:42 PM »
forgot to say, if qgis is driving you nuts, try GRASS, it's a different kind of nuts, perhaps more to your liking.

say grass from the command line, and wait, about 3 windows will eventually pop up, one of them a command prompt from GRASS; at that command prompt say sumpn like:

r.in.gdal input=NETCDF:"/home/sidd/Morlinghem-2015/MCdataset-2014-11-19.nc":bed output=MCgrassbed

at this point you should be able to add the layer MCgrassbed from the grass GUI

(sorry this is all from memory)

A-Team, the data as i remember is 16 bit integer for the bed, as is the error

sidd

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #15 on: February 26, 2015, 12:13:26 AM »
There's now a 148 Mb .GZ file available at:

http://www.4shared.com/archive/Gw5xDQUwba/jakcdl.html

which hopefully contains the desired bed data in text format. It's the result of:

Code: [Select]
ncdump -v bed MCdataset-2014-11-19.nc > jak.cdl
« Last Edit: February 26, 2015, 09:57:08 AM by Jim Hunt »
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #16 on: February 26, 2015, 01:09:15 AM »
also from the dim recesses of memory ... r.contour from the GRASS prompt goes off and works busily and fails in a few seconds with no errors and no contours created, while gdal_contour from the unix command line actually works in a few hours or a small an pitiful laptop

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #17 on: February 26, 2015, 10:09:53 AM »
So far so good, download and unarchiving went smoothly, now have 789 MB file as .cdl which I have never heard of but has something to do with NetCDF CDL Metadata (Unidata) where the network Common data form Description Language) but there is more explanation here:

http://www.unavco.org/software/visualization/idv/IDV_for_GEON_netcdf.html
https://www.unidata.ucar.edu/software/netcdf/docs/netcdf.html
https://www.unidata.ucar.edu/software/netcdf/docs/ncdump-man-1.html

Quote
     If you already have data files in ASCII, an easier way is to write an ASCII equivalent of a NetCDF file, called a CDL file, and then convert that CDL file to the binary NetCDF file using a NetCDF utiltiy program called ncgen. Ncgen is available for many platforms. CDL files are easily read and understood by humans, and can be revised with a text editor. (Why not just use the CDL file in NetCDf applications? - they are not machine independent, and they require much more disk space than the equivalent binary NetCDF file.)

    You can also invert a binary NetCDF file to the equivalent ASCII CDL form so it is readable, using the NetCDF "ncdump" utility. This lets you examine and reuse the format in any working NetCDF binary file. The command to convert a file "sample.nc" to ascii cdl form is "ncdump sample.nc > sample.cdl" on Linux. To just see the header (metadata), do ncdump -h sample.nc > sampleheader.cdl"

    NetCDF files have dimensions, variables, and attributes. "Dimensions" are named integers such as latitude=1200, giving the size of a related coordinate variable array. "Variables" have one-dimensional arrays of observed data, and of location data - the dimension values - where the observed data occur. For the IDV, Every data value must have a latitude, longitude, and depth value, and in some cases, a time value. (There is also a way to put an x-y-z grid of data into NetCDF (not lat-long-altitude), if you know the mapping to a lat-long-altitude.) A variable with the same name as a dimension(number of coordinates) is a "coordinate," a location variable. It defines a physical coordinate (such as latitude) with that dimension. Typical coordinate variables are latitude, longitude, depth, and time. "Attributes" are metadata about variables, such as a variable's unit name such as kilometer/s. "Global attributes" are metadata about the entire data set, such as information about the origin of the data, campaign or station remarks, etc.

CDL

    The NetCDF package includes an ascii format, called CDL, to make ascii files that are equivalent to NetCDF files. CDL files are complete inverses of the equivalent NetCDF, and vice versa. CDL is convenient since you can create, read and edit an ascii file, and then convert it to a binary NetCDF file. For the NetCDF documentation about CDL, see Section 2.1.2 Network Common Data Form Language (CDL), CDL Syntax, and CDL data types. Other parts of the complete NetCDF documentation may be useful to you.

I am going to post this while the getting is good and then try to open...

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #18 on: February 26, 2015, 11:28:15 AM »
That .cdl file opens readily in a basic text editor. It consists of a page of metadata of which the most interesting is flag_meanings = "none gimpdem mass_conservation none interpolation hydrodstatic_equilibrium kriging", then many pages of what I take as fill-in of the rectangular grid where there was no actual data (which will correspond to alpha channel in an eventual image).

Then come many pages of positive and negative numbers, 24 to a line, which seem to have a limited range of like -330 to 1159 and so could be the long-sought bedrock elevations themselves in meters).

I was expecting a semi-colon to be used as line break but it is only used as end of file. The grid is x = 10018 by y = 17946 which, depending how they are stepping through the array, says the line break symbol should come at 416 or 747 lines. It is the former as they are using blocks of 10018 that are distinguished by two spaces at the start and four spaces that begin each subsequent line. The end of the block has an ordinary single carriage return as do each of the internal lines.

This gives a satisfactory enough handle to quickly grep the data blocks into excel-readable comma separated value CSV format: delete space space space space, change comma space to comma, delete all CR, change space space to CR or end-of-block symbol CR.

From there it should be openable as a grayscale in ImageJ as a BMP or raw file. I'm thinking elevations have to be normalized to fit the 16 bit range and format conditions. That would involve an offset by the smallest negative to make everything positive and then division by the largest elevation followed by multiplication by 255 to put everything in [0,255] for 8 bit. Sounds simple enough but could be challenging given the sheer size of the array.

The resulting image would be 10018 x 17946 = 179,783,028 pixels or 179 MB which is about a fifth of a Landsat 15 m 16 bit file so quite manageable. However at 72 dpi it would take an 11 x 20 foot monitor to see it all, and a blog width of 700 pixels by 1254 height would require rescaling by 14:1 to 7%.

So here the first thing I would do is dumb it down in size by bicubic interpolation, 25% would give 4x4=16 fold reduction in file size and dimensions of 2504 by 4486 and 10% would just fill my monitor. It would be better to doing this much earlier on in the netCDF process, pre-dumping.

I'm not clear on what the lat,lon corners are, so not clear what the ground resolution is, for example if the calving front is 6 km wide, how many y numbers do we have on a transect of the channel? I suppose the resolution is in the Morlighem paper which probably uses the same grid as the Bamber paper.

It might be better to somehow tile the ncdump so that particular pieces of the Jakobshavn channel could be done at higher resolution. That is what Howat is doing on these gigantic 2 m resolution WorldView files. Then the end user  downloads and tiles up just what they need, rather than cutting down a giant file.

Great progress, much thanks to jim and sidd here.

The data looks like this ...

 bed =
  _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,

    _, _, _, _, _, 12, 14, 19, 25, 37, 55, 79, 100, 113, 118, 114, 109, 101,
    90, 76, 60, 47, 33, _, _, _, _, _, 16, 14, 13, _, 12, 11, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, 13, 14, 24, 47, 63, 100,
    134, 165, 195, 220, 242, 264, 283, 298, 310, 319, 326, 330, 332, 332, 
    285, 233, 195, 161, 124, 82, 44, 0, -43, -84, -121, -144, -164, -178,
    -185, -193, -194, -197, -198, -201, -203, -202, -200, -199, -189, -181,
    -169, -156, -140, -124, -111, -100, -91, -85, -77, -68, -64, -61, -64,


netcdf MCdataset-2014-11-19 {
dimensions:
   x = 10018 ;
   y = 17946 ;
variables:
   int x(x) ;
      x:long_name = "Cartesian x-coordinate" ;
      x:standard_name = "projection_x_coordinate" ;
      x:units = "meter" ;
   int y(y) ;
      y:long_name = "Cartesian y-coordinate" ;
      y:standard_name = "projection_y_coordinate" ;
      y:units = "meter" ;
   byte mask(y, x) ;
      mask:long_name = "ma
...

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #19 on: February 26, 2015, 12:09:53 PM »
Here is a direct way to open netCDF files as images, which could then be cropped to an area of interest. However since Jim's ncdump of just Jakboshavn is already quite large, the main bedrock file will be gigantic and the likely outcome would just be tying up the computer for hours. ImageJ has a number of other promising import capabilities such as xy coordinates and raw.

ImageJ Plugins developed in Freiburg

 NetCDF Plugin... Load and save files in Unidata NetCDF format.
 HDF5 Plugin... Load and save files in HDF5 format.
 PCA Plugin ... Perform 2D/3D Principle Component Analysis.

http://lmb.informatik.uni-freiburg.de/resources/opensource/imagej_plugins.en.html
http://lmb.informatik.uni-freiburg.de/resources/opensource/imagej_plugins/netcdf.html

The plugin uses the NetCDF Java Library for reading and writing NetCDF files. Requirements:

    ImageJ, plugins tested with Version 1.38 and newer.
    Java, at least Version 1.4 (for NetCDF Java Library).

Known issuses  Color images of type != byte will be read, but cropped to 8bit. No intelligence is used when original values larger 255 are found.

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #20 on: February 26, 2015, 12:15:18 PM »
However since Jim's ncdump of just Jakboshavn is already quite large

My dump is of all Greenland! I've still not worked out how to "clip" successfully using qgis, or anything else for that matter!
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #21 on: February 26, 2015, 12:34:02 PM »
Looks like it may be feasible to clip by scraping off the data display in Panoply ... if only I knew what part of Greenland I was clipping to. It does not copy cleanly though as a block, there is some very weird unselectable characters between the numbers that most but not all text editors think is a carriage return. These turns out to be 13 10 in ascii which are CR carriage return followed by LF line feed, so you just get a long list of one value per line, no break for row and column. Panoply also has CSV export capability but I can't get the menu item to light up.

« Last Edit: February 26, 2015, 12:49:04 PM by A-Team »

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #22 on: February 26, 2015, 02:05:28 PM »
Bingo! I am going to write this down before I forget just how this came about. In netCDF freeware viewer call Panoply. Opened Morlighem's .nc bedrock file. Went to array view and selected a block of values. Pasted into a freeware demo text editor call PageSpinner and replaced whatever they were using as spacers with tabs. Copied over to another freeware texteditor called TextEdit set to plain text and saved as raww.txt

Then, big breakthrough, found that freeware Fiji image editor (ImageJ2) had no issues with importing as "Text Image" option, converting the numeric array to a conventional grayscale, determining the correct bit depth on the way. Because the array was very small, the resulting grayscale is zoomed out and screencaptured.

I have no idea yet which steps are essential nor where this is in Greenland but it does appear to be an exceeding simple way of converting a netCDF layer to a grayscale image. The bit depth is only used during a few early steps of contrast adjustment or contouring or whatever, then dropped down to 8 bits to escape ImageJ and process in Gimp for forum display.

Here is the image as text:

92   112   146   179   186   209   239   256   266   283   293   292   289   294   294   290
236   250   282   286   285   308   318   327   341   352   350   348   348   344   343   344
366   386   392   394   393   399   398   405   417   413   409   406   402   399   402   397
459   460   454   460   464   468   481   485   482   477   470   463   456   451   452   445
536   527   519   524   524   526   538   542   532   526   518   509   505   500   493   488
581   580   580   585   585   585   587   587   579   572   561   548   545   539   537   528
616   618   620   623   629   627   621   616   608   600   591   581   575   574   573   569
652   656   658   656   656   648   639   632   623   613   603   594   588   589   588   594
676   680   682   681   672   665   654   645   634   625   616   607   604   603   603   609
695   695   693   686   679   668   657   648   639   631   622   618   615   613   615   623
710   705   698   691   685   673   661   650   642   636   630   625   622   621   624   630
714   709   703   696   688   677   664   654   647   640   635   631   629   631   638   636
717   712   707   700   690   679   668   659   651   643   636   630   632   636   635   639
717   712   706   700   691   682   671   662   654   646   639   634   634   632   633   635
710   704   696   690   682   676   666   659   652   644   639   635   634   634   634   635
702   695   689   683   675   669   659   653   649   643   637   636   638   636   632   635
691   686   681   677   671   662   655   649   645   641   638   637   635   635   634   634
680   676   671   666   661   655   647   642   637   635   631   632   630   633   632   633
672   667   663   655   650   645   639   634   633   628   626   630   629   629   630   629
672   665   659   650   643   637   632   629   627   627   624   627   627   628   626   627
671   663   658   650   642   638   632   628   627   625   625   625   625   628   629   630
674   666   660   653   644   639   635   630   626   624   622   623   625   628   633   632
689   681   673   664   654   647   638   631   628   627   625   627   630   635   642   643
705   696   688   681   672   663   650   639   633   635   634   634   635   643   648   652
731   718   707   697   689   679   663   652   643   646   649   651   652   656   660   664
770   756   744   732   718   699   688   677   669   665   665   669   671   674   679   683
808   794   780   764   746   726   712   703   694   693   691   692   694   694   696   701
870   849   828   808   782   751   738   731   722   724   723   722   722   718   717   720
936   919   902   894   865   825   781   768   759   756   754   756   750   744   741   741
999   992   982   969   943   888   831   817   804   802   795   787   780   771   765   764
985   973   961   947   934   921   897   864   850   837   833   829   820   812   797   793
1016   1005   993   980   966   950   928   905   885   869   871   868   861   852   837   829
1041   1032   1020   1005   992   974   952   931   913   896   882   872   865   861   878   869
1057   1048   1038   1027   1014   997   977   954   935   920   906   896   888   884   881   877
1061   1060   1053   1043   1032   1014   999   974   956   941   928   916   910   908   901   898
1067   1068   1064   1057   1042   1026   1008   990   972   958   947   938   930   925   918   914
1079   1079   1073   1063   1050   1036   1017   1002   985   972   963   952   943   936   935   926
1084   1086   1080   1068   1055   1042   1026   1010   996   982   973   965   956   949   946   940
1089   1087   1083   1068   1056   1044   1029   1015   1000   991   983   978   970   963   958   952
1089   1085   1080   1067   1056   1043   1029   1018   1008   1000   992   986   981   979   972   966
1085   1081   1076   1063   1054   1040   1031   1023   1018   1009   1000   993   991   985   981   980
1078   1075   1069   1061   1053   1040   1036   1029   1025   1015   1009   1004   1001   994   986   984
1076   1070   1065   1060   1052   1046   1041   1034   1028   1022   1016   1013   1005   1002   993   988
1070   1071   1067   1061   1056   1052   1045   1040   1032   1026   1020   1020   1011   1007   1000   995
1067   1071   1069   1061   1057   1054   1048   1041   1035   1029   1025   1019   1013   1007   1004   997
1067   1067   1066   1062   1058   1054   1049   1043   1035   1025   1022   1018   1012   1009   1004   999
1067   1064   1061   1059   1057   1052   1050   1044   1033   1026   1022   1018   1013   1008   1003   1007
1066   1065   1062   1059   1058   1052   1045   1040   1032   1026   1023   1018   1013   1008   1003   1002
1065   1063   1060   1062   1054   1051   1047   1039   1034   1026   1024   1019   1014   1008   1003   1004
1063   1058   1061   1060   1056   1051   1048   1040   1033   1028   1024   1018   1009   1006   999   1000
« Last Edit: February 26, 2015, 02:34:44 PM by A-Team »

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #23 on: February 26, 2015, 05:46:15 PM »
Quote
My dump is of all Greenland! I've still not worked out how to "clip" successfully using qgis, or anything else for that matter!
That could be a blessing in disguise. If the image file size is just ~140 MB, that is no worse than a 30 m Landsat channel, so maybe I should intentionally go for all-Greenland, the advantage being loading all of them in ImageJ, which can put them all in a single co-registered stack which is not much worse than a Landsat B8 file, allowing them all to be cropped at once to the whatever is the current region of interest, and after manipulations, saved out to gimp as a layered 8-bit gif without significant loss of information.

In other words, they never needed to wrap it up as a netCDF number crunch in the first place, just posted the image stack to the cloud. In fact the Morlighem group (who are very GIS-minded) has already done this on their web page, just to so-so resolution in so-so palettes. ImageJ will export an image back to a variety of convenient numerical formats should that ever be needed.

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #24 on: February 26, 2015, 10:06:49 PM »
gnuplot will eat and process layers of the ascii dump also, if you chop out whichever layer you want

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #25 on: February 26, 2015, 10:14:59 PM »
Shock news! A YouTube video relevant to Developer's Corner?



Thanks for all the info A-Team. Having just had a long chat with the Managing Editor of the Mail on Sunday I now need a stiff drink! Then I'll see if I can get any further with qgis.
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #26 on: February 27, 2015, 02:22:43 AM »
After some trial and error I discovered that replacing the auto-generated -projwin with a manual -srcwin did the trick. Thus:

Code: [Select]
gdal_translate -srcwin 2955 10650 600 250 -of GTiff HDF5:"MCdataset-2014-11-19.nc"://bed jak1.tif
produces this:

http://www.4shared.com/archive/gRS7KZuOce/jak1tif.html

Is that any use to you A-Team? I have dozens of other output formats to choose from!
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #27 on: February 27, 2015, 07:17:18 AM »
Morlinghem nc file latitude of origin is 70 N, longitude of central meridian is 45 W  (-45)
These are the lat_0 and lon_0 params in the projection string  (lon_0=-45) in Coordinate Reference Sysytem (CRS)



 sidd


Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #28 on: February 27, 2015, 10:10:15 AM »
Morlinghem nc file latitude of origin is 70 N

My CRS currently looks like

Code: [Select]
+proj=stere +ellps=WGS84 +datum=WGS84 +lat_0=90 +lat_ts=70 +lon_0=-45 +k_0=1.0 +x_0=0 +y_0=0
Which gives me the clipping problem. Are you saying I should have +lat_0=70 ?
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #29 on: February 27, 2015, 11:07:44 AM »
Quote
Morlinghem nc file latitude of origin is 70 N, longitude of central meridian is 45 W  (-45)
These are the lat_0 and lon_0 params in the projection string  (lon_0=-45) in Coordinate Reference Sysytem (CRS)
Helpful explanation of the pj string CRS numbers, sidd. The Jakobshavn calving front is at 69.140, -49,640. It seems everyone is using the same projection for Greenland now after some flirtations with mercator and others, called Polar Stereographic North (70ºN, 45ºW) or officially ESPG 3413 http://www.spatialreference.org/ref/epsg/wgs-84-nsidc-sea-ice-polar-stereographic-north/

According to Morlighem's home page, all the data use the same 150 m-resolution grid although the 'true' resolution of the bedrock topography and ice thickness is 400 m. There is no use yet of 2014 data -- more Jakobshavn grids were flown but may not have added much unless the radar was improved.

That's nice to have everything at the same resolution but what does 400 m mean for the Jakobshavn channel topography? Only 2.5 pixels per km, 6 km width so 15 pixels for a transect or 400*15/150=40 pixels after bumping the resolution by an unknown method, probably bicubic or sinc. I am thinking the mass conservation interpolation is limited to experimental so remains at 400 m.

For sills and troughs, consider a 2 km cross-channel swath. At 400 m this results in a tiny grayscale of dimensions 5 x15 pixels; at 150 m, 12 x37. Not a lot there to contour.

The question here is whether bicubic might actually have some physical landscape sense, not just be a way of weighting neighborhood pixels to make smooth interpolative pixels. Maybe so but for real street smarts, optimal interpolation of this type should be triply anisotropic, that is, take the direction of the ice stream current into account as well as the speed profile across the channel for the different contributing ice streams and legacy topography from the good old days when JI was grinding away at the edge of the continental shelf.

I suppose we could compare interpolation methods by looking at where 2014 radar contributed new data, normalizing to where it coincided with old data.

Quote
After some trial and error I discovered that replacing the auto-generated -projwin with a manual -srcwin did the trick.  Is that any use to you A-Team? I have dozens of other output formats to choose from!

Fantastic, Jim! This proved to be a 16 bit tif file of dimensions 600x250. The question is, has any information been lost? I am thinking not, this is the 150 m. But can you list these other formats?

This use of -9999 m for areas with no data has a bizarre effect on the initial appearance of the image, pushing all the data way off the right in the histogram. I'm concerned that it is compressing it too much, losing information. The way to explore this would be to replace all the -9999 with a number say 10 m less than the minimum (ie the maximum actual depth below sea level). That would create a uniform low gray spike in the histogram instead of the black patch. Since it is the only place that gray occurs in the image, it can be lifted out and replaced by anything including transparency.

Next move? If you have a chance, Jim:

-- Let's check at these other format products, verify no information is being lost. After that, I'll process the grayscale into various products for the JI forum.

-- Apply same process to the same Jakobshavn region for the other Morlighem layers in the netCDF folder: surface elevation, ice thickness, and error map. This will give us a mini GIS stack for Jakobshavn that serves as the basis for modeling. This will level the playing field -- we'll have the same starting point as the people pushng out the academic papers.
« Last Edit: February 27, 2015, 11:16:32 AM by A-Team »

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #30 on: February 27, 2015, 11:26:38 AM »
It seems like this is reproducing the channel features that sidd found with the qgis contour tool. That was set at 100 m so must have done some interpolating on its own initiative from the posted 150 m that was derived from 400 m sparse radar tracks. Below I false-colored with the 'gem' palette after contrast adjustment and 6x vertical exaggeration of the channel.

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #31 on: February 27, 2015, 08:48:35 PM »
projection parameter discussion

http://trac.osgeo.org/proj/wiki/GenParms

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #32 on: February 27, 2015, 09:45:34 PM »
projection parameter discussion

I'm very familiar with all that now sidd, but what specific parameters settings are you using in qgis for the Morlighem .nc?

What's wrong with my proj4 settings above?
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #33 on: February 28, 2015, 02:48:06 AM »
again from memory, i am fairly sure i have lat_0=70, lon_0=-45,units=m

cannot check till latr

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #34 on: February 28, 2015, 05:45:40 AM »
i seem to be using

+proj=stere +lat_0=70  +lon_0=-45 +k=1 +x_0=0 +y_0=0 +ellps=WGS84 +datum=WGS84 +units=m +no_defs

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #35 on: February 28, 2015, 09:34:13 AM »
i seem to be using

+proj=stere +lat_0=70  +lon_0=-45 +k=1 +x_0=0 +y_0=0 +ellps=WGS84 +datum=WGS84 +units=m +no_defs

Thanks sidd. I still reckon lat_0 should be 90, based on this from the .nc:

Code: [Select]
byte polar_stereographic;
  :ellipsoid = "WGS84";
  :false_easting = 0.0; // double
  :false_northing = 0.0; // double
  :grid_mapping_name = "polar_stereographic";
  :latitude_of_projection_origin = 90.0; // double
  :standard_parallel = 70.0; // double
  :straight_vertical_longitude_from_pole = -45.0; // double

Even if I copy and paste your proj4 settings into my CRS I still get the same error when I try to clip :(

What else could possibly be going wrong?
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #36 on: February 28, 2015, 10:36:43 PM »
you may be right ...

from the proj.4 param discussion page, lat_0 should be the latitude of origin and lon_0 should be the center meridian ?

"+lat_0     Latitude of origin"
"+lon_0     Central meridian"

so are they talking of latitude of projection origin or map origin

lat_ts is the true scale latitude that may not apply here

lets say lat_0 is 90
lat_1 should be the first standard parallel, 70 N (?)

next time i have a chance i will use those and see, i suspect it makes little difference for my purposes.
i have done some playing with no CRS at all specified, seems to work also

also there is a clipper tool in qgis under the Raster menu in terrain analysis tab i thnk, same place as you see the contour tool

sidd

PS: i see a link

http://www.remotesensing.org/geotiff/proj_list/random_issues.html#stereographic

which points out several issues ...
« Last Edit: February 28, 2015, 10:51:45 PM by sidd »

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #37 on: March 01, 2015, 09:50:41 PM »
there is a clipper tool in qgis under the Raster menu in terrain analysis tab i thnk, same place as you see the contour tool

That's the method I am attempting to use. Is there another one within qgis? I copied the auto-generated qdal invocation from qgis to a command line, which revealed this:

Code: [Select]
gdal_translate -projwin 2949.09474727 -10650.8231476 3777.13943528 -10946.6906231 -of GTiff HDF5:"MCdataset-2014-11-19.nc"://bed jak1.tif
Input file size is 10018, 17946
Computed -srcwin 2949 -10650 828 -295 from projected window.
Computed -srcwin falls outside raster size of 10018x17946.

Does that suggest what the problem might be to you? As far as I can see the computed srcwin IS within the raster, apart from the minus signs. Is that the source of the confusion? If I remove them things appear at first sight to work as expected.

"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #38 on: March 01, 2015, 11:01:59 PM »
If clipping is the problem, why not just make one big Geotiff out of Morlighem's layers and put it up in cloud storage?

It would not be any worse (after tar.gz) than a single Landsat download. Those individual files are 160 MB each plus 639 MB for the 15 meter, all 16 bit grayscale tifs (except for some plaintext metadata).

We've been opening those in ImageJ, making a stack (which will be co-registered) and then cropping down to special regions from there, keeping the all important registration and saving out as a gif stack for gimp import.

LC80370022014268LGN00_MTL.txt
LC80370022014268LGN00_BQA.TIF
LC80370022014268LGN00_B11.TIF
LC80370022014268LGN00_B10.TIF
LC80370022014268LGN00_B9.TIF
LC80370022014268LGN00_B8.TIF
LC80370022014268LGN00_B7.TIF  --> etc
LC80370022014268LGN00_B6.TIF  --> etc
LC80370022014268LGN00_B5.TIF  --> etc
LC80370022014268LGN00_B4.TIF  --> surface elevation
LC80370022014268LGN00_B3.TIF  --> ice thickness
LC80370022014268LGN00_B2.TIF  --> bedrock
LC80370022014268LGN00_B1.TIF  --> bedrock error

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #39 on: March 02, 2015, 12:02:53 PM »
Quote
I have dozens of other output formats to choose from!

Fantastic, Jim! This proved to be a 16 bit tif file of dimensions 600x250. The question is, has any information been lost? I am thinking not, this is the 150 m. But can you list these other formats?

I can! Please see:

http://www.gdal.org/formats_list.html
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #40 on: March 02, 2015, 07:28:33 PM »
Quote
formats see:http://www.gdal.org/formats_list.html

Wow, never seen such a silly list like that in my life. Unless it be the preposterous list of 80 projections in Panoply no one ever has used twice in the history of cartography.

Buried in the dgal list is png. Just use that. It is a universal graphics format that every graphics and gis program can read and write. Compresses files reversibly but does not throw out any information.

Somehow they are missing the very basic concept of getting from A to B is most easily and safely done as A --> PNG <-- B. That way A does not need to know anything about B's format (or changes in it that B makes without notifying A, whom B does not even know).

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #41 on: March 03, 2015, 12:21:58 PM »
Buried in the dgal list is png. Just use that. It is a universal graphics format that every graphics and gis program can read and write. Compresses files reversibly but does not throw out any information

I tried that:

Code: [Select]
gdal_translate -projwin 2987 10597 3627 10896 -of PNG HDF5:"MCdataset-2014-11-19.nc"://bed jak.png
Input file size is 10018, 17946
Computed -srcwin 2987 10597 640 299 from projected window.
Warning 6: PNG driver doesn't support data type Int16. Only eight bit (Byte) and sixteen bit (UInt16) bands supported. Defaulting to Byte

0...10...20...30...40...50...60...70...80...90...100 - done.

Is 8 bits any good to you? Assuming not then I'll have to RTFM when I have a spare 5 minutes, unless sidd has a better suggestion?
« Last Edit: March 03, 2015, 08:29:46 PM by Jim Hunt »
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #42 on: March 03, 2015, 03:29:34 PM »
Quote
Warning 6: PNG driver doesn't support data type Int16. Only eight bit (Byte) and sixteen bit (UInt16) bands supported. Defaulting to Byte
Sure, let's see what 8 bit kicks out. It sounds like sixteen bit (UInt16) refers to unsigned integers whereas the data seems to be signed + and - values, so hard to say what it would do with it. Good grief, seems like a whole lot of trouble just to extract an x,y array that would import directly in ImageJ.

I'm not understanding why method you used to make a good grayscale of Jakobshavn is not working for Pete and Zach. RTF = rich text format? Just plain text array would suffice, tab or comma delimited.

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #43 on: March 03, 2015, 08:30:45 PM »
RTF = rich text format?

Sorry, no. A typo now corrected!
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #44 on: March 04, 2015, 04:16:56 AM »
"unless sidd has a better suggestion"

sorry, better suggestion for ...?  i thought A-team had the whole output of ncdump as text already ? can easily slice and dice that to get any layer you want

Jim Hunt

  • First-year ice
  • Posts: 6268
  • Don't Vote NatC or PopCon, Save Lives!
    • View Profile
    • The Arctic sea ice Great White Con
  • Liked: 893
  • Likes Given: 87
Re: nc netcdf data
« Reply #45 on: March 04, 2015, 10:04:25 AM »
"unless sidd has a better suggestion"

sorry, better suggestion for ...?


Persuading qgis/gdal to convert signed 16 bit to unsigned.

Quote
i thought A-team had the whole output of ncdump as text already ? can easily slice and dice that to get any layer you want

Currently he has the entire bed layer only as text. It appears he prefers .PNGs though
"The most revolutionary thing one can do always is to proclaim loudly what is happening" - Rosa Luxemburg

A-Team

  • Young ice
  • Posts: 2977
    • View Profile
  • Liked: 944
  • Likes Given: 35
Re: nc netcdf data
« Reply #46 on: March 04, 2015, 11:18:19 AM »
 
Quote
i thought A-team had the whole output of ncdump as text already ? can easily slice and dice that to get any layer you wantCurrently he has the entire bed layer only as text. It appears he prefers .PNGs though

Right, with a lossless image format, I can see where it is that I am clipping to. As ncdump text, have to scrape through their coord system.

In short, .nc --> stack of co-registered grayscale images.

I'm thinking it may need to be .tif because it is 16 bit. Believe .png is just 8 bit. However it may not be posted as 16-bit but not make real use of it. That is, be making poor use of 16-bit space, after linear change in contrast, fit losslessly in 8-bit.

They are talking about 50 m error in bedrock in a 2000 m range, 1 part in 40. Why post to 0.1% when there is 2.5% ambient error? (Actually I see this and worse done all the time in Greenland glaciology.)

sidd

  • First-year ice
  • Posts: 6774
    • View Profile
  • Liked: 1047
  • Likes Given: 0
Re: nc netcdf data
« Reply #47 on: March 04, 2015, 06:46:19 PM »
"As ncdump text, have to scrape through their coord system."

Ah. I wrote a program to do that once, when i was feeding data to gnuplot to do plot as matrix. Will see if i can find it.

sidd

Tor Bejnar

  • Young ice
  • Posts: 4606
    • View Profile
  • Liked: 879
  • Likes Given: 826
Re: nc netcdf data
« Reply #48 on: April 11, 2016, 05:45:07 PM »
I'm not sure where to post this, but "data" was in the thread title, so...

Tamino, on his 'Open Mind' blog, is offering a data service for climate data.  The start of his "original" post on this was (emphasis added)
Quote
We love climate data. We love to see it for ourselves, plot it in new ways, just go exploring for what we can find. Fortunately, a great deal of it is freely available on the internet.

 But — it’s not always so easy to work with. Everybody’s data file is in a different format. Some are easy to use and import to various programs, but some are a royal pain. And, there are some things we always want to do — like transforming data to anomaly values — which we then have to do ourselves. We may want to compare different data sources, which can require loading multiple files, formatting them, computing anomalies, aligning the data sources. And we have to go through the whole thing again when new data are released. Winnie the Pooh would describe it as a “bother.”

Wouldn’t it be nice if the major sources of climate data were retrieved for you on a regular basis? If different sources were combined into a small number of files, properly time-aligned? If anomalies were already computed where appropriate? If they were delivered in a friendly form — as csv files?

I’ve decided to offer a new service, a climate data subscription service. All the data are available online, what this service does is make them easy as pie to play with, and deliver them to your inbox twice a month. If you subscribe by April 30th, the fee is only $25 for a 1-year subscription, which works out to a mere $2.08 per month. That will get you data files twice a month for twelve months … at which time you can renew your subscription if you wish.
...

I don't know how many of you follow Tamino, but I enjoy his describing his statistical approach to the issues he follows.
Arctic ice is healthy for children and other living things because "we cannot negotiate with the melting point of ice"

magnamentis

  • Guest
Re: nc netcdf data
« Reply #49 on: April 14, 2016, 04:39:47 PM »
thanks for the link, i like his approach too, we need more of that