EQUIX#

import cf_xarray
import dcpy
import distributed
import hvplot.xarray
import matplotlib as mpl
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import pump
import xgcm
from dcpy.oceans import read_osu_microstructure_mat
from IPython.display import Image
from scipy.io import loadmat

import eddydiff
import eddydiff as ed
import xarray as xr

xr.set_options(keep_attrs=True)

plt.rcParams["figure.dpi"] = 140
plt.rcParams["savefig.dpi"] = 200
plt.style.use("ggplot")

Todo#

  • [x] See if IOP variance terms match chameleon

  • [x] is there a consolidated TAO data file

  • [x] are they in the PMEL TAO files?

  • [ ] use 10 min averages to calculate KT, Jq

  • [ ] What is happening with the top few bins

  • [x] Take out Chameleon MLD properly not just slice it out

  • [x] filter χpod bad estimates using dT/dz - already done

  • [x] filter out binned estimates using count

  • [ ] pick bins that don’t outcrop?

  • [ ] try chameleon dTdz using xgcm transform

  • [ ] look at T-S diagram north, south of the equator

Data file notes#

indus/chipod/eq08_long/processed/chi_analysis/deglitched/

  1. allchi_eq08_long.mat EQUIX EOP APL mooring; renamed to equix_eop.mat

  2. mean_chi_*.mat seem to be IOP files per χpod

EOP units: 204 205 304 312 314 315 328

mean_chi vs summary files

  1. summary files: T1, T2, AZ etc.

  2. mean_chi files: T1, T2, eps, chi, KT, Jq

Image("../images/equix-chipods.png")
_images/3c74b0bb1cf781750424ebb3fdfa9bd9f5920b0ca3438b5bd09886df3b0ee150.png
Image("../images/equix-tao-chipods.png")
_images/135188b978ac6ae019e04a798c6cb37470ec8a9ff241528bf70148e60c62b9ed.png

Chameleon#

chameleon = xr.open_dataset("/home/deepak/datasets/microstructure/osu/equix.nc")
del chameleon["dTdz"]  # not potential temperature?
chameleon = ed.sections.add_ancillary_variables(chameleon)
# take out ML
chameleon = chameleon.where(chameleon.depth > chameleon.mld)
chameleon = chameleon.cf.set_coords(["latitude", "longitude"])
bins = ed.sections.choose_bins(
    chameleon.cf["neutral_density"], depth_range=np.arange(20, 200, 15)
)

chameleon
<xarray.Dataset>
Dimensions:      (depth: 200, time: 2624, zeuc: 80)
Coordinates:
  * depth        (depth) uint64 1 2 3 4 5 6 7 8 ... 194 195 196 197 198 199 200
    lon          (time) float64 -139.9 -139.9 -139.9 ... -139.9 -139.9 -139.9
    lat          (time) float64 0.06246 0.0622 0.06263 ... 0.06317 0.06341
  * time         (time) datetime64[ns] 2008-10-24T20:36:23 ... 2008-11-08T19:...
  * zeuc         (zeuc) float64 -200.0 -195.0 -190.0 ... 185.0 190.0 195.0
Data variables: (12/46)
    pmax         (time, depth) float64 nan nan 205.9 205.9 ... 203.9 203.9 203.9
    castnumber   (time, depth) float64 nan nan 16.0 ... 2.668e+03 2.668e+03
    AX_TILT      (depth, time) float64 nan nan nan nan ... 0.7031 4.232 1.28
    AY_TILT      (depth, time) float64 nan nan nan nan ... -2.623 -2.121 0.05032
    AZ2          (depth, time) float64 nan nan nan ... 2.612e-06 3.208e-06
    C            (depth, time) float64 nan nan nan nan ... 4.135 4.137 4.164
    ...           ...
    chi_masked   (depth, time) float64 nan nan nan ... 4e-09 2.592e-09 9.245e-09
    Krho         (depth, time) float64 nan nan nan ... 2.642e-05 9.09e-06
    KrhoTz       (depth, time) float64 nan nan nan ... 2.195e-06 4.328e-07
    eps_chi      (depth, time) float64 nan nan nan ... 4.389e-11 5.913e-10
    Kt           (depth, time) float64 nan nan nan ... 1.877e-07 2.039e-06
    KtTz         (depth, time) float64 nan nan nan ... 1.559e-08 9.708e-08
Attributes:
    starttime:  ['Time:20:34:29 298   ' 'Time:20:42:18 298   ' 'Time:20:52:14...
    endtime:    ['Time:20:38:29 298   ' 'Time:20:46:29 298   ' 'Time:20:56:29...
    name:       EQUIX

Read climatologies#

argograd = xr.open_zarr(
    "../datasets/argo_monthly_iso_gradients.zarr", decode_times=False
)
argograd.Smean.attrs = {"standard_name": "sea_water_salinity"}
argograd.Tmean.attrs = {"standard_name": "sea_water_potential_temperature"}
argograd["temp"] = dcpy.eos.temp(argograd.Smean, argograd.Tmean, argograd.pres, 0)
argograd = argograd.cf.guess_coord_axis()
argograd.pres.attrs.update({"positive": "down", "standard_name": "sea_water_pressure"})
argograd = argograd.cf.add_bounds("pres")
argo = (
    argograd.sel(lon=220, method="nearest")
    .sel(lat=slice(-3, 8), pres=slice(300))
    .mean("time")
)
argo["pden"] = ed.jmd95.dens(argo.Smean, argo.Tmean, 0)
argo["gamma_n"] = dcpy.oceans.neutral_density(argo)
argo
<xarray.Dataset>
Dimensions:      (bounds: 2, lat: 11, pres: 25)
Coordinates:
  * lat          (lat) float32 -2.5 -1.5 -0.5 0.5 1.5 2.5 3.5 4.5 5.5 6.5 7.5
    lon          float32 220.5
  * pres         (pres) float64 2.5 10.0 20.0 30.0 ... 240.0 260.0 280.0 300.0
    pres_bounds  (bounds, pres) float32 -1.25 6.25 15.0 ... 270.0 290.0 310.0
Dimensions without coordinates: bounds
Data variables:
    Smean        (pres, lat) float32 dask.array<chunksize=(25, 11), meta=np.ndarray>
    Tmean        (pres, lat) float32 dask.array<chunksize=(25, 11), meta=np.ndarray>
    dSdia        (pres, lat) float64 dask.array<chunksize=(25, 11), meta=np.ndarray>
    dSdz         (pres, lat) float32 dask.array<chunksize=(25, 11), meta=np.ndarray>
    dSiso        (pres, lat) float64 dask.array<chunksize=(25, 11), meta=np.ndarray>
    dTdia        (pres, lat) float64 dask.array<chunksize=(25, 11), meta=np.ndarray>
    dTdz         (pres, lat) float32 dask.array<chunksize=(25, 11), meta=np.ndarray>
    dTiso        (pres, lat) float64 dask.array<chunksize=(25, 11), meta=np.ndarray>
    ρmean        (pres, lat) float32 dask.array<chunksize=(25, 11), meta=np.ndarray>
    temp         (pres, lat) float32 dask.array<chunksize=(25, 11), meta=np.ndarray>
    pden         (pres, lat) float32 dask.array<chunksize=(25, 11), meta=np.ndarray>
    gamma_n      (lat, pres) float32 dask.array<chunksize=(11, 25), meta=np.ndarray>
Attributes:
    dataset:  argo
    name:     Mean fields and isopycnal, diapycnal gradients from Argo
ecco = (
    ed.read_ecco_clim()
    .sel(lon=220, method="nearest")
    .sel(lat=slice(-3, 8), pres=slice(300))
    .mean("time")
)
ecco["gamma_n"] = dcpy.oceans.neutral_density(ecco)
ecco
<xarray.Dataset>
Dimensions:      (bounds: 2, lat: 22, pres: 20)
Coordinates:
  * lat          (lat) float64 -2.75 -2.25 -1.75 -1.25 ... 6.25 6.75 7.25 7.75
    lon          float64 220.2
  * pres         (pres) float64 5.0 15.0 25.0 35.0 ... 194.7 222.7 257.5 299.9
    pres_bounds  (bounds, pres) float64 0.0 10.0 20.0 30.0 ... 236.7 274.8 321.2
Dimensions without coordinates: bounds
Data variables:
    RHOAnoma     (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    Smean        (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    Tmean        (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    dSdia        (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    dSdz         (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    dSiso        (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    dTdia        (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    dTdz         (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    dTiso        (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    pden         (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    temp         (pres, lat) float64 dask.array<chunksize=(20, 6), meta=np.ndarray>
    gamma_n      (lat, pres) float64 dask.array<chunksize=(6, 20), meta=np.ndarray>
Attributes:
    dataset:  ecco
    name:     Mean fields and isopycnal, diapycnal gradients from ECCO v4r3
colegrad = xr.open_dataset("../datasets/cole-clim-gradient.nc")
colegrad
<xarray.Dataset>
Dimensions:             (lat: 145, lon: 360, pres: 58)
Coordinates:
  * lon                 (lon) float32 20.5 21.5 22.5 23.5 ... 377.5 378.5 379.5
  * lat                 (lat) float32 -64.5 -63.5 -62.5 -61.5 ... 77.5 78.5 79.5
  * pres                (pres) float32 2.5 10.0 20.0 ... 1.9e+03 1.975e+03
    reference_pressure  int64 0
Data variables:
    dSiso               (pres, lat, lon) float32 ...
    dTiso               (pres, lat, lon) float32 ...
    ρmean               (pres, lat, lon) float32 ...
    gamma_n             (lat, lon, pres) float32 ...
cole = ed.read_cole()
cole
<xarray.Dataset>
Dimensions:                (lat: 131, lon: 360, pres: 101, scalar: 1, sigma: 96)
Coordinates:
  * lat                    (lat) float64 -65.0 -64.0 -63.0 ... 63.0 64.0 65.0
  * lon                    (lon) float64 1.5 2.5 3.5 4.5 ... 358.5 359.5 360.5
  * pres                   (pres) float64 0.0 20.0 40.0 ... 1.98e+03 2e+03
  * sigma                  (sigma) float64 20.0 20.1 20.2 ... 29.3 29.4 29.5
    start_year             (scalar) float64 2.005e+03
    end_year               (scalar) float64 2.015e+03
    mixing_efficiency      (scalar) float64 0.16
    minimum_points         (scalar) float64 15.0
    maximum_mixing_length  (scalar) float64 6e+05
Dimensions without coordinates: scalar
Data variables: (12/16)
    mixing_length_sig      (lat, lon, sigma) float64 ...
    salinity_mean_sig      (lat, lon, sigma) float64 ...
    salinity_std_sig       (lat, lon, sigma) float64 ...
    salinity_gradient_sig  (lat, lon, sigma) float64 ...
    depth_mean_sig         (lat, lon, sigma) float64 ...
    number_points_sig      (lat, lon, sigma) float64 ...
    ...                     ...
    density_mean_depth     (lat, lon, pres) float64 ...
    number_points          (lat, lon, pres) float64 ...
    velocity_std           (lat, lon, pres) float64 ...
    diffusivity            (lat, lon, pres) float64 nan nan nan ... nan nan nan
    process_date           (scalar) timedelta64[ns] 8 days 12:43:27.466400463
    diffusivity_first      (lat, lon) float64 nan nan nan nan ... nan nan nan
Attributes:
    title:    This dataset contains calculations related to the estimation of...

Read χpod data#

equix_iop = xr.load_dataset("/home/deepak/datasets/microstructure/osu/equix/iop.nc")
equix_iop.coords["kind"] = ("depth", ["APL"] * equix_iop.sizes["depth"])

tao_iop = (
    xr.load_dataset("/home/deepak/datasets/microstructure/osu/equix/tao_may08.nc")
    .reindex(time=equix_iop.time)
    .drop_sel(depth=[39, 84])
)
tao_iop.coords["kind"] = ("depth", ["TAO"] * tao_iop.sizes["depth"])

# depths are quite different so simple merge works.
# clusters of 3 χpods are also separated
iop_ = xr.merge([tao_iop, equix_iop])

iop = iop_[["theta", "chi", "eps", "dTdz"]].coarsen(time=10, boundary="trim").mean()
iop["salt"] = 35 * xr.ones_like(iop.theta)
iop["salt"].attrs = {"standard_name": "sea_water_salinity"}
iop["theta"].attrs = {"standard_name": "sea_water_temperature"}
iop["pres"] = iop.depth
iop["pres"].attrs = {"standard_name": "sea_water_pressure"}
iop.coords["latitude"] = 0
iop.coords["longitude"] = -140
iop = iop.cf.guess_coord_axis()
iop = ed.sections.add_ancillary_variables(iop)
iop
<xarray.Dataset>
Dimensions:             (depth: 12, time: 2256)
Coordinates:
  * depth               (depth) int64 18 24 28 49 51 53 59 62 69 80 124 150
  * time                (time) datetime64[ns] 2008-10-24T07:05:00 ... 2008-11...
    unit                (depth) float64 313.0 nan nan nan ... nan 327.0 321.0
    kind                (depth) object 'TAO' 'APL' 'APL' ... 'APL' 'TAO' 'TAO'
    actual_depth        (depth, time) float64 nan nan nan nan ... nan nan nan
    latitude            int64 0
    longitude           int64 -140
    reference_pressure  int64 0
Data variables: (12/15)
    theta               (depth, time) float64 24.49 24.51 24.5 ... 16.74 16.88
    chi                 (depth, time) float64 1.617e-06 3.078e-06 ... 2.275e-10
    eps                 (depth, time) float64 7.304e-06 1.578e-05 ... 1.673e-10
    Tz                  (depth, time) float64 0.002167 0.0036 ... 0.1626 0.01216
    salt                (depth, time) float64 35.0 35.0 35.0 ... 35.0 35.0 35.0
    pres                (depth) int64 18 24 28 49 51 53 59 62 69 80 124 150
    ...                  ...
    chi_masked          (depth, time) float64 1.617e-06 3.078e-06 ... 2.275e-10
    Krho                (depth, time) float64 0.02864 0.05469 ... 2.409e-07
    KrhoTz              (depth, time) float64 6.205e-05 0.0001969 ... 2.93e-09
    eps_chi             (depth, time) float64 4.392e-05 3.427e-05 ... 5.343e-10
    Kt                  (depth, time) float64 0.1722 0.1188 ... 7.692e-07
    KtTz                (depth, time) float64 0.0003731 0.0004276 ... 9.355e-09
iop.chi.cf.plot(robust=True, norm=mpl.colors.LogNorm())
<matplotlib.collections.QuadMesh at 0x7fccfd0e8ee0>
_images/6b1fa6f5a83e3039c6f54ee1d985865dc55278e1f3f761c899c59a9247521c95.png

Do ancillary variables#

iop.Tz.cf.plot(robust=True)
<matplotlib.collections.QuadMesh at 0x7fccfcf57b20>
_images/465ab3cbd72e2de1ec98b74313174633e52167632e36c5766c7c3fac0768e3e9.png
iop.KtTz.cf.plot(robust=True)
<matplotlib.collections.QuadMesh at 0x7fccfce35a00>
_images/873e8a2fae3defbd15963cc461fc67bfac8733eaa1b17304e5db0cba482afdc9.png

IOP: χpod vs Chameleon#

Not sure why this does not totally agree with Perlin & Moum

chameleon.chi.mean("time").rolling(depth=5, center=True).mean().cf.plot()
iop.chi.mean("time").cf.plot(marker="x", xscale="log")
[<matplotlib.lines.Line2D at 0x7fccfcd17ca0>]
_images/87ee7a93924c0f909a9d7101f1c4eab815bcba0fb2a52bed4b042e66f9741356.png

bin-averaging in density space#

cham_dens = ed.sections.bin_average_vertical(chameleon, "neutral_density", bins)

iop_dens = ed.sections.bin_average_vertical(
    iop.where(iop.Tz > 5e-3), "neutral_density", bins, skip_fits=True
)
tao_dens = ed.sections.bin_average_vertical(
    iop.where(iop.Tz > 5e-3).query(depth="kind == 'TAO'"),
    "neutral_density",
    bins,
    skip_fits=True,
)

apl_dens = ed.sections.bin_average_vertical(
    iop.where(iop.Tz > 5e-3).query(depth="kind == 'APL'"),
    "neutral_density",
    bins,
    skip_fits=True,
)
chipod_dens = xr.concat([iop_dens, tao_dens, apl_dens], dim="kind")
chipod_dens["kind"] = ["iop", "TAO", "APL"]

# χpod depths are useless as a mean pressure
# we need to remap using the mean; here we use chameleon for now
# could use Argo et al later
chipod_dens = chipod_dens.rename({"pres": "χpod_pres"})
chipod_dens.coords["pres"] = cham_dens["pres"].reset_coords(drop=True)

for var in cham_dens:
    cham_dens[var].attrs["coordinates"] = "pres"
cham_dens.num_obs.cf.plot(y="pres")
chipod_dens.num_obs.sel(kind="iop").cf.plot(y="pres", xscale="log")
dcpy.plots.linex([cham_dens.num_obs.median() / 3, chipod_dens.num_obs.median() / 3])

final result#

This was wrong earlier :X. I forgot to do χ/2 with the χpod estimate

χpods say no difference i.e. there doesn’t seem to be any eddy stirring

Old notes: Mostly agree.

  • It looks like the biggest disagreement is between 100m and 120m ≈ EUCmax.

  • Note that I have used the chameleon \(∂_zθ^m\)

  • crosses need to be moved one half-bin width down

chipodmask = chipod_dens.num_obs > chipod_dens.num_obs.median() / 3
chammask = cham_dens.num_obs > cham_dens.num_obs.median() / 3

ed.sections.plot_var_prod_diss(cham_dens.where(chammask))
(chipod_dens.chi / 2).where(chipodmask).sel(kind="iop").cf.plot(
    y="pres", color="r", ls="none", marker="x"
)
(chipod_dens.where(chipodmask).KtTz * cham_dens.dTdz_m).sel(kind="iop").cf.plot(
    y="pres", marker="x", ls="none", color="k"
)
plt.title("EQUIX IOP")
hdl = dcpy.plots.liney(chameleon.eucmax.mean().values, label="$z_{EUC}$")
chipod_dens.where(chipod_dens.num_obs > 600).sel(kind="iop").chi.cf.plot.step(y="pres")
chipod_dens.where(chipod_dens.num_obs > 250).sel(kind="TAO").chi.cf.plot.step(y="pres")
chipod_dens.where(chipod_dens.num_obs > 250).sel(kind="APL").chi.cf.plot.step(y="pres")

EQUIX EOP#

  • [ ] need salinity

  • [ ] merge in TAO

  • [ ] need to preserve time dimension while grouping

equix_eop = xr.open_dataset(
    "/home/deepak/datasets/microstructure/osu/equix/hourly_eop.nc"
).rename({"dTdz": "Tz"})
equix_eop["depth"] = equix_eop.depth.data.astype(int)
equix_eop["salt"] = 35 * xr.ones_like(equix_eop.theta)
equix_eop["salt"].attrs = {"standard_name": "sea_water_salinity"}
equix_eop["T"] = dcpy.eos.temp(equix_eop.salt, equix_eop.theta, equix_eop.depth)
equix_eop.coords["pres"] = dcpy.eos.pres(equix_eop.depth, 0)
equix_eop.coords["latitude"] = 0
equix_eop.coords["longitude"] = -140
equix_eop = equix_eop.cf.guess_coord_axis()
equix_eop["gamma_n"] = dcpy.oceans.neutral_density(equix_eop)
equix_eop["pden"] = dcpy.eos.pden(equix_eop.salt, equix_eop.theta, 0)
equix_eop.attrs["name"] = "EQUIX"

tao = xr.open_dataset("/home/deepak/datasets/microstructure/osu/equix/hourly_tao.nc")
tao.attrs["name"] = "TAO"

tao_eop = dcpy.util.slice_like(tao, equix_eop.time).drop_sel(depth=[39, 84])
eop = xr.merge([tao_eop, equix_eop])
eop
<xarray.Dataset>
Dimensions:             (depth: 11, time: 2692)
Coordinates:
  * depth               (depth) int64 18 25 29 47 49 58 59 69 76 124 150
  * time                (time) datetime64[ns] 2008-11-11T05:00:00 ... 2009-03...
    unit                (depth) float64 313.0 315.0 328.0 ... 304.0 327.0 321.0
    pres                (depth) float64 nan 25.15 29.17 47.28 ... 76.47 nan nan
    latitude            int64 0
    longitude           int64 -140
    reference_pressure  int64 0
Data variables:
    theta               (time, depth) float64 24.77 24.62 24.51 ... 18.66 15.89
    chi                 (time, depth) float64 2.222e-07 7.709e-08 ... 2.5e-07
    eps                 (time, depth) float64 4.368e-07 6.169e-08 ... 7.051e-09
    Kt                  (time, depth) float64 0.008625 0.0002651 ... 3.797e-06
    Jq                  (time, depth) float64 98.86 13.11 40.81 ... 1.153 2.633
    dTdz                (time, depth) float64 0.008565 nan ... 0.06073 0.2235
    Tz                  (time, depth) float64 nan 0.01206 0.005363 ... nan nan
    KtTz                (time, depth) float64 nan 3.196e-06 ... nan nan
    salt                (time, depth) float64 nan 35.0 35.0 ... 35.0 nan nan
    T                   (time, depth) float64 nan 24.62 24.52 ... 21.78 nan nan
    gamma_n             (time, depth) float64 nan 23.46 23.49 ... 24.29 nan nan
    pden                (time, depth) float64 nan 1.023e+03 ... nan nan
eccoT_full, edges = ed.estimate_gradients(ecco, bins, bin_var="gamma_n", debug=True)
eccoT = eccoT_full.interp(lat=0)
argoT_full, _ = ed.estimate_gradients(argo, bins, bin_var="gamma_n", debug=True)
argoT = argoT_full.interp(lat=0)
eccoT
<xarray.Dataset>
Dimensions:         (bounds: 2, gamma_n: 11)
Coordinates:
    lon             float64 220.2
  * gamma_n         (gamma_n) float64 23.54 23.67 23.84 ... 25.96 26.16 26.28
    gamma_n_bounds  (bounds, gamma_n) float64 23.49 23.59 23.74 ... 26.23 26.33
    dz_remapped     (gamma_n) float64 7.004 8.579 10.3 13.2 ... 21.8 18.04 17.01
    pres            (gamma_n) float64 57.03 64.82 74.26 ... 174.0 193.9 211.4
    y               float64 0.0
    lat             int64 0
Dimensions without coordinates: bounds
Data variables:
    Tmean           (gamma_n) float64 24.9 24.59 24.12 ... 15.3 14.32 13.66
    dTdy            (gamma_n) float64 -3.753e-06 -3.315e-06 ... -1.896e-06
    dTdz            (gamma_n) float64 0.03738 0.04391 ... 0.04075 0.03307
_images/5c84e29e7751a9353c22ee717b05e338c6e408ce37726381135d986e6cff07d4.png _images/0ebdc7d70e4e0b9ca0a569a20e4ac8cba8f16a5099b90a53a1d01c15512d02b3.png
chipod_dens_df = ed.eddydiff.bin_avg_in_density_time(
    eop.where(eop.Tz > 5e-3), bins={"neutral_density": bins}, strftime="%Y"
)
chipod_dens = ed.sections.bin_average_vertical(
    eop.where(eop.Tz > 1e-2), "neutral_density", bins, skip_fits=True
)
chipod_dens
<xarray.Dataset>
Dimensions:             (bounds: 2, gamma_n: 11)
Coordinates:
  * gamma_n             (gamma_n) float64 23.54 23.66 23.84 ... 26.16 26.28
    pres                (gamma_n) float64 40.65 42.21 43.99 47.9 ... nan nan nan
    latitude            int64 0
    longitude           int64 -140
    reference_pressure  int64 0
    num_obs             (gamma_n) float64 1.412e+03 2.154e+03 3e+03 ... nan nan
    gamma_n_bounds      (bounds, gamma_n) float64 23.49 23.59 ... 26.23 26.33
Dimensions without coordinates: bounds
Data variables:
    theta               (gamma_n) float64 24.34 23.95 23.39 ... nan nan nan
    chi                 (gamma_n) float64 1.007e-06 1.207e-06 ... nan nan
    eps                 (gamma_n) float64 5.64e-07 6.304e-07 ... nan nan
    Kt                  (gamma_n) float64 0.0006162 0.0005668 ... nan nan
    Jq                  (gamma_n) float64 64.93 65.87 85.84 ... nan nan nan
    dTdz                (gamma_n) float64 nan nan nan nan ... nan nan nan nan
    Tz                  (gamma_n) float64 0.02894 0.03726 0.04573 ... nan nan
    KtTz                (gamma_n) float64 1.584e-05 1.607e-05 ... nan nan
    salt                (gamma_n) float64 35.0 35.0 35.0 35.0 ... nan nan nan
    T                   (gamma_n) float64 24.34 23.96 23.4 22.55 ... nan nan nan
    pden                (gamma_n) float64 1.024e+03 1.024e+03 ... nan nan
    unit                (gamma_n) float64 263.4 268.7 280.9 ... nan nan nan
eccoT["gamma_n"] = argoT["gamma_n"] = chipod_dens.gamma_n
(chipod_dens.chi / 2).cf.plot(y="gamma_n")
(chipod_dens.KtTz * eccoT.dTdz).cf.plot(y="gamma_n")
[<matplotlib.lines.Line2D at 0x7f05e3244820>]
_images/c6d1e8b3c71c06489f603e004998c65cc222af7dd9a4804d6de5eee80270b3b9.png
cole = cole.interp(lat=0, lon=220).set_coords("density_mean_depth")
cole_diff = xr.DataArray(
    np.interp(eccoT.gamma_n, cole.density_mean_depth, cole.diffusivity),
    dims="gamma_n",
    coords={"gamma_n": eccoT.gamma_n},
)

colegrad_pt = colegrad.interp(lat=0, lon=220)
cole_gradT = xr.DataArray(
    np.interp(eccoT.gamma_n, colegrad_pt.gamma_n, colegrad_pt.dTiso),
    dims="gamma_n",
    coords={"gamma_n": eccoT.gamma_n},
)

This plot shows local variance dissipation \(⟨χ⟩/2\) (chipod), local variance production \(⟨K_Tθ_z⟩⟨θ⟩\_z\) (chipod+climatology), and estimate of variance produced by eddy stirring using the Cole et al (2015) argo estimate \(K_e |⟨θ⟩\_y|²\) (cole + climatology).

The story seems to be consistent in that eddy stirring variance production is small.

Here I have used \(⟨θ⟩\_y\) at the equator from 1D spline fits to the mean field along isopycnals. Technically we would want a gradient between 1S and 5N (approx.). BUT it looks like the gradient is larger at the equator generally, so this might be an overestimate even.

I could redo the Cole estimate using argo points between 1°S and 5°N and combine with appropriate gradient (using their methodology) to get a better value possibly.

(chipod_dens.chi / 2).cf.plot(
    y="gamma_n", label="$⟨χ⟩/2$", color="r", size=4.5, aspect=1 / 1.4
)
# (chipod_dens.KtTz * cham_dens.dTdz_m).cf.plot(
#    y="gamma_n", xscale="log", label="$⟨K_T θ_z⟩ ⟨θ⟩_z^{cham}$"
# )
(chipod_dens.KtTz * eccoT.dTdz).cf.plot(
    y="gamma_n", xscale="log", color="k", ls="-", label="$⟨K_T θ_z⟩ ⟨θ⟩_z^{ecco}$"
)
(chipod_dens.KtTz * argoT.dTdz).cf.plot(
    y="gamma_n", xscale="log", color="k", ls="--", label="$⟨K_T θ_z⟩ ⟨θ⟩_z^{argo}$"
)
(cole_diff * cole_gradT**2).plot(
    y="gamma_n",
    color="k",
    ls="dashed",
    _labels=False,
    label="$Κ_e^{argo} |⟨θ⟩_y|²_{cole}$",
    ylim=(25.75, 23),
    xlim=(1e-10, None),
)
(cole_diff * argoT.dTdy**2).plot(
    y="gamma_n",
    color="k",
    ls=":",
    _labels=False,
    label="$Κ_e^{argo} |⟨θ⟩_y|²_{argo}$",
    ylim=(25.75, 23),
    xlim=(1e-10, None),
)
(cole_diff * eccoT.dTdy**2).plot(
    y="gamma_n",
    color="k",
    ls="-.",
    _labels=False,
    label="$Κ_e^{argo} |⟨θ⟩_y|²_{ecco}$",
    ylim=(25.75, 23),
    xlim=(1e-9, None),
)
plt.xticks([1e-9, 1e-8, 1e-7, 1e-6])
plt.legend(loc="upper left")
plt.title("")
plt.xlabel("Variance production/dissipation [°C²/s]")
Text(0.5, 0, 'Variance production/dissipation [°C²/s]')
_images/e372079ee704df4c8588cb3c22316e880f1629703a0051019abd93fb25186784.png

Uncertainties in estimating \(Κ_e^{argo} |⟨θ⟩\_y|²\)#

Using the spline gradient at 0 seems like a large value. I think we would want to estimate gradient between 1S and 5N. So the variance production estimate looks like an overestimate

xr.concat(
    [eccoT_full.dTdy, argoT_full.dTdy.interp(lat=eccoT_full.lat)], dim="kind"
).assign_coords(kind=["ecco", "argo"]).plot(col="kind", yincrease=False)
<xarray.plot.facetgrid.FacetGrid at 0x7fa55875d910>
_images/dd54d89282d4f9b6ec96f0b91189819cc180fafe599af2b41bfc96a03f57c133.png

n-var groupby#

garr = np.array(tuple(tuple_groups))
out = np.arange(17 * 5, dtype=float).reshape((17, 5))
result = np.arange(55, dtype=float)
result[mask] = np.nan
out[(garr[:, 0], garr[:, 1])] = result
out
v1 = eop.gamma_n
v2 = eop.time.dt.month

to_group = ((v1, bins), v2)

factorized = []
groups = []
for var in to_group:
    if isinstance(var, tuple):
        # groupby_bins
        var, bins = var
        partitioned = pd.cut(var.data.ravel(), bins)
        cat = partitioned.codes
        uniques = partitioned.categories.values
    else:
        assert isinstance(var, xr.DataArray)  # really duck array
        cat, uniques = pd.factorize(var.data.ravel())

    groups.append(uniques)
    factorized.append(var.copy(data=cat.reshape(var.shape)))

factorized = xr.broadcast(*factorized)

ngroups = len(to_group)
iters = tuple(map(lambda x: x.flat, tuple(g.data for g in factorized)))
tuples = tuple(zip(*iters))
group_idx, tuple_groups = pd.factorize(tuples)
group_idx = group_idx.reshape(factorized[0].shape)
actual_groups = [tuple(g[i] for g, i in zip(groups, idx)) for idx in tuple_groups]
group_idx
from dask_groupby.core import chunk_reduce
result = chunk_reduce(eop.chi.data, group_idx, func=("mean",))
# result["groups"] = np.array(actual_groups)#[mask]
# result["mean"] = result["mean"]#[mask]

mask = [-1 not in tup for tup in tuple_groups.flat]
midx = pd.MultiIndex.from_tuples(actual_groups, names=["gamma_n_bins", "month"])[mask]
mean_chi = xr.DataArray(
    result["mean"][mask], dims="dim", coords={"dim": midx}
).unstack()
mean_chi

Comparing various \(T_z\) estimates#

First calculate dT/dz using chameleon after remappping to isopycnal bins.

Find mean T, mean z on the isopycnal surfaces. Then do dT/dz.

import xgcm

chamgrid = xgcm.Grid(chameleon, periodic=False, coords={"Z": {"center": "depth"}})
isoT = chamgrid.transform(
    chameleon.theta,
    axis="Z",
    target_data=chameleon.gamma_n,
    target=bins,
    method="linear",
)
zT = chamgrid.transform(
    chameleon.depth.broadcast_like(chameleon.gamma_n),
    axis="Z",
    target_data=chameleon.gamma_n,
    target=bins,
    method="linear",
)

cham_dens["dTdz_m_iso"] = (
    "gamma_n",
    (-1 * isoT.mean("time").diff("gamma_n") / zT.mean("time").diff("gamma_n")).data,
)
cham_dens.dTdz_m_iso.cf.plot(y="pres")
/home/deepak/miniconda3/envs/dcpy/lib/python3.8/site-packages/numba/np/ufunc/gufunc.py:151: RuntimeWarning: invalid value encountered in _interp_1d_linear
  return self.ufunc(*args, **kwargs)
[<matplotlib.lines.Line2D at 0x7f0c08438f70>]
_images/126e41f32f60c4ddac34d5c6c22e5d1f7c282c4e75bcdefa0215b0493511c393.png

This figure compares estimates of the gradient in depth space and isopycnal space, from chameleon and various mean fields. Summary is that the fitting procedure of Polzin & Ferrari looks like an underestimate of the mean gradient. All other estimates are basically the same. That said, even though the fitting results in a lower value it’s not that low compared to all the other estimates.

Most interestingly the chameleon 2 week estimate is basically the same as the argo, ecco climatology estimates. This means that we can’t partition the variance dissipated :/

f, ax = plt.subplots(1, 2, sharex=True, constrained_layout=True)

y = "gamma_n"
plt.sca(ax[0])
(cham_dens.Tz).plot(y=y, label="cham mean Tz")
(cham_dens.dTdz_m).plot(y=y, label="cham fit Tz")
cham_dens.dTdz_m_iso.cf.plot(y=y, label="cham iso Tz")
eccoT.dTdz.plot.step(y=y, label="ecco iso Tz")
argoT.dTdz.plot.step(y=y, label="argo iso Tz")
ax[0].legend()

y = "pres"
plt.sca(ax[1])

chameleon.theta.mean("time").cf.differentiate("Z", follow_positive=True).cf.plot(
    label="cham diff"
)
ecco.Tmean.interp(lat=0).cf.differentiate("Z", follow_positive=True).cf.plot(
    label="ecco diff"
)
argo.Tmean.interp(lat=0).cf.differentiate("Z", follow_positive=True).cf.plot(
    label="argo diff"
)
(cham_dens.Tz).plot(y=y, label="cham mean Tz")
(cham_dens.dTdz_m).plot(y=y, label="cham fit Tz")
cham_dens.dTdz_m_iso.cf.plot(y=y, label="cham iso Tz")
eccoT.dTdz.plot.step(y=y, label="ecco iso Tz")
argoT.dTdz.plot.step(y=y, label="argo iso Tz")
ax[1].legend()
<matplotlib.legend.Legend at 0x7f0bb30fd370>
_images/3dd56b381826bcee9ee4013fb63cd73b83d499b00f4406cdfa93a53f7d6d80b0.png

Cole et al (2015) estimate#

Let’s see what they used for variance, gradients at the same σ levels

cole = ed.read_cole()
cole
<xarray.Dataset>
Dimensions:                (lat: 131, lon: 360, pres: 101, scalar: 1, sigma: 96)
Coordinates:
  * lat                    (lat) float64 -65.0 -64.0 -63.0 ... 63.0 64.0 65.0
  * lon                    (lon) float64 1.5 2.5 3.5 4.5 ... 358.5 359.5 360.5
  * pres                   (pres) float64 0.0 20.0 40.0 ... 1.98e+03 2e+03
  * sigma                  (sigma) float64 20.0 20.1 20.2 ... 29.3 29.4 29.5
    start_year             (scalar) float64 2.005e+03
    end_year               (scalar) float64 2.015e+03
    mixing_efficiency      (scalar) float64 0.16
    minimum_points         (scalar) float64 15.0
    maximum_mixing_length  (scalar) float64 6e+05
Dimensions without coordinates: scalar
Data variables: (12/16)
    mixing_length_sig      (lat, lon, sigma) float64 ...
    salinity_mean_sig      (lat, lon, sigma) float64 ...
    salinity_std_sig       (lat, lon, sigma) float64 ...
    salinity_gradient_sig  (lat, lon, sigma) float64 ...
    depth_mean_sig         (lat, lon, sigma) float64 ...
    number_points_sig      (lat, lon, sigma) float64 ...
    ...                     ...
    density_mean_depth     (lat, lon, pres) float64 ...
    number_points          (lat, lon, pres) float64 ...
    velocity_std           (lat, lon, pres) float64 ...
    diffusivity            (lat, lon, pres) float64 nan nan nan ... nan nan nan
    process_date           (scalar) timedelta64[ns] 8 days 12:43:27.466400463
    diffusivity_first      (lat, lon) float64 nan nan nan nan ... nan nan nan
Attributes:
    title:    This dataset contains calculations related to the estimation of...
def plot_pac(da):
    sigma_levels = [24.0, 25.0]
    fg = (
        da.sel(lat=slice(-10, 10), lon=slice(110, 360 - 70))
        .sel(sigma=sigma_levels, method="nearest")
        .plot(
            robust=True,
            # cbar_kwargs={"orientation": "horizontal"},
            col="sigma",
            col_wrap=2,
            aspect=1.4,
        )
    )
    for ax in fg.axes.flat:
        ax.plot(220, 0, marker="o", color="w")


plot_pac(cole.salinity_gradient_sig)
plot_pac(cole.salinity_std_sig)
plot_pac(cole.mixing_length_sig)
_images/9fe5f09164f758a7a57f0d91623321c2f3cd64ff0d834c6d1fb24cd3995758d8.png _images/a17fb123e782bcdf45a49ccd616a71b4625928b7f5ba60628904cbbd15497ff4.png _images/5093adc624bf0fc8b048c6af8255314efe4613ad2c80188cf55044bf26e3679d.png

Profiles at (0, 140W)#

These look OK.

  1. there is an order of magnitude larger \(\\sqrt{S'S'}\) between 40m and 150m than 250m-400m.

  2. That matches a larger |∇S| in the same depth range

Next step: is the longitudinal gradient bigger than meridional gradient

import eddydiff as ed

ed.plot.plot_cole_profile(lat=0, lon=[221, 220, 219])
plt.gca().set_ylim(400, 0)
(400.0, 0.0)
_images/05c84ac14a9f2e3d962a3610cb5807a825766f26af8700d171448e4c5bc57f95.png

Todo:#

  • [ ] Seasonal cycle in iso Tz from ecco, argo?

  • [ ] look for stirring signal in Argo?

  • [ ]