{
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "\n# Writing Simple Datasets\n\nWrite 1D, 2D, and 3D arrays to PSI-style HDF5 or HDF4 files.\n\nThis example demonstrates the basics of writing PSI-style HDF files using\n:func:`~psi_io.psi_io.write_hdf_data` for generic n-dimensional datasets, and\n:func:`~psi_io.psi_io.wrhdf_1d`, :func:`~psi_io.psi_io.wrhdf_2d`,\n:func:`~psi_io.psi_io.wrhdf_3d` for dimensionality-specific writing.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "import tempfile\nfrom pathlib import Path\nimport numpy as np\nfrom psi_io import write_hdf_data, wrhdf_1d, wrhdf_2d, wrhdf_3d, read_hdf_data"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "Construct a simple 3D dataset representing a hypothetical scalar field on a\nrectilinear (r, \u03b8, \u03c6) grid. Following PSI's Fortran ordering convention, the\ndata array is C-ordered as (n\u03c6, n\u03b8, nr) while the scales are supplied in\nphysical dimension order (r, \u03b8, \u03c6):\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "nr, nt, np_ = 10, 20, 30\nr = np.linspace(1.0, 5.0, nr, dtype=np.float32)\nt = np.linspace(0.0, np.pi, nt, dtype=np.float32)\np = np.linspace(0.0, 2*np.pi, np_, dtype=np.float32)\nf3d = np.random.default_rng(0).random((np_, nt, nr)).astype(np.float32)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "Write the dataset to an HDF5 file with :func:`~psi_io.psi_io.write_hdf_data`.\nScales are passed as positional arguments in physical dimension order (r, \u03b8, \u03c6):\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "with tempfile.TemporaryDirectory() as tmpdir:\n    out_h5 = Path(tmpdir) / \"output.h5\"\n    write_hdf_data(out_h5, f3d, r, t, p)\n\n    f_back, r_back, t_back, p_back = read_hdf_data(out_h5)\n    print(f\"Data shape (read back) : {f_back.shape}\")\n    print(f\"r scale : shape={r_back.shape}, range=[{r_back[0]:.2f}, {r_back[-1]:.2f}]\")\n    print(f\"t scale : shape={t_back.shape}, range=[{t_back[0]:.4f}, {t_back[-1]:.4f}]\")\n    print(f\"p scale : shape={p_back.shape}, range=[{p_back[0]:.4f}, {p_back[-1]:.4f}]\")\n    print(f\"Round-trip exact match : {np.array_equal(f3d, f_back)}\")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "Writing to an HDF4 file uses the identical call \u2013 only the file extension changes.\nDispatch to the correct backend is handled automatically based on the extension:\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "with tempfile.TemporaryDirectory() as tmpdir:\n    out_hdf = Path(tmpdir) / \"output.hdf\"\n    write_hdf_data(out_hdf, f3d, r, t, p)\n\n    f_back_h4 = read_hdf_data(out_hdf, return_scales=False)\n    print(f\"HDF4 data shape (read back) : {f_back_h4.shape}\")\n    print(f\"Round-trip exact match      : {np.array_equal(f3d, f_back_h4)}\")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "For 1D and 2D datasets the same conventions apply \u2013 the scale(s) precede the\ndata in the positional argument list:\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "x = np.linspace(0.0, 2*np.pi, 64, dtype=np.float32)\nf1d = np.sin(x)\n\ny = np.linspace(0.0, np.pi, 32, dtype=np.float32)\nf2d = np.outer(np.sin(x), np.cos(y)).astype(np.float32)\n\nwith tempfile.TemporaryDirectory() as tmpdir:\n    write_hdf_data(Path(tmpdir) / \"out1d.h5\", f1d, x)\n    write_hdf_data(Path(tmpdir) / \"out2d.h5\", f2d, x, y)\n\n    r1d = read_hdf_data(Path(tmpdir) / \"out1d.h5\")\n    r2d = read_hdf_data(Path(tmpdir) / \"out2d.h5\")\n    print(f\"1D : data={r1d[0].shape}, scale={r1d[1].shape}\")\n    print(f\"2D : data={r2d[0].shape}, x={r2d[1].shape}, y={r2d[2].shape}\")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "The legacy dimension-specific writers (:func:`~psi_io.psi_io.wrhdf_1d`,\n:func:`~psi_io.psi_io.wrhdf_2d`, :func:`~psi_io.psi_io.wrhdf_3d`) provide a\nbackward-compatible interface for existing PSI codes.\n\n.. attention::\n   The argument order of these writers differs from :func:`~psi_io.psi_io.write_hdf_data`:\n   scales are interleaved between the filename and the data array\n   (*filename*, *x*, [*y*, [*z*,]] *f*). They also default to ``sync_dtype=True``,\n   which silently casts scale arrays to match the data dtype, and always write to\n   PSI's standard dataset identifiers (``'Data-Set-2'`` for HDF4, ``'Data'`` for HDF5).\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "with tempfile.TemporaryDirectory() as tmpdir:\n    wrhdf_1d(Path(tmpdir) / \"legacy1d.h5\", x, f1d)\n    wrhdf_2d(Path(tmpdir) / \"legacy2d.h5\", x, y, f2d)\n    wrhdf_3d(Path(tmpdir) / \"legacy3d.h5\", r, t, p, f3d)\n\n    rl3d = read_hdf_data(Path(tmpdir) / \"legacy3d.h5\")\n    print(f\"Legacy 3D : data={rl3d[0].shape}, r={rl3d[1].shape}, t={rl3d[2].shape}, p={rl3d[3].shape}\")\n    print(f\"Round-trip exact match : {np.array_equal(f3d, rl3d[0])}\")"
      ]
    }
  ],
  "metadata": {
    "kernelspec": {
      "display_name": "Python 3",
      "language": "python",
      "name": "python3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.13.9"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 0
}