Module 7 Wavelet Packet - igheyas/WaveletTransformation GitHub Wiki
📋 Week 7: Wavelet Packets & Lifting Scheme 📝 Outline Goals
Wavelet Packet Decomposition
Best‐Basis Selection
Lifting Scheme: Split–Predict–Update
Filter Bank Factorization
Example: 1D Wavelet Packet in Python
Exercises
- Goals By the end of this week you will:
Extend the DWT to a full binary tree of subbands (wavelet packets)
Learn how to pick an optimal basis from that tree via cost functions
Understand the lifting scheme for in-place, integer-to-integer, customizable wavelet transforms
See how any FIR filter bank can be factored into lifting steps
🧠 What DWT does first: In DWT, each step splits the signal into:
Low-frequency part (approximation)
High-frequency part (detail)
But only the low-frequency part gets split again at the next level. So the tree looks like:
✅ They also split the high-frequency parts!
That means:
Both low and high parts are decomposed at every level
The result is a full binary tree
🎯 Why do this? Wavelet packets let you:
Analyze both slow and fast changes in more detail
Better match the signal’s structure (like audio or EEG)
Choose the best path for compression or classification
Each part (low or high) gets split into two more parts:
Here's how it works: At every level, both the low and high signals are passed through:
A low-pass filter → gives the new low part
A high-pass filter → gives the new high part
So each part splits into 2 new parts:
📦 Low → splits into: Low–Low (LL), Low–High (LH)
📦 High → splits into: High–Low (HL), High–High (HH)
So yes — ✅ two parts per split
If you go one level deeper, each of those also splits into 2 again. Can you guess how many total parts you'll get at level 3?
(Each level doubles the number of parts. 😊)
So what happens next? Let’s go step by step:
HL → passes through:
a low-pass filter → becomes HLL
a high-pass filter → becomes HLH
So:
🛠️ How do we divide a signal into low and high parts? We use filters:
Low-pass filter: Lets slow changes pass through → gives the approximation
High-pass filter: Keeps fast changes (edges, detail) → gives the detail
These are like smart sieves that split the signal into:
🐢 Smooth part (low)
🐇 Jumpy part (high)
low = downsample(convolve(x, low_pass_filter))
high = downsample(convolve(x, high_pass_filter))
This gives:
-
A shorter version of the smooth signal (low)
-
A shorter version of the details (high)
-
And we repeat the process on each part!
Let’s explain the Best-Basis Selection in wavelet packets in simple, clear steps. 🌳✨
🎯 What’s the problem? In a wavelet packet tree, we get lots of nodes (lots of time-frequency components).
But:
Not all nodes are equally useful! Some contain very little signal, others are noisy or redundant.
💡 What’s the goal? Choose the best set of nodes (a “basis”) to represent the signal efficiently.
We want the fewest pieces with the most important information.
🧠 What does the algorithm do? This is called the Coifman–Wickerhauser Best-Basis Algorithm. Here’s how it works, step by step:
🌳 What do we get? A pruned wavelet packet tree — a best-basis — that:
Minimizes some cost (like entropy or energy)
Keeps only the most informative nodes
Gives you a highly efficient and adaptive signal representation
🧾 In plain words: The algorithm automatically picks the best “blocks” of the wavelet tree to represent the signal — using fewer pieces but keeping the important stuff.
Let’s go step by step and look at some common cost functions used in best-basis selection — and explain every symbol clearly. 🧠✨
🎯 Goal of a cost function: We want to assign a number (cost) to each node — lower cost means "better" for keeping in the final wavelet packet basis.
📏 Common Cost Functions:
🔧 What is the Lifting Scheme? The lifting scheme is a simple, efficient way to build wavelet transforms — without complex filters.
It breaks the transform into 3 small steps: 👉 Split – Predict – Update
And it helps you do:
Faster computation
Less memory use
Custom wavelets (like CDF 5/3 for JPEG-2000)
🎯 Why is this powerful? ✅ It's fast — just adds, subtracts, multiplies ✅ It works in-place (less memory) ✅ It's invertible — you can go back exactly ✅ It’s the foundation for JPEG-2000 lossless compression
Wavelet Packet Coefficients
# wavelet_packet_threshold_best_basis_with_tree.py
import numpy as np
import pywt
import matplotlib.pyplot as plt
# 1) Generate a toy non‐stationary signal: two chirps + noise
fs = 512
T = 1.0
t = np.linspace(0, T, int(fs*T), endpoint=False)
x = np.sin(2*np.pi*5 * t) * (t < 0.5) \
+ np.sin(2*np.pi*20 * t) * (t >= 0.5)
x += 0.1 * np.random.randn(len(t)) # small noise
# 2) Define a low‐threshold cost: count of coefficients above T
T_thresh = 0.05
def threshold_cost(coefs):
"""Cost = number of ‘significant’ coeffs above threshold T_thresh."""
return np.count_nonzero(np.abs(coefs) > T_thresh)
# 3) Recursive best‐basis selector using threshold_cost
def best_basis(node, cost_func):
if node.level < node.maxlevel:
node.decompose()
L_basis, L_cost = best_basis(node['a'], cost_func)
R_basis, R_cost = best_basis(node['d'], cost_func)
cost_children = L_cost + R_cost
else:
L_basis = []; R_basis = []
cost_children = np.inf
cost_node = cost_func(node.data)
if cost_node < cost_children: # split if children are cheaper or tie
return [node], cost_node
else:
return L_basis + R_basis, cost_children
# 4) Build the wavelet‐packet tree to level 3
wp = pywt.WaveletPacket(
data=x,
wavelet='db2',
mode='periodization',
maxlevel=3
)
# 5) Print the full tree of coefficients in an ASCII hierarchy
def print_wp_ascii(node, prefix='', is_last=True):
"""
Recursively prints a WaveletPacket subtree with ASCII branches.
- node: a WaveletPacket node
- prefix: accumulated leading characters
- is_last: True if this node is the last child of its parent
"""
name = node.path or 'root'
# choose branch or end-branch connector
branch = '└── ' if is_last else '├── '
print(prefix + branch + f"{name}: {np.round(node.data, 3)}")
# if we can go deeper, decompose and recurse on 'a' and 'd'
if node.level < node.maxlevel:
node.decompose()
children = [node['a'], node['d']]
for i, child in enumerate(children):
# for the next level prefix, extend with │ or depending
extension = ' ' if is_last else '│ '
print_wp_ascii(child,
prefix + extension,
is_last=(i == len(children)-1))
print("Wavelet Packet Tree (coefficients at each node):")
print_wp_ascii(wp)
# 6) Select best basis with threshold_cost
basis_nodes, _ = best_basis(wp, threshold_cost)
paths = [n.path for n in basis_nodes]
print("\nBest‐basis packet paths:", paths)
# 7) Print coefficients of best‐basis nodes
print("\nBest‐basis coefficients:")
for node in basis_nodes:
print(f" Path {node.path:3s} → {node.data}")
# 8) Reconstruct signal from best‐basis packets
new_wp = pywt.WaveletPacket(
data=None,
wavelet='db2',
mode='periodization',
maxlevel=3
)
for node in basis_nodes:
new_wp[node.path] = wp[node.path].data
x_rec = new_wp.reconstruct(update=False)
# 9) Measure reconstruction performance
mse = np.mean((x - x_rec)**2)
snr = 10 * np.log10(np.sum(x**2) / np.sum((x - x_rec)**2))
print(f"\nReconstruction MSE: {mse:.2e}")
print(f"Reconstruction SNR: {snr:.2f} dB")
# 10) Plot original vs reconstructed
plt.figure(figsize=(10,4))
plt.plot(t, x, label='Original', alpha=0.7)
plt.plot(t, x_rec,label='Reconstructed',alpha=0.7)
plt.xlabel('Time [s]')
plt.ylabel('Amplitude')
plt.title(f"Best‐Basis Reconstruction (threshold={T_thresh})")
plt.legend()
plt.tight_layout()
plt.show()
Output:
Wavelet Packet Tree (coefficients at each node): └── root: [-0.04 0.092 0.218 0.263 0.117 0.065 0.242 0.377 0.482 0.592 0.634 0.438 0.748 0.767 0.654 0.792 0.999 0.79 0.747 0.973 0.836 1.105 1.091 1.055 0.979 0.976 1.127 0.783 1.055 0.921 1.078 0.939 1.008 0.95 0.847 0.993 0.721 0.938 0.745 0.491 0.512 0.579 0.565 0.482 0.358 0.399 0.39 0.247 0.186 0.257 0.035 0.263 -0.237 -0.102 -0.149 -0.301 -0.243 -0.291 -0.496 -0.496 -0.588 -0.495 -0.419 -0.611 -0.525 -0.624 -0.972 -0.833 -0.698 -0.926 -0.869 -0.95 -0.86 -0.871 -1.018 -0.927 -1.096 -1.053 -1.144 -0.942 -0.984 -0.868 -1.044 -0.874 -0.939 -0.84 -0.882 -0.797 -0.534 -0.886 -0.571 -0.622 -0.527 -0.536 -0.577 -0.35 -0.388 -0.347 -0.238 -0.181 -0.034 0.08 -0.066 -0.091 0.099 0.177 0.108 0.266 0.308 0.392 0.407 0.5 0.575 0.488 0.639 0.734 0.826 0.634 0.818 0.873 0.864 0.77 0.816 0.856 1.129 0.974 0.898 0.927 0.996 1.02 0.844 0.861 0.838 0.919 0.99 0.809 0.755 0.973 0.934 0.75 0.749 0.718 0.821 0.606 0.62 0.593 0.344 0.373 0.446 0.505 0.12 0.135 -0.105 0.121 0.114 0.149 -0.215 -0.239 -0.111 -0.455 -0.506 -0.419 -0.56 -0.435 -0.585 -0.735 -0.618 -0.686 -0.815 -0.849 -0.964 -0.932 -1.064 -0.872 -1.042 -0.964 -1.145 -1.036 -1.021 -1.092 -1.146 -1.067 -0.862 -0.857 -1.18 -1.057 -0.99 -0.958 -0.9 -0.782 -0.838 -0.524 -0.887 -0.68 -0.602 -0.757 -0.585 -0.464 -0.274 -0.457 -0.319 -0.01 -0.102 -0.338 0.009 0.065 0.15 0.039 0.151 0.362 0.34 0.391 0.516 0.344 0.513 0.548 0.739 0.605 0.898 0.82 0.962 0.942 0.96 0.731 0.798 1.001 0.982 1.037 0.952 0.774 0.878 0.978 1.139 0.957 1.217 1.049 1.002 0.995 0.808 1.062 0.817 0.846 0.68 0.776 0.41 0.666 0.767 0.682 0.402 0.429 0.282 0.251 0.315 0.313 -0.014 0.035 0.059 0.118 0.388 0.695 0.723 1.061 1.11 0.984 0.723 0.747 0.555 0.331 0.154 0.009 -0.34 -0.679 -0.772 -0.988 -1.147 -1.061 -0.8 -0.993 -0.701 -0.764 -0.294 -0.129 -0.051 0.348 0.518 0.539 0.926 0.918 1.089 0.961 1.003 0.902 0.591 0.329 0.17 -0.139 -0.433 -0.629 -0.89 -0.804 -1.056 -0.945 -1.089 -1.005 -0.71 -0.377 -0.329 -0.145 0.145 0.4 0.868 0.83 0.909 0.931 1.04 0.852 0.873 0.57 0.436 0.245 -0.014 -0.102 -0.345 -0.78 -0.91 -0.928 -0.988 -1.004 -0.828 -0.756 -0.631 -0.392 -0.376 0.089 0.446 0.672 0.608 0.932 1.127 0.769 0.995 0.806 0.726 0.443 0.193 0.075 -0.135 -0.244 -0.523 -0.838 -0.795 -1.046 -0.795 -1.075 -0.99 -0.722 -0.673 -0.385 -0.242 0.111 0.391 0.658 0.742 0.845 0.877 0.884 0.876 0.733 0.704 0.514 0.343 0.036 -0.266 -0.171 -0.85 -0.877 -0.931 -0.932 -0.99 -1.006 -0.848 -0.65 -0.503 -0.341 -0.2 0.238 0.527 0.609 0.87 0.914 0.853 0.956 0.767 0.875 0.654 0.604 0.169 -0.2 -0.254 -0.452 -0.731 -0.903 -0.924 -1.078 -1.157 -0.932 -0.903 -0.784 -0.233 -0.187 0.187 0.506 0.712 0.606 1.008 0.855 1.015 0.838 0.792 0.878 0.425 0.353 0.136 -0.308 -0.259 -0.661 -0.701 -0.921 -1.031 -0.972 -1.077 -0.824 -0.764 -0.594 -0.214 -0.054 0.251 0.419 0.629 1.036 0.772 0.957 0.824 0.618 0.708 0.738 0.512 0.26 0.153 -0.25 -0.54 -0.689 -1.277 -0.916 -0.985 -1.099 -0.828 -0.968 -0.9 -0.379 -0.189 0.078 0.3 0.521 0.653 0.843 1.089 0.995 0.908 0.915 0.976 0.714 0.371 0.253 -0.065 -0.399 -0.518 -0.798 -0.832 -1.044 -1.122 -0.876 -0.946 -0.661 -0.75 -0.221 -0.187 0.106 0.461 0.68 0.655 0.942 0.92 0.872 0.989 0.97 0.748 0.472 0.155 0.101 -0.164 -0.288 -0.682 -0.837 -1.03 -0.894 -1.055 -0.68 -0.82 -0.616 -0.52 -0.213] ├── a: [-0.143 0.271 0.208 0.256 0.636 0.818 0.924 0.966 1.299 1.117 1.275 1.556 1.402 1.453 1.328 1.426 1.4 1.297 1.197 1.12 0.722 0.814 0.571 0.55 0.328 0.243 -0.075 -0.211 -0.35 -0.591 -0.788 -0.658 -0.748 -1.211 -1.081 -1.276 -1.242 -1.338 -1.453 -1.549 -1.338 -1.367 -1.282 -1.253 -0.957 -0.977 -0.786 -0.77 -0.541 -0.403 -0.089 -0.049 0.064 0.195 0.421 0.567 0.749 0.828 1.081 1.074 1.211 1.101 1.46 1.301 1.4 1.283 1.195 1.356 1.12 1.322 1.043 1.089 0.9 0.6 0.651 0.389 -0.01 0.215 -0.147 -0.245 -0.665 -0.693 -0.784 -0.92 -1.078 -1.287 -1.4 -1.36 -1.523 -1.451 -1.613 -1.276 -1.51 -1.436 -1.282 -1.081 -1.07 -0.926 -0.923 -0.514 -0.476 -0.167 -0.161 0.146 0.182 0.48 0.631 0.622 0.902 1.103 1.287 1.318 1.118 1.414 1.357 1.18 1.482 1.585 1.463 1.289 1.298 1.098 0.768 1.064 0.725 0.459 0.457 0.14 0.043 0.444 1.035 1.568 1.175 0.88 0.334 -0.332 -1.047 -1.571 -1.313 -1.199 -0.637 -0.094 0.602 1.1 1.44 1.429 0.982 0.326 -0.455 -1.092 -1.343 -1.501 -1.121 -0.509 0.029 0.988 1.236 1.397 1.213 0.696 0.128 -0.395 -1.218 -1.393 -1.265 -0.932 -0.542 0.488 0.897 1.437 1.29 1.071 0.41 -0.064 -0.64 -1.201 -1.282 -1.422 -0.967 -0.414 0.432 1.014 1.227 1.233 1.014 0.578 -0.133 -0.87 -1.283 -1.394 -1.276 -0.785 -0.347 0.58 1.117 1.27 1.215 1.083 0.421 -0.316 -0.912 -1.302 -1.581 -1.351 -0.639 0.088 0.845 1.196 1.347 1.209 0.841 0.249 -0.423 -0.978 -1.385 -1.456 -1.143 -0.511 0.196 0.861 1.254 1.198 0.99 0.823 0.267 -0.561 -1.479 -1.406 -1.324 -1.281 -0.363 0.32 0.846 1.424 1.319 1.37 0.72 0.045 -0.697 -1.17 -1.516 -1.266 -0.972 -0.299 0.505 0.969 1.292 1.369 1.18 0.401 -0.064 -0.764 -1.33 -1.361 -1.085 -0.775] │ ├── aa: [-0.46 0.28 0.72 1.216 1.638 1.774 2.078 1.951 1.996 1.785 1.253 0.952 0.604 0.052 -0.425 -0.995 -1.075 -1.614 -1.767 -2.035 -2.008 -1.89 -1.523 -1.232 -0.903 -0.288 0.019 0.477 0.946 1.389 1.59 1.863 1.932 1.778 1.753 1.639 1.329 0.922 0.246 0.012 -0.728 -1.058 -1.454 -1.901 -2.048 -2.141 -2.035 -1.87 -1.505 -1.273 -0.664 -0.206 0.249 0.783 1.136 1.76 1.713 1.891 1.976 2.111 1.855 1.317 1.164 0.63 0.069 1.279 1.894 0.766 -1.219 -2.043 -1.211 0.518 1.929 1.644 -0.295 -1.842 -1.78 -0.16 1.667 1.829 0.513 -1.358 -1.87 -0.91 1.141 1.96 0.929 -0.67 -1.847 -1.644 0.23 1.661 1.59 0.139 -1.641 -1.871 -0.684 1.342 1.818 0.922 -1.003 -2.171 -1.277 0.843 1.867 1.398 -0.274 -1.81 -1.799 -0.052 1.606 1.557 0.686 -1.652 -2.027 -0.96 1.011 2.005 1.364 -0.627 -1.991 -1.557 0.328 1.703 1.746 0.141 -1.646 -1.72 ] │ │ ├── aaa: [-1.246 0.798 2.086 2.774 2.85 2.046 1.032 -0.414 -1.513 -2.454 -2.889 -2.346 -1.417 -0.139 1.128 2.168 2.688 2.521 2.078 0.748 -0.652 -1.888 -2.847 -2.961 -2.362 -1.248 0.137 1.501 2.451 2.799 2.716 1.743 0.403 2.532 -0.951 -2.133 2.27 0.365 -2.63 1.661 1.25 -2.572 0.834 1.812 -2.266 -0.436 2.376 -1.637 -1.41 2.505 -0.715 -2.169 2.318 0.273 -2.598 1.578 1.218 -2.839 0.655 2.227 -2.36 -0.322 2.528 -1.635] │ │ └── aad: [ 0.213 0.029 -0.045 -0.027 0.188 -0.008 -0.01 -0.225 -0.127 -0.128 -0.132 -0.009 0.111 -0.025 0.12 0.089 -0.034 0.106 0.143 0.187 -0.021 -0.138 -0.103 -0.104 -0.165 0.021 0.077 0.289 0.016 0.182 -0.149 0.062 0.059 0.639 -0.95 0.038 1.018 -0.828 -0.302 0.93 -0.585 -0.718 1.053 -0.13 -0.986 0.783 0.337 -0.885 0.64 0.675 -1.094 0.371 0.774 -0.765 -0.181 0.617 -0.758 -0.623 0.916 -0.128 -0.933 0.709 0.301 -0.866] │ └── ad: [ 0.258 -0.174 0.062 -0.133 -0.098 0.194 0.059 0.031 0.008 0.152 0.099 0.069 0.095 -0.022 -0.008 0.064 -0.238 -0.068 0.026 -0.151 -0.025 -0.122 -0.061 -0.08 -0.073 -0. -0.048 -0.007 -0.071 -0.036 -0.195 -0.058 0.014 0.159 0.176 0.071 -0.155 0.106 0.203 0.121 -0.02 0.016 -0.04 0.078 0.083 0.211 -0.079 0.085 0.051 0.127 0.111 0.092 0.037 -0.118 0.018 0.132 0.106 -0.216 0.134 -0.082 0.09 0.226 -0.138 -0.066 -0.156 0.454 0.108 0.039 -0.402 -0.197 -0.071 0.102 0.256 0.088 -0.205 -0.272 0.006 0.289 0.178 0.068 0.139 -0.23 -0.054 0.217 0.315 0.223 0.025 -0.234 -0.28 -0.154 0.213 0.136 0.161 -0.153 -0.15 -0.023 0.125 0.15 0.266 -0.058 -0.08 -0.298 -0.017 0.149 0.149 0.067 -0.087 -0.229 -0.078 0.136 0.131 0.183 0.078 -0.133 -0.417 0.106 0.323 0.318 0.036 -0.096 -0.098 -0.112 0.112 0.16 -0.075 -0.035 -0.217 -0.16 ] │ ├── ada: [ 0.092 -0.049 -0.11 0.149 0.043 0.159 0.109 0.028 -0.187 -0.042 -0.113 -0.118 -0.094 -0.033 -0.046 -0.195 -0.003 0.26 -0.098 0.251 0.051 -0.018 0.164 0.049 0.098 0.17 0.046 -0.026 0.087 -0.022 0.104 -0.001 -0.074 0.371 -0.353 -0.165 0.31 -0.191 -0.085 0.286 0.104 -0.148 0.415 0.113 -0.409 0.113 0.185 -0.221 0.093 0.293 -0.16 -0.144 0.223 -0.081 -0.162 0.207 0.178 -0.431 0.388 0.175 -0.168 0.085 0.035 -0.268] │ └── add: [-0.213 -0.055 0.173 -0.016 0.073 -0.03 -0.045 0.173 -0.025 -0.112 -0.048 -0.002 0.05 0.039 0.081 -0.007 0.053 0.074 0.017 0.052 0.022 0.032 0.186 0.037 0.03 0.018 -0.128 0.07 -0.286 -0.115 0.246 0.022 0.372 0.144 -0.046 0.004 0.102 -0.196 0.189 -0.088 -0.206 0.072 0.075 -0.095 -0.139 0.008 -0.109 -0.026 -0.028 -0.089 -0.216 0.095 0.046 -0.143 0.098 0.068 0.049 0.044 0.163 -0.082 -0.114 0.16 0.072 -0.205] └── d: [ 0.008 0.103 -0.123 0.02 0.033 -0.214 0.102 -0.066 -0.026 0.14 0.084 0.022 -0.084 -0.233 -0.088 -0.062 0.038 0.17 0.135 -0.125 0.033 0.028 0.003 -0.023 0.125 0.293 0.006 -0.088 0.089 0.018 -0.016 -0.1 0.145 -0.061 -0.09 -0.064 0.079 0.095 0.038 0.08 0.12 0.068 0.047 -0.102 -0.242 -0.023 0.029 0.093 -0.043 -0.037 0.13 -0.12 0.086 0.027 0.028 -0.001 -0.094 0.009 -0.145 0.048 -0.056 -0.112 0.018 -0.033 0.103 -0.005 -0.009 -0.029 0.089 -0.07 -0.061 -0.07 0.113 -0.057 0.216 0.072 0.053 0.187 -0.117 -0.08 0.092 0.098 -0.129 0.054 0.027 0.06 0.133 0.093 0.008 0.003 -0.078 0.185 -0.03 -0.008 0.076 0.279 -0.011 -0.127 -0.026 -0.106 0.171 -0.262 0.023 -0.082 0.1 -0.045 -0.126 -0.058 -0.164 -0.058 0.003 -0.111 0.089 0.058 -0.125 -0.029 -0.169 -0.003 0.082 0.184 0.059 0.189 -0.006 0.118 0.045 -0.061 0.166 -0.037 -0.106 0.13 0.099 0.088 0.067 -0.018 0.095 -0.12 -0.011 -0.116 -0.175 -0.211 0.082 0.069 -0.157 -0.035 -0.043 0.12 -0.056 0.012 0.019 0.119 0.076 -0.131 0.133 -0.069 -0.098 0.009 -0.035 -0.062 -0.04 0.04 0.052 -0.123 0.006 -0.099 -0.012 0.093 -0.006 0.157 0.012 -0.21 0.001 0.01 0.028 0.069 -0.169 -0.204 -0.107 0.082 0.039 0.008 0.09 0.032 0.01 -0.037 0.012 0.015 0.323 -0.072 0.021 -0.09 0.019 0.008 0.034 -0.059 0.078 0.12 0.12 0.163 -0.161 0.058 -0.087 -0.019 0.055 -0.221 -0.093 0.062 -0.205 -0.079 -0.02 0.243 0.02 -0.209 -0.117 -0.03 0.057 0.047 -0.116 -0.042 -0.002 0.299 0.096 -0.133 0.132 -0.067 -0.016 0.193 0.085 -0.179 -0.047 0.101 0.012 0.043 -0.034 0.041 -0.039 0.081 0.067 -0.102 -0.098 -0.042 0.111 0.135 0.159 -0.064 0.135 0.109 -0.076 0.115 0.027 0.068 0.112 -0.012 0.101 0.179 0.007 0.037] ├── da: [ 0.064 -0.053 -0.024 -0.029 -0.033 0.154 -0.101 -0.205 0.022 0.163 -0.027 -0.005 0.158 0.115 0.038 -0.046 0.071 -0.129 0.051 0.08 0.148 0.08 -0.261 0.039 -0.016 0.053 0.016 0.048 -0.058 -0.099 -0.051 -0.06 0.07 -0.028 0.053 -0.115 0.02 0.163 0.136 -0.037 0.077 -0.052 0.045 0.16 0.062 -0.019 0.052 0.123 0.1 -0.13 0.03 -0.138 0.05 -0.119 -0.178 -0.062 0.05 -0.061 -0.166 0.101 0.181 0.107 0.059 0.115 -0.09 0.157 0.082 0.045 -0.071 -0.261 0.002 -0.101 -0.019 0.011 0.039 0.074 0.045 -0.108 -0.033 -0.061 0.035 -0.075 -0.036 0.074 0.039 -0.102 0.065 -0.14 -0.175 0.063 0.085 0.014 -0.046 0.259 -0.04 -0.03 0.009 0.048 0.216 -0.031 -0.056 -0. -0.144 -0.157 -0.003 0.103 -0.213 0.059 -0.083 0.032 0.177 0.033 -0.013 0.13 -0.105 0.073 0.006 -0.003 0.085 -0.155 0.082 0.166 0.117 0.011 0.079 0.111 0.118 0.1 ] │ ├── daa: [ 0.093 -0.048 0.005 -0.059 -0.041 0.035 0.151 0.068 0.002 -0.021 0.214 -0.169 0.015 0.058 -0.041 -0.113 0.017 0.002 -0.02 0.174 0.029 0.04 0.119 0.049 0.11 -0.075 -0.029 -0.227 0.02 -0.169 0.217 0.139 0.005 0.164 -0.096 -0.144 -0.067 0.049 0.054 -0.099 -0.012 -0.055 0.037 -0.003 -0.211 0.11 0.031 0.084 -0.024 0.204 -0.044 -0.155 -0.027 -0.104 -0.057 0.173 0.048 -0.009 0.029 0.024 0.016 0.17 0.081 0.166] │ └── dad: [-0.06 0.004 0.189 -0.179 0.17 -0.096 0.043 -0.096 -0.143 0. 0.15 0.089 0.035 0.058 -0.051 -0.06 -0.057 -0.114 0.081 -0.12 -0.078 0.1 -0.076 0.046 -0.161 -0.13 -0.007 -0.021 0.026 0.042 0.007 0.112 0.097 0.033 -0.209 -0.042 0.008 0.03 -0.094 -0.046 -0.045 0.061 -0.135 -0.034 0.069 0.007 0.244 -0.054 -0.062 -0.054 0.086 -0.098 0.209 0.124 -0.047 -0.01 0.158 0.065 -0.055 -0.188 0.084 -0.077 0.017 0.012] └── dd: [ 0.139 0.015 -0.238 -0.038 0.091 0.022 -0.137 -0.021 0.077 -0.173 0.031 -0.083 0.217 -0.156 0.014 -0.152 -0.027 -0.064 0.052 -0.012 -0.003 0.012 0.035 0.096 -0.096 -0.166 0.005 0.035 0.099 0.099 -0.096 -0.066 -0.019 -0.064 -0.045 -0.09 -0.169 -0.007 0.192 -0.109 0.134 0.048 -0.027 0.036 0.027 0.186 -0.06 0.223 -0.127 -0.149 -0.255 -0.088 0.012 0.065 -0.005 -0.129 0.103 0.078 -0. 0.108 0.124 0.054 -0.156 -0.009 0.089 0.002 -0.087 -0.114 0.006 -0.162 0.143 0.017 0.142 -0.002 0.057 -0.206 -0.023 0.055 -0.026 0.025 -0.122 -0.062 0.096 0.114 -0.2 0.022 0.131 -0.09 0.1 -0.056 0.001 -0.043 -0.141 -0.144 -0.08 -0.003 -0.096 0.032 0.172 0.105 -0.031 -0.15 0.2 -0.019 0.208 -0.154 0. 0.086 -0.014 0.21 -0.235 -0.06 0.133 -0.171 0.113 0.037 0.055 0.038 -0.063 -0.053 0.017 -0.172 0.106 0.086 -0.019 -0.093 0.125 0.002] ├── dda: [ 0.151 -0.212 0.081 -0.118 0.011 -0.105 0.105 -0.094 -0.117 0.01 -0.01 0.069 -0.072 -0.081 0.134 -0.045 -0.056 -0.068 -0.211 0.116 0.074 0.006 0.089 0.106 0.001 -0.307 -0.018 -0.015 0.041 0.046 0.188 -0.118 0.082 -0.098 -0.105 0.027 0.119 0.003 -0.103 0.027 -0.116 0.102 -0.124 0.087 0.028 -0.018 -0.161 -0.124 -0.097 0.187 -0.034 0.064 0.13 -0.053 0.108 -0.126 0.029 0.013 0.08 -0.048 -0.064 0.027 -0.011 0.042] └── ddd: [ 0.096 -0.024 0.069 -0.026 -0.174 -0.159 -0.175 -0.097 -0.053 -0.012 -0.004 0.117 -0.132 0.002 0.102 -0.038 -0.019 0.024 -0.049 -0.198 0.037 0.017 0.174 0.237 -0.002 -0.003 0.066 -0.164 0.059 0.02 0.079 -0.022 0.025 -0.079 -0.191 -0.065 -0.063 -0.174 0.091 0.079 -0.074 0.179 -0.015 -0.156 -0.058 0.039 -0.045 0.081 -0.034 0.06 -0.229 -0.141 -0.173 0.099 0.281 -0.089 -0.22 0.001 0.046 -0.043 -0.192 0.08 -0.145 -0.081]
Best‐basis packet paths: ['aaa', 'aad', 'ada', 'add', 'daa', 'dad', 'dda', 'ddd']
Best‐basis coefficients: Path aaa → [-1.24625142 0.79789861 2.08630213 2.7738694 2.84983076 2.04572315 1.03185913 -0.41380935 -1.51328552 -2.45417102 -2.88925155 -2.34560951 -1.4173425 -0.1387557 1.12755223 2.16813207 2.68802281 2.52056393 2.07795892 0.74849329 -0.65212199 -1.88807655 -2.84745908 -2.96082586 -2.36197539 -1.2484107 0.13687253 1.50077929 2.45119579 2.79879101 2.71588862 1.74256955 0.40329899 2.53175704 -0.95079449 -2.1333171 2.27042362 0.36488408 -2.63045313 1.66079369 1.24989798 -2.57168812 0.8338101 1.81221862 -2.26641486 -0.43552718 2.37589347 -1.63667029 -1.41040101 2.50539362 -0.71492207 -2.16910964 2.3178668 0.27344892 -2.59849038 1.57800934 1.21795901 -2.83902562 0.65502001 2.22677382 -2.36014199 -0.32187477 2.52791286 -1.63479516] Path aad → [ 0.21253122 0.02886883 -0.04451334 -0.02747382 0.18794389 -0.00751698 -0.00962171 -0.22472078 -0.12719737 -0.12794749 -0.13189177 -0.00874335 0.11126855 -0.02491447 0.11989894 0.08932385 -0.03364954 0.10636916 0.14264489 0.18704114 -0.0208583 -0.13838479 -0.10325439 -0.10444029 -0.16519859 0.02116397 0.07705984 0.28940326 0.01569951 0.18204764 -0.14928455 0.06217592 0.05850228 0.63881943 -0.95024842 0.0379409 1.01775294 -0.82780759 -0.30174528 0.92952828 -0.58505262 -0.71750289 1.05304308 -0.13001837 -0.98583232 0.78304899 0.33727857 -0.88544494 0.63969317 0.67488647 -1.09374959 0.37084032 0.77382273 -0.76473246 -0.18138888 0.61743939 -0.75845773 -0.62295922 0.91601338 -0.12820469 -0.93281616 0.70887102 0.30086208 -0.86577242] Path ada → [ 0.09179559 -0.0494069 -0.1097601 0.14894206 0.0434604 0.15908883 0.10873185 0.02758087 -0.18689446 -0.04219578 -0.11311272 -0.11813443 -0.09386562 -0.03283046 -0.04556409 -0.1951412 -0.00311297 0.25997711 -0.09759005 0.25071933 0.05060962 -0.01843818 0.16445154 0.04869936 0.0976067 0.17015016 0.04622282 -0.0264419 0.08680821 -0.02176595 0.10373419 -0.00050586 -0.07449474 0.37058272 -0.35270173 -0.16524575 0.30990111 -0.19092386 -0.08457356 0.28574748 0.10449719 -0.14842918 0.4148755 0.11255603 -0.40901943 0.11330974 0.18508409 -0.22050287 0.09255261 0.2927217 -0.15972542 -0.14396378 0.2227465 -0.08119761 -0.16207667 0.20661284 0.17783734 -0.43115 0.38813423 0.17511287 -0.1679275 0.08530081 0.03528194 -0.26763188] Path add → [-0.213049 -0.05499506 0.17333494 -0.01615065 0.0733946 -0.03018815 -0.04450038 0.17286228 -0.02455469 -0.11171816 -0.04774603 -0.0022653 0.04991648 0.03913101 0.08080686 -0.00713541 0.05283907 0.07411018 0.01664936 0.05203113 0.02174018 0.03217143 0.18617321 0.03722429 0.02982198 0.01782582 -0.12784594 0.0700537 -0.28612942 -0.1145519 0.24605107 0.022145 0.37157666 0.14416975 -0.04562292 0.00350477 0.10157939 -0.19602745 0.18921092 -0.08778951 -0.20616098 0.07179585 0.07530637 -0.09491769 -0.13877124 0.00800945 -0.10882682 -0.0263023 -0.02788768 -0.08928694 -0.21600537 0.09534851 0.04574536 -0.14301498 0.0976154 0.06794076 0.04940962 0.04363666 0.16283544 -0.08218033 -0.11369982 0.15962607 0.07200834 -0.20521617] Path daa → [ 0.09314007 -0.04794966 0.00548751 -0.05869649 -0.04056791 0.03468921 0.15076199 0.06821847 0.00153636 -0.02100818 0.21440032 -0.16876082 0.01503154 0.05774198 -0.04111956 -0.1132598 0.0165448 0.00227424 -0.0197544 0.17370979 0.02890752 0.04006108 0.11857894 0.04939839 0.11018736 -0.07498955 -0.0286492 -0.22719149 0.02002313 -0.16940788 0.21665308 0.13878242 0.00459128 0.16372133 -0.0962119 -0.14409803 -0.0667854 0.04863603 0.0536031 -0.09856362 -0.01237134 -0.05484215 0.03650312 -0.00344212 -0.21102627 0.11031172 0.03124725 0.08393464 -0.02433546 0.20407547 -0.04363508 -0.15521858 -0.02737157 -0.10420565 -0.05671667 0.17317017 0.04759866 -0.00940273 0.02854339 0.02430375 0.01578704 0.17017869 0.08116499 0.16575626] Path dad → [-5.97025434e-02 3.69687407e-03 1.88872141e-01 -1.79394588e-01 1.70312577e-01 -9.56753013e-02 4.29614905e-02 -9.57484112e-02 -1.43051786e-01 1.40494646e-04 1.49802805e-01 8.88521447e-02 3.48985488e-02 5.81692086e-02 -5.12516205e-02 -5.96393054e-02 -5.68253985e-02 -1.14086060e-01 8.09016869e-02 -1.19890736e-01 -7.77573679e-02 1.00355361e-01 -7.55034488e-02 4.57642465e-02 -1.61363491e-01 -1.29907447e-01 -6.99463943e-03 -2.08694827e-02 2.63127770e-02 4.18516230e-02 7.35237815e-03 1.12219903e-01 9.69001261e-02 3.28825694e-02 -2.09088810e-01 -4.17325217e-02 8.16981911e-03 3.02057637e-02 -9.43003872e-02 -4.62751992e-02 -4.51473181e-02 6.07268849e-02 -1.35067361e-01 -3.37859894e-02 6.89212826e-02 6.85625095e-03 2.44004680e-01 -5.36934905e-02 -6.19990179e-02 -5.35012427e-02 8.60584883e-02 -9.76370702e-02 2.09482125e-01 1.23906137e-01 -4.74821620e-02 -9.83375308e-03 1.58113211e-01 6.45392882e-02 -5.47942570e-02 -1.88145155e-01 8.42042944e-02 -7.67901785e-02 1.65254930e-02 1.18008258e-02] Path dda → [ 0.15145003 -0.21190944 0.08061746 -0.11831023 0.01146037 -0.10450606 0.10474647 -0.09422017 -0.11686695 0.01036189 -0.00981607 0.06852403 -0.07183087 -0.08102409 0.13382863 -0.04483545 -0.05627028 -0.06752749 -0.21095448 0.11574047 0.07368117 0.00563282 0.08943919 0.10595193 0.00055636 -0.30706285 -0.01763283 -0.01546219 0.04147148 0.04569784 0.1884557 -0.11791838 0.08235441 -0.09841347 -0.10507031 0.02663222 0.1191406 0.00305349 -0.10343375 0.02650166 -0.11649493 0.10164558 -0.12365054 0.08730372 0.02783491 -0.01792342 -0.16075647 -0.12449962 -0.09679892 0.18689914 -0.03415614 0.06406006 0.13030738 -0.05318701 0.10756169 -0.12638849 0.02911956 0.012827 0.08034404 -0.04846041 -0.06381072 0.02732534 -0.01083679 0.04224708] Path ddd → [ 0.0958979 -0.02436756 0.06886305 -0.02640059 -0.17432501 -0.15931419 -0.17542599 -0.09748747 -0.05277138 -0.01228527 -0.00439233 0.11689653 -0.13241049 0.00182019 0.10222256 -0.0379191 -0.01904209 0.02432099 -0.04887104 -0.19844776 0.03742161 0.01715055 0.17433943 0.23706911 -0.00180981 -0.00277946 0.06550618 -0.16446019 0.05886576 0.02003297 0.07887446 -0.02240743 0.02521533 -0.07892668 -0.19069157 -0.06495081 -0.06329493 -0.1737374 0.09082563 0.07903115 -0.07412952 0.17868571 -0.01497526 -0.15618039 -0.05812347 0.03903372 -0.04512926 0.08064482 -0.03415577 0.0601183 -0.22859843 -0.14136721 -0.17313657 0.09891409 0.28119411 -0.08906404 -0.21996453 0.00091479 0.04570384 -0.04327082 -0.19150861 0.07987088 -0.14513661 -0.08095537]
Reconstruction MSE: 5.90e-32 Reconstruction SNR: 309.42 dB