4.3 Data Normalization & Feature Engineering
Prior to AI inference, all data undergoes a rigorous normalization and transformation protocol:
Zero-Center Scaling: Converts numeric telemetry to zero-mean, unit-variance for numerical stability in gradient-based learning systems.
Temporal Windowing: Segments time-series data into overlapping sliding windows to capture temporal correlations.
Derived Feature Construction: Generates secondary metrics (e.g., moving averages, variance trends, spectral entropy) to enhance anomaly detection sensitivity.
Last updated