PitchHut logo
Informational Inertia
Quantifying structure within data beyond conventional limits.
Pitch

Informational Inertia provides a novel approach to measure the irreducible structures in data, surpassing traditional notions of entropy. By utilizing minimal estimators, this project analyzes the core components of data signals, revealing their inherent complexities and resistances to simplification.

Description

Informational Inertia (I) provides a robust framework for quantifying the irreducible structure within data, extending beyond traditional measures of entropy and compression. At its core, Informational Inertia seeks to address the question of how resistant a signal remains to simplification after all apparent structure has been removed. This approach is distinct from notions of meaning or intelligence, focusing solely on the quantitative aspects of data.

Key Features

  • Quantitative Measures: The analysis yields three primary outputs:-

    • Entropy: Serves as a histogram-based proxy for analysis reference.
    • I_bar: Represents the normalized irreducible variance fraction, derived after the removal of models and trends.
    • I_comp: Functions as a proxy for compression resistance, normalized against entropy.
  • Composite Estimates: Formulate a composite measure using:

    = Σ wᵢ · Z(Iᵢ)
    

    where Z(Iᵢ) are the normalized estimators and wᵢ denotes precision-weighted coefficients. This methodology assists in mitigating estimator bias and enhances uncertainty propagation.

Application Overview

To demonstrate the functionality of Informational Inertia quickly, users can analyze a sample signal through Python, as shown:

import numpy as np  
from inertia.analyze import analyze  

x = np.sin(np.linspace(0, 10*np.pi, 2000))  
np.savetxt("signal.csv", x, delimiter=",")  

r = analyze("signal.csv")  
  
print("Entropy:", r.entropy)  
print("I_bar:", r.I_bar)  
print("I_comp:", r.I_comp)  

r.plot()

Observations from Testing

Informational Inertia has been tested with varying signal structures:

  • Structured Signals: Yield lower values of entropy and I_bar.
  • Random Noise: Typically indicates higher entropy and I_bar.
  • Linear Trends: Denote very low I_bar, indicating removable structure.
  • Combinations of Trends and Structure: Present intermediate I_bar values.

Limitations

It is crucial to note that Informational Inertia is not a universal detector of structure. Some limitations include:

  • Purely random noise may still show some I_comp due to artifact compression.
  • Nonlinear trivial signals could falsely appear irreducible under linear detrending.
  • Sensitivity to representation factors like scaling and encoding.
  • Adversarial examples might inflate scores without reflecting true structure.

Informational Inertia is a diagnostic tool rather than a philosophical interpretation, not claiming semantic meaning or intelligence. Its intended use lies in providing measurable insights beyond traditional entropy and compression methods.

0 comments

No comments yet.

Sign in to be the first to comment.