Our product line now includes NeuraStock—an AI tool for financial market forecasting

  • Home
  • Research
  • Products
  • Neurah
  • Internship
  • About
  • Contact
  • More
    • Home
    • Research
    • Products
    • Neurah
    • Internship
    • About
    • Contact
  • Home
  • Research
  • Products
  • Neurah
  • Internship
  • About
  • Contact
Artificial Neural Computing

At the Intersection of Sciences

At the Intersection of SciencesAt the Intersection of Sciences

Research

There are billions upon billions of organisms currently living on planet Earth. All of them came into existence through the process known as biological evolution, and all of them are continuously performing information processing or computational tasks. For all practical purposes, these organisms can be regarded as computers. But what exactly is their architecture? Do they follow the classical von Neumann model, or is there something fundamentally different—something we can identify and perhaps use to build better computers? Do they exploit quantum effects or quantum computation, or are such effects irrelevant at the macroscopic scale of individual organisms?


At Artificial Neural Computing, we are developing a new kind of bio-inspired computer by integrating recent advances in physics, biology, and machine learning.

Machine learning

  • Toward a theory of machine learning develops a thermodynamic theory of neural networks by applying the principle of maximum entropy to derive equilibrium properties and learning analogues of thermodynamics laws, showing that optimal architectures maximize the Laplacian of free energy, with deep networks offering enhanced learning efficiency.
  • Emergent scale invariance in neural networks shows that learning systems, such as neural networks, naturally evolve towards a self-organized critical state, suggesting that scale invariance in physical and biological systems may emerge from learning dynamics.
  • Bio-inspired machine learning: programmed death and replication explores computational aspects of biological phenomena like replication and programmed death in machine learning, proposing algorithms for neuron addition and removal to improve neural network performance and efficiency.
  • Autonomous particles shows that agents can identify relevant information through intrinsic symmetries, such as Galilean symmetry in autonomous driving, and that only a few relevant invariants are needed for learning, as demonstrated by a simple model of autonomous vehicles.
  • Dataset-learning duality and emergent criticality describes the duality between non-trainable dataset variables and trainable variables in artificial neural networks, showing that this duality can help study the emergence of criticality, where power-law distributions of fluctuations of trainable variables were shown to emerge. 
  • Covariant Gradient Descent formulates a covariant gradient descent method that unifies and generalizes popular optimization algorithms by defining optimization dynamics through covariant force and metric tensors, enabling consistent learning across curved parameter spaces.
  • Geometric Learning Dynamics introduces a unified geometric framework for modeling learning dynamics across physical, biological, and machine learning systems, revealing three fundamental regimes based on the relationship between the metric tensor and noise covariance matrix, with the intermediate regime explaining biological complexity.


Physics

  • The World as a Neural Network proposes that the universe could be  a neural network, where trainable variables follow Madelung and Hamilton-Jacobi equations, exhibiting both quantum and classical behaviors, while non-trainable variables evolve in a curved emergent space-time, with entropy production described by the Einstein-Hilbert equation.
  • Emergent quantumness in neural networks explores the connection between quantum mechanics and neural network dynamics by considering a grand canonical ensemble of neural networks, deriving the Schrödinger equation with a "Planck's constant" determined by the chemical potential of non-trainable variables.
  • Towards a theory of quantum gravity from neural networks demonstrates that a neural network can be described by the Madelung and Schrödinger equations for trainable variables, and the geodesic and Einstein equations for non-trainable variables, showing that the quantum and gravitational descriptions of the system are dual and emerge from the interplay between entropy production and destruction during learning.
  • Emergent field theories from neural networks establishes a duality between Hamiltonian systems and neural network-based learning systems, showing that the Hamilton-Jacobi equations correspond to the activation and learning dynamics of neural networks, and applies this duality to model various field theories, including Klein-Gordon and Dirac fields.
  • Molecular Learning Dynamics applies the physics-learning duality to molecular systems by modeling each particle as an agent minimizing a loss function, demonstrating this approach with a learning-based simulation of water molecules that achieves comparable accuracy with greater computational efficiency than traditional physics-based methods.
  • Geometric Learning Dynamics introduces a unified geometric framework for modeling learning dynamics across physical, biological, and machine learning systems, revealing three fundamental regimes based on the relationship between the metric tensor and noise covariance matrix, with the intermediate regime explaining biological complexity.
  • Neural Relativity proposes that the learning dynamics of individual agents follows geodesics in a curved space, and that their collective behavior can be described by general relativity, suggesting that gravity may be an emergent phenomenon of learning.

Biology

  • Toward a theory of evolution as multilevel learning develops a theory of biological evolution, including the origin of life, as multilevel learning by formulating seven fundamental principles of evolution, analyzing them using neural network frameworks, and deriving a generalized version of the Central Dogma of molecular biology.
  • Thermodynamics of evolution and the origin of life presents a phenomenological theory of evolution and the origin of life by combining classical thermodynamics and statistical learning, using the maximum entropy principle to model evolutionary processes and major transitions, including the origin of life.
  • Bio-inspired machine learning: programmed death and replication explores computational aspects of biological phenomena like replication and programmed death in machine learning, proposing algorithms for neuron addition and removal to improve neural network performance and efficiency.
  • Quasi-equilibrium states and phase transitions in biological evolution develops a macroscopic description of evolutionary dynamics by analyzing the Shannon entropy and Hamming distance of biological sequences, demonstrating phase transitions in SARS-CoV-2 data and suggesting the framework's potential for early pandemic warning systems.
  • Geometric Learning Dynamics introduces a unified geometric framework for modeling learning dynamics across physical, biological, and machine learning systems, revealing three fundamental regimes based on the relationship between the metric tensor and noise covariance matrix, with the intermediate regime explaining biological complexity.

ANC Journal Club

Copyright © 2025 Artificial Neural Computing Corp. 

All Rights Reserved by the Universe.

Temporarily powered by digital computers.