SAMIBC2026 Presentation Announcement Slide for Gradient-Accelerated Cosmological Inference:
A JAX-Based Framework for Differentiable
Bayesian Computation in Astrophysics

As data driven decision environments become increasingly complex, traditional statistical workflows often struggle to scale. High dimensional parameter spaces, computationally expensive likelihood functions, and slow convergence can limit the effectiveness of conventional Bayesian inference methods. This scholarly presentation introduces a fully differentiable computational framework designed to accelerate and stabilize high dimensional inference using gradient based optimization.

Presented virtually by Mayank Jha of Amazon Robotics within the Educational Research and Measurement track, this session explores how automatic differentiation, GPU accelerated computation, and vectorized probabilistic programming can transform Bayesian workflows. Built on JAX, the framework enables exact gradient computation for complex likelihood functions that were previously infeasible using finite difference methods. This shift from approximation to exact gradient evaluation dramatically improves computational stability and efficiency.

The presentation demonstrates how gradient information unlocks advanced inference techniques such as Hamiltonian Monte Carlo, No U Turn Sampling, and Stochastic Variational Inference with normalizing flows. These approaches allow models to explore parameter spaces more efficiently, improving convergence speed while reducing the need for manual tuning. Rather than relying on random walk sampling strategies, the framework leverages gradient directionality to guide probabilistic exploration in a principled and scalable manner.

A detailed case study based on a Dark Energy Survey Year 1 3×2pt analysis highlights order of magnitude improvements in sampling efficiency. The framework also enables rapid Fisher matrix estimation without manual step size tuning, significantly reducing the time required for sensitivity analysis and uncertainty quantification. These results demonstrate how differentiable programming provides a powerful foundation for scalable, data driven modeling in high dimensional systems.

While grounded in cosmological inference, the implications extend far beyond astrophysics. Any domain involving complex probabilistic modeling, machine learning pipelines, or large scale predictive analytics can benefit from gradient based inference workflows. By integrating automatic differentiation with probabilistic programming, this framework offers a blueprint for modernizing Bayesian computation across scientific and analytical disciplines.

Author and Affiliation
Mayank Jha, Amazon Robotics

This research contributes to the evolving intersection of machine learning, probabilistic modeling, and strategic data decision making. If you are interested in scalable inference, differentiable programming, or next generation analytics, this session provides both conceptual clarity and practical direction. Learn more about this presentation and register to attend the SAM International Business Conference at www.samnational.org/conference.