ISDCI Research & Development
Streaming Edge Processing at SLAC
SLAC has a decades long history of leading real-time streaming data processing for detectors and accelerator systems, leveraging custom ASICs, FPGAs, and high-throughput firmware to operate directly at the sensor edge where data signals are born. Building on this expertise, SLAC is now advancing these pipelines into the AI domain by deploying embedded inference engines on FPGAs and other heterogeneous hardware platforms to enable intelligent filtering, feature extraction, and autonomous decision-making in situ. These developments bring machine learning directly into the data production stream, reducing latency, enabling autonomous experimental adaptability, and laying the foundation for fully intelligent edge-to-exascale workflows.
Autonomous Accelerator Tuning at SLAC
SLAC has pioneered machine-learning–driven approaches to automate accelerator tuning, replacing manual, expert-driven optimization with adaptive control algorithms that can rapidly respond to drifting machine conditions. Their work integrates reinforcement learning, surrogate modeling, and Bayesian optimization to tune complex beamline parameters in real time, improving beam quality, stability, and operational efficiency. These efforts demonstrate how AI-assisted controls can reduce downtime, enhance reproducibility, and lay the groundwork for fully self-optimizing accelerator facilities.
Autonomous Intelligent Detector Systems at SLAC
SLAC is developing next-generation detector systems that incorporate embedded intelligence—AI algorithms, adaptive firmware, and near-sensor processing—to autonomously clean, filter, and interpret raw data at the point of acquisition. These systems leverage custom ASICs, FPGAs, and edge-AI architectures to identify features of interest, correct sensor artifacts, and dynamically adjust operating parameters based on evolving experimental conditions. This work enables detectors that self-optimize, reduce data volumes, and provide real-time insights, paving the way for more efficient and autonomous scientific instruments across SLAC’s experimental programs.
Data Cataloging and AMSC API Development
SLAC is building advanced data cataloging and automated transfer systems that efficiently move massive LCLS experimental datasets to exascale computing facilities for real-time analysis and long-term stewardship. Their work integrates high-performance data pipelines, intelligent metadata services, and workflow-aware routing that ensures data is discoverable, prioritized, and delivered to HPC resources with minimal latency. These capabilities enable seamless edge-to-exascale collaboration, supporting rapid scientific feedback loops and scalable processing for the world’s fastest x-ray science experiments.
Advanced Test Bed Development at SLAC
SLAC designs and operates a wide range of sophisticated test beds that integrate sensors, detectors, edge computing, and control systems to prototype next-generation scientific instrumentation. These test environments enable rapid experimentation with AI-driven controls, real-time data-processing pipelines, and novel hardware architectures under realistic beamline or laboratory conditions. By providing flexible, high-fidelity platforms for validation and iteration, SLAC’s test beds accelerate technology maturation and reduce risk for deployment in major scientific facilities.
AI-Driven LCLS Data Processing at SLAC
SLAC has advanced the use of artificial intelligence to accelerate LCLS data processing, enabling real-time interpretation of ultrafast x-ray experiments at unprecedented scale. Their efforts combine deep learning, compressed sensing, and edge-to-exascale streaming frameworks to rapidly classify events, denoise images, and extract scientifically meaningful features directly from detector outputs. These AI-enabled pipelines dramatically reduce data bottlenecks, support self-driving experiments, and unlock faster scientific discovery across a broad range of LCLS user programs.
AI-Enhanced Data Processing for the Rubin Observatory
SLAC contributes to the Rubin Observatory’s data processing pipeline by developing advanced algorithms and AI-driven methods to manage and analyze the observatory’s massive, rapidly acquired sky survey data. Their work integrates machine learning for image characterization, anomaly detection, and photometric refinement, ensuring high-fidelity extraction of astrophysical signals from complex, time-variable observations. These efforts enable near–real-time alert generation and support Rubin’s mission to map the dynamic Universe with unprecedented accuracy and scale.
AI-Accelerated Data Processing for High Energy Physics at SLAC
SLAC applies advanced data processing and AI techniques to enhance the reconstruction, simulation, and interpretation of high-energy physics data across collider, neutrino, and cosmic-ray experiments. Their efforts include developing fast machine-learning–based event reconstruction, sophisticated noise-reduction and pattern-recognition algorithms, and accelerated simulation frameworks that dramatically reduce computational cost. This work enables higher-precision physics measurements, faster turnaround from data to insight, and scalable analysis pipelines that support the next generation of HEP discoveries.