Skip to main content Skip to secondary navigation

Arbabian Lab

Main content start

 

Welcome to the Arbabian Lab!

Our group designs and builds end-to-end intelligent sensing systems that bridge the physical and digital worlds. These systems span the entire stack—from physics and high-performance/high-frequency circuits to algorithms and edge intelligence—combining RF, ultrasound, and optical modalities. By tightly coupling hardware, signal processing, and inference, we create new capabilities for autonomy, healthcare, the Internet of Things, and scientific discovery. Our research leverages multi-physics and multi-disciplinary approaches, integrating electromagnetics, acoustics, and hybrid wave interactions with custom electronic design and machine learning to introduce new sensing systems for physical AI. 

Check out our research page for more detailed information about our active projects!

Autonomy and Physical AI

While AI has revolutionized the digital economy, its impact on the physical world—factories, hospitals, transportation, and energy systems—remains limited. Bringing AI into these domains requires sensing and inference systems that are far more capable, power-efficient, and scalable than today’s. The key bottleneck lies in how machines perceive and interpret complex, dynamic environments.

Our research develops new architectures that co-design sensing and compute, enabling closed-loop autonomy that is adaptive, robust, and efficient. We view autonomy not as isolated robots, but as “infrastructure as a robot”—factories, hospitals, and cities operating as living, intelligent systems. Achieving this vision demands rethinking the entire sensor-to-compute stack, from novel physical transducers and low-power edge processors to learning algorithms that fuse multimodal data in real time. To manage the growing complexity of these systems, we also study design autonomy for chips and systems, developing methods that automatically explore, optimize, and adapt architectures across hardware, software, and algorithms. By enabling cross-layer co-design—from devices and circuits to models and control policies—we unlock new performance, efficiency, and scalability regimes that cannot be achieved through manual or siloed design approaches.

The goal is to build the foundations for Physical AI—where the physical world becomes programmable, self-optimizing, and responsive to human intent.

Biomedical Systems: Nyquist Health 

Despite remarkable advances in therapeutics and diagnostics, modern healthcare remains largely reactive. Patients are monitored intermittently, and intervention begins only after symptoms appear—by which point treatment is more costly and outcomes worse. Our work in “Nyquist Health” aims to fundamentally shift this paradigm from post-symptom care to predictive and preventive health. We develop continuous “Nyquist-rate” sensing and imaging systems that sample physiological signals at their natural bandwidth, providing early indicators of disease progression.

We pursue two complementary directions: (1) continuous, edge-intelligent screening systems that act as persistent sensors for human health, and (2) smart implants and ingestibles that monitor, interpret, and intervene in real time. These systems borrow principles from the Internet of Things—distributed, adaptive, and always connected—but apply them to the body, transforming health monitoring into a high-resolution feedback loop. In doing so, we seek to create a future where disease is detected and treated before it manifests.

Environmental Sensing

Oceans regulate Earth's climate, absorb vast amounts of carbon, and sustain most of the planet's biodiversity—yet over 80% of them remain unmapped and unobserved. Similarly, below-ground root systems that determine crop resilience and yield remain largely invisible to conventional sensing approaches. This lack of visibility in both marine and terrestrial environments limits our ability to model climates, monitor ecosystems, and manage natural resources. Our research develops multi-modal sensing systems that merge electromagnetic and acoustic waves to image beneath opaque surfaces. In marine applications, we generate and detect acoustic waves from the sky to enable scalable, real-time underwater sensing. For terrestrial applications, we use microwave thermoacoustic imaging to non-destructively map root architectures in soil. Together, these approaches pave the way for comprehensive environmental monitoring—from a "Google Earth for the oceans" to high-throughput below-ground phenotyping for sustainable agriculture.