A deep learning framework for marine acoustic and seismic monitoring with distributed acoustic sensing
/ Authors
/ Abstract
Distributed Acoustic Sensing (DAS) enables high-resolution and long-duration monitoring of marine acoustic and seismic activity by turning existing fiber-optic cables into dense sensor arrays. However, extracting diverse signals from continuous DAS data remains challenging due to the massive data volumes and signal complexity. Here, we present DASNet, a deep learning framework for automated detection, classification, and arrival-time picking of diverse marine signals in DAS data. The model is trained using a semi-supervised pipeline on continuous recordings and jointly predicts spatiotemporal bounding boxes and segmentation masks for each detected event. Applied to three years of data from the Seafloor Fiber-Optic Array in Monterey Bay (SeaFOAM), DASNet identified over 500,000 events spanning multiple signal categories. For seismic monitoring, the model detects the majority of cataloged local earthquakes within 100 km and identifies distant earthquake-generated T-waves, with beamforming analysis revealing source azimuths clustered toward the southwestern Pacific and along mid-ocean ridge systems. For bioacoustic monitoring, DASNet detects and tracks more than 400,000 blue and fin whale calls, revealing seasonal and interannual variability consistent with independent hydrophone records. For anthropogenic activity, DASNet detects and localizes vessel traffic near the cable, with estimated positions validated against Automatic Identification System (AIS) tracks. These results demonstrate that combining DAS with deep learning provides a scalable, high-resolution monitoring approach for marine environmental observation. As submarine DAS deployments expand, this framework could substantially enhance seismic, bioacoustic, and anthropogenic observations in regions where conventional instrumentation remains sparse.