Oceanography The Official Magazine of
The Oceanography Society
Volume 36 Supplement 1

View Issue TOC
Volume 36, No. 1
Pages 36 - 37


Monitoring Algal Blooms with Complementary Sensors on Multiple Spatial and Temporal Scales

David R. Williamson Glaucia M. FragosoSanna MajanevaAlberto DallolioDaniel Ø. HalvorsenOliver HaslerAdriënne E. OudijkDennis D. LangerTor Arne JohansenGeir JohnsenAnnette StahlMartin LudvigsenJoseph L. Garrett
Jump to
Full text Citation References Copyright & Usage
Full Text

Climate change, and other human-induced impacts, are severely increasing the intensity and occurrences of algal blooms in coastal regions (IPCC, 2022). Ocean warming, marine heatwaves, and eutrophication promote suitable conditions for rapid phytoplankton growth and biomass accumulation. An increase in such primary producers provides food for marine organisms, and phytoplankton play an important global role in fixing atmospheric carbon dioxide and producing much of the oxygen we breathe. But harmful algal blooms (HABs) can also form, and they may adversely affect the ecosystem by reducing oxygen availability in the water, releasing toxic substances, clogging fish gills, and diminishing biodiversity. Understanding, forecasting, and ultimately mitigating HAB events could reduce their impact on wild fish populations, help aquaculture producers avoid losses, and facilitate a healthy ocean.

Phytoplankton respond rapidly to changes in the environment, and measuring the distribution of a bloom and its species composition and abundance is essential for determining its ecological impact and potential for harm. Satellite remote sensing of chlorophyll concentration has been used extensively to observe the development of algal blooms. Although this tool has wide spatial and temporal (nearly daily) coverage, it is limited to surface ocean waters and cloud-free days. Microscopic analyses of water and net samples allow much closer examination of the species present in a bloom and their abundance, but this is a time-consuming process that collects only discrete point samples, sparsely distributed in space and time. Neither of these methods alone captures the rapid evolution of algal blooms, the spatial and temporal patchiness of their distributions, or their high local variability. In situ optical devices and imaging sensors mounted on mobile platforms such as autonomous underwater vehicles (AUVs) and uncrewed surface vehicles (USVs) capture fine-scale temporal trends in plankton communities, while uncrewed aerial vehicles (UAVs) complement satellite remote sensing. Use of such autonomous platforms offers the flexibility to react to local conditions with adaptive sampling techniques in order to examine the marine environments in real time.

Here we present an integrated approach to observing blooms—an “observational pyramid”—that includes both classical and newer, complementary observation methods (Figure 1). We aim to identify trends in phytoplankton blooms in a region with strong aquaculture activity on the Atlantic coast of mid-Norway. Field campaigns were carried out in consecutive springs (2021 and 2022) in Frohavet, an area of sea sheltered by the Froan archipelago (Figure 2). The region is a shallow, highly productive basin with abundant fishing and a growing aquaculture industry. Typically, there are one or more large algal blooms here during the spring months. We use multi-instrumentation from macro- to a microscale perspectives, combined with oceanographic modeling and ground truthing, to provide tools for early algal bloom detection.


FIGURE 1. The observational pyramid concept offers simultaneous, integrated monitoring of the marine environment from space to seabed and from scales of hundreds of square kilometers to the microscopic. After Dallolio et al. (2019)  > High res figure


FIGURE 2. (inset) Fieldwork location relative to Norway. (upper panel) The larger Frohavet region is overlain here with the path and coverage of the plane equipped with a hyperspectral camera during 2021 fieldwork, and the path of the long-endurance USV in the weeks around 2022 fieldwork. (lower panel) The locations of net and water samples and the paths of the UAV, AUV, and USV missions in the main sampling area in 2022. Elevation data from Kartverket, satellite images from Norge i bilder/Kartverket and CNES/Airbus, Landsat/Copernicus, Maxar Technologies via Google Maps. > High res figure


At the very largest scale, in 2021 we used multispectral images from the Sentinel-2 and PRISMA satellites, and in 2022 from the Norwegian University of Technical and Natural Sciences Centre’s Hypso-1 hyperspectral imaging satellite (Grøtte et al., 2021) to monitor ocean color in the area of interest over several weeks surrounding the main fieldwork. A long-endurance USV, equipped with a payload of ocean sensors including a CTD, an ADCP, an oxygen optode, and a fluorometer also monitored the area before, during, and after the missions. SINMOD, a coupled physical-chemical-biological ocean model, was used to simulate environmental conditions in the area of interest (Figure 3). During the fieldwork, the satellite imagery was supplemented by hyperspectral data collected by cameras mounted on a plane (in 2021) and on a UAV for covering smaller areas at higher resolution; these sensors are also less affected by cloud cover. At the same time, we launched an AUV (Saad et al., 2020) that covered hundreds of cubic meters of ocean while taking high-magnification underwater images (Figure 4) and collecting CTD and chlorophyll data. Finally, we deployed a Niskin water sampler at a number of locations within the AUV and UAV observation area and at several depths for microscopy and eDNA analysis.


FIGURE 3. A chlorophyll forecast made on April 19 during the 2022 fieldwork using SINMOD, a coupled physical-chemical-biological ocean model (spatial resolution 160 m). The white box shows the fieldwork area. > High res figure


FIGURE 4. A collage of cropped images from the AUV-mounted silhouette camera includes copepods and a fish larva. The abundance of such grazers impacts the timing and sizes of algal blooms. Image resolution is approximately 30 µm per pixel. > High res figure


The multisensor, multiscale operations will allow us to assess phytoplankton health and divide the growth of their populations into stages: pre-bloom (cells begin to grow), bloom phase (exponential growth), and post-bloom (grazing and decay). While remote sensing provides a broad view of such growth through ocean color, net and water sampling give us insight into the changing species composition within a bloom. Simultaneous AUV imaging provides information on grazers that feed on the phytoplankton—thought to be an important factor in the evolution of algal blooms. Multisensor operations also provide ground truth for remote sensing and help us link hyperspectral observations from novel aerial and satellite sensors with conditions in the water. All of these data sources will be used to validate and improve the SINMOD model of the ocean, allowing it to better predict the occurrence and composition of HABs and other algal blooms. Reliable prediction and automated observation also improve monitoring by telling us where and when expensive fieldwork with small-scale, high-resolution sensors can be most effective.

Observational efforts that combine state-of-the-art monitoring technology, multi- and hyperspectral remote sensing, ecosystem modeling, traditional water sampling, and integrated taxonomy via microscopic and molecular (eDNA) species identification are paramount for a holistic understanding of bloom formation, as well as of marine primary production overall. Our project demonstrates the advantages of this approach and promises to enable more effective ocean monitoring in the future.



This work was supported by the Research Council of Norway and industry partners through the Centre of Excellence funding scheme (NTNU AMOS, grant no. 223254), MASSIVE (270959), MoniTare (315514), Nansen Legacy (276730), AILARON (262741), HYPSCI (325961), SeaBee (296478), and the Green-platform project 328674. Figure 1 UAV model by Pablo Sánchez (grabcad.com), workboat model by Lewist123 (thingiverse.com), and plane model from downloadfree3d.com.

Williamson, D.R., G.M. Fragoso, S. Majaneva, A. Dallolio, D.Ø. Halvorsen, O. Hasler, A.E. Oudijk, D.D. Langer, T.A. Johansen, G. Johnsen, A. Stahl, M. Ludvigsen, and J.L. Garrett. 2023. Monitoring algal blooms with complementary sensors on multiple spatial and temporal scales. In Frontiers in Ocean Observing: Emerging Technologies for Understanding and Managing a Changing Ocean. E.S. Kappel, V. Cullen, M.J. Costello, L. Galgani, C. Gordó-Vilaseca, A. Govindarajan, S. Kouhi, C. Lavin, L. McCartin, J.D. Müller, B. Pirenne, T. Tanhua, Q. Zhao, and S. Zhao, eds, Oceanography 36(Supplement 1):36–37, https://doi.org/10.5670/oceanog.2023.s1.11.


Dallolio, A., L. Bertino, L. Chrpa, T.A. Johansen, M. Ludvigsen, K.A. Orvik, L.H. Smedsrud, J. Sousa, I.B. Utne, P. Johnston, and K. Rajan. 2019. Long-duration autonomy for open ocean exploration: Preliminary results & challenges. RSS 2019 Workshop on Robots in the Wild: Challenges in Deploying Robust Autonomy for Robotic Exploration, June 22–26, 2019, Freiburg, Germany.

Grøtte, M.E., R. Birkeland, E. Honoré-Livermore, S. Bakken, J. Garrett, E.F. Prentice, F. Sigernes, M. Orlandić, J.T. Gravdahl, and T.A. Johansen. 2022. Ocean color hyperspectral remote sensing with high resolution and low latency—The HYPSO-1 CubeSat mission. IEEE Transactions on Geoscience and Remote Sensing 60:1–19, https://doi.org/​10.1109/​TGRS.2021.3080175.

IPCC. 2022. Climate Change 2022: Impacts, Adaptation and Vulnerability. Contribution of Working Group II to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change, H.-O. Pörtner, D.C. Roberts, M. Tignor, E.S. Poloczanska, K. Mintenbeck, A. Alegría, M. Craig, S. Langsdorf, S. Löschke, and others, eds, Cambridge University Press, Cambridge, UK, and New York, NY, USA, 3,056 pp.

Saad, A., A. Stahl, A. Våge, E. Davies, T. Nordam, N. Aberle, M. Ludvigsen, G. Johnsen, J. Sousa, and K. Rajan. 2020. Advancing ocean observation with an AI-driven mobile robotic explorer. Oceanography 33(3):50–59, https://doi.org/10.5670/oceanog.2020.307.

Copyright & Usage

This is an open access article made available under the terms of the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution, and reproduction in any medium or format as long as users cite the materials appropriately, provide a link to the Creative Commons license, and indicate the changes that were made to the original content. Images, animations, videos, or other third-party material used in articles are included in the Creative Commons license unless indicated otherwise in a credit line to the material. If the material is not included in the article’s Creative Commons license, users will need to obtain permission directly from the license holder to reproduce the material.