{"id":1708,"date":"2020-06-23T17:48:06","date_gmt":"2020-06-23T17:48:06","guid":{"rendered":"https:\/\/meridian.cs.dal.ca\/?page_id=1708"},"modified":"2023-02-01T14:11:19","modified_gmt":"2023-02-01T14:11:19","slug":"detecting-underwater-sounds-with-deep-learning","status":"publish","type":"page","link":"https:\/\/meridian.cs.dal.ca\/fr\/detecting-underwater-sounds-with-deep-learning\/","title":{"rendered":"Detecting underwater sounds with deep learning"},"content":{"rendered":"<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column content_column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p>Many marine species have evolved to rely primarily on sound for  underwater navigation, prey detection, and communication. Therefore, marine biologists listen to the sounds  generated by marine animals, for example, to detect the presence of an  endangered species or to study social behavior. Increasing numbers of underwater listening and recording devices are  being deployed worldwide, generating vast amounts of data that easily exceed our capacity for manual analysis. While algorithms exist to automatically analyze acoustic data in search for signals of interest, these algorithms often make many mistakes and generally do not achieve the same level of accuracy as human analysists. <\/p>\n\n\n\n<p>In recent years, a new breed of algorithms known as deep neural networks have gained immense popularity, especially in fields such as computer vision and speech recognition where they outperform existing algorithms and even human analysts. Such deep learning algorithm exhibit an impressive ability to learn from (large amounts of) data, requiring far less feature engineering and making them suitable to cope with the increasing amounts of ocean acoustics data. Recently, such methods have been successfully applied to solve detection and classification problems in marine bioacoustics.&nbsp;<\/p>\n\n\n\n<p>Motivated by these recent developments, MERIDIAN has developed a software package called <em>Ketos <\/em>that makes it easier for marine bioacousticians to train deep-learning algorithms to solve sound detection and classification tasks. We are also working closely together with researchers to develop detection and classification algorithms for selected marine species, including mammals and fish, and various types of motorized vessels.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Fish<\/h3>\n\n\n\n<p>Hundreds of fish species are known to produce sounds. We are developing deep learning models to automatically detect fish in hydrophone data. Such models will help researchers to analyze large  amounts of data that currently remains unexplored. The first species we  are working with are the arctic cod and sablefish. These are a few of  the species studied by Francis Juanes and Amalis Riera,  biologists in the MERIDIAN team based at the University of Victoria, who  closely guide us in our development efforts and work to gather the necessary data in natural and controlled environments.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"594\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_Detection-1024x594.jpeg\" alt=\"\" class=\"wp-image-1458\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_Detection-1024x594.jpeg 1024w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_Detection-300x174.jpeg 300w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_Detection-768x446.jpeg 768w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_Detection.jpeg 1280w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Formes d'onde repr\u00e9sentant les sons caract\u00e9ristiques de la goberge, de la morue polaire et de la morue charbonni\u00e8re<\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Marine Mammals<\/h3>\n\n\n\n<p> Using a combination of convolutional and recurrent neural networks, we  are developing deep learning models to detect and classify several whale  species, including right, sei, fin and humpback whales. We collaborate  with several groups and institutions that provide us with data and their expertise. Dr. Chris Taggart and Dr. Kim Davies from the oceanography  department at Dalhousie University, for example,  have been studying the  endangered North Atlantic right whale using underwater acoustics and a  variety of other methods. They provided data collected by autonomous  underwater vehicles, which we are using to develop detectors that can be  evaluated. Similarly, the MERIDIAN team leader at Rimouski Dr. Yvan  Simard, and our collaborators at the Bedford Institute of Oceanography and Ocean Networks Canada are all providing  data sets and guidance to make sure our efforts are well aligned with  the needs of the underwater acoustics community. <\/p>\n\n\n\n<p>Our first deep learning model, released in 2020, has been trained to detect the characteristic &#8220;upcall&#8221; of the endangered North Atlantic right whale. You can learn more about the model and its performance in the paper <a href=\"https:\/\/asa.scitation.org\/doi\/10.1121\/10.0001132\">&#8220;Performance of a deep neural network at detecting North Atlantic right whale upcalls&#8221;<\/a> published in the Journal of the Acoustical Society of America.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/sho-hatakeyama-Cu6I_d8gw5A-unsplash-1024x683.jpg\" alt=\"\" class=\"wp-image-25\" width=\"512\" height=\"342\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/sho-hatakeyama-Cu6I_d8gw5A-unsplash-1024x683.jpg 1024w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/sho-hatakeyama-Cu6I_d8gw5A-unsplash-300x200.jpg 300w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/sho-hatakeyama-Cu6I_d8gw5A-unsplash-768x512.jpg 768w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/sho-hatakeyama-Cu6I_d8gw5A-unsplash-150x100.jpg 150w\" sizes=\"auto, (max-width: 512px) 100vw, 512px\" \/><\/figure><\/div>\n\n\n<h3 class=\"wp-block-heading\">Motorized Marine Vessels<\/h3>\n\n\n\n<p>\n\nAux \u00c9tats-Unis, une \u00e9quipe de recherche dirig\u00e9e par M. Neil Hammerschlag, Ph.D., de la Rosenstiel School of Marine and Atmospheric Science de l\u2019Universit\u00e9 de Miami \u00e9tudie les effets de l\u2019urbanisation c\u00f4ti\u00e8re sur la r\u00e9partition, les d\u00e9placements et la sant\u00e9 des requins. L\u2019\u00e9tude, connue sous le nom de Urban Sharks project, se concentre sur le nord de la Baie de Biscayne. Il s\u2019agit d\u2019une r\u00e9gion fortement affect\u00e9e par la population de Miami (2,5 millions d\u2019habitants), notamment en raison d\u2019un achalandage \u00e9lev\u00e9 de la navigation de plaisance.\n<\/p>\n\n\n\n<p>Together with Ocean Tracking Network (OTN), Neil and his team have deployed a tracking array,  which is being used to study the distribution and movements of  individual tagged sharks. Moreover, they have deployed a number of  broadband hydrophones, which they use to listen to underwater sounds,  both man-made and from marine life.  <\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_AcousticMonitoring-1-1024x728.jpg\" alt=\"\" class=\"wp-image-1465\" width=\"512\" height=\"364\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_AcousticMonitoring-1-1024x728.jpg 1024w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_AcousticMonitoring-1-300x213.jpg 300w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_AcousticMonitoring-1-768x546.jpg 768w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_AcousticMonitoring-1.jpg 1200w\" sizes=\"auto, (max-width: 512px) 100vw, 512px\" \/><figcaption class=\"wp-element-caption\">Les requins-marteaux figurent parmi les esp\u00e8ces de requins pr\u00e9sentes dans la Baie de Biscayne<\/figcaption><\/figure><\/div>\n\n\n<p>\nMERIDIAN soutient Neil et son \u00e9quipe dans l\u2019analyse des donn\u00e9es relatives aux hydrophones large bande et a d\u00e9velopp\u00e9 un logiciel permettant de d\u00e9tecter le bruit sous-marin caus\u00e9 par les bateaux. Ce \u00ab d\u00e9tecteur de bateaux \u00bb a \u00e9t\u00e9 appliqu\u00e9 avec succ\u00e8s pour quantifier l'activit\u00e9 de navigation de plaisance dans la Baie de Biscayne. Des donn\u00e9es pr\u00e9cieuses ont \u00e9t\u00e9 recueillies \u00e0 propos des perturbations caus\u00e9es par les bateaux. Ces derni\u00e8res peuvent alors \u00eatre compar\u00e9es \u00e0 l'habitat et au niveau d'activit\u00e9 des requins.\n<\/p>\n\n\n\n<p>We are now working on extending the capabilities of our boat detector to be able to distinguish between different types of boating activity and  different engine types. We will also be taking a closer look at the  sounds made my marine animals in an attempt to quantify the biodiversity  in the bay. <\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped aligncenter blog_gallery wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"768\" data-id=\"642\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN2598-1024x768.jpg\" alt=\"\" class=\"wp-image-642\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN2598-1024x768.jpg 1024w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN2598-300x225.jpg 300w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN2598-768x576.jpg 768w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN2598.jpg 2000w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"768\" data-id=\"643\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN4016-1024x768.jpg\" alt=\"\" class=\"wp-image-643\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN4016-1024x768.jpg 1024w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN4016-300x225.jpg 300w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN4016-768x576.jpg 768w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/DSCN4016.jpg 2000w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/figure>\n\n\n\n<p class=\"has-text-align-center caption has-small-font-size\"> Hydrophone situ\u00e9 sur le fond sableux de la baie de Biscayne (\u00e0 gauche) d\u00e9ploiement d\u2019un hydrophone (\u00e0 droite) <\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Kedgi &#8211; An interactive training app<\/h3>\n\n\n\n<p> One of the benefits of using deep learning to develop detectors and  classifiers is that we can take advantage of the high plasticity of  neural networks to create a model that can be tailored as needed.  MERIDIAN is developing an application called <em>Kedgi<\/em> that will allow users to interact  with a neural network as it is training, providing feedback on its  performance and letting the model use the expert&#8217;s input to achieve  better results. Starting with a pre-trained model, the user can apply it  to a sample of a new dataset and evaluate its performance. In case the  results are not satisfactory, the user can feed the model a larger  amount of data and tell the model whether its outputs are correct.  During this process, the application collect the user inputs to improve  the model&#8217;s detection\/classification abilities. This can be very useful  when a pre-trained model performs very well in a given scenario (e.g.:  detecting humpback whale in an area with shipping noise) but has its  performance reduced in a new environment (e.g. an area with high level  of seismic noise). With the help of the user, the model can adapt to the  new environment more quickly. An important feature of this application  is that it also works as an annotation platform, a common task in the  bioacoustician&#8217;s workflow. This way a neural network can also learn by  watching how the analyst does their work. The current version of <em>Kedgi<\/em> only provides a set of very basic functionalities, allowing users to run models on audio files and inspect the detections, but we are working hard to implement more interactive functionalities!<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Ketos &#8211; A deep learning software library<\/h3>\n\n\n\n<p> Most of our tools are directed to users who don&#8217;t have advanced machine  learning and software development skills: the pre-trained models and  apps we are working on do not require any programming or advanced  computer skills. But we also want to make sure the developers involved  with underwater acoustic can take advantage of the work we do. That includes all the code we use to develop our models in an  open-source library. It contains not only the neural network  architectures that we find most useful for producing detectors and  classifiers but also algorithms for data augmentation, utilities to deal  with large datasets, several signal processing algorithms and more.   Those who want to train a new model from scratch, try a variation of a  network architecture or simply dig in the code base to see how things  are done, can find out more <a href=\"https:\/\/docs.meridian.cs.dal.ca\/ketos\/\">here<\/a>. Their developers will also find extensive documentation with tutorials, a testing suite, and instructions on how to contribute.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/11\/ketos_logo.png\" alt=\"\" class=\"wp-image-971\" width=\"256\" height=\"204\"\/><\/figure><\/div>\n\n\n<p> <\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column link_column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<h4 class=\"wp-block-heading\">Tools &amp; Services<\/h4>\n\n\n\n<p><a href=\"https:\/\/meridian.cs.dal.ca\/fr\/2015\/04\/12\/ketos\/\">Ketos<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/gitlab.meridian.cs.dal.ca\/public_projects\/boat_detector\">D\u00e9tecteur de bateau<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/docs.meridian.cs.dal.ca\/kedgi\/\">Kedgi<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/meridian.cs.dal.ca\/fr\/north-atlantic-right-whale-narw-call-detection-tool\/\">NARW Call Detection Tool<\/a><\/p>\n\n\n\n<p><\/p>\n<\/div>\n<\/div>\n\n\n\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>Many marine species have evolved to rely primarily on sound for underwater navigation, prey detection, and communication. Therefore, marine biologists listen to the sounds generated by marine animals, for example, to detect the presence of an endangered species or to study social behavior. Increasing numbers of underwater listening and recording [&hellip;]<\/p>\n","protected":false},"author":4,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-1708","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/pages\/1708","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/comments?post=1708"}],"version-history":[{"count":14,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/pages\/1708\/revisions"}],"predecessor-version":[{"id":2664,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/pages\/1708\/revisions\/2664"}],"wp:attachment":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/media?parent=1708"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}