{"id":616,"date":"2015-10-31T17:16:22","date_gmt":"2015-10-31T17:16:22","guid":{"rendered":"https:\/\/sites.cs.dal.ca\/meridian-rebuild\/?p=616"},"modified":"2019-12-10T15:16:50","modified_gmt":"2019-12-10T15:16:50","slug":"directional-acoustic-data-visualization","status":"publish","type":"post","link":"https:\/\/meridian.cs.dal.ca\/fr\/2015\/10\/31\/directional-acoustic-data-visualization\/","title":{"rendered":"Visualisation de donn\u00e9es acoustiques directionnelles"},"content":{"rendered":"\n<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column content_column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p>\nMERIDIAN is working with JASCO to design a tool for visualizing \nunderwater acoustic data with a directional component that indicates \nbearing for use by biologists and other researchers. This system will \nvisualize data collected from hydrophone arrays that are capable of \ndiscerning the bearing of sound sources, which can help users \ndistinguish between animals and track movement of vessels and wildlife. \nWhere acoustic data is normally presented to biologists in the form of \n2D Time versus Frequency spectrograms which use colour to indicate \nintensity, this system must also present the directional information in a\n way that is intuitive for the user. Experimentation with directional \ncolours and filtering techniques has shown promise, but we are \ninvestigating solutions that take advantage of interaction, \n3-dimensional views, and accompanying Time versus Direction plotting and\n animations. This system will allow biologists to analyze, classify, and\n annotate data, and to prepare reports for presentation.\n<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"alignleft size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_DirectionalDiagram.png\" alt=\"\" class=\"wp-image-1459\" width=\"217\" height=\"267\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_DirectionalDiagram.png 770w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_DirectionalDiagram-243x300.png 243w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/12\/Projects_DirectionalDiagram-768x949.png 768w\" sizes=\"auto, (max-width: 217px) 100vw, 217px\" \/><figcaption>The application structure: a  web-based front-end using open source D3.js and ThreeJS libraries with a Flask\/Python back-end. <\/figcaption><\/figure><\/div>\n\n\n\n<p> The application visualizes acoustic data for the web in a traditional spectrogram viewer built with D3. Using ThreeJS,  a secondary view provides a 3D Time vs Bearing plot. In this plot,  magnitudes associated with each bearing at each timestep are summed,  indicating the directionality of sound sources. This plot allows the  user to filter according to specific bearing angles and update the  corresponding spectrogram with only sound associated with those  bearings. All calculations are done in the back-end with Flask  and Python. Users can adjust time and frequency spans, filter according  to bearing angle, and download snapshots of both views. Users can also  drag back and forth along the spectrogram\u2019s time axis to move the  camera\u2019s position in the 3D viewer. <\/p>\n\n\n\n<figure class=\"wp-block-gallery columns-2 is-cropped alignwide blog_gallery wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\"><ul class=\"blocks-gallery-grid\"><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"560\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website2-1-1024x560.jpg\" alt=\"\" data-id=\"617\" data-link=\"https:\/\/meridian.cs.dal.ca\/dav-website2-1\/\" class=\"wp-image-617\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website2-1-1024x560.jpg 1024w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website2-1-300x164.jpg 300w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website2-1-768x420.jpg 768w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website2-1.jpg 1643w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/li><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"558\" src=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website3-1-1024x558.jpg\" alt=\"\" data-id=\"618\" data-link=\"https:\/\/meridian.cs.dal.ca\/dav-website3-1\/\" class=\"wp-image-618\" srcset=\"https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website3-1-1024x558.jpg 1024w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website3-1-300x164.jpg 300w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website3-1-768x419.jpg 768w, https:\/\/meridian.cs.dal.ca\/wp-content\/uploads\/2019\/10\/dav-website3-1.jpg 1641w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/li><\/ul><\/figure>\n\n\n\n<p class=\"has-text-align-center has-small-font-size caption\">Screenshots of the spectrogram and 3D views (left), and a spectrogram filtered for bearing angle (right). <\/p>\n\n\n\n<p>\n\nThe machine learning portion of this project will investigate if \ndirectional acoustic data alone can be used to estimate the accurate \nposition (range and bearing) of a sound source using Automatic \nIdentification System (AIS) satellite data to generate the training data\n set for a fixed mooring. Research between 2017 and 2019 suggests that \nsound source localization using acoustic data alone is possible using \nneural networks [1][2][3]. \n<\/p>\n\n\n\n<p class=\"has-small-font-size footnotes\">\n[1]   H. Niu, E. Ozanich, P. Gerstoft, \u201cShip localization in Santa \nBarbara Channel using machine learning classifiers\u201d, The Journal of the \nAcoustical Society of America 142, EL455, 2017, <a href=\"https:\/\/doi.org\/10.1121\/1.5010064\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/doi.org\/10.1121\/1.5010064<\/a><br>\n[2]   Wang, Y. Peng, H. \u201cUnderwater acoustic source localization using \ngeneralized regression neural network\u201d, The Journal of the Acoustical \nSociety of America 143,  2018, <a href=\"https:\/\/doi.org\/ 10.1121\/1.5032311\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/doi.org\/ 10.1121\/1.5032311<\/a><br>\n[3]   Niu, H. Zaixiao, G. Ozanich, E. Gerstoft, P. Haibin, W. Zhenglin, \nL. \u201cDeep learning for ocean acoustic source localization using one \nsensor\u201d, Journal of the Acoustical  Society of America, 2019,[Online]. \nAvailable: <a href=\"https:\/\/arxiv.org\/abs\/1903.12319\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/arxiv.org\/abs\/1903.12319<\/a>,  [Accessed: April 7, 2019]\n\n<\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column link_column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<h5 class=\"wp-block-heading\">Partners<\/h5>\n\n\n\n<ul class=\"link_list wp-block-list\"><li><a href=\"https:\/\/www.jasco.com\/\">JASCO<\/a><\/li><\/ul>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>MERIDIAN is working with JASCO to design a tool for visualizing underwater acoustic data with a directional component that indicates bearing for use by biologists and other researchers. This system will visualize data collected from hydrophone arrays that are capable of discerning the bearing of sound sources, which can help [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":559,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[8],"tags":[13,14],"class_list":["post-616","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-activities","tag-data-visualization","tag-web-service"],"_links":{"self":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/posts\/616","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/comments?post=616"}],"version-history":[{"count":9,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/posts\/616\/revisions"}],"predecessor-version":[{"id":1490,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/posts\/616\/revisions\/1490"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/media\/559"}],"wp:attachment":[{"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/media?parent=616"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/categories?post=616"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/meridian.cs.dal.ca\/fr\/wp-json\/wp\/v2\/tags?post=616"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}