# Audio

The Audio view allows you to display audio-type data. New APIs are being released such as the WebCodec API and allow more flexible support for audio streaming.

Details of the API can be found here https://github.com/WICG/web-codecs (opens new window)and here https://wicg.github.io/web-codecs/ (opens new window).

If the browser does not support WebCodec API, It will use as fallback the lib ffmpeg.js.

The lib is based on FFMPEG.js which is a port of the native FFMPEG C++ library using emscripten (opens new window). More information: https://github.com/mdhsl/ffmpeg.js (opens new window)

The view is composed of a decoder (either WebCodec API or FFMPEG.js) and one or more audio viewers: equalizer, spectrogram etc.


API Reference

# Supported layers

The view supports type layers:

  • audioData

# Docoders

The view will use the decoder that is supported by the browser, either the one of the WebCodecApi or the one of the Ffmpeg.js lib. By default, the 'aac' codec is used.

# WebCodecAPI

The webcodecApi uses HW acceleration. The webcodecApi uses HW acceleration and is still experimental. It is necessary to activate the options in chrome as indicated on https://github.com/WICG/web-codecs (opens new window)

# Ffmpeg.js

The decoding is processed frame by frame using the library ffmpeg.js following this principle https://ffmpeg.org/doxygen/3.3/decode__audio_8c_source.html (opens new window).

# Visualizers

The view allows to visualize the audio data in frequency and time.

One or more visualizers can be added to the audio view. The audio view is responsible for decoding the data and then forwarding the decoded AudioBuffer to all visualizers. Each will then display the data independently. The audio view does not have an anchor in the DOM, it is the visualizers that must be added on a div of the application.

To do this, we will use the container parameter of the constructor.

API Reference

There are several audio visualizers provided by default and grouped by type: time, frequency or spectrogram. Each group can have several implementations: chart.js, pure HTML5 canvas, three.js etc.

Here is the list:

Frequency:

  • AudioFrequencyCanvasVisualizer

    API Reference

  • AudioFrequencyChartJsVisualizer

    API Reference

Time:

  • AudioTimeCanvasVisualizer

    API Reference

  • AudioTimeChartJsVisualizer

    API Reference

Spectrogram:

  • AudioSpectrogramVisualizer
    API Reference

# Example

import AudioView from 'osh-js/core/ui/view/audio/AudioView';
import AudioFrequencyCanvasVisualizer
    from 'osh-js/core/ui/view/audio/visualizer/frequency/AudioFrequencyCanvasVisualizer';
import AudioTimeCanvasVisualizer from 'osh-js/core/ui/view/audio/visualizer/time/AudioTimeCanvasVisualizer';
import AudioFrequencyChartJsVisualizer
    from 'osh-js/core/ui/view/audio/visualizer/frequency/AudioFrequencyChartJsVisualizer';
import AudioTimeChartJsVisualizer from 'osh-js/core/ui/view/audio/visualizer/time/AudioTimeChartJsVisualizer';
import AudioSpectrogramVisualizer from 'osh-js/core/ui/view/audio/visualizer/spectrogram/AudioSpectrogramVisualizer';
import DataSynchronizer from 'osh-js/core/timesync/DataSynchronizer';

// #region snippet_audio_datasource
import SosGetResult from 'osh-js/core/datasource/sos/SosGetResult.datasource.js';
import AudioDataLayer from 'osh-js/core/ui/layer/AudioDataLayer';
import {Mode} from 'osh-js/core/datasource/Mode';

let audioDataSource = new SosGetResult("alex-audio", {
    endpointUrl: "sensiasoft.net/sensorhub/sos",
    offeringID: "urn:android:device:dd90fceba7fd5b47-sos",
    observedProperty: "http://sensorml.com/ont/swe/property/AudioFrame",
    startTime: "2021-04-12T10:48:45Z",
    endTime: "2021-04-12T10:49:45Z",
    mode: Mode.REPLAY,
    tls: true
});

const dataSynchronizer = new DataSynchronizer({
  replaySpeed: 1.0,
  masterTimeRefreshRate: 250,
  startTime: "2021-04-12T10:48:45Z",
  endTime: "2021-04-12T10:49:45Z",
  dataSources: [
      audioDataSource
  ]
});

let audioView = new AudioView({
    name: "Audio",
    css: 'audio-css',
    container: 'audio-chart-container',
    gain: 5,
    playSound: true,
    layers: [
        new AudioDataLayer({
            dataSourceId: audioDataSource.id,
            getSampleRate: (rec) => rec.sampleRate,
            getFrameData: (rec) => rec.samples,
            getTimestamp: (rec) => rec.timestamp
        })
    ]
});
// #endregion snippet_audio_datasource

const audioCanvasFrequencyVisualizer = new AudioFrequencyCanvasVisualizer({
    fftSize: 32,
    barWidth: 20,
    css: 'audio-canvas',
    container: 'canvas-frequency'
});
const audioCanvasTimeVisualizer = new AudioTimeCanvasVisualizer({
    fftSize: 1024,
    css: 'audio-canvas',
    container: 'canvas-time'
});

const audioChartFrequencyVisualizer = new AudioFrequencyChartJsVisualizer({
    css: 'audio-canvas',
    fftSize: 32,
    container: 'chart-frequency',
    options: {},
    datasetOptions: {
        borderColor: 'rgba(0,0,0,0.5)',
        backgroundColor: 'rgba(210,210,210,0.8)',
        barThickness: 20,
        borderWidth: 1
    },
});

const audioChartTimeVisualizer = new AudioTimeChartJsVisualizer({
    css: 'audio-canvas',
    fftSize: 1024,
    container: 'chart-time'
});

const audioSpectrogramVisualizer = new AudioSpectrogramVisualizer({
    fftSize: 2048,
    container: 'spectrogram'
});

audioView.addVisualizer(audioCanvasFrequencyVisualizer);
audioView.addVisualizer(audioCanvasTimeVisualizer);
audioView.addVisualizer(audioChartFrequencyVisualizer);
audioView.addVisualizer(audioChartTimeVisualizer);
audioView.addVisualizer(audioSpectrogramVisualizer);

document.getElementById("listen").onclick = () => {
    dataSynchronizer.connect();
}

const inputChartElt = document.getElementById("input-range-chart");
inputChartElt.onchange = (event) => {
    document.getElementById("range-value-chart").innerText = inputChartElt.value;
    audioView.setGain(parseInt(inputChartElt.value));
}