# VideoView

The video view allows you to play a video stream of several kinds:

  • compressed H264, H265, VP9, VP8 etc.
  • Mjpeg

To do this, it will first detect the compression used (provided by the schema description or GetResultTemplate) and then use an internal view among these three views:

  • Mjpeg: for MJPEG data
  • FFMPEG: for all other types of compressed data
  • VideoCodecApi: still experimental in chrome

# FFMPEG

The FFMPEG view allows you to display video-type data. The lib is based on FFMPEG.js which is a port of the native FFMPEG C++ library using emscripten (opens new window).

More information: https://github.com/mdhsl/ffmpeg.js (opens new window)


API Reference

# Supported layers

The view supports type layers:

  • videoData

# Limitation

The view theoretically decodes all types of video streams supported by FFMPEG. Due to the limitation of emscripten and WebAssembly, the Javascript library does not support hardware decoding. So processing is completely handled by the CPU.

However, it is possible to display almost twenty H264 videos in parallel, since CPU decoding is optimized by the CPU instructions themselves.

# Example

import SosGetResult from 'osh-js/core/datasource/sos/SosGetResult.datasource.js';
import VideoView from 'osh-js/core/ui/view/video/VideoView.js';
import VideoDataLayer from "osh-js/core/ui/layer/VideoDataLayer";
import DataSynchronizer from "osh-js/core/timesync/DataSynchronizer";
import {Mode} from "osh-js/core/datasource/Mode";

const REPLAY_SPEED = 2.5;

// create data source for UAV camera
let videoDataSource = new SosGetResult("drone-Video", {
  endpointUrl: "sensiasoft.net/sensorhub/sos",
  offeringID: "urn:mysos:solo:video2",
  observedProperty: "http://sensorml.com/ont/swe/property/VideoFrame",
  startTime: "2015-12-19T21:04:30Z",
  endTime: "2015-12-19T21:09:19Z",
  mode: Mode.REPLAY,
  tls: true
});

// show it in video view using FFMPEG JS decoder
let videoView = new VideoView({
  container: 'video-h264-container',
  css: 'video-h264',
  name: 'UAV Video',
  framerate:25,
  showTime: true,
  showStats: true,
  useWebCodecApi: false,
  layers: [
    new VideoDataLayer({
      dataSourceId: videoDataSource.id,
      getFrameData: (rec) => rec.videoFrame,
      getTimestamp: (rec) => rec.timestamp
    })
  ]
});

// start streaming
const dataSynchronizer = new DataSynchronizer({
  masterTimeRefreshRate: 250,
  startTime: "2015-12-19T21:04:30Z",
  endTime: "2015-12-19T21:09:19Z",
  replaySpeed: REPLAY_SPEED,
  dataSources: [
    videoDataSource
  ]
});
dataSynchronizer.connect()



# MJPEG

The MJPEG view allows the display of a video stream made up of images in JPEG format.


API Reference

# Supported layers

The view supports type layers:

  • videoData

# Example

import SosGetResult from 'osh-js/core/datasource/sos/SosGetResult.datasource.js';
import VideoView from 'osh-js/core/ui/view/video/VideoView';
import VideoDataLayer from 'osh-js/core/ui/layer/VideoDataLayer';
import {Mode} from 'osh-js/core/datasource/Mode';
import DataSynchronizer from 'osh-js/core/timesync/DataSynchronizer';
import SosGetResultParser from "../../../source/core/parsers/sos/SosGetResult.parser";

let startTime = "2015-02-16T07:57:59.447Z";
let endTime = "2015-02-16T08:09:00Z";

// create data source for Android phone camera
let videoDataSource = new SosGetResult("android-Video", {
    endpointUrl: "sensiasoft.net/sensorhub/sos",
    offeringID: "urn:android:device:060693280a28e015-sos",
    observedProperty: "http://sensorml.com/ont/swe/property/VideoFrame",
    startTime: startTime,
    endTime: endTime,
    mode: Mode.REPLAY,
    tls: true,
    prefetchBatchDuration: 5000,
    timeShift: -16000
});

// show it in video view
let videoView = new VideoView({
    container: "video-mjpeg-container",
    css: "video-mjpeg",
    name: "Android Video",
    keepRatio: true,
    showTime: true,
    layers: [
        new VideoDataLayer({
            dataSourceId: videoDataSource.id,
            getFrameData: (rec) => rec.videoFrame,
            getTimestamp: (rec) => rec.timestamp
        })
    ]
});

// start streaming
const dataSynchronizer = new DataSynchronizer({
    masterTimeRefreshRate: 250,
    replaySpeed: 1.0,
    startTime: startTime,
    endTime: endTime,
    dataSources: [
        videoDataSource
    ]
});
dataSynchronizer.connect()




# Video Codec API beta

New APIs are being released such as the WebCodec API and allow more flexible support for video streaming.

Details of the API can be found here https://github.com/w3c/webcodecs (opens new window) and here https://web.dev/webcodecs/ (opens new window).

Full specification: https://www.w3.org/TR/webcodecs/ (opens new window)

This new API allows, among other things, to decode Video frames using Hardware acceleration.


API Reference

# Supported layers

The view supports type layers:

  • data

# Limitation

WARNING

This experimental version can be used in chrome >= 94 and support only H264, VP8 & VP9. The VideoDecoder use a lot of memory. This may lead to some issues while playing multiple videos at the same time.

# Codec

By default, the following profiles are used for the codecs:

  • 'vp9':'vp09.02.10.10.01.09.16.09.01',
  • 'vp8': 'vp08.00.41.08',
  • 'h264': 'avc1.42e01e'

The codec string can be passed in parameter but the the profile cannot be overridden yet.

By default, the h264 codec is used.

# Architecture

The WebCodec API is used to decode frame using the VideoDecoder. In addition, the VideoDecoder is used within a webWorker to optimize performance.