{"id":472,"date":"2024-09-20T15:49:42","date_gmt":"2024-09-20T15:49:42","guid":{"rendered":"https:\/\/blog.mozilla.org\/webrtc\/?p=472"},"modified":"2025-11-20T14:56:58","modified_gmt":"2025-11-20T14:56:58","slug":"unbundling-mediastreamtrackprocessor-and-videotrackgenerator","status":"publish","type":"post","link":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/","title":{"rendered":"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator"},"content":{"rendered":"<p><meta name=\"twitter:image\" content=\"http:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/Thumbnail.png\"><\/p>\n<p>Earlier this year we promised more blog posts about new and exciting WebRTC functionality in standards. So today we&#8217;re talking about <a href=\"https:\/\/www.w3.org\/TR\/mediacapture-transform\/#track-processor\"><code>MediaStreamTrackProcessor<\/code><\/a> and <a href=\"https:\/\/www.w3.org\/TR\/mediacapture-transform\/#video-track-generator\"><code>VideoTrackGenerator<\/code><\/a>! These two APIs let web developers convert a real time video <code>MediaStreamTrack<\/code> to and from a stream of <code>VideoFrame<\/code>s respectively. This simplifies real time video processing in JavaScript.<\/p>\n<p>These APIs are video-only and <a href=\"https:\/\/w3ctag.github.io\/design-principles\/#worker-only\">worker-only<\/a> which avoids <a href=\"https:\/\/docs.google.com\/document\/d\/18OFTfCuZvcEbq_Xmt9VxMqEflV02Z8_2YODDXD_XdFg\/edit#heading=h.m504bcvia657\">jank<\/a> when scrolling by never blocking the browser&#8217;s real time media pipeline on a busy main thread \u2014 They&#8217;re available in <a href=\"https:\/\/webkit.org\/blog\/15443\/news-from-wwdc24-webkit-in-safari-18-beta\/#webrtc\">Safari 18 Tech Preview<\/a> and <a href=\"https:\/\/bugzil.la\/1749532\">will be in Firefox<\/a> by the beginning of next year.<\/p>\n<p>Chrome on the other hand, <a href=\"https:\/\/groups.google.com\/a\/chromium.org\/g\/blink-dev\/c\/oo6MQoRbDXk\/m\/7Kjjfe9NAwAJ\">shipped<\/a> proprietary versions of these APIs back in 2021. Those are instead exposed on the main-thread, and also handle audio. This is obviously not great for web interoperability. We\u2019ll look at both versions, and propose solutions for web developers wondering how to manage this discrepancy.<\/p>\n<p>Both versions promise convenience and performance. But are they equally necessary? And what about audio? Let\u2019s explore.<\/p>\n<p><!--more--><\/p>\n<h2>The standard APIs<\/h2>\n<p>In the spec, <code>MediaStreamTrackProcessor({track})<\/code> must be created in a worker, so you <code>postMessage<\/code> your video track to the worker first.<\/p>\n<p>In this simple example, the worker then pipes the processor&#8217;s readable <code>VideoFrame<\/code>s through a transform and into a <code>VideoTrackGenerator<\/code>, whose <code>.track<\/code> member is <code>postMessage<\/code>d back to main, and assigned to a video element. Once this has been set up, the media flows in real time through the transform off the main-thread.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" width=\"2832\" height=\"582\" src=\"http:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png\" alt=\"\" class=\"alignright size-full wp-image-476\" srcset=\"https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png 2832w, https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3-250x51.png 250w, https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3-700x144.png 700w, https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3-768x158.png 768w, https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3-1536x316.png 1536w, https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3-2048x421.png 2048w, https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3-120x25.png 120w\" sizes=\"(max-width: 2832px) 100vw, 2832px\" \/><br \/>\nIf you&#8217;re in Safari (18 Tech Preview or newer), click the <em>&#8220;Result&#8221;<\/em> tab to see two self-views, one regular and one <strong>sepia-toned and mirrored<\/strong>. Otherwise, don&#8217;t worry, you&#8217;ll see what it looks like in the <em>subsequent<\/em> example, which works in all browsers!<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/jsfiddle.net\/jib1\/hwxyzbc4\/60\/embedded\/js,result\/\" allow=\"camera\" width=\"100%\" height=\"379px\" frameborder=\"0\"><span style=\"display: inline-block; width: 0px; overflow: hidden; line-height: 0;\" data-mce-type=\"bookmark\" class=\"mce_SELRES_start\">\ufeff<\/span><span style=\"display: inline-block; width: 0px; overflow: hidden; line-height: 0;\" data-mce-type=\"bookmark\" class=\"mce_SELRES_start\">\ufeff<\/span><\/iframe><\/p>\n<p>See the <em>&#8220;JavaScript&#8221;<\/em> tab for the code. The second argument to <code>postMessage({track}, [track])<\/code> <em>transfers<\/em> the track to the worker. This is due to <code>MediaStreamTrack<\/code>&#8216;s custom cloning semantics, which might be simplified <a href=\"https:\/\/github.com\/w3c\/mediacapture-extensions\/issues\/58\">in the future<\/a>. <\/p>\n<p>The worker then hooks a JavaScript transform into the browser&#8217;s real time media pipeline like this:<\/p>\n<pre>\r\nawait mstp.readable.pipeThrough(new TransformStream({transform})).pipeTo(vtg.writable);\r\n<\/pre>\n<p>The example&#8217;s <code>transform<\/code> processes pixels to create a sepia-toned mirrored version of the video using <code>OffscreenCanvas<\/code> (scroll down the code above to reveal it).<\/p>\n<p>That&#8217;s it! The above APIs have consensus in the W3C, and Firefox plans to implement them.<\/p>\n<h2>The proprietary APIs<\/h2>\n<p>Chrome exposes an early version of <code>MediaStreamTrackProcessor<\/code> and an earlier-named incompatible <code>MediaStreamTrackGenerator<\/code>. Both are main-thread only. This unfortunately violates W3C design principle <a href=\"https:\/\/w3ctag.github.io\/design-principles\/#worker-only\">\u00a7 10.2.1. Some APIs should only be exposed to dedicated workers<\/a> to avoid blocking rendering.<\/p>\n<p>While web developers can still transfer the resulting <code>readable<\/code> and <code>writable<\/code> to a worker themselves, many don&#8217;t. Even when they do, transferring a property of a source is not the same as transferring that source. The source and sink remain on main-thread <a href=\"https:\/\/github.com\/whatwg\/streams\/issues\/1124\">according to the Streams spec<\/a>. Chrome tries to move the source and sink implicitly anyway, violating the streams spec. But this likely breaks down in general. For example it\u2019s unclear what happens if you split the stream in two using <a href=\"https:\/\/streams.spec.whatwg.org\/#rs-tee\"><code>tee()<\/code><\/a> and transfer one of them. Unpredictable browser optimizations can lead to <a href=\"https:\/\/en.wikipedia.org\/wiki\/Action_at_a_distance_(computer_programming)\">action at a distance<\/a>.<\/p>\n<p>Another mismatch is that while the spec&#8217;s <code>VideoTrackGenerator<\/code> <strong><em>has a<\/em><\/strong> track, <code>MediaStreamTrackGenerator<\/code> <strong><em>is a<\/em><\/strong> track. And last, but not least, as the different name might suggest, it also does audio.<\/p>\n<p>Firefox is unlikely to implement these proprietary APIs, for the reasons given above. No convincing main-thread use cases have been demonstrated. Also, audio data is already exposed through the more powerful <a href=\"https:\/\/developer.mozilla.org\/en-US\/docs\/Web\/API\/AudioWorkletNode\">AudioWorkletNode<\/a> API.<\/p>\n<p>The good news is these main-thread APIs are redundant: they&#8217;re easily polyfilled to work in all browsers today!<\/p>\n<h2>Polyfilling the proprietary APIs, part 1: video<\/h2>\n<p>This exercise will show how to do without the proprietary APIs, by achieving functional parity with them. Here&#8217;s an example where processing is done on main-thread for brevity:<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/jsfiddle.net\/jib1\/jqm079a6\/57\/embedded\/result,js\/\" allow=\"camera\" width=\"100%\" height=\"300px\" frameborder=\"0\"><span style=\"display: inline-block; width: 0px; overflow: hidden; line-height: 0;\" data-mce-type=\"bookmark\" class=\"mce_SELRES_start\">\ufeff<\/span><span style=\"display: inline-block; width: 0px; overflow: hidden; line-height: 0;\" data-mce-type=\"bookmark\" class=\"mce_SELRES_start\">\ufeff<\/span><\/iframe><\/p>\n<p>This relies on the following polyfills in non-chromium browsers (under <em>&#8220;Resources&#8221;<\/em> if you <em>&#8220;Edit in JSFiddle&#8221;<\/em>):<\/p>\n<ol>\n<li> <a href=\"https:\/\/jan-ivar.github.io\/polyfills\/mediastreamtrackprocessor.js\">mediastreamtrackprocessor.js<\/a><\/li>\n<li> <a href=\"https:\/\/jan-ivar.github.io\/polyfills\/mediastreamtrackgenerator.js\">mediastreamtrackgenerator.js<\/a><\/li>\n<\/ol>\n<p>The polyfills handle both audio and video, and the performance is decent. For video, they rely on <code>OffscreenCanvas<\/code> and <code>canvas.captureStream<\/code> (not off-screen <a href=\"https:\/\/github.com\/w3c\/mediacapture-fromelement\/issues\/65\">yet<\/a>) respectively. They can no doubt be optimized, e.g., by using workers to perform better in background tabs. Hopefully they&#8217;re helpful until Firefox and Chrome implement the spec.<\/p>\n<h2>Polyfilling the proprietary APIs, part 2: audio<\/h2>\n<p>Here&#8217;s an audio example using the same polyfills:<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/jsfiddle.net\/jib1\/x54d0ayk\/69\/embedded\/result,js\/\" allow=\"microphone\" width=\"100%\" height=\"300px\" frameborder=\"0\"><span style=\"display: inline-block; width: 0px; overflow: hidden; line-height: 0;\" data-mce-type=\"bookmark\" class=\"mce_SELRES_start\">\ufeff<\/span><span style=\"display: inline-block; width: 0px; overflow: hidden; line-height: 0;\" data-mce-type=\"bookmark\" class=\"mce_SELRES_start\">\ufeff<\/span><\/iframe><br \/>\n<sub>1.) <strong>Update 2025:<\/strong> Requires Safari 26+ which has AudioData support.<\/sub><\/p>\n<p>For audio, the polyfills use <code>AudioWorkletNode<\/code> in both directions. The polyfills could likely be optimized, e.g. by using <code>SharedArrayBuffer<\/code> to avoid copies, or workers to perform better in background tabs.<\/p>\n<p>This performs quite well in Firefox, and is well below its audio callback budget. The amount of garbage collected might be improved with reusable <code>SharedArrayBuffer<\/code>s given to <code>AudioData<\/code>. But for most practical purposes it shouldn\u2019t matter.<\/p>\n<p>In any case, if performance is an issue, you&#8217;re likely better off using <code>AudioWorkletNode<\/code> directly. You&#8217;ll have more tools at your disposal, and more direct control over things like threading.<\/p>\n<h2>Can the standard APIs be polyfilled?<\/h2>\n<p>We think the standard <a href=\"https:\/\/www.w3.org\/TR\/mediacapture-transform\/#track-processor\"><code>MediaStreamTrackProcessor<\/code><\/a> and <a href=\"https:\/\/www.w3.org\/TR\/mediacapture-transform\/#video-track-generator\"><code>VideoTrackGenerator<\/code><\/a> APIs add value. They satisfy W3C design principles, are more likely to perform reliably even under heavy load across various implementations, and don\u2019t suffer action at a distance problems.<\/p>\n<p>In the future, we might be able to polyfill them the same way using a new <a href=\"https:\/\/github.com\/w3c\/mediacapture-fromelement\/issues\/65\"><code>OffscreenCanvas.captureStream()<\/code><\/a> method \u2014 but it doesn&#8217;t exist today.<\/p>\n<p>A core advantage of the standard APIs is they guide applications to move <code>MediaStreamTrack<\/code> off main-thread. <a href=\"https:\/\/bugzil.la\/1749543\">Firefox<\/a> and <a href=\"https:\/\/crbug.com\/40058526\">Chrome<\/a> still need to implement transfer of <code>MediaStreamTrack<\/code>. And that part is not easily polyfilled.<\/p>\n<p>We look forward to the standard APIs being implemented in all browsers! Until then, perhaps the polyfills can help bridge the gap. We hope this has given you some ideas! As always, if you have comments, feel free to <a href=\"https:\/\/x.com\/jibrewery\/status\/1837172179211014425\">reach out on X<\/a>!<\/p>\n","protected":false},"excerpt":{"rendered":"Earlier this year we promised more blog posts about new and exciting WebRTC functionality in standards. So today we&#8217;re talking about MediaStreamTrackProcessor and VideoTrackGenerator! These two APIs let web developers convert a real time video MediaStreamTrack to and from a stream of VideoFrames respectively. This simplifies real time video processing in JavaScript. These APIs are [&hellip;]","protected":false},"author":1399,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"coauthors":[301098],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v22.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Unbundling MediaStreamTrackProcessor and VideoTrackGenerator - Advancing WebRTC<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator - Advancing WebRTC\" \/>\n<meta property=\"og:description\" content=\"Earlier this year we promised more blog posts about new and exciting WebRTC functionality in standards. So today we&#8217;re talking about MediaStreamTrackProcessor and VideoTrackGenerator! These two APIs let web developers convert a real time video MediaStreamTrack to and from a stream of VideoFrames respectively. This simplifies real time video processing in JavaScript. These APIs are [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/\" \/>\n<meta property=\"og:site_name\" content=\"Advancing WebRTC\" \/>\n<meta property=\"article:published_time\" content=\"2024-09-20T15:49:42+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-11-20T14:56:58+00:00\" \/>\n<meta property=\"og:image\" content=\"http:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png\" \/>\n<meta name=\"author\" content=\"Jan-Ivar Bruaroey\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Jan-Ivar Bruaroey\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/\",\"url\":\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/\",\"name\":\"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator - Advancing WebRTC\",\"isPartOf\":{\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#primaryimage\"},\"thumbnailUrl\":\"http:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png\",\"datePublished\":\"2024-09-20T15:49:42+00:00\",\"dateModified\":\"2025-11-20T14:56:58+00:00\",\"author\":{\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/#\/schema\/person\/f2eb9712b8d85b70aebe1faf24e731fd\"},\"breadcrumb\":{\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#primaryimage\",\"url\":\"https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png\",\"contentUrl\":\"https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png\",\"width\":2832,\"height\":582},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/blog.mozilla.org\/webrtc\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/#website\",\"url\":\"https:\/\/blog.mozilla.org\/webrtc\/\",\"name\":\"Advancing WebRTC\",\"description\":\"Committed to moving Firefox and WebRTC forward\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/blog.mozilla.org\/webrtc\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/#\/schema\/person\/f2eb9712b8d85b70aebe1faf24e731fd\",\"name\":\"Jan-Ivar Bruaroey\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/blog.mozilla.org\/webrtc\/#\/schema\/person\/image\/5f3d49a61b032619d0d33c4cc7c7433f\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/16d7e05dc9f8a855a02e0796b00aad3f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/16d7e05dc9f8a855a02e0796b00aad3f?s=96&d=mm&r=g\",\"caption\":\"Jan-Ivar Bruaroey\"},\"url\":\"https:\/\/blog.mozilla.org\/webrtc\/author\/jbruaroeymozilla-com\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator - Advancing WebRTC","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/","og_locale":"en_US","og_type":"article","og_title":"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator - Advancing WebRTC","og_description":"Earlier this year we promised more blog posts about new and exciting WebRTC functionality in standards. So today we&#8217;re talking about MediaStreamTrackProcessor and VideoTrackGenerator! These two APIs let web developers convert a real time video MediaStreamTrack to and from a stream of VideoFrames respectively. This simplifies real time video processing in JavaScript. These APIs are [&hellip;]","og_url":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/","og_site_name":"Advancing WebRTC","article_published_time":"2024-09-20T15:49:42+00:00","article_modified_time":"2025-11-20T14:56:58+00:00","og_image":[{"url":"http:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png"}],"author":"Jan-Ivar Bruaroey","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Jan-Ivar Bruaroey","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/","url":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/","name":"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator - Advancing WebRTC","isPartOf":{"@id":"https:\/\/blog.mozilla.org\/webrtc\/#website"},"primaryImageOfPage":{"@id":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#primaryimage"},"image":{"@id":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#primaryimage"},"thumbnailUrl":"http:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png","datePublished":"2024-09-20T15:49:42+00:00","dateModified":"2025-11-20T14:56:58+00:00","author":{"@id":"https:\/\/blog.mozilla.org\/webrtc\/#\/schema\/person\/f2eb9712b8d85b70aebe1faf24e731fd"},"breadcrumb":{"@id":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#primaryimage","url":"https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png","contentUrl":"https:\/\/blog.mozilla.org\/webrtc\/files\/2024\/09\/DiagramMirroredSepia3.png","width":2832,"height":582},{"@type":"BreadcrumbList","@id":"https:\/\/blog.mozilla.org\/webrtc\/unbundling-mediastreamtrackprocessor-and-videotrackgenerator\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/blog.mozilla.org\/webrtc\/"},{"@type":"ListItem","position":2,"name":"Unbundling MediaStreamTrackProcessor and VideoTrackGenerator"}]},{"@type":"WebSite","@id":"https:\/\/blog.mozilla.org\/webrtc\/#website","url":"https:\/\/blog.mozilla.org\/webrtc\/","name":"Advancing WebRTC","description":"Committed to moving Firefox and WebRTC forward","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/blog.mozilla.org\/webrtc\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/blog.mozilla.org\/webrtc\/#\/schema\/person\/f2eb9712b8d85b70aebe1faf24e731fd","name":"Jan-Ivar Bruaroey","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/blog.mozilla.org\/webrtc\/#\/schema\/person\/image\/5f3d49a61b032619d0d33c4cc7c7433f","url":"https:\/\/secure.gravatar.com\/avatar\/16d7e05dc9f8a855a02e0796b00aad3f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/16d7e05dc9f8a855a02e0796b00aad3f?s=96&d=mm&r=g","caption":"Jan-Ivar Bruaroey"},"url":"https:\/\/blog.mozilla.org\/webrtc\/author\/jbruaroeymozilla-com\/"}]}},"_links":{"self":[{"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/posts\/472"}],"collection":[{"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/users\/1399"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/comments?post=472"}],"version-history":[{"count":0,"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/posts\/472\/revisions"}],"wp:attachment":[{"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/media?parent=472"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/categories?post=472"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/tags?post=472"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/blog.mozilla.org\/webrtc\/wp-json\/wp\/v2\/coauthors?post=472"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}