Then Martin Stransky backported that work into the Firefox copy of WebRTC and Jan Grulich and Tomas Popela ensured the changes got merged into Chromium/Chrome.įor webcams there is not much progress yet.
AUDIO AND VIDEO NOT SYNCING SMPLAYER LINUX CODE
Jan Grulich worked with the upstream WebRTC project to add code to interact with the new portal APIS defined for Wayland, to negotiate screen sharing options and then native PipeWire support to fetch the screen content. WT: There was nothing for screen sharing, it was just X11 calls to grab the screen content. There is also the volume control that interacts through the PulseAudio API with PipeWire to manage the volumes.ĬS: So there was no real PipeWire precursor for video, as most stuff just interacted directly with v4l, so I assume it must have been a big task porting over things like GNOME Shell and the web browsers to start using it? We have some more advanced features implemented such as DMABUF passing and metadata for the cursor and clipping regions when sharing a single window. PipeWire will route this stream to the applications like Firefox or the screen recorder. WT: GNOME Shell will send a stream to PipeWire when screen sharing is activated. The version that ended up in Fedora 27 needed another rewrite to make that happen, really.ĬS: Can you talk a little about how PipeWire interacts with things like Wayland and GNOME Shell? WT: Yes, there were only wild ideas about trying to handle audio as well. Another requirement was to make this secure and work well with Flatpak and Flatpak’s concept of portals to handle things that have potential security concerns.ĬS: Ah right, because when PipeWire was originally introduced to Fedora in Fedora 27 it was only dealing with video right? Providing a way to do screen sharing in GNOME Shell? The idea was then to take the PulseVideo idea and implement the possibility for clients to provide streams as well (not just v4l2 devices). It resulted in a bunch of patches to GStreamer regarding fdmemory.Īround that time we started to think about screen capture for Wayland. It used GStreamer, DBus and file descriptor (fd) passing to do this fairly efficiently. It was a small server that would send the video from a v4l2 camera to one or more other processes. The first one was PulseVideo, which was written by William Manley back in 2015. Wim Taymans: PipeWire really evolved out of two earlier ideas. In this interview we will talk about where PipeWire came from, where it is at and where Wim sees it going from here.Ĭhristian Schaller & Wim Taymans testing PipeWire video filters at FlockĬhristian Schaller: What was the origin of PipeWire in terms of what problems you wanted to resolve? In addition to that, it also merges the world of pro-audio with mainstream Linux.
AUDIO AND VIDEO NOT SYNCING SMPLAYER LINUX FULL
In 2015 he started working on PipeWire: a project that has come to full fruition in Fedora Workstation 34, where it handles both audio and video. He joined Red Hat in 2013 and has helped maintain GStreamer and PulseAudio for Red Hat since. He was one of the two original developers of the GStreamer multimedia framework and he was the principal maintainer for most of the project’s initial existence. Wim Taymans has a long track record in the Linux community.