I like the "What's PipWire" intro, I had no idea what it was, and now I get it. It starts with this nice explanation:
"PipeWire is a media processing graph, this might not ring a bell so let me rephrase. PipeWire is a daemon that provides the equivalent of shell pipes but for media: audio and video."
I've so many projects that still made no sense after reading the ABOUT or README, this one does a nice job of explaining.
It's not clear to me what Pipewire does with respect to video. The audio part makes sense, most talk of pipewire that I've seen focuses on using pipewire as a replacement to pulseaudio. But what exactly does it do for video? I assume pipewire doesn't replace video playing applications like mpv. Would mpv use pipewire in place of some other component? Would/could mpv use pipewire in place of ffmpeg? Or is pipewire for video something like a wayland compositor?
The fact that it's a filter graph for shuttling buffers between sources and sinks lends itself to both buffers for audio and buffers for video. An application that generates video frames could decide to send its video frames to pipewire, and an application that accepts video frames could accept them from pipewire.
This is how screensharing through xdg-desktop-portal (linked in my sibling comments) works. The xdp implementation gets screencapture frames from the compositor and pushes them into pipewire, and the screensharing application (browser, etc) pulls the frames from pipewire.
This isn't new; DirectShow on Windows, and its successor Media Foundation, are also based on graphs. Put "windows graphedit" and "windows topoedit" in an image search engine for examples.
The original use case for pipewire was video, it was originally named "PulseVideo". It allows things like using an application running on Wayland as a video source and piping it into Firefox so you can share your screen on Zoom or something similar.
It does the same thing it does for audio: routes and multiplexes streams. Video devices in Linux the same problem as audio devices where multiple applications can't have them open at the same time. Pushing the video through a daemon also allows you to do the same interesting things that you can do with audio streams in pulseaudio/jack, where you can transparently change the inputs and outputs around without having to restart or reconfigure the application, and you can route video through arbitrary effect chains before it reaches the destination.
If the application can use v4l devices directly (I think this includes all ffmpeg players such as mpv) or can output a video stream somewhere, then it might make sense to also add a pipewire backend there. Maybe somebody will make this easy by adding a pipewire encoder/decoder to ffmpeg? Then the applications won't have to do much at all to support this. If the application uses gstreamer, you may already be able to get it to work automatically by using pipewiresink/pipewiresrc.
Arnavion got it right, but a simpler case is that you would be able to share your webcam or video streams with multiple applications at the same time, while also allowing you to insert filters or modification of that video on the fly. Similar to how you can do that with audio servers right now.
I like the excitement around pipewire. I was already into Linux when similar transitions were happening with pulseaudio or wayland or systemd, and the excitement today is so different from the negativity (deserved or not) around these other projects.
While I'm not a pro by any means, I've been playing with this use case for a bit now. The jack interface "just works". If you're running an RT or full pre-empt kernel, the delays is really minimal.
It's silly how much easier it is to setup. No changes needed and your qjackctl / Carla / ... will show all applications. You can connect your browser, guitarx, ardour, reaper channels as you please - whether they're jack-aware or not.
I've switched from Jack to Pipewire on my main desktop and it performs perfectly, much better than the jack / pulseaudio combo (using an RME Multiface II).
I recently switched to PipeWire on NixOS 21.05 release. It was mostly seamless (except for some issues of not running it at realtime with rtkit, something that PulseAudio does automatically) and having access to new codecs like mSBC and SBC-XQ for Bluetooth is great.
It is very early though and lacks documentation like the article posted above says. Wouldn't recommend for everyone yet, but sure it is nice to see almost no issues in a project that is still on its early stages.
Same here. mSBC coded that started working for me around Fedora 33/pipewire 3.2x is a night/day change for me. I can use my Jabra bluetooth headset with high quality audio instead of really crappy sound. I think neither Windows nor OSX can do that with regular bluetooth adapter built in laptops.
Does that mean you do not have pulseaudio running? I am still confused whether all my apps would just run fine if I install pipewire and remove pulseaudio, or whether pipewire currently takes over only some of the responsibilities of pulseaudio.
Zero PulseAudio, just the `pipewire-pulse` daemon (that replaces it). Better yet, all PulseAudio clients that I tried just works (including things like `pavucontrol`).
I've tried with Ubuntu for a while but not had luck. Does installation work easily with Ubuntu now, or did you have to hack a bit? If so, is this written up somewhere?
Looks like it is super easy with Nix but I've not yet had success on Ubuntu.
It took a bit of hacking but it was most likely the mess I created earlier (half-way through I realized I had some leftovers from when I tried to build it manually some time ago).
In the end I managed to remove all traces of PipeWire from my system and followed some tutorial which referenced above PPA, without any further problems.
True, a project that consists just of a bunch of C files that you can drop into your project and that works with (almost) any build settings is just as good, or even better than "header only" libraries IMO.
But both approaches are preferable to complicated libraries that are hard to build and force the use of a certain build framework.
"PipeWire is a media processing graph, this might not ring a bell so let me rephrase. PipeWire is a daemon that provides the equivalent of shell pipes but for media: audio and video."
I've so many projects that still made no sense after reading the ABOUT or README, this one does a nice job of explaining.