We got a number of replies to my previous call for feedback, which is absolutely wonderful. I figure I might as well address one of the most common issues that everyone talks about: Gapless playback in Amarok.
Just so you all know, a solution for it is in the works. However, it isn’t an easy fix. Large portions of the GStreamer backend are being retooled, redesigned, and rewritten to make it happen. Regardless, progress is being made, and has been since I started on this adventure about 3 weeks ago.
I’ll admit, Phonon GStreamer’s code isn’t exactly pretty. Sure, it plays every file format known to mankind, offers to install the codecs you’re missing, and has very high performance at the same time, but the design is far from optimal.
Part of gstreamer’s base plugins package is a nifty little element called “playbin2″. Using it is really easy, and can be demonstrated at the command line:
$ gst-launch playbin2 uri="file:///srv/media/Music/Paul and Storm - Your Love Is....flac"
That replaces the previously lengthy process of building a pipeline by hand:
$ gst-launch filesrc location="/srv/media/Music/Paul and Storm - Your Love Is....flac" ! \ decodebin2 ! autoaudiosink
decodebin2 and the auto*sink elements hide away another layer of mess involving codec detection, audio device configuration, massaging of raw audio data, and so forth.
Getting to the point of discussion, here’s a rough layout of phonon-gstreamer’s average pipeline, taken from our debugging page:
filesrc -> decodebin2 -> queue -> audioresample -> audioconvert -> pulsesink \ -> ffmpegcolorspace -> queue -> xvimagesink
Each pipeline is hand-assembled with a bunch of calls to gst_element_factory_make, property setting, some linking of decoded audio/video streams to the proper sub-graphs, etc. Since each pipeline is hand-assembled, there is one big problem: we’re basically writing our own playbin2. This involves automatic connection of decoded audio/video streams to the proper output graphs, keeping track of the elements, setting properties across different interfaces, and worst of all: duplicating effort.
By now you’re likely wondering where gapless playback comes into this. Turns out that playbin2 handles all the trickiness of switching input streams at exactly the right time and feeding some buffers in just as the current buffers are pumped out to the soundcard. All that’s needed is to set the “uri” property at exactly the right time (when the “about-to-finish” signal is emitted). As proof of concept, have a look at my hacks repository and play with my ‘gapless’ hack:
$ make gapless-test
Work has been ongoing inside the ‘plumbing’ branch in phonon-gstreamer’s git. The first few steps have been refactoring the massive MediaObject implementation into two components: one to handle the state engine, and another to handle interfacing with GStreamer. That bit is as done as I can comfortably make it without adding a zillion tiny functions to enforce total encapsulation, yet introducing coupling of an insane degree between the two classes. The next step is converting the handmade pipeline to a playbin2 based one. Earlier today, I was successful in hacking together a prototype that handles simple file/URL MediaSources. Here’s some off-the-cuff metrics to demonstrate how significant of an improvement this really is:
$ wc -l pipeline.cpp mediaobject.cpp 224 pipeline.cpp 1952 mediaobject.cpp
For reference, no pipeline code has been removed from the MediaObject. The Pipeline object currently handles the total life cycle of the gstreamer pipeline, including codec detection, creating the appropriate transport source, plugin installation, and feeding data to VideoWidgets and AudioOutputs.