Editing video files produced by my Garmin dashcam is sometimes really annoying. Not only it splits every darn trip into 60 second chunks forcing you to edit bunch of small files, but it will also occasionally produce video with the first frame completely black. At least that’s how it looks in Vegas Movie Studio 15 and, while annoying, was easy enough to remove. In DaVinci Resolve these videos would have a few seconds worth of corrupted data and that’s a bigger problem.
As I’m moving most of my video editing to Resolve due to it’s cross-platform capabilities, I decided to figure it out. One way to see what’s going on with MP4 is by looking at video file in MediaInfo.
The first confusing thing was that MP4 contained only streams with ID 2 and 3. What happened to stream with ID 1 is anybody’s guess. The second source of confusion was that all streams combined amounted to smidgen over 60 MB. The whole video file was more than 70 MB. While one can expect MP4 container format to take some space, overhead is generally measured in KB - not MB.
However, both these things were present in both valid and invalid video file. It took going into Advanced mode to reveal more curiosities. At last it let me know where remaining 10 MB were - in the header. And more interestingly it has shown multiple stream size calculations for video stream. File that contained black frame had one of it’s six video stream sizes listed as 59.97 MB while fully working file had all video stream sizes set to 60.00 MB.
Either due to crappy encoder or bad coding, Garmin not only bloats header to unreasonable level but it can also miscalculate stream stream sizes. Because MP4 contains stream size data at multiple places, it was dependent on decoder whether error would be noticeable or not.
Knowing I am dealing with the corrupt container and seemingly correct stream (albeit one frame shorter), I decided to simply repackage MP4 without recompression using ffmpeg:
This copies both video and audio stream (irrelevant if there is no audio stream) into a new file. For normal videos this results in a direct stream copy. Videos where one frame was corrupted end up with 59.967 seconds worth of frames. Essentially the broken frame will be removed. And this repackaging solved the black frame issue for both Vegas Movie Studio and DaVinci Resolve.
Unfortunately, while DaVinci Resolve did recognize files now, exported result had a stutter. For some reason all these files were recognized as 15 fps. And no, this wasn’t due to stream copy as original videos were misidentified too. It took me a while to give up and ask the question about it on Blackmagic forum only to find out I stumbled upon a bug.
As a workaround before bug is resolved, I went onto converting the stream to DNxHD LB codec:
Single instance applications are fun in any programming language. Let’s take QT as example. One could just create a server and depending on whether it can listen, determine if another instance is running. Something like this:
server =newQLocalServer();bool serverListening = server.listen("SomeName");if(!serverListening){//hey, I'm walkin over here}
And that’s it. Only if it would be this easy.
This code might fail on Unix with AddressInUseError if there was a previous application crash. This means we need to complicate code a bit:
server =newQLocalServer();bool serverListening = server.listen("SomeName");if(!serverListening &&(server->serverError()== QAbstractSocket::AddressInUseError)){QLocalServer::removeServer(serverName);//cleanup
serverListening = _server->listen(serverName);//try again}if(!serverListening){//hey, I'm walkin over here}
But fun wouldn’t be complete if that was all. You see, Windows have issues of their own. As implemented in QT, you can actually have multiple listeners at the same time. Failure to listen will never happen there.
Unfortunately this is a bit more complicated and you can really go wild solving this issue - even so far as to involve QSystemSemaphore with it’s portability and thread blocking issues.
Or you can go with solution that works 99% of the time - directly calling into CreateMutexW API.
Modifying code in the following manner will do the trick:
server =newQLocalServer();bool serverListening = server.listen("SomeName");if(!serverListening &&(server->serverError()== QAbstractSocket::AddressInUseError)){QLocalServer::removeServer(serverName);//cleanup
serverListening = _server->listen(serverName);//try again}#ifdefined(Q_OS_WIN)if(serverListening){CreateMutexW(nullptr,true,reinterpret_cast<LPCWSTR>(serverName.utf16()));if(GetLastError()== ERROR_ALREADY_EXISTS){//someone has this Mutex
server->close();//disable server
serverListening =false;}}#endifif(!serverListening){//hey, I'm walkin over here}
Now on Windows we try to create our very own mutex. If that succeeds, all is normal. If that returns an error, we simply close our server because we know some other instance owns the mutex.
Not ideal but it covers single-instance scenario reasonably well.
If you want to use this in application, you can download the example and use it as follows:
if(!SingleInstance::attach()){returnstatic_cast<int>(0x80004004);//exit immediately if another instance is running}
As a headphone user I find nothing more annoying than computer asking me every single freaking time what exactly did I plug in. While Windows drivers for Dell XPS 15 audio do allow you to select default, one is not so lucky under Linux.
However, Linux being configurable to a fault does offer a workaround.
You can append the following options to the end of /etc/modprobe.d/alsa-base.conf, followed by a reboot:
options snd-hda-intel model=headset-mic
This will lie a bit to sound driver and stop the darn questions.
I wanted a simple system tray application that would work on both Windows and Linux. As C# doesn’t really have a proper GUI for Linux (albeit you can come a long way using Windows Forms), I decided to go with QT.
Under Windows it worked beautifully. Under Ubuntu - not so. QT example for tray icon was pretty much equivalent and it worked flawlessly. But my simple example just wouldn’t. It took me a while but I traced the issue.
Upon click I wanted to display a context menu. It seemed innocent enough to dynamically create it:
And this works under Windows. But Ubuntu and it’s Unity GUI don’t really know what to do with tray icon without preassigned context menu. And thus tray icon is never actually displayed.
Once I figured that out, solution was simple. Just assign menu statically:
While I already wrote about expanding DropBox’s Ext4 volume on ZFS, I never actually wrote how to create one in the first place. I guess it’s time to fix that injustice.
First you need to create a volume of sufficient size. While you can just make it as big as your Dropbox allowance is, I would advise going with at least double of that. Not only this helps if you are doing ZFS snapshots (remember it’s copy-on-write) but it also helps if you are moving files around as Dropbox fully releases space only once the new files are created.
Whatever you decide, you need to create a volume and format it:
Do note the _netdev part as it ensures dropbox volume is mounted way after ZFS has already done so. Without it you might have a race condition and volume mounting might prevent subpools to be mounted under the same path.
Finally you can install Dropbox as you usually would. While it will complain about directory already being present, you can simply cancel directory selection and it will start syncing regardless.