Why I Keep My Home Servers in UTC

Except for desktop computers and mobile phones, all my networked devices live in UTC timezone (sometime incorrectly referred to as GMT).

First, the most obvious reason is that my servers and devices live in two very different locations. Most of them are in USA but a few still remain in Croatia (yep, I have transcontinental offsite backup). For anything that needs time sync, I would need to manually calculate time difference. And not only once - thanks to different daylight time schedule there are four different time offsets throughout the year. With multiple devices around, mistakes are practically assured.

However, I would use UTC even with all devices in the same location. And the reason is aforementioned daylight saving time. Before I switched to UTC every year after daylight starts or ends I would have one hour difference on something. Bigger devices (e.g. NAS) would usually switch time but smaller IoT devices would not.

Since my network has centralized logging I can be sure that some devices will be one hour off at any time. And I am sure to notice this only when I need the logs, leaving me to add mental calculations to already annoying troubleshooting task. And, even if I remember to reconfigure it, I can be sure damn daylight saving screws it again later.

And yes, it might not be necessarily important for all my servers and devices to share the same time in the grand scheme of things. But UTC makes it easy enough and adjusting to it is reasonably easy.

If you have the same issues, jump in - you’ll not be sorry.

PS: The only downside is that my server sends me e-mail health report at different time depending if it is winter or summer.

PPS: Why the heck we still use daylight saving time?

Seattle Code Camp 2017

Illustration

Seattle Code Camp organizers have finalized selection of this year talks and I am proud to say I got two sessions.

My first topic will be Crash Course In Foreign Language Support For ÜS Developer. If title looks familiar, it is actually a rerun of last year talk under, surprise-surprise, the same title. :) Of course, it won’t be exactly the same talk as session will be 15 minutes longer, and hopefully more polished.

My hope is that after this talk an average developer will understand different regional environments, how complicated stuff can get, what C# has to offer in regards to regionalization, where C# fails, and what are the most common mistakes.

Second session will be Path Over the Desktop Bridge, a suspenseful tale of desktop application visiting the Windows Store for the first time. I will talk about my experience, lessons learned, and how to talk with testers in 1000 characters or less. It will be wild ride of back and forth culminating in a happy ending. If this innuendo doesn’t peek your interest, I don’t know what will.

Seattle Code Camp will be held on Saturday, September 9th at Seattle University. You will be able to register for attendance soon.

[2017-08-10: Registration is now open.]

Boot Linux ISO From USB

Illustration

Let’s face it - nobody uses DVD drives for installations any more. Even if your computer has it, chances are it also has USB drive support. And USB drive is MUCH faster than DVD.

There are many different ways to get Linux ISO onto USB for the purpose of Penguinification. My favorite desktop distribution - Linux Mint - has instructions for quite a few of them. However, with great selection comes great confusion.

Assuming you have Windows computer lying around, I will describe what I’ve found to be the least intrusive method leaving no permanent traces on Windows nor requiring installation of any applications.

Assuming you already downloaded Linux ISO file, you will also need to download PORTABLE version of Rufus. Yes, you could also install it but we are looking into the least intrusive way so portable reflect that philosophy better.

What you will see is trivial interface with all defaults being set properly for any modern Linux distribution, whether you need UEFI or BIOS installation. The only thing is selecting appropriate ISO image hidden behind button next to combo box saying ISO Image. If you forget this you will find yourself booting into Free DOS. Good for getting BIOS firmware updates and not much more.

If you are installing a bit newer version of Linux, you will probably get a warning that different ldlinux.sys and ldlinux.bss are needed. Answering yes will let Rufus download them from Internet.

The next question might be (depending on options selected) about a method of USB creation. USB mode worked for me every time.

After answering Yes to the final warning of imminent data destruction of the destination, your USB drive will get ISO applied to it and you are ready to use it for installing a Linux of your choice.

PS: I personally tested this with Linux Mint and Fedora but I don’t believe there is any that will not work.

Cheap Cybersecurity Books

Those into cybersecurity, rejoice.

Humble has a new book bundle and, unlike their lately book offerings, this one is actually good and extremely cheap considering the books included. Frankly, it would be a good deal if only Applied Cryptography was included.

Yes, lowest tier is useless and middle tier essentially lives on Cryptography Engineering with Mitnik’s The Art of Deception adding a bit of flair.

But the most expensive $15 tier more than makes it up with Applied Cryptography, aged book that still somehow manages to stay current in the approach to security if not in all examples. And there is Secret and Lies proving that Schneier is getting all philosophical as he ages.

Based on my picks you can already see that they might have called this Schneier’s bundle and I would be equally interested. The only two books I wish were here are Applied Cryptography and The Twofish Encryption Algorithm (yes, I know how old it is).

If you have any interest in security do think about this bundle. Probably the cheapest (legal) way to get some real classic and a good read.

My SSH Crypto Settings

With ever-expanding number of scripts on my NAS I noticed that pretty much every one had similar, but not quite the same parameters. For example, my automatic replication would use one set of encryption parameters while my Mikrotik router backup script would use other, and my website backup script would use a third variant.

So I decided to see if I could still keep the reasonable security but consolidate all these to a single type.

For key exchange, I had choice of diffie-hellman-group-exchange-sha256, diffie-hellman-group-exchange-sha1, diffie-hellman-group14-sha1, and diffie-hellman-group1-sha1. Unfortunately there is no curve25519-sha256@libssh.org or similar algorithms that are considered more secure.

For a while I considered using diffie-hellman-group14-sha1 as it uses 2048 bit prime but its abandonment by modern SSH versions made me go with diffie-hellman-group-exchange-sha256. As this method allows for custom groups, it should be theoretically better but it also allows server to setup connection with known weak parameters. As servers are in my control, that should not pose an huge issue here.

For cipher my hands were extremely tied - Mikrotik, my router of choice, supports only aes256-ctr and aes192-ctr. Both are of acceptable security so I went with faster: aes192-ctr.

For authentication Mikrotik was again extremely limited - only hmac-sha2-256 and hmac-sha1 were supported. While I was tempted to go with hmac-sha1 which is still secure enough despite SHA1 being broken (HMAC part really does make a difference), I went with hmac-sha2-256 as former might get obsoleted soon.

My final set of “standard” parameters is as follows:

-2 -o KexAlgorithms=diffie-hellman-group-exchange-sha256 -c aes192-ctr -o MACs=hmac-sha2-256

Additional parameter is not strictly encryption related but I find it very reasonable to enforce SSH protocol version 2.