Noisy Microphone on Lenovo P70 Under Linux

I use Lenovo P70 as my Ubuntu workstation and most of the time I am reasonably ok with it. However, join a call and my microphone is either too low or there is way too much background noise. And that’s whether I use internal microphone or headset.

I tried playing with the slider but that’s annoying beyond belief. Sweet spot seems to be in bottom fifth but there is a really fine line between making background noise stop and my voice inaudible. Since there is no numerical value next to slider, eyeing out the perfect level is close to impossible.

As it often happens in Linux, solution comes courtesy of command line.

amixer set Capture 80%

I found that 80% level is where background noise becomes drastically less noticeable while still allowing my voice to come through.


PS: Interestingly, 100% level from command line puts slider at 1/5th of the sale. That means that (over)amplification can get up to 500%. A bit much…

Easy Type-C, Finally

It has been a while since USB Type-C became popular but I still avoided it for my hobby designs. It wasn’t due to its performance, oh no - performance was much better than what I needed. You see, I still use USB 2.0 for literally all my electronics projects. While not fast in the grand scheme of USB, it’s plenty fast for serial communication and that’s what I end up designing my gadgets around 90% of the time. My issues with the connector were due to soldering. Hand-soldering, to be precise.

Most of the type-C receptacle designs are of surface-mount kind. On its own that’s not an issue - I’ve hand-soldered my share of finicky narrow pitch ICs. Issue is the location of 24 pins that are needed. While 12 pins are almost reachable by a thin soldering iron, the other 12 pins are usually under the device. Not an issue if you are making them professionally or even in an oven but impossibility for soldering by hand.

And no, you cannot solder just one side of connector and be compliant. It doesn’t matter if you’re only interested in USB 2.0 as I was - you still need to connect both sides as you need to connect 5.1K resistor for device to be properly recognized. And yes, even if you fail to do that it will work with the type-A to type-C cable. It will sometimes work even with the type-C to type-C. However, things that happen only sometimes have a nasty habit of happening at the most unfortunate time.

Connectors that used a combination of through-hole design and surface-mount pads looked promising on the paper. The unreachable pads underneath were converted to pins and soldering problem got solved. Unfortunately, a lot of such connectors had a shield over the rear pads. Now rear pads became a problem. I did find a few with more open space that were easier to hand-solder but at a pricey $3.

But finally, I found the type-C connector that suits my use case perfectly - USB4085-GF-A. It’s an USB 2.0 connector that allows connection to 16 out of 24 type-C pins. The sacrificed pins mean that you won’t ever be able to use it with the high-speed devices (thus 2.0 moniker). However, the pins that remain include not only all the standard USB 2.0 power and data connections but also CC1 and CC2. These two pins are really important if you want your device to be part of the type-C universe and pull 1.5 A of current. The only thing needed in addition to the old type-A designs is two 5.1K pull-down resistors on each of those pins and you’re golden.

Mind you, the connector is not perfect. At $1.50 it’s a bit pricey but not excessively so. If you are using the standard 1.6 mm PCB, you will need to take a great care to position it flush in order for short pins to even reach to the other side. Hole spacing means you will need to use 6/6 PCB manufacturing which, while often supported, is still a bit less common than bog-standard 8/8. But these are all minor issues and hand-soldering is a breeze.

And despite all this goodness, I only used it for a few test contraptions and not in any of my proper projects. This is due to its biggest drawback - there is only one manufacturer and, outside of my initial purchase, they are pretty much on backorder the whole time. Even at the time of this writing, the quantity is 0 on DigiKey.

Once this connector becomes more available, I am looking forward to using it.


PS: For those wanting to have type-C plug directly on the board, there is a Molex 105444-0001. Just don’t forget to order 0.80 mm PCB.

PPS: For the 90° type-C connector, KUSBX-SL-CS1N14-B looks promising as it also allows connection to both CC1 and CC2 - a necessity for the proper type-C device. I haven’t tried it myself though.

Using TLS 1.3 from .NET 4.0 Application

Due to ubiquitous support for .NET 4.0 on all Windows platforms (including Windows 10) out-of-box, I still keep most of my freeware apps on it. Yes, I lose some new fancy features but not dealing with .NET Framework download makes it really convenient. But there is one thing that proved to be a problem - TLS 1.3.

When I changed my web server to use only TLS 1.2 and above, built-in upgrade suddenly stopped working. Digging a bit showed reason was that .NET 4.0 supported only TLS 1.0 by default and my web server where upgrades were located required TLS 1.2 at the minimum.

For the latest .NET versions, updating to a higher TLS is easy enough:

ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls13
                                     | SecurityProtocolType.Tls12
                                     | SecurityProtocolType.Tls11
                                     | SecurityProtocolType.Tls;

But Tls11, Tls12, and Tls13 enums are not part of .NET 4.0. However, because .NET is so closely integrated with Windows, sometime it’s worth asking it directly - by specifying enum’s numerical value:

ServicePointManager.SecurityProtocol = (SecurityProtocolType)12288
                                     | (SecurityProtocolType)3072
                                     | (SecurityProtocolType)768
                                     | SecurityProtocolType.Tls;

If you run this code before making the first HTTP request, suddenly you are not limited to the SSL and the ancient TLS anymore.

As this code still requires a bit of error checking, I finally ended up with the function below:

try { //try TLS 1.3
    ServicePointManager.SecurityProtocol = (SecurityProtocolType)12288
                                         | (SecurityProtocolType)3072
                                         | (SecurityProtocolType)768
                                         | SecurityProtocolType.Tls;
} catch (NotSupportedException) {
    try { //try TLS 1.2
        ServicePointManager.SecurityProtocol = (SecurityProtocolType)3072
                                             | (SecurityProtocolType)768
                                             | SecurityProtocolType.Tls;
    } catch (NotSupportedException) {
        try { //try TLS 1.1
            ServicePointManager.SecurityProtocol = (SecurityProtocolType)768
                                                 | SecurityProtocolType.Tls;
        } catch (NotSupportedException) { //TLS 1.0
            ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls;
        }
    }
}

This code ensures the highest TLS is supported even from the poor old .NET 4.0.

Serial Number In Intel Hex - Bash Edition

When creating USB devices having an unique serial number for each is quite important. Yes, stuff will work with duplicate (or missing) serial number but you might notice Windows will get confused. With the serial devices the most noticeable effect will be a different port assignment on each plug. Not a breaking deal but a bit annoying.

The easiest way to deal with this is having a dummy serial number in code and then using post-build action to change it in the final Intel hex file. For TmpUsb project I even created a script to do exactly this. However, that script was written in PowerShell which until recently (March 2020) didn’t work on Linux and it definitely doesn’t come installed there by default. This and a few other reasons were why I decided to rewrite this in Bash for my CyberCard project.

The first step is setting up the serial number. The most crucial step here is __at macro that will ensure serial number is always at a predictable location. For my example, I selected location close to the end of addressable space (0x1F00). One could use special characters here but I would highly advise to have a valid USB serial number just in case script fails. That means at least 12 alphanumeric characters but I always go for numbers-only as it makes script a bit easier.

//Serial string descriptor
const struct{uint8_t bLength;uint8_t bDscType;uint16_t string[13];}sd003 ^^__at(0x1F00)^^={
    sizeof(sd003),USB_DESCRIPTOR_STRING, {
        '1','9','7','9','0','1','2','8','1','8','1','5','0'
    }};

The final Intel hex file location will end up multiplied by 2 and thus at offset 0x3E00 further shifted by 4 hexadecimal characters. Script then needs to replace every 4th hexadecimal byte up to the count of 13 in my case (as my serial number is 13 digits long).

The only task remaining is calling it once project is done building. Previously I would do this using project properties dialog (Project Properties, Conf, Building, Execute this line after build) but that proved to be an issue if you want to compile on both Windows and Linux. While Linux setup was quite straightforward, on Windows it would get confused if you have any Linux-based toolset installed (e.g. Git Bash) and throw an error:

"User defined post-build step: [update-serial.sh]"
nbproject/Makefile-default.mk:105: recipe for target '.build-conf' failed
make[1]: Leaving directory 'Q:/Projects/Electronics/CyberCard/Firmware/Source'
nbproject/Makefile-impl.mk:39: recipe for target '.build-impl' failed
process_begin: CreateProcess(NULL, bash Q:\Projects\Electronics\CyberCard\Firmware\Source\update-serial.sh, ...) failed.
make (e=2): The system cannot find the file specified.

One workaround was to simply wrap call to this script in .cmd (or .bat) file but that would cause issues when compiling under Linux. Only if there was a way to execute one command for Windows and another one when running on Linux… Cue the Makefile.

While you cannot differentiate OS from GUI, you can go and edit MPLAB project Makefile directly. There we can point toward .cmd on Windows and toward .sh when running on Linux.

.build-post: .build-impl
# Add your post 'build' code here...
ifeq ($(OS),Windows_NT)
        update-serial.cmd
else
        ./update-serial.sh
endif

All that is left is actually placing the script(s) in the same directory the Makefile is situated (download here) and you’re good assuming you defined serial number as I did. If you didn’t, just adjust LOCATIONS variable inside.


PS: This assumes you have Git (or some other bash-capable environment) installed in Windows.

PPS: Don’t forget to use <Tab> in Makefile for indentation. Spaces won’t do here.

Moving Mercurial Repository from BitBucket to GitHub

I already wrote once about moving from Mercurial to Git. That was way back in 2015 when I moved most of my repositories. However, I didn’t move all. Quite a few old electronics projects remained on BitBucket, my Mercurial home of choice. With BitBucket removing Mercurial starting June 1st, it was time to move remainder.

First you will need Ubuntu Linux with a few tools.

sudo apt install --yes git mercurial wget

mkdir ~/bin
wget https://raw.github.com/felipec/git-remote-hg/master/git-remote-hg -O ~/bin/git-remote-hg
chmod +x ~/bin/git-remote-hg

Here the git-remote will do most of the heavy lifting as it will enable cloning of Mercurial repository (if it fails, try adding --config format.sparse-revlog=0 option):

git clone hg::ssh://hg@bitbucket.org/^^user^^/^^project^^

With repository now in Git format, we can treat it as any other repository and push it to our server. In the case of GitHub, that would look something like this:

git remote rm origin
git remote add origin git@github.com:^^user^^/^^project^^.git
git push --all 

That’s all. Rinse and repeat until there is no Mercurial repo left.

PS: Considering how great Mercurial is for beginners, I am sad to see it’s primary online home gone. There are other places allowing hosting of Mercurial repositories but BitBucket was among first and it actually worked remarkably well.

PPS: Yes, this works for local repositories too.