Programming in C#, Java, and god knows what not

Living in High DPI

What’s the problem?

Most of applications are pretty ignorant regarding DPI settings (including here a majority of mine applications also). Developers tend to optimize for 96 dpi and leave it at that. With more and more high DPI LCDs it is very hard for users to stay at that setting so lot of them are working on 120%.

On Windows XP this would look very bad if application is not high DPI aware but on Vista they decided to stretch client area of those misbehaving applications to match user selected dpi settings. Form is little bit blurry but it is better than element misplacement and text clippings. Old misbehaving applications look almost decent there.

Problem is that on Vista even applications that took effort to look good on higher DPI settings are affected with that scaling. All that work was done for nothing since Vista makes your application think it is working at 96 dpi.

Telling Vista that I am smart guy

There is a way to let Vista know that we took that additional effort. One just needs to call SetProcessDPIAware API (Vista only!) and Windows will leave us alone to manage our own interface. Since P/Invoke is not my favorite way of doing things from .NET I am happy that there is possibility of embedding it in manifest also.

If you have Visual Studio 2008 embedding this manifest is easy as creating new file but in older versions you will have some more work to do.

Some time later I will lead you through actual creation of well behaving application.

Default System Font

.NET Framework 1.0 and above

By Microsoft’s design guidelines you should be using system defined font for displaying user interface elements in your program. Of course, Microsoft found it very helpful to ignore that and assign Microsoft Sans Serif at 8.25 points as default font for all new .NET applications. Since both XP and Vista brought us new font (Tahoma at 8.25 points and Segoe UI at 9 points, respectively) most of .NET applications are just looking slightly out of place.

Solution is quite simple - you should only manually assign new font to form and all elements will pick up that setting (if you left them at default). Only problem is retrieving a font to use.

Proper way

Microsoft states that we should use GetThemeFont API to get this information. Problem is that this function does not exist in managed world and you don’t always have option to make P/Invoke calls (i.e. because of security settings) and portability also comes into question (Windows older than XP and mono).

SystemFonts.???

One would think that SystemFonts.DefaultFont would return what we need but for some reason known only to guy who made that function it returns Microsoft Sans Serif in almost all cases. Why does it ignore real windows default font is unknown to me. Some reflecting is probably needed but I tend to be on lazy side when there is no actual possibility to change anything.

Good choice is strangely SystemFonts.MessageBoxFont since this return proper font on all platforms I have used it. It is not official source for that information but should be good enough if you really want (or need) to stay in managed world.

If one wants to see example of it, I am happy to provide.

How to Choose Barcode Symbology

For one project I needed to select proper barcode symbology (way of encoding). Requirements were clear: variable length (that excludes EAN/UPC subset), some form of check-sum and it needs to work with standard (aka cheap) equipment. That left me with few candidates which I will try to describe.

2 of 5

This symbology is well supported since it is quite old (circa 1960) and it is used within industry. It is numeric only code and there is optional support for check-sum (modulo 10) digit, but that digit needs to be encoded and decoded by software (this may be a problem if you have some devices which are not under your control). Problem is also in width of code since this one encodes data only in bars which makes for a lot of wasted space. It’s low density could be a problem if you are “space challenged”.

To overcome excessive width of standard 2 of 5, somebody though of encoding data in both bars and spaces. That effectively made code twice as dense as standard one (or twice as short) for same amount of data but support is only available for even number of digits. Everything else said for standard version is same for this one.

Codabar (NW-7)

Illustration

Used mostly within blood banks this symbology allows for numbers with limited number of symbols (- $ : / . +). Since it is self-checking there is no additional check-sum needed but one can always use a software one. Code can start and end in four different ways (usually called A B C D) so there is possibility of differentiating code on that base also (but be aware of lock-in since no other code has that option). Since characters are separated with space code is not among shortest.

Code 3 of 9 (Code 39)

This is alphanumeric code and enables encoding numbers, characters (big case only) and some symbols (space - . $ / + % *). It is self checking so check-sum is not necessary but there is defined way of doing it if more security is needed. There is also possibility of concatenating multiple barcodes together but that possibility is rarely used (just don’t start your code with space). There is also extended variant that can encode whole ASCII (0-127) range.

Code 128

Illustration

This symbology is three in one deal. Three different encodings not only allow for full ASCII (0-127) to be encoded but there is also special double density mode which allows it to encode numeric only data in half the width. Check-sum (modulo 103) is mandatory part of encoding so no additional support is needed within software. Since symbols are also self-checking this gives very high confidence in reading. There were small problems with reading Code 128 with old barcode readers but everything that is recent on market supports it. Since there are three different ways of encoding data (and switching between them within) writing optimal encoder is not an easy task.

Conclusion

At the end, I selected Code 128. Not only that it gives highest possible level of security but it also has shortest code (with numeric data at double density). Slightly complex encoding was just another challenge that needed overcoming. C# source code is available.

Useful pages

Here are few pages where you can find more information - enough to write your own encoder.

Bug With CenterParent

One may think that centering a window is an easy task. Just set StartPosition to CenterParent and everything should work. This is true in most cases, however there is one bug that makes things more annoying than it should be.

Reproducing bug

First step should always be to reproduce buggy behaviour. In this case it is fairly easy. Just make new application with two forms. One parent we will at default settings and one child for which we will change StartPosition to CenterParent. On main form create two buttons. First button should show modal window (ShowDialog) and second should create just owned window (Show) like in this example.

When you click on second button you will notice that form isn’t centered at all. Every window created just goes to next default position.

Although this is definitely a bug, Microsoft decided not to fix it. Explanation is rather strange: “a fix here would be a breaking change to the behavior of WinForms 1, 1.1 and 2.”. I am not challenging that it would be a change to fix it but how many people would it really affect in negative manner. I personally cannot imagine one person setting child form to CenterParent and then being angry because form is centered.

Workaround

Like in “good old days” we need to center it manually. Do not forget to set StartPosition to manual and then just use some good old mathematics to fix it.

Resolution

Real solution would be to fix it in framework but since it is already rejected, I wouldn’t hold my breath for it. Easier solution (.NET Framework 3.5) would be to create extension method and solve it there. It is not resolution as such but it makes whole look nice when you need it.

X86, X64, IA64, AMD64, EM64T...

x86

Back in “good old times” Intel named processor model numbers 8086, 80286, 80386, 80486, 80586 (ok, this one didn’t really exist - Pentium was introduced instead). Since they all were pretty much same architecture, at one point somebody just said x86 processors. To make things more interesting although x86 could refer to 80286 which is 16-bit processor, we use it in current days to refer to 32-bit architecture only. Intel itself referred to it as IA-32, but that name never really sounded as good.

IA64

Intel decided to make new 64-bit processor. Since IA-32 was heavy with compatibility baggage from as far as 8086, they made a clean cut. Everything on this processor was bigger, better and incompatible with old IA-32 architecture. Since server market needed 64-bit, some advancements were made there but market penetration wasn’t as good as Intel hopped. Problem was that native applications came in small numbers while compatibility with old IA-32 (or x86) instruction set was really slow. Architecture still lives on with Itanium processor.

x64

AMD noticed problem and made own version of 64-bit processor. They just added some new 64-bit instructions to already existing x86. Solution was not all that clean as IA-64 but old applications worked at same speed (no emulation was needed) and new applications could address 64-bit space. That solution is known now as x86-64.

It took a while for Intel to see that his Itanium is going nowhere near consumer machines. When they finally took notice unbelievable thing happened - Intel used AMD64 instruction set. This made those two architectures same in sense of programming support. One doesn’t need to care whether he writes for Intel or AMD (not really true for early version of Pentiums since they lacked some instructions). Early name for Intel’s version was EM64T just in case you were interested.

64-Bit - How Hard Can It Be?

If you do programming in .NET world answer is clear: It is not hard at all. As long as you keep your hands out of cookie jar (know also as Win32), moving to 64-bit is just none to few clicks away.

Of course, some prerequisites are needed.

You need to have 64-bit operating system

This is pretty obvious - 32-bit operating system on 64-bit hardware will work in 32-bit mode. No big surprise here.

You need to work on .NET framework 2.0 and above

.NET framework 1.0 and 1.1 exist in 32-bit versions only. They will run on 64-bit platforms without any problem but they will do so through emulation layer called WOW64 and thus no support for 64-bit address space - everything above 2 GB stays unavailable.

You need to tell your favorite .NET compiler that you want that

Illustration

By default, .NET will compile code for target platform called “Any CPU”. Although one could think that this would make code that is common denominator of all - 32-bit, it will actually mark executable as both 32-bit and 64-bit. This is possible since code is translated to CLR and thus processor agnostic. On 32-bit systems it will run as 32-bit code, on 64-bit systems it will run as 64-bit code. In case that you need insane amounts of memory (for me insane is above 2 GB) all the times, you can select x64 or Itanium as your target and make your code unable to run in 32-bit mode at all.

If you use installer let him know about your bit-ness also.

Illustration

If you pack your code with MSI installer, you have a problem. There is no way to tell it that your code is both 32-bit and 64-bit (Any CPU).

If you select x86 as your platform, it will install correctly on both 32-bit and 64-bit Windows but on 64-bit it will install in Program Files (x86) folder. That folder is reserved for legacy 32-bit code and having your application there sends clear signal to users that something is wrong here althought it will run as 64-bit application when you start it.

If you select x64 or Itanium as your target platform you will end up with installer that will show error message and refuse to proceed if system is 32-bit (or that other 64-bit one) even though code would run just fine.

There are two solutions. Either make separate MSI package for every platform or switch installer. Neither of these two is nice one. :(