Programming in C#, Java, and god knows what not

HRESULT

HRESULT (a.k.a. SCODE) is one of data types you will see when playing with COM objects.

Although data type is called result handle, it is not handle to anything. While this may seem like nitpicking, there is one very important difference in case of 64-bit systems. This value is always 32 bits in length (UInt32) while true handle would be either 32 or 64 bits (IntPtr).

Another thing that may strike as unexpected is that result of successful operation is not necessarily 0 as you may expect from your Windows API experience. This value has total of seven sub-types but in reality only three are used.

Severity

If only check whether operation was successful is needed, you can just see state of first bit (31) (SEVERITY_SUCCESS or SEVERITY_ERROR). If that bit is cleared, you are fine. In case of trouble, that bit is set:

bool IsSuccess(uint error) {
    return (error & 0x80000000) != 0x80000000;
}

If you have successful operation, you usually don’t care about other two fields. They are usually only handy when error occurs.

Facility

Second field is facility code that occupies bits from 16 to 26, so some bit-magic is needed:

ushort GetFacilityCode(uint error) {
    return (ushort)((error & 0x07FF0000) >> 16);
}

If you get facility with number less than or equal to 8, you are pretty sure that it originated from Windows. If you have SEVERITY_ERROR and facility value is 0 (FACILITY_NULL), developer was too lazy to assign facility code.

Error code

Third field is most useful one. Bits from 0 to 15 give you real error code:

ushort GetErrorCode(uint error) {
    return (ushort)(error & 0x0000FFFF);
}

In case of errors that originated somewhere in Windows API (and that is a lot of them) this will give you System error code (e.g. ERROR_FILE_NOT_FOUND). Even if error originated from some custom source, you can be almost sure that they reused it for same purpose. From this code you can pretty much deduce where problem lies.

Gadget's MIME Type

Once upon a time I created Windows gadget. Since I wanted to distribute it, I just uploaded it to my site I thought I was done with it. However, once I tried to download it, I got famous 404 “The page cannot be found” message. I double-checked everything and file was there. What wasn’t there is ability of IIS to handle it.

In order for IIS to know what to do with it, there is little piece of something called MIME type. I will not go into deep definitions of it, but it is sufficient to say that “gadget” extension should be defined as “application/x-windows-gadget”. Once that task is done, file can be downloaded without furher problems.

Virtual Disk API in Windows 7

Illustration

I already wrote about Windows 7 API for virtual disk support (for beta and release candidate).

Just as reminder, this is .NET implementation of Virtual disk Windows API. That enables you to create and attach (and few other things) virtual disks (.vhd) directly from C#.

This version can be downloaded here here. For shorter and more focused view, you can check open and create examples. Although they were written while Windows 7 was in release candidate, they do remain valid.

Windows Marketplace

Illustration

Windows Marketplace is new idea from Microsoft to collect all Windows Mobile applications in one place (like Apple’s App Store). Today it is starting to accept applications from vendors. While this is definitely a good thing I do not understand some road-blocks.

I, as developer from Croatia, cannot use this. Only developers from some twenty countries are worth enough to have their application distributed. If you happen to be outside of those countries - tough luck.

Annual subscription fee is set at $99. While this is not much, it is only a start. There is process of paid certification intent of which is to ensure applications are good enough for other users. Part of requirements to pass is to have proper code signing certificate and that doesn’t come cheap either. This is good platform for all companies that have nice little budget assigned, but it is definitely something that will “scare away” freeware developers. Applications that they will sell for free will now lose them money.

As you can guess, those two reasons were enough to drive me away from something that looks like a good idea.

GetHashCode

GetHashCode is approximate method for checking of equality that can end-up being used by other classes (most popular one is Hashtable).

.NET documentation gives us these rules:

  • Equal objects should return same hash code.
  • Same object should return same hash code every time until object is changed.
  • Distribution of hash codes should be random.

Equal objects should return same hash code

Sometimes checking for equality can be expensive and it may be few times faster to check whether their hash codes are equal. This will not give you definitive answer but it will allow you to remove quite a few objects from consideration. Hashtable does this through it’s bucket mechanism. If you insert object that doesn’t follow this rule, HashTable may be unable to return correct result (e.g. for ContainsKey method).

Note that this doesn’t mean that different objects must return different hash code.

Same object should return same hash code every time until object is changed

This one seems logical since it follows same reasoning as first rule. Whenever property that can make your Equals or Compare methods return different result changes value, you should recalculate your hash code. Small changes to object (something that doesn’t affect these methods) should not generate new hash code (they can, but there is no reason to).

Distribution of hash codes should be random.

This one is introduced to help classes that use buckets for divide-and-conquer (HashTable is our example again). This ensures that every bucket is filled to approximately same level so that any search doesn’t need to check every object.

Worst case scenario is every object returning same hash code value (e.g. return 0). While this follows rule one and two, performance-wise it is awful since every check will need to take all objects into consideration.

This is important rule, but you should not go through too much effort for this. Since GetHashCode method could get called a lot during search through collection, you should try to make it as fast as possible. Fast GetHashCode with less than optimal distribution will often out-perform elaborate and slow code.

What happens if I don’t override GetHashCode?

You will get default hash code that is based on memory address of your object in memory (in future implementations this may change). While this does work good enough, it may fail to recognize two objects as being same if they are created differently (e.g. returning same data row two times). In most of cases it will work, but it is generally bad idea to use it with collections (most of them use buckets) and it can lead to bugs that are difficult to find.

How to generate it then?

If you intend to use class with collection, you have probably already overriden Equals method (or you implemented some of compare interfaces e.g. IComparer). Whatever you have there to check for equality, use it in GetHashCode also. E.g. if your application uses property named Key for Equals, write:

public override int GetHashCode() {
    return this.Key.GetHashCode();
}

This makes it both simple and fast (if type of Key is one of .NET types) while following all rules.

Slightly more complicated situation is when you check against more than one property. One path you could take is to return GetHashCode based on element that changes more frequently. This will cause few collisions with hash codes (different objects will have same hash code) but it will not cause bugs. Depending on how many properties you have, it may not even have big hit on performance.

Other approach is combining two hash codes into one. E.g.:

public override int GetHashCode() {
    return this.Key.GetHashCode() ^ this.Key2.GetHashCode();
}

If you go that way, always measure speed. In more than one case you will find GetHashCode method that takes all elements into consideration is slower than one that has collisions. It all depends on objects you will use. From my experience, I would recommend avoiding calculating hash code on more than two properties.

Caching?

While caching may sound like a good idea, there is no need for it if you use GetHashCode of .NET Framework’s classes (as we did in examples above). Those classes either already have caching in place or they are using operation that is fast enough so that caching is not needed.

Only if you have your own hashing mechanism, you should consider caching results. Do not forget to update hash code also if object is changed.

Is it worth it?

If you are using something from Collection namespace, answer is yes. Almost anything there is either already using GetHashCode or it may use it in future. Even simplest of all hash codes will help performance.

Single Parent

Quite a few applications have both menu (MenuStrip control) and toolbar (ToolStrip control). Both of these controls have their menu containers. In case of MenuStrip this is ToolStripMenuItem and in case of ToolStrip we can use ToolStripSplitButton to get same effect. Both of those controls share DropDownItems property and with this you could make one ToolStripMenuItem and add it to both:

var newItem = new ToolStripMenuItem("Test");
newItem.Click += new EventHandler(newItem_Click);
toolStripMenuItem1.DropDownItems.Add(newItem);
toolStripSplitButton1.DropDownItems.Add(newItem);

This code looks nice but it does not work. In this case we will get Test added to toolStripSplitButton1 only.

Culprit for this is in SetOwner method (as seen with Reflector):

private void SetOwner(ToolStripItem item) {
    if (this.itemsCollection && (item != null)) {
        if (item.Owner != null) {
            item.Owner.Items.Remove(item);
        }
        item.SetOwner(this.owner);
        if (item.Renderer != null) {
            item.Renderer.InitializeItem(item);
        }
    }
}

As you can see, if item already has an owner, that owner is removed, and only than new owner is set.

Only solution is to create two new items and assign each to their own parent control:

var newItem1 = new ToolStripMenuItem("Test");
newItem1.Click += new EventHandler(newItem_Click);
toolStripMenuItem1.DropDownItems.Add(newItem1);

var newItem2 = new ToolStripMenuItem("Test");
newItem2.Click += new EventHandler(newItem_Click);
toolStripSplitButton1.DropDownItems.Add(newItem2);

While this means that you have two pieces of same code, you can find consolidation in fact that event handler methods can be reused.

One Step Back

I was more than surprised to find that Visual Studio 2010 (beta 1) has moved from “Any CPU” as default platform target to “x86”. In short, every application you create with Visual Studio 2010, will not run on 64-bit Windows without compatibility layer (WOW64).

While this may seem like a minor change, I consider it a major problem in 64-bit programming. When Visual Studio 2008 defaulted to Any CPU, that did caused whole range of bugs and errors. However, by resolving those errors, developer was at least somewhat introduced to problems with his code in 64-bit environment. If he could not resolve it, he would switch it manually to 32-bit.

This change just seems like allowing careless developers to stay careless. In short time, this will not cause too much problems, but in long run you will get bunch of code that requires 32-bit compatibility layer (32-bit Windows will not be with us forever).

Why Shouldn't I Sign My Code

While ago I talked about reasons for signing your executables. Now I will tell something about reasons against it.

Speed issues

In certain situations loading of .NET assemblies may be delayed. While usually this is not visible that much, it is something to keep in mind.

Not having trusted root certificate

If you don’t have one of trusted root certificates (e.g. VeriSign’s) you cannot count on Windows recognizing your signature. You will get same prompt as unsigned applications (on Windows Vista and 7). While your application may be signed, your users will not see that.

What do I do?

I do not sign my assemblies. I did try it for a while, but without proper certificate (expensive one), there is just no reason to do it.

However, I do strong-name my assemblies. This is not exactly same as signing them but it does provide you with integrity check which is only thing that I really need.

Reordering TabPages Inside TabControl

When using TabControl in my programs, usually there is one thing users always ask for - “let me reorder those tabs”. Since reordering tabs is such a logical operation, one would think that the mighty .NET 2.0 Framework has one. But no luck there. Visual Studio 2008 does have that feature, but framework mortals are not blessed with it.

Since Visual Basic programmers waited for inheritance for far too long, they tend to use it for solving every possible problem (when you have a hammer, all problems look like nails). May I say that here, the same inheritance approach will be used. :)

What we want to do here is to extend the standard TabControl. Inside of it, we are interested in MouseDown, MouseMove and MouseUp events. Some people like to use drag and drop here, but that seems to me like using a bomb to kill a fly.

MouseDown

Upon receiving a MouseDown event, we check if the Left button is used and if TabControl has tab selected. If those requirements are fulfilled, we initialize the SourceTabPage variable. With that, we are ready for further events.

MouseMove

When the mouse is moved, one must check if reordering is in progress. If that is true, we look at what TabPage is under it. If it is not starting one - we have a show.

One of the things we must discover is which TabPage we are hovering over. Since unfortunately, we cannot use some sort of HitText, we must iterate through all TabPages and select one with our coordinates.

After that, we check which side we are hovering. It is only important if you wish to display a different cursor for each side. These two are the ones I use, but you can think of your own schema.

MouseUp

Here, we must know on which side we are since the new location (before or after hovering TabPage) depends on this information. This code is basically the same as the one we use for determining the cursor type. After clearing is done, so is our function.

Conclusion

This extended control offers a good solution to reordering TabPages inside a TabControl. I think that the only thing one may hold against it is updating the order on MouseUp but that decision is made for the sake of speed and code clarity. This way, it is very easy to implement changes through further extending since it uses protected OnMouseX procedures.

There is source code available in both VB.NET and C#. Hope you like it.

P.S.

This is redoing of my Code Project article.

Intellisense 10-4

Illustration

While C#'s IntelliSense was always good, in Visual Studio 2010 they enhanced it even more. When one writes e.g. “reason”, it will also match CloseReason. While this may not seem much, it is gem if you do not remember exact property name.

Small feature that helps a lot.