Geeks With Blogs
Freestyle Coding Programming in the Real World

As I was working on updates the The Krewe app, I found myself needing to re-implement the Geocoordinate. Before you ask, I'm making a PCL library, and the PCL doesn't have shared version of the object. You can have a GeoCoordinate on the phone, or a Geocoordinate on the desktop or tablet. I figured the object was lightweight enough that it wouldn't be too difficult to implement in a portable way.

Being the good programmer I am, I went to overload the 3 required methods from Object. Of course, Equals and ToString were trivial. Then, I came to GetHashCode. At this point, I need to turn two doubles into a unique integer.

This gave me some pause. The easy answer would be to produce the hash code of the latitude and longitude. Then, I could do some weird math on those two values to produce my new, "unique" value. However, something about this left me unsatisfied. I was already compressing something with a HUGE range into a much smaller bucket. If I wanted to do something truly effective, I needed to produce a unique value before the compression.

My first thought was to normalize and add the values. I put a little thought behind this, and quickly discovered a flaw in this idea. Two Geocoordinates that differed in reciprocal ways would collide. In other words, a LatLon pair of ( 9, 11 ) would collide with ( 11 , 9 ). Of course, I'm really worried about values down at the .00001 level. You might think this is a bit extreme, but with geocoordinates come geofences. Now, pairs of coordinates with offset values don't seem as unlikely.

My next thought was to use the beginning parts of the latitude and longitude to create an integer. Based on the maximum size of an integer, that gave me either 2 or 3 decimal points to deal with. 3 decimal places is considered accurate to the neighborhood. This would definitely lead to collisions.

Finally, the light bulb went off. Between the two doubles, the highest value either would have is 180. Doubles are "accurate" to such an astronomically high value that I had plenty of room to play. I began by normalizing the latitude and longitude to be positive. The value range for a latitude is 90 to -90; the value range for a longitude is 180 to -180. Thus, I added 90 to the latitude and 180 to the longitude.

Next, I needed to preserve the significant digits I desired. 8 decimal places is accurate to about 1 millimeter. At that point, if I get a collision from my little trick, they might as well be the same place. With this in mind, I multiply the longitude by 100,000,000.

Now, the magic happen. I further multiply the longitude by 1,000. This give me a number that has no significant digits in any place that conflicts with the significant digits of the latitude. As such, I was able to add the longitude to the latitude to produce an uber double that was extremely unique to every place on the planet.

My final step was simple, ask .NET to give me the hash code for that double. I figure they have a few people smarter than me. I'll get them that far and let them clean up the rest. The resulting function looked that this:

public override int GetHashCode() {
return ( Math.Truncate( ( Longitude + 180.0 ) * 1E11 ) + ( Latitude + 90.0 ) ).GetHashCode();
}

Now, I know a lot of you may be thinking that I wasted a lot of thinking for such a silly little problem. However, I think people are discounting the importance of actually overriding the 3 overridable functions from object. The reason they are there is so that all the cool tricks the framework provides are based off those 3 functions. A little effort up front can make huge performance gains later.

Happy hashing...

Posted on Friday, January 16, 2015 1:20 PM .NET , C# | Back to top


Comments on this post: A Case Study in Producing Unique Hash Values

No comments posted yet.
Your comment:
 (will show your gravatar)


Copyright © Chris Gardner | Powered by: GeeksWithBlogs.net