I know it's an old question, but just to add a bit of empirical data...
Running 50,000,000 look-ups on a dictionary with 10,000 entries and comparing relative times to complete:
..if every look-up is successful:
- a straight (unchecked) run takes 1.2 seconds
- a guarded (ContainsKey) run takes 2 seconds
- a handled (try-catch) run takes 1.21 seconds
..if 1 out of every 10,000 look-ups fail:
- a guarded (ContainsKey) run takes 2 seconds
- a handled (try-catch) run takes 1.37 seconds
..if 16 out of every 10,000 look-ups fail:
- a guarded (ContainsKey) run takes 2 seconds
- a handled (try-catch) run takes 3.27 seconds
..if 250 out of every 10,000 look-ups fail:
- a guarded (ContainsKey) run takes 2 seconds
- a handled (try-catch) run takes 32 seconds
..so a guarded test will add a constant overhead and nothing more, and try-catch test will operate almost as fast as no test if it never fails, but kills performance proportionally to the number of failures.
Code I used to run tests:
using System; using System.Collections.Generic; namespace ConsoleApplication1 { class Program { static void Main(string[] args) { Test(0); Test(1); Test(16); Test(250); } private static void Test(int failsPerSet) { Dictionary<int, bool> items = new Dictionary<int,bool>(); for(int i = 0; i < 10000; i++) if(i >= failsPerSet) items[i] = true; if(failsPerSet == 0) RawLookup(items, failsPerSet); GuardedLookup(items, failsPerSet); CaughtLookup(items, failsPerSet); } private static void RawLookup ( Dictionary<int, bool> items , int failsPerSet ){ int found = 0; DateTime start ; Console.Write("Raw ("); Console.Write(failsPerSet); Console.Write("): "); start = DateTime.Now; for(int i = 0; i < 50000000; i++) { int pick = i % 10000; if(items[pick]) found++; } Console.WriteLine(DateTime.Now - start); } private static void GuardedLookup ( Dictionary<int, bool> items , int failsPerSet ){ int found = 0; DateTime start ; Console.Write("Guarded ("); Console.Write(failsPerSet); Console.Write("): "); start = DateTime.Now; for(int i = 0; i < 50000000; i++) { int pick = i % 10000; if(items.ContainsKey(pick)) if(items[pick]) found++; } Console.WriteLine(DateTime.Now - start); } private static void CaughtLookup ( Dictionary<int, bool> items , int failsPerSet ){ int found = 0; DateTime start ; Console.Write("Caught ("); Console.Write(failsPerSet); Console.Write("): "); start = DateTime.Now; for(int i = 0; i < 50000000; i++) { int pick = i % 10000; try { if(items[pick]) found++; } catch { } } Console.WriteLine(DateTime.Now - start); } } }
ContainsKeyexplicitlyToCheck.ContainsKey(Position).exists(myPosition, myDictionary)they could simply make a standard callmyDictionary.ContainsKey(myPosition. So that anyone reading the code doesn't have to go look up this mysteriousexists, which doesn't add anything useful (it is not any simpler to call).existsorContainsKeyis used ("the callers" ofexists). If any of those "callers" are performance-critical, then are they making multiple method calls onToCheck, which could be replaced with fewer calls? The classic example is replacingToCheck.ContainsKey( key )+... = ToCheck[key]withTryGetValue. That is where there is some hope of a performance gain.