|Home||Back to Index|
dynamic keyword introduced in C# 4.0 can be really handy. But what’s the cost?
I was a bit nervous after reading the incredible Mr. Lippert. So I tested it.
I created two console projects based on my Replace Conditional with Polymorphism example. There are 7 subtypes, the overloads call Trace.WriteLine (to avoid I/O overhead), and the caller never sends an unknown type (to avoid exception overhead and match the real world). I compiled in debug mode and verified that the calls did not get optimized out. The programs use a Stopwatch around a loop, which calls the MakeAnimalNoise method once for each subtype.
The results surprised me.
Time (hh:mm:ss.fractions) for 100000 trials without dynamic: 00:00:10.3912497 Time (hh:mm:ss.fractions) for 100000 trials with dynamic: 00:00:11.0632324
Timing the runs with a physical stopwatch, I didn’t see any significant difference in startup times, either. (My measurements here were far from exact.)
Binary file sizes are comparable. I did not have the tools to measure memory footprints.
Based on these tests I’m comfortable with using dynamic code from a performance standpoint. Thank you Microsoft!
Update - March 6, 2011
I suspect this example takes full advantage of internal DLR caching; your mileage may vary.