I am looking at the pricing of various cloud computing platforms, particularly Amazon's EC2, and a lot of the quotes are based on a unit called an Instance-Hour.
I am trying to get a handle on the exact definition of an instance-hour to better compare the costs of continuing to host a web-application versus putting it out on the cloud.
(1) Does it correspond to any of the Windows performance counters in such a way that I could benchmark our current implementation and use it in their pricing calculators?
(2) How does a multi-processor instance figure into the instance-hour calculation?