Now I'm ready to get out my checkbook.
What factors influence the cost of such a computation in today's cloud computing market?
How does the price scale with N and the rate at which I want to complete the problem? (for example, if it takes 1000 AWS instances 10 months or 10,000 AWS instances 1 month, am I going to pay the same? I'm assuming the length of time to run this is less than a year, so not long enough that the cost would decrease significantly during the computation.)
Are there aspects of flexibility that would decrease the price? (example: the cloud servers can put my computation on hold or throttle the CPU anytime they want in the short term, as long as the average rate my computation is running in any given day is, say, more than 80% of full-speed)
Lenstra used the term "dollardays" to describe the cost of breaking cryptographic keys as something that scales with the capital cost of the computing equipment (40 million dollardays = 40 days on $1 million worth of computers, or 1 day on $40 million worth of computers) but with cloud computing it seems more like it would be a piecework pricing structure.
0 comments