If you have an application where the performance and growth patterns are well known, cloud isn't a choice made for lowest cost. It may be best value for different reasons.
I've done ROI studies for several applications like that, and usually cloud has a higher total cost unless there are specific availability requirements or you don't have a facility that can meet a 99.9 SLA.
The key assumption is that you know a lot about the app and it's in a operating mode. If you're in a hyper growth mode, have fluctuating or seasonal demand, or you have no capital funds, cloud is a no brainer.
For sure, I think you can often save a decent amount of $$ with dedicated hardware. It does have downsides of course...
If your demand is 50% stable 50% fluctuating (say some base load + a big spike at US prime time), I still think you can win with a hybrid cloud... i.e. serve base load from a COLO, and serve spike load from the cloud. That does mean you need to configure at least 2 networks, but not a terrible idea from a DR standpoint anyway (Main DC fail? Push a button and run off the cloud until it is fixed)
I've done ROI studies for several applications like that, and usually cloud has a higher total cost unless there are specific availability requirements or you don't have a facility that can meet a 99.9 SLA.
The key assumption is that you know a lot about the app and it's in a operating mode. If you're in a hyper growth mode, have fluctuating or seasonal demand, or you have no capital funds, cloud is a no brainer.