A datacenter's power consumption is a major contributor to its operational expenditures (op-ex) as determined by peak-demand-over-billing-period based pricing which is often employed by major electric utility providers. There is a growing interest in reducing a datacenters electricity costs by using demand-throttling techniques and/or energy storage devices (batteries which are readily available at most datacenters as a backup energy source). For the latter, we present a Markov Decision Process framework based on power-demand uncertainty and a linearized battery degradation model. This framework also explicitly considers risk of over or under charging the battery resulting in higher cost-savings (up to 2×) with tractable risk. We show the complete characterization of risk-cost trade-off and cost-per-risk as a function of datacenter's workload characteristics. We also, study a linearized battery degradation model empirically, and show the accuracy of this model for bursty workloads, however there is some discrepancy between the linearized model and reality for workloads with lower variability.