Abstract
Everyone loves a large caching tier in their multitier cloud-based web service because it both alleviates database load and provides lower request latencies. Even when load drops severely, administrators are reluctant to scale down their caching tier. This paper makes the case that (i) scaling down the caching tier is viable with respect to performance, and (ii) the savings are potentially huge; e.g., a 4x drop in load can result in 90% savings in the caching tier size.
Original language | English (US) |
---|---|
State | Published - 2012 |
Event | 4th USENIX Workshop on Hot Topics in Cloud Computing, HotCloud 2012 - Boston, United States Duration: Jun 12 2012 → Jun 13 2012 |
Conference
Conference | 4th USENIX Workshop on Hot Topics in Cloud Computing, HotCloud 2012 |
---|---|
Country/Territory | United States |
City | Boston |
Period | 6/12/12 → 6/13/12 |
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Computer Networks and Communications
- Computer Science (miscellaneous)