Fair Benchmarking for Cloud Computing Systems
Lead Research Organisation:
University of Surrey
Department Name: Computing Science
Abstract
Compute resource benchmarks are an established part of the high performance computing (HPC) research computing landscape, and are also in general evidence in non-HPC settings. As Cloud Computing technology becomes more widely adopted, there will be increasing need for well-understood benchmarks that offer fair evaluation of such generic systems in comparison other kinds of computing systems that have been optimized for specific purposes. Cloud Computing benchmarks need to be able to account for all parts of the lifecycle of cloud system use., and most existing benchmarks do not allow for this. Cloud-specific benchmarks will increase in importance because clouds have a wider range of possible applications than are offered by HPC, and also because the variety of options and configurations of cloud systems, and efforts needed to get to the point at which traditional benchmarks can be run, have various effects on the fairness of the comparison. In this pilot, we set out to create an academically focused cloud benchmark site that accounts fairly for such variations. The principal outcome will be a web portal that embodies such considerations and which can be used to access data about benchmark runs, and potentially to adapt benchmarks to run on other Cloud systems. The proposed portal will offer a service to a knowledgeable user that returns the closest matches, based on the closest portfolio of benchmark elements, to a set of requirements specified about their own application as a Service Level Agreement (SLA). The portal will also offer access to bundled benchmark tests (virtual machines containing such applications) that have been constructed during the project, and which will alleviate the need for other researchers to repeat such work and the associated costs of Cloud as well as of effort, in doing so.
Planned Impact
The increasing use of Cloud Computing systems provides for increased flexibility with computing use at cost efficient levels. This offers potential for return on research investment in academic expenditure, hence demonstrating benefit to research funders. There are large numbers of Cloud Computing researchers and users outside academia who also stand to benefit from such efforts. Our routes to assuring the impact of our research are: 1. Providing an accessible website relating to the benchmarking of Cloud systems. Researchers and industrial users will gain by being able access impartial and transparent advice on Cloud Computing performance and features. This will be in stark contrast to the efforts currently required on the part of each interested individual and organisation in having to learn about such benchmarks and attempt to determine whether there are specific performance biases introduced in vendor-run benchmarks. The website will reduce the associated costs to industry of developing in-house skills in order to undertake such benchmarking, or in covering the costs of repeating such benchmarks. Impact horizon: within 6 months of project completion. 2. Runnable virtual machines (VMs) containing the benchmarks The website will also offer users the opportunity to download fully runnable images of computer systems (VMs) that can be used on internal systems. It should also be possible to offer instructions for creating derivative software applications that can allow industry to avoid the problems of being wholly dependent on a single system provider. Impact horizon: within 6 months of project completion. 3. Publications We will describe Cloud uses and the relationship of benchmarking to such uses on the website, and through submitting publications to learned journals and prestigious conferences. This will help Cloud users across all sectors to gain knowledge about assessing various benchmarks on different Cloud systems. Published lists of performance and features of Cloud systems should encourage providers to innovate in order to attract and retain customers, with consequent value to the research community. Furthermore, capturing the needs of users will help the providers to target new products and services for a range of academic and industrial users. Impact horizon: 6 -12months following project completion. 4. Events We will attend key conferences and workshops in order to build awareness of the project's activities. Notable conferences are the IEEE Cluster, Cloud and Grid Computing (CCGrid) symposium in Newport Beach, CA, USA, 23-26 May 2011, and the 1st International Conference on Cloud Computing and Services Science (CLOSER 2011), 7-9 May 2011 in the Netherlands, Impact horizon: during the project.
Organisations
People |
ORCID iD |
Lee Gillam (Principal Investigator) | |
Mark Baker (Co-Investigator) |
Publications
Gillam L
(2014)
Benchmarking cloud performance for service level agreement parameters
in International Journal of Cloud Computing
Gillam L
(2013)
Fair Benchmarking for Cloud Computing systems
in Journal of Cloud Computing: Advances, Systems and Applications
Lee Gillam (Author)
(2012)
Fair Benchmarking for Cloud Computing Systems
Lee Gillam (Author)
(2012)
Adding Cloud Performance to Service Level Agreements
Lee Gillam (Creator)
(2012)
Visualization of benchmark data - web demonstrator
Description | In this pilot project, we set out to offer a web portal that embodies searchable results of benchmark runs. We developed a dynamic visualization of benchmark data such that results across both providers and benchmarks can be displayed, and these values are all scaled to the best performance in that benchmark. |
Exploitation Route | This should lead towards negotiability and automation of QoS for Service Level Agreements (SLAs) and subsequently feed in to Cloud Brokerage. |
Sectors | Aerospace, Defence and Marine,Agriculture, Food and Drink,Chemicals,Communities and Social Services/Policy,Construction,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Electronics,Energy,Environment,Financial Services, and Management Consultancy,Healthcare,Leisure Activities, including Sports, Recreation and Tourism,Government, Democracy and Justice,Manufacturing, including Industrial Biotechology,Culture, Heritage, Museums and Collections,Pharmace |
URL | http://www.cs.surrey.ac.uk/BIMA/People/L.Gillam/downloads/publications/Fair%20Benchmarking%20for%20Cloud%20Computing%20Systems.pdf |
Description | IP Protecting Cloud Services in Supply Chains (IPCRESS) |
Amount | £107,507 (GBP) |
Funding ID | 101416 |
Organisation | Innovate UK |
Sector | Public |
Country | United Kingdom |
Start | 04/2013 |
End | 09/2014 |
Description | Surrey's EPSRC IAA funding |
Amount | £19,787 (GBP) |
Funding ID | EP/I034408/1 |
Organisation | Engineering and Physical Sciences Research Council (EPSRC) |
Sector | Public |
Country | United Kingdom |
Start | 09/2014 |
End | 09/2015 |