TY - GEN

T1 - A new mathematical model for optimizing the performance of parallel and discrete event simulation systems

AU - Rizvi, Syed S.

AU - Elleithy, Khaled M.

AU - Riasat, Aasia

PY - 2008

Y1 - 2008

N2 - Null message algorithm is an important conservative time management protocol in parallel discrete event simulation systems for providing synchronization between the distributed computers with the capability of both avoiding and resolving the deadlock. However, the excessive generation of null messages prevents the widespread use of this algorithm. The excessive generation of null messages results due to an improper use of some of the critical parameters such as frequency of transmission and Lookahead values. However, if we could minimize the generation of null messages, most of the parallel discrete event simulation systems would be likely to take advantage of this algorithm in order to gain increased system throughput and minimum transmission delays. In this paper, a new mathematical model for optimizing the performance of parallel and distributed simulation systems is proposed. The proposed mathematical model utilizes various optimization techniques such as variance of null message elimination to improve the performance of parallel and distributed simulation systems. For the sake of simulation results, we consider both uniform and non-uniform distribution of Lookahead values across multiple output lines of an LP. Our experimental verifications demonstrate that an optimal NMA offers better scalability in parallel discrete event simulation systems if it is used with the proper selection of critical parameters.

AB - Null message algorithm is an important conservative time management protocol in parallel discrete event simulation systems for providing synchronization between the distributed computers with the capability of both avoiding and resolving the deadlock. However, the excessive generation of null messages prevents the widespread use of this algorithm. The excessive generation of null messages results due to an improper use of some of the critical parameters such as frequency of transmission and Lookahead values. However, if we could minimize the generation of null messages, most of the parallel discrete event simulation systems would be likely to take advantage of this algorithm in order to gain increased system throughput and minimum transmission delays. In this paper, a new mathematical model for optimizing the performance of parallel and distributed simulation systems is proposed. The proposed mathematical model utilizes various optimization techniques such as variance of null message elimination to improve the performance of parallel and distributed simulation systems. For the sake of simulation results, we consider both uniform and non-uniform distribution of Lookahead values across multiple output lines of an LP. Our experimental verifications demonstrate that an optimal NMA offers better scalability in parallel discrete event simulation systems if it is used with the proper selection of critical parameters.

UR - http://www.scopus.com/inward/record.url?scp=70249123479&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=70249123479&partnerID=8YFLogxK

U2 - 10.1145/1400549.1400690

DO - 10.1145/1400549.1400690

M3 - Conference contribution

AN - SCOPUS:70249123479

SN - 1565553195

SN - 9781565553194

T3 - Proceedings of the 2008 Spring Simulation Multiconference, SpringSim'08

BT - Proceedings of the 2008 Spring Simulation Multiconference, SpringSim'08

T2 - 2008 Spring Simulation Multiconference, SpringSim'08

Y2 - 14 April 2008 through 17 April 2008

ER -