Optimal engineering of computer and communication systems crucially depends on their analytical understanding under random input. However, a key characteristic of modern systems is that their input exhibits various forms of correlations, which cannot be well captured by most theories for performance analysis; moreover, resorting to approximations of real input with amenable models can be very misleading in terms of the systems' performance.
To better understand systems with correlated input, this talk introduces an elementary technique based on stopping times analysis. In a nutshell, stopping times are random times depending on the past only, and are natural representations of outstanding events in systems (e.g., the time of a QoS violation or the time when some requested object is not in some cache). The applicability of the proposed method will be illustrated with several classical examples, e.g., bandwidth dimensioning, admission control in wireless networks, or the design of cache algorithms.
|