Modelling and Analysis of
Networked and Distributed Systems
A SICSA Workshop
17th June 2010, University of Stirling

SICSA Logo
Stirling Logo

Context

This workshop is endorsed by SICSA, the Scottish Informatics and Computer Science Alliance. In particular, it is financially supported by the SICSA themes on 'Modelling and Abstraction' and 'Next Generation Internet'.

This workshop is also synchronised with the workshop on Mathematics of Networks that will take place on 18th June 2010 in St. Andrews. The bus trip from Stirling to St. Andrews takes about two hours. There may also be the possibility of informal lift-sharing or a shared taxi/minibus from Stirling to St. Andrews (about 1.5 hours).

Objectives

Audience

It is expect that this workshop will mainly be attended by contributors to the SICSA themes of 'Modelling and Abstraction' and 'Next Generation Internet'. However, the workshop is definitely not closed. Participation from other SICSA themes and from outside SICSA would be welcome. Participants should be permanent staff, RAs or PhD students with an interest in one or both of the following areas:

Call for Talks

As this is a workshop, no formal proceedings will be published. The workshop will consist of invited and other talks from speakers in the field. Proposals are solicited for talks of around 20 minutes. With speaker agreement, the PDF of talks will be made available digitally to delegates. Submissions will not be refereed, but they will be vetted by the organisers to make sure that they fit the scope of the workshop.

Speakers are expected to help bridge the gap between modelling/analysis and networked/distributed systems rather than just describe previous work. The ideal talk would describe a formal technique with thoughts about its application to networking, or a networking problem with thoughts about what would benefit from a formal technique.

Key dates are as follows:

Organisation

The workshop is organised by Computing Science and Mathematics, University of Stirling:

Programme Co-Chair
Logistics Co-Chair

Registration

Thanks to sponsorship by SICSA, there is no fee for this event. Registrations should be made by emailing name, institution and contact details to .

To help in booking catering, registrations are requested by 3rd June 2010; registration may be possible after this date. As a courtesy, registered delegates should notify Marwan Fayed as soon as possible if they are unable to attend.

Venue

The workshop will take place in Lecture Theatre B4, Cottrell Building, at the University of Stirling. The online directions provide information about getting to the campus. B4 can be found on the interior map. Tea/coffee will be served nearby.

Programme

09.45 Tea/Coffee

Served in Room 2B74.

10.05 Welcome and Introduction to The Workshop

Prof. Ken Turner and Dr. Marwan Fayed, Computing Science and Mathematics, University of Stirling

10.15 Invited Talk: Formal Testing of Distributed Systems

Prof. Rob Hierons, Brunel University

Rob Hierons is Professor of Computing and a founder of the ATeSST research group at Brunel. His research interests include a variety of formally-based approaches to testing distributed and other systems based on specifications, models and mutations.

Formal approaches to testing have received significant attention over many years and are at the heart of several test automation techniques. In addition to automation, the formalisation of testing allows one to reason about what the outcome of testing says about the system under test. The talk will start by reviewing the main approaches to formalising testing in the context of distributed systems, and some of the resultant conformance relations and test generation algorithms. It will then consider distributed testing, where we test a system that has distributed interfaces by placing a separate tester at each interface. If the local testers cannot interact with one another and there is no global clock, then each local tester observes only the sequence of inputs and outputs at its interfaces (a local trace). This makes it hard, and sometimes impossible, to reconstruct the global trace that occurred. Importantly, if users are also affected by the restriction (only local traces are observed) then we should recognise this in the conformance relation used. This talk will explore conformance relations for distributed testing and some interesting properties of these relations.

11.00 Invited Talk: Exploring the Stratified Shortest Paths Problem

Dr. Tim Griffin, University of Cambridge

Tim Griffin is a Senior Lecturer and Fellow at King's College, University of Cambridge. His research interests include network protocol design and analysis, with a focus on Internet routing protocols.

The Border Gateway Protocol (BGP) is the keystone among Internet routing protocols as it maintains connectivity in the entire global Internet. BGP has evolved into a rather mysterious beast. Theoretical work over the last ten years has made it clear that BGP represents something novel in the context of routing protocols: BGP does not compute globally optimal paths, but only locally optimal paths. The talk will explain how this exotic type of routing can be understood in an algebraic setting. The Stratified Shortest-Paths Problem (SSPP) is presented as a very simple example of this type of algebra. The SSPP is meant to capture the essence of a BGP-like path problem without BGP's many operational complexities.

11.45 Simulation and Modelling Approaches for Large-Scale Structured Peer-to-Peer Network Overlays

Dr. Mario Kolberg and Jamie Furness, University of Stirling

Structured Peer to Peer (P2P) overlay networks are becoming increasingly popular. Extensive research activities are taking place in this domain. However, new algorithms and approaches are very hard to validate. This is largely due to the very dynamic nature of such systems, involving a very large numbers of nodes (often hundreds of thousands or even millions). On top of this, nodes are joining and leaving the network throughout the lifetime of the network. Thus the exact state of such a network at particular points in time is very hard to trace.

Implementations using network simulator engines, such as NS2 and Omnetpp are commonly used, but are hampered by the complexity of the implementations. Complexity is an issue in two ways: firstly the coding is often very detailed, and secondly the run-time complexity is such that simulation runs can take a long time even on more powerful machines.

This talk will explore the complexity of such simulations using some recent examples by the authors. The talk is expected to stimulate a discussion on how modelling approaches can improve this situation.

12.05 A Temporal Model and Distance Metrics for Network Analysis

John Tang, University of Cambridge

The analysis of social and technological networks has attracted a lot of attention as social networking applications and mobile sensing devices have given us a wealth of real data. Classic studies looked at analysing static or aggregated networks, i.e. networks that do not change over time or are built as the result of aggregating information over a certain period of time. Given the soaring collections of measurements related to very large, real network traces, researchers are quickly starting to realise that connections are inherently varying over time and exhibit more dimensionality than static analysis can capture.

In this talk we will discuss new temporal distance metrics to quantify and compare the speed (delay) of information diffusion processes, taking into account the evolution of a network from a global view. Using real contact traces, we will show how these metrics are able to capture the temporal characteristics of time-varying graphs, such as delay, duration and time order of contacts (interactions). From this, we explore the concepts of reachability and small-world behaviour in time-varying graphs.

12.25 Lunch

Served in Stirling Management Centre.

14.00 Agent-Based Modelling and Simulation of Wireless Sensor Networks

Muaz Niazi and Dr. Amir Hussain, University of Stirling

Agent-based models are a relatively new paradigm gaining rapid acceptance in domains ranging from economics to social systems, from computer networks to even biological systems. In the domain of computer networks, the use of agent-based modelling and simulation allows the designer to focus on the key aspects of the network and not worry about other layers of abstraction.

In this talk, we will show how an agent-based model can be developed to represent an emergent phenomenon in wireless sensor networks. With large scale sensor networks and a failure rate dependent on battery lifetime, sensors start behaving as a complex adaptive system. With the passage of time sensors start disappearing from the network map, limiting the availability of network paths. We will explain how this diminishing phenomenon can be easily represented using an agent-based model.

14.20 Knowledge Networks as Socio-Technical Systems

Dr. Paola Di Maio, University of Strathclyde

In my research I investigate the socio-technical aspects of knowledge sharing on the Web in different types of organisations. In particular, I work on the dependency networks and 'problem entanglement' that make up most of the bottlenecks, as well as some sociotechnical metrics.

In this talk I will outline the details of a framework used in combination with techniques such as 'root cause analysis' that can help devise systemic solutions to complex, systemic knowledge-sharing challenges.

14.40 Modelling Interacting Networks in The Brain

Prof. Philippe De Wilde, Heriot-Watt University

The Networking Problem The brain consists of three types of interacting networks: neural networks, astrocyte (glial) networks, and a cerebrovascular (blood flow) network. The interaction mechanisms are gradually becoming clear, and are an active research topic in neurophysiology. The astrocytes provide crosstalk between the neurons, and can make learning more robust. The cerebrovascular network provides a metabolic function. If this network fails when somebody has a stroke, parts of the neural network fail too. There are no models of the three interacting networks. Such a model would provide huge medical benefit, and would also inspire better machine learning and pattern recognition algorithms. Astrocytes, for example, play a role in vision and smell. Finally, such a model would improve our understanding of human intelligence.

Benefits from Formal Techniques The appropriate level of abstraction is unknown. Some researchers use graphs, some Petri nets, some partial differential equations. As the brain is a complex system of interacting elements, other researchers use statistical mechanics techniques. It is also unknown how to make the model scalable. With 1011 neurons and 1012 astrocytes, only a small part of the brain can be simulated. Classical neural computation has been successful since the 1980's, and a model of the three interacting networks in the brain will extend the range of applications.

In this talk I will give an overview of the problem, and will show what formal techniques have been attempted.

15.00 Tea/Coffee

Served in Room 2B74.

15.15 An Investigation into Modelling Networks with A Spatial Stochastic Process Algebra

Dr. Vashti Galpin, University of Edinburgh

Stochastic process algebras are used for the quantitative modelling of computing phenomena. They assign durations to actions. From this, it becomes possible to understand the performance of the whole system. Recently, the stochastic process algebra PEPA has been extended with spatial notions. This presentation will illustrate how networks can be modelled using this spatial stochastic process algebra. The inherent spatial aspects of a network can be represented by a graph. Since the formalism captures spatial information as a graph with weighted edges, this is a good match to the problem area.

The presentation will describe the formalism, followed by the model and an assessment of its utility.

15.35 Computational Stochastic Modelling for Large-Scale Systems: Methods and Applications

Dr. Rashid Mehmood, University of Swansea

Modelling and simulation are important tools in the design and analysis of systems. Most systems of interest are huge in details. Both the amount of required memory and the time to compute the solution pose a major difficulty: a problem known as the Curse of Dimensionality or State Space Explosion.

In this talk, we will present our work on Computational Stochastic Modelling (probabilistic model checking) for large-scale systems, and will discuss the methods involved in the modelling process and give a few applications. In particular, we will focus on Continuous time Markov Chains (CTMCs) which have been widely used as a stochastic formalism for design and analysis of systems in engineering and science. In this context, traditional methods for performance analysis typically require the generation, storage, and the processing of the underlying state space for the numerical solution. For steady state problems, the numerical solution phase can be reduced to the solution of linear equation systems of the form Ax = 0. The transient solutions for the CTMC can be obtained by solving a differential equation. Both transient and steady state solutions involve large-scale matrix computations.

Subsequently, we will present parallel numerical computing techniques and compact data structures which we have developed to handle large systems. Using the discussed techniques, we will demonstrate analysis of stochastic models with over 1.2 billion states and 16 billion transitions on single and multiple workstations. (Equally, it involves solution of a sparse linear system with 1.2 billion equations and variables.) To the best of our knowledge, these are the largest models solved by the academic community working in this area.

Finally, we will discuss a few applications in telecommunication systems (Wireless and Optical Networks, Computational Grids), and overview the ongoing work.

15.55 Using PEPA to study Network Selection Strategies in 3G-WLAN Internetworking Networks

Prof. Jane Hillston, University of Edinburgh

The stochastic process algebra PEPA has been used for modelling a variety of computer and communication systems over the last 15 years. In this talk I will outline a recent set of models which investigated the average throughput, handover rate and network blocking probabilities for mobile nodes in a 3G-WLAN heterogeneous network under a number of commonly used network selection strategies.

This talk will present joint work with Hao Wang and Dave Laurenson, IDCOM, University of Edinburgh. In particular I will discuss how the compositional structure of the process algebra made it particularly straightforward to compare the different strategies in a common framework.

16.15 Plan for SICSA Collaborations

Initial identification of shared problems, possible collaborations and road map

17.00 Workshop Close

Informal discussions among delegates.