School of Computing - Research - Working Papers - 1993

Working Papers for 1993

Modelling the Software Process in Terms of the System-Representations and Transformation-steps Used,
T. Moynihan

I view the software development process as the evolution of an initial representation of the required system through a sequence of transformation steps and intermediate representations, culminating in the delivered software. In the general case, the intermediate representations are based on different linguistic systems. I argue that any representation, and any transformation- step, can usefully be characterised in terms of its degree of possession of each of a small number of properties. I then show that the degree of possession of these properties can influence important software process outcomes. I conclude by claiming that the proposed paradigm brings one closer to the "core" of the software development process than do most other approaches to software process modelling.

Jupiter: An Interoperator for Heterogeneous Database Systems,
J.P. Murphy and J.B. Grimson.

This paper describes the Jupiter Interoperator prototype system being developed at Trinity College Dublin and at Dublin City University. The goal of the Jupiter interoperator is to provide a lightweight transport which allows interoperation of heterogeneous database systems which may have different data models. At the core of the Jupiter architecture are a distributed dictionary system which provides the mapping information needed to allow interoperation between the participant DBMSs and an interoperator language, the Jupiter Interoperator Language (JIL) which allows a user to formulate queries to the federation. The participant DBMSs form a loosely coupled federation which interoperates by using the services provided by the Jupiter interoperator. The interoperator is itself constructed above the Admadeus distributed persistent object store (DPOS). To our knowledge this is a novel approach which has not been attempted before, i.e. using a DPOS to manage a federation of heterogeneous DBMSs. In this paper we outline the goals of our research and describe the implementation architecture of Jupiter. An interoperator language JIL is also described.

Network Management in a Heterogeneous Internet,
M. Connolly and M. Scott.

The TCP/IP protocol suite is today's de facto open standard for computer communications in heterogeneous networks. However, it is the opinion of many that the OSI protocols from ISO will eventually dominate and replace TCP/IP. The existence in today's networks of both sets of standards gives rise to the question of how to manage such heterogeneous network elements, particularly in this phase of transition from one standard to the other. This paper first examines the reasons for heterogeneity, discussing general TCP/IP to OSI transition strategies. It goes on to identify specific strategies for the transition of the management application itself, as well as strategies for the management of a network in transition, discussing in more detail the implementation of one of those strategies, namely the proxy management ofg OSI network elements using a native SMP management system.

The Application of Fuzzy Logic to Image Processing and Analysis,
R.B. Lawlor and J. Power.

The theory of fuzzy sets was developed as an extension of classicalBoolean logi and has been applied to both low-leve and mid- level computer vision problems. The view of a digital image as an array of singletons where each pixel's membership represents the strength of some property at that point, allows the use of fuzzy logic in image processing. The paper contains a comparison of the image enhancement, thresholding, noise suppression and edge detection problems using fuzzy and non-fuzzy approaches. We then describe an attempt at image analysis using fuzzy geometric properties, in particular fuzzy perimeter, fuzzy compactness and index of area coverage. It was hoped than on extracting these properties for simple 2-dimensional shapes, shape identification could be achieved. Fuzzy perimeter is then examined in some detail and experimental results are presented and discussed.

Comparison of Effective and Longest Relaxation Times of Single Domain Ferromagnetic Particles,
J. Waldron.

Potentially, a Single Domain Magnetic Particle can store one bit of information with the two orientations of the magnetism serving as then on/off states of the memory element. For the purpose of increasing the speed of computers and increasing their memory density, the density of memory elements will have to be increased and their size decreased, however the particle must not be so small that the magnetic memory will be unstable against thermal agitation at room temperature. This paper describes numerical analysis undertaken to compare several mathematical models of the behaviour of Single Domain Particles and shows the most suitable method of calculation of the relaxation time for all magnitudes of the energy barrier.

Design and Implementation of a Comprehensive Security system for Use in Distributed Network Systems,
R. Moran and J. Dunnion.

Among commercial organisations there is a growing need to provide adequate levels of protection to computer systems, especially where those systems and the applications are distributed. This paper describes the design of a Comprehensive Security System (CSS) for use in typical computer networks. CSS requirements are reviewed under the headings of protection, detection, accountability, data integrity, data confidentiality and network availability. A prototype system has been implemented over a BSD4.3 Unix domain network and this is compared to similar systems developed for OSI-based networks.

Partitioning Methods for Signature Files: Experiments on Searching Large Volumes of Text,
F. Kelledy and A.F. Smeaton.

Searching through large volumes of text can be implemented by generating a signature file from the input text using techniques of superimposed coding and performing the search on the signature file rather than on the text itself. Overall search time can be reduced by deterministically partitioning the signature file using one of a number of known techniques. In this paper we evaluate the effectiveness of some known deterministic paritioning algorithms using real rather than siumulated data of over 276 Mbytes of original text, and using real rather than simulated user queries. We present a selection of results to illustrate trends and highlight important aspects of the performance of these methods which could not be determined by simulation or experiments on small data sets alone. We also propose a new method of deterministically partitioning a signature file and compare it empirically with existing algorithms.

Pseudo-Dynamic Load Balancing,
D. Sinclair and M. O'hEigeartaigh.

Two major barriers prevent the widespread, common usage of parallel and distributed computing systems:
(1) A language which expresses parallelism without reference to the underlying hardware configuration
(2) A user invisible method for effectively distributing the tasks that form the parallel/distributed program, among the available processing nodes. This is known as the load balancing problem.
This paper proposes a new approach to solving the load balancing problem. This new method is called the pseudo-dynamic load balancing and provides an efficient real time allocation of n tasks among m processing nodes.

Non-Symmetric Formulation of Load Balancing on Parallel Distributed Systems,
D. Sinclair and M. O'hEigeartaigh.

This paper presents a non-symmetric mathematical formulation for the allocation problem of n inter-communicating tasks among m homogeneous processing nodes when the optimisation criterion is to minimise response time. This formulation shows that the problem is not only an integer optimisation problem, it is also quadratic in nature. A novel relaxation technique is presented which reduces the problem to a mixed integer linear problem (MILP). This relaxed MILP formulation is implemented in the SCICONIC mathematical programming package and is applied to an example problem. The results demonstrate that this new relaxation technique generates strong upper bounds on the execution time of a program executing on a parallel/distributed computing system.

Statistical Inference for Texture Characterisation Using Second Order Statistics,
P. Kiernan.

The maximum liklihood method proposed by Kashyap for determination of conditional autoregressive moving average (CARMA) model parameters for texture is modified for uncorrelated white Gaussian noise driven quarter plane simultaneous ARMA (SARMA) models. Test results based on modelling and synthesis of real textures are presented and analysed.

Worst Case Analysis of Pseudo-Dynamic Load Balancing,
D. Sinclair and M. O'hEigeartaigh.

The Pseudo-Dynamic Load Balancing algorithm is a novel technique for generating near-optimal allocations of n inter-communicating tasks among m processing nodes. This paper derives the worst case ratio, R, for the Pseudo-Dynamic Load Balancing algorithm. If the ratio of the maximum communications load to the maximum computational load is a, then the upper bound on R is R <= 1+2a.

The Universal Random Bit Tester,
M. Scott.

What do the successful gambler, wiretapper and data compressor have in common ? The answer is the ability to predict the output of a data source. Of course if the data source is truely random, its output cannot be predicted with an accuracy exceeding pure chance. in this article a short C program which implements a new Universal statistical test for randomness, doe to Maurer, is presented.

The Denotational Semantics of a Database Language,
J.P. Murphy and J.B. Grimson.

This paper presents a semantic description of the database language T/QL. T/QL is a relational subset of the JIL multidatabase language with most of the syntactic sugar removed. The semantics of a relational database language is usually given via the relational algebra or the relational calculus. In this paper a different approach is attempted, we define the abstract syntax of T/QL and an algebra, we then describe the valuation functions which map the abstract syntax to the algebra. This allows us to assign a meaning or denotation to a program in T/QL. A feature of the semantics is that it is defined using the Z notation.

AIDS in Ireland: Limitations of 'Deterministic' Back-Projection,
C.M. Comisky and H.J. Ruskin.

The method of back-projection uses knowledge of the numbers of AIDS cases (after adjustment for reporting delays) and information on the incubation period distribution, to derive estimates of those previously infected with the HIV virus. Although initially widely used by mathematicians and statisticians as a means of providing essential estimates of the numbers of unknown HIV infectious and short-term predictions on futuer numbers with AIDS, it now appears clear that estimates are highly variable depending on the nature of the assumptions made. We briefly discuss results obtained on the irish AIDS data for a parameterised version of the back-projection method; (AIDS counts represented by a smooth continuous curve). Special features of the irish data compared to that of the U.K. and U.S. are also briefly discussed.

On the Derivation of Asymptotic Formulae for the Dielectric Correlation Times of a Single Axis Rotator from the Exact Solution,
W.T. Coffey, D.S.F. Crothers, J.T. Waldron.

It is shown how the exact formulae for the longitudinal and transverse correlation times of a single axis rotator with two equivalent sites, which have been previously given as a series of modified Bessel functions, may be rendered in integral form using Watson's integral formula for the product of two modified Bellel functions. The method of steepest descents is applied to these solutions in order to obtain asymptotic formulae for the correlation times in the high potential barrier limit.

Navigating through Dynamic Environments Using Graph Theory,
J. O'Duinn and M. O'hEigeartaigh

Graph Theory is an established technique for modelling static objects in a domain. Modelling mobile objects requires that the domain be viewed as a series of static domains at consecutive time-slices. Conventionally, the entire network would be rebuilt at the end of the current time-slice so as to correctly represent the domain in the next time-slice. This paper outlines how to update only the relevant parts of the graph network, which represent the changed parts of the domain, and still ensure that the network correctly represents the domain. For domains which contained a mixture of mobile and static objects, experiments carried out show that this partial updating technique can rebuild a network much quicker than the conventional technique.

Counting Hamiltonian Paths and Tours in Regular Graphs,
M. O'hEigeartaigh and M.E.J. O'Kelly.

We let Kn be the complete graph on n nodes and Gm,n the set of regular graphs of degree m with n nodes. We will regard graphs G in Gm,n as having been derived from Kn by deleting the arcs of the regular graph S. Duality results relating the chain in S with the number of Hamiltonian paths and tours in G are derived.

Multidatabase Language Semantics in Jupiter,
J.P. Murphy and J.B. Grimson.

This article presents a semantic description of the relational multi-database language component of the Jupiter Interoperator, which is known as the Jupiter Interoperator Language (JIL). Multidatabase research has been motivated by the proliferation of databases and the consequent need to jointly manipulate the information in them. Standard database management systems do not possess the capabilities required of this type of manipulation as they typically exhibit single relation semantics, i.e. the result of an operation is a relation. Multidatabases, on the other hand, require manipulations of more than one relation at a time resulting in (possibly) more than one relation in the result, i.e. they have multi-relation semantics. The contribution of this paper is that it describes a novel semantics of a relational multidatabase language in a uniform formal framework, by this we mean that we also describe update, deletion and modification semantics. To the best of our knowledge there is no other work which gives such a uniform semantics; the closest in approach does not also describe update, deletion and modification semantics.

Novel Chaining Methods for block Ciphers,
M. Scott and M. Shafa'Amry.

Several new chaining techniques are described for use with secret key Block Ciphers. In particular. new forms of error- propagating chaining are introduced which sacrificae the self-healing properties of the classic Cipher FeenBack (CFB) and Cipher Block Chaining (CBC) modes, which are of dubious advantage in many applications, in return for much enhanced security against known and chosen plaintext attacks. In addition it is demonstrated that a standard one-way hash function, such as MD5 or NIST- SHS, normally used in conjunction with a digital signature system, can also be used to encrypt.

The Modelling of Mutual Alternative Routing in Circuit-Switched Networks,
P. Duggan and T. Curran.

This paper reports on a project which simulated the concept of mutual altertnative routing on the Telecom Eireann national trunk telecomminucations network. OPNET, OPNET, a Unix based package was used for the simulations.

Meaning Representation in Connectionist Systems,
J. Connelly.

The aim of the work outlined in this paper is to examine the suitability of connectionist architectures to the problem of lexical meaning representation. Conceptual structure is adopted as the semantic theory in which the meaning of individual lexical items are seen in terms of how they relate to other lexical items. To accommodate this the semantic feature approach to meaning representation is proposed, i.e. each lexical item is described in terms of its properties or attributes. Thus similarity between lexical items can be measured by the degree of overlap in their respective feature profiles. Conceptual hierarchies are employed in an effort to represent world knowledge at a conceptual level and also to make the production of a feature profile for each lexical item more consistent.

Scheduling Algorithms for the Yearly Extension of the Telephone Transmission Network,
D. Logue and M. O'hEigeartaigh.

This paper introduces the issues in routing flows across a telephone transmission network. It describes the development of a number of heuristic algorithms utilising search techniques to achieve an optimum flow routing. the basic routing heuristics which were developed are explained; a link based heuristic, a demand based heuristic and a chain based heuristic. A computer model called OPTNET, developed to implement and avaluate the heuristics, is described along with the data structures used in the model. The effectiveness of each heuristic is evaluated by applying them to a number of test data sets that represent different network topologies. Results from applying the heuristic algorithms to the set of test networks are given in tabular form along with an evaluation of their performances. Based on the test results the conjecture is made that, of the three heuristic approaches studied, the path based heuristic is most effective.

Producing LOTOS Specifications from STATEMATE,
S. Parker and T. Moynihan.

Formal methods allow specifiers to ensure the correctness, completeness and consistency of specifications. As such they can be an invaluable aid in the development of complex software systems. A main flaw with the use of formal methods, however, is their unsuitability as instruments of communication, particularly with people from a non-technical background. The graphical nature of informal techniques leads to a much more intuitive understanding of system requirements. This paper presents the idea of using a rigorous technique (STATEMATE) as a precursor to the development of a formal language (LOTOS). STATEMATE provides a good means of communicating requirements with users. It also enables completeness and correctness checks to be performed on systems at the requirements capture stage of system development. The use of STATEMATE as an initial step in the development of LOTOS leads to a much more highly structured and modular LOTOS specification.

Implementing Metrics for Process Improvement,
A. McAuley and R. Verbruggen.

There is increasing interest in the use of metrics to control the software development process, to demonstrate productivity and value and to identify areas for process improvement. Research work completed to date is based on the implementation of metrics in a 'standard' software development environment, and follows either a top-down or bottom-up approach. With the advent of further European unity, many companies are producing localised products, i.e. products which are translated and adapted to suit each European country. Metrics systems need to be customised to the processes and environment of each company. Thi spaper describes a 12-step process for metrics implementation, using an optimum approach which is a combination of top-down and bottom-up approaches, with a set of applicable metrics covering the software development process, which can be adapted for any development environment. For the case study, a software localisation company, the suggested implementation process is followed and relevant measures are adapted to suit the different environment with a particular emphasis on quality metrics. A metrics system is itself subject to continuous improvement and rather than being a once-off implementtaion, is an evolutionary process changing as the software development process comes under control.

An Approach for the Motion Planning of Mobile Robots,
B. Wang, C. Daly, M. Scott and M. Ryan.

Planning future actions is one of the fundamental components of intelligent mobile robot research. Motion planning problems have attracted many researchers to find efficient methods so that real time processing can be realised. In this paper an approximate approach for motion planning of a mobile robot in a binary represented environment is introduced. The approach uses fuzzy logic methods for digital image processing to process the original digital image taken by a CCD camera to determine the Principal Rectangle for every object in the image. The Principal Rectangle is used to represent every obstacle approximately. An A*-like search algorithm (using sector theory) is used to find the shortest path from the starting point to the destimation point. Experimental and simulation results show that the approach is efficient and effective.

DCU-Cipher: A Secret-Key Block Cipher System,
M. Shafa'Amry and M. Scott.

The recent results of using the new type of chosen-plaintext attach, called differential cryptanalysis, makes most published secret key block cipher systems vulnerable. We introduce in this paper a secret key block cipher that resists all known cryptanalysis methods including the differential cryptanalysis. The proposed method is workable for either 64-bit plaintext/64-bit ciphertext blocks, or 128-bit plaintext/128-bit ciphertext blocks. The secret key in both styles is 128 bits long. this method has only four rounds and the main transformation function in this cipher algorithm is based on four mixed operations. The proposed method is suitable for both hardware and software implementation. It is also suitable for cryptographic hash function implementation.

Using Linear Programming Techniques to Solve Telecommunications Problems,
B. McCarthy, M. O'hEigeartaigh and G. Keogh.

This paper attempts to use linear programming techniques coupled with common network flow concepts to generate formulations for two telecommunications problems. The first of these is based on optimal channel allocation in a network with a discrete cost function while the second is based on the network design problem. Both problems are fully formulated and solved using the SCICONIC package. Numerical results for a number of datasets are included. The formulations of the two problems in MGG code and the tables of results are given in the appendices.

Navigation in a Hypertext Using Dynamically Constructed Guided Tours,
C. Dunne and R. Verbruggen.

When searching by browsing in a large hypertext the user can become lost or disoriented. Effective access to information in a large hypertext requires query-based access and knowledge of the user and domain to compliment browsing and combat the problem of disorientation. In this paper we describe a technique for implementing an intelligent guide to aid the user in navigating a large hypertext. The technique involves identifying information seeking and useful patterns in the user's behaviour and exploiting hthese patterns to direct users toward relevant information in response to a query. The guide's behaviour is context- sensitive and adapts to individual users.

Statistical Modelling of a Disk Server in a Client/Server Environment,
B. Stone, M. O'hEigeartaigh and R Verbruggen.

In this paper we analytically model the performance of a server in a client/server environment and in particular we concentrate on the modelling of disk access to the server machine considering such aspects as the layout of a disk. We are especially interested in measuring the proportion of the network bandwidth being utilised by the disk server which is indicative of the scalability of the system.

Counting Chains in Single and Double Hamiltonian Tours,
M. O'hEigeartaigh and M.E.J. O'Kelly.

Let G be a Hamiltonian tour or two edge disjoint tours with nodes. By a chain of a arcs and b components in G, we mean a collection of b disjoint paths of total length a arcs in G. We let na,b be the number of these chains. Tables for na,b are derived and relationships with Sterling numbers of the second kind and the number of partitions of an integer are investigated. Algorithms for calculating na,b for small values of a are presented.

The Crossong Number Problem: A Comparative Study of Simulated Annealing, TABU Search and MYOPIC Algorithms,
G.I. Leeson and M. O'hEigeartaigh.

This paper describes and contrasts the results of some experiments with "strong" and "weak" search strategies into the NP- complete crossing number problem. The algorithms chosen for this set of experiments were simmulated annealing, TABU searching and the myopic search algorithm. The myopic search algorithm presented here is a search technique developed at Dublin City University.

Automated Storage Protocol Identification: Mechanisms and Algorithms,
G.I. Leeson and M. O'hEigeartaigh.

This paper is an attempt to fill in a hole in the current computing literature concerning identification theory, i.e. file identification. The aim is to describe the various techniques that can be employed to identify a given file, as well as to propose an hybrid algorithm that can be employed by software packages to overcome limitations with each of the basic identification techniques.