As the computing world embraces the trend towards object orientation, recent years have seen the introduction of object oriented databases (OODBMS) onto the commercial market. Many application developers are now faced with decisions regarding not just which OODBMS to use but whether to use an object oriented database at all. In this paper we briefly review the emergence of benchmarking suites for OODBMS and we list some guidelines which the potential OODBMS user should bear in mind.
This paper describes the dictionary component of the Jupiter Interoperator. The Jupiter interoperator is a system which supports interoperability between a federation of autonomous heterogeneous database systems. By heterogeneous we mean syntactic and semantic heterogeneity between autonomous databases with homogeneous data models. A feature of Jupiter is that it has been constructed with reference to a formal framework. The multidatabase language JIL has been specified formally using a denotational approach. The purpose of this paper is to describe the features of the Jupiter distributed multidatabase dictionary Jupiter/MDD. The description is formal and is given using the Z notation. The contribution of this paper relates to schema evolution for federated databases; new types of user privileges and their meaning; and the characterisation of legacy systems along three related dimensions.
At a recent Appeal court hearing in London the Author presented evidence contradicting that of a leading British Forensic scientist, leading to the overturning of the conviction. The same Scientist presented crucial evidence at the Gibraltar Inquest, at other controversial trials, and was the Principle Forensic Investigator in the Lockerbie disaster.
We propose a 2-D extension of the Minimum Free Energy (MFE) parameter estimation method which may be used to determine autoregressive (AR) model parameters for 2-D spectral estimation. The performance of the technique for spectral estimation of 2_D sinusoids in white noise is demonstrated by numerical example. Better results can be achieved than with the multidimensional Levenson algorithm.
This paper describes an econometric analysis of the demand for residential telephone connections in Ireland. The aim is to develop a method of estimating future growth in the number of residential connections by firstly identifying the factors which influence medium to long term trends in these quantities in an Irish context and then devising a planning/forecasting model incorporating these factors.
The factors considered were derived from both the Irish experience and that of other countries; they included social, financial, economical and pricing influences. Models developed in other countries were investigated and adaptations were made where relevant.
A new test for serial correlation in multiple regression is proposed. The power of the test approximates that of the full Durbin-Watson Procedure very well in cases where the latter is known to have optimal power properties. The test also succeeds in having good power properties in circumstances where powers of the Durbin-Watson test are low. Unlike point optimal tests for serial correlation there is no requirement on the researcher to pre-specify a likely alternative to the null hypothesis of white noise. The construction of the test employs results regarding high-leverage observations in multiple-regression and as a by product it is shown that it is the presence of high leverage observations in a different co-ordinate system that lowers the power of the Durbin-Watson statistic in those circumstances where it is weak.
This paper describes a prototyping system which facilitates the use of ESML (Extended Systems Modelling Language), a Real-Time Structured Analysis and Design (RTSA/SD) language, as a graphical executable specification language. A set of execution rules (guidelines) are defined for ESML to allow prediction of the behaviour of ESML specifications over time. The ESML/LOOPN prototyping advocates the translation of non-executable ESML specifications into executable LOOPN (Language of Object-Oriented Petri Net) specifications; LOOPN nets are used to define an execution semantics for ESML according to the guidelines set down by the ESML execution rules. ESML can therefore be used to prototype real-time systems. For brevity this paper limits its scope to a subset of the ESML language.
We concentrate in this report on large-scale simulations of physical and related systems. Consideration is given to the various hardware capabilities, algorithms and current software availability. We also review language and machine performance, in particular for DCU systems. We discuss, as illustrations, sandpile models of single cellular systems and a cellular network model for froth. We also present results of our own studies on sandpiles, using the simulation approach. Two examples of sandpile systems are considered, namely the undirected and directed height models. For the latter, we also investigate the effect of leakage of sand through holes in the sandpile base. Results on the undirected model for exponent values of the size and duration of avalanches are in reasonably good agreement with theoretical work and previous studies. For the directed model, estimates of exponent values clearly do not agree with those for the undirected case, supporting the view that these models belong to different "universality" classes. The effect of leakage in the system is less clear cut. There is some evidence that self-organization of the critical state is lost, but it seems clear that larger systems and a wider range of values for hole concentration must be investigated before this result can be confirmed.
In this paper we explore how coloured petri nets (CPNs) can be used to model multimedia presentations. We will investigate the possibilities of modelling the spatial data of a presentation alongside the temporal data with the extra capabilities that CPNs offer over tradtional Transition/Place petri nets. Like Time petri nets, CPNs are an extension of the basic Transition/Place petri net model and are generally used to describe complex, real world systems which would be too complex for the original model. For modelling multimedia presentations, CPNs can be used to give an indication of the amount of data that needs to be stored and manipulated which is particularly important for distributed multimedia presentations where network transmission can be a bottleneck. Basic Place/transition nets provide an excellent means of specifying the temporal aspects of presentations however they fail to provide a mechanism to model the spatial or visual characteristics os multimedia presentations. In this paper we outline the advantages of using CPNs to do this and we give some worked examples.
The conventional Electronic Code Book (ECB) and Cipher Block Chaining (CBC) modes for use with a block cipher are open to known plaintext attack. Almost all successful cryptanalytic attacks on modern cipher systems are of this type. In this paper a simple and computationally inexpensive model is suggested to `harden' these modes against such attack.
This article defines a formal model for multidatabase interoperability. A mathematical model and multidatabase language are introduced and the formal semantics of the multidatabase language is defined in terms of the mathematical model. The development of the model is incremental in approach, starting with relational model semantics and extending this semantics to a multi-relational model and in our current work to a canonical model for federated systems.
The major contribution of this article is that it defines a uniform formal framework for the investigation of properties of multidatabase systems and their languages. Major language extensions relating to binding, negotiation and modifi cation semantics are proposed which are not be found elsewhere in the literature. Legacy systems are characterised in this article along three related dimensions: the legacy applications themselv es; the subsystems which support the legacy applications and interoperability; and the models and languages of the legacy applications and subsystems. We develop an extended algebra for multirelations and a new calculus for multirelations.
In this paper we propose the use of WordNet as a knowledge base in an information retrieval task. The area in which we work is data sharing in large scale autonomous distributed database systems, ( a.k.a. Federated Database Systems, FDBSs ). However, the applicability of our ideas is not restricted to this application domain.
The WordNet derived knowledge base makes semantic knowledge available which can be used in overcoming many problems associated with the richness of natural language. A semantic similarity measure is also proposed which can be used as an alternative to pattern matching in the comparison process.
Since the earliest days of computing it has been a goal to automatically process natural language text with a view to identifying its content. Informaiton retrieval has traditionally relied on keyword-based indexing to deliver retrieval for users in response to queries, a basis which is inadequate given the vagaries of natural language. In this article we examine why natural language processing techniques have not yet delivered efficient, domain-independent, robust and accurate analysis of text for information retrieval tasks. we analyse where work to date has placed us in the context of a roadmap or plan of where research should go in the future.
In this paper we describe how a hypertext corpus on an undergraduate course on database systems was converted from a proprietary system in HTML (HyperText Mark-up Language). The courseware was then enriched with non-text media and made available through Mosaic on the world wide web (WWW). A previously developed mechanism to dynamically generate a guided tour in respose to a user's query was then superimposed onto the Mosaic interface and the courseware was distributed to a number of sites on a metropolitan area network (MAN). This work demonstrates the feasibility of delivering multimedia courseware over a MAN using the WWW and Mosaic, enhanced with a guided tour creation facility.
It was the conclusion of the "definition phase" of the SPECS project (one of four RACE software technology projects) that none of the current specification languages such as SDL and LOTOS support all the needs for the specification of telecommunication systems, but that most desirable features are available somewhere amongst the languages. Although both FDTs can be handled by the semantic layer of the tools developed by SPECS, an additional language, to define the inter-connection between them, has to be designed and implemented to enable the handling of the complete mixed FDT systems. Thus SPECS has defined a scheme for combining specifications in LOTOS and SDL, bridging the worlds of synchronous and asynchronous specification languages. This paper describes the SPECS approach to mixing two different FDTs, SDL and LOTOS, and presents a set of rules to automate this mixing.
In this paper the accessibility graph approach for motion planning of a mobile robot which can accelerate from zero speed to its maximum speed in no time in a dynamic environment is discussed. The concepts of improved accessibility polygon and improved accessibility graph are introduced. A new approach based on the concept of improved accessibility graph is used to solve the problem of planning the motion of a mobile robot which can not accelerate from zero speed to its maximum speed in no time. The improved accessibility graph approach takes the same calculation time as the accessibility graph approach, th at is, in polynomial order. While the approach used before is the space-time approach which converts a two dimensional problem into a three dimensional problem. The calculation time for the space-time approach is in exponential order. The improved accessibility graph approach can also be extended to solve similar motion planning problems in three dimensional dynamic environments. Simulation results show that the new approach is effective and efficient.
Planning future action is very important in mobile robot control. In this paper, the shortest path and minimum time path planning is first addressed. A new method for minimum time path planning is presented. The method is based on smoothing the path at turning points by arcs. So the robot travels along lines and arcs that make up the path. Along the line it accelerates to maximum velocity and then decelerates as it approaches an arc to reach the "optimal velocity". The "optimal velocity" for the robot to travel along an arc is dependent on the potential function values and the radius of the arc. Fuzzy logic method or other methods are used to get the "optimal velocity" at a turning point. Simulation results show that the method is reasonable and effective.
This article sets itself the task of resolving the tension arising
from the complexity of linguistic theory vis a vis the often ad-hoc
simplifications made in effective applied computational linguistics
via a cognitive theory of language processing (LP). The
contraformalistic notions of the later Wittgenstein are discussed
followed by a consideration of Piaget's positing of the priority of
non-linguistic over linguistic knowledge. From this discussion, a
cognitive model of context emerges.
The consequences of this model are spelled out in an overall theory of human LP. It is argued that the processes involved vary with respect to the orthogonal factors of degree of restriction of context and task involved. The consequences of this model for future CL, both theoretical and applied, are discussed.
Despite significant advances in computer hardware, the designer of
human-computer interfaces is singular in his dependence upon
individual skill and experience. Contemporary software interface
design is largely based upon emulation of a previously successful
formula for a product. A more formal theory of the design of
human-computer interaction (HCI) is required so that we may understand
what is done during design and codify it for application
This paper discusses the pitfalls involved in formulating a theory of interaction. The pros and cons of a phenomenological-based approach are reasoned and the complexities of devising a cognitive-processing model are described.
There is a growing awareness of the need for computer security. In order to address the need, companies have developed security policies, governments have enacted laws, and standards organisations are developing standards specifically relating to computer security. In this document, we look at the broad management issues, the standards, the laws, the threats to security and the countermeasures which can be put in place to deal with the threats. We address the need for security, defining the basic requirements of security, that is, - to guarantee the availability of computer systems, the integrity of data and systems, and the confidentiality of data and systems. We review the reasons for the growth in the need for security and the consequences of loss of security. Because of the current concern about viruses, we examine virus programs in some detail, and the growth of malicious software during the past twenty years, including some of the hacking incidents. The document comprises two parts. In the first part, we discuss the problems - viruses, poor operating system security, poor procedures and unrecognised risks. In the second part, we address the solution to these problems - most of them tried, tested and workable. Finally, we discuss the areas of current research in computer security.
This thesis is concerned with query optimisation techniques in a distributed database system. One such technique, known as Semantic Query Optimisation, is introduced and studies that have been carried out on this technique are summarised and discussed. An outline of the current status of relational and distributed database systems and is given and some some of the major issues in these fields are discussed. A summary of a number of different approaches that have been adopted to achieve semantic query optimisation in a centralised databse is given. Some of the issues that must be considered when designing a query optimiser for a distributed database system are investigated and explained in detail. A current implementation of a distributed database system is over viewed and some emerging market trends in this area are also discussed. Some tests on the usefulness of SQO techniques were carried out in a commercial environment and the results of these tests are discussed in detail.
In this paper I will consider a number of hidden trapdoors which have
been suggested for public key systems and also propose two possible
methods of incorporating a practical hidden trapdoor for the RSA
system. Both methods are concerned with the prime generation routine
of an implementation of the RSA algorithm.
The first is a modification of a recent suggestion by Ross Anderson and makes use of a secret prime, known only to the designer/manufacturer, which enables the RSA modulus to be easily factored.
In the second method, the primes generated are powers of a primitive root of a prime, specially chosen so as to significantly reduce the computation required to solve the Discrete Logarithm problem. The task of factoring the RSA modulus involves the calculation of a discrete logarithm modulo this prime. In this way, knowledge of this secret prime allows the designer/manufacturer to find the original primes with relative ease.
The approach presented here by which the metadata of a database can be
replicated to copies of the database stored on sites running the same or
a different DBMS, concentrates on a simplified and ideal version of
the heterogeneity problem and deals only with schematic heterogeneity.
The changes to the metadata are specified using the local data
definition language of the site on which the change is originating.
The commands are translated into the equivalent commands of a
common language, transferred to the other sites and then translated
into the commands of the local data definition language on each site.
The synchronisation mechanism used is based on designating one site to be the controlling site. This site runs a process called an update manager to synchronise all the changes to the metadata.
A prototype implementation of these techniques was developed to replicate the metadata of two relational DBMS's, IBM's Database 2 (DB2) and Database 2 OS/2 (DB2/2). A representative subset of the commands in the data definition languages of these two DBMS's were chosen to be handled by the prototype - index and table commands.
This paper looks at ways of using vision to improve the accuracy of a robotic system used in a SMT production environment. These improvements are obtained by developing routines to calibrate the robot, to calibrate the robot grippers, to find the part center and to find the PCB fiducial marks.
This paper proposes a new methodology for the development of Knowledge Based Systems (KBS). The key characteristics of knowledge based systems are examined, with particular emphasis on those aspects of developing such systems which differentiate them from more conventional development approaches. Various existing KBS development methodologies are then considered. Some of the important aspects of these methodologies are adopted/adapted within the new methodology, which seeks to define a methodological approach which satisfies the key requirements of knowledge- based systems development while ensuring that the key managerial aspects of a methodological approach are also catered for. The proposed methodology is then assessed in the light of the particular difficulties of the paradigm which it seeks to address.
The purpose of this dissertation is to evaluate how effectively sound is used in graphical user interface (GUI) designs for multimedia applications. I have approached this from two angles - in terms of general design guidelines for GUIs which incorporate sound, and also in terms of how to formally evaluate GUIs which incorporate sound.
In this paper I aim to look at the area of software
quality, to understand what it means to produce high
quality software products and how to go about it. The
best way to learn about any subject is to take a
practical example and work it through. In this
respect I will take a customer project currently in
development and use it as a case study to understand
the difficulties of producing quality software even
with the availability of new tools and environments
whose producers claim will make us more productive and
produce better quality products.
By studying the current research in the area of software quality and software development processes , by critically examining the development process used for the customer project and by examining the defects being produced by this process I aim to come to a better understanding of what is required to improve the testing and development processes for a Windows based, event driven system and hopefully have gleaned enough knowledge to be able to adapt and improve the development process to suit the new development environments of the future.
As businesses become more dynamic and flexible, they frequently find that their existing information systems act as bottlenecks to change. These information systems, based on monolithic architectures, and running on outdated platforms can handicap a business. This paper examines approaches to legacy system migration. Approaches considered include redeveloping all applications from scratch based on a new architecture, and incremental migration approaches which plot a migration path from the existing legacy systems to the desired new platform and architecture.
Objective: Given a geographic region, expressed as a polygonal area and a radio cellular network coverage map, represented by a point for each cell's antenna location and a polygonal area (around this point) for each cell's radio coverage area:- What is the best subset of cells to choose out of the complete set of cells which will provide the optimal coverage of the geographic region?
In this report, we concentrate on investigating the evolutionary behaviour and the statistical properties of a typical 2D froth cellular network. We introduce the current four main methods of constructing 2D froth models and concentrate in particular on obtaining results for the direct simulation and the Monte Carlo method for various different system sizes. We compare the results of related distribution functions and correlation functions obtained by these methods with known experimental results. Overall agreement is achieved for small system size; for the larger system size, the Monte Carlo method is found to be more flexible than the direct simulation. For the further consideration of the 3D case, we discuss the theory and possibilities of extending the computer simulations.
This paper describes an approach to information retrieval based on a syntactic analysis of the document texts and user queries, and from that analysis, the construction of tree structures (TSAs) to encode and capture language ambiguities. TSAs are constructed at the clause level and thus each document can yield many TSAs and each query may be represented by several TSAs. The TSAs from documents and from queries are then matched and their degrees of overlap between individual TSAs are computed and then aggregated to yield a score for each document, which is then used in ranking the collection. This paper presents the system description when benchmarking our retrieval strategy on category B of TREC-3, i.e. on c.550 Mbytes of the Wall Street Journal newspaper texts. The implementation is based on a two-stage retrieval where a statistically-based pre-fetch retrieval retrieves the set of WSJ articles for the more computationally expensive language based processing. The results of our retrieval system in terms of precision and recall are disappointing and an analysis of why is also included. Part of this analysis includes a direct comparison between our system and some mainstream IR approaches. In addition to performing ad hoc retrieval on texts in English, we have also performed ad hoc retrieval on texts in Spanish using a weighted trigram approach, and this is outlined and performance results given in an appendix.
Despite the advantages associated with the use of formal (mathematical) techniques for the specification of computerised systems industry has been slow to abandon existing techniques. There are several reasons for this not least of which are the lack of training in formal techniques and the perceived difficulty associated with their use. One area of much recent research aimed at overcoming these difficulties has focused on the possibility of integrating formal techniques with currently used specification techniques, particularly the semi-formal graphical techniques of Structured Analysis . An integration of formal techniques and graphical techniques (or bridging of the formality gap) would allow for an easy evolution to the use of formal techniques for systems development. It would also allow the advantages of both techniques to be exploited. This paper describes a method for integrating Real Time Structured Analysis with the formal specification technique LOTOS. The approach suggested uses the Real Time Structured Analysis specification as a cognitive vehicle for producing a LOTOS specification, and is illustrated by means of a small case study.