Computing and Information Systems - Theses

Permanent URI for this collection

Search Results

Now showing 1 - 10 of 13
  • Item
    Thumbnail Image
    Modelling human behaviour with BDI agents
    Norling, Emma Jane ( 2009)
    Although the BDI framework was not designed for human modelling applications, it has been used with considerable success in this area. The work presented here examines some of these applications to identify the strengths and weaknesses of the use of BDI-based frameworks for this purpose, and demonstrates how these weaknesses can be addressed while preserving the strengths. The key strength that is identified is the framework's folk-psychological roots, which facilitate the knowledge acquisition and representation process when building models. Unsurprisingly, because the framework was not designed for this purpose, several shortcomings are also identified. These fall into three different classes. Firstly, although the folk-psychological roots mean that the framework captures a human-like reasoning process, it is at a very abstract level. There are many generic aspects of human behaviour - things that are common to all people across all tasks - which are not captured in the framework. If a modeller wishes to take these things into account in a model, they must explicitly encode them, replicating this effort for every model. To reduce modellers' workload and increase consistency, it is desirable to incorporate such features into the framework. Secondly, although the folk-psychological roots facilitate knowledge acquisition, there is no standardised approach to this process, and without experience it can be very difficult to gather the appropriate knowledge from the subjects to design and build models. And finally, these models must interface with external environments in which they 'exist.' There are often mismatches in the data representation level which hinder this process. This work makes contributions to dealing with each of these problems, drawing largely on the folk-psychological roots that underpin the framework. The major contribution is to present a systematic approach to extending the BDI framework to incorporate further generic aspects of human behaviour and to demonstrate this approach with two different extensions. A further contribution is to present a knowledge acquisition methodology which gives modellers a structured approach to this process. The problems at the agent-environment interface are not straightforward to solve, because sometimes the problem lies in the way that the environment accepts and receives data. Rather than offering the golden solution to this problem, the contribution provided here is to highlight the different types of mismatches that may occur, so that modellers may recognise them early and adapt their approach to accommodate them.
  • Item
    Thumbnail Image
    A framework for valuing the quality of customer information
    Hill, Gregory ( 2009)
    This thesis addresses a widespread, significant and persistent problem in Information Systems practice: under-investment in the quality of customer information. Many organisations require clear financial models in order to undertake investments in their information systems and related processes. However, there are no widely accepted approaches to rigorously articulating the costs and benefits of potential quality improvements to customer information. This can result in poor quality customer information which impacts on wider organisational goals. To address this problem , I develop and evaluate a framework for producing financial models of the costs and benefits of customer information quality interventions. These models can be used to select and prioritise from multiple candidate interventions across various customer processes and information resources, and to build a business case for the organisation to make the investment. The research process involved: The adoption of Design Science as a suitable research approach, underpinned by a Critical Realist philosophy. A review of scholarly research in the Information Systems sub-discipline of Information Quality focusing on measurement and valuation, along with topics from relevant reference disciplines in economics and applied mathematics. A series of semi-structured context interviews with practitioners (including analysts, managers and executives) in a number of industries, examining specifically information quality measurement, valuation and investment. A conceptual study using the knowledge from the reference disciplines to design a framework incorporating models, measures and methods to address these practitioner requirements. A simulation study to evaluate and refine the framework by applying synthetic information quality deficiencies to real-world customer data sets and decision process in a controlled fashion. An evaluation of the framework based on a number of published criteria recommended by scholars to establish that the framework is a purposeful, innovative and generic solution to the problem at hand.
  • Item
    Thumbnail Image
    Proactive traffic control strategies for sensor-enabled cars
    Wang, Ziyuan ( 2009)
    TRAFFIC congestions and accidents are major concerns in today’s transportation systems. This thesis investigates how to improve traffic throughput by reducing or eliminating bottlenecks on highways, in particular for merging situations such as intersections where a ramp leads onto the highway. In our work, cars are equipped with sensors that can measure distance to neighboring cars, and communicate their velocity and acceleration readings with one another. Sensor-enabled cars can locally exchange sensed information about the traffic and adapt their behavior much earlier than regular cars. We propose proactive algorithms for merging different streams of sensor-enabled cars into a single stream. A proactive merging algorithm decouples the decision point from the actual merging point. Sensor-enabled cars allow us to decide where and when a car merges before it arrives at the actual merging point. This leads to a significant improvement in traffic flow as velocities can be adjusted appropriately. We compare proactive merging algorithms against the conventional priority-based merging algorithm in a controlled simulation environment. Experimental results show that proactive merging algorithms outperform the priority-based merging algorithm in terms of flow and delay. More importantly, the imprecise information (errors in sensor measurements) is a major challenge for merging algorithms, because inaccuracies can potentially lead to unsafe merging behaviors. In this thesis, we investigate how the accuracy of sensors impacts merging algorithms, and design robust merging algorithms that tolerate sensor errors. Experimental results show that one of our proposed merging algorithms, which is based on the theory of time geography, is able to guarantee safe merging while tolerating two to four times more imprecise positioning information, and can double the road capacity and increase the traffic flow by 25%.
  • Item
    Thumbnail Image
    Mining surprising patterns
    KUO, YEN-TING ( 2009)
    From the perspective of an end-user, patterns derived during the data mining process are not always interesting. The mining of unexpected patterns is a computational technique introduced in earlier work to address this problem. However, such unexpected patterns are not necessarily surprising to the user. In this thesis, we show that the quality of a user's knowledge, that is encoded in computational form, is key to bridging the gap between unexpected and surprising patterns. The thesis presents an approach that reduces this gap by exploiting a synergy between existing techniques utilised in data mining. Key to the new approach is (1) the employment of a domain ontology to guide the mining of association rules, (2) an encoding of users' knowledge using a Bayesian network representation, and (3) a probabilistic model to generate explanations for unexpected rules. The methods are tested on real-world data in two domains using users who are domain experts. In the medical domain, a dataset of chronic kidney disease patients is mined with a nephrologist; in the educational domain, a dataset of a decimal comparison test of children is mined with two education researchers. Surprising patterns have been successfully discovered. Further gaps, identified during the investigation, are captured in the discussion of case studies. In total, the surprisingness problem needs to be tackled from the aspects of knowledge representation, knowledge acquisition, interpretation assistance; and prevention of meaningless rules. A lack of sufficient information about rules is found to be a major cause of meaningless rules, where the un-informativeness problem was caused outside the scope of rule ranking. We conclude that the surprisingness problem should be further researched beyond the scope of the thesis.
  • Item
    Thumbnail Image
    Utility-oriented internetworking of content delivery networks
    Pathan, Al-Mukaddim Khan ( 2009)
    Today’s Internet content providers primarily use Content Delivery Networks (CDNs) to deliver content to end-users with the aim to enhance their Web access experience. Yet the prevalent commercial CDNs, operating in isolation, often face resource over-provisioning, degraded performance, and Service Level Agreement (SLA) violations, thus incurring high operational costs and limiting the scope and scale of their services. To move beyond these shortcomings, this thesis sets out to establish the basis for developing advanced and efficient content delivery solutions that are scalable, high performance, and cost-effective. It introduces techniques to enable coordination and cooperation between multiple content delivery services, which is termed as “CDN peering”. In this context, this thesis addresses five key issues ― when to peer (triggering circumstances), how to peer (interaction strategies), whom to peer with (resource discovery), how to manage and enforce operational policies (re-quest-redirection and load sharing), and how to demonstrate peering applicability (measurement study and proof-of-concept implementation). Thesis Contributions: To support the thesis that the resource over-provisioning and degraded performance problems of existing CDNs can be overcome, thus improving Web access experience of Internet end-users, we have: - identified the key research challenges and core technical issues for CDN peering, along with a systematic understanding of the CDN space by covering relevant applications, features and implementation techniques, captured in a comprehensive taxonomy of CDNs; - developed a novel architectural framework, which provides the basis for CDN peering, formed by a set of autonomous CDNs that cooperate through an interconnection mechanism, providing the infrastructure and facilities to virtualize the service of multiple providers; - devised Quality-of-Service (QoS)-oriented analytical performance models to demonstrate the effects of CDN peering and predict end-user perceived performance, thus facilitating to make concrete QoS performance guarantees for a CDN provider; - developed enabling techniques, i.e. resource discovery, server selection, and request-redirection algorithms, for CDN peering to achieve service responsiveness. These techniques are exercised to alleviate imbalanced load conditions, while minimizing redirection cost; - introduced a utility model for CDN peering to measure its content-serving ability by capturing the traffic activities in the system and evaluated through extensive discrete-event simulation analysis. The findings of this study provide incentive for the exploitation of critical parameters for a better CDN peering system design; and - demonstrated a proof-of-concept implementation of the utility model and an empirical measurement study on MetaCDN, which is a global overlay for Cloud-based content delivery. It is aided with a utility-based redirection scheme to improve the traffic activities in the world-wide distributed network of MetaCDN.
  • Item
    Thumbnail Image
    ARTS: Agent-Oriented Robust Transactional System
    WANG, MINGZHONG ( 2009)
    Internet computing enables the construction of large-scale and complex applications by aggregating and sharing computational, data and other resources across institutional boundaries. The agent model can address the ever-increasing challenges of scalability and complexity, driven by the prevalence of Internet computing, by its intrinsic properties of autonomy and reactivity, which support the flexible management of application execution in distributed, open, and dynamic environments. However, the non-deterministic behaviour of autonomous agents leads to a lack of control, which complicates exception management in the system, thus threatening the robustness and reliability of the system, because improperly handled exceptions may cause unexpected system failure and crashes. In this dissertation, we investigate and develop mechanisms to integrate intrinsic support for concurrency control, exception handling, recoverability, and robustness into multi-agent systems. The research covers agent specification, planning and scheduling, execution, and overall coordination, in order to reduce the impact of environmental uncertainty. Simulation results confirm that our model can improve the robustness and performance of the system, while relieving developers from dealing with the low level complexity of exception handling. A survey, along with a taxonomy, of existing proposals and approaches for building robust multi-agent systems is provided first. In addition, the merits and limitations of each category are highlighted. Next, we introduce the ARTS (Agent-Oriented Robust Transactional System) platform which allows agent developers to compose recursively-defined, atomically-handled tasks to specify scoped and hierarchically-organized exception-handling plans for a given goal. ARTS then supports automatic selection, execution, and monitoring of appropriate plans in a systematic way, for both normal and recovery executions. Moreover, we propose multiple-step backtracking, which extends the existing step-by-step plan reversal, to serve as the default exception handling and recovery mechanism in ARTS. This mechanism utilizes previous planning results in determining the response to a failure, and allows a substitutable path to start, prior to, or in parallel with, the compensation process, thus allowing an agent to achieve its goals more directly and efficiently. ARTS helps developers to focus on high-level business logic and relaxes them from considering low-level complexity of exception management. One of the reasons for the occurrence of exceptions in a multi-agent system is that agents are unable to adhere to their commitments. We propose two scheduling algorithms for minimising such exceptions when commitments are unreliable. The first scheduling algorithm is trust-based scheduling, which incorporates the concept of trust, that is, the probability that an agent will comply with its commitments, along with the constraints of system budget and deadline, to improve the predictability and stability of the schedule. Trust-based scheduling supports the runtime adaptation and evolvement of the schedule by interleaving the processes of evaluation, scheduling, execution, and monitoring in the life cycle of a plan. The second scheduling algorithm is commitment-based scheduling, which focuses on the interaction and coordination protocol among agents, and augments agents with the ability to reason about and manipulate their commitments. Commitment-based scheduling supports the refactoring and parallel execution of commitments to maximize the system's overall robustness and performance. While the first scheduling algorithm needs to be performed by a central coordinator, the second algorithm is designed to be distributed and embedded into the individual agent. Finally, we discuss the integration of our approaches into Internet-based applications, to build flexible but robust systems. Specifically, we discuss the designs of an adaptive business process management system and of robust scientific workflow scheduling.
  • Item
    Thumbnail Image
    The adoption of advanced mobile commerce services by individuals: investigating the impact of the interaction between the consumer and the mobile service provider
    AlHinai, Yousuf Salim (The University of Melbourne, 2009)
    This research investigates the impact of the interaction between the consumer and mobile service provider on the adoption of advanced mobile commerce services by existing consumers of mobile technology. These factors include: 1) Perceived Relationship Quality (PRQ), which is the consumer’s evaluation of the quality of his/her relationship with the mobile service provider, and 2) Perceived Value of the Adoption Incentive (PVI), which is the consumer’s evaluation of the value of incentives that are offered by the service provider to entice him/her to adopt the mobile service. The influence of these factors on consumer attitudes and intentions towards adopting mobile commerce services is studied and compared with three other well-known adoption factors including perceived usefulness, ease of use and the subjective norm. This study was undertaken in three parts. Firstly, a conceptual study was conducted to investigate and analyse the existing literature on consumer adoption of mobile commerce services. This phase started with a general review of the existing studies using a novel model: the Entities-Interactions Framework, EIF. The EIF explains adoption behaviour in terms of interactions between the consumer and the other entities including the mobile service, the service provider and the social system. This framework was used to analyse the extent to which important adoption factors have been covered by past research and therefore identify the research questions. The conceptual study resulted in the development of a research model and relevant hypotheses. Secondly, a large-scale questionnaire survey was conducted to test the research model and the proposed hypotheses. This part of the research helped give a broad picture of the influence of consumer-service provider factors on consumer adoption of mobile commerce services. Thirdly, face-to-face interviews with mobile phones users were conducted in order to validate the survey results and provide an understanding of the mechanisms that control the impact of the investigated factors. The research found that PRQ and PVI have an important influence on the attitude and intention of existing mobile phone users towards accepting and using advanced mobile commerce services. Furthermore, the research found that these newly introduced factors are more influential on consumer adoption perceptions than other well-established factors. The study enriches our understanding of technology adoption by individuals because it explains why an existing user of a technology, such as mobile technology, will or will not adopt advanced versions of that technology. The findings affirm that in the context of communication technologies, which are interactive by nature, understanding the interaction between consumers and service providers is a key to understanding the progressive adoption by consumers of advanced forms of these technologies. The thesis provides practitioners (particularly mobile service providers) with a better understanding of the impact and implication of their interaction with consumers on consumers’ acceptance and use of mobile services. The study emphasises the importance of incorporating this understanding throughout the mobile service provision process, starting from the conceptualisation of the service to the actual provision of the service to the market. The study also offers a novel comprehension of how to view each mobile service offer as a consequence of the previous offer and a precedent of the next in order to enhance consumer adoption of mobile service in the short and long runs.
  • Item
    Thumbnail Image
    Provider recommendation based on client-perceived performance
    Thio, Niko ( 2009)
    In recent years the service-oriented design paradigm has enabled applications to be built by incorporating third party services. With the increasing popularity of this new paradigm, many companies and organizations have started to adopt this technology, which has resulted in an increase of the number and variety of third party providers. With the vast improvement of global networking infrastructure, a large number of providers offer their services for worldwide clients. As a result, clients are often presented with a number of providers that offer services with the same or similar functionalities, but differ in terms of non-functional attributes (or Quality of Service – QoS), such as performance. In this environment, the role of provider recommendation has become more important - in assisting clients in choosing the provider that meets their QoS requirement. In this thesis we focus on provider recommendation based on one of the most important QoS attributes – performance. Specifically, we investigate client-perceived performance, which is the application-level performance measured at the client-side every time the client invokes the service. This performance metric has the advantage of accurately representing client experience, compared to the widely used server-side metrics in the current frameworks (e.g. Service Level Agreement or SLA in Web Services context). As a result, provider recommendation based on this metric will be favourable from the client’s point of view. In this thesis we address two key research challenges related to provider recommendation based on client-perceived performance - performance assessment and performance prediction. We begin by identifying heterogeneity factors that affect client-perceived performance among clients in a global Internet environment. We then perform extensive real-world experiments to evaluate the significance of each factor to the client-perceived performance. From our finding on heterogeneity factors, we then develop a performance estimation technique to address performance assessment for cases where direct measurements are unavailable. This technique is based on the generalization concept, i.e. estimating performance based on the measurement gathered by similar clients. A two-stage grouping scheme based on the heterogeneity factors we identified earlier is proposed to address the problem of determining client similarity. We then develop an estimation algorithm and validate it using synthetic data, as well as real world datasets. With regard to performance prediction, we focus on the medium-term prediction aspect to address the needs of the emerging technology requirements: distinguishing providers based on medium-term (e.g. one to seven days) performance. Such applications are found when the providers require subscription from their clients to access the service. Another situation where the medium-term prediction is important is in temporal-aware selection: the providers need to be differentiated, based on the expected performance of a particular time interval (e.g. during business hours). We investigate the applicability of classical time series prediction methods: ARIMA and exponential smoothing, as well as their seasonal counterparts – seasonal ARIMA and Holt-Winters. Our results show that these existing models lack the ability to capture the important characteristics of client-perceived performance, thus producing poor medium-term prediction. We then develop a medium-term prediction method that is specifically designed to account for the key characteristics of a client-perceived performance series, and to show that our prediction methods produce higher accuracy for medium-term prediction compared to the existing methods. In order to demonstrate the applicability of our solution in practice, we developed a provider recommendation framework based on client-perceived performance (named PROPPER), which utilizes our findings on performance assessment and prediction. We formulated the recommendation algorithm and evaluated it through a mirror selection case study. It is shown that our framework produces better outcomes in most cases, compared to country-based or geographic distance-based selection schemes, which are the current approach of mirror selection nowadays.
  • Item
    Thumbnail Image
    The case for mobile trajectory – a practical 'theory' for mobile work
    GRAHAM, CONNOR CLIVE ( 2009)
    This thesis progressively evolves and presents a practical 'theory' for mobile work – mobile trajectory – through three case studies conducted using fieldwork. The three cases presented here examine tram travellers finding their way around a city centre (Case A), health care workers looking after people with mental illness in a residential setting (Case B) and mobile clinicians caring for young people with mental illness in a community setting (Case C). My concern is to develop a 'theory' for mobile work that is both practical and theoretical,; at once supporting the practical action of completing field and analytic work while abstracting away from the ordinary affairs of society. The contribution of this ‘theory’ is to synthesise ideas from the domain of studies of ICTs mobile work to support description, rhetoric, inference and application for mobile work. This 'theory' has particular COMPONENTS, FEATURES, PROPERTIES, CONCERNS and ASSOCIATED NOTIONS. A mobile trajectory has a CORE TRAJECTORY that involves particular work: the CORE WORK. There are ALIGNED TRAJECTORIES that feed the CORE TRAJECTORY. These are part of the CORE TRAJECTORY. The FEATURES of mobile trajectory are CYCLES, TRANSITIONS, TRAVERSALS, STREAMS, SCHEMES, POSSIBILITIES, HISTORICITY and SHAPE. The PROPERTIES are PHYSICALITY, LOCALITY, INSTRUMENTALITY, SYNCHRONICITY, INTER- DEPENDENCY, PREDICTABILITY and PALPABILITY. Important CONCERNS are RECONCILIATION CONCERNS, ALIGNMNENT CONCERNS, RECIPROCAL CONCERNS and CONTINGENCY CONCERNS. Key ASSOCIATED NOTIONS are SOCIAL SPHERES with particular WORLDS and SUB-WORLDS comprising MEMBERS with particular ROLES and INVOLVEMENT. SOCIAL SPHERES have particular BOUNDARIES, RESOURCES and MEDIA and shared KNOWLEDGE and PRACTICES. MEDIA and RESOURCES have particular AVAILABILITY and MUTABILITY. MEMBERS have particular BIOGRAPHIES, TIES and OBLIGATIONS and AWARENESS of others. Through the case material presented I demonstrate how this 'theory' supports the work of describing and discussing mobile work for the purpose of conceptualising, selecting, recommending and critically evaluating everyday Information and Communication Technologies. At the end of the thesis I compare mobile trajectory to three alternative approaches and two alternative theories with regard to supporting the same kind of work.
  • Item
    Thumbnail Image
    Acquiring plans within situated, resource-bounded agents: a hybrid, BDI-based approach
    Karim, Samin M. R. ( 2009)
    The BDI model is a widely accepted model for situated, resource-bounded agents. Intentions are a key component of the BDI model, and constrain the agent’s commitment to achieving its desires, and execution of behaviours. These behaviours are fulfilled via plans, which are an abstract representation for behaviours. Plans specify a course of action in a given context to achieve a particular goal. Acquiring plans is a complex problem, which is due to the complex relationships between the plan goal/context and the presence of multiple action steps in a plan. The BDI model does not feature a plan acquisition, or more generally, a knowledge acquisition, capability. Acquiring atomic knowledge is comparatively more straightforward, as reinforcement and management of the knowledge is less complex than plans. This thesis presents an approach to plan acquisition for agents based on the BDI model and combined with a ‘bottom-level’ learner. The system, which we call PGS (plan generation system), essentially transforms knowledge from the bottom-level into the BDI-based top-level. PGS manages the top-level and bottom-level interactions whilst managing action execution. These processes all occur during run-time. We will firstly explain concepts that are central to the thesis, and then describe related work that goes towards achieving the thesis aims. We then describe the PGS architecture, the results from case study experiments that have been conducted, and a discussion of these results and the main thesis outcomes.