Abstracts and Presentations

This page lists the abstracts of all presentations at Complex'07.Where authorisation has been given, the presentations are also available, usually in Adobe Acrobat (pdf) and Apple Quicktime movie format.

The abstracts can also be obtained in Adobe Acrobat (pdf) format by downloading the Handbook & Abstracts.

Permission is granted for this material, presented at the 8th Asia-Pacific Complex Systems Conference to be shared for non-commercial, educational purposes. To disseminate otherwise or to republish requires written permission from the author(s).

Contents

Plenaries
Anomalous Diffusion
Business and Economics
Complex Systems Engineering
Complex Systems in the Earth Sciences
Complexity in Energy, Water, and Urban Development
Computational Modelling for Biology and Chemistry
Defence and Security
Social Networks and Epidemiology
Social Science and Management
Turbulence
General Track
Poster Session
Software Demonstrations

Plenaries

Are fractal skeletons the explanation for the plankton paradox and narrowing of arteries due to cell trapping in a disturbed blood flow?
Celso Grebogi (Institute for Complex Systems, King's College, University of Aberdeen, Scotland, United Kingdom)

Carbon-climate-human interactions as a complex system [pdf]
Mike Raupach (CSIRO)

Complex systems challenges in health care [pdf]
Michael Ward (Central Clinical Division, School of Medicine, The University of Queensland)

Complex Systems Perspective on the Revolution in Human Performance Optimization [pdf]
Kenneth Boff (Air Force Research Laboratory)

Controlling complex resources over different timeframes in process control
Penelope Sanderson (Cognitive Engineering Research Group, The University of Queensland)

Evolution Toward Enterprise Systems Engineering (If You don't Have a Billion Years, Will a Billion Dollars Do?) [pdf]
Joseph DeRosa (Director Systems Engineering, MITRE Corporation, United States of America)

Exponential random graph models for social networks [mov] [pdf]
Philippa Pattison (School of Behavioural Science, University of Melbourne)

From words to meanings - Human knowledge as a complex system
Andrew Smith (Leximancer)

Structure and dynamics of complex networks [mov]
Hawoong Jeong (Department of Physics, Korea Advanced Institute of Science and Technology, Republic of Korea)

Synchronisation and emergent intelligence in networked agents [pdf]
Akira Namatame (Department of Computer Science, National Defense Academy of Japan)

Systems-level metabolic engineering of bacteria using genome-scale in silico models
Sang Yup Lee (Dept of Chemical & Biomolecular Engineering, Korea Advanced Institute of Science and Technology, Republic of Korea)

The colourful geometry of nature
Michael Barnsley (The Australian National University)

Anomalous Diffusion

A Fractional Cable Equation for Anomalous Electrodiffusion in Nerve Cells [pdf]
Bruce Henry (The University of New South Wales), Trevor Langlands (The University of New South Wales), Susan Wearne (Mount Sinai School of Medicine)

Anomalous diffusion with linear reaction dynamics [pdf]
Trevor Langlands (The University of New South Wales), Bruce Henry (The University of New South Wales), Susan Wearne (Mount Sinai School of Medicine)

Fractional reaction diffusion along flow lines [mov] [pdf]
Boris Baeumer (The University of Otago, New Zealand)

Heat Conduction in Nonlinear Systems
Bambi Hu (Hong Kong Baptist University and University of Houston)

Nonlocal heat transport in laser-produced plasmas [pdf]
Frank Detering (The Australian National University)

Simulation study on heat conduction and beyond
Nobuyasu Ito (The University of Tokyo)

Business and Economics

'Discovering' Small Worlds in Potentially Biased Networks: A Methodological Critique [mov] [pdf]
Sam MacAulay (ARC Centre for Complex Systems, The University of Queensland), John Steen (ARC Centre for Complex Systems, The University of Queensland), Tim Kastelle (ARC Centre for Complex Systems, The University of Queensland)

A Review of Design Approaches Within Schumpeterian Economic Simulations [mov] [pdf]
Craig Lynch (Macquarie University)

ACE modelling: does size matter?
Paul Davis (Macquarie University)

Agent-based design considerations to ensure behaviour is emergent: A Labour market simulation using RePast
Paul Davis (Macquarie University)

Automatic Extraction and Modelling of Human Knowledge Networks from Natural Language using a Complex Systems Approach
Andrew Smith (Leximancer), Michael Humphreys (School of Psychology, The University of Queensland), Bettina Cornwell (School of Business, The University of Queensland)

Breaching Walras's Law: a first step to modelling endogenous money [pdf]
Steve Keen (School of Economics & Finance, The University of Western Sydney)

Combining System Dynamics and Choice Modelling to Simulate Demand Effects of Integrated Customer-Centric Marketing and Revenue Management [mov] [pdf]
Christine Mathies (The University of New South Wales)

Introduction to an Agent-Based Model of Development Processes in Tanzania
Brett Parris (Dept of Econometrics & Business Statistics, Monash University; and World Vision Australia)

Mean Bad Birds versus Kind Friendly Chickens: Group Selection and the Evolution of Cooperation [mov] [pdf]
Ian Wilkinson (The University of New South Wales), Dan Ladley (The University of Leeds, United Kingdom), Louise Young (The University of Technology, Sydney)

Minimalism and model-building: an assured model of the exchanges between consumers, retailers and manufacturers
David Midgley (INSEAD, France), Robert Marks (The Australian Graduate School of Management), Daniel Klapper (The University of Frankfurt, Germany), Dinesh Kunchamwar (Barclays Capital, Singapore)

Modeling innovation changes in business networks [mov] [pdf]
Sharon Purchase (The University of Western Australia), Doina Olaru (The University of Western Australia), Sara Denize (The University of Western Sydney)

Performance metrics: Towards an uncertainty principle for organizations [pdf]
Bill Lawless (Paine College)

The Cost of Information Acquisition in a Supply Network of Rationally Bounded Negotiating Agents [pdf]
Rodolfo Garcia-Flores (CSIRO Mathematical and Information Sciences), Nectarios Kontoleon (CSIRO Mathematical and Information Sciences), Rene Weiskircher (CSIRO Mathematical and Information Sciences), Simon Dunstall (CSIRO Mathematical and Information Sciences)

The Emergence of New Markets [pdf]
Ella Reeks (The University of Queensland)

The Use of Genetic Algorithms for Modelling the Behaviour of Boundedly Rational Agents in Economic Environments: Some Theoretical and Computational Considerations [pdf]
Janice Gaffney (The University of Adelaide), Charles Pearce (The University of Adelaide), Scott Wheeler (Defence Science and Technology Organisation)

Using Kauffman
Ian Wilkinson (The University of New South Wales), James Wiley (Victoria University of Wellington, New Zealand)

Complex Systems Engineering

"So...," asks the Chief Engineer "What do I go do?"
Douglas Norman (The MITRE Corporation)

A Self-Organising Sensing System for Structural Health Management [mov] [pdf]
Nigel Hoschke (CSIRO Industrial Physics and The University of New South Wales), Don Price (CSIRO Industrial Physics)

Architecture Characteristics Required to Support an Evolvable Model Based Systems Engineering Environment [mov] [pdf]
Christian Ross, Peter Campbell (Defence and Systems Institute, The University of South Australia)

Architecture Trade-Off Analysis and Multi-Objective Optimization Strategies
Lars Grunske (The University of Queensland)

Designing complex interactions - a people centred perspective [pdf]
Peter Johnson (University of Bath, United Kingdom)

Evaluation of Conceptual Models for Agent Based Representation of Behaviour in a Simulation of the Capability Development Process [mov] [pdf]
Kathy Darzanos (Defence and Systems Institute, The University of South Australia), Peter Campbell (Defence and Systems Institute, The University of South Australia), Stephen Cook (Defence and Systems Institute, The University of South Australia)

Exploiting CAS as a 'Force Multiplier' - Its Application to Policy, Acquisition, Assessment and Operational Employment [pdf]
Patrick Beautement (QinetiQ, United Kingdom)

Exploration of non-reductionist models of service ecosystems
Peter Bruza (Queensland University of Technology), Kirsty Kitto, Alistair Barros

Exploring Social complexity and concept evolution in war games using Dialogue Mapping [pdf]
Cherylne Fleming (Defence Science and Technology Organisation)

Implementing Adaptive Campaigning Through Force Development
Tim Pickford (Force Development Group)

Morphogenic Systems Engineering for Self-Configuring Networks [extended abstract] [mov] [pdf]
Shane Magrath (Defence Science and Technology Organisation)

Simulation engine for strategic planning of health services using Agent Based Modeling
Ashok Kanagarajah (ARC Centre for Complex Systems, The University of Queensland), Peter Lindsay (ARC Centre for Complex Systems, The University of Queensland), David Parker (The University of Queensland Business School)

Towards A Complex Systems Model Of Australian Air Traffic Management
Ariel Liebman (ARC Centre for Complex Systems, The University of Queensland), Peter Lindsay (ARC Centre for Complex Systems, The University of Queensland), Colin Ramsay (ARC Centre for Complex Systems, The University of Queensland), Martijn Mooij (The Key Centre for Human Factors, The University of Queensland)

Complex Systems in the Earth Sciences

Complexity of earthquake occurrence and implications for earthquake prediction
Dion Weatherley (Earth Systems Science Computational Centre)

Ensemble Prediction of Atmospheric Blocking Regime Transitions [pdf]
Jorgen Frederiksen (CSIRO Marine and Atmospheric Research), Terry O'Kane (Antarctic Climate and Ecosystems Cooperative Research Centre)

Limitations to scaling up a biophysical domain within agent based models. [pdf]
Geoffrey Carlin (CSIRO), Freeman Cook (CSIRO)

Non-equilibrium thermodynamics of coupled systems
Bruce Hobbs (CSIRO Exploration and Mining), Alison Ord (CSIRO), Klaus Regenauer-Lieb

Practical Aspects of Symbolisation and Subsequent Analysis of Weather Data [pdf]
Jay Larson (The Australian National University Supercomputing Facility; and The Australian National University)

Southern Hemisphere Climate Transitions and Storm Track Changes
Jorgen Frederiksen (CSIRO Marine and Atmospheric Research), Carsten Frederiksen (Bureau of Meteorology Research Centre)

The emergence of patterned shear bands and fracture systems in granular materials - A numerical study.
Alison Ord (CSIRO), Bruce Hobbs (CSIRO Exploration and Mining), Klaus Regenauer-Lieb

Complexity in Energy, Water, and Urban Development

The Complex Dynamics of Urban Systems Project at CSIRO [pdf]
Tim Baynes (CSIRO)

Complex Behaviour among many Heating, Ventilation, and Air-Conditioning Systems
Jiaming Li, Geoffrey Poulton, Geoffrey James (CSIRO ICT Centre)

Complexity of Urbanization Patterns and Resource Use in Sea Change Communities across Australia: the interplay between pattern and structure [pdf]
Kostas Alexandridis (CSIRO Sustainable Ecosystems), Heinz Schandl (CSIRO Sustainable Ecosystems)

Impacts of Vehicle-to Grid (V2G) technologies on electricity market operations [pdf]
Ariel Liebman (ARC Centre for Complex Systems, The University of Queensland), Geoff Walker (School of Information Technology & Electrical Engineering, The University of Queensland)

NEMSIM: Towards practical deployment of an agent-based, scenario exploration simulation tool for the National Electricity Market [mov] [pdf]
George Grozev (CSIRO Sustainable Ecosystems), Marcus Thatcher (CSIRO Marine and Atmospheric), Per da Silva (CSIRO Sustainable Ecosystems), Geoff Lewis (CSIRO Sustainable Ecosystems), Chi-hsiang Wang (CSIRO Sustainable Ecosystems)

Optimal GENCO's Bidding Strategies under Price Uncertainty with Bilateral Contracts [pdf]
Xia Yin (The University of Queensland), Zhaoyang Dong (The University of Queensland), Tapan Kumar Saha (The University of Queensland)

The Intelligent Grid Project [pdf]
Simon Dunstall (CSIRO Mathematical and Information Sciences), Rodolfo Garcia-Flores (CSIRO Mathematical and Information Sciences), Nectarios Kontoleon (CSIRO Mathematical and Information Sciences), Bill Lilley (CSIRO Energy Technology), Rene Weiskircher (CSIRO Mathematical and Information Sciences)

The interaction between water and energy supply and use [pdf]
Tim Baynes (CSIRO), Steven Kenway (CSIRO), Turner Graham (CSIRO), Jim West (CSIRO)

Tuning the cognitive complexity in participatory modelling
Nils Ferrand (Cemagref), Géraldine Abrami (Cemagref - Unite Mixte de Recherche Gestion de l'Eau, Acteurs et Usages), Nicolas Becu, Daniell Katherine, Natalie Jones, Pascal Perez (CIRAD; and The Australian National University), Jean-Emmanuel Rougier

Water service delivery in Tarawa: creating an agent based model for integrated analysis [mov] [pdf]
Magnus Moglia (CSIRO Land and Water; and The Australian National University), Pascal Perez (CIRAD; and The Australian National University), Stewart Burn (CSIRO Land and Water)

Computational Modelling for Biology and Chemistry

Identifying fundamental principles to effectively shape social institutions in natural resource management
Ryan McAllister (CSIRO), Luis R. Izquierdo (University of Burgos)

Importance sampling strategies for forwards and backwards processes in population genetics [pdf]
Martin O'Hely (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of Queensland), Mark Beaumont (University of Reading, United Kingdom), Lounes Chikhi (CNRS Toulouse), Robert Cope (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of Queensland), Jean-Marie Cornuet (INRA Montpellier), Leesa Wockner (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of Queensland)

Interactively exploring distributed computational models of biology [mov] [pdf]
James Watson (ARC Centre for Complex Systems and ARC Centre in Bioinformatics, The University of Queensland), Janet Wiles (The University of Queensland)

Modeling the Import of Nuclear Proteins [mov] [pdf]
John Hawkins (ARC Centre for Complex Systems, The University of Queensland), Mikael Boden (School of Information Technology & Electrical Engineering, The University of Queensland)

Modelling population processes with random initial conditions [pdf]
Philip Pollett (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, University of Queensland), Anthony Dooley (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of New South Wales), Joshua Ross (University of Warwick, United Kingdom)

Monte Carlo without Markov chains for model polymers
Aleks Owczarek (The University of Melbourne)

Simplified models for vibrational energy transfer in proteins [pdf]
Steven Lade (The Australian National University), Yuri Kivshar (The Australian National University)

Suddenly-stopped flow in a curved pipe
Richard Clarke (The University of Adelaide), James Denier (The University of Adelaide)

Defence and Security

A Blueprint for Reform: Towards a Coherent National Security Strategy [mov] [pdf]
Charlie Edwards (Demos)

A Comparative Study of Networked Teaming [mov] [pdf]
Victor Fok (Defence Science and Technology Organisation), Martin Wong (Defence Science and Technology Organisation), Alex Ryan (Defence Science and Technology Organisation)

A Framework for Decision Superiority in Complex Organisations [mov] [pdf]
Chris Murray (ThoughtWeb Pty Ltd)

A multi-objective genetic algorithm based method for designing low-cost complex networks resilient to targeted attacks. [mov] [pdf]
George Leu (National Defense Academy of Japan), Akira Namatame (Department of Computer Science, National Defense Academy of Japan)

Adaptive Campaigning [mov] [pdf]
Wade Stothart (Future Land Warfare, Australian Defence Forces)

Agent based models---finding the right tool for the right job. [mov] [pdf]
Matthew Berryman (Defence Science and Technology Organisation), Anne-Marie Grisogono (Defence Science and Technology Organisation), Alex Ryan (Defence Science and Technology Organisation)

Autonomic MANET Management Through the Use of Self Organizing UAVs [pdf]
Robert Hunjet (Defence Science and Technology Organisation)

Dealing with battlefield complexity: A decision maker's perspective [pdf]
Damien Armenis (Defence Science and Technology Organisation)

Diversity and complexity in pictures
Michael Barnsley (The Australian National University)

Effectiveness of closed-loop congestion controls for DDoS attacks [pdf]
Takanori Komatsu (National Defence Academy of Japan), Akira Namatame (Department of Computer Science, National Defense Academy of Japan)

Examining 'Federation Learning': Assisting Development of Australia's National Security Systems
Rick Nunes-Vaz (Land Operations Division, Defence Sciences and Technology Organisation), Dawn Hayter (IGOR Human Science Consultancy, Urban Providore Pty Ltd, Adelaide, Australia), Dion Grieger (Land Operations Division, Defence Sciences and Technology Organisation), Gary Hanly (Land Operations Division, Defence Sciences and Technology Organisation), Prior Chad, Monique Kardos (Land Operations Division, Defence Sciences and Technology Organisation)

Human aspects of managing complex systems [mov] [pdf]
Alex Wearing (The University of Melbourne)

Hurricane Katrina, A Whole of Government Case Study
Mark Clemente (The Boeing Company)

Implementing an Adaptive Approach in Non-Kinetic Counterinsurgency Operations
Mick Ryan (Australian Army)

Improving wargames using complex system practices
Anthonie van Lieburg (Netherlands Organisation for Applied Scientific Research), Peter Petiet (Netherlands Organisation for Applied Scientific Research), Nanne le Grand (Netherlands Organisation for Applied Scientific Research)

Network-Centric Complex Warfare
Mervyn Cheah (Singapore Armed Forces Centre for Military Experimentation, Future Systems Directorate)

Robots for Warfighting - Simplifying Complexity in Right & Wrong Ways
Patrick Hew (Defence Science and Technology Organisation)

Signatures of Game Dynamics for Intelligence and Information Operation
Hussein Abbass (Defence and Security Applications Research Centre)

Team Composition: Linking Individual and Team Characteristics to Team Decision-Making and Performance [pdf]
Sebastian Schaefer (Institute of Technology of Intelligent Systems, Universitat der Bundeswehr)

The Adaptive Stance [pdf]
Anne-Marie Grisogono (Defence Science and Technology Organisation)

The Essential Thing: Enabling Effective Action in a Complex Security Environment
Roger Noble (Australian Command and Staff College)

The Importance of Complexity Theory to Modelling and Analysis, Using a NCW example [mov] [pdf]
Michael Lauren (Defence Technology Agency, New Zealand)

The Importance of Contextually Sensitive Processes In Supporting Ad-hoc Collaborative Working In Complex Systems [pdf]
Neil Carrigan (The University of Bath, United Kingdom)

The Rosetta-II Project: Measuring National Differences in Complex Cognition [mov] [pdf]
Richard Warren (Air Force Research Laboratory)

Whole of Government Operations [mov] [pdf]
Ed Smith (Boeing)

Social Networks and Epidemiology

A Hybrid Agent-Based and Network Modelling Framework of Contagion Diffusion in Social Networks
Paul Box (CSIRO Sustainable Ecosystems), Yiheyis Maru (CSIRO Sustainable Ecosystems, Alice Springs)

Assessing Uncertainty In The Structure Of Ecological Models Through A Qualitative Analysis Of System Feedback And Bayesian Belief [pdf]
Jeff Dambacher (CSIRO Land and Water)

Boolean networks as models of social behaviour
Tania Leishman (Monash University), David Green (Monash University)

Complexity and the obesity epidemic
Matthew Beaty (CSIRO Sustainable Ecosystems), Cathy Baker (ACT Health), Cathy Banwell (National Centre for Epidemiology and Population Health, The Australian National University), Guy Barnett (CSIRO Sustainable Ecosystems), Helen Berry (National Centre for Epidemiology and Population Health, The Australian National University), Jane Dixon (National Centre for Epidemiology and Population Health, The Australian National University), Rob Dyball (The Australian National University), Sharon Friel (National Centre for Epidemiology and Population Health, The Australian National University), Amy Griffin (The University of New South Wales; and Australian Defence Force Academy), Katrina Proust (The Australian National University)

Epidemic Spread Modelling: Alignment of Agent-based Simulation with a SIR Mathematical Model. [pdf]
Alex Skvortsov (HPP Division, Defence Science and Technology Organisation), Russell Connell (Defence Science and Technology Organisation), Peter Dawson (HPP Division, Defence Science and Technology Organisation), Ralph Gailis (HPP Division, Defence Science and Technology Organisation)

Local versus global processes on networks: Simple processes with complex dynamics
Peter Whigham (University of Otago)

Missing data in social network analysis [mov] [pdf]
Johan Koskinen (The University of Melbourne)

Social influence models [mov] [pdf]
Galina Daraganova (School of Behavioural Science, University of Melbourne), Philippa Pattison (School of Behavioural Science, University of Melbourne), Garry Robins (School of Behavioural Science, University of Melbourne), Peng Wang (School of Behavioural Science, University of Melbourne)

Social network approaches to hepatitis C virus epidemiology: outcomes from consecutive studies in Melbourne, Australia [mov] [pdf]
Campbell Aitken (Burnet Institute), Rhonda McCaw (Victorian Infectious Diseases Reference Laboratory), Scott Bowden (Victorian Infectious Diseases Reference Laboratory), Mandvi Bharadwaj (The University of Melbourne), Margaret Hellard (Burnet Institute)

State Estimation of Complex Social Networks [mov] [pdf]
Daniel McMichael (CSIRO), John Ward (CSIRO)

Understanding social networks in free-ranging cattle
Dave Swain (CSIRO)

Social Science and Management

A way for leading in times of turbulent and consistent change
Roderick Cross (Dept Primary Industries & Fisheries (Qld)), Gillian Ching (Department Primary Industries & Fisheries)

Complexity of the Australian public's orientation towards low emission technologies
Simone Carr Cornish (CSIRO), Stephen Fraser (CSIRO), John Gardner (CSIRO), Peta Ashworth (CSIRO), Anna Littleboy (CSIRO)

Governance by Social Network: A Multiscale Analysis of Communication Efficiency
Adam Dunn (Alcoa Research Centre for Stronger Communities, Curtin University of Technology), Daniela Stehlik (Alcoa Research Centre for Stronger Communities, Curtin University of Technology)

Hop, Step and Jump! - The application of a Self Organising Map (SOM) approach to the measurement of change in organizations
W.J. Parry (ChangeTrack Research), Stephen J. Fraser (CSIRO Exploration and Mining), Bruce Hobbs (CSIRO Exploration and Mining), C.W. Peake (ChangeTrack Research)

Turbulence

A world tour of dynamical systems, stability, and chaos. [pdf]
Rowena Ball (The Australian National University), Philip Holmes (Princeton University)

Bifurcation in Resistive Drift Wave Turbulence [extended abstract] [pdf]
Ryusuke Numata (The Australian National University), Rowena Ball (The Australian National University), Robert Dewar (The Australian National University), Linda Stals (The Australian National University)

Eddy structure in the Roughness Sub-Layer
John Finnigan (CSIRO Centre for Complex Systems Science), Roger Shaw (University of California, United States of America), Ned Patton (National Center for Atmospheric Research, USA), Ian Harman (CSIRO Centre for Complex Systems Science)

On subgrid-scale parameterizations of the eddy viscosity, stochastic backscatter and the eddy-topographic force
Terry O'Kane (Antarctic Climate and Ecosystems Cooperative Research Centre), Jorgen Frederiksen (CSIRO Marine and Atmospheric Research)

Statistical Models of a Tracer Plume in the Complex Urban Canopies [pdf]
Alex Skvortsov (HPP Division, Defence Science and Technology Organisation), Ralph Gailis (HPP Division, Defence Science and Technology Organisation), Michael Borgas (CSIRO Marine and Atmospheric Research), Peter Dawson (HPP Division, Defence Science and Technology Organisation), Michael Roberts (HPP Division, Defence Science and Technology Organisation)

General Track

A Preliminary Model for Studying the Interactions Between Nephrons [extended abstract] [pdf]
Robert Moss (The University of Melbourne)

A Tool to Analyse the Long-term Viability of an Agricultural Region by Considering the Interactions of Socio-Economic, Ecological Factors [pdf]
Xianfeng Su (CSIRO), Senthold Asseng (CSIRO), Freeman Cook (CSIRO), Peter Campbell (Defence and Systems Institute, The University of South Australia), Geoff Carlin (CSIRO)

Advanced Data Analysis Methods to Detect and Predict the Non-technical Losses based on Customer Behaviour Changes for Power Utilities
Anisah Nizar (The University of Queensland), Zhaoyang Dong (The University of Queensland)

Australia 2007 to 2025 - A Complex Systems Approach
Bruce Hobbs (CSIRO Exploration and Mining), Klaus Regenauer-Lieb

Complexity in Speciation: Effects of disasters on adaptive radiation in a Dual Phase Evolution model [extended abstract] [pdf]
Greg Paperin (Monash University), David Green (Monash University), Suzanne Sadedin, T. G. Leishman (Monash University)

Decision processes used to describe turn-off of Beef Cattle from Australian grazing farms. [extended abstract]
Graham Donald (CSIRO Livestock Industries), David Miron (CSIRO Livestock Industries), Irina Emelyanova (CSIRO Livestock Industries)

Democracy's event horizon
Roger Bradbury (Resource Management in Asia-Pacific Program)

Distributed task allocation with autonomous selected agents
Kenta Oomiya (Future University - Hakodate), Keiji Suzuki (Future University - Hakodate)

Emerging Network Structure in a Darwinised Data-Oriented Parser
Dave Cochran (University of St. Andrews)

Evolution of node-clusters in space associated with processes determined by Voronoi maps: statistics from simulations.
David Odell (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems), Konstantin Borovkov (University of Melbourne; and ARC Centre of Excellence for Mathematics and Statistics of Complex Systems)

Finite Time Ruin Probability with Heavy-Tailed Claims and Constant Interest Rate
Dingcheng Wang (The Australian National University)

Forecasting Commodities Markets using Agent Based Modelling Techniques. [mov] [pdf]
David Miron (CSIRO Livestock Industries), Graham Donald (CSIRO Livestock Industries), Irina Emelyanova (CSIRO Livestock Industries)

Harvesting Heterogeneous Renewable Resources: Uncoordinated, Selfish, Team-, and Community-Oriented Strategies
Markus Brede (CSIRO Marine and Atmospheric Research), Fabio Boschetti (CSIRO Marine and Atmospheric Research), Burt de Vries (Utrecht University, Copernicus Institute)

Heuristics, complexity and belief networks: a case study of an outback livelihood system [mov] [pdf]
Thomas Measham (CSIRO Sustainable Ecosystems), Kostas Alexandridis (CSIRO Sustainable Ecosystems), Samantha Stone-Jovicich (CSIRO Sustainable Ecosystems)

Information Contagion and Financial Prices [pdf]
Mark Bowden (ARC Centre for Complex Systems, The University of Queensland)

Is the fractal geometry of nature a coincidence? [mov] [pdf]
Julianne Halley (CSIRO Molecular and Health Technologies), Dave Winkler (CSIRO Molecular and Health Technologies)

Islets in an Ocean: Towards a Philosophy of Complexity [extended abstract]
Tony Smith (Meme Media)

Managing Linguistic Complexity [mov] [pdf]
David Butt (Macquarie University)

Mapping model complexity [pdf]
Fabio Boschetti (CSIRO Marine and Atmospheric Research), Nicky Grigg (CSIRO Land & Water), David McDonald (CSIRO Marine and Atmospheric Research)

Mimosa: using ontologies for modeling and simulation of complex systems [extended abstract]
Jean-Pierre Muller (CIRAD)

Mythological Archetypes as an Emergent Process [mov] [pdf]
Victor MacGill

Non-Hamiltonian and fractional dynamics in complex plasma [extended abstract] [pdf]
James Stokes (The University of Sydney), Alex Samarian (The University of Sydney), Sergey Vladimirov (The University of Sydney)

On Efficient Management of Complex Systems [extended abstract] [mov] [pdf]
Victor Korotkikh (Central Queensland University), Galina Korotkikh (Central Queensland University)

Perils of Power Laws [pdf]
Russell Standish (Mathematics and Statistics, The University of New South Wales)

Resilient extraction of renewable resources
Cameron Fletcher (CSIRO Sustainable Ecosystems), David Hilbert (CSIRO Sustainable Ecosystems)

Self-avoiding walk enumeration via the lace expansion [pdf]
Nathan Clisby (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, The University of Melbourne), Richard Liang (University of California, Berkeley, United States of America), Gordon Slade (University of British Columbia, Canada)

Social Capital: a complex of dynamic relationships
Angela Wardell-Johnson (Centre for Rural and Regional Innovation, Qld; and The University of Queensland)

The Evolution of the World Trade Web as a Complex System
Tim Kastelle (ARC Centre for Complex Systems, The University of Queensland)

The Optimal Design of Profitable Renewable Energy Systems; A Hamiltonian Based Approach [mov] [pdf]
Frank Horowitz (CSIRO Exploration and Mining), Peter Hornby (CSIRO Exploration and Mining)

Value as a Driving Factor in Complex Human Enterprises [pdf]
Patrick Beautement

Variational principle for relaxed states of a plasma confined by a nonintegrable magnetic field [pdf]
Robert Dewar (The Australian National University), Matthew Hole (The Australian National University), Stuart Hudson (Princeton Plasma Physics Laboratory), Mathew McGann (The Australian National University)

Wetland ecosystems as manifestation of complex systems [pdf]
Changhao Jin (Arthur Rylah Institute)

Zonal flow generation by modulational instability [pdf]
Robert Dewar (The Australian National University), R Farzand Abdullatif (The Australian National University)

Poster Session

Computational Models for Studying Signalling Control Mechanisms behind Legume Autoregulation of Nodulation
Liqi Han (ARC Centre of Excellence for Integrative Legume Research and ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland), Peter M. Gresshoff (ARC Centre of Excellence for Integrative Legume Research), Jim Hanan (ARC Centre of Excellence for Integrative Legume Research, ARC Centre for Complex Systems and Advanced Computational Modelling Centre, The University of Queensland)

A computational model of C. elegans locomotion dynamics and motor control circuitry
Mark Wakabayashi (Thinking Systems, Queensland Brain Institute and School of Information Technology and Electrical Engineering, The University of Queensland), David Carrington (School of Information Technology & Electrical Engineering, The University of Queensland), Janet Wiles (The University of Queensland)

Approximating extinction times and probabilities for absorbing birth-death processes
David Sirl (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, University of Queensland), Hanjun Zhang (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, The University of Queensland), Philip Pollett (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, University of Queensland)

Computational techniques for modeling complex biological systems
James Watson (ARC Centre for Complex Systems and ARC Centre in Bioinformatics, The University of Queensland), Mark Wakabayashi (Thinking Systems, Queensland Brain Institute and School of Information Technology and Electrical Engineering, The University of Queensland), Jared Moore (The University of Queensland), Andres Sanin Montoya (The University of Queensland), Kai Willadsen, Nic Geard (ECS, University of Southampton), Daniel Bradley (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland), Janet Wiles (The University of Queensland)

Constructing complexity based rules for C. elegans from recursive networks
Frank Burden (CSIRO Molecular and Health Technologies; and Scimetrics Ltd), Julianne Halley (CSIRO Molecular and Health Technologies), Dave Winkler (CSIRO Molecular and Health Technologies)

Critical regions and phase changes in simulations of spiking neurons
Peter Stratton (Queensland Brain Institute), Janet Wiles (The University of Queensland)

Do stylized facts of order book markets need strategic behaviour?
Dan Ladley (The University of Leeds, United Kingdom), Klaus Schenk-Hoppe (University of Leeds, United Kingdom)

Electricity market planning and management [pdf]
John Lu (The University of Queensland)

Identifying Emergent Social Behavioural Networks in Domesticated Livestock
Kym Patison (CSIRO), Dave Swain (CSIRO), Philippa Pattison (School of Behavioural Science, University of Melbourne), Garry Robins (School of Behavioural Science, University of Melbourne)

Modelling the Omics Network of Hepatocellular Carcinoma using NetMap®
David Fung (The University of Sydney), John Galloway

Network models for embryonic stem cell self-renewal and differentiation in mice and men
Julianne Halley (CSIRO Molecular and Health Technologies), Frank Burden (CSIRO Molecular and Health Technologies; and Scimetrics Ltd), Dave Winkler (CSIRO Molecular and Health Technologies)

Novel sparse Bayesian methods for stem cell microarray analysis and cancer diagnosis
Dave Winkler (CSIRO Molecular and Health Technologies), Frank Burden (CSIRO Molecular and Health Technologies; and Scimetrics Ltd), Julianne Halley (CSIRO Molecular and Health Technologies)

Optimal Active Learning in Gaussian Process Regression: an Empirical Study
Flora Yu-Hui Yeh (ARC Centre for Complex Systems, School of Information Technology & Electrical Engineering, The University of Queensland), Marcus Gallagher (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland)

Reinforcement learning in complex computer game environments
Michelle McPartland (ARC Centre for Complex Systems, School of Information Technology & Electrical Engineering, The University of Queensland), Marcus Gallagher (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland)

Robots and the Evolution of Spatial Language
Ruth Schulz (School of Information Technology & Electrical Engineering, The University of Queensland), David Prasser (School of Information Technology & Electrical Engineering, The University of Queensland), Mark Wakabayashi (Thinking Systems, Queensland Brain Institute and School of Information Technology and Electrical Engineering, The University of Queensland), Janet Wiles (The University of Queensland)

Searching Concept Spaces using Physical Navigation Strategies
Paul Stockwell (Thinking Systems, School of Information Technology and Electrical Engineering, The University of Queensland), Andrew Smith (The Institute for Social Science Research, The University of Queensland), Janet Wiles (The University of Queensland)

Simulkit: a software toolkit aiming towards a unified network-based view of complex systems
Daniel Bradley (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland), Ariel Liebman (ARC Centre for Complex Systems, The University of Queensland), Leighton Brough (ARC Centre for Complex Systems, The University of Queensland)

Theme Park Problem on Complex Network Models
Yasushi Yanagita (Future University-Hakodate), Keiji Suzuki (Future University - Hakodate)

Virtual Kiwifruit: Modelling Annual Growth Cycle
Mikolaj Cieslak (The University of Queensland), Alla N. Seleznyova (HortResearch, New Zealand), Jim Hanan (ARC Centre of Excellence for Integrative Legume Research, ARC Centre for Complex Systems and Advanced Computational Modelling Centre, The University of Queensland)

Software Demonstrations

A Netlogo simulation of dynamic configurability in combined arms teams.
Martin Wong (Defence Science and Technology Organisation), Victor Fok (Defence Science and Technology Organisation)

EcoLab 5
Russell Standish (Mathematics and Statistics, The University of New South Wales)

LiveGraph - a tool for data visualisation, analysis and logging in complex systems simulations.
Greg Paperin (Monash University)

Multi-tasking, specialisation and adaptability in a logistics problem
Matthew Berryman (Defence Science and Technology Organisation), Vanja Radenovic (Defence Science and Technology Organisation), David Batten (CSIRO), Anne-Marie Grisogono (Defence Science and Technology Organisation), Alex Ryan (Defence Science and Technology Organisation)

VLAB - An online virtual laboratory for complexity and artificial life
Alex Tee Neng Heng (Faculty of IT, Monash University), David Green (Monash University)


Plenaries

Are fractal skeletons the explanation for the plankton paradox and narrowing of arteries due to cell trapping in a disturbed blood flow?

Celso Grebogi (Institute for Complex Systems, King's College, University of Aberdeen, Scotland, United Kingdom)

Nature is permeated by phenomena in which active processes, such as chemical reactions and biological interactions, take place in environmental flows. They include the dynamics of growing populations of plankton in the oceans and the evolving distribution of ozone in the polar stratosphere. I will show that if the dynamics of active particles in flows is chaotic, then necessarily the concentration of particles has the observed fractal filamentary structures. These structures, in turn, are the skeletons and the dynamic catalysts of active processes, yielding an unusual, singularly enhanced productivity. I will argue that this singular productivity could be the hydrodynamic explanation for the plankton paradox, in which an extremely large number of species are able to coexist, negating the competitive exclusion principle that asserts the survival of only the most perfectly adapted to each limiting resource. I will then suggest that the presence of such fractal skeletons in arterial flow could be the explanation for the eventual restenosis (recurrence of narrowing) of arteries after a stent-assisted angioplasty.

References

Chemical and biological activity in open flows: A dynamical system approach, T. Tél, A. Moura, C. Grebogi and G. Károlyi, Phys. Reports 413, Issues 2-3, pages 91-196 (2005).

Carbon-climate-human interactions as a complex system

Mike Raupach (CSIRO)

Dr Michael Raupach

Carbon-climate-human interactions can be modelled as a low-dimensional dynamical system. This system is now exhibiting two kinds of disturbance-amplifying feedback.

  1. Biophysical vulnerabilities: Land-air and ocean-air exchanges of carbon (both as CO2 and CH4) are being modified by changes in climate and land use. Together, effects create a potential climate change impact which is not negligible in comparison with projected direct consequences of anthropogenic emissions. New observational evidence shows that the land and ocean CO2 sinks (which now take up over half the total annual CO2 emission flux from human activities) are already weakening.
  2. Vulnerabilities arising from human behaviour: CO2 emissions from fossil-fuel burning and industrial processes have been accelerating at global scale. The emissions growth rate since 2000 was greater than that for the most fossil-fuel-intensive of the IPCC emissions scenarios developed in the late 1990s. Growth since 2000 was driven by a reversal of earlier declining trends the carbon intensity of the economy, coupled with continuing increases in population and per-capita GDP.

These interacting feedbacks place the carbon-climate-human system at three crossroads: the Gaian crossroads (the coevolution of climate, biogeochemical cycling and the living biosphere); the Kyoto crossroads (representing choices about how to deal with human impacts on the planet); and the crossroads of Robert Johnson, who according to legend sold his soul to the Devil in exchange for musical genius. Our challenge here is to reshape the bargain that, through energy derived from detrital biospheric carbon, has fuelled the genius of the human race.

Complex systems challenges in health care

Michael Ward (Central Clinical Division, School of Medicine, The University of Queensland)

Research based advances in healthcare over the past 40-50 years have been substantial and unequivocal. This has resulted in an extensive and ever increasing catalogue of powerful diagnostic, therapeutic and procedural interventions. These have led to significant decreases in the mortality and morbidity of many diseases. However there is increasing concern amongst clinicians and health service managers about the quality and safety of healthcare delivery. This is based upon evidence of: variations of several orders of magnitude in the outcomes of treating the same condition; of little correlation between these outcomes and the relevant costs; of a failure to deliver interventions of proven efficacy to around half of those in need, and of a 10% chance of harm caused by the processes of healthcare. These problems have resisted conventional service improvement initiatives for over a decade. It seems clear that some of this intractability may be due to the complex non-linear interactions amongst the many specialised component parts of healthcare that contribute to the benefits. These interactions generate complex adaptive systems that are difficult to both understand and manage. This presentation will focus upon those components of this complexity where the techniques of non-linear data analysis seem to have potential value both for understanding key human dynamics, and for improving systems management. The presentation will be followed by a panel discussion of researchers and clinicians with relevant experience and expertise with the objective of providing a framework for future collaborative research.

Complex Systems Perspective on the Revolution in Human Performance Optimization

Kenneth Boff (Air Force Research Laboratory)

The revolutions in Info, Bio and Nano Technologies have spawned a revolution in human enhancement technologies that fundamentally challenge the notion of what it means to be human. Human performance optimization has potential to bestow unprecedented competitive and military advantages in future human-technology systems. Over the past decade, classical human factors engineering (Gen 1) has rapidly given way to 'cognitive systems engineering' (Gen 2) which, in turn has spawned a growing investment in Cognitive Systems Integration (Gen 3) wherein silicon-based technologies are neurally-coupled to symbiotically enhance cognitive performance. The biological optimization of human performance capabilities (Gen 4) is emergent and indicates that the means to redesign our basic human factors - that is, how we think, how we feel, how we look, how we age, and how we communicate with one another - is close at hand. Taken together, the revolution in human performance optimization has profound implications for design consideration of the human as a variable in complex systems where traditional boundary constraints may no longer be valid. This presentation will review and explore the options and implications of this revolutionary paradigm shift on the design and deployment of complex systems.

Controlling complex resources over different timeframes in process control

Penelope Sanderson (Cognitive Engineering Research Group, The University of Queensland)

Monitoring and controlling complex systems becomes particularly challenging when human controllers must satisfy constraints operating in multiple timeframes. Examples of systems with such temporal complexity are air traffic control, chemical process control, and power generation. We outline how the problem of temporal complexity emerges in different domains and imposes challenges to effective control. To illustrate our work we focus on hydropower system control. A hydropower company generates power according to targets handed down by the central market organisation and responds to market opportunities, but at the same time the company must preserve its ability to meet corporate strategic and tactical objectives. The human controller at the 'sharp end' of hydropower system operations must coordinate water, generation, and transmission resources consistently with those objectives. In this presentation we describe the information needed by hydropower system human controllers for effective control. Specifically, we outline the general form of analyses that identify the purposes, priorities and functions of the hydropower domain, and that identify the timeframes in which different subsystems function. From such analyses we develop the general form of interface displays that help human controllers solve the problem of coordinating resources over different timeframes. In particular, such displays make the temporal boundaries of safe operating regions visible. Finally, we draw conclusions for analysis and design of human-system interfaces for complex systems in which resources must be coordinated by different parties across multiple timeframes.

Evolution Toward Enterprise Systems Engineering (If You don't Have a Billion Years, Will a Billion Dollars Do?)

Joseph DeRosa (Director Systems Engineering, MITRE Corporation, United States of America)

Corporations are like living organisms - they evolve and adapt. They can start out simple, and go through successive stages of increased complexity and organization. They have their own niche in the business ecosystem. Individual companies, as well as the entire business segment to which they belong, face dangers of extinction from competitors and the environment. At the same time some adapt and thrive. This is the story about the on-going evolution of a company known for its expertise in traditional systems engineering toward the murky waters of complex systems and enterprise systems engineering.

Exponential random graph models for social networks

Philippa Pattison (School of Behavioural Science, University of Melbourne)

In this talk I describe a program of work whose aim is to develop statistical models for social networks. Global network structure is hypothesised to arise as the outcome of interactive processes occurring within local neighbourhoods of a network. Each neighbourhood is conceived as a possible site of interaction and is associated with a subset of possible network ties. An illustration of the approach is presented using a model specification that relies on Markovian neighbourhoods (Frank & Strauss, 1986) as well as generalized realisation-dependent neighbourhoods that are generated, in part, by interactive network processes themselves (Snijders, Pattison, Robins & Handcock, 2006). The model is estimated from an observation of a complete network. I then consider conditional maximum likelihood estimation of models of this form from partial network data obtained from multi-wave snowball sampling schemes. Snowball sampling schemes are those in which nodes adjacent to one or more "seed" nodes are identified, as are nodes adjacent to each of those nodes, and so on. I describe a separation condition for the case of multiple seed nodes, report some simulations assessing the estimation approach, and discuss potential applications.

References

Frank, O., & Strauss, D. (1986). Markov graphs. Journal of the American Statistical Association, 81, 832-842. Snijders, T., Pattison, P., Robins, G., and Handcock, M. (2006). New specifications for exponential random graph models. Sociological Methodology, 36, 99-153.

From words to meanings - Human knowledge as a complex system

Andrew Smith (Leximancer)

Researchers have characterised human knowledge in terms of semantic networks and concept maps for several decades. Concept Maps, or Mind Maps, are often created directly by the humans who possess the knowledge in question. However, the systematic generation of sematic networks is generally based on language. Natural language data - text and speech - is regarded as the principal diagnostic output of the human cognitive system.

Natural language shows several key indicators of a complex system. Humans habitually use language creatively. People's knowledge representations of a given system are known to depend on their current goal, context and background. Key components of the human knowledge representation are: the concept, or category, such as a chair as opposed to a table, and the relationship, such as the observation that the chair is under the table. Concepts could also be referred to as classes, or as agents, and have multiple slowly changing attributes. Relationships between concepts, on the other hand, are much more dynamic. Object Role Modelling is one systematic formalism for expressing knowledge as a flexible network of more primitive conceptual agents.

Our Leximancer system employs a two-level process to automatically extract knowledge networks from text. The first level is a recursive algorithm for identifying concept agents and discovering their various lexical attributes - this is the extracted Thesaurus. The second level identifies the co-occurrence network of concept agents in the text, then uses these relational observations to find the emergent conceptual phase space via a recursive and dissipative nonlinear algorithm. This phase space is the resulting concept map. A key part of our work is to test the reproducibility of the emergent patterns using parallel data sets.

Structure and dynamics of complex networks

Hawoong Jeong (Department of Physics, Korea Advanced Institute of Science and Technology, Republic of Korea)

Complex systems as diverse as the Internet or the cell can be described by networks with complex topology. Traditionally it has been assumed that these networks are random. However, recent studies indicate that such complex systems emerge as a result of self-organizing processes governed by simple but generic laws, resulting in inhomogeneous scale-free topologies strikingly different from those predicted by random networks. Such studies also lead to a paradigm shift regarding our approach to complex systems, allowing us to view them as dynamical systems rather than static graphs. I will review historical development of complex network studies, and discuss the implications of these findings on the error and attack tolerance of the Internet and the robustness of the cells. Also recent research activities especially on dynamical aspect of complex network will be presented, including large-scale data analysis of social networking service (SNS) and price of anarchy of transportation networks.

Synchronisation and emergent intelligence in networked agents

Akira Namatame (Department of Computer Science, National Defense Academy of Japan)

The understanding of emergent collective phenomena in natural and social systems has driven the interest of scientists from different disciplines. Among emergent collective phenomena, the synchronization of a set of interacting individuals occupies a privileged position because its ubiquity in the natural systems. For instance, flocking behavior of fish school, birds and many animals are good examples. The dynamics of each flocking member is controlled by the local interactions between each member and its neighbors that control the spatial and velocity coherence of the system. Although a substantial element of this research has successfully concentrated on building mathematical and simulation models that simulate flocking behavior, the control of flocking dynamics in order to achieve a specific object is difficult. Previous research on synchronization only measures the stability of a system's dynamics and does not consider the level of coherence or desirability in the behaviors exhibited by a system.

The fact that selfish behavior may not achieve full efficiency at the aggregate level has been well known in the literature. As the complexity of networked systems has increased, the presence of undesirable behaviors resulting from strategic interactions among agents with the different interests of these networked systems has grown. These collective behaviors can have catastrophic consequences, as in the sudden collapse. Therefore we need to cope with networked agents systems by attempting to stack the deck in such a way that agents with selfish incentives have to do what is the desirable thing. Of particular interests is the question how interactions and couplings among agents can be restructured so that they are free to choose their actions while avoiding outcomes that none would have chosen.

In this talk, we discus the study that provides a new perspective and tools to control emerging collective behavior. We discuss how local patterns of synchronization emerge differently in networked agents, driving the process towards a certain global synchronization degree following desirable paths. The dependence of synchronization dynamics on the coupling rules among agents and on the interaction topology among agents is discussed. We discuss the relationship between synchronization in networked agents and emergence of desirable outcomes as collective intelligence. We can observe the wisdom of collective agents at the microscopic level as evolved local rules that constitute constraints on agents' behaviors to achieve desirable outcomes.

Systems-level metabolic engineering of bacteria using genome-scale in silico models

Sang Yup Lee (Dept of Chemical & Biomolecular Engineering, Korea Advanced Institute of Science and Technology, Republic of Korea)

Recent availability of the complete genome sequences of numerous organisms is making it possible to reconstruct the in silico genome-scale metabolic models. Metabolic engineering aims at purposeful modification of cellular and metabolic network in order to achieve several goals including enhanced production of desired products, production of novel bioproducts, broadening the substrate utilization range, among others. In this lecture, I will present the results obtained in my group regarding the development of new metabolic engineering strategies for the production of primary and secondary metabolites using the in silico genome-scale metabolic simulation coupled with omics studies.

The colourful geometry of nature

Michael Barnsley (The Australian National University)

Arrays of data from biology, astronomy, physics, meteorology, etc. often display complex geometries of form and colour which are hard to quantify. This lecture will showcase a new fractal-based method for identifying regularities in such data: a coupled chaos game is used to construct mappings between intricate structures which appear to be different but are inherently related. The method is simple to apply in diverse situations.

Anomalous Diffusion

A Fractional Cable Equation for Anomalous Electrodiffusion in Nerve Cells

Bruce Henry (The University of New South Wales), Trevor Langlands (The University of New South Wales) and Susan Wearne (Mount Sinai School of Medicine)

We describe a fractional cable equation to model anomalously slow electrodiffusion of ions in nerve cells. Fundamental solutions are presented and results for firing rates and voltage attenuation are obtained in terms of the anomalous diffusion parameters. A particluar application to model the passive propagation of a postsynaptic potential along a spiny dendrite is described.

Anomalous diffusion with linear reaction dynamics

Trevor Langlands (The University of New South Wales), Bruce Henry (The University of New South Wales) and Susan Wearne (Mount Sinai School of Medicine)

We consider the problem of anomalously sub-diffusing species, modelled at the mesoscopic level using continuous time random walks, to include linear reaction dynamics. If a constant proportion of walkers are added or removed instantaneously at the start of each step then the long time asymptotic limit yields a fractional reaction-diffusion equation with a fractional order temporal derivative operating on both the standard diffusion term and a linear reaction kinetics term. If the walkers are added or removed at a constant per capita rate during the waiting time between steps then the long time asymptotic limit has a standard linear reaction kinetics term but a fractional order temporal derivative operating on a non-standard diffusion term.

We also consider a two-particle variant of the latter model where the first particle type is transformed into the second particle type when it is removed. Both particle types still undergo anomalous sub-diffusion. Here the equation for the first particle type remains the same as in the single particle case but the equation for the second particle type differs in form to that of the first.

Results from the single particle cases are compared with a phenomenological model with standard linear reaction kinetics and a fractional order temporal derivative operating on a standard diffusion term. We also compare the results with Monte Carlo simulations.

Fractional reaction diffusion along flow lines

Boris Baeumer (The University of Otago, New Zealand)

We model anomalous dispersion using power-law jump distributions. In the scaling limit this leads to a fractional diffusion equation. This allows for the added feature of reaction, modeling invasion of species/spread of diseases. We then take the dispersion model and generalise it to account for regional features, using an old idea by Bochner, namely subordination. This leads to a PDE involving fractional powers of the generator of the underlying flow group.

Heat Conduction in Nonlinear Systems

Bambi Hu (Hong Kong Baptist University and University of Houston)

Heat conduction is an old yet important problem. Since Fourier introduced the law bearing his name 200 years ago, a first-principle derivation of this law from statistical mechanics is still lacking. Worse still, the validity of this law in low dimensions and the necessary and sufficient conditions for its validity are still far from clear. In this talk I'll review recent works done on this subject. I'll also report our recent work on asymmetric heat conduction in nonlinear systems. The study of heat conduction is not only of theoretical interest by also of practical interest. I'll discuss various designs of thermal rectifiers and thermal diodes.

Nonlocal heat transport in laser-produced plasmas

Frank Detering (The Australian National University)

Deviations from classical, local heat diffusion are typical in laser-produced plasmas due to the sharp temperature gradients and the large collisional mean free paths. The heating process in itself leads to non-Maxwellian particle velocity distributions which lead to further differences to the classical collisional transport. A practical approximation is often needed to account for these deviations in large-scale hydrodynamical simulation codes which are commonly used to study the entire experiments. In the present study we attempt to study and justify different nonlocal transport approximations for these applications. Using Fokker-Planck and particle-in-cell (PIC) simulations of the evolution of the electron distribution a single hot spot has been studied in a laser produced plasmas. The practical formula for nonlocal heat flux has been derived as a generalized expression of nonlocal linear approach [V. Yu. Bychenkov et al., Phys. Rev. Lett. 75, 4405 (1995)] and tested in simulations. The Electron distribution function is studied at different spatial locations with respect to localized heating source. It is found to be dependent on the interplay between the effects of collisional heating and nonlocal transport. Significant non-Maxwellian high-energy tails of electron distribution function are found which may have strong impact on the behaviour of other important processes in non-uniformly heated laser plasmas [O. V. Batishchev, V. Yu. Bychenkov, F. Detering, W. Rozmus, R. Sydora, C. E. Capjack, and V. N. Novikov, Phys. Plasmas 9, 2302 (2002)].

Simulation study on heat conduction and beyond

Nobuyasu Ito (The University of Tokyo)

Relation between microscopic dynamics and macroscopic heat conduction is one of the most fundamental problems in nonequilibrium statistical physics and nonlinear sciences. Recent computer simulation finally succeeded to reproduce Fourier law in three-dimensional systems[1-3], and this made theoretical and simulational studies of nonlinear nonequilibrium structures and phenomena possible[4]. On the other hands, some nonlinear lattices turn out to show anomalous heat conduction in mesoscopic scale even if they are three-dimensional[5]. In this talk, such heat-conduction and transport simulations are to be given together with a future perspective.

References

[1] T. Shimada, T. Murakami, S. Yukawa, K. Saito and N. Ito, J. Phys. Soc. Jpn. vol.69 (2000) p.3150

[2] T. Murakami, T. Shimada, S. Yukawa and N. Ito, J. Phys. Soc. Jpn. vol.72 (2003) p.1049

[3] F. Ogushi, S. Yukawa and N. Ito, J. Phys. Soc. Jpn. vol.74 (2005) p.827.

[4] F. Ogushi, S. Yukawa and N. Ito, J. Phys. Soc. Jpn. Vol.75 (2006) 073001.

[5] H. Shiba, S. Yukawa and N. Ito, J. Phys. Soc. Jpn. vol.75 (2006) 103001.

Business and Economics

'Discovering' Small Worlds in Potentially Biased Networks: A Methodological Critique

Sam MacAulay (ARC Centre for Complex Systems, The University of Queensland), John Steen (ARC Centre for Complex Systems, The University of Queensland) and Tim Kastelle (ARC Centre for Complex Systems, The University of Queensland)

This paper argues that much of the empirical work that identifies small world properties within social and economic systems is, unlike that studying physical networks, potentially misestimating the degree of 'small world-ness' within these systems. It is conjectured that misestimation largely stems from a systematic bias in methodological design, which in turn results from the disparate complexity and cost between collecting systemic network data (e.g. both strong and weak ties) for socio-economic (e.g. interpersonal collaboration) and physical networks (e.g. the internet). This disparity exists because the socio-economic literature is primarily aimed at examining the structural properties of socially embedded interdependencies between agents, instead of physical interdependencies (e.g. neural, electricity, electronic communication and transportation networks) whose physical nature in many ways makes them inherently less complex/costly to identify and analyse.

Within socio-economic networks the interdependencies that are least costly to identify and measure are strong ties (e.g. members of boards, alliance members, research collaborations) due to their relatively stable, structured and systemic nature (Montgomery, 1994). Conversely, weak interdependencies within a network (weak ties within sociological parlance) are more problematic to identify due to their inherently dynamic, quasi-random and subtle nature (Granovetter, 1973). Therefore, when carrying out methodological design researchers are much more likely to 'satisfice' by selecting a data collection model biased towards the collection of 'strong tie' data at the expense of 'weak tie' data. Whilst the ultimate effect of this bias on the small world statistic is likely to be contingent on a range of factors (e.g. the distribution of agent-degree centrality within the sample and the population), as weak ties are known to be more likely to bridge local clusters within networks of diverse agents (e.g. Burt, 2004; Uzzie and Spiro, 2005) it can be hypothesised that the identified bias will be systematic. This argument is supported by a meta-analysis of methodology within the field and reference to a number of recent articles that examine the sensitivity of the small world statistic to missing nodes (Deng, Zhoa, Li, 2007), weight randomization of network ties (Li, Fan, Wang, Li, Wu and Di, 2007) and the changes in the proportion of weak-to-strong ties within a network (Shi, Adamic and Strauss, 2007). The implications for the study of socio-economic networks are then discussed.

A Review of Design Approaches Within Schumpeterian Economic Simulations

Craig Lynch (Macquarie University)

Schumpeter's Theory of Economic Development has been an object of attention and in-depth analysis since it was introduced nearly a century ago. Despite it being a representation of Schumpeter's reason and logic, supported by some empirical research, it is in essence a clearly defined dynamic system that ties together almost all components and behaviours within a typical economy. Over the past few decades the means to simulate and explore this theory have emerged through the availability of several computing methodologies and tools. As a consequence a number of models have been constructed that focus on some aspect of the theory, and the term 'Schumpeterian' has become a much-used adjective within this arena.

The question of whether these models adequately simulate the complete theory, given its breadth and reliance upon the combinations of innovation, entrepreneurial action, the supply of credit, and interactions amongst all economic agents, represents the foundation of this paper. A range of models, with varying methodological approaches, are analysed to determine how closely they encompass all aspects of Schumpeter's theory. The implicit design within each model is reviewed to the degree to which transactions generated from individual behaviour can be aggregated at an industry or economy level, and a first approximation of an overall design framework is consequently distilled from this exercise. Of particular interest is the ability for entrepreneurs and firms to make strategic choices and succeed in improving or sustaining profit accordingly. The paper also provides a contrast between agent-based modeling and a recent adaptation of computable general equilibrium approaches in the simulation of an economy operating within Schumpeter's behavioural rules.

ACE modelling: does size matter?

Paul Davis (Macquarie University)

Agent-based Computational Economics (ACE) is a relative new comer to the field of economic research. ACE modelling involves the creation of computer based economic models comprised of heterogeneous agents and environmental rules. Models are executed enabling the modeller to study the emergent (macroeconomic) behaviour resulting from complex inter-agent dynamics (microeconomics).

This presentation is part of a program of work within a PHD the focus of which is research into Schumpeterian dynamics and in particular economic development and the emergence of cyclicality. The question being considered here is whether the size of a model, reflected in the number of agents, impacts on the modelling processes of design, execution and analysis, or stated colloquially, ACE modelling: does size matter?

The methodological analysis is performed against two models, a model with no fixed size and an extant model. Questions asked are aligned with the following model development life-cycle process: Design, Execution and Analysis, and include: Is model transparency influenced by Agent Quantity? What is the impact of parallel or sequential processing? How does the modeller determine whether a behaviour is emergent or pre-programmed?

Findings show that the ACE modelling capability to embrace and understand complexity enables models with large agent populations to be effectively utilised. This has the potential to re-establish the linkages between microeconomic behaviour and macroeconomic observation as well as connect the results from ACE modelling with real-world economic analysis.

Agent-based design considerations to ensure behaviour is emergent: A Labour market simulation using RePast

Paul Davis (Macquarie University)

Joseph Schumpeter in his 1912 book The Theory of Economic Development argued Labour and Land were the fundamental building blocks of economies; entrepreneurial utilisation of capital to fund innovations that drove economic growth in a "creative destruction" style manner; new business lines destroying the old resulting in the appearance of business cycles. A question has arisen during a program of work within a PHD whose focus is researching Schumpeterian dynamics, economic development and the emergence of cyclicality; how does one model a Schumpeterian economy and ensure if the model generates cycles, they are emergent (not pre-programmed) phenomena?

This presentation is a PHD progress report examining the impact ensuring emergence has on the design and implementation of an agent based model under construction. The model is being developed using RePast (Java). Currently it is comprised of two types of agents, Persons and Firms and a market mechanism enabling agents to trade Labour. As production represented as a function of the consumption of labour has not yet been introduced the model does not display cyclicality however the work to date has demonstrated to the researcher:

  1. object orientated methodologies provide atomic modelling at the cost of added complexity
  2. sequential "event-based" modelling requires effective use of randomisation
  3. hard-coded limits (e.g. population size) increases the risk that emergent behaviour may not appear
  4. careful design of factors influencing decision variables is required to ensure that the interaction of the factors does not result in the emergence of results that can be predicted

Automatic Extraction and Modelling of Human Knowledge Networks from Natural Language using a Complex Systems Approach

Andrew Smith (Leximancer), Michael Humphreys (School of Psychology, The University of Queensland) and Bettina Cornwell (School of Business, The University of Queensland)

Some researchers have recognised that the knowledge contained in unstructured natural language is complex. Humans appear to be capable of: maintaining contradictory beliefs and definitions, reinterpreting identities and roles subject to context and goal, using vocabulary and metaphor creatively, and modifying entities and relationships with almost infinite degrees of variation. Since language reflects to some degree our knowledge of the complex world around us, and combining this with the dynamic requirements of the human organism which has its own internal cognitive complexity, the conclusion that knowledge expressed as natural language is a complex system might appear unavoidable.

And yet, in disciplines such as Knowledge Management, Cataloging, Semantic Web, and Intelligence Extraction, it is common for fixed ontologies to be designed, and subsequent language data to be forced into alignment with the specified ontological rules. This mechanistic approach is analogous to clear-felling a rain forest without understanding what we are destroying.

This paper will present two case studies which will use the new text analysis system called Leximancer to extract knowledge networks from natural language collections in an emergent fashion. Leximancer's recursive algorithms fall into the definition of complexity, and while the algorithmic rules which define the interactions between words and between concepts are fairly simple, the resulting models capture a high degree of the complexity of the data. Cross validation of emergent patterns is an important aspect of these studies.

The first study will present a new method for Brand Concept Mapping. This method involves prompting individual subjects with minimal stimulus material which is designed to maximise the thoroughness of their response. The premise for this study is to discover the internal cognitive map of potential consumers which would operate when they are making a product choice on their own. For this reason, and to avoid the experiment being confounded by social interaction effects, knowledge elicitation was conducted as an isolated essay writing task rather than as an interview or focus group. Data was collected using several different stimulus materials so that the sensitivity of the method to manipulation of the experimental conditions could be assessed.

The data was analysed using Leximancer to automatically extract aggregate concept maps from the material. Maps were extracted from different samples of the data both within group and between group to test for sampling validity and sensitivity.

The second case study will employ the dynamic (temporal) mapping feature of Leximancer to examine the dynamics of complexity in a news feed. The dynamic map presents as an animated concept map which shows concepts and themes emerging and fading. The map shows periods of equilibrium where there is a stable attractor and also periods of instability. It is hypothesized that there may be both areas of chaos and of transitional complexity in the unstable regions, but this is the subject of ongoing research.

Dynamic conceptual mapping will be useful for visually monitoring the shared knowledge state in several key situations: trading and financial services (using news wire, email, and corporate reporting data); organisational and project team communications (with email, chat text, and meeting minutes); extended and distributed planning meetings (using multi-channel text transcription); ongoing investigations (using statements, interview transcripts, and investigator notes); and market or opinion research.

Breaching Walras's Law: a first step to modelling endogenous money

Steve Keen (School of Economics & Finance, The University of Western Sydney)

Walras's Law has been a mainstay of general equilibrium mathematical modelling in economics for over a century, and much of the post-WWII development of macroeconomics was undertaken to try to make Keynes's arguments about an unemployment equilibrium consistent with this Law.

In this paper I argue that Walras's Law is invalidated in a truly monetary framework, and present a simple linear model of a monetary exchange economy in which Walras's "Law" is shown only to apply in a situation of zero growth. From a complex systems perspective, this means that the full nonlinear dynamics of a monetary production economy are dissipative, not conservative.

Combining System Dynamics and Choice Modelling to Simulate Demand Effects of Integrated Customer-Centric Marketing and Revenue Management

Christine Mathies (The University of New South Wales)

Stated preference choice experiments were conducted in the airline and hotel industry to examine how the current practice of simultaneous yet unintegrated customer-centric marketing and revenue management impacts customers' choices. An unintegrated approach was found to create perceived conflicts and unfairness for customers, which manifest in their purchase choices. The choice analyses form the basis for a series of simulations of how an integrated approach to customer-centric marketing (CCM) and revenue management (RM) can avoid conflicts and alter customers' choices. Firstly, the preference estimates from the choice experiments were used for basic what-if predictions of choice probabilities if one or more attributes are changed. The findings suggest that service firms can achieve substantial revenue increases. However, a more sophisticated approach to predicting demand from a basic integrated CCM-RM system combines system dynamics with discrete choice modelling to help airlines and hotels decide on the best strategy of how to incorporate CCM and RM. The simulation models for each industry include the market, represented by three different market segments with distinct preference structures, and the providers, represented by the focal airline, and hotel respectively, and three competitors. Preference estimates are employed to specify the decision rules of the simulation model. While some existing research brings together conjoint analysis and simulation models with the objective to add precision to the model formulation, this paper expands this idea to stated choice analysis and compares the simulation outcomes from two different techniques.

Introduction to an Agent-Based Model of Development Processes in Tanzania

Brett Parris (Dept of Econometrics & Business Statistics, Monash University; and World Vision Australia)

Policy makers increasingly require models that can integrate the socio-economic, political, epidemiological and environmental dimensions of development. Agent-based models can achieve this integration using interacting agents in computer simulations to represent individuals, households, firms, governments and land types. This paper presents progress on the development of an agent-based model of a Tanzanian region developed using RepastJ. It uses a range of data sources, including the national household survey, World Bank development data and a Social Accounting Matrix (SAM).

People are modelled as individuals, who are born, grow up and eventually die. They must remain healthy, and need to be educated as children. They may marry, have children, divorce, get sick and recover. They are related to other specific individuals and are members of households. These households may be rural or urban, very poor, poor or non-poor and may have access to transport, and other household amenities such as sanitation, electricity, and telecommunications. The characteristics of the households determine their relative power in bargaining over the prices of the goods and services. Richer, healthier households, with higher levels of education, and good access to fertile land, transport and information, have greater bargaining power than poorer, more isolated households on marginal land. The model enables some of the dynamics of poverty and wealth to be explored.

Mean Bad Birds versus Kind Friendly Chickens: Group Selection and the Evolution of Cooperation

Ian Wilkinson (The University of New South Wales), Dan Ladley (The University of Leeds, United Kingdom) and Louise Young (The University of Technology, Sydney)

The development of collaborative relations and networks among and within organisations is an important source of competitive advantage for firms, industries, regions and nations. The problem is that firms often begin from a state of adversarial relations and it is difficult to turn them into more collaborative forms. Various theories of the emergence of cooperation have been proposed based on kinship ties and repeated interaction and reciprocity but they cannot account for the emergence of large scale cooperation among people and organisations that are ' strangers,' which characterises many types of business cooperation. Group selection mechanisms offer ways forward. A renewed focus on group selection mechanisms has been championed by leading evolutionary biologists such as Edward O. Wilson at Harvard and David Sloan Wilson, who see it as opening up 'a whole new ballgame' in research. Group selection mechanism can be used to provide another type of explanation for and a means for facilitating the emergence of cooperative behaviour in social and economic systems, including business relations and networks. We examine the impact of group selection on the outcomes of mixed motive game simulations, revisiting research done in the 80s and 90s. We demonstrate that group selection evolves fitter (better performing) strategies than does individual selection, because individual strategies as well the mixes of interacting strategies in groups co-evolve. Lastly, given the focus on producing high performing researchers in university departments as a result of the RQF funding scheme, the relevance of this research to that context has not gone unnoticed!

Minimalism and model-building: an assured model of the exchanges between consumers, retailers and manufacturers

David Midgley (INSEAD, France), Robert Marks (The Australian Graduate School of Management), Daniel Klapper (The University of Frankfurt, Germany) and Dinesh Kunchamwar (Barclays Capital, Singapore)

Model assurance combines ideas from software proof, destructive testing and empirical validation. Previously we raised the philosophical issue of whether social scientists should take a traditional scientific approach to building agent-based models or whether they should prefer a minimalist approach. Taking our own advice we developed a minimalist version of our model. We present the results of assuring this new model. The specific steps taken to assure the model include: 1. Verification a. Two external experts have inspected the RePast code to discover whether it follows the model specification correctly. b. We use the Genetic Algorithm as an optimizer to test the bounds of the model by seeking implausible results. 2. Validation a. A real supermarket chain has provided two databases which we use to validate the model. b. Here we follow a hybrid approach where we use one database to calibrate consumer agents at the micro-level and then we fit the retailer and manufacturer models to the other database at the macro-level, again using the Genetic Algorithm. We report the results of this model assurance exercise and use it to define 'minimalism' more tightly, arguing that it is more restrictive than 'parsimony.' We also extend this debate by discussing the practical barriers that currently prevent ABMs reaching their full potential in the social sciences. These include the costs of software proof and the lack of data to validate many aspects of the agents.

Modeling innovation changes in business networks

Sharon Purchase (The University of Western Australia), Doina Olaru (The University of Western Australia) and Sara Denize (The University of Western Sydney)

Actors' choices concerning their communication processes (intensity and richness), cooperation, and knowledge sharing with other business actors change existing network structures resulting in improved overall 'network innovation' (Möller and Svahn, 2003). The extant literature establishes need to elaborate the interplay between these actor choices and performance, in particular via longitudinal studies (Ferguson et al., 2005, Walter et al., 2006). However, it is difficult to secure cooperation for such research and real world cases do not allow the researcher to exact the necessary detail to include most complex interactions. This research investigates how changes in communication processes produce variations in the innovative performance of the network using simulation.

The 'outcome' variable in this conceptualisation is the 'network innovation' that results from the communication processes developing cooperation, knowledge sharing, learning and adaptation in business networks (Hummond, 2000). Specifically, actors modify the strength of their connections with the other actors by altering the amount and type of information they share with other network actors. By changing the input variables such as, communication intensity and richness, within similar network structures, emerging patterns of network innovation are examined. Numerous simulations of network structures are assessed for a particular size of network (parameter of the model).

The complexity of the problem, the nonlinear effects arising from structure asymmetry, the nature of the communication, and the self-organising features of social networks make fuzzy sets an attractive paradigm to use within the simulation process (Robinson 2003, Bonabeau 2002). The model uses seven inputs with 3-5 adjectives to capture flexibility in learning and adaptation, background/environment conditions, communication, network position and the fuzzy-rule system models ways in which dynamics in network structure and relations can trigger the development of new ideas within the network.

Both academic and managerial implications emerge from the modelling exercise - understanding how individuals take advantage of the clustering and 'short-cuts' in the network in order to improve their central or actor-in-the-middle/broker position is required to influence/moderate both radical and marginal changes in innovation in business networks.

References

Bonabeau, E. (2002) Agent-based modeling: Methods and techniques for simulating human systems, PNAS 99 (3), 7280-7287.

Ferguson, R. J., Paulin, M., Möslien, K. and Müller, C. (2005) Relational governance, communication and the performance of biotechnology partnerships, Journal of Small Business and Enterprise Development, 12:3 395-408.

Hummond (2000) Utility and Dynamic Social Networks, Social Networks 22, 221-249. Möller, K. and Svahn, S. (2003) Managing Strategic Nets: A capability perspective, Marketing Theory, 3(2), 209-234.

Robinson, V.B. (2003) A Perspective on the Fundamentals of Fuzzy Sets and their Use in Geographic Information Systems, Transactions in GIS 7(1), 3-30. Walter, A., Auer, M. and Ritter, T. (2006) The impact of network capabilities and entrepreneurial orientation on university spin-off performance, Journal of Business Venturing, 21: 541- 567.

Performance metrics: Towards an uncertainty principle for organizations

Bill Lawless (Paine College)

I'm currently working on two projects, the first more conceptual and applied to seven military Medical Department Research Centers and once Central Business University; and the second more mathematical.

A new organizational metrics: A quantum approach

A future goal of robot teams and agent-based models (ABM's) is to field systems of organizations based on first principles derived from human counterparts. However, the failure of traditional organizational theory has forestalled that opportunity but at the same time opened the way to innovative theories of organizations and change. Inspired by Bohr and Heisenberg's ideas about the application of interdependent uncertainty in the interaction between action and observation, making organizations bistable, we have begun to construct a theory of organizations based on the uncertainty of energy level (resources) and belief/action consensus, leading to preliminary metrics of organizational performance that we have applied in field studies. Our goal in this project is to address the problem posed by organizations with: the development of new theory; field tests of new metrics for organizations; and the development of quantum ABM's set within a social circuit as a building block for an organization. Should we be successful, our research would represent a fundamental departure from traditional observational methods of social science by forming the basis of a predictive science of organizations. We expect that replacing the traditional method of observation with a predictive science must account for when cognitive observations work and when they do not ("illusions"). Examples with mergers are provided and discussed.

An evolvable game theory: A bistable or quantum approach

A new approach to game theory has been designed to resolve two of its major problems: The arbitrariness of valuing cooperation greater than competition in determining social welfare; and the lack of interdependent uncertainty. The approach is to develop quantum or bistable agents and relationships. The quantum approach means that agents in relationships are more likely to be found in bistable states that correspond to their energy levels - the more complex, competitive, or conflictual the state, the greater the energy required. In our view, games are initialized, evolved to a state that solves a target problem, then measured, consequently creating a measurement problem. In past research, we have resolved the measurement problem. The measurement problem led to the development of metrics that have been applied to organizations in the field (we briefly illustrate an application to military Medical Department Research Centers). In this paper, we focus on modeling control in bistable close and market relationships to produce evolvable systems.

The Cost of Information Acquisition in a Supply Network of Rationally Bounded Negotiating Agents

Rodolfo Garcia-Flores (CSIRO Mathematical and Information Sciences), Nectarios Kontoleon (CSIRO Mathematical and Information Sciences), Rene Weiskircher (CSIRO Mathematical and Information Sciences) and Simon Dunstall (CSIRO Mathematical and Information Sciences)

Industrial problems that arise in situations where responsibility is shared and knowledge is incomplete often do not yield clear-cut optimisation problems which can be solved "monolithically". A new paradigm for distributed problem solving is needed that requires problem decomposition by independent entities that are able to optimise with local models and/or data which are concealed, inaccessible or incompatible, and who must learn or infer about their environment using the incomplete information available to them. Learning is often made through negotiation. In this process, two parties with some apparent conflict, interact in a potentially opportunistic way to arrive at a mutually beneficial solution. Learning through negotiation implies bounded rationality, a limitation that is often disregarded in optimisation literature. Thus, the actual cost of negotiation as a search process is seldom considered in optimisation studies. The aims of this paper are: Firstly to propose and test a computational agent-based model for explicitly assessing the cost of information acquisition in a network of economically motivated trading agents. This allows us to make a quantitative trade-off against the bargaining gains that negotiation, as a search process, provides. Our second aim is to assess the effect of different agent strategies and network arrangements (topologies) on the system's welfare.

The Emergence of New Markets

Ella Reeks (The University of Queensland)

Markets hold a central place in economic theory and their operation provides us with ample opportunities to explore the principle of emergence. Whilst emergence has typically been explored as a phenomenon occurring within markets, this paper seeks to expand the agenda to the emergence of markets. Specifically, the research examines the processes whereby the macro-system properties of a 'new market' emerge from the micro interactions of entrepreneurial organisations seeking to exchange a unique good or service. New markets are characterised by the emergence of a distinctive and coherent set of diverse market rules - ranging from informal business practices, through trade association sponsored standards, to designed trading systems - known as market institutions. The paper argues that the emergence of a new market depends upon early market participants creating, maintaining, and transforming exchange relationships that exhibit positive feedback attributes. The research focuses on the contractual interactions between buyers & sellers and identifies collaborative learning, joint problem-solving, mutual experimentation, and cooperative adaptation as necessary behaviours for the emergence of unique market rules and institutions. The paper utilises a case study of a real world market - Australia's wind energy market - to assist in developing and illustrating the theoretical sketch of market emergence.

The Use of Genetic Algorithms for Modelling the Behaviour of Boundedly Rational Agents in Economic Environments: Some Theoretical and Computational Considerations

Janice Gaffney (The University of Adelaide), Charles Pearce (The University of Adelaide) and Scott Wheeler (Defence Science and Technology Organisation)

We consider a class of models of adaptive learning by a population of boundedly rational agents in an economic environment. The agents revise their decisions in accordance with a genetic algorithm. The genetic algorithm gives the dynamics of the system. We discuss the recent paper of Wheeler et al. that employed a Markov chain framework to analyse this system of heterogeneous agents in an economic context and a genetic algorithm. The application of Markov chain theory is not immediate because the fitness function is state dependent. Wheeler et al. show there is an intimate relation between grid spacing in the decision space and convergence behaviour in the cobweb model with boundedly rational agents and a genetic algorithm. In this paper we develop these ideas further and provide illustrative computer simulations.

Using Kauffman

Ian Wilkinson (The University of New South Wales) and James Wiley (Victoria University of Wellington, New Zealand)

The aim of our work is to apply complex adaptive systems perspectives to the study of Industrial market systems (IMS's). IMS's consist of interrelated organizations involved in creating and delivering products and services to end-users. The modeling effort has two interrelated but distinct purposes. The first is to help us to better understand the processes that shape the creation and evolution of firms and networks in IMS's. This will provide a base both for predicting, and perhaps influencing, the evolution of industrial marketing systems. Secondly, the models may be used for optimizing purposes, to help us design better performing market structures.

The specific objectives of the present research are to examine: a) the processes by which structure evolves in IMS's; b) the factors driving these processes; and c) the conditions under which better performing structures may evolve. The paper describes computer models which are capable of mimicking the dynamic processes of IMS's, drawing in particular on the NK Models developed by Stuart Kauffman at the Santa Fe Institute, [Kauffman 1992, 1995].

Complex Systems Engineering

"So...," asks the Chief Engineer "What do I go do?"

Douglas Norman (The MITRE Corporation)

Acquiring large collections of evolvable, net-centric elements which can be endlessly fashioned and refashioned into chains of valued capabilities demands we rethink some of our fundamental approaches to Systems Engineering. Complexity theory has taken center stage in this new Systems Engineering reformulation offering the central theme of variation, environment shaping, selection, and adaptation. Yet the writings and discussions to date have focused mostly on the descriptive and comparative; what's asked often by those producing or acquiring systems is "what do I go do?" The author offers some heuristics which move us along towards that goal.

While incomplete (for example, appropriate incentive structures and business models need to be discovered and become the accepted norms), these heuristics don't require the people who are using them to understand complex systems per se, or reinterpret their jobs from that point of view (although it can help). Rather, the heuristics come from operationalizing practical interpretations of the complex systems principles; thus they can be used on their own, and can result in "good enterprise citizen" behavior for a system which supplies composable elements for the agile assembly of new capabilities which extend beyond their system borders.

The heuristics are intended for the Chief Engineer or Program Manager who is charged with the design, development, and fielding of a particular system; and help place the system, or elements of it, into an enterprise context. These heuristics are:

  1. Focus on the fundamental unique value a system brings to the enterprise;
  2. Develop and offer simple, casual methods for clients (people or machines) to get access to, and interact with, a system's fundamental unique value;
  3. Use quick and continual feedback from the field-user(s) to assess "realized value," suggest changes, and uncover opportunities. a. Note: this feedback can be direct or indirect- where direct feedback is "what they say" and indirect feedback is "what they use." Direct feedback tends to have many filters on the feedback received; while indirect tends to be more pure, but more difficult to gather.

The switch of focus to an explicit understanding of one's position and offering(s) to the enterprise - along with attention to actual feedback along the dimension of realized operational value - can change the overall dynamic in a way which coheres to the goals of netcentricity; and is informed by complexity.

A Self-Organising Sensing System for Structural Health Management

Nigel Hoschke (CSIRO Industrial Physics and The University of New South Wales) and Don Price (CSIRO Industrial Physics)

This paper describes a new approach to structural health monitoring and management (SHM) that aims to diagnose and respond to damage using the self-organization of a complex system of distributed sensors and processing cells. To develop and evaluate the approach, an experimental SHM test-bed system has been developed, with the aim of detecting and characterising the damage from high-velocity impacts such as those due to micrometeoroids on a space vehicle. An important new feature of the system is an ability to support mobile (robotic) agents that can roam the exterior surface of the test-bed, obtaining additional damage information and providing a crude repair capability. The focus of this paper is the development of a self-organised approach to the operation of such a robotic agent, for which it obtains local information by direct communication with the fixed agents embedded in the underlying structure. A short video clip showing the operation of the experimental test-bed will be shown.

(*) This paper is an up-dated version of a paper presented at the 10th International Conference on Knowledge-Based & Intelligent Information & Engineering Systems (KES2006), Bournemouth, UK, 9-11 October 2006

Architecture Characteristics Required to Support an Evolvable Model Based Systems Engineering Environment

Christian Ross and Peter Campbell (Defence and Systems Institute, The University of South Australia)

Classical Systems Engineering methods based on working from fixed requirements towards a final product have been shown to be inadequate for use with large present day complex system of systems. Such projects are characterized by being built from components which are sourced from different environments and that may have also been originally intended for a slightly different user group, by long term life cycles during which requirements and operating environments are likely to change and by a degree of sensitivity to the behavior of people both within and external to the development project. As a result, any useful Model Based Systems Engineering environment must be able to be applied across a discontinuous domain and must have scalability and flexibility to evolve with the developing project. Scalability can be achieved by having an architecture with the ability to incorporate new versions of executable sub-models into the overall model of the system as the understanding of the project aspects increases with time, while flexibility requires the ability of the architecture to adapt to changes in conceptual scope over time. This paper describes the characteristics that a supporting architecture must have to provide systems engineers the flexible system of tools and models for Model Based Systems Engineering to be successfully applied to the design and development of complex system of system level projects.

Architecture Trade-Off Analysis and Multi-Objective Optimization Strategies

Lars Grunske (The University of Queensland)

Architecture trade-off analysis methods are appropriate techniques to evaluate design decisions and design alternatives with respect to conflicting quality requirements. However, the identification of good design alternatives is a time consuming task, which is currently performed manually. To automate this task, this presentation proposes to use evolutionary algorithms and multi-objective optimization strategies based on architecture refactorings to identify a sufficient set of design alternatives. This approach will reduce development costs and improve the quality of the final system, because an automated and systematic search will identify more and better design alternatives.

Designing complex interactions - a people centred perspective

Peter Johnson (University of Bath, United Kingdom)

Abstract not available at time of printing.

Evaluation of Conceptual Models for Agent Based Representation of Behaviour in a Simulation of the Capability Development Process

Kathy Darzanos (Defence and Systems Institute, The University of South Australia), Peter Campbell (Defence and Systems Institute, The University of South Australia) and Stephen Cook (Defence and Systems Institute, The University of South Australia)

The Defence Capability Development Manual (DCDM) 2006 describes in detail the Capability Development (CD) processes performed by the Australian Defence Organisation (ADO). It also recognizes that there is a constant need for adaptation and modification to any process driven approach in order to respond to the realities of working within the Defence environment where changes in schedule, funding, personnel, requirements and technology frequently occur during the life of a capability assessment. UniSA is working with DSTO to build a simulation tool which it is hoped will lead to a better understanding of the dynamics which occur during the course of any CD program. We are building an agent based simulation using the open source toolkit Repast and have decided on an initial set of agents for implementation. Details of this project are described in further detail in a paper to be presented at SimTect 2007. This paper presents research into an evaluation of several conceptual models for agent representation which will be suitable for simulating segmented workflow processes within the Capability Development Executive (CDE). The three conceptual models that will be reported on in this paper are the Belief-Desire-Intention (BDI) model, the Parallel-rooted, Ordered, Slip-stack, Hierarchical (POSH) model and a directly heuristic agent implementation based on an approach similar to the Framework for Addressing Cooperative Extended Transactions (FACET).

Exploiting CAS as a 'Force Multiplier' - Its Application to Policy, Acquisition, Assessment and Operational Employment

Patrick Beautement (QinetiQ, United Kingdom)

All enterprises form networks of interactions and interdependencies (which extend way beyond their traditional 'boundaries') and so are, de-facto, CAS federations. As the unexpected is always to be expected, enterprises must, from the start, have an appropriate level of adaptability available to them to be employed at 'run-time' (active operations). This ability to be appropriately agile can only arise from a balanced implementation of CAS engineering mechanisms and techniques at Design, Assembly and Run-time (DART). This paper explains a complex system engineering model called the "DART Framework" and how it could be employed by the Policy, Doctrine, Acquisition and Experimentation communities to enable agility to be available for decisive use during military operations.

The paper explains that, to operate purposefully and effectively in the real world, enterprises must be able to adapt and so they must be agile enough to generate novelty. As novelty cannot be defined a-priori, it must be generated at run-time. Enterprises use novelty to generate 'option space' and already employ various aspects of CAS to do this through dynamic self-organisation and self-regulation to maintain and sustain their viability over time. However, currently, these capabilities are not purposefully factored into policy, engineering, acquisition and assessment - indeed, the benefits of dynamic 'emergent' behaviour are poorly understood and are often crushed as they are seen, wrongly, as being damaging. This paper will suggest some suitable techniques to employ to deliver systems that are able to adapt dynamically and appropriately. To do this, the paper will discuss the following: how to understand what kind of flexibility needs to be available at run-time; the demands and benefits of agility and its systemic organisational aspects; how to understand what needs to be specified and which are the appropriate complex system engineering mechanisms to employ at which stage in DART; how to balance having an ability to change at run-time against cost and risk and how to justify the investments; how to get rapid experimentation which can inform evolutionary design and how to engineer the necessary federations.

Exploration of non-reductionist models of service ecosystems

Peter Bruza (Queensland University of Technology), Kirsty Kitto and Alistair Barros

The value-proposition of service ecosystems is enabling services to be procured and traded outside their traditional custodial and governance boundaries into new and unforeseen markets. The wider setting of service ecosystems implies that aspects of service provisioning can be outsourced to third parties for new business opportunities. It is easy to imagine that service ecosystems may have an important role to play in transforming media, finance and government. Service Ecosystems are exemplars of 'highly contextual systems'. A contextual system depends in some way upon the behaviour of factors generally considered external to it, e.g., the environment, its history, the process of measuring the system. It has been convincingly argued from complex systems theory that highly contextual systems are not modeled appropriately in the traditional (reductionist) way. For example, the interface to a service is often considered an aspect of the service, and these often require constant amendment even if the service is placed in a slightly different context. In order to mitigate this phenomenon, services are typically hard-wired into hierarchical structures. In the feral world of service ecosystems, hard-wired service solutions will not remain satisfactory for long. A non-reductionist point of departure offers an intriguing shift in view - a service interface needs to manifest dynamically at the point of interaction rather than being a fixed property of a service in advance. This echoes quantum theory where properties cannot be ascribed to a system until it undergoes an act of measurement (i.e. we look at it). Consequently, this talk will speculate about which aspects of quantum theory may be relevant in providing a foundation for a non-reductionist approach for modeling service ecosystems.

Exploring Social complexity and concept evolution in war games using Dialogue Mapping

Cherylne Fleming (Defence Science and Technology Organisation)

Strategic war games are used at the start of the Defence Acquisition process, or more correctly the initial phases of the process known as Capability Development. Strategic war gaming is a complex area where problems of technical and systematic issues are overlayed with complex social interactions. The objectives of these events have multiple conflicted stakeholders both as participants and in the senior committees which use the results to make changes to the Defence Capability Plan. As such, strategic war games are as much about the people and the process as the documented outcomes. Previous authors have developed tools and techniques for assisting the technical and system aspects of the war game (Wood, 1999) but continuing to explore the social complexity and concept evolution would valuably assist in developing our understanding of the decision making.

Over the past eighteen months an interdisciplinary technique built on Dialogue Mapping has been developed, and applied to several strategic war games. This has shown that the Capability Development process is adaptive to changes in inputs and outputs while evolutions within the process also occur. These changes can be seen at many levels, for example in timeframes, process adjustments, concept development and evolution to identify a few. The context of these events is to work on wicked problems, where the participants are stakeholders who learn and change strategies over not just a single annual cycle of DCU but over the successive years and across related events (i.e. in parallel or serial studies).

This form of analysis identifies the interactions, communications, consensus and cohesion of the participants at the event and provides evidence to assist the validity and reliability of the war game outcomes. By developing an appreciation of the group dynamics, and the decision making that occurred during the event, anecdotal evidence suggests greater insights into the results have been demonstrated. It is proposed that this increased traceability of concept evolution will continue to assist when the results are challenged post event.

Implementing Adaptive Campaigning Through Force Development

Tim Pickford (Force Development Group)

Abstract not available at time of printing.

Morphogenic Systems Engineering for Self-Configuring Networks

Shane Magrath (Defence Science and Technology Organisation)

We propose Morphogenic Systems Engineering as an approach for enabling military communications systems to be self-configuring. This long range research objective seeks to determine how communications systems can bootstrap themselves into existence from primitive configurations with minimal administrator intervention or supervision. The goal is to enable tactical communications networks to be fully autonomous in such ways that enhance rather than detract from force capability, operational tempo, or commander's intent.

Simulation engine for strategic planning of health services using Agent Based Modeling

Ashok Kanagarajah (ARC Centre for Complex Systems, The University of Queensland), Peter Lindsay (ARC Centre for Complex Systems, The University of Queensland) and David Parker (The University of Queensland Business School)

Understanding the drivers of behavior of health care services and the underlying dynamics of demand and supply is of key importance to medical practitioners, service providers, strategists and regulatory bodies. A cross-disciplinary research project is being undertaken at the University of Queensland to develop an agent based simulation modeling tool to study the behavior of healthcare. Service providers usually ask questions like 'How do we reduce access block within a hospital?' This would mean that patients arriving at an emergency department who are not turned away due to lack of beds. To answer these types of questions, various best practice techniques used in industry, are being applied to health care services but there have been mixed reports and doubts of their effectiveness within heath care. At University of Queensland with the support of Queensland Health, we have been exploring simulation techniques using a powerful computational method; agent based modeling (ABM), to study the impact of industrial best practice application to health services. In this paper we will outline how ABM, (i) allows the dynamic nature of services to be modeled, (ii) treats patients and doctors as individuals therefore retaining the richness of information at the micro level and (iii) allows patient and service provider interactions and social networks to be explicitly modeled. In this paper we describe an agent based service dynamics behavior model drawing from the health service experiences and literature. We present some simulation results exploring and comparing different service dynamics using, health service based data

Towards A Complex Systems Model Of Australian Air Traffic Management

Ariel Liebman (ARC Centre for Complex Systems, The University of Queensland), Peter Lindsay (ARC Centre for Complex Systems, The University of Queensland), Colin Ramsay (ARC Centre for Complex Systems, The University of Queensland) and Martijn Mooij (The Key Centre for Human Factors, The University of Queensland)

The Australian Air Traffic Management (ATM) system is undergoing a number of fundamental paradigm changes over the next 10-15 years. While the current system is the most integrated ATM system in the world, the current paradigm will not scale with current forecast traffic increases. This applies even more to other western ATM systems such as the USA's and Europe's. The key problem areas are delays across the systems caused by air traffic control when flights experience perturbations away from their planned 4-D trajectories (3-space dimensions plus time). The ACCS has been developing an ATM and trajectory simulation model which takes into account the aircraft performance characteristics and air-traffic controller agents. The current simulator can emulate the behaviour of the system at a single sector level and contains most of the elements required for a complex systems simulation of the current ATM paradigm. This talk describes the work done to date and presents some novel approaches to ATM including parameter sweeps demonstrating the behaviour of the system as a function of key agent parameters such as operator scan interval (modelling limited human cognitive processing capacity). Additionally we will outline the future plans for the models and the ATM research.

Complex Systems in the Earth Sciences

Complexity of earthquake occurrence and implications for earthquake prediction

Dion Weatherley (Earth Systems Science Computational Centre)

We review recent advances in understanding the physics of earthquakes from a complex systems perspective. Earthquakes are the product of highly nonlinear, multi-scale, multi-physics interactions giving rise to complex and seemingly unpredictable temporal sequences of earthquake occurrence. The concept of self-organised criticality has infused the discipline with new insights, giving rise to the Critical Point Hypothesis of earthquakes. Large earthquakes are considered the product of the Crust reaching a Critical Point, analogous to that of Statistical Physics. Self-organisation, driven by plate tectonics, results in systematic patterns of energy release in the leadup to the critical point, offering the possibility for routine intermediate term forecasting of earthquake occurrence. Cellular automaton models for seismicity have begun to unlock the key features of crustal fault systems that ensure a quasi-periodic approach to and retreat from a Critical Point collminating in a large earthquake. We also examine recent advances in earthquake forecasting methodologies, particularly the stress accumulation method and pattern informatics.

Ensemble Prediction of Atmospheric Blocking Regime Transitions

Jorgen Frederiksen (CSIRO Marine and Atmospheric Research) and Terry O'Kane (Antarctic Climate and Ecosystems Cooperative Research Centre)

The skill of ensemble prediction during blocking regime transitions is examined within two multi-level general circulation atmospheric models. As well we examine the roles of inhomogeneities and non-Gaussian terms by comparing the predictability of the transitions within ensembles of simpler barotropic models and within a recently developed closure model for inhomogeneous turbulence interacting with general mean flows, Rossby waves and topography. An ensemble prediction scheme based on fast growing perturbations has been implemented for these four models. The methodology uses a breeding method, based on an implicit linearization of the models, in which perturbations likened to leading Lyapunov vectors are obtained and used to perturb the initial conditions. Detailed comparisons of the skill of ensemble mean forecasts with control forecasts have been carried out for initial conditions in October and November 1979. A particular focus has been the variability of forecast skill during the development, maturation and decay of the large-scale blocking dipoles that occurred. On average, the ensemble mean forecast performs better than the control forecast for forecast times longer than 3 or 4 days. Forecasts initiated twice daily exhibit considerable variability in forecast skill that is shown to be related to instability regimes of particular synoptic events. At a given forecast lead time, errors tend to be larger for forecasts validating when blocks are developing or decaying and smaller for mature blocks. The spread of ensemble member forecasts has been studied and related to likely forecast skill.

Limitations to scaling up a biophysical domain within agent based models.

Geoffrey Carlin (CSIRO) and Freeman Cook (CSIRO)

Agent Based modelling environments offer a framework for creating Agent Based Models (ABM) and have evolved from a desire to model human social behaviour and individual decision making (Bonabeau 2002). ABM's for investigating the effects of government policy on land use change are becoming increasing popular (Heckbert et al. 2005) but require the development of not only social but also biophysical agents.

This paper will look at the integration of a surface and groundwater model representing the biophysical domain within the Single Entity Policy Impact Assessment (SEPIA) model (Smajgl et al. 2006) which is being used to model several tropical catchments in Queensland and is currently being developed within the Recursive Porous Agent Simulation Toolkit (Repast) (North et al. 2006). Most existing biophysical modelling is undertaken using non-ABM or dynamic models that are based on well understood reductionist science developed for small scale models, or empirical lumped parameter models at larger scales. Although we can develop these small scale models into large scale models using discretisation of the landscape they have not been a resounding success due to difficulties with parameterisation (Cook et al. 2005). We hope to investigate these limitations to scaling up the biophysical domain by determining the proportional size of the catchment where this approach is no longer valid due to non-linear processes. This will be achieved by investigating results of hydrologic changes resulting from land-use or management changes at a range of scales within the modelled area.

Non-equilibrium thermodynamics of coupled systems

Bruce Hobbs (CSIRO Exploration and Mining), Alison Ord (CSIRO) and Klaus Regenauer-Lieb

Many natural systems involve feedback relations between four fundamental non-equilibrium processes, namely (i) heat generation and transport, (ii) chemical reactions and transport, (iii) fluid flow and (iv) mechanical processes. There has been much progress in the past 50 years in non-linear dynamics where two or maybe three of these processes are coupled together and it is well known that such coupling leads to patterning both spatially and temporally. However such coupling is commonly 'weak' in the sense that energy fluxes arising from non-equilibrium dissipation of energy are not considered explicitly. This presentation explores the complete coupling of these four processes in a 'strong' manner. The overarching principle is supplied through the Second Law of Thermodynamics as expressed by the Clausius-Duhem Inequality where energy dissipation from all processes is considered. The chemical dissipation feedback is particularly complex since, in part, it involves the products of the chemical potentials of the various chemical components and the time rate of change of concentrations of these components. The chemical potentials are functions of stress and temperature whilst the rates of concentration change follow reaction-diffusion equations. This complex coupling produces wonderous effects alone but coupling to the other processes enhances the complexity further. This approach is a powerful method of exploring many coupled systems in Nature. We illustrate by exploring specifically some problems of geological interest.

Practical Aspects of Symbolisation and Subsequent Analysis of Weather Data

Jay Larson (The Australian National University Supercomputing Facility; and The Australian National University)

Climate variability and weather predictability remain areas of active research. There exists a wealth of historical data from numerous sources including surface stations, upper-air soundings, remote sensing, and retrospective analyses produced by data assimilation systems. Information-theoretic measures such as the Shannon Entropy H quantify the amount of 'surprise' in the data, and H is a high-order metric for variability. Symbolic dynamics is another method for assessing potential predictability of these data. Observing instruments and data assimilation systems typically produce data spaced evenly in time, a characteristic compatible with symbol-based analyses. Both information-theoretic and symbolic analyses can be pursued after symbolisation of these raw (continuous) data. I will present an overview of generating partitions for certain types of weather data, and discuss how meteorological observing conventions offer us a useful set of partitions that would be accepted by meteorologists and climatologists. I will discuss the size S of the symbol sets for each these schemes, and comment on the relationship to dataset size N and the associated practical limit for application of these techniques. I will present information-based analyses of weather station data from the US Department of Energy-sponsored Atmospheric Radiation Monitoring project, and reanalyses generated by the National Centers for Environmental Prediction and the European Center for Medium-range Weather Forecasting. Other topics to be discussed include information content, time-lagged mutual information as a measure of predictability, and assessment of predictability based on rates of entropy convergence (following Crutchfield and Feldman, 2003).

Southern Hemisphere Climate Transitions and Storm Track Changes

Jorgen Frederiksen (CSIRO Marine and Atmospheric Research) and Carsten Frederiksen (Bureau of Meteorology Research Centre)

The inter-decadal changes in Southern Hemisphere winter cyclogenesis have been studied using a global two-level primitive equation instability-model with reanalyzed observed July three-dimensional basic states for the periods 1949-1968 and 1975-1994. The early to mid-1970s were a time of quite dramatic reduction in the winter rainfall in the South West of Western Australia (SWWA). We find that the rainfall reduction is associated with a decrease in the vertical mean meridional temperature gradient and in the peak upper tropospheric jet-stream zonal winds near 30o south throughout most of the Southern Hemisphere. These changes are reflected in the properties of the leading Southern Hemisphere cyclogenesis modes: for 1975-94 both the fastest growing mode, and on average the 10 leading Southern Hemisphere cyclogenesis modes that cross Australia, have growth rates which are around 30% smaller than for the corresponding modes for 1949-68. The sensitivity of our results, to the strengths of physical parameterizations and to the choice of basic states based on different data sets, is examined. Our results suggest that a primary cause of the rainfall reduction over SWWA in the period after 1975 is the reduction of the intensity of cyclogenesis and the southward deflection of some storms.

The emergence of patterned shear bands and fracture systems in granular materials - A numerical study.

Alison Ord (CSIRO), Bruce Hobbs (CSIRO Exploration and Mining) and Klaus Regenauer-Lieb

Our objective is to understand the microstructural rearrangements associated with the localization of deformation into shear zones. This objective remains a critical geological issue. Fluid flow paths are controlled by such rearrangements through the effects of volume change and associated changes in the stress field. Geochemical reactions, and hence mineralogical changes, which result from interactions between fluid flow and the rock mass the fluid flows through, will occur in different patterns, controlled by these fluid flow paths.

In contrast to continuum systems where localisation or shear band development arises through a bifurcation in a predefined system of differential equations, shear bands emerge in numerical simulations of deforming granular systems with no prescribed mathematical relations other than very simple contact forces between particles together with the boundary conditions. Shear band emergence arises from the self-organisation of large numbers of particles with long-range geometrical interactions playing a dominant role. As such, both translation and rotation of particles are important with particle rotation playing a fundamental role. Granular media therefore deform more like materials with non-local constitutive relations than materials where only first order interactions are relevant. In large systems far from equilibrium a continuum approach would say that the dissipation of energy plays a fundamental role in defining the evolution of the system including whether the evolution is unstable or not. In this study, we explore such energy dissipation and flow in deforming granular systems, with or without cohesion between the particles.

Complexity in Energy, Water, and Urban Development

The Complex Dynamics of Urban Systems Project at CSIRO

Tim Baynes (CSIRO)

Urban systems are undoubtedly complex entities. To understand a city as a whole and help guide its transition to a sustainable future requires an understanding of the populations's needs, the social and physical infrastructure to supply to those needs and also the complex interactions between urban subsystems. This presentation seeks to inform and garner interest in a CSIRO project specifically aimed at the dynamics between urban subsystems such as transport networks, housing infrastructure, water and energy supply networks and social networks.

The intention is to use the discipline of designing and constructing a dynamic model as a vehicle to focus on the problems of describing and understanding the urban 'ecosystem'. The project will provide a forum to integrate research around complex phenomena within urban subsystems. It is hoped the outcomes will help us gain a deeper understanding of the collective dynamics between urban subsystems and deepen the intellectual foundation of whole-of-system indicators regarding vulnerability, resilience and sustainability.

Complex Behaviour among many Heating, Ventilation, and Air-Conditioning Systems

Jiaming Li, Geoffrey Poulton and Geoffrey James (CSIRO ICT Centre)

An agent-based distributed energy management and control system is being developed to promote large-scale adoption of distributed energy technology, which includes management of electricity demand and local generation. The system includes software suitable for deployment to energy management agents in customer homes and businesses, and multi-agent coordination algorithms which achieve helpful system outcomes by attempting to enforce a time-variable cap on the total power drawn from the grid by the managed distributed energy resources.

In order to control resources effectively and coordinate them to respond to near-future market and network contingencies, each resource must be modelled sufficiently well to predict its near-term electricity demand. We have formulated a suitable model for Heating, Ventilation and Air-Conditioning (HVAC) systems in buildings and we are validating it against measurements in a building equipped with a demonstration HVAC agent. In general this is a complicated system requiring a high-order model. However, we require only near-term predictions and the model parameters may change - for example, due to changing solar radiation, opening or closing doors and windows, and entrance or egress of people - on a similar time scale. Therefore, we represent each HVAC system using a first-order model with time-varying parameters.

Previously we have demonstrated the coordination of large numbers of cool-rooms represented by models with constant parameters. The algorithm featured information exchange through indirect (or stigmergetic) communications between resource agents and a broker agent and consistently gave helpful emergent behaviours conforming to a desired maximum total electricity demand for a nominated period. We have now investigated the emergent behaviours arising when resource model parameters vary with time, and we report these results.

Complexity of Urbanization Patterns and Resource Use in Sea Change Communities across Australia: the interplay between pattern and structure

Kostas Alexandridis (CSIRO Sustainable Ecosystems) and Heinz Schandl (CSIRO Sustainable Ecosystems)

Sea Change is a persistent and wide-spread phenomenon across the vast majority of coastal communities of Australia. Urbanization patterns and their relationship with resource patterns of use are gaining recognition in both the scientific and planning communities. The magnitude, structure and degree of resource use in many sea change communities is closely linked to many patterns and magnitudes of change, including demographic, economic, urban development and environmental changes occurring simultaneously, and across multiple spatial and temporal scales. Our ability to respond and/or anticipate future changes in coupled human-environmental systems rests upon our understanding of the systemic interactions, feedbacks and cross-scale linkages that emerge and co-evolve across multiple scales and domains. This approach highlights the benefits of a systematic study of the system characteristics of urban, sub-urban, peri-urban and ex-urban changes with a social and sociological understanding of resource dependencies and accounting, and provides a profiling of sea change communities in both of these dimensions.

Impacts of Vehicle-to Grid (V2G) technologies on electricity market operations

Ariel Liebman (ARC Centre for Complex Systems, The University of Queensland) and Geoff Walker (School of Information Technology & Electrical Engineering, The University of Queensland)

The Vehicle-to-Grid (V2G) concept is based on the newly developed and marketed technologies of hybrid petrol-electric vehicles, most notably represented by the Toyota Prius, in combination with significant structural changes to the world's energy economy, and the growing strain on electricity networks. The work described in this presentation focuses on the market and economic impacts of grid connected vehicles. We investigate price reduction effects and transmission system expansion cost reduction. We modelled a large numbers of plug-in-hybrid vehicle batteries by aggregating them into a virtual pumped-storage power station at the Australian national electricity market's (NEM) region level. The virtual power station concept models a centralised control for dispatching (operating) the aggregated electricity supply/demand capabilities of a large number of vehicles and their batteries. The actual level of output could be controlled by human or automated agents to either charge or discharge from/into the power grid. As previously mentioned the impacts of widespread deployments of this technology are likely to be economic, environmental and physical.

NEMSIM: Towards practical deployment of an agent-based, scenario exploration simulation tool for the National Electricity Market

George Grozev (CSIRO Sustainable Ecosystems), Marcus Thatcher (CSIRO Marine and Atmospheric), Per da Silva (CSIRO Sustainable Ecosystems), Geoff Lewis (CSIRO Sustainable Ecosystems) and Chi-hsiang Wang (CSIRO Sustainable Ecosystems)

This paper will present NEMSIM - an agent-based simulation tool for Australia's National Electricity Market (NEM), which is in an advanced stage of development by CSIRO Energy Transformed Flagship Program. CSIRO's Complex Systems Science emerging science area provided some funding support for the project, importantly at the critical initial stages. More recently CSIRO signed an agreement with Core Collaborative Pty Ltd to commercialise NEMSIM. This collaborative project aims to create a new generation of simulation software to assist energy industry participants address complex strategic, financial and operational decisions over different time horizons. The commercial version of the tool will be known under the name GENERSYS.

NEMSIM models the decision making (bidding, contracting and investment) of power companies in the very dynamic NEM. It provides a set of demand models consistent with climate change scenarios and simulates the spot market based on bidding and dispatch over 30 minute intervals. It models all scheduled generating units in the NEM and the power flows across the main interconnections and takes into account forced and maintenance outages of generating plants and main transmission links. It has a module to estimate investment in new generation capacity based on the real options approach. NEMSIM calculates greenhouse gas emissions due to electricity generation and provides a comprehensive graphical user interface with reporting and scenario evaluation capabilities.

Agent-based modelling provides a constructive framework to study spatial, temporal and network effects in energy systems since the corresponding individual company agents and their infrastructure are located in market regions, evolve over time, and are constrained by the energy infrastructure (electricity grids, gas pipelines, etc.).

Optimal GENCO's Bidding Strategies under Price Uncertainty with Bilateral Contracts

Xia Yin (The University of Queensland), Zhaoyang Dong (The University of Queensland) and Tapan Kumar Saha (The University of Queensland)

In deregulated electricity markets, market players have to change their major objective from minimizing costs to maximizing profits. Therefore, they have an important task of implementing optimal bids for each trading interval to achieve the goal of profit-maximizing. The electricity market is uncertain and price volatility can cause significant market risks, so it is essential to propose a novel & effective method for building optimal bids with estimating and handling the price uncertainties and risks. This paper applies Generalized Autoregressive Conditional Heteroskedastic (GARCH) methodology to accurately predict electricity prices and estimate the risks involved in electricity prices. The authors then propose a novel approach of designing the optimal bidding strategies based on generator's degree of risk taking. Particle Swarm Optimization (PSO) is employed in this proposed method to solve the self-scheduling problem as an optimizer. The bilateral contracts of a generator are taken into account in the proposed bidding model as optimization constraints. Case studies using a coal generator located in Australian national electricity market are conducted to verify the effectiveness of the proposed method.

The Intelligent Grid Project

Simon Dunstall (CSIRO Mathematical and Information Sciences), Rodolfo Garcia-Flores (CSIRO Mathematical and Information Sciences), Nectarios Kontoleon (CSIRO Mathematical and Information Sciences), Bill Lilley (CSIRO Energy Technology) and Rene Weiskircher (CSIRO Mathematical and Information Sciences)

The CSIRO Energy Transformed Flagship was established to address environmental and efficiency challenges within the energy sector. A key challenge identified by the Flagship is the reduction of greenhouse emissions considered responsible for climate change. The Flagship recognises Distributed Energy (DE) is a key component in Australia's ability to substantially reduce its greenhouse footprint, by using small energy efficient generation technologies located close to the point of use. Through the Intelligent Grid (IG) Project, CSIRO is addressing the economic, environmental and social changes required to deliver a compelling fact based argument for DE to meet substantial CO2 emission reduction targets in Australia.

We propose a systems approach to analyse and simulate the benefits of DE in the Australian energy market. The IG programme of research is aimed at discovering, measuring and simulating the full value chain for DE solutions. It considers the social, environmental and economic choices made when tackling peak load, network constraints and base load growth. An overarching simulation framework provides the architectural guidance, data consistency and commonality for scenario analysis. The complex system integrates temporal scales ranging from hours to decades, while spatial scales range from 10's of metres to regional levels. In determining the impact of DE technologies, social surveys and focused workgroups are being conducted to ensure community and industry acceptance. Environmental protection will be assured by conducting cumulative impact assessments on resources, land use, air and water quality. Economic analyses will ensure changes to the energy system result in acceptable financial impacts.

The interaction between water and energy supply and use

Tim Baynes (CSIRO), Steven Kenway (CSIRO), Turner Graham (CSIRO) and Jim West (CSIRO)

Australia is urgently seeking solutions to problems of water and energy security in the context of climate change and a carbon constrained future, but what are the unforeseen opportunities and pitfalls at the intersection of the water and energy supply and use? We present pertinent analysis of historical records and preliminary simulation results of the interaction between the water and energy sectors.

Current centralised electricity and gas supply in Australia requires access to more than 60 000 Gl of water (actual consumption is 271 Gl). As Australians re-asses the worth of water we look at the potential effects on the water sector of alternative sources of electricity generation.

Conversely, alternative water supply options vary significantly in the energy they require e.g. desalination compared with greywater re-use. We also explore the energy implications of broadscale changes to water supply and use. Recent simulations of the dynamics between the energy and water sectors demonstrate feedback phenomena at the scale of the state economy but the magnitude of this effect has to be compared relative to the energy and water needs of other economic sectors.

Lastly we discuss how public acceptance and the location of people and climate change impacts might influence change as well as command and control style infrastructure decisions.

Tuning the cognitive complexity in participatory modelling

Nils Ferrand (Cemagref), Géraldine Abrami (Cemagref - Unite Mixte de Recherche Gestion de l'Eau, Acteurs et Usages), Nicolas Becu, Daniell Katherine, Natalie Jones, Pascal Perez (CIRAD; and The Australian National University) and Jean-Emmanuel Rougier

Participatory modelling is expected in Natural Resources Management to provide more accurate and socially appropriate models, and to improve the groups' capacity to collaborate and engage in collective action. However the group modelling process itself has to accommodate many actors, issues, scales, and to combine profane and expert knowledge. Participants are supported by modellers in order to formulate their visions (mental models) and to exchange and combine them. The models and knowledge sets are usually huge. Many strategies and tools to extract, manage and use this knowledge can be designed therefore. But which one fits better the final objective of improving the management of the commons? Based on 3 experiences using similar cognitive modelling tools, we explore and compare various protocols.

We especially show that the trade-off between the given (triggers) and the collected elements, the process for combining the models and the final use of the model are key factors to choose the method. Experimental conditions and their evolution during group modelling process also determines the combination of tools used at each step

Following these first conclusions, we propose and demonstrate a knowledge processing strategy that starts from individual interviews or cognitive mapping, includes groups "tips and tricks" for managing conflicting knowledge and builds on some software infrastructure to facilitate their processing and exploitation. We finally discuss how such a process has to deal with a continuously evolving context such as natural resources management.

Water service delivery in Tarawa: creating an agent based model for integrated analysis

Magnus Moglia (CSIRO Land and Water; and The Australian National University), Pascal Perez (CIRAD; and The Australian National University) and Stewart Burn (CSIRO Land and Water)

Decisions relating to water service delivery in Pacific atoll towns are generally made by a small number of professionals within severe constraints on staff and finances and with limited information. Decision making tends to be problematic due to a high level of uncertainty and demanding social acceptance issues. Additionally, a large range of interconnected and evolutionary environmental, technical and social factors need to be considered. Further complicating the task is the need to cater for the preferences and requirements of a large range of stakeholders. This leaves decision makers within a sphere of evolutionary and ill defined problems, struggling with issues of bound rationality.

To set up a framework for strategic and adaptive decision making, agent-based modelling has been identified as a practical option for developing a system representation in support of stakeholder dialogue. Such a model allows for integrating socio-technical models, embedding heuristic rules, representing stakeholder knowledge, and for capturing complex, uncertain and evolutionary aspects of the system. This provides opportunities for scenario analysis and dialectic exploration. Dialectic exploration can allow stakeholders to explore underlying assumptions and preferences beyond that which is normally possible.

For these reasons, an ABM was developed using the Cormas platform. An UML-based ontology was prepared based on the researchers' understanding of the system, seen as a starting point for validation and interactions with stakeholders, in line with the cyclical Companion Modelling approach. The model integrates a range of issues such as relating to water safety, tariff structures, freshwater lenses and household water usage.

Computational Modelling for Biology and Chemistry

Identifying fundamental principles to effectively shape social institutions in natural resource management

Ryan McAllister (CSIRO) and Luis R. Izquierdo (University of Burgos)

Social institutions can be viewed as the abstract aggregate of the various social norms that prevail in a community. In general, our understanding of how social norms form and develop in response to multiple drivers is rather limited and somewhat subjective. Thus, our knowledge of the structure, functioning and evolution of social institutions is often imperfect and contested. However, even though social institutions are bottom-up processes that usually emerge without any central design, they are also subjected to the influence of top-down drivers that may be controllable (e.g. building transport infrastructure networks). This means that there is often some scope to shape social institutions with the view of dealing with natural resource problems more effectively. This is generally a non-trivial task, since social institutions tend to emerge with no predefined purpose, and there may be a substantial mismatch between the scale of a social institution and the scale of the particular natural resource problem we want to tame.

This paper aims at identifying the structural characteristics of social institutions that may be relevant to deal with a specific natural resource problem. With limited information on the structure and functioning of social institutions, we use network theory and GIS to simulate how a certain pollutant flows through the landscape. The landscape is conceptualised as a network, where nodes represent properties, and links denote the strength of influence between properties. It is assumed that an efficient social institution to deal with the problem would resemble the ecological network of influence, linking those properties that significantly affect each other.

We test our model in Australian rangelands, where social institutions are crucial because high resource uncertainty increases the importance of establishing mutual cooperation. These areas are generally sparsely populated with low productivity and minimal formal governance, and the dominant natural resource processes are large scale and require collective responses. Paradigmatic examples are the impact of run-off sediments on ocean water quality, and the detrimental effect of long-ranged weed infestations on riparian zones.

Importance sampling strategies for forwards and backwards processes in population genetics

Martin O'Hely (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of Queensland), Mark Beaumont (University of Reading, United Kingdom), Lounes Chikhi (CNRS Toulouse), Robert Cope (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of Queensland), Jean-Marie Cornuet (INRA Montpellier) and Leesa Wockner (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of Queensland)

Suppose we want to determine some mean property of a random process. We could just simulate the process and take the mean over many replicate simulations. However, the quality of this estimate may be affected by the variance of estimates of the property across the replicates. Importance sampling involves simulating a different process, but keeping track of deviations from the original process, in such a way that the average of the estimates across replicates is still an unbiased estimator of the mean, and (hopefully) such that the variance of these estimates is smaller. We look at strategies for importance sampling in two population genetic problems: (i) estimating likelihoods of allelic configurations after a population admixture event, and (ii) finding the probability that a duplicate copy of a gene persists indefinitely in a population. Two important themes emerge from examining these strategies: first, that sometimes an apparently "smarter" technique works worse than simple-minded approaches, and secondly that there are often massive gains in efficiency to be had from implementing importance sampling.

Interactively exploring distributed computational models of biology

James Watson (ARC Centre for Complex Systems and ARC Centre in Bioinformatics, The University of Queensland) and Janet Wiles (The University of Queensland)

Interaction with a running computational model is a powerful way to gain insight into its dynamics. Two key advantages that interactive simulations have over batch simulations are: (1) user exploration facilitates understanding of model behaviour under various conditions, and (2) the researcher can quickly hone in on points of interest by dynamically prioritizing computations according to those regions.

However, interaction with a computational model depends on the responsiveness of that model. This paper describes software that provides an interactive interface to a computationally intensive simulation. Computation is initiated by user actions, and distributed to idle machines. Results are displayed as they are returned. Also, an algorithm that aids prioritization of computation according to interesting regions of the results is presented.

This approach has proven useful in modeling genetic regulatory networks for two reasons. First, in this domain the simulation can be easily divided into distinct units of work. The computing requirements for each work unit are small, but simulations are comprised of many of these work units. Second, some regions of parameter space are more interesting than others, so dynamically choosing where to spend computing resources is advantageous.

Modeling the Import of Nuclear Proteins

John Hawkins (ARC Centre for Complex Systems, The University of Queensland) and Mikael Boden (School of Information Technology & Electrical Engineering, The University of Queensland)

The importation of proteins into the nucleus is a crucial element in the dynamic life of the cell. It is complicated by the massive diversity of targeting signals and the existence of proteins that shuttle between the nucleus and cytoplasm. Unlike other localisation processes which are focused on placing proteins in their place of primary function, nuclear import itself places a functional role by mediating genome regulation.

We present an update of Nucleo prediction service using a novel machine learning architecture: PALFE Networks. We use an array of neural networks for local feature extraction to identify the presence of NLSs. We then use the concatenated local feature vectors to train a global SVM based classifier.

Furthermore, we present an extension of the model for predicting whether or not a protein is likely to fall into the category of nuclear-cytoplasm shuttling proteins. This allows us to begin estimating the proportion of the proteome that is involved in gene regulation, mediated by selective nuclear import.

We compare the model against our previous architecture Nucleo V1.0 that used a combination of a composite spectrum kernel with explicitly defined motifs. We test against a hold out set of proteins not used in the model development, nor in any of the alternative services available.

The final predictor Nucelo V2.0 operates with a realistic success rate of 0.89 and a MCC of 0.77 for predicting nuclear import, as established on the independent test set.

Modelling population processes with random initial conditions

Philip Pollett (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, University of Queensland), Anthony Dooley (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems; and The University of New South Wales) and Joshua Ross (University of Warwick, United Kingdom)

Often when modelling population processes the initial state is not known with certainty. Here we outline a general method for incorporating random initial conditions in population models where a deterministic model is sufficient to describe the dynamics of the population. For a large class of models we show that the overall variation is composed of variation due to random initial conditions and variation due to random dynamics, and thus we are able to quantify the variation not accounted for when random dynamics are ignored. We begin by reviewing some results of Tom Kurtz, which allow one to quantify variation in density-dependent population models.

Monte Carlo without Markov chains for model polymers

Aleks Owczarek (The University of Melbourne)

I will give an overview of developments of flat-histogram stochastic enumeration techniques in studying the critical phenomena of model polymeric systems. Energy landscapes with multiple minima can be considered with these techniques.

Simplified models for vibrational energy transfer in proteins

Steven Lade (The Australian National University) and Yuri Kivshar (The Australian National University)

The study of the energy transfer in proteins and other macromolecules is central to a deeper understanding of molecular and chemical dynamics. In this work, we develop a rather general approach for describing the resonant energy exchange between different vibrational modes of complex macromolecules by employing simplified models. We employ the Lagrangian approach for weakly interacting normal modes of a macromolecule, and derive a system of coupled equations for the time evolution of the oscillation envelopes. We use the parameters of an earlier study on the protein myoglobin [K. Moritsugu et al., Phys. Rev. Lett. 85, 3970 (2000)] to compare our predictions with the results of molecular dynamics simulations, and reveal an excellent agreement. Our model permits some conclusions of a general nature, and warnings on the validity of such modelling. The energy transfer between vibrational modes is identified as analogous to multi-parametric processes in nonlinear optics.

Suddenly-stopped flow in a curved pipe

Richard Clarke (The University of Adelaide) and James Denier (The University of Adelaide)

Understanding the circulatory system has long driven theoretical interest in the flow of viscous fluid through a curved pipe, as it is known that diseases such as atherosclerosis are linked to regions of weak wall shear stress within the blood vessel. In these investigations, the heartbeat has traditionally been simulated using oscillatory or pulsatile (oscillatory about a non-zero mean) pressure gradients. Some recent numerical studies, however, have employed a more physiological intermittent pressure waveform, where a pulse-like systolic pressure profile is followed by a stationary diastolic profile. This type of waveform, together with curvature-induced centrifugal flows, is seen to result in some novel flow features, such as the collision of boundary-layer flows at the vessel wall, which results in flow separation and the propagation of a jet across the core of the vessel. We investigate the essential features of these phenomena by studying the limiting case where the diastolic period is established through a sudden shut-off of the flow. In this setting we are able to mathematically describe the structure of the flow in the small-time limit by matching diffusive boundary layers onto a core flow that is driven by an unknown pressure gradient (which must be determined as part of the solution). This analysis offers the opportunity to study the strength of the finite-time singularity in the boundary-layer equations linked to the separation event.

Defence and Security

A Blueprint for Reform: Towards a Coherent National Security Strategy

Charlie Edwards (Demos)

At the beginning of the twenty-first century, governments across the globe have struggled to keep up with the growth and complexity of the challenges facing them. The UK government is no exception and finds itself exposed to changes across a global system that often reverberate unpredictably throughout British society. This global interconnectedness makes it harder for governments to predict and intervene in social and economic problems. Today cartoons shown in Danish newspapers create civil unrest on the streets of London, drugs from the poppy fields of Afghanistan lead to violence on Glasgow estates, and regional instability in the Middle East raises the price of petrol in the UK. While departments have begun to develop a more joined-up approach to this interconnected world, there has been no obvious impact to Britain's archaic security architecture and systems. Without a strategic framework for departments and agencies to operate within, Whitehall continues to suffer from a duplication of resources, mixed messages from politicians and infighting between departments - all of which makes the system more opaque, both for those people who work in it and for citizens. To cut through this complexity and help the government respond to the plethora of challenges facing the UK, at home and abroad, the government should develop a national security strategy.

A Comparative Study of Networked Teaming

Victor Fok (Defence Science and Technology Organisation), Martin Wong (Defence Science and Technology Organisation) and Alex Ryan (Defence Science and Technology Organisation)

In warfare, the diversity of elements within a heterogeneous force gives rise to many combinations in which teams can be organised. Understanding how best to organise teams for engaging a particular adversary and environmental context remains largely an art of war. This paper seeks to explore the domain of networked teaming by examining a broad range of teaming approaches that are based on the nature of the team composition and on the team adaptation behaviour. Using an agent-based distillation, the teaming approaches were applied to a common heterogeneous force-mix and played off against one another to evaluate their comparative effectiveness under a simple abstracted combat scenario. Preliminary results highlight the cyclic nature of the effectiveness of the various networked teaming approaches in that no one approach can be totally dominant under all contexts.

A Framework for Decision Superiority in Complex Organisations

Chris Murray (ThoughtWeb Pty Ltd)

This paper describes a theoretical framework by which complex organisms can self-organise effectively in fast-changing environments with unpredictable shocks. The framework is a mathematical model embodied as a software platform and a coherent methodology enabling rapid configuration and deployment of applications providing improved situational awareness, synchronisation of activity and improvement in the speed and quality of decision making. Each configuration involves a network of virtual databases, fuzzy logic connections, rules engines and personal software agents. The applications described in this paper have been put to practical use in emergency management, healthcare, defence, financial services and strategic marketing.

This paper describes how complex systems can be managed and controlled using contextual reasoning and intelligent push to extend the traditional model of leadership. Micro-level interactions build a vision that can be shaped and transformed into a network of connectivity and purpose drivers which guide micro- and macro-level behaviour. Through this approach, the macro-level system properties and behaviours emerging from the relatively simple micro-level interactions create alignment and adaptability. Limitations of tree structures and positive feedback loops are overcome.

The paper provides a glimpse of the future, with rapidly evolving cybernetic communities employing global information systems and avatars. Every person in these communities has an avatar equipped with contextual reasoning and intelligent push, enabling the human participants to extend the capability of their brains, reasoning more holistically and making superior decisions in complex, fast-changing environments with far-reaching consequences.

A multi-objective genetic algorithm based method for designing low-cost complex networks resilient to targeted attacks.

George Leu (National Defense Academy of Japan) and Akira Namatame (Department of Computer Science, National Defense Academy of Japan)

In the latest years, our society tends to become more and more dependent on large scale (global) infrastructure networks. In many cases, attacks on a few important nodes of such systems lead to irreparable local or, worse, global damages. Thus, designing resilient networks rather than reducing the effects of some unexpected attacks becomes a must. As the most resilient network, regarding any kind of attacks, should be a full connected graph, it is obvious that implementing such a network is an utopia. This paper proposes an original multi-objective method for optimizing complex networks' structure, taking into account the implementation costs. A micro genetic algorithm is used in order to improve the network's resilience to targeted attacks on HUB nodes while keeping the implementation costs as low as possible. The algorithm has a 2D binary-encoding of the network, based on adjacency matrix, which provides a very fast convergence, due to the direct access of the genetic operators to the network structure.

Adaptive Campaigning

Wade Stothart (Future Land Warfare, Australian Defence Forces)

The presentation will outline the recently released Australian Army concept, Adaptive Campaigning. Adaptive Campaigning outlines the Australian Land Force response to the future conflict environment as part of the military contribution to a Whole of Government approach to resolving conflict.

The purpose of Adaptive Campaigning is to influence and shape perceptions, allegiances and actions of a target population to allow peaceful political discourse and a return to normality. Given the complexities of the environment the key to the Land Force's success will be its ability to effectively orchestrate effort across five lines of operation: Joint Land Combat, Population Support, Indigenous Capacity Building, Public Information and Population Protection.

This paper presents conceptual and force modernisation direction to Army to ensure it remains postured to meet the demands of the future operating environment.

Agent based models---finding the right tool for the right job.

Matthew Berryman (Defence Science and Technology Organisation), Anne-Marie Grisogono (Defence Science and Technology Organisation) and Alex Ryan (Defence Science and Technology Organisation)

Many complex systems are amenable to study through the use of agent based models. Agent based models provide useful insights into how individual interactions give rise to emergent properties. Of these, structures and causality are of great interest. This paper focuses on which agent based modelling toolkits are best suited for studying: 1. processes that generate and/or preserve structure, including, but not limited to, adaptation and self-organisation, and 2. complex networked causality, and how we can influence complex systems to provide desired outcomes.

Of particular interest are defence systems, and battlefield-specific agent based models are considered in addition to general-purpose agent based modelling toolkits like Repast and MASON.

This paper summarises our findings on the agent based modelling toolkits in the above contexts, and for general complex systems.

Autonomic MANET Management Through the Use of Self Organizing UAVs

Robert Hunjet (Defence Science and Technology Organisation)

Mobile ad-hoc networks (MANETs) have been proposed as a viable means to achieve Network Centric Warfare in the future tactical Defence environment. However, they bring with them inherent complexity. The nodes of a MANET are free to move causing the topology of the network to constantly change. Increasing the distance between nodes and transmission of the signal through obstacles such as trees, rocks and hills will attenuate it possibly to the point where it can not be detected by the receiver node. Consequently in a MANET the bandwidth available is constantly changing as is the delay and jitter experienced on the network. Moreover, the most commonly used transport protocol TCP/IP, functions poorly under such conditions. We suggest that complexity science can provide self-managing control mechanisms for MANETs using biological self-organizational techniques. We present an autonomic management model using unmanned aerial vehicles (UAVs) as part of a MANET. The UAVs utilise a self organization biomimetic design pattern based on the nesting habits of bluegill sunfish, which optimises network topology. This model creates a mobile backbone throughout the tactical MANET which helps to keep the network functioning despite variations in bandwidth, jitter, delay and connectivity.

Dealing with battlefield complexity: A decision maker's perspective

Damien Armenis (Defence Science and Technology Organisation)

Warfare has become a highly complex and information saturated environment. The modern battlefield is now characterised by significantly more complex demands placed on the decision maker, including multiple information sources, time pressure, incomplete and conflicting information, rapidly changing and evolving scenarios, and a high workload. Military complex systems are situated within these environmnents, placing a unique demand on the cognitive skills of modern day commanders. Analytical strategies to decision making requiring laborious and time consuming procedures are not fit for these complex systems. New theories are now emerging which have analysed and modeled military decision making in complex environments, and have produce various implications for research, metrics, and training. These relatively new research streams such as naturalistic decision making have allowed decision researchers to assess how commanders deal with battlefield decisions. Investigations are often performed on a cognitive level, modelling the decision strategies and cognitive processes employed by experienced decision makers, and building decision skills training programs for more novice military staff. Validation and evalutation of these training programs however has been less than thorough. This paper will first discuss the complex features of decision making on today's battlefield, then move on to outlining the relevant theory and its implications for investigating, training, and evaluating decision making in a complex military system.

Diversity and complexity in pictures

Michael Barnsley (The Australian National University)

It is shown that models for real world pictures can be generated using orbits of segments of pictures generated by an Iterated Function System (IFS) semigroup. These models are relevant to image compression and image cognition. The richness of such models is illustrated. It is shown that this richness can be measured using the concept of image diversity.

Effectiveness of closed-loop congestion controls for DDoS attacks

Takanori Komatsu (National Defence Academy of Japan) and Akira Namatame (Department of Computer Science, National Defense Academy of Japan)

Congestion has negative effects on several services (ex. Internet, Transportation, Telecommunication etc.). We focus on congestion collapse in the Internet. The Internet is packet switching network, not wired switching network. Therefore congestion collapses can be mitigated using improved packet scheduling based on a crowd control or an active queue management. We have to evaluate what type of control mechanisms can solve this problem most effectively. However, the problem is associated with dynamic conditions such as underlying network topology, network load, and the reactions of transport protocols to congestion. The research aim of this paper is to evaluate the effectiveness of the congestion control schemes and network resistance against over flow problems in consideration of what we mentioned. Adaptive flows adjust the rate, while unresponsive flows do not respond to congestion and keep sending packets. For instance, random early detection (RED) exemplifies this class of algorithms. A router only maintains a simple FIFO queue for all traffic flow and drops the arriving packet randomly during congestion. The probability to drop a packet increases with the queue length. By keep the output queuesize small, RED can reduce the delay time for most of the traffic flow. However, RED cannot penalize the misbehaving traffic flows. We evaluate, the congestion control schemes such DropTail, RED and CHOKe using responsive and unresponsive flows in presence of short and long-lived background traffic. We use several network topologies to identify responsive and unresponsive flows that cause packet drops in other flows. We also simulate how various network topologies, including the tiers model, transit-stub model, scale free model, have resistance for over flow state. The simulations show CHOKe is successful in providing bandwidth requested by the legitimate user during congestion state.

Examining 'Federation Learning': Assisting Development of Australia's National Security Systems

Rick Nunes-Vaz (Land Operations Division, Defence Sciences and Technology Organisation), Dawn Hayter (IGOR Human Science Consultancy, Urban Providore Pty Ltd, Adelaide, Australia), Dion Grieger (Land Operations Division, Defence Sciences and Technology Organisation), Gary Hanly (Land Operations Division, Defence Sciences and Technology Organisation), Prior Chad and Monique Kardos (Land Operations Division, Defence Sciences and Technology Organisation)

Australia's national security 'federation' is a loose coalition of agencies that must coordinate its activities to meet the needs of 'prevention', 'protection', 'response' and 'recovery' roles. While Australia does have an overarching framework to guide the way that the federation's agencies interact, the precise arrangements are necessarily fluid in order to meet the needs of particular contexts and problems. Fortunately, Australia has not been exposed to the same degree of security-related stress that has occurred in many other countries, but this means that neither our 'fitness' nor our adaptive mechanisms have been seriously tested. Like other nations, Australia uses a number of methods to assess and develop its abilities to meet national security objectives. Australia's national security Exercise Program together with information arising from events like the London bombings, for example, reveal issues that need to be examined and understood by the federation in terms of its need to change. In our role of assisting 'federation learning', we must start by understanding what learning mechanisms are in place and how the system might use and internalise relevant information to assess its fitness. In this paper we try to interpret and represent the national security federation as a complex adaptive system that must make effective use of information from non-Australian contexts or gained through simulation, and identify ways to enhance its fitness and adaptivity.

Human aspects of managing complex systems

Alex Wearing (The University of Melbourne)

The goal (at least implicitly) of many investigators working with man-machine systems has been to replace the human operator, with all of his/her limitations, by technology. In many situations e.g. flying aircraft or using sensors, this goal has been achieved. However, in higher level tasks e.g. where senior officers or CEOs are typically involved, and particularly in those situations where teams manage the system e.g. boards of directors, the human operator(s) is/are of fundamental importance to the control of the system, and likely to insist on remaining so.

To examine the role of the human in complex tasks we use data collected from both the field (using in-depth interviews as well as questionnaires and performance measures) and the laboratory (using simulations). We confirm the significance of a number of the characteristics of the human operator and the system. These include characteristics relating to cognitive capabilities, uncertainty and intent, and processes such as resource allocation (sometimes termed metacognition). These and others are important where human beings are involved in the management and control of complex systems. Of particular interest are those findings that bear on the question of why competent people make simple mistakes, and those findings with implications for education, especially of higher level decision makers.

Hurricane Katrina - A Whole of Government Case Study

Mark Clemente (The Boeing Company)

This paper will examine how the various responding organizations tried to adapt in a rapidly evolving situation. What was their 'fitness' for the tasks and how did they define success? How did they adapt and learn to make better informed time critical decisions? How flexible, resilient, responsive and agile were the different organizations. All complex adaptive systems learn from experience. Unfortunately a lot of that learning goes on within communities of expertise and is never shared outside of their communities. Co-adaptation and learning between communities is almost non-existent. When disparate organizations need to come together on a common challenge, the result is often a tower of Babel of misunderstanding and lost opportunities. It doesn't have to be that way.

Implementing an Adaptive Approach in Non-Kinetic Counterinsurgency Operations

Mick Ryan (Australian Army)

Counterinsurgency operations are a significant challenge for a variety of nations whose military cultures and force structures are still dominated by Cold War, conventional mindsets. John Kiszley recently wrote that adapting to counter-insurgency presents particular challenges to militaries, and that many of these challenges have at their root issues of organizational culture. As General David Petraeus has recently stated, the insurgencies in Iraq and Afghanistan were not the wars for which western militaries were best prepared at the start of the 21st Century. Despite this, it is likely that insurgency will be the favoured approach of violent non-state actors well into the century because of the overwhelming superiority of conventional western military forces. Therefore, our success in these operations will be determined largely by the ability, and willingness, to adapt and succeed in the changed situation.

Challenged by counterinsurgency operations in southern Afghanistan, and the need for non-kinetic solutions to counter those seeking to terrorise the population into supporting the Taliban, the Australian Army developed the reconstruction task force concept. For eight months, the 1st Reconstruction Task Force conducted operations to rebuild the physical infrastructure of Uruzgan province, build the indigenous capacity for the conduct of engineer-related activities in their society and to support the Dutch Provincial Reconstruction Team (PRT).

This paper examines one attempt to realise an adaptive approach during the conduct of military operations. In particular, it examines how one Australian unit - the 1st Reconstruction Task Force - sought to implement an adaptive approach in the preparation for, and conduct of, its operations in southern Afghanistan. The paper starts with a description of the context - how reconstruction operations fit within counterinsurgency campaigns, and the environment in which the task force found itself. The next section describes how the task force sought to win the adaptation battle against the Taliban. The final part of this paper examines the some insights for implementing an adaptive approach in military units.

This paper will offer suggestions on how to guide the transition from theory to practice in adapting military operations before and during execution. Done from the practitioner's view, it seeks an approach that is simple yet effective in assisting military organisations to adapt during the stress of operations. The key idea is that of winning the adaptation battle against a thinking, adaptive adversary.

Improving wargames using complex system practices

Anthonie van Lieburg (Netherlands Organisation for Applied Scientific Research), Peter Petiet (Netherlands Organisation for Applied Scientific Research) and Nanne le Grand (Netherlands Organisation for Applied Scientific Research)

This paper describes our ongoing efforts to improve current military wargames as used by most military communities and the Dutch defense. In a typical wargame scenario military commanders and Intelligence officers are playing the game of blue and red forces, drawing their course of actions in order to outplay one and the other. Most of these games, as far as they don't require fully scripted scenarios, are based upon regular, symmetric, and large scale military operations. These core models are usually based upon mutual attrition and require a lot of personnel. In this study we focus on two particular issues. First of all, the configuration of a typical wargame scenario which is a complicated and time consuming process. Second, most wargames lack the incorporation of active non-combatants like civilians which are of utmost importance for the shape and dynamics of today's battlefield. For both these issues we explore the usefulness of complex (adaptive) system knowledge and tools. Our aim is to use simple models of self-organization, both to simplify scenario configuration and to generate complex human behaviors. To do so, we study the use of various agent-based modeling approaches; in particular the well-known work of Axtell and Epstein on socio-cultural modeling called 'Sugarscape'. We believe that, although these kinds of models are a very coarse and simplified representation of reality, they are useful in generating behavioral effects that mimic real-life patterns. Incorporating these models into a wargame will confront military decision makers with the possible unforeseen higher order effects of their actions. Moreover, such a wargame would provide an interesting tool that could support evolutionary approaches to current military challenges.

Network-Centric Complex Warfare

Mervyn Cheah (Singapore Armed Forces Centre for Military Experimentation, Future Systems Directorate)

Networks are unpredictable if we have millions of users communicating and sharing information. While the first generation of military was clearly military warfare, this is not so lately. The military has found itself being involved with civilian operations and sometimes even trying to command over these culturally different organizations. The military has started to realize that her operations are becoming a complex problem. This is because civil-military operations can sometimes have undefined goals and boundaries, disunity of effort within the adhoc organization, the issue of disparate communication networks, different interpretation of words and even understanding of the intent and statements of the higher authority, and the unpredictability of the citizens of the nation(s), just to name a few.

While militaries like the US and the other countries have started to work on the notion of Network-Centric Warfare, one aspect that will be imperative to defence operations is the equally important notion of Network-Centric Complex Warfare and how do we deal with them.

Network Centric Complex Warfare or NCCW would be an important area of study for the militaries and even the defence communities, if militaries are to move to NCW. This is because the more we connect, the more we want to push to edge-level decision making, the more we rely on networks, then the more complex it will be for operations in terms of co-ordination, information sharing, trust, shared understanding and more critically, decision making.

This presentation will cover some possible areas of work that looks at NCCW. An important area of work that has gone into this is the Adaptive Planning and Intelligent C2 Assistant (APICA), which both the Australian DSTO and the SAF Centre for Military Experimentation (SCME) are collaborating on.

Robots for Warfighting - Simplifying Complexity in Right & Wrong Ways

Patrick Hew (Defence Science and Technology Organisation)

This paper introduces a framework for thinking about autonomous (robotic) systems, and uses it to derive implications for future warfighting. It turns out that, to adequately assess the potential of future technology, the historical understanding of "situation awareness" must be expanded to include the simple way a robot perceives its environment (no matter how complex), and the potentially complex consequences of its actions (no matter how simple). Examples in the paper include the evolution of sophisticated air defence systems compared to the strategic threat environment, how modern strike weapons export problems from their physical domain to somebody else's information domain, and preserving the humanitarian intent of weapons-control conventions under emergent robotics technology.

Signatures of Game Dynamics for Intelligence and Information Operation

Hussein Abbass (Defence and Security Applications Research Centre)

In this talk, I will present one of our most recent research findings. The research question was: do footprints exist for game dynamics? Our focus has been on social dynamic, where players in a game interact according to some rules or strategies. The main finding of the research so far is that each type of game is characterised by a unique set of footprints. The significance of this is we can recognise the type of game dynamics through a sequence of observations; hence, we can infer for example the intention of a group of people from a sequence of intelligence observations. I will show the use of these findings for command and control in defence and social network mining in security.

Team Composition: Linking Individual and Team Characteristics to Team Decision-Making and Performance

Sebastian Schaefer (Institute of Technology of Intelligent Systems, Universitat der Bundeswehr)

Command and Control in the 21st century is characterized by transformation from traditional industrial age C2 to networked information age C2 concepts. While a requisite information infrastructure is widely recognized as enabler of networked C2, the contribution of humans to C2 performance is still underestimated. The authors argue that knowledge of how and to what degree characteristics of individuals and teams affect networked C2 and collective action, both in teams and between coalition forces, is indispensable for an efficient implementation of information age C2 concepts.

This paper presents results of an empirical study aimed at uncovering the effects of selected individual and team characteristics - measured by means of standard psychological tests - on team collaboration and effectiveness by means of simulation experiments. The study involved 130 teams, of four cadets and junior military officers of the German Bundeswehr each, tasked to locate targets distributed over a simplified terrain grid in a simulated operation. The results show whether and to what degree personality structures, both on the individual and team level, affect team collaboration measured in terms of shared situational awareness and task performance.

The Adaptive Stance

Anne-Marie Grisogono (Defence Science and Technology Organisation)

Adaptive Campaigning is based on a philosophy of Mission Command - a command approach which expects and makes use of the initiative, intelligence and adaptivity of those commanded. A lot of attention has been paid to how commanders should exercise mission command, and to a lesser degree, to when it should be exercised. This paper argues that it is equally important to consider how it should be received and implemented, and presents the concept of taking an Adaptive Stance as the essential complement that makes mission command work as intended. The essential elements of an Adaptive Stance will be described, and their implications for organisational policy, doctrine, training and the cultures of defence forces will be discussed. There are also implications for information collection and dissemination policies and the systems that enable and support them. Extension of the notion of an Adaptive Stance beyond the individual to teams and enterprises has further consequences for policy, doctrine and structure, and significant potential benefits for enterprise and C2 agility.

The Essential Thing: Enabling Effective Action in a Complex Security Environment

Roger Noble (Australian Command and Staff College)

This paper will examine the challenge of successfully conducting contemporary military operations within complex operating environments. It will draw directly on recent Australian operational experience, including operations in southern Iraq. It will initially examine the nature of the contemporary operating environment and highlight the implications for military forces seeking demanding mission outcomes inside a complex 'conflict eco-system'. It then proposes a comprehensive, systematic approach designed to enable effective action within the complex environment. This systemic response places the human at the centre and seeks to create a framework and culture that supports and enables independent, timely action. The paper will discuss command systems, control support, behavioural rules, intelligence and operational processes, approaches to planning, the place of technology, and the idea of a 'cerabral network' based on training, doctrine and culture.

The Importance of Complexity Theory to Modelling and Analysis, Using a NCW example

Michael Lauren (Defence Technology Agency, New Zealand)

This talk discusses how improved understanding of the behaviour of complex systems has allowed better characterisation of their statistical properties, and demonstrates the importance of this to analysis, using a Network Centric Warfare example.

The Importance of Contextually Sensitive Processes In Supporting Ad-hoc Collaborative Working In Complex Systems

Neil Carrigan (The University of Bath, United Kingdom)

One of the anticipated effects of moving to a Network Enabled Capability (NEC) environment is that ad-hoc networks of agents in the battle-space are likely to spontaneously form in order to create effect. These networks may span operational or even national domains, each with different cultures and working practices, adding to the complexity of the system within which the human (and autonomous) agent must operate. For example, in peacekeeping scenarios, military personnel may have to collaborate with civilian and non-governmental organisations, across a number of locations, with only limited communications capacity, as part of rapidly forming, spontaneous networked capability. As a consequence, different operational units, and individuals within those units, will need to share information and collaborate closely to create effect at optimum tempo. However, when people begin to work in this way there is the potential for contextual factors (differences in working practices, policy and culture) both inter- and intra-organisational, to impact negatively on the effectiveness and efficiency of collaboration, both at the level of the individual and the network. All of which can lead to collaboration 'breakdown', requiring time and effort from individuals to repair and a subsequent loss of system agility. This paper considers breakdowns that occur when people collaborate in complex systems and what additional challenges a NEC environment brings. Special attention is given to context and how contextual factors influence the way in which collaborative work is done over and above the explicit procedures and interactions that are necessary for people to collaborate. Understanding which of these factors are most important in militating against collaboration breakdown provides guidance to designers of collaborative support technologies regarding what background contextual information needs to be shared across the network.

The Rosetta-II Project: Measuring National Differences in Complex Cognition

Richard Warren (Air Force Research Laboratory)

Professionals in complex scientific, business, military and humanitarian domains must identify problems, engage in sensemaking, weigh options, and coordinate with others in order to make decisions and take action. During multinational interchanges, cultural differences in cognition can compromise the productivity, safety, and quality of work. A growing body of research challenges the universal nature of cognition and points to the need for understanding national differences in cognition. The Rosetta-II project is an unprecedented study in cultural cognition, taking place with over 1000 subjects, in China, Japan, Korea, Malaysia, Singapore, India, and the US. In extending the work of Richard Nisbett, the study is focused on developing a set of perceptual and simple cognitive measures that can validly predict complex cognition related to sensemaking in a natural context. The study is also designed to delineate the relationships between these simple measures and complex cognitive functions, as well as determining national differences in cognitive measures. In addition to the common measurement tools to be used for the research, the Korean and Japanese teams will develop measures for "Holism" and "Decision making Behavior" respectively, and apply in their studies.

Whole of Government Operations

Ed Smith (Boeing)

Asymmetric challenges from Bosnia and Kosovo to Iraq and Afghanistan have focused world attention on the human dimension of conflict and the complex nature of the post 9/11 security environment. With this focus has come a growing consensus that responses to such challenges can not be confined to military action alone. The result has been a growing interest both in the socio-cognitive aspects of competition and conflict and in effects-based approaches to 'comprehensive,' whole of government/ nation/ coalition action. Indeed, Ashby's Law of requisite variety suggests that the breadth and diversity of options that nation-states coalitions can bring to bear are a potentially decisive strength in asymmetric conflict.

The problem, however, is the word 'potentially.' Nation-states and coalitions of states may have the diversity of options but have been notoriously bad at breaking through stovepipe organizational barriers to tap the resources available inside the governments much less to identify or tap those to be found in an entire nation. Recent attempts to deal with on-going conflicts and natural disasters have only served to underline the problem.

At the root of these difficulties is the challenge of dealing with complexity. The effects-based, socio-cognitive nature of the problem and the need for a holistic response demand more than data and information. They mean acquiring, vetting, analyzing, understanding, and using contextualized complex knowledge and expertise in a plethora of subject areas across the stovepipes of multiple agencies, organizations, and governments. Yet, the complex nature of the knowledge and expertise to be shared and coordinated means that 'sharing' is less a function of communications than it is of networking human interfaces able to translate and relate the knowledge of one stovepipe to the needs of another. Moreover, given the complex adaptive nature of the actors, there is no 'one stop' or 'one size fits all' answer. Rather, continually learn and adapt in all of these functions and areas. How then are governments and organizations to adapt their organization and social networking to deal with the challenge?

Social Networks and Epidemiology

A Hybrid Agent-Based and Network Modelling Framework of Contagion Diffusion in Social Networks

Paul Box (CSIRO Sustainable Ecosystems) and Yiheyis Maru (CSIRO Sustainable Ecosystems, Alice Springs)

In epidemiology most attention is focused on the infectious characteristics of a biological contagion, and risks and behaviours of individuals. Diffusion of a contagion through a population is a function of interaction between individuals, either through direct contact (as in sexually transmitted diseases) or through contact with a common vector. Interaction between individuals is highly influenced by social networks, which influence both the probability of interaction and the kinds of interaction that will occur (e.g., norms and rules for physical contact). Any innate behaviour of two individuals that contributes to a diffusion of a contagion between them may be tempered by their established relationship to each other, and by the wider social network to which they belong. Contagion diffusion is also largely dependent on environmental, demographic, economic, geographic, or other factors in the wider environment. Modelling this diffusion at the individual, network, or demographic/environmental scale requires some assumptions about behaviours, relations, and environmental and societal structures at one scale that may be incompatible with assumptions at another scale. By modelling contagion diffusion at any one of these scales, an important role of interaction between processes and structures at the different scales is lost to the model. We present a modelling framework that utilizes agent-based models of contagion diffusion within a social network structure that simultaneously integrates processes at all three scales. By representing populations of agents, networks, and coarser scale structures as swarms (collections of agents with common schedules of concurrent interaction), complex patterns of contagion diffusion through populations spontaneously emerge. By comparing the differences in behaviour between simulations with differing social networks, where individual behaviours and environmental and demographic properties are held constant, the relative importance of social network structure on contagion diffusion is demonstrated. By modelling all three scales simultaneously, the importance of interaction of processes between scales is demonstrated.

Assessing Uncertainty In The Structure Of Ecological Models Through A Qualitative Analysis Of System Feedback And Bayesian Belief

Jeff Dambacher (CSIRO Land and Water)

Ecological predictions and management strategies are sensitive to uncertainty in model structure and variability in model parameters. Systematic analysis of the effect of alternative model structures, however, is often beyond the resources typically available to ecologists, ecological risk practitioners, and natural resource managers. Many of these practitioners are also using Bayesian Belief Networks based on expert opinion to fill empirical information gaps. The practical application of this approach, however, is limited by the need to populate large conditional probability tables and the complexity associated with ecological feedback cycles. In this paper, we describe a modelling approach that helps solve these problems by embedding a qualitative analysis of sign directed graphs into the probabilistic framework of a Bayesian Belief Network. Our approach incorporates the effects of feedback on the model's response to a sustained change in one or more of its parameters, provides an efficient means to explore the effect of alternative model structures, mitigates the cognitive bias in expert opinion, and is amenable to stakeholder input. We demonstrate our approach by examining two published case studies: a host-parasitoid community centered on a non-native, agricultural pest of citrus cultivars and the response of experimental lake mesocosms to nutrient input. Observations drawn from these case studies are used to diagnose alternative model structures and to predict the system's response following management intervention.

Boolean networks as models of social behaviour

Tania Leishman (Monash University) and David Green (Monash University)

Many aspects of social organization (e.g. consensus, cooperation) emerge from mutual relationships and the influence of peers. Boolean network social (BNS) models represent social groups as networks in which the nodes represent individuals ("actors", "agents") and have binary states (e.g. AGREE/DISAGREE). The edges represent relationships between pairs of individuals and their potential interactions or influence. In BNS models, social behaviour emerges from complex networks of interactions between peers, rather than from complex individuals. This emergent behaviour differs in subtle, but important ways, from the corresponding Markov or dynamic system models. BNS models show that peer-peer interactions lead to irregular abrupt changes, rather than smooth transitions, and at first impede changes in state within the network, but accelerate change once a critical point is passed. BNS models are sensitive both to the degrees of influence associated with social connections, and to the network's topology. They also reveal how these network topologies can emerge under different regimes of peer-peer interactions that involve reinforcing or weakening links. Despite their simplicity, BNS models yield potentially important insights about many social issues. Our studies have resulted in insights about social issues, including emergence of social consensus and cooperation, influence of media and maintenance of law and order.

Complexity and the obesity epidemic

Matthew Beaty (CSIRO Sustainable Ecosystems), Cathy Baker (ACT Health), Cathy Banwell (National Centre for Epidemiology and Population Health, The Australian National University), Guy Barnett (CSIRO Sustainable Ecosystems), Helen Berry (National Centre for Epidemiology and Population Health, The Australian National University), Jane Dixon (National Centre for Epidemiology and Population Health, The Australian National University), Rob Dyball (The Australian National University), Sharon Friel (National Centre for Epidemiology and Population Health, The Australian National University), Amy Griffin (The University of New South Wales; and Australian Defence Force Academy) and Katrina Proust (The Australian National University)

Obesity can be conceptualised as an emergent property of contemporary social-ecological systems with complex multiscale dynamics. Complex systems science and related emerging research areas offer important perspectives and analytical approaches for moving beyond a focus on discrete parts of the obesity problem, narrow interventions, and isolated solutions. Of particular relevance is emerging theory on the dynamics of coupled social-ecological systems. This research focuses on the unique systems that arise when global, national and local level environmental, political, socio-cultural and socio-economic processes intersect and influence one another. Obesity may be considered an inevitable, if unintended, consequence of these complex processes. To date research efforts and policy and practice responses have been fragmented, often focusing separately on (i) behavioural (e.g., individual diet and physical activity), (ii) social (e.g., societal processes that shape population diet and physical activity) and (iii) environmental factors (e.g., the influence of built form on food availability and physical activity) factors. While these research areas have provided many important insights into the obesity crisis, obesity remains an unsolved and rapidly growing global public health problem. Drawing on previous evidence, a new approach is needed that considers obesity as a complex systems problem. This means adding to linear conceptualisations of causal pathways recognition of multiple components, or subsystems, interacting in non-linear fashion. A key research challenge is to describe the structure of the social-ecological system leading to obese populations and then to identify the levers or intervention points that may be effective in transitioning populations towards healthy weight.

Epidemic Spread Modelling: Alignment of Agent-based Simulation with a SIR Mathematical Model.

Alex Skvortsov (HPP Division, Defence Science and Technology Organisation), Russell Connell (Defence Science and Technology Organisation), Peter Dawson (HPP Division, Defence Science and Technology Organisation) and Ralph Gailis (HPP Division, Defence Science and Technology Organisation)

Social contacts are an important channel for the propagation of disease through a population and should be considered in conjunction with traditional epidemic diffusion that is due to meteorological advection. Such channels should always be taken into account for a realistic estimation of a long-term impact of a disease outbreak (natural or malicious) and for the best response options (i.e. optimal immunisation strategy). This paper describes our recent experience in the development and validation of a simple agent-based model to simulate such an event. The software platform for our model was AUE (Advanced Urban Environment); a high fidelity urban representation developed by DSTO. It consists of various components covering the objects that are included within an urban confine as well as software tools for their manipulation in AUE. Also included is the extraction and utilisation of the statistical information associated with AUE objects. Social dynamics of the population were simulated with the CROWD tool (which is a part of the AUE suit) that allowed us to instantiate the software agents (i.e. individuals) based on the provided Census data. It also allowed us to associate agents into the social groups (by age, households, professional networks etc). With the AUE framework, we modelled the residents of a small township (about 3000 people) as they went about their lives (home, work, school, kinder, shopping etc) and we studied the spread of disease through social interactions in such context. The simplified scenario of disease transmission was adopted when each agent had only three internal "epidemic" states (S - Susceptible, I - Infected, R - Recovered). These states could be changed as a result of social interaction and the process of disease transmission could be characterised by only three parameters (probability of transmission and recovery, recovery period). In order to validate our new agent-based model we aligned it with a well-known SIR epidemiological model for simplified scenarios (system of coupled differential equations for time evolution of each SIR group). For such scenarios we managed to establish quantitative criteria when a new AUE model and a traditional SIR epidemiological model could produce the equivalent results. By careful alignment of the output of both models we obtained a sense of validity which was needed to develop a realistic disease spread model in a complex multi-agent social context (including alignment of model parameters, scenarios and underlying assumptions). We believe that our new agent-based model for disease outbreaks provides a cost-effective ethical tool for reasoning about such events and for the simulation of typical "what-if" scenarios as well as for the evaluation of various response options

Local versus global processes on networks: Simple processes with complex dynamics

Peter Whigham (University of Otago)

Evolutionary dynamics for the Moran process have been previously examined within the context of fixation probability for introduced mutants, where it was demonstrated that certain topological (network) structures act as amplifiers of selection. This talk will revisit the assumptions for this spatial Moran process and show that proportional global fitness, introduced as part of the Moran process, is necessary for the amplification of selection to occur. We will show that under the condition of local proportional fitness selection the amplification property no longer holds. In addition, regular network structures are also shown to have a modified fixation probability from a fully-connected population when local selection is applied. Theoretical results from population genetics, which suggest fixation probabilities are independent of geography, are discussed in relation to these local network-based models and shown to have different assumptions and therefore are not in conflict with the results of this work. Local operators on network topologies are discussed to show how small changes in the assumptions regarding processes on a network can have signficant effects on behaviour.

Missing data in social network analysis

Johan Koskinen (The University of Melbourne)

In statistical analysis of social networks we are typically dealing with observations that are not independently observed. This generally prevents us from using the standard missing data principles available in the statistical literature. In this paper we present four different perspectives on missingness in social network analysis and outline statistical strategies for dealing with these. These different strategies can roughly be said to correspond to four different kinds of data-problems. The first is when you lack information on the interaction for some pairs (dyads) of actors but where the research setting is otherwise of a standard social network form. The information may be missing because of lack of monitoring of individual dyads or as non-response on the part of some egos. The second approach deals with the case when you have several, partially overlapping and possibly contradictory reports on one and the same network. It is assumed that there exists a true but missing configuration that these reports relate to and our interest is in examining the accuracy of the reports as well as estimating the structural features of the unobserved, true network. A third approach deals with network data where there might be missing actors, i.e. an unknown number of vertices are missing from the studied graph. Lastly, the fourth approach considers doing statistical inference for networks where the true identities of the vertices are not known and hence you may have multiple observed actors that may map to the same single true vertex. This is for example relevant when we try to merge several ego-nets to one large network representation. In the present paper we also describe the first approach in more detail as well as provide some illustrative examples. The proposed inference scheme consists of making a few simple assumptions regarding what causes data to be missing upon which we may implement a Markov chain Monte Carlo algorithm for exploring the joint posterior of the parameters in a model for the network and the missing information. The missing information is effectively treated as yet another parameter-block to be estimated.

Social influence models

Galina Daraganova (School of Behavioural Science, University of Melbourne), Philippa Pattison (School of Behavioural Science, University of Melbourne), Garry Robins (School of Behavioural Science, University of Melbourne) and Peng Wang (School of Behavioural Science, University of Melbourne)

In this paper we review a general class of models for location-dependent interactive social processes originally developed by Robins et al (2001). Here we present an extension of the social influence model based on non-Markovian neighbourhood assumptions. The model expresses interdependent actor attributes as a function of exogenous relational variables, other exogenous attribute variables, and spatial location. Model parameters reflect a variety of different influence effects, and we discuss the theoretical basis of each of these effects. The application of the model is discussed in the context of the study, designed to examine the potential role of social networks and individual level factors in understanding persistent patterns of spatial clustering in unemployment in Australia.

Social network approaches to hepatitis C virus epidemiology: outcomes from consecutive studies in Melbourne, Australia

Campbell Aitken (Burnet Institute), Rhonda McCaw (Victorian Infectious Diseases Reference Laboratory), Scott Bowden (Victorian Infectious Diseases Reference Laboratory), Mandvi Bharadwaj (The University of Melbourne) and Margaret Hellard (Burnet Institute)

The Hepatitis C Virus (HCV) affects 170-200 million people worldwide; an estimated 200,000 Australians are chronically infected, and another 7,000-13,000 new infections occur each year, almost all in people who inject illicit drugs. We have been using social network methods to study HCV in injecting drug users (IDUs) in Melbourne since early 2000. Our first study, Networks I (N1), involved 199 IDUs (197 in one connected component) interviewed once only, and employed molecular epidemiology to assess the overlap between potential and actual transmission pathways. N1 produced several important insights into HCV epidemiology, including the discovery that a subset of long-term IDUs were performing repeated high-risk behaviours with infected partners but had no evidence of exposure to the virus, suggesting the existence of immune protection from HCV. That observation lead to the inclusion of immunological expertise in Networks II (N2), a longitudinal study which incorporates 251 IDUs (over 70% of whom have been interviewed at least twice to date) in a relatively fragmented social network. Preliminary molecular analysis has shown how HCV is being spread across Melbourne, with genetically related infections detected in IDUs living 50 km apart. In addition, we have confirmed that network factors are indeed important in determining exposure to HCV, with the discovery that IDUs' HCV status is significantly and independently associated with the infection status and age of first injection of their network members. These and other findings from N1 and N2 will be presented and discussed.

State Estimation of Complex Social Networks

Daniel McMichael (CSIRO) and John Ward (CSIRO)

The dominant analytical tool for social networks is Social Network Analysis (SNA), which provides summaries of interaction strengths between components of a social system. SNA conflates temporal interactions to show measures of average or total interaction. We develop a socio-temporal model, the social interaction network (SIN) that allows the dynamics of individual decisions and social interactions to be modelled and provides a basis for state estimation, prediction and guidance for likely social system trajectories. The approach provides a framework for optimising policy prior to implementation. The techniques we propose represent social interactions as grammatical productions. The state of the social system is represented by a grammar tree comprising a set of such productions. State estimation becomes a process of parsing under the available domain and observation constraints. The state estimator can be trained using previously observed interaction data. The authors originally developed this approach for the domain of military battle management. The authors now examine the problem of policy instrument design for the encouragement of revegetation and the improvement in farm incomes for the coming era of carbon trading. Individual decisions based on: (i) physical environmental factors (ii) microeconomic factors, (iii) farmer attitudes, (iv) modes of information and social interaction. We show how such a system can be modelled using computationally viable deontics described as the 'grammar of institutions' [6], contributing to ex ante performance analysis of policy initiatives.

References

[1] Daniel McMichael, Geoff Jarrad and Simon Williams, 2006, "Situation Assessment with Generalized Grammar", (Information Fusion Journal, in press).

[2] Daniel McMichael, Simon Williams and Geoff Jarrad, 2006, "Functional Combinatory Categorial Grammar", (submitted to the Journal of Artificial Intelligence Research).

[3] Daniel McMichael and Geoff Jarrad, 'Grammatical Methods for Situation and Threat Analysis', Proc. International Conference on Information Fusion (Fusion 2005), Philadelphia, PA, July 2005.

[4] Daniel McMichael, Geoff Jarrad, Simon Williams and Michael Kennett 'Modelling, simulation and estimation of situation histories', Proc. International Conference on Information Fusion (Fusion 2004), Stockholm, Sweden, June 2004, pp. 928-935.

[5] Brian T. Pentland, 'Grammatical Models of Organizational Processes', Organization Science, Vol. 6, No. 5 (Sep. - Oct., 1995), pp. 541-556

[6] Sue E. S. Crawford and Elinor Ostrom, 'A Grammar of Institutions', The American Political Science Review, Vol. 89, No. 3 (Sep., 1995), pp. 582-600

Understanding social networks in free-ranging cattle

Dave Swain (CSIRO)

Understanding social networks has important implications for disease management. To develop robust quantitative evaluation of social networks relies on observing all contacts between all individuals within a group. This paper presents social network data that has been collected from small groups of cattle using transceiver contact loggers. These loggers provided a continuous record of the time and duration of close proximity contacts between all of the individual cows in the group. The data from the contact loggers have provided a unique insight into the social interactions of groups of cattle. Observing social hierarchies, regrouping and changes in social behaviour can help understand not only disease movement in groups of cattle but also provide an indication on changes in the physiological state of individual animals. Further work aims to explore the genetic determinants of social behaviour. Transceiver contact loggers provide a valuable tool for determining social networks in free ranging animals

Social Science and Management

A way for leading in times of turbulent and consistent change

Roderick Cross (Dept Primary Industries & Fisheries (Qld)) and Gillian Ching (Department Primary Industries & Fisheries)

Disturbance, uncertainty and dialogue are unlikely ingredients for managing complexity and turbulent change. The Queensland Department of Primary Industries and Fisheries propose presenting a paper on its innovative approach to strategic engagement that harnesses these elements to work with human living organisation systems to-

  • Challenge the assumptions that supports the system
  • Tackles the intractable issues that work against organisational performance
  • Empower and ignite passion to seek new and creative solutions
  • Surface beliefs, and thinking to strengthen understanding and guide decision making

The presentation will consider the approaches used in design, implementation and follow-though of the processes which invite full participation in the interpretation of complexity and sense making, the emergence of real issues, a shared sense of meaning and the co-creation of solutions that deliver immediate results and lasting change.

Over a three year period the approach has been applied to a range of complex social systems including internal public service and external stakeholder groups including community, industry and enterprises. The process evolution has taught some surprising and disturbing lessons for any embarking on the change management and organisational performance.

A number of case studies will be provided to illustrate the benefits, outcomes and risks of the processes.

A summary of the key recommendations will close the talk.

Complexity of the Australian public's orientation towards low emission technologies

Simone Carr Cornish (CSIRO), Stephen Fraser (CSIRO), John Gardner (CSIRO), Peta Ashworth (CSIRO) and Anna Littleboy (CSIRO)

Corresponding author: Simone Carr Cornish <Simone.CarrCornish@csiro.au>

There are a number of low emission technologies that support the reduction of greenhouse gas emissions from electricity generation. However, society's likely acceptance of these technologies is characterised by complexity. This complexity has been investigated using a self-organising maps (SOM) approach. The SOM approach is an unsupervised data mining technique that uses ordered-vector quantization to resolve linear and non-linear relationships within complex data. A SOM approach was used to investigate societal attitudes and trends collected via a survey of the Australian population (n=2700). The survey used a social psychology framework to assess participants' perspectives, reflected by ratings of attitudes towards, knowledge of and relevance of specific low emission technologies. Initial SOM analyses identified a number of societal orientations towards low emission technologies, ranging from pro-environmental approaches to scepticism about the threat of climate change. The drivers underlying these orientations were knowledge of energy issues, support for the action around climate change, support for nuclear energy, support for carbon capture and storage and support for renewable energy. The groups identified by the SOM process were further explored to identify the demographic characteristics of each orientation. This study confirmed that the Australian public's orientation towards low emission technologies is complex and that a range of distinct orientations exist.

Governance by Social Network: A Multiscale Analysis of Communication Efficiency

Adam Dunn (Alcoa Research Centre for Stronger Communities, Curtin University of Technology) and Daniela Stehlik (Alcoa Research Centre for Stronger Communities, Curtin University of Technology)

A multiscale approach to modelling social networks provides an alternative method for analysis of social metrics including cohesiveness, betweenness and efficiency. The efficiency of a company, government department or community group is complicated by individuals taking on multiple roles within their employ and within their community. We propose a method of analysis that builds a hierarchy; using professional, community and social links as intralevel connections, and aggregation of groups by role as interlevel connections. When applied to members of the South Coast Regional Initiative Planning Team (SCRIPT), a Federal & State government funded natural resource management organisation, our research suggests that the efficiency of communication within the wider community of conservation-minded individuals may be adversely affected by multiple-role individuals. This suggestion is contrary to the intuitive result that would favour a positive influence of a higher quantity of communication. We develop measures of cohesion and efficiency for a hierarchical view of the social network across professional, community and social dimensions. It is a novel approach, using a natural resource management organisation as a case study to explore these issues further. This research forms part of a larger international research project into conservation and sustainability in the south coast of Western Australia currently underway at Curtin University of Technology.

Hop, Step and Jump! - The application of a Self Organising Map (SOM) approach to the measurement of change in organizations

W.J. Parry (ChangeTrack Research), Stephen J. Fraser (CSIRO Exploration and Mining), Bruce Hobbs (CSIRO Exploration and Mining) and C.W. Peake (ChangeTrack Research)

The classical way of analyzing change within large organisations is to measures attitudes or operations of segments of the company, take averages and then attempt to assemble all the data into one analysis that hopefully reflects the status of the company at that time. Improvements are then attempted by influencing various parts of the organisation without looking at the feedback effects these perturbations will have on the dynamics of the whole organisation.

However, the evidence shows that this approach fails to achieve the desired change outcomes; and, it is the norm not the exception, for change to go off track rather than to remain on track. The cost to the business is high, involving both financial loss and negative impact on customer service and staff morale.

Recognising that the behaviour of large organisations is a complex system and that measurement involves mapping and modeling multivariate and non linear dimensions, CSIRO and ChangeTrack Research have worked together to develop a unique application of SOM methodology to measure and manage change. Our approach involves the following:

  • Build very large data sets covering long time sequences, measuring multiple dimensions of change in Australian and international companies.
  • Develop graphical ways of visualising and analysing data, showing what needs to be done to the system in order to 'guide' it in another direction and then track those changes.
  • Optimise the tracking system that an organisation must follow in order to progress from where they are now to the behaviours that define maximum performance.

This approach breaks new ground in tracking organizational change and performance measurement. There are research challenges in linking 'soft and fuzzy' people and change measures to 'hard' organisational outcomes; however this approach has successfully been used to bring a major change project back on track.

Turbulence

A world tour of dynamical systems, stability, and chaos.

Rowena Ball (The Australian National University) and Philip Holmes (Princeton University)

This journey begins with a visit to a large nineteenth century house in northern England, where we meet the residents. The route then takes us to a museum in Regensburg, Germany, thence to the 41st floor of an office tower in downtown Hong Kong, and finally to the port of Botany Bay where we take in the complicated chiaroscuro of an oil refinery. During the tour aspects of the mathematics of dynamical systems, stability, and chaos will be reviewed within a historical framework that draws together the two major threads of its early development: celestial mechanics and control theory, and focussing on qualitative theory. From this perspective we show how concepts of stability enable us to classify dynamical equations and their solutions and connect the key issues of nonlinearity, bifurcation, control, and uncertainty that are common to time-dependent problems in natural and engineered systems. Some new results on stability of complex networks and refined turbulence will be presented.

Bifurcation in Resistive Drift Wave Turbulence

Ryusuke Numata (The Australian National University), Rowena Ball (The Australian National University), Robert Dewar (The Australian National University) and Linda Stals (The Australian National University)

Fusion plasmas and other turbulent flows in two dimensional (2D) geometry can undergo a spontaneous transition to a turbulence suppressed regime. In plasmas such transitions dramatically enhance the confinement and are known as L-H transitions. From theoretical and experimental work, it is now widely believed that generation of stable coherent structures such as shear flows suppresses cross-field turbulent transport and leads to the confinement improvement.

We have analyzed the modified Hasegawa-Wakatani (MHW) model which describes electrostatic resistive drift wave turbulence in 2D slab geometry by numerical simulations. We have shown that a coherent zonal flow generated in a certain parameter range is lost if strength of the driving parameter due to inhomogeneous background density is increased or the electron adiabaticity parameter is decreased. The system exhibits a sudden transition from the zonal flow dominated state to the turbulence dominated state. This transition may be ascribed to the instability of the zonal flow.

In this study, we consider the Kelvin-Helmholtz (K-H) type instability of the generated zonal flow. Linearlization of the MHW equations around the equilibrium containing the zonal flow V(x)=V0 sin(kx) yields the Rayleigh's eigen value equation modified by the effect of the electron adiabaticity and the background density profile. By analyzing this equation, we discuss the relation between the transition in the MHW system and the stability of the zonal flow against the K-H instability.

Eddy structure in the Roughness Sub-Layer

John Finnigan (CSIRO Centre for Complex Systems Science), Roger Shaw (University of California, United States of America), Ned Patton (National Center for Atmospheric Research, USA) and Ian Harman (CSIRO Centre for Complex Systems Science)

In the lowest part of turbulent boundary layers above deep roughness layers such as forests or cities, the turbulent statistics are distinctly different from those in the logarithmic or 'Inertial Sub-Layer' above. In this 'Roughness Sub-Layer' the eddies are more coherent and organised and transfer both momentum and scalars more efficiently than in the ISL with important consequences for modelling and measurement. recent analysis of wind tunnel, field and numerically simulated RSL data has suggested the reason for these differences but also poses the question of why ensemble-averaged eddy fields are well described by linear stability theory whereas individual eddies are not. This dichotomy raises the deeper question of why linear eigenmodes are attractors for this highly non-linear system.

On subgrid-scale parameterizations of the eddy viscosity, stochastic backscatter and the eddy-topographic force

Terry O'Kane (Antarctic Climate and Ecosystems Cooperative Research Centre) and Jorgen Frederiksen (CSIRO Marine and Atmospheric Research)

The interaction between retained and subgrid scale transient eddies, topography and the mean flow is a crucial determining factor in the generation of oceanic circulations. Inaccurate parameterizations of those dynamical processes that are not able to be resolved in our numerical ocean (and atmospheric) climate models, due to the small scales at which they occur, lead to systematic defects in large-scale general circulation models. Traditional approaches to ocean modeling treat fields resolved on the model grid by the classical dynamics of continua. Due to the inherently turbulent nature of oceanic flows any arbitrarily nearby solutions of the fundamental equations of fluid motion (i.e. the Navier-Stokes equation) will become uncorrelated as they evolve over time. Parameterizations of the actions of the many small scale fluctuations are typically treated by replacing the usual horizontal eddy viscosity parameterization (centered at rest) by an ad hoc eddy tendency (centered about a filtered form of the topography) in order to relax toward statistical mechanical equilibrium tuned to observations (Holloway, J. Phys. Oceanogr. 1992). New methods, based on non-equilibrium statistical mechanics and statistical dynamics, that allow "exact" statistics to be obtained for highly nonlinear and inhomogeneous barotropic flows will be presented.

Statistical Models of a Tracer Plume in the Complex Urban Canopies

Alex Skvortsov (HPP Division, Defence Science and Technology Organisation), Ralph Gailis (HPP Division, Defence Science and Technology Organisation), Michael Borgas (CSIRO Marine and Atmospheric Research), Peter Dawson (HPP Division, Defence Science and Technology Organisation) and Michael Roberts (HPP Division, Defence Science and Technology Organisation)

This paper presents a new analytical model for the plume meander and fluctuation statistics in the turbulent boundary layer above complex canopies. A Probability Density Function (PDF) for plume meander is derived from a one-particle displacement PDF based on Large Deviation Theory. The PDF for vertical meander is found to be non-Gaussian, while the crosswind centroid meander always remains close to Gaussian.

A new universal scaling for in-plume fluctuations is proposed based on the self-similarity properties of near surface mixing. For a power-law profile of the boundary layer the analytical results for fluctuation statistics are obtained, including downstream and across the plume asymptotics.

The model explicitly takes into account parameters of the flow in the turbulent boundary layer and the effect of the underlying surface. The influence of the surface roughness and stability conditions is briefly discussed.

Model predictions are compared with some recent experimental data from a water channel, showing a close match.

General Track

A Preliminary Model for Studying the Interactions Between Nephrons

Robert Moss (The University of Melbourne)

The human kidney exhibits many of the properties of a complex system. It consists of approximately 800,000 to 1,000,000 nephrons, which are the basic filtration units of the kidney. The nephron is a twisted tubule that adjusts the solute levels of the body's blood plasma. Nephrons are surrounded by the renal fluid, which can be divided into several layers, and a complicated network of capillaries that also exchange solutes with the surrounding renal fluid. The behaviour of individual nephrons can fluctuate widely and can even behave in a chaotic manner. However, the overall behaviour of the kidney remains stable.

Our aim is to simulate the behaviour of a clusters of nephrons and to understand how this stability is influenced by the interactions between the nephron tubule segments. Further, we aim to create models that are capable of predicting kidney function and the effects of renal disease. Ultimately, we are interested in investigating the behaviour of the kidney at a level of abstraction that is relevant to clinicians.

We approach this problem assuming that the kidney is a complex network, and we model individual nephrons as dynamic networks. In this talk we present our nephron model, where each node models a tubule segment and difference equations model solute transport between tubule segments, the surrounding renal fluid, and peritubular capillaries. We will also describe the various challenges we face in analysing these models, as well as discussing the analysis approaches that we anticipate using.

A Tool to Analyse the Long-term Viability of an Agricultural Region by Considering the Interactions of Socio-Economic, Ecological Factors

Xianfeng Su (CSIRO), Senthold Asseng (CSIRO), Freeman Cook (CSIRO), Peter Campbell (Defence and Systems Institute, The University of South Australia) and Geoff Carlin (CSIRO)

The long-term viability of an agricultural region is the result of the complex interactions of a large number of different factors across the bio-physical and socio-economic domains. These interactions are non-linear and at any time are driven by the context arising from the interactions of humans with their social and physical environments. Like any other system in which humans are involved, this is a complex adaptive system which can be very difficult to understand with approaches and tools that provide only static views of the state of the system. To study the issues affecting the viability of the Katanning region of Western Australia we have developed a multi-agent based human-landscape model to simulate and analyse the impact of land-use, climate and market changes on regional profitability, environmental sustainability and community resilience by considering aspects of farm management, technology adaptation and risk perception. The simulation is event driven by the actions of farmers that have the capabilities of communication and adoption of new behaviours in response to perceptions of changes in their socio-economic and bio-physical environments selecting different management strategies and making decisions on basis of farming rules. Each farmer agent has a farming plan at the beginning of each year which is then modified by their perceptions of the environment they find themselves in. This presentation will show the results of several simulation runs with changes in land use patterns, farm profitability and uptake of new technology. Planned improvements and extensions to the model will be addressed.

Advanced Data Analysis Methods to Detect and Predict the Non-technical Losses based on Customer Behaviour Changes for Power Utilities

Anisah Nizar (The University of Queensland) and Zhaoyang Dong (The University of Queensland)

The identification and prediction of non-technical losses (NTLs) are important tasks for many power utilities, especially those in developing countries such as Malaysia, Thailand, and Indonesia. NTLs are mainly related to power theft and they involve customer management processes that include a number of means of consciously defrauding the utility concerned. Currently, most of the solutions are more or less ad hoc and allow detection of theft only after a long period of observation. It is in the light of this background motivation that this paper presents a framework of analysis intended to detect fraudulent behaviour in the power utility industry. The specific focus is on the distribution sector and the analysis is based on customer consumption data. A framework of analysis is proposed to facilitate the detection of NTLs by a utility in a timely manner and with accurate results.

This study utilizes historical data accumulated by the utility concerned to develop and test the framework of analysis proposed. The available 30-minute intervals data for three years is used to generate indicators relating to commercial customers. The framework applies a series of data-mining techniques, including those of classification, clustering, forecasting, and feature selection, to alert the utility to any changes in consumer behaviour. It is found that such changes can be effectively monitored and detected by the method proposed. The system has the capacity to trigger an alarm signal so that the utility can proceed with the suspicious data in hand to undertake an appropriate field-site investigation in order to rectify the problem. The present study can be extended to substation metering research into the means of detecting fraud among customers who consistently record normal behaviour but who have actually been contributing to NTLs from the outset.

Australia 2007 to 2025 - A Complex Systems Approach

Bruce Hobbs (CSIRO Exploration and Mining) and Klaus Regenauer-Lieb

Several fundamental issues face Australia over the next decade or so. These comprise the structure and demography of the population, climate change, water and salinity, the energy mix and diversification of the economy whilst maintaining our long-standing strengths. Traditionally these issues are treated independently of each other. However it is worth exploring whether all of these can be treated as one system. This paper explores some possibilities for treating the problems facing Australia as one system and investigates the positive feedback relations that are involved. The mix of water-salinity-energy-agriculture-health of inland towns is an important sub-system in the overall system. Fresh water is needed to maintain the system and there is plenty but full of salt. Desalination requires energy but that is expensive. A solution is to use low enthalpy geothermal energy not to generate electricity but to desalinate underground water. The beauty of such an approach is that the low enthalpy geothermal energy and the salty underground water are commonly part of the same geological system and hence are in pleniful supply precisely where the problem exists. Inland desalination is too expensive using traditional energy sources and geothermal energy production is uneconomic because of the low population density but the two together solve several problems at once (including the salinity issue, supply of fresh water, supply of energy to remote areas) and and hence are economic. There are a number of other positive feedback loops within this sub-system that enhance the economics of the approach even further.

Complexity in Speciation: Effects of disasters on adaptive radiation in a Dual Phase Evolution model

Greg Paperin (Monash University), David Green (Monash University), Suzanne Sadedin and T. G. Leishman (Monash University)

Recent studies suggest that macro-evolutionary patterns such as punctuated equilibrium [5] may be generated by a process termed Dual Phase Evolution (DPE) ([1], [2], [3]). According to the DPE hypothesis, evolution in landscapes exhibits two phases - selection and variation. Disturbances such as mass extinctions can flip the landscape from selection to variation phases. Similar processes occur in a wide range of artificial, natural and social complex systems ([2], [3], [6]). Here, we show that mass extinctions induce DPE in a simulation model of adaptive radiation. The model is based on a previous model of adaptive radiation which did not incorporate dual phase evolution [4]. Results confirm that mass extinctions caused by external disturbances can trigger periods of rapid species turnover and adaptive radiation (variation phases), which are followed by long periods without innovation (selection phases). Our simulations also show that the spatial configuration of disasters leading to mass extinctions strongly influences whether and to what extent such disasters are capable of inducing evolutionary variation phases.

[1] D. G. Green, M. G. Kirley (2000): 'Adaptation, diversity and spatial patterns', Intern. Journal of Knowledge-Based Intelligent Engineering Systems 4(3), 184-190. [2] D. G. Green, D. Newth, M. Kirley (2000): 'Connectivity and catastrophe - towards a general theory of evolution'. In M.A. Bedau et al. (eds.) Artificial Life VII [3] T. G. Leishman, D. G. Green, S. Sadedin (2006): 'Dual phase evolution: a mechanism for self-organization in complex systems'. InterJournal Complex Systems, 1861. [4] S. Gavrilets, A. Vose (2005): 'Dynamic patterns of adaptive radiation'. PNAS 2005; 102; 18040-18045. [5] N. Eldredge, S. J. Gould (1972): 'Punctuated Equilibria: an Alternative to Phyletic Gradualism'. In T. M. Schopf (ed.) 1972: Models in Palaeobiology. pp. 82-115. [6] D. G. Green, S. Sadedin (2005): 'Interactions matter - Complexity in landscapes and ecosystems'. Ecological Complexity 2, 117-130.

Decision processes used to describe turn-off of Beef Cattle from Australian grazing farms.

Graham Donald (CSIRO Livestock Industries), David Miron (CSIRO Livestock Industries) and Irina Emelyanova (CSIRO Livestock Industries)

A number of quantitative multi-criteria decision modelling methodologies are used to describe economic, climatic impact on farm seasonal turn-off patterns. These patterns are unique to beef cattle classes and reflect the number available for sale. In the case of animal disease control for instance it is critical to have some understanding of animal movements from farm holdings. The decision of farmers to move beef cattle is complex and often involves many processes such as historical, contemporary and climate forecast, commodity prices, animal type, enterprise and number of animals available and associated off-shore activities. These data provide the means to model the movements of beef cattle from farm. Algorithms have been developed to model and allocate beef cattle numbers to farm locations derived from training regions and data from the Australian Bureau of Statistics (AgStats). An overview of the Australian Beef industry (report to DAFF, June 2006, AusVet Animal Health Services) describes in detail cattle dynamics and genotype composition for different sectors existing within the industry. The outputs show the seasonality of beef cattle available for movement off farm and it is envisaged that the National Livestock Identification Scheme (NLIS) and National Livestock Reporting Service (NLRS) will provide a means for validation.

Democracy's event horizon

Roger Bradbury (Resource Management in Asia-Pacific Program)

Can the democracies, as complex adaptive systems, cope with today's fully connected world - the complex adaptive system that is the Anthropocene? It is an open question. It used to be the case, way back in the Holocene - say before 1950 - that they coped rather well. But this by the trick of pretending that their world was a bunch of separated lumps - this country, that economy, this forest, that ocean. In the new epoch, with the world now knitted up, the democratic response is becoming fuddled. Oil shocks and financial crises belie the invisible hand. Climate change, emerging diseases like SARS, Nipah and Ebola, pandemics of flu and terrorism, and ecosystem collapses are all seen as disasters, in the old medieval sense of the word - singular, unexpected, unwelcome visitations - rather than the working through of fundamental Anthropocene dynamics. How did this come to be? Perhaps the answer lies in democracy's near event horizon - the local in time and space that is embedded deeply and pathologically in the heart of democracy. And what will come of it? Perhaps democracy, as a complex adaptive system, is a dinosaur in the new epoch, doomed to extinction.

Distributed task allocation with autonomous selected agents

Kenta Oomiya (Future University - Hakodate) and Keiji Suzuki (Future University - Hakodate)

Our study presents an autonomous selected agents approach for distributed task allocation in grid computing environments. In this paper, we show a basic framework of our proposing approach.

In our previous studies, autonomous selected agents are applied to problems how to manage limited common resources and the effectiveness for these problems is shown. Each of autonomous selected agents has some roles and selects a role according to situations. In this paper, we treat distributed task allocation problems as one of problems how to manage limited common resources and apply autonomous selected agents to distributed task allocation problems. To apply our proposed agents to these problems may introduce robustness into distributed task allocation systems.

We use a simulation of grid computing environments. In the simulation, agent is treated as a computer which has some processors. Our proposed task allocation system in grid computing environments consists of autonomous selected agents. These agents have two roles, a global scheduler and a local scheduler. The global scheduler is a scheduler which allocates tasks from users to local schedulers. The local scheduler is a scheduler which allocates tasks from global schedulers to their processors. Therefore, the agents allocate tasks step-by-step. In dynamic distributed task allocation, schedulers can not know other schedulers performance because their performance are change dynamically.

We examine the effectiveness of our proposed distributed task allocation system.

Emerging Network Structure in a Darwinised Data-Oriented Parser

Dave Cochran (University of St. Andrews)

Data-Oriented Parsing (DOP; Scha 1990, Bod 1992, 1998) is a method for statistical parsing, whereby novel inputs may be analysed by directly exploiting the statistical regularities present in a parsed, labelled training corpus without any abstract representations being generated; rather, training corpus entries are stored as node-labelled trees, from which fragments (subtrees) may be extracted and recombined to produce multiple possible parses of a novel input, for which the most probable parse is approximated by means of a Monte Carlo sample. Darwinised DOP (DDOP) is a novel unsupervised DOP parser (see also Bod 2006), which, unlike previous DOP models, operates incrementally, which is to say, it gets its training data one sentence at a time instead of storing its entire training corpus right from the beginning; it builds derivations from subtrees extracted from its previous exemplar-base, but backs off to using a randomly-generated subtree when that can't be done. Each time it parses a new input, the output parse is added to its exemplar-base; when subtrees from the exemplar base are used in constructing the output parse, the output parse will contain new copies of them; thus subtrees are Darwinian replicators, and are selected for generalisablility. This paper reports on the first experiments with DDOP, but also reports on the online growing network-structure of DDOP exemplar bases in simulations; a Treebank may be thought of as a single graph-theoretic structure if both the tree-internal edges and edges representing the substitutability relation (equivalent to node-labels) are admitted as network components.

References

Bod, R. (1992). 'A Computational Model of Language Performance; Data-Oriented Parising'. Proceedings COLING-92, Nantes, France

Bod, R. (1998), Beyond Grammar; An Experience-Based Theory of Language, Stanford, CA: Centre for the Study of Language and Information. Bod, R. (2006). 'Unsupervised Parsing with U-DOP', paper given at CONLL 2006, New York, NY.

Scha, R. (1990). 'Taaltheorie en Taaltechnologie: Competence en Performance', in Q. de Kort and G. Leerdam (eds.), Computertoepassingen in de Neerlandistiek, Almere: Landelijke Vereniging van Neerlandici (LVVN-jaarboek).

Evolution of node-clusters in space associated with processes determined by Voronoi maps: statistics from simulations.

David Odell (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems) and Konstantin Borovkov (University of Melbourne; and ARC Centre of Excellence for Mathematics and Statistics of Complex Systems)

The Voronoi map for a set of points (nodes) in a metric space is a subdivision of the underlying space into cells which associates with each node the set of all points whose distance from that node is minimal. As such it is a natural model for many naturally occurring phenomena in the physical, biological and social sciences. We have proposed a class of time-space models in which the nodes undergo birth-and-death type processes with parameters based on properties of the associated Voronoi cells. The dynamics of these models is determined by both local and global properties of the node-clusters. After proving ergodicity under certain broad conditions we conducted a number of simulation-based statistical studies which demonstrate that a combination of statistics including Thiel's redundancy measure, Baddeley and Van Lieshout's J-function provide an initial level of discrimination between the various kinds of clustering that are observed. We are also able to make a number of interesting general observations about the evolution of such node-clusters.

Most of this material will appear in our forthcoming papers. The research was undertaken under the auspices of MASCOS, an ARC-funded Centre of Excellence.

Finite Time Ruin Probability with Heavy-Tailed Claims and Constant Interest Rate

Dingcheng Wang (The Australian National University)

At first the paper investigates the asymptotic behavior of the finite time ruin probability with constant interest rate and subexponentially tailed claims, which extends the result recently established by Qihe Tang in the classical risk model to the renewal risk model; then, within a smaller class of the claim-size distributions, the paper discusses the finite time ruin probability with claims arriving according to an arbitrary counting process.

Forecasting Commodities Markets using Agent Based Modelling Techniques.

David Miron (CSIRO Livestock Industries), Graham Donald (CSIRO Livestock Industries) and Irina Emelyanova (CSIRO Livestock Industries)

Agent based modelling approaches offers techniques for exploring the behaviours of nonlinear systems such as financial and commodities markets. In this talk, an agent based model is utilised to simulate the behaviour of livestock auctions in saleyards. The agent based model utilises simple intelligence for on farm decision making processes, such as whether to buy or sell. The farm agent's intelligence is implemented using a fuzzy expert system. The collective decisions of farms create a supply and demand that is utilised within the saleyard where farm agents that need to buy bid for lots of cattle. The bidding process is affected by the supply and demand effects that are evident within the market. This process of auctioning creates a time series for both prices and saleyard volumes. These time series exhibit behaviours that are found in time series of financial markets. Further, the auction process is shown to exhibit behaviours that can be found in an actual saleyards.

Harvesting Heterogeneous Renewable Resources: Uncoordinated, Selfish, Team-, and Community-Oriented Strategies

Markus Brede (CSIRO Marine and Atmospheric Research), Fabio Boschetti (CSIRO Marine and Atmospheric Research) and Burt de Vries (Utrecht University, Copernicus Institute)

Using the example of a fishing fleet harvesting in different fishing zones with different fishing capacities and growth rates, we investigate strategies for the exploitation of distributed renewable resources by a crowd of agents without centralized coordination. In agent based simulations we compare the performance of uncoordinated random harvesting, team playing, selfish individualistic and community oriented (Collective Intelligence or COIN) behaviours operating with long and short time horizon planning. In contrast to the selfish strategy that aims at maximizing an individual's profit, the COIN strategy strives to optimize community performance. Not surprisingly we find that community- and long-term oriented behaviours are strategies that lessen the danger of overexploitation of the resource base. The outcome of an evolutionary dynamics where strategies in the agent population spread proportional to relative economic performance is strongly influenced by the harvesting pressure. In order of decreasing resource abundance we find that first for lowest harvesting pressure an uncoordinated random, then a cooperative COIN- followed by the selfish and later the team strategy for an extremely overharvested resource dominate the agent population. We also find that increasing harvesting pressure increasingly favours short term and more individualistic strategies.

Heuristics, complexity and belief networks: a case study of an outback livelihood system

Thomas Measham (CSIRO Sustainable Ecosystems), Kostas Alexandridis (CSIRO Sustainable Ecosystems) and Samantha Stone-Jovicich (CSIRO Sustainable Ecosystems)

This presentation will present an overview of inductive approaches and how they relate to principles of complex systems such as emergence and local interaction. Drawing on the qualitative social sciences, a key feature is that phenomena are not 'tested' but rather heuristically inferred from a detailed analysis of empirical data. In some cases, inductive social science can provide a valuable tool for evaluating the ways people learn and respond to challenges involving complexity in their surrounding social and natural coupled systems. In other cases the insights from inductive approaches can reveal interactions between critical drivers which can be modelled using techniques such as Bayesian Belief and Decision Networks. An example of such an approach (BOLnet) is provided in this presentation to explore and provide an alternative approach to a livelihood system and associated decisions in an outback grazing system. The presentation will conclude with a discussion of future opportunities for inductive approaches in complex system science.

Information Contagion and Financial Prices

Mark Bowden (ARC Centre for Complex Systems, The University of Queensland)

People enjoy discussing finance markets. There is also evidence that these discussions effect the investment decisions of individuals. While interaction seems to be an important aspect of investment behaviour, at the very least among small investors, the efficient market hypothesis is predominately concerned with information efficiency i.e. the speed at which new information is incorporated into the stock price. In this paper a simple artificial market is developed where "sentiment" investors trade with each other as well as with a "fundamentals" trader. The sentiment investor differs from the technical trader usually used in such a model. They buy and sell based on their level of comfort in the market rather than hard and fast trading rules. In this way they better mimic the trading of the typical 'mum and dad' investor. It is shown that such a model can offer some explanation into the kurtosis and persistence characteristics of volatility in returns generated in financial markets. It may also offer some insights into the momentum effect of generally rising or falling markets commonly referred to as bull and bear markets. However, it is also shown that herd behaviour among sentiment investors can not explain the formation of bubbles in markets if they face liquidity constraints.

Is the fractal geometry of nature a coincidence?

Julianne Halley (CSIRO Molecular and Health Technologies) and Dave Winkler (CSIRO Molecular and Health Technologies)

We argue that critical-like dynamics self-organize relatively easily in non-equilibrium systems generally, and that in biological systems such dynamics serve as fractal templates upon which natural selection builds further elaborations. Such dynamics can be modified by natural selection in two fundamental ways, reflecting the selective advantage (if any) of critical-like fluctuations. First, fluctuations can become increasingly or decreasingly critical. Second, fluctuations can differentiate as participating units adopt systematic behavioural variations. Although these may seem like trivial interactions, we argue that they have profound consequences for biological systems because they might facilitate the evolution of conspicuous features of biological systems, including division of labour, compartmentalization, computation and complex collective behaviour. It follows from these ideas that the fractal geometry of nature is not coincidental.

Islets in an Ocean: Towards a Philosophy of Complexity

Tony Smith (Meme Media)

This paper reviews ISCE's February 2007 3rd International Workshop on Complexity and Philosophy (ISBN: 0-9791688-1-3). Aside from some common references to 20 year old work of Ilya Prigogine and Stuart Kauffman, the 20 presentations largely talked past each other, albeit politely. While at least half the participants had strong hard science backgrounds, arguments from hard science were largely neglected in seemingly politically correct deferral to the social constructivist minority. The quest for a Philosophy of Complexity appears to have become a for now at least losing battle by those willing to persist in trying to elucidate common characteristics of complex systems across the physical, biological and social domains against the tide of those mining complexity theory for tactics that might be applied to organisation/knowledge management, arguably the derivatives traders of the new millennium. This paper recaps those numerous common characteristics, including particular reference to the still controversial status of the edge of chaos/border of order. It looks further at the particular obstacles to developing a broader appreciation of the philosophical implications of complexity theory, especially in the still formative minds of postgrads and postdocs. Do even our brightest youngsters have to have beaten Marcia Salner's suggested second personal crisis before their brains can differentiate complex systems from relativism, an ever more unlikely prospect since the sanctification of child protection? Could Anglo capitalist triumphalism survive the rise of a proper intellectual appreciation of complex dynamics and its broad philosophical implications?

Managing Linguistic Complexity

David Butt (Macquarie University)

Engineering models represent semantic behaviour in terms that are both artificially complex and implausibly simplistic. Languages are, in fact, more interesting for complexity theorists than the traditions of such modelling make apparent. Complexity theory has provided new ways of construing linguistic complexity, and of managing that complexity with respect to defined research goals.

A first clarification is to recognise that we come to linguistic behaviour, whether as speakers or analysts, through its characteristics as a realizational system: paradoxically, many things have to happen simultaneously on a number of levels for anything to happen at all. Complex patterns on at least 4 distinct levels have to align for meaning to be generated. Consequently, the number of possible dependent relations on any one level, and then across levels, appears to explode. For example, a 'stream lined' system of tense selections in English soon produces > 70,000 possible combinations.

On the other hand, such profusion of 'meaning potential' (Halliday, 1973; 2005) is delimited in any given instance of speech by the restrictions on possible alignments on other levels, i.e. the choices of context, rhetorical structure, lexis/grammar, and phonology. When we map typical choices on a number of dimensions of language, our task takes the form of a 'phase space/phase portrait' comparison (Cohen & Stewart, 1994: 198ff), with each specific domain of linguistic activity - each register - setting the limits of particular enquiry.

We explore the management of complex semantic behaviour through the details of our projects in surgical safety and mental health.

Mapping model complexity

Fabio Boschetti (CSIRO Marine and Atmospheric Research), Nicky Grigg (CSIRO Land & Water) and David McDonald (CSIRO Marine and Atmospheric Research)

We propose to define the complexity of a numerical model as the statistical complexity of the output it produces. This measure allows for a direct comparison between data and model complexity. Since the complexity of the model output can vary as a function of the input parameters, the measure we propose is high-dimensional and we reconstruct it by searching the model parameter space. We then plot the search results on a 2D image, which we call a self-organised Complexity Map, in order to segment the model domain visually into areas of different dynamical behaviour.

Given a specific problem, a data set and the need to choose among a number of alternative models to study it, we suggest that models whose maximum complexity is lower that the data complexity should be disregarded because they are unable to reconstruct some of the structures contained in the data. Similar reasoning could be used to disregard model subdomains because of unnecessarily high complexity.

The approach is tested on an ecological model for which the Complexity Map can be compared to previously published analytical results. We show that the Complexity Map correctly detects the location of Hopf and fold bifurcations with no need to specify a priori which dynamical features should be analysed in order to detect changes in dynamical behaviour.

We suggest that model complexity so defined better captures the difficulty faced by a user in managing and understanding the behaviour of an numerical model as compared to measures based on a model 'size' or architecture.

Mimosa: using ontologies for modeling and simulation of complex systems

Jean-Pierre Muller (CIRAD)

Modeling is the shared activity for both modeling and simulation, and knowledge representation. At the same time, the objectives of these two domains are not the same and both seem necessary when dealing with modeling highly structured and complex systems. To face the complexity of the systems we are trying to model and to simulate nowadays, the challenge addressed in this paper is to mix both approaches in a common framework. We first describe separately the recent advances made in the modeling and simulation community as well as in the knowledge representation community. We show that, despite the same goal to describe a reality, or at least a part of it, it results in very different, although related concepts. This difference is due to the focus taken by these communities. The first community is centered on dynamics, and the second one on static descriptions. From the analysis of the differences and similarities, we propose an architecture which is being tested in a modeling and simulation platform: Mimosa. The outcome is a formal way to pave the path from conceptual to running models which is sketched in this paper. The achievements and the perspectives are discussed in the conclusion.

Mythological Archetypes as an Emergent Process

Victor MacGill

As our earliest human ancestors lived in increasingly larger communities, more complex social organisation, including the means of communication, were required to sustain them. Attractors forming as shared mental patterns in the mind of each tribal member aligned their mental processes so the tribe moved towards maximising themselves on their fitness landscape.

Particularly after language developed, the tribes began to operate as effective complex adaptive systems enabling the emergence of even more adaptive traits. This included the formation of social roles, enhancing individual tribal member's skills and abilities.

Four fundamental roles found ubiquitously in traditional societies are the leader, wisdom holder, nurturer and protector. Each tribal member needed to hold a representation of these mental patterns in their mind so they knew what to expect of others and how they should respond to them. As the attractors were reinforced generation after generation, they adapted and canalised to became archetypes. The archetypes developed in a fractal-like manner in as much as the patterns held in the collective unconscious were repeated in each individual brain.

Carl Jung described these four archetypes as the king, warrior, magician and lover. These link respectively to the qualities of authority, courage, wisdom and compassion. The archetypes manifest themselves collectively in fairy stories and myths. They are deeply buried in our unconscious mind, subtly guiding our behaviour, and can be accessed through various techniques to enable the emergence of new levels of understanding of ourselves and our relationship to the world we live in.

Non-Hamiltonian and fractional dynamics in complex plasma

James Stokes (The University of Sydney), Alex Samarian (The University of Sydney) and Sergey Vladimirov (The University of Sydney)

The study of plasma systems containing ensembles of microparticles (dust) is a rapidly developing field of complex systems research with applications extending from solid-state physics to astrophysics [1]. The multi-component mixture of plasma and dust particles is a characteristically complex system with many strongly interwoven non-linear particle interactions [2]. One of the general features of complex plasma systems is the presence of reactionless dust particle interactions which arise due the dynamic interaction between the dust particles and the plasma in which they are immersed. These forces are position dependent and cannot be described by the gradient of a potential. The system is thus out of reach of the Hamiltonian description since the Hamiltonian depends explicitly on position. Under stable conditions however, where the energy flow into and out of the system are approximately equal, the energy of a non-Hamiltonian system may be approximately constant (or at least bounded). In this case, consideration of stability on the basis of the Hamiltonian can be valid. When the Hamiltonian approximation is not justified, it can be useful to redefine the Hamiltonian to incorporate the non-potential interaction forces. Non-Hamiltonian dynamics is of fundamental interest due to its ubiquity in complex systems and fractional calculus has provided a powerful reformulation of Hamiltonian dynamics which encompasses some types of dissipative systems [2]. To our knowledge, the inclusion of non-reciprocal interaction forces has not been achieved.

References

[1] S. V. Vladimirov, K. Ostrikov, and A. A. Samarian, Physics and Applications of Complex Plasmas (Imperial College, London, 2005).

[2] J. D. E. Stokes, S. V. Vladimirov and A. A. Samarian, `Nonlinear dynamics of a two particle complex plasma in vertical alignment, EPS Conf. Proc. (Upcoming) (2007).

[3] F. Riewe Phys. Rev. E. 55, 3581 (1997).

On Efficient Management of Complex Systems

Victor Korotkikh (Central Queensland University) and Galina Korotkikh (Central Queensland University)

Complex systems profoundly change human activities of the day. However, it is still unknown how to deal with complex systems efficiently without confronting NP-hard problems. Existing concepts of complexity are very important on many occasions, but they do not explain how the performance of a system is dependent on its complexity and whether efficient management of complex systems may be possible at all.

For this reason we introduce a concept of complexity based on self-organization processes of prime integer relations, called the structural complexity. By computational experiments we investigate whether the performance of an optimisation algorithm for a NP-hard problem might behave as a concave function of the algorithm's structural complexity.

The computational experiments raise the possibility of a general optimality condition of complex systems: if the structural complexity of a system is in a certain relationship with the structural complexity of a problem, then the complex system shows the optimal performance for the problem. The optimality condition presents the structural complexity of a system as a key to its optimisation. From its perspective the optimisation of a system could be all about the control of the structural complexity of the system to make it consistent with the structural complexity of the problem.

Importantly, the experiments indicate that the performance of a complex system may indeed behave as a concave function of the structural complexity. Therefore, once the structural complexity could be controlled as a single entity, the optimisation of a complex system would be potentially reduced to a one-dimensional concave optimisation irrespective of the number of variables involved its description. This might open a way to efficient management of complex systems.

Perils of Power Laws

Russell Standish (Mathematics and Statistics, The University of New South Wales)

Power law distributions are often a signature of complex systems behaviour, and so quite an industry has grown up around reporting power law distributions and extracting the exponents. However, there are a number of traps lying in wait for the unwary, due to the difficulty of getting statistical significance to extend over many orders of magnitude. It is quite possible that much of the published literature on power laws fallen afoul of one of these traps. This presentation covers a number of these problems, and what indications there are that data might be affected by these problems.

Resilient extraction of renewable resources

Cameron Fletcher (CSIRO Sustainable Ecosystems) and David Hilbert (CSIRO Sustainable Ecosystems)

The long term survival of managed renewable resource harvesting systems is limited by the dynamic capacity of the system to absorb unexpected environmental, economic or social conditions. This 'dynamic' perspective is drastically different to contemporary or sustainable management focussed on optimising the long term 'average' productivity or performance of a system. In fact, the Millennium Ecosystem Assessment (2005) attributed many diverse examples of ecosystem collapse around the world to a systematic pathology of human management that degrades system dynamic capacity, even while aiming to achieve 'sustainable' land use.

To effectively manage for dynamic capacity we need ways of understanding renewable resource harvesting systems attuned to the highly variable, nonlinear nature of these systems. We are developing such techniques based on a dynamical systems description of the interacting ecological, economic, and social system subcomponents. We employ resilience analysis, bifurcation analysis and eigenvalue analysis to estimate the capacity of these systems to survive both specific shocks and generally environmental variability.

We show that not only is it possible to incorporate management for dynamic capacity into contemporary management practices, but also that it does not necessarily require a distinct change in long-term 'average' management goals such as profitability. In any renewable resource harvesting system there are a range of sustainable management strategies that yield maximum productivity, but exhibit drastically differing dynamic capacity to cope with unexpected change. By understanding dynamic capacity and beginning to incorporate it into management, we can hope to slow and recover the collapse of human-managed renewable resource harvesting systems.

Self-avoiding walk enumeration via the lace expansion

Nathan Clisby (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, The University of Melbourne), Richard Liang (University of California, Berkeley, United States of America) and Gordon Slade (University of British Columbia, Canada)

We introduce a new method for the enumeration of self-avoiding walks based on the lace expansion. We also introduce an algorithmic improvement, called the two-step method, for self-avoiding walk enumeration problems. We obtain significant extensions of existing series on the cubic and hypercubic lattices in all dimensions d >= 3: we enumerate 32-step self-avoiding polygons in d=3, 26-step self-avoiding polygons in d=4, 30-step self-avoiding walks in d=3, and 24-step self-avoiding walks and polygons in all dimensions d >= 4. We analyse these series to obtain estimates for the connective constant and various critical exponents and amplitudes in dimensions d >= 3. We also provide major extensions of 1/d expansions for the connective constant and for two critical amplitudes.

Social Capital: a complex of dynamic relationships

Angela Wardell-Johnson (Centre for Rural and Regional Innovation, Qld; and The University of Queensland)

Landscape management negotiates across interlinked ecological and social systems drawing on social resources in situations where issues in landscape sustainability challenge society's capacity and will in problem solving. Social capital results from a dynamic process that operates through social networks based on norms of reciprocity that ensue through practices of exchange and depend on relations of trust.

While social capital is frequently referred to as the glue that holds society together (World Bank 1998, Stone and Hughes 2002, Productivity Commission 2003) this research makes use of a complex systems framework to identify the content of social capital. Social networks and the coupling forms (Marion 1999) through which individuals, communities and society are linked serve as evidence to expose a process that reflects innovation, adoption, consolidation and stagnation followed by cyclic renewal through innovation (Holling 2001).

This paper outlines the theoretical basis of social capital as a means of exploring the potential for a re-framing within complex systems framework. An explorative research methodology integrated results from comparative agri-ecological landscapes through quantitative survey and qualitative interview methods. The results provided a re-configuration of social capital as a dynamic force that exposes the interaction between bridging and bonding social capital. It is this dynamic relationship that exposes the adaptive capacity which maintains resilience in socio-ecological communities. These results make explicit the dynamic characteristics of social capital that act as a key component of resilience in agricultural landscapes.

The Evolution of the World Trade Web as a Complex System

Tim Kastelle (ARC Centre for Complex Systems, The University of Queensland)

Recent studies have shown that complex networks with power-law degree distributions also have power-law distributions of Betweenness Centrality. Similarly, networks with Gaussian degree distributions have a bimodal distribution of Betweenness Centrality. While these previous studies have been based primarily on modelling, this paper examines the distribution of Betweenness Centrality in an evolving complex network, the World Trade Web as it has evolved over time from 1938 to 2003. It shows that networks with log-normal degree distributions have log-normal distributions of Betweenness Centrality as well. The paper then uses complex network analysis to look for evidence of economic convergence. The fundamental structure of the network remains essentially the same over this period, which does not support the contention that the economy is converging to a steady state. Rather, it appears to be best described as a complex system which is in a constant state of transformation. The measure of betweenness centrality is used to develop a model of economic development. This model is used to explain how the major structural measures of the network, such as degree distribution, average path length and clustering coefficient remain stable, while the connectivity of individual nodes remains relatively dynamic.

The Optimal Design of Profitable Renewable Energy Systems; A Hamiltonian Based Approach

Frank Horowitz (CSIRO Exploration and Mining) and Peter Hornby (CSIRO Exploration and Mining)

We develop and deploy an optimal control approach to the machine design of renewable energy systems. The problem is formulated via input/output equations of motion, with capacities of various subsystems available for purchase, and conservation laws explicitly enforced. We present some example designs and discuss variations available for future work.

In contrast to the "Design by Darwin" moniker given to genetic programming approaches to optimal design problems, we label our approach "Hatching with Hamilton".

Value as a Driving Factor in Complex Human Enterprises

Patrick Beautement

This paper discusses the benefits which could be realised from incorporating the notion of 'value' into models of complex human enterprise. The paper takes as its starting point the realities of acting in the world - that throughout commerce and government and in everyday life we are inevitably connected through complex networks of relationships and interactions at different levels of abstraction, always mediated by the environment with which we co-evolve. Capabilities are needed so that we are able to adapt to these inevitable uncertainties of the world without dislocation.

Enterprises are purposeful and are driven by their notions of what they 'value' and they act to defend or promote those values through change. Hence, for organisations and society at large, the key capabilities to have are: being open to change; being able to sense opportunities for change and then having the will, organisational agility and adaptability to take advantage of these opportunities and challenges. The need for concomitant agility in cyberspace is also essential. To achieve purposeful agility, social enterprises and cyberspace must be capable of co-supporting continuous dynamic adjustment by employing complex systems engineering mechanisms which are appropriate for dealing with the real world.

Often, though, we see the world in terms of process, bounded 'systems', efficiency and 'the bottom line' and our processes and information systems only see the world in terms of facts and 'optimal' behaviour. As a result, our tools and systems are less effective than they could be. In a sustainable, interconnected world, we need instead to look at a broader spectrum which involves the essential complex interplay between at least five types of capital: natural, human, social, manufactured and financial. These capitals can be thought of in terms of 'value sets'.

So, the questions then are: how do these value sets map to complex adaptive mechanisms and drivers and how can we represent the related values in cyberspace? It is essential to be able to do this with the (potentially conflicting) sets of values of different enterprises such that cyberspace (and the software entities inside it and the hybrid entities at its margins) can 'reason' in a useful manner to augment human endeavours. This talk will discuss these issues and suggest some approaches, mechanisms and models for further discussion and debate.

Variational principle for relaxed states of a plasma confined by a nonintegrable magnetic field

Robert Dewar (The Australian National University), Matthew Hole (The Australian National University), Stuart Hudson (Princeton Plasma Physics Laboratory) and Mathew McGann (The Australian National University)

Confinement of plasma by a strong toroidal magnetic field is the principle underlying most approaches to fusion power, the magnetic field preventing transport of matter and providing thermal insulation against the enormous temperature difference (10^8 deg C) between the middle of the plasma and the edge. The magnetic field is only fully effective in doing this if all magnetic field lines lie on nested toroidal surfaces. I.e. in, Hamiltonian dynamical systems language, if the field-line Hamiltonian is integrable, so the phase space is foliated by invariant tori. However, integrability breaks down when axial symmetry is broken, either by spontaneous instabilities or by currents external to the plasma, and confinement is degraded in the complex set of chaotic regions arising from this. We propose [1] a variational principle to model the effect of nonintegrability within the magnetohydrodynamic (MHD) approximation: minimization of the total plasma energy subject to the constancy of a finite number of topological (magnetic helicity), mass and entropy invariants consistent with the assumed existence of residual invariant tori of KAM type and explore its implementation.

References

[1] "Eigenvalue problems for Beltrami fields arising in a 3-D toroidal MHD equilibrium problem", S.R. Hudson, M.J. Hole and R.L. Dewar, accepted for publication in Phys. Plasmas (2007)

Wetland ecosystems as manifestation of complex systems

Changhao Jin (Arthur Rylah Institute)

Wetlands are the most threatened ecosystems of our planet. Mathematical models are developed to study freshwater wetland ecosystems in the face of secondary salinisation and climate change. System behaviour and dynamics are investigated. Important questions related to ecosystem resilience and sensitivity, path dependence, stability and nonlinearity are addressed. The models are tested against empirical data.

Zonal flow generation by modulational instability

Robert Dewar (The Australian National University) and R Farzand Abdullatif (The Australian National University)

The term "zonal flow" has recently come to be much used in toroidal magnetic confinement plasma physics to refer to a mean poloidal flow with strong variation in minor radius. The sheared nature of this flow is thought to have the strongly beneficial effect of reducing radial transport by suppressing turbulence, thus improving the confinement of heat required to achieve fusion conditions. The use of the same phrase 'zonal flow' in the context of both geophysics and magnetic plasma confinement is no coincidence, as the existence of strong analogies between these fields has become well recognized. The envelope formalism provides a simple way of understanding the excitation of zonal flows by nonlinear interactions of plasma drift waves or Rossby waves, which can be described equivalently by the Hasegawa-Mima (HM) equation or the barotropic vorticity equation, respectively [1]. The extension of this work to modulational instability of ion temperature gradient modes will be discussed.

References

[1] "Zonal flow generation by modulational instability" R.L. Dewar and R.F. Abdullatif. In Proceedings of the CSIRO/COSNet Workshop on Turbulence and Coherent Structures, Canberra, Australia, 10-13 January 2006 (World Scientific, 2007, in press, eds. J.P. Denier and J.S. Frederiksen), pp 415-430. E-print arXiv:physics/0610016

Poster Session

Computational Models for Studying Signalling Control Mechanisms behind Legume Autoregulation of Nodulation

Liqi Han (ARC Centre of Excellence for Integrative Legume Research and ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland), Peter M. Gresshoff (ARC Centre of Excellence for Integrative Legume Research) and Jim Hanan (ARC Centre of Excellence for Integrative Legume Research, ARC Centre for Complex Systems and Advanced Computational Modelling Centre, The University of Queensland)

Nitrogen fixation by legumes is the product of a symbiosis of legume plants and a group of soil bacteria know as rhizobia, in which the plant invests its resources in new organs called nodules to house the bacteria. A balance between nodulation and other growth processes is maintained by a regulatory process called autoregulation of nodulation (AON). It has been hypothesized that the interaction with rhizobia causes a root-produced signal, which is sent to the leaves and then detected by a leucine-rich repeat receptor kinase encoded by the NARK gene. The detection of this root-shoot signal further induces the production of a shoot-derived signal (SDI) transported to the root, which inhibits development of new nodules. We use multi-scale modelling of processes including intra- and inter-cellular signalling, long-distance signalling and phenotypic development regulated by internal control mechanisms to study this homeostatic system in soybean. A model of signal reception in the leaves was developed using context sensitive L-systems to study hypotheses on intra-cellular signal transduction involved in AON. At a whole plant scale, a structural framework has been developed capable of simulating soybean growth driven by hypothetical patterns discovered in the empirical data, integrated with developmental factors. Root-shoot and shoot-root signalling pathways were also incorporated into the structural model to allow the pattern-driven control of nodulation to be replaced with explicit models of the signals involved in autoregulation. This will enable the model to be used for prediction in more complex situations and for the testing of signalling hypotheses.

A computational model of C. elegans locomotion dynamics and motor control circuitry

Mark Wakabayashi (Thinking Systems, Queensland Brain Institute and School of Information Technology and Electrical Engineering, The University of Queensland), David Carrington (School of Information Technology & Electrical Engineering, The University of Queensland) and Janet Wiles (The University of Queensland)

The nematode Caenorhabditis elegans is a model organism for research into the biological basis of behaviour. Its 302 neurons have been extensively studied, and their connectivity has been comprehensively mapped. It nevertheless remains unclear exactly how the motor neurons operate to innervate the muscles in a way that supports its undulatory locomotion. One plausible hypothesis is that muscle activations are based on the locations of curves along the body, by incorporating feedback from regions of the motor neurons suspected to be sensitive to mechanical stretch. These suspected stretch receptors, however, extend in the same direction as the muscle waves are propagated, and thus appear unsuited to producing the muscle activation patterns required for undulatory locomotion. Due to the small size of C. elegans, electrophysiological recordings of the neuron activations during locomotion have not been possible. This project employed computational models to assess the plausibility of the stretch receptor model of C. elegans locomotion. A comprehensive model of the C. elegans body using a robust physics simulation was developed. This model provided a platform for the integration of a model of stretch receptor based motor control that was designed according to the known C. elegans neural circuitry. Simulation of the models' dynamics and interactions showed that stretch receptor based motor control is able to sustain forward locomotion by activating the muscles on the forward outer sides of curves in the body. Stretch receptors could therefore form the basis of C. elegans locomotion because of, and not in spite of, their orientation, thus resolving the apparent paradox. The stretch receptor hypothesis is shown to offer a consistent and computationally plausible model of the coordination of locomotion by the C. elegans neural circuitry.

Approximating extinction times and probabilities for absorbing birth-death processes

David Sirl (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, University of Queensland), Hanjun Zhang (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, The University of Queensland) and Philip Pollett (ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, University of Queensland)

Birth-death processes are a class of Markov processes widely used in biological and chemical settings, for example in modelling population sizes of threatened species or the number of molecules of a particular reactant in a chemical system. Many such systems have an absorbing state, corresponding to population extinction or depletion of a reactant.

We describe a little-known method due to Mu-Fa Chen (2000, 2001) of approximating the decay parameter of an absorbing birth-death process; then introduce numerical methods based on Chen's results which can be used to obtain very good approximations to quantities including expected extinction times and the probability of extinction within a given timeframe.

Computational techniques for modeling complex biological systems

James Watson (ARC Centre for Complex Systems and ARC Centre in Bioinformatics, The University of Queensland), Mark Wakabayashi (Thinking Systems, Queensland Brain Institute and School of Information Technology and Electrical Engineering, The University of Queensland), Jared Moore (The University of Queensland), Andres Sanin Montoya (The University of Queensland), Kai Willadsen, Nic Geard (ECS, University of Southampton), Daniel Bradley (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland) and Janet Wiles (The University of Queensland)

Computational modeling is an effective technique for investigating complex biological systems. This poster presents an overview of recent modeling work done by the BioComplexity group of the ARC Centre for Complex Systems.

We have found that software models of specific biological systems are generally developed by a small number of people, and are constantly redesigned or even discarded as understanding of the model changes. While the overall model is transient, components such as visualizations and analysis methods are constantly re-used both within and across research projects.

The ACCS has invested in the capture of both the concepts of techniques used (in the form of Complex Systems Patterns) and in reusable implementations of key techniques (in the form of a software library known as CoolKit). Here we present an overview of this work, and its application in biological research.

Constructing complexity based rules for C. elegans from recursive networks

Frank Burden (CSIRO Molecular and Health Technologies; and Scimetrics Ltd), Julianne Halley (CSIRO Molecular and Health Technologies) and Dave Winkler (CSIRO Molecular and Health Technologies)

The differentiation pathways of Caenorhabditis elegans embryogenesis have been studied as a surrogate for future work on human embryonic and haematopoietic stem cell regulatory networks. Previous work by Geard and Wiles on network models of the same model organism has been extended by the introduction of a regularizer and more robust convergence algorithms. The resultant weight matrix can be interpreted as a set of rules that guides the differentiation of the cells via a set of regulatory factors; internal genes or external entities. The activity of the regulatory factors show patterns across the differentiation pathway that reflect the left or right hand split. Using these patterns, work is underway to identify the actual factors responsible for the differentiation and to interpret the associated weights. Some recent work on the embryonic and haematopoietic systems will be discussed.

Critical regions and phase changes in simulations of spiking neurons

Peter Stratton (Queensland Brain Institute) and Janet Wiles (The University of Queensland)

Our goal in this research is to find efficient ways to simulate neural systems at multiple spatial and temporal scales. Current artificial neural networks have proven very effective at modelling longer time scales required for cognitive (> 2 seconds) and engineering applications. However to model shorter timeframes with biological plausibility one needs to model the spiking characteristics of neurons. One of the challenges of spiking neuron simulations is the computational load of simulating individual spikes in networks large enough to be computationally significant. In addition, there is a wide range of spiking neuron behaviour which depends on complex chemical interactions and other biological details which may or may not be salient at a given spatial and temporal scale. Izhikevich (2003) recently proposed a spiking neuron model based on a set of parameterised equations which can replicate a broad range of neuron characteristics. The elegance of this model is that the entire range of known cortical neuron behaviours can be emulated with changes in just four parameters. In this study we have implemented the Izhikevich model and simulated a fully connected network of 1000 spiking neurons. We then applied complex systems tools to visualise slices through the state space, and present heatmaps that demonstrate critical regions and phase changes in network behaviour.

Do stylized facts of order book markets need strategic behaviour?

Dan Ladley (The University of Leeds, United Kingdom) and Klaus Schenk-Hoppe (University of Leeds, United Kingdom)

This paper studies the role of strategy and the market mechanism in market behaviour. To this end we analyse a zero-intelligence model of a dynamic limit-order market. Positive correlation in order types are found to be caused by the market mechanism rather than strategic interaction. In contrast the absolute probabilities of order submission are found to highlight room for strategic behaviour. Market architecture governs several stylised facts of limit order markets. Price movements may be predicted in the short term from analysing the state of the order book.

Electricity market planning and management

John Lu (The University of Queensland)

Abstract-Transmission Expansion Planning (TEP) is originally aimed at implementing a robust transmission system between generation and distribution systems in order to support a competitive, efficient and flexible electricity market while satisfying reliability requirements. Since the start of deregulation, market participants have been eager to maintain a competitive edge in the competitive electricity market. Market players adjust their operational strategies with the changing electricity market. This phenomenon clearly implicates that for market participants, a project with managerial flexibility will be much more preferred than others. Under the traditional Net Present Value (NPV) based financial evaluation framework for TEP, the single risk-adjusted discount rate may not fully represent the risk factors in the market, especially for long term TEP. Therefore, how to incorporate a project's management flexibility that response to unpredictable market developments in the planning process has become the key for long term TEP. This paper presents an option based planning framework, which combining an expansion plan's short term foreseeable benefit together with values from potential options in the long term future.

Identifying Emergent Social Behavioural Networks in Domesticated Livestock

Kym Patison (CSIRO), Dave Swain (CSIRO), Philippa Pattison (School of Behavioural Science, University of Melbourne) and Garry Robins (School of Behavioural Science, University of Melbourne)

Investigating the herding structure of a group of cattle as a network can help quantify elements of social behaviour, including associations between individuals, and thereby identify possible routes of information and disease transmission. To determine the social structure, we must track individuals in relation to the group. Measuring proximity between individuals is a physical means of determining sociability. Using observation as a means to quantify interactions can be erroneous but the use of transceiver contact loggers can continuously record the time and duration of close proximity contacts between all cows within the group. By quantifying social interactions, derivation of a sociability index provides a measure of an individual's preference for socialising that ranks individuals on their social status. This poster presents potential directions for further work utilising the derived sociability index, namely the link between physiological drivers and behaviour. Studying the change in behaviour of an animal under varying environmental and physiological conditions can provide feedback on the production efficiency of these conditions. Quantitative modelling of the social interactions can help explore how the underlying social behaviour might influence resource use and disease dynamics.

Modelling the Omics Network of Hepatocellular Carcinoma using NetMap®

David Fung (The University of Sydney) and John Galloway

Modularity is an important topological feature of biological networks such as a proteome network, a gene co-expression network, signal transduction pathways, and metabolic pathways. In essence, modules can be defined as sub-networks whose entities are more frequently connected to each other than entities outside the sub-networks. When visualized as a node-edge graph, modules represent communities of high connectivity. Because it exists in a variety of biological context, modularity should be seen not only in physical networks but also integrated networks of physical and ontological entities. The authors called the latter type of networks as the 'omics' network. Visual graph mining can be used to compare the 'omics' networks across clinical phenotypes that may reveal the molecular pathology of hepatocellular carcinoma. In particular, the topological change of certain modules due to the change of membership within should reveal the molecular relationships that may explain the progression of the disease. In this paper, the authors demonstrated the use of visual graph mining in modeling the 'omics' network of hepatocellular carcinoma.

Network models for embryonic stem cell self-renewal and differentiation in mice and men

Julianne Halley (CSIRO Molecular and Health Technologies), Frank Burden (CSIRO Molecular and Health Technologies; and Scimetrics Ltd) and Dave Winkler (CSIRO Molecular and Health Technologies)

Embryogenesis is an extremely complex process governed by control mechanisms at several different levels of organization. To tease apart this remarkable complexity, we are building a range of network models that recapitulate the first few cell divisions in mouse and human embryonic stem cells. We chose network models to facilitate detection of an appropriate level of detail and integration of multiple heterogeneous data sets, and begin with the mouse model because of the relatively large amount of data available. We tie models to reality with microarray studies that reveal noteworthy differences in gene expression between cell types, and cell lineage diagrams. Our elaboration of the work by Geard and Wiles in Caenorhabditis elegans embryogenesis provided an initial template that could be adapted to mammalian embryogenesis. Special attention was paid to how factors external to cells integrate with intrinsic genetic circuitry.

Novel sparse Bayesian methods for stem cell microarray analysis and cancer diagnosis

Dave Winkler (CSIRO Molecular and Health Technologies), Frank Burden (CSIRO Molecular and Health Technologies; and Scimetrics Ltd) and Julianne Halley (CSIRO Molecular and Health Technologies)

Analysis of microarray data to probe gene regulatory processes in stem cells, and classify and diagnose cancers, represents an interesting and important class of mathematical problems, one in which the number of variables greatly outweighs the number of observations (grossly underdetermined systems). Construction and annotation of complexity-based network models of gene regulation require the identification of relevant genes and the genes to which they are connected.

We have used two Bayesian approaches to analyze microarray data to understand the relevance of gene expression levels in stem cells and to classify closely related small, round blue-cell cancers tumours (SRBCT). We validated our feature extraction and classification methods using a comprehensive cancer microarray data set from the literature, and a small set of microarrays from gene knockouts in embryonic stem cells. We used Bayesian regularized neural networks to carry out nonlinear classification and diagnosis of SRBCT. We also used a sparse Bayesian feature reduction method based on an EM algorithm with a tuneable sparsity to identify a small set of the genes most relevant for diagnosis.

Both methods performed well and the combination provided a means of optimally modelling the data, and allowing interpretation of the models in terms of the most relevant genes.

We are currently applying the methods to identify relevant genes in the early stages of embryonic stem cell differentiation. We will deduce regulatory network models from the genes identified.

Optimal Active Learning in Gaussian Process Regression: an Empirical Study

Flora Yu-Hui Yeh (ARC Centre for Complex Systems, School of Information Technology & Electrical Engineering, The University of Queensland) and Marcus Gallagher (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland)

Inductive learning and self-adaptation is a fundamental property of many types of natural and artificial complex systems. This is a consequence of the difficult learning tasks and/or data presented by the environments in which such systems exist. Active learning is a class of supervised learning techniques where the learning algorithm actively selects data points to be used in training. Active learning aims to use as few labeled training examples as possible to achieve desired performance. This is most beneficial for problems where labeling data is expensive or difficult. This paper presents empirical active learning results for Gaussian process regression based on the optimal active learning technique proposed by Roy and McCallum. Optimal active learning selects points to minimise expected future error by making approximations over the full input distribution and true output distribution. Gaussian processes are a state-of-the-art probabilistic method in machine learning. They are well-suited to efficient implementation of optimal active learning because they provide a mean and variance for any predicted output. Although Gaussian processes have been proposed for active learning, so far only preliminary experimental evaluations have been done. This paper investigates the practical usefulness of optimal active learning in different problems and domains. The technique is applied to a range of real-world datasets varying in size, dimensionality and domain. Its performance is evaluated and compared with random active learning and conventional supervised learning. Furthermore, several experimental factors of optimal active learning have been explored such as the importance of reference-set and test-set sizes.

Reinforcement learning in complex computer game environments

Michelle McPartland (ARC Centre for Complex Systems, School of Information Technology & Electrical Engineering, The University of Queensland) and Marcus Gallagher (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland)

Computer games provide complex, rich and dynamic environments for researching complex system techniques. Research in game artificial intelligence has shown successful initial results in applying reinforcement learning (RL) algorithms to control game agents. This study extends previous work by applying a RL algorithm to the complex domain of first person shooter games. Game agents will learn the task of high-level decision making through interaction with the environment.

The study will address the issues of applying RL to dynamic, continuous and fast paced environments. TD(ë) will be used as the basis RL algorithm, due to its success in dynamic, complex environments over exhaustive methods such as Q-learning. Deictic action will also be used to abstract the state space information, while still providing smooth state transitions of the agent.

The experimental setup consists of five RL agents playing multiple games against five agents hard-coded with rules, in a purpose built, generic first person shooter environment. The aim of the experiment is to investigate the performance of RL agents compared to hard-coded rule based agents. A combination of quantitative and qualitative data collated from the experiments will be used to determine the skill level and behaviour set of the RL agents. It is hypothesised that strategic behaviours will emerge from the learning game agents, producing competitive game play.

Robots and the Evolution of Spatial Language

Ruth Schulz (School of Information Technology & Electrical Engineering, The University of Queensland), David Prasser (School of Information Technology & Electrical Engineering, The University of Queensland), Mark Wakabayashi (Thinking Systems, Queensland Brain Institute and School of Information Technology and Electrical Engineering, The University of Queensland) and Janet Wiles (The University of Queensland)

Language is a paradigm example of a complex system, in which simple primitives and interactions give rise to complex behaviours across multiple spatial and temporal scales. These primitives are the basis for the grounding of language terms in real world phenomena. In this project, we use mobile robots that map their environment to study spatial language and its grounding. The robot platform, RatSLAM, was inspired by the rodent hippocampus and has the ability to explore novel environments and create effective maps. The RatChat project added a language dimension, in which the robots evolve simple languages to describe regions in their spatial world. However, labelling regions in space provides only simple languages, while a complete spatial language also describes motion, the location of objects and landmarks, paths, directions, and distances. We are now addressing the challenge of designing robots that can learn spatial relationships, and hence will learn spatial language terms such as the English prepositions 'near', 'between', and 'behind'. The two studies planned involve languages for landmarks and the spatial relations described by English spatial prepositions. The first study is a location language game, where robots will play language games when they are within a set distance of each other. From these games, the robots will form a simple language for places in their world. In the second study, the robots will be provided with a set of conceptual primitives that can be combined to form a set of spatial relations. Using an interactive interface, the robots will be taught the meanings of spatial relations. The poster will describe the RatChat project and recent progress.

Searching Concept Spaces using Physical Navigation Strategies

Paul Stockwell (Thinking Systems, School of Information Technology and Electrical Engineering, The University of Queensland), Andrew Smith (The Institute for Social Science Research, The University of Queensland) and Janet Wiles (The University of Queensland)

This project asks how physical navigational strategies can be used to navigate a conceptual landscape derived from a body of literature. The question is significant to both pure and applied domains. The ability to navigate is a fundamental ability in mammals, with specific brain regions specialised for navigation. The project is part of the Thinking Systems project which is investigating to what extent the brain regions evolved for physical navigation have been adapted to navigating in conceptual spaces. Understanding the relationship between physical and conceptual navigation will provide insight into the neural bases of concepts, language for space and the way in which spatial metaphors are used for other (non-spatial) domains. On a practical level, this project has significance for the development of practical algorithms for navigating concept maps, in information systems and virtual reality, or using fuzzy inferencing to construct a 'narrative' chain similar to a conditional probability tree or decision tree. For the longer term there is significance in developing mechanisms that may assist in language acquisition in intelligent agents. The expected outcomes for this project include a) an appropriate data structure for storing a conceptual landscape; b) implementation of a tool for generating a conceptual landscape; c) design and Implementation of physical-like navigational techniques; and d) metrics for indicating the effectiveness of any given navigational techniques. It is expected that literature collections based on different topics will produce quite different landscape structures. Network performance metrics may be used to categorise the form that a landscape may take, thus quantifying the performance of the navigational technique against similar landscapes.

Simulkit: a software toolkit aiming towards a unified network-based view of complex systems

Daniel Bradley (ARC Centre for Complex Systems, School of Information Technology and Electrical Engineering, The University of Queensland), Ariel Liebman (ARC Centre for Complex Systems, The University of Queensland) and Leighton Brough (ARC Centre for Complex Systems, The University of Queensland)

There are currently few computational tools that may be shared between complex systems modelers working in different domains (air traffic control, genetic regulatory networks, commerce). This presentation describes the current status of the ACCS Simulkit project, which aims to produce a shared computational framework for complex systems modelers. The commonality between the complex systems research within different domains is the possible use of networks to describe interactions between components. The shared use of networks implies the ability to share visualizations and file formats that target the network level. It is hoped that in the future requested extensions to the framework in specific domains will be able to be applied within other domains - eventually leading to a unified theory of complex systems.

Theme Park Problem on Complex Network Models

Yasushi Yanagita (Future University-Hakodate) and Keiji Suzuki (Future University - Hakodate)

The theme park problem is one of exercises to research on the Mass- User support of applying social problem. It is suggested as the problem that simplified on a large scale dynamic scheduling.

The general research on the theme park problem is shown that possibility of Mass-User support by huge agents' scheduling algorithm. But, this coordination scheduling algorithm for huge agents depends on environmental setting of the theme park model. The effectiveness of the algorithm is controlled by model setting. The versatility of the algorithm isn't yet confirmed. In addition, the experiment is performed by only environment. It isn't shown that the effectiveness and versatility of the algorithm in the environment which setting of service time, the number of each segments, the number of the edges and the connection relations are changed.

In this paper, it is suggested the theme park models which used the general idea of the complicated networks such as small-world networks and scale-free networks to come true the complicated environment such as the real problems. It is compared with the effectiveness and efficiency of the models and the strategies to introduce the multi-agents which is applied the traversal strategies that can support the various situation to coordinate the institutions using instantly. In addition, It is shown that possibility of the Mass- User support by the traversal strategies for large-scale personal schedule adjustment in the suggestion model.

Virtual Kiwifruit: Modelling Annual Growth Cycle

Mikolaj Cieslak (The University of Queensland), Alla N. Seleznyova (HortResearch, New Zealand) and Jim Hanan (ARC Centre of Excellence for Integrative Legume Research, ARC Centre for Complex Systems and Advanced Computational Modelling Centre, The University of Queensland)

The aims of our research are to develop hypotheses related to the processes underlying the branching pattern of a kiwifruit vine.

Kiwifruit (Actinidia deliciosa) is a perennial vine of horticultural importance. The pattern of growth cessation of annual shoots creates three distinct shoot types (short, medium and long), and is influenced by genotype, environmental conditions, and rootstock. These shoot types are not evenly distributed along the parent shoot, but form branching zones.

We use L-systems to create a 3-D virtual plant representation of the annual growth cycle of a managed mature kiwifruit vine. In the beginning of each cycle, the structure consists of the main trunk, two leaders and a specified number of canes trained on a T-bar structure. We focus on the development of axillary bud outgrowth from these canes. Shoot growth is modelled by appearance of metamers from an apex, with a metamer consisting of a node, an internode, a leaf, and an axillary bud. Growth cessation occurs when the shoot apex aborts, and is modelled as a stochastic process. We develop hypotheses defining this stochastic process which lead to the emergence of branching patterns. We compare virtual data collected from our architectural model of the kiwifruit vine with existing experimental data on branching patterns and shoot development.

Our architectural model forms a basis for further modelling of the vine's growth and of the interactions between plant architecture, resource allocation and environment. The model will be used to explore the complexity of the vine's architecture, and to predict its behaviour under the influence of various management practices and environmental parameters (e.g., temperature and light).

Software Demonstration

A Netlogo simulation of dynamic configurability in combined arms teams.

Martin Wong (Defence Science and Technology Organisation) and Victor Fok (Defence Science and Technology Organisation)

In this demonstration we will be showcasing a NetLogo agent based distillation constructed to investigate the warfighting effectiveness of different unit compositions of abstracted combined arms teams. A simple combat scenario involving a blue combined arms team engaging a red combined arms team will be used for the purpose of the demonstration. In the first instance combined arms teams will be kept static meaning that the members of any particular team will not be changeable once the team has formed. Next, we will present a case where members are able to be dynamically allocated to other teams either to replenish existing combat-attritioned teams or to form new warfighting structures optimized to engage the encountered enemy forces.

EcoLab 5

Russell Standish (Mathematics and Statistics, The University of New South Wales)

Since EcoLab 4 was announced at Complex Systems 2000, EcoLab has grown into a mature C++-based simulation platform capable of supporting agent based modelling. It is broadly comparable with other well known ABM platforms such as Repast, Mason and Swarm in terms of ease of use and available features. Its main justification is in the support of C++ as a model implementation language, a language renowned for its rich feature set, and execution efficiency. Secondarily, EcoLab provides support for distributed memory parallel processing, allowing for readily scaling models to parallel computing clusters.

This presentation will report on the unique features of EcoLab, and also on a comparative study of a simple agent based model implemented in EcoLab, Repast, Mason and Swarm. Finally, plans for the next major revision (version 5) will be announced, including better integration with Java-based packages such as Repast.

LiveGraph - a tool for data visualisation, analysis and logging in complex systems simulations.

Greg Paperin (Monash University)

Simulation is an essential tool in complex systems research. And the hardest, most time-consuming part of simulation development is the user interface, especially monitoring tools and visual display.

Several visualisation frameworks are available, however, they usually require extensive data preparation from the model developer and trade off powerful functions for a complex, slow-to-use user interface. Such frameworks are targeted at post-simulation data analysis.

Our LiveGraph framework for exploratory data analysis combines several features that are (at least in combination) missing in other products:

  • A plotter that automatically updates graphs of simulation outputs in real-time.
  • A concise and simple point-and-click interface that allows users to quickly select and compare data series even in simulations outputting over 1000 series simultaneously.
  • Transformation of data series for visual comparison, or application of feature detection by the virtue of a single click.
  • The framework is Java-based and can be run on any computer system. However, it is easily integrated with simulations written in any programming language.
  • LiveGraph reads files in a simple CSV-style format. For simulations written in Java it provides an API that handles data logging.

These features make the LiveGraph system particularly useful while exploring the parameter space of a simulation model. LiveGraph is a generic tool that allows researchers to direct their effort to actual models, not visualisation.

The demonstration shows how LiveGraph was used with two very different complex systems simulation models.

For more information please visit http://www.live-graph.org.

Multi-tasking, specialisation and adaptability in a logistics problem

Matthew Berryman (Defence Science and Technology Organisation), Vanja Radenovic (Defence Science and Technology Organisation), David Batten (CSIRO), Anne-Marie Grisogono (Defence Science and Technology Organisation) and Alex Ryan (Defence Science and Technology Organisation)

In a logistical context of storage and distribution, we have built an agent based model to generalize Ricardo's law of comparative advantage, i.e. to consider the 'trading' of tasks and functions in the design or evolution of a complex adaptive system, leading to an effective distribution of skills/capabilities among its agents. We explore tradeoffs between specialisation and multi-tasking (pleiotropy) in this dynamic framework. We do not simply consider adaptation of agents, but adaptability of the agents, i.e. their ability to track changes at different time scales in the environment. This is done using two-level adaptation (adapting the learning parameters of agents), and also by considering a 'system-of-systems designer' agent.

VLAB - An online virtual laboratory for complexity and artificial life

Alex Tee Neng Heng (Faculty of IT, Monash University) and David Green (Monash University)

If a picture is worth a thousand words then a simulation model is worth a thousand pictures. Concepts such as 'complexity' are difficult because they are abstract, so the best way to understand their meaning is to run a simulation. For this reason, we developed VLAB, an online tool to make complexity easier to grasp. Monash University's Complexity Virtual Laboratory (VLAB) is a web-based resource for research and education about complex systems. Its goals are to stimulate interest in complex systems and Alife and to provide demonstrations, both for key ideas and for recent research findings. VLAB includes numerous simulations (mostly java applets), together with related tutorials, references and web links. The demonstrations include cellular automata, swarms, evolution, networks and non-linear dynamic systems. Applied examples include forest ecology, fire spread, epidemics, starfish outbreaks, spread of computer viruses and cascading power failures. VLAB enhances the visibility and impact of research by tying simulations to research publications, online journals and books. Researchers can also enhance the impact of their research through participation and contributions to the site. We aim to expand VLAB's coverage of complex system in the future. We encourage researchers to contribute by providing their own simulation and tutorials. For more information, visit VLAB web site at http://www.complexity.org/vlab/