Running
[Agent practical 9 of 9]


So, that's it for our model.


We now have a model that lays the foundation for a wide range of ABM, but which also exemplars a number of key ABM ideas.

In terms of the final practical, we could have progressed to validating our prediction in some way, and we haven't run any sensitivity tests, which we could do using our original bat script. We could also, for example, run sensitivity tests of the variables we didn't calibrate, like the neighbourhood size.

There are also still a few things that need smoothing out in our model: it would be nice, for example, if the user could set the model with non-default values through the GUI. In addition, there's still at least one bug: what happens if two agents get full and restart within each other's neighbourhoods? Maybe you can think of more.

In terms of parallelisation, we've parallelised our calibration. Our model runs fairly fast, so we haven't had to parallelise the model itself. However, with larger models with more sophisticated agent-based behaviour we might have to do that. You can find a practical that walks you through doing this with MPJ here: Parallelisation and Java. The model used is much simpler than our model (no bad thing!) but the principles can be adapted. In both cases some kind of neighbourhood metric is the medium of communication, and there would be nothing to stop us using the density surface solution suggested for the simple model to effectively parallelise ours.

Finally, although we now tend to use the Grid or Cloud computing now, Beowulf clusters are still a popular option. With this in mind, you might be interested in comparing running MPJ like this with running stuff on a Beowulf cluster, so we've saved our old Beowulf instructions for you. Although you can't use them, if you read through them they're give you a flavour of how this stuff works on Linux, and using a different MPI implementation that demands a C-based MPI implementation to run under it: Beowulf instructions.


So, that's it for building our model -- it's been an intense and somewhat crazy ride. If you now open the code we've developed, it will probably just swim before your eyes -- there's a lot of it, and some of it in combination is probably quite bewildering. DON'T PANIC. The practicals were designed to:

  1. Learn something about the elements involved in building an ABM.
  2. Give you chunks of ABM code so you don't have to do it yourself from scratch.
  3. Get you to practice some simple code ideas along the way (loops, branches, making objects, etc.)
  4. Help you get familiar with some of the ways you think when writing code (instance variables, method separation, etc.)

If all you got out of it was a few concepts and the chance to practice some code, that's fine. The materials are designed for you to come back to. The best thing to do now is visit the course overview page (right postit-like note on the homepage) and review the practicals and what we did in each.

The practicals were also designed to give you some code that can form the basis of a wide variety of applications. Here's the code with the ABM bits stripped out of it:

Analyst.java (was Model);
Storage.java (was Environment);
IO.java;
Process.java (a new class, to show tool for image processing).

The only difference is this version uses two Storage class objects, one for the original data, and one for the results of processing. It works perfectly well as the start for lots of things. For example, we used it, in combination with the JPEG reading code from the end of part 8 to generate the hillslope data we used to fill the environment.

So, that's it. Congrats! Now it's time to work on your own model!