Parallelisation and Python
Having chosen to divide up the Agents, we need to re-write the model.py code so less agents are made on each machine (an alternative is to make all the agents on one machine, and portion them out at the start, but the agents here are homogeneous and independent, so this is unnecessary). Here's the load balancing code we'd add to "task" (so it runs on each node). Note that for any code inside "task", the variables are local, so each node has its own list of agents and a landscape object; node zero is going to add the landscapes together where needed, as we'll see in later sections:
// Setup
pipe_to_zero = None
# Setup agents
if (node != 0):
node_number_of_agents = int(number_of_agents/(number_of_nodes - 1))
if (node == (number_of_nodes - 1)):
node_number_of_agents = int(node_number_of_agents +
(number_of_agents % (number_of_nodes - 1)))
pipe_to_zero = pipes[node - 1]
for i in range (node_number_of_agents):
agents.append(Agent())
agents[i].densitylimit = densitylimit
agents[i].landscape = landscape
# Agents get a reference to the
# landscape to interrogate for densities.
# They could also get a reference to the
# agent list if agents need to talk,
# but here they don't.
# Allocate agents a start location.
x = int(width/2) # Set to middle for start
y = int(height/2) # Set to middle for start
agents[i].x = x
agents[i].y = y
# Give the landscape a reference to the agent list so it
# can find out where they all are and calculate densities.
landscape.agents = agents
Thus we should end up with no Agents on node zero, with the other nodes getting an even spread, except for the last node, which gets the remainder of the Agents that can't be divided evenly.
We'll also need to adjust our agents[i].step();
loops in model.py to loop through less agents:
for i in range(node_number_of_agents):
agents[i].step()
Next we need to get all the worker nodes to send their densities back to node zero, so it can collate them.