B. Algorithms

List of all algorithms in JCOP

In table 1 is a list of all algorithms bundled with JCOP, along with brief description, type of search and required problem interfaces. Note that both graph search algorithms listed here extends GraphSearch which reduces work required to create own graph search algorithm to almost no more than creating a queue for it.

Name Description Search type Required interfaces
BreadthFirstSearch Graph search algorithm where newly expanded elements are placed to the end of queue. Slow but finds optimal solution. Graph search StartingConfigurationProblem
DepthFirstSearch Graph search algorithm where newly expanded elements are placed to the beginning of queue. Requires less memory than BreadthFirstSearch, but doest not guarantee shortest path to solution. Graph search StartingConfigurationProblem
GeneticAlgorithm Creates population of solutions from previous population by reproduction of two configurations. Global search RandomConfigurationProblem
GlobalSearchProblem
GraphSearch Abstract base for all graph search algorithms, requires queue to be implemented at least. Graph search StartingConfigurationProblem
SimulatedAnnealing Accept changes into worse configuration with lower probability with increasing time, in the end accepts almost only better configurations. Local search RandomConfigurationProblem
or StartingConfigurationProblem

Table 1: List of all algorithms in JCOP.

BreadthFirstSearch traverses over all possible configurations. First expanded configurations are processed first. Is very slow (since it traverses all configurations) and memory consuming, but always finds optimal solution and shortest path to it.

Animated BFS, source: <a href="http://commons.wikimedia.org/wiki/File:Breadth-First-Search-Algorithm.gif">commons.wikimedia.org</a>

Figure 1: Animated BFS, source: commons.wikimedia.org.

DepthFirstSearch traverses over all possible configurations. First expanded configurations are processed last. Is very slow (since it traverses all configurations), but always finds optimal solution (however shortest path is not guaranteed).

Animated DFS, source: <a href="http://commons.wikimedia.org/wiki/File:Depth-First-Search.gif">commons.wikimedia.org</a>

Figure 2: Animated DFS, source: commons.wikimedia.org.

GeneticAlgorithm

GeneticAlgorithm simulates evolution in nature. Creates population of random configurations and then randomly (better configurations has higher chance) selects pair to reproduce into new pair of configurations (to next generation). Every optimization step consists of creating whole new population. Common settings for this algorithm are size of population and mutation rate. Suggested improvements are implementing new Selection, Mutation and Reproduction classes.

GraphSearch

GraphSearch algorithm removes much of repetitive work when creating graph search algorithms. These usually works almost the same and the only difference is how they store their opened (and closed) states (configurations). One optimization step consists of taking one opened configuration, expand it to all nearby configurations and put these to opened queue (it is not necessarily a FIFO queue, it may be any data structure).

How we interact with the queue is defined by GraphSearchQueue interface. Three most important methods are GraphSearchQueue#fetch() to get new opened state to expand, GraphSearchQueue#add(Configuration) to add new state to the queue and GraphSearchQueue#testPresence(Configuration) to test whether given configuration has ever been added to queue (and hence not to be added again). How these methods are implemented alters if search becomes breadth first search, depth first search or some heuristic search for example.

SimulatedAnnealing

SimulatedAnnealing works with just one configuration. In every optimization step it tries to use a random operation on the configuration. If it leads to better solution or it passes the temperature test, it is accepted. Otherwise no change to current configuration is made.

Simulated annealing has starting temperature which lowers in every optimization step by certain amount (both values can be changed). The higher the temperature the higher the chance to accept worse configuration and how much worse new configuration is also affect chances.

In order for SimulatedAnnealing to be able to work properly, it requires all problems to have their fitness scaled to 0.0 - 1.0 (inclusive) interval. See BaseFitness for more details.