Most of the work on the reality gap
problem has assumed that only the
control policy of robots will be transferred. Lipson and Pollack, 18 however,
integrated an evolutionary robotics
simulation with rapid prototyping
technology to automate robot manufacture as well as robot design (Figure
1c, 1d). They first evolved the body
plans and control policies for robots
composed of linked assemblages of
linear actuators. Then, the 3D architectures of the best of these evolved
robots were printed out of plastic; motors, circuitry, and batteries were then
added by hand. Many of these automatically designed and manufactured
robots were able to successfully reproduce the locomotion patterns originally evolved in the simulator.
Combinatorics of evaluation. It was
identified early on that the time required to evaluate a single robot might
grow exponentially with the number
of parameters used to describe its task
environment. 22 For example, consider
a robot that must grasp m different
objects under n different lighting conditions. Each robot must be evaluated
for how well it grabs each object under
each lighting condition, requiring mn
evaluations per robot. If there are p parameters describing the task environment and each parameter has s different settings, then each robot must be
evaluated sp times.
This is a serious challenge in the
field that has yet to be resolved. However, one possible solution to this
challenge may be addressed using
co-evolution. Consider a population
of robots and a second population of
task environments competing against
one another. The robots evolve to succeed when exposed to environments
drawn from the pool of evolving environments, and environments evolve
to foil the abilities of the evolving robots. This is not unlike prey evolving
to elude predators, while the predators
evolve to catch prey. This approach
could, in the future, be used to evolve
robots that successfully generalize
against a subset of task environments
they might encounter when manufactured and deployed.
Evolvability. Evolving all aspects of
a complex machine such as a robot is
a daunting, high-dimensional optimization problem. Biological evolution
one goal in
evolutionary
robotics in
particular, and
the field of
evolutionary
computation
in general, is
to create
increasingly
evolvable
algorithms.
faces the same challenge yet seems to
have addressed it by a process known
as the evolution of evolvability. A species with high evolvability is defined
as one that can more rapidly adapt to
changes in its environment than a similar species with lower evolvability.
One goal in evolutionary robotics
in particular, and the field of evolutionary computation in general, is to
create increasingly evolvable algorithms. Rather than independently
optimizing individual parameters of
a candidate solution, such algorithms
should rapidly discover useful aggregate patterns in candidate solutions
and subsequently elaborate them. It
has been shown, for example, that genomes that encode formal grammars
produce robots with regular structure, and that such genomes are more
evolvable than genomes that do not
produce regular structures. 12
Similarly, when an evolutionary
algorithm biased toward producing
regular patterns was used to evolve
artificial neural networks for robots it
was found, again, that such networks
more rapidly discover desired behavior compared to other evolutionary
methods that do not generate such
regularity. 6 Auerbach and Bongard1
have expanded the reach of this evolutionary algorithm to shape robot body
plans as well.
Despite these recent advances, little
is known about how to design evolutionary algorithms that reorganize
genetic representations to maximize
evolvability and thus automatically
generate adaptive complex machines
in a reasonable amount of time.
Fitness Function Design. The original and continued goal of evolutionary robotics is to make as few assumptions about the final form of the robot
or the kind of behavior that should
be generated. However, designing a
fitness function that rapidly discovers desirable solutions without biasing it toward particular solutions is
notoriously difficult. For this reason
there have been efforts in the field to
eliminate the usage of a fitness function altogether. One recent example
is novelty search, which begins with
simple candidate solutions and gradually creates more complex solutions as
optimization proceeds. 17 The fitness of
any given solution is simply how much