What is Niched Pareto Genetic Algorithm?

The niched Pareto genetic algorithm (NPGA) is a variation of the genetic algorithm (GA) created specifically for multi-objective optimization challenges.

In a typical GA, a population of candidate solutions (individuals or chromosomes) evolves over generations via selection, crossover, and mutation. To determine fitnessThe fitness function serves as a critical component of the genetic algorithm as it guides the evolutionary process by determining which solutions are selected for reproduction and which are discarded., these individuals are evaluated using a single-objective function.

However, in multi-objective optimization problems, many divergent goals must be optimized simultaneously. The goal is to identify solutions representing a trade-off between these objectives, sometimes known as Pareto optimum solutions.

The NPGA improves on classic GA by providing techniques that retain solution diversity while also promoting convergence toward the Pareto-optimal front. It accomplishes this by niching approaches, which encourage the preservation of different solutions by penalizing similar ones.

Steps of NPGA

The niched Pareto genetic algorithm typically involves the following steps:

  1. Initialization: Generate an initial population of candidate solutions.

  2. Evaluation: Evaluate the fitness of each solution based on multiple objective functions.

  3. Selection: Choose individuals for reproduction, typically based on their fitness and diversity.

  4. Crossover: Create offspring solutions by combining genetic information from selected parents.

  5. Mutation: Introduce random changes to the offspring solutions to maintain diversity.

  6. Niching: Apply techniques to encourage diversity and maintain a diverse population of solutions.

  7. Termination: Repeat steps 3-6 for a certain number of generations or until a termination condition is met.

Here’s an illustrative representation of the niched Pareto genetic algorithm:

Flowchart for NPGA
Flowchart for NPGA

The niched Pareto genetic algorithm searches for solutions representing the Pareto-optimal front, giving decision-makers a choice of trade-off solutions in multi-objective optimization situations.

Example code

The code below implements a niched Pareto genetic algorithm to optimize solutions for a multi-objective optimization problem, producing Pareto fronts as the final output. It utilizes the Schaffer function for fitness evaluation and iterates through generations with crossover and mutation operations to refine solutions.

import random
import math
# Define the Schaffer function
def schaffer(x):
return (x ** 2, (x - 2) ** 2)
# Define the fitness function for a solution
def fitness(solution):
return [schaffer(solution)]
# Generate initial population
def generate_population(size, lower_bound, upper_bound):
return [random.uniform(lower_bound, upper_bound) for _ in range(size)]
# Perform crossover between two solutions
def crossover(solution1, solution2):
return [(s1 + s2) / 2 for s1, s2 in zip(solution1, solution2)]
# Perform mutation on a solution
def mutate(solution, mutation_rate, lower_bound, upper_bound):
return [s + random.uniform(-mutation_rate, mutation_rate) for s in solution]
# NPGA
def npga(population_size, generations, mutation_rate, lower_bound, upper_bound):
population = generate_population(population_size, lower_bound, upper_bound)
for _ in range(generations):
next_population = []
for i in range(len(population)):
solution = population[i]
# Calculate fitness
fit = fitness(solution)
# Determine if solution dominates any other solutions in the population
dominated = False
for j in range(len(population)):
if j != i:
if all(fit[0] <= f[0] and fit[1] <= f[1] for f in fitness(population[j])):
dominated = True
break
# If solution is not dominated, add it to the next population
if not dominated:
next_population.append(solution)
# Perform crossover and mutation
while len(next_population) < population_size:
solution1 = random.choice(next_population)
solution2 = random.choice(next_population)
offspring = crossover(solution1, solution2)
offspring = mutate(offspring, mutation_rate, lower_bound, upper_bound)
next_population.append(offspring)
# Replace population with next population
population = next_population
return population
# Define parameters
population_size = 50
generations = 100
mutation_rate = 0.1
lower_bound = -10
upper_bound = 10
# Run NPGA
result = npga(population_size, generations, mutation_rate, lower_bound, upper_bound)
print("Final solutions:")
for solution in result:
print(fitness(solution))

Code explanation

Here’s the code explanation for better understanding.

  • Lines 1–2: We are importing required modules: random for generating random numbers and math for mathematical functions.

  • Lines 4–6: We defined the Schaffer function, a common test function for multi-objective optimization, which returns a tuple of two values to minimize two objectives.

  • Lines 8–23: We are defining the fitness function that evaluates a solution using the Schaffer function and returns a list containing the tuple of fitness values, also defining a function to generate an initial random population of solutions, defining a function to perform crossover between two solutions, which averages corresponding elements of the two solutions and at the end we defined a function to perform mutation on a solution by adding a random value within a specified mutation rate to each element.

  • Lines 25–61: We are defining the NPGA function, which initializes the population using the generate_population function and perform all the necessary steps.

  • Lines 64–68: We are defining parameters such as population size, number of generations, mutation_rate, and upper/lower bounds for solutions, running the NPGA with the specified parameters storing the result, and printing the final solutions, i.e., the Pareto fronts, by evaluating the fitness of each solution in the result.

Test yourself

Here’s a short quiz for your better understanding.

Niched Pareto genetic algorithm (NPGA)

1

What is NPGA designed for?

A)

Single-objective optimization

B)

Multi-objective optimization

C)

Constraint optimization

D)

Dynamic optimization

Question 1 of 40 attempted

Conclusion

In conclusion, the niched Pareto genetic algorithm is a powerful tool tailored to address multi-objective optimization challenges. Unlike traditional genetic algorithms that rely on single objective functions, NPGA embraces the complexity of multi-objective problems by seeking solutions representing a trade-off between divergent goals, known as Pareto-optimal solutions. NPGA empowers decision-makers with a comprehensive set of trade-off solutions, enabling informed and balanced decision-making across a spectrum of objectives.

Free Resources

Copyright ©2025 Educative, Inc. All rights reserved