Class Activations

java.lang.Object
net.bmahe.genetics4j.neat.Activations

public class Activations extends Object
Utility class providing common activation functions for NEAT (NeuroEvolution of Augmenting Topologies) neural networks.

Activations contains a collection of mathematical functions commonly used as activation functions in neural networks. These functions transform the weighted sum of inputs at each node into the node's output value, introducing non-linearity that enables neural networks to learn complex patterns.

Available activation functions:

  • Linear: Simple linear transformation with slope and bias parameters
  • Sigmoid: Logistic function providing smooth transition between 0 and 1
  • Hyperbolic tangent: Smooth function providing output between -1 and 1
  • Identity: Pass-through function for linear networks
  • NEAT paper: Sigmoid variant with specific parameters from original NEAT research

Function variations:

  • Float versions: Optimized for float precision networks
  • Double versions: Higher precision for sensitive applications
  • Parameterized versions: Customizable function parameters
  • Pre-configured versions: Common parameter combinations

Common usage patterns:


 // Use standard sigmoid activation
 FeedForwardNetwork network = new FeedForwardNetwork(
     inputNodes, outputNodes, connections, Activations::sigmoid
 );
 
 // Custom sigmoid with different steepness
 Function<Double, Double> customSigmoid = Activations.sigmoid(2.0);
 FeedForwardNetwork steeperNetwork = new FeedForwardNetwork(
     inputNodes, outputNodes, connections, customSigmoid
 );
 
 // Hyperbolic tangent for outputs in [-1, 1] range
 FeedForwardNetwork tanhNetwork = new FeedForwardNetwork(
     inputNodes, outputNodes, connections, Activations::tanh
 );
 
 // Linear activation for regression problems
 FeedForwardNetwork linearNetwork = new FeedForwardNetwork(
     inputNodes, outputNodes, connections, Activations::identity
 );
 
 // Float versions for memory efficiency
 Function<Float, Float> floatSigmoid = Activations.sigmoidFloat;
 

Activation function characteristics:

  • Sigmoid: Smooth, bounded [0,1], good for binary classification
  • Tanh: Smooth, bounded [-1,1], zero-centered, often preferred over sigmoid
  • Linear: Unbounded, preserves gradients, suitable for regression
  • Identity: No transformation, useful for pass-through connections

Performance considerations:

  • Float vs Double: Float versions use less memory and may be faster
  • Function references: Pre-defined functions avoid object creation
  • Mathematical operations: Optimized implementations for common cases
  • Branch prediction: Simple functions improve CPU branch prediction

Integration with NEAT evolution:

  • Network evaluation: Applied to hidden and output nodes during forward propagation
  • Fitness computation: Affects network behavior and resulting fitness
  • Gradient flow: Function choice impacts learning and evolution dynamics
  • Problem matching: Different problems benefit from different activation functions
See Also:
  • Field Details

    • sigmoidFloat

      public static Function<Float,Float> sigmoidFloat
      Standard sigmoid activation function for float values (steepness = 1.0).
    • sigmoid

      public static Function<Double,Double> sigmoid
      Standard sigmoid activation function for double values (steepness = 1.0).
    • identityFloat

      public static Function<Float,Float> identityFloat
      Identity activation function for float values (f(x) = x).
    • identity

      public static Function<Double,Double> identity
      Identity activation function for double values (f(x) = x).
    • tanhFloat

      public static Function<Float,Float> tanhFloat
      Hyperbolic tangent activation function for float values. Output range: (-1, 1).
    • tanh

      public static Function<Double,Double> tanh
      Hyperbolic tangent activation function for double values. Output range: (-1, 1).
    • neatPaperFloat

      public static Function<Float,Float> neatPaperFloat
      Sigmoid activation function with steepness 4.9 as used in the original NEAT paper (float version).
    • neatPaper

      public static Function<Double,Double> neatPaper
      Sigmoid activation function with steepness 4.9 as used in the original NEAT paper (double version).
  • Constructor Details

    • Activations

      private Activations()
  • Method Details

    • linearFloat

      public static Function<Float,Float> linearFloat(float a, float b)
      Creates a linear activation function with specified slope and bias for float values.

      The linear function computes: f(x) = a * x + b

      Parameters:
      a - the slope parameter
      b - the bias parameter
      Returns:
      a linear activation function
    • linear

      public static Function<Double,Double> linear(double a, double b)
      Creates a linear activation function with specified slope and bias for double values.

      The linear function computes: f(x) = a * x + b

      Parameters:
      a - the slope parameter
      b - the bias parameter
      Returns:
      a linear activation function
    • sigmoidFloat

      public static Function<Float,Float> sigmoidFloat(float a)
      Creates a sigmoid activation function with specified steepness for float values.

      The sigmoid function computes: f(x) = 1 / (1 + exp(-a * x))

      Output range: (0, 1)

      Parameters:
      a - the steepness parameter (higher values create steeper transitions)
      Returns:
      a sigmoid activation function
    • sigmoid

      public static Function<Double,Double> sigmoid(double a)
      Creates a sigmoid activation function with specified steepness for double values.

      The sigmoid function computes: f(x) = 1 / (1 + exp(-a * x))

      Output range: (0, 1)

      Parameters:
      a - the steepness parameter (higher values create steeper transitions)
      Returns:
      a sigmoid activation function