Skip to main content

Objective:

Understanding the Parameter Optimization framework in nTop. This guide explains what Parameter Optimization is, how it differs from other optimization types, and how to effectively set up a study.

What is happening?

Parameter Optimization is a powerful framework in nTop that helps you find the optimal design parameters for your project. Unlike Field or Topology Optimization, which modify the shape and material layout of a part, Parameter Optimization adjusts scalar inputs (like a wing’s chord length or wingspan) to achieve a specific goal. It automates the process of adjusting your parametric model’s variables to minimize or maximize an objective (such as drag) while satisfying a set of constraints (like minimum lift).

Frequently Answered Questions

This is a key distinction based on what is being changed:
  • Topology/Field Optimization is generative. They modify the shape and material layout of a part within a design space. TO typically decides where material should be solid or void. FO can vary material properties or lattice parameters at every point in a design space.
  • Parameter Optimization is not generative. It works on a pre-defined model you have already built. It adjusts the high-level scalar parameters of that model to find the best combination.
For example, TO/FO might create the optimal internal sparse structure of a wing. Parameter Optimization would take that wing design and answer, “What is the optimal chord length to maximize the lift-to-drag ratio while keeping total mass below 50 kg?”
You can define your inputs using three different parameter types. This table explains their roles:
Parameter TypeWhat it doesCommon uses
Independent ParameterCreates a design variable that can change within a specified lower and upper bound.Defining the primary design variables that you want the optimizer to adjust, such as beam thickness, cell size, or fillet radius.
Dependent ParameterCreates a parameter whose value is calculated by a function that takes other parameters as inputs.Maintaining specific mathematical relationships between variables, like keeping a constant ratio between two geometric features.
Constant ParameterCreates a fixed value that remains constant throughout the optimization run.Defining static values that are used in your design but are not part of the optimization study, such as material properties or load values.
The Parameter Optimization block has four Algorithms. Choosing the right one depends on the complexity of your problem and your need for speed versus accuracy. How to Select an Algorithm for Parameter Optimization
  • Grid: Tests every single combination of points on a grid you define. It becomes extremely slow as you add more parameters, becoming exhaustive. It’s best for exploring a small, simple design space.
  • Global: A smart search that balances exploring new, untested regions (exploration) and optimizing promising areas (exploitation). It’s best for complex problems where you don’t have a good starting guess and want to find the true global optimum.
  • Local: Starts from an initial guess and quickly finds the nearest optimum by building a smooth approximation of the function. It’s very fast and efficient if you already have a good design and just want to refine it.
  • Smooth: Uses gradient information to find the solution. It’s the most efficient for smooth, continuous problems, but can struggle if your design space is noisy or has sharp changes.
The Smooth (LBFGS) algorithm is gradient-based, but it does not calculate a true analytical gradient from your workflow.Instead, it uses a finite difference approximation (which uses first differences) to estimate the gradient. This means it works very well for objective functions that are smooth and continuous. However, it may be less efficient or fail to converge if your objective function is “noisy,” discontinuous, or has sharp changes where a gradient is undefined.The algorithm scales poorly when you have larger numbers of parameters, due to finite difference computation, which requires many more function evaluations. The Grid, Global, and Local algorithms do not require gradient information.