Skip to main content

Hello, and welcome to a detailed guide to meshing in nTop. We’ll jump straight into an example and then talk about some of the nuances within the inputs. So, we have a heat exchanger on screen. It’s about 3 and 1/2 inches in diameter and about 10 inches high. It’s a TPMS running the center, and the objective will be to get a high-quality mesh that accurately represents the implicit geometry while also trying to reduce the overall element count as well as the solve time of the blocks.

With any large part, I always encourage people to start with a sample section first. So, in this case, I just have a cylinder running through the main heat exchanger, and I’m going to intersect that with the implicit I want to mesh. I’ll test my input parameters on this sample section first so we don’t have to wait for the entire object to mesh as we refine the inputs. In this case, we’re using the Mesh From Implicit Body block, and because I don’t have any edges to preserve the sharpness on, I’ve left the Block in its default state, or, in other words, I’ve not accessed the overload. I have a tolerance value of 0.35 mm, which in this case gives me a pretty dense mesh. There are some smaller features in the main hex that I want to capture.

So, once you have the appropriate mesh from implicit body tolerance, we can move to the Remesh Surface block or subsequently the Simplify Mesh by Threshold. This case, I went straight to Remesh Surface, and you’ll notice I follow the same logic as in the quick guide to meshing. The chord height and the mid-in length are a function of the tolerance at 2% and 33% respectively. Growth rate I dropped to 1.2. The rest of the inputs I’ve left as default, and overall this has given me a pretty good mesh that ultimately captures the geometric fidelity that we want while reducing the file size. So, we’ve gone from about 3 million elements in the Mesh From Implicit Body down to about 900,000.

Taking these same input parameters now, we can move straight to the full-size heat exchanger and have a lot more confidence in the results we’re going to get. Starting with the initial mesh, in this case, I am concerned with sharpening some of the edges primarily down at the bottom where I want to define my boundary conditions. So, in this case, I just have a thin cylinder intersecting those bottom feet. Same tolerance input and sharpen iteration set to one. It’s going to give me a very dense mesh as we saw before. We can zoom in, and as we do so, now the elements begin to render, and we’re at 52 million bases, so a lot denser than we ultimately need it to be.

In this example, I went straight from the P body to the Remesh Surface. Those same input parameters from before, if we hit I to isolate the remeshed part, again, all we’re doing is reducing the overall element count but keeping and preserving the fidelity, the curvature that we want to capture whether it’s for analysis or manufacturing purposes. Alternatively, we could have also used the Simplify Mesh by Threshold. If I click S where my cursor bar is in the notebook, I can search for blocks. So, if I search for Simplify Mesh by Threshold, we’ll put manual run mode on so it doesn’t automatically start. We would use the same logic as with the chord height, and this could either be a mesh that we send to slicing or into the Remesh Surface block or any downstream CAD tool that remeshes.

With this larger example out of the way, we’ll get into the nuances of each of the inputs. The file and part we’ll focus on is the heat suit example from the quick guide to machine final part. As an implicit, it’s the green object on screen, and we want to preserve the edges that are intersected by the purple rectangles.

I’m not going to go through every single meshing block that I have in this notebook; there are quite a few. But I am going to touch on each of the inputs and some of the most influential parameters. What I did was I went through and tried to characterize each of them against one another to try and reduce the total element count as well as the solve time. Looking at the plots, the blue in the bar charts are from January 2022, and all the red plots are from October 2023. So right away, you’ll see that there is a meshing speed up that’s only ever going to really get better.

I’ll pop between the file and this and the meshing data that we see here, to kind of talk through some of each of the nuances. But ultimately, this is what I’m using to help decide and characterize what inputs to focus on. This file will be included so you can walk through it yourself or follow along.

The first place we’re going to mesh is Mesh from Plusab body, and the default way that the block comes in is this first one that we see here. Run one with any block that has this down arrow indicates an overload, and that’s how we can access the different inputs related to certain blocks. The default version of this block works pretty well as far as discretizing an implicit body. There are only two inputs that you need to satisfy: the object you want to mesh and a tolerance value. This tolerance value is going to be an allowable surface deviation that the implicit can take or that the mesh can take while meshing the implicit body.

The first thing that this block does is voxelize, and then we stitch together a surface mesh from the voxelization. The other inputs, the Min Feature Size, Sharpen, and Simplify, are optional inputs. The Min Feature Size, if that is defined, it’s going to ignore any implicit features that are below that value. The Sharpen option and the Simplify I never use, honestly. In this case, if I click sharpen, it’s going to go through the entire implicit body and try and sharpen edges everywhere. In most cases, we don’t need to do that. We only want edges to be sharp in specific regions, and in a lot of scenarios when we go to print these structures, we’re not going to get sharp edges anyway, so there’s no need to spend the time meshing or increasing the risk of self-intersections while doing so.

You also notice a Simplify option. I again never use this. This will forcibly decimate the mesh by around 90%, regardless of preserving geometric fidelity. In most cases, when we’re trying to mesh an object with a complex lattice structure, the maximum number of elements are going to be on the struts or the lattice itself. That’s where we also want to preserve our fidelity so that we print a part that is indicative of how we designed it. I’ll show you that when we select this option, we kind of lose that fidelity.

So this first run has a Mesh From P of Body tolerance that is about 30% of my minimum feature size, and overall, it captures the curvature of the structure pretty well. But again, it gives us an overly dense mesh. I also have the Sharpen options selected, and in this case, it did a pretty good job of sharpening all of the edges without causing any self-intersections. The next run is the same block, except sharpen has not been selected.

Hopping over to the charts and looking towards the bottom left, we can see a drastic time difference in the two blocks when we choose to sharpen all edges versus not sharpening. The next few blocks have the overload expanded, so in this case, what I’m wanting to do is specifically define the regions I want to preserve their sharpness and the number of iterations I want the block to take while trying to preserve those edges. Looking back over at the charts, these next three are those cases. So right away, we can also see that when we choose to locally sharpen compared to sharpen all edges, we have a pretty large decrease in the overall solve time. The only difference between these is the number of iterations that we’re trying to take in solving the edges. In most cases, I only ever hover between about a value of one and three. For reference, the Sharpen option in run one only has one iteration.

If you do want to try and sharpen all edges, I still encourage you to access the overload. You can leave the sharpen extents blank, in which case it’ll work to sharpen all edges. But now, you can control the number of iterations that you want to work with.

Real quick, let’s look at what simplify does. So this says the overload access three iterations and we’re discreetly sharpening our edges. Looking at the mesh, we have a significantly coarsened mesh, but it’s pretty clear that we’re not preserving the fidelity that we might want. We’ve lost the curvature of the TPMS itself. So, when we go to print this, or maybe we want to do some type of analysis and define contact between the current surface and let’s say the fluid domain, we would have pretty significant interference between the two.

So again, I never use the simplify option. I use some additional utility blocks, which we’ll move on to next. There are two main utility blocks: we can use the Simplify Mesh by Amount and Simplify Mesh by Threshold blocks. The Simplify by Amount is going to do exactly what it says: it’s going to reduce the element count by a specific percentage. In this case, I’ve defined 50%. In some cases, this block can do a pretty good job. But because again it forcibly does it based on a percentage value, I’d rather choose to use a block that does it based on accuracy of my geometry, which is the Simplify Mesh by Threshold.

The threshold input is akin to cord height, or in other words, the allowable surface deviation that the mesh can take while trying to coarsen the object. What I’ve discovered and what I tend to do is I’ll make the threshold input a function of my initial meshing tolerance. So, I’ll insert a Multiply block, use that value, and I’ll use about 2% of that input for an allowable surface deviation. That way, if I change my minimum feature size, all I really have to do is update my Mesh from Implicit Body Tolerance input, and that’s going to propagate through the rest of my meshing inputs. This value seems to work pretty well. It maintains good geometric fidelity while coarsening the mesh enough for either sending it to a slicing tool or to external CAE software for remeshing and validation verification in those tools.

In some instances, this is a pretty good place to stop. However, if we want to at the very least run some type of analysis in nTop, or maybe our external tools don’t have the capability of meshing, let’s say, an STL, the next step that we’ll have to take is a Remesh Surface, where we put a fair quality mesh on our body. It says it in the name: we’re remeshing the surface and, therefore, the total number of incoming elements is going to have a pretty large impact on the overall solve time of this block.

If we move to the meshing data and focus here on this middle table and chart, the furthest left columns are the Mesh from Implicit Body L. The other columns are the solve time. The left axis is the solve time of the Remesh Surface block, and the green line is the total element count, which is on the right axis here. And so, right away we can see that the block takes significantly longer to run than the measurement was a body. But based on the inputs that we choose, the solve time is going to discreetly vary as well.

Moving back to the file, we’ll talk about the various inputs. Not going to go into heavy detail on the specific values just yet. We’ll jump straight down to my initial approach and preferred methods. We’ll focus first on the Remesh Surface. So, in this case, our input mesh is run three of the Mesh from Implicit Body, which has the overload option selected. So, we’re defining certain regions that we want to preserve sharpness on, and we only have one iteration defined for the edges.

Looking at the Remesh Surface and the inputs, let’s just run down the list. You’ll notice that I have an edge length of 10 mm. When we compare that to the ruler at the bottom, that’s a pretty large input compared to the overall size of my part, my minimum feature size, cell size of my lattice, etc. And essentially, what I’ve started to do is let my edge length kind of be my global maximum for this block. My shape is defined as triangle. At the moment, we don’t support any quad type elements in our analysis. So, I just want to encourage you to stay on the triangular tent-shaped meshes. Span angle, I leave at a default of 30. Growth rate comes in at a value of two. I drop it down to 1.2, which is industry standard. Feature angle, I also leave at 45 or the default value.

The next two inputs is where I’ve spent a lot of time and focus in trying to reduce solve time and element count. Essentially, what I’m trying to do is go through the pain of figuring this stuff out so you don’t have to. So, real quick, let’s look at a difference in a couple of Remesh Surface blocks. Let’s just take run 4-1 here. So, this Remesh Surface only has a growth rate defined and a pretty small edge length of 1 mm. The rest of the inputs I’ve left default. If we do a top-down view, we can see that the bolt hole in this case has lost its circular feature, and we’re also not preserving the fidelity or curvature of the TPMS structure as well as we would.

Now, looking at the span angle and feature angle inputs, we could decrease these to preserve both the circular nature of the bolt hole as well as the curvature of the TPMS. But in the studies I did referencing the chart which were being included, they had a pretty large impact on solve time and were pretty specific to the mesh criteria that they were working on. So, it was only focused on span angle or feature angle. For example, coming back down to the from run three Remesh Surface, you’ll notice that the span angle and feature angle are at 45. If I isolate this mesh and we look at a top-down view, we’ve preserved the circular nature of the bolt hole much better as well as the curved surfaces of the TPMS structure. With the span angle and feature angle again being at their default inputs, we’re able to do this by defining a cord height. I’m going to go ahead and make this block a variable for height. I’m just going to call it 2% of Mesh from Implicit Body Tolerance. The cord height, as I mentioned previously, is the allowable surface deviation that the mesh can take while remeshing. So, it’s going to read in the mesh from run three, look at each of the elements, find its centroid essentially, and while it goes through and decimates and remeshes, it’s going to measure where that centroid changes from its original location and either remesh or keep that element as it was.

The next input that I want to focus on is the minimum edge length, and this one was pretty crucial as far as the solve time of this block. So, again, I’m going to go ahead and make this a variable. I’ll call it minimum edge length, and for now, just 33% of Mesh from Implicit BodyComing back up to this original Remesh Surface block, if we open it up, the default way that this block comes in is with a Min Edge length of zero. In other words, what we’re doing is we’re telling this block that the smallest element size that you can create while remeshing is zero, which doesn’t really make a lot of sense because we’ve already discretized in this first step in the Mesh From Implicit Body. So there’s no real reason to go significantly smaller than the first place we discretized. What I tend to do now is just start with a value that is about 33% of this Mesh From Implicit Body tolerance. Depending on the quality of mesh that I get, the number of elements will either increase or decrease this value.So most of the time, this is what my Remesh Surface block will look like. The edge length will vary based on the overall size of my part and what I want my Global mesh setting, what I want my Global Max mesh size to be, but otherwise, this is what my Remesh Surface block will usually look like.So real quick, let’s talk about the cord height and this 2% input value. What I kind of discovered was 2% seems to be a sweet spot. So real quick, let’s talk about that. Just going to take a snippet of the screen. What I ended up looking at was on this x-axis was cord height as a percentage of Mesh From Implicit Body tolerance. I went from 1% all the way up to 10%. On the right axis, I plotted element count, and on the left axis, I plotted time.For the most part, the element count was pretty consistent throughout each of the input parameters that I ran. What changed quite a bit, however, was the solve time. At 1% and at about 10%, the block took the most time to solve, and the overall time plot kind of looked like that, with a sweet spot being from about 2 to 4% of the Mesh From Implicit Body tolerance. So that’s more or less the reason of why I start at 2%. It gives me the most amount of fidelity with kind of the least amount of time.So looking a little deeper, at least my initial approach and preferred methods, you’ll see there are a couple additional blocks: a Simplify Mesh By Threshold, which we’ve talked about, and a Remesh Surface. As I mentioned, the Remesh Surface block, the time it takes to run is going to be directly correlated to the number of incoming elements. So if we look at the total elements coming in from this Mesh As A Body, we have 1.9 million elements. The Simplify Mesh By Threshold has reduced that down to about 350,000 elements, and that’s what’s being fed into this Remesh Surface block.With the same input parameters as above, we can go back to the charts, and scrolling over to the far right, we have kind of my initial approach and preferred methods plot, the bar chart. The column on the left is the Mesh From Implicit Body, the overload with extents define and one iteration. The next one is taking that same Mesh From Implicit Body, putting it straight into the Remesh Surface. And then the next two plots are of the Simplify Mesh By Threshold and that being pushed into the Remesh Surface. In this example, it’s actually quicker to do a Simplify Mesh By Threshold and a Remesh Surface than going straight into this Remesh Surface block. So if I was to do multiple design iterations of this heat sink, I would choose to Simplify Mesh By Threshold first and then Remesh. In some cases, I’ve seen it be close enough or take a little bit longer than just going straight into the Remesh Surface block. So there at some point will be a trade-off between using a utility meshing step versus going straight to a Remesh.One last thing I want to point out quickly is this chart here at the bottom. The furthest right bar is looking at this bottom row where we have a Simplify Mesh By Threshold being fed into a Remesh Surface. We’ve defined a large Edge length, a minimum element length of 33% of Mesh From Implicit Body tolerance, and same with the cord height. Of all of the Remesh Surface blocks that I ran, this ultimately was the fastest while also preserving the fidelity of the model.
So we’ll jump back to our notebook, and I’ll quickly talk about the last file that’s going to be included in this meshing course. I’ve created a Mesh Input Suggestions. We’ll strive to keep this as updated as possible with various revision dates. There are a handful of links here that you can use to explore different meshing techniques or will take you to various articles. But the gist of it is each of these sections is going to go into pretty good detail as to why I choose certain inputs and the nuances within. So it’s basically in writing what we verbalized throughout this video here. There are a handful of additional sections which are going to go into some volume meshing techniques, export surface meshes for CFD, as well as some additional utility blocks such as Merge Mesh, Split Refine, and Collapse Vertices. So I encourage you to take a look at this, and if you have any questions, feel free to reach out. Otherwise, thank you for listening, and if you made it this far, happy meshing.

Description

This training takes you through advanced meshing techniques.  We recommend this information as a guide on how to currently and successfully surface mesh in nTop for virtually any downstream operations. Prerequisites: nTop Essentials, Guide to Meshing What is covered:
  • Surface meshing large/complex geometry
  • Characterizing influential mesh input parameters
  • Exploring effects of mesh block order on solve time

Downloadable Files: