-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Boundary coarsening too aggressive #105
Comments
How is the boundary mesh computed? Can you reveal the total area of the mesh? It might be that pragmatic does not make use of the boundary IDs in the coarsening, but instead resorts to comparing the area/volumes of the old and new triangulation -- unless this has been fixed. At least I can see that coarsen.h still contains std::abs(total_new_av-total_old_av)/std::max(total_new_av, total_old_av)>DBL_EPSILON Alternatively, the tolerance for the calculation of the boundary mesh should be decreased. |
The boundary mesh is computed using FEniCS, roughly lines 459-468 in adaptivity.py.in For adapt: Donor mesh area : 3.27806152928198244e+01 for coarsen: @taupalosaurus had an idea on how to fix this |
The default tolerance in adaptivity.py.in is 30 degrees. I think this explains it. Just use tol=1 in detect_colinearity? |
The problem is not in the boundary tags: even if a large portion of the boundary is marked with the same ID pragmatic should just not collapse an element that introduces a large local volume change. If tol=1 is used then what happens is that every element in the boundary is given a different tag and this is equivalent to not coarsen the boundary at all, which gives rise to knife elements on the boundary. |
I think you are right that Ideally, one would want an exact/analytic description of the boundary to avoid these issues. |
Yes, the problem is they don't have an analytic description of their surface - so as in issue #15, in the long run, we need to try to stick to the local curvature of the surface. |
It would be a better time investment for them to get a surface
representation, like level sets, splines, NURBS, etc.
…On Mon, Nov 6, 2017 at 2:23 PM, Nicolas Barral ***@***.***> wrote:
Yes, the problem is they don't have an analytic description of their
surface - so as in issue #15
<#15>, in the long run,
we need to try to stick to the local curvature of the surface.
At a shorter term, I think playing on the volume tolerance might help -
for now it's either no surface coarsening at all, or everything is
accepted. I can put a parameter here to at least experiment.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#105 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAjoickJwS2YD3pIO45zc9qPW1aWRR5Oks5sz1ykgaJpZM4QSfdV>
.
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
|
I am not sure this kind of data really does well with CAD: the shape is so irregular you would end up with super high degree polynomials, and surface reprojection would be costly, no ? |
On Tue, Nov 7, 2017 at 4:12 AM, Nicolas Barral ***@***.***> wrote:
I am not sure this kind of data really does well with CAD: the shape is so
irregular you would end up with super high degree polynomials, and surface
reprojection would be costly, no ?
Segmentation helps in the irregularity, so coarse grid with representation
between (almost like spectral elements or NURB elements). That is how we
handle the fault.
You could also do least-squares surfaces.
Matt
… —
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#105 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAjoiZQGn0gCTBr2k-3np3H0FDgTIPTbks5s0B7ugaJpZM4QSfdV>
.
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
|
@taupalosaurus No, it is not possible, the tags are set by the detect_colinearity routine in adaptivity.py.in. I made sure to avoid using a 0 tag (also I checked the tags and there are no 0 tags). |
If the mesh is coarsened significantly pragmatic adapt coarsens the boundary too aggressively so that entire boundary elements disappear. See pictures attached: original.png
is the original mesh, adapt.png was obtained using adapt and coarsen.png with coarsen (this behaviour is much worse with coarsen). I apologise, but I am not allowed to share the actual meshes. @taupalosaurus we talked about a (maybe?) easy fix for this while at Oxford.
Original:
Adapt:
Coarsen:
The text was updated successfully, but these errors were encountered: