-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multi-domain mesh creation #1332
Comments
OK - after a bit more reading and testing - I think I have it working! There were 2 key things I needed to change (one with the mesh and one with the partition options):
Final solution:
This resulted in the following output (each process filled its local data with floating point values equal to its process id):
|
I'm glad you got this working. Do you have suggestions on how we can improve the documentation? |
The tricky part was realizing that which data came from which domain_id needed to be manually selected using an array of "selections" rather than just specifying the desired region and letting Conduit determine who owned that data. There were no examples in the documentation with multiple selections, so it was a bit of trial-and-error. Having the code that matches the M:N redistribution in the picture (where target is 10, 4, and 2) might be helpful. |
Well, now I'm running into a different issue. In the example above:
I now have each process with ghost cells for its neighbors. This means the actual data size for each process is as follows (when overall grid is 16x12):
Accordingly, I update my "start" and "end" in each selection to account for the desired data sometimes being 1 cell to the right or down. I also update the "origin/{i,j}" of the coordset in the mesh. Any ideas why the uniform domain cannot be maintained? |
Wait, nevermind... I just realized that "end" is inclusive. It didn't matter without the ghost cells, since it would get cropped to the data size, but I was now grabbing the ghost cells to the right / below since I assumed "end" was exclusive |
Conduit Blueprint currently has no notion of ghost cells or nodes, but that support will likely be added in the future. |
We should enhance the documentation for partitioning and provide more and better examples. |
Hello! |
@cyrush can correct me if I'm wrong, but I don't believe MPI is enabled for the python interface for blueprint. I'm not sure why that's the case. We should add it. |
read of the situation is correct, we can add that support. |
Hello.
I have data that is distributed amongst N processes and I want to create a blueprint mesh for it. I thought I was doing it correctly, but when I call the partition function, I am getting unexpected results. I'm not sure if I am creating the mesh wrong or calling the partition function wrong.
Any assistance would be appreciated!
Example (12 processes, each owning a 4x4 subregion of an overall 16x12 grid):
Code:
I then want to repartition the mesh to access the whole thing on process 0. So I tried the following:
However, the resulting output mesh still only has size 4x4 and only contains the data from process 0.
As a side note, I am setting "target" to 1 (specifying 1 process), but how do I specify which process (i.e. what if I want it on process 3 instead of process 1)?
The text was updated successfully, but these errors were encountered: