Hi all
I am trying to capture an object on a table from all sides.
I put the object on the table, make photos feed them into 3d zephyr and get the desired result (dense pointcloud)
Then I turn the object upside down, make photos and feed them into 3d zephyr (in a new workspace) and again get the desired result (dense pointcloud)
But now, how can I merge the 2 scans into one? I assume something like that:
1) cut away all data (e.g. the. table) in the dense pointclouds of the related captures
2) import one of the sparse pointclouds into the other workspaces. How is this done? is it possible to also import all photos and keep their relations to the points of the dense pointcloud))
3) register the pointclouds
Finally, I'd like to export a textured mesh in highest quality, baking the texture-atlas from all photos (from both imports).
Can somebody point me to tutorials about how exactly this should be done? - Many thanks!
J
PS: I am using version 5.0 Dragonfruit beta
I am trying to capture an object on a table from all sides.
I put the object on the table, make photos feed them into 3d zephyr and get the desired result (dense pointcloud)
Then I turn the object upside down, make photos and feed them into 3d zephyr (in a new workspace) and again get the desired result (dense pointcloud)
But now, how can I merge the 2 scans into one? I assume something like that:
1) cut away all data (e.g. the. table) in the dense pointclouds of the related captures
2) import one of the sparse pointclouds into the other workspaces. How is this done? is it possible to also import all photos and keep their relations to the points of the dense pointcloud))
3) register the pointclouds
Finally, I'd like to export a textured mesh in highest quality, baking the texture-atlas from all photos (from both imports).
Can somebody point me to tutorials about how exactly this should be done? - Many thanks!
J
PS: I am using version 5.0 Dragonfruit beta
Comment