Help to Optimize Large Data-set

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Reuben
    3Dfollower
    • Jul 2017
    • 21

    #15
    So I still am using the bundle adjustment with 3 control points.

    I have 12 GCPs (gps measured on paper targets), but these are not spaced evenly throughout the 4 sparse cloud areas that I have made. I am not sure what the difference between bundle adjustment and a rigid adjustment is? My understanding is I can only do rigid if I have GCP's with GPS coordinates?

    My workflow going forward is to:
    1) merge all 4 areas (sparse clouds) with control points and do bundle adjustment.
    2) Scale the entire sparse cloud with my 12 GCPs gps coordinates
    3) Generate dense point cloud
    4) Generate mesh
    5) textured mesh => export

    Is this the best way to go forward? Or should I be doing something different with bundle adjustment?

    Comment

    • Andrea Alessi
      3Dflow Staff
      • Oct 2013
      • 1335

      #16
      you can merge using any control points , doesn't have to be a GCP with GPS coordinate. But there must be some kind of overlap and correspondence, so that you pick over the images "control point 1" in project 1 , and pick the same "control point 1" in project 2.

      The bundle adjustment will minimize the error by moving the camera positions. As you can imagine, if you a ton of images this can take a while to minimize all the camera positions and rotations error.

      You can absolutetly proceed that way, i would also try withouth bundle adjustment. That should take only a few seconds, so if that result doesn't look good, you can always proceed with BA

      Comment

      • Reuben
        3Dfollower
        • Jul 2017
        • 21

        #17
        Hi Andrea, so the bundle adjustment merge for all areas was taking too long. I did the control point merge without the bundle adjustment. It all looks good. But when I save the sparse point cloud as a zep I get this error:

        [10:53:58] SysInfo | Cpu: 11.53 %; Disk (C/T): 107236/307196 MB (199960 MB free, 34.91 % used); Memory (C/T): 46428/124927 MB (77005 MB free, 37.16 % used)
        [10:54:07] StereoPointCloud::saveDenseData error: vector<T> too long
        [10:54:12] WorkspaceAsyncSaver warning: could not points descriptors
        Last edited by Reuben; 2017-12-06, 12:16 PM.

        Comment

        • Andrea Alessi
          3Dflow Staff
          • Oct 2013
          • 1335

          #18
          Hi Reuben,

          did zephyr save the zep file? That means that there was not enough contiguous memory to allocate the full densified dense point clouds from the merge.

          The warning is telling you that the seed points are saved, but not densified points, so i assume your dense clouds were already *very* dense. Seed points should be saved though, so if you have the zep file saved, you should be able to proceed regardless.

          Is this the case?

          Comment


          • Reuben
            Reuben commented
            Editing a comment
            Hi Andrea, I haven't generated any dense clouds yet. When I save the zep a popup says "cannot save workspace choose different file location". I also cannot open the zep file it creates, says it has become corrupted.
        • Andrea Alessi
          3Dflow Staff
          • Oct 2013
          • 1335

          #19
          Hi Reuben,

          I doublechecked, and while that function is called only by dense clouds, we have noticed we had (wrongly) pasted the same error message also in the descriptor function. Please disregard my previous message about dense clouds, we have fixed at least the error message in our internal build now.

          The good news though, is that the file should have been saved correctly regardless. Is there a chance you could share with us your zep file? It shouldn't have become corrupted if it finished saving the file, and if it did, we may able to salvage it (hopefully)

          Sorry about this issue

          Comment


          • Reuben
            Reuben commented
            Editing a comment
            no problem. I will DM you the dropbox link soon.
        • Reuben
          3Dfollower
          • Jul 2017
          • 21

          #20
          Hi again!

          I finished the model and it all looks great. I just have a few questions:

          1) What do the BA Mean Square Error (px), BA Reference Variance (px), and Mean Reprojection Errors indicate? What are good values?

          2) Is there a way to calculate the resolution of the Dense Point Cloud? Something like points per squared meter? Or 1 point = x cm2?

          3) Same for the mesh, can a resolution be calculated? As in the average/min/max triangle area?

          4) When I used the Confidence Tool on the Dense Point Cloud I originally had 2,605,021 points. But the confidence tool only finds 167,526 points between confidence of 0 to 203. It seems the tool thinks 2,437,495 points are less than 0, why is that?

          Thank you!
          Last edited by Reuben; 2017-12-19, 03:00 PM.

          Comment

          • Roberto
            3Dflow
            • Jun 2011
            • 559

            #21
            Glad you made it with the big project!

            1. B.A. stays for Bundle Adjustment, ie. the final optimization algorithm of the Structure from motion. The mean square error is the mean reprojection error of the points of the sparse point cloud. The reference variance is a formula that takes into account also the number of 'unknown' in the minimization. Both values are expressed in pixels. It's difficult to give you an answer on good values. Around 1px is usually good, but please beware that the reprojection error might be "self-referential". You would need to check accuracy with control points/metric measurements to be really sure about your results.

            2 / 3. Unfortunately these values are not extracted in Zephyr. Please note that in Photogrammetry density is not a good reference for quality / accuracy (as opposite to laser scanning for example). You can simply get a more dense point cloud / mesh by tuning the parameters properly:
            Understanding density   Welcome to the 3DF Zephyr tutorial series. In this recipe, you will learn about point density, densification and decimation. Albeit not properly a tutorial, this guide can improve the final results you get in Zephyr. This guide is useful especially … Read More


            4. Those points are assigned a confidence of 0. This means that Zephyr is not completely sure about their triangulation. 167k 'confident' points is a quite low, but you can try to increase the discretization level during point cloud generation to get more points.

            Comment

            Working...