» Home

  » News

  » E57 User Group

  » Downloads

  » Test Data

  » Bugs

  » Documentation

  » Feature Categories

  » Extensions

  » Example

  » Supporting Partners

  » Supporting Products

  » Consultants

  » Licenses

Issues with ASTM E2807 Standard

This is a list of the issues found in the ASTM E2807 standard when trying to implement libE57 library.

Click each item to expand its details.


  • Missing IndexLimits structure for grided point clouds
  • When the standard defined the IndexBounds structure the standard states that the bounds are defined to be tight to the data given. In table 28, the standard defines rowMaximum to be "The maximum rowIndex value of any point represented by this IndexBounds object." The problem with this is that it doesn't meet the real world need to know the index limits for structured point clouds. In order to know all the invalid points that occurred outside the indexBounds mim/max bounds, we need to know what the index limits were when it was scanned.

    For example, given a PTX (size 100,100) file of a scan where rows 92 to 100 were all invalid, the IndexBounds.rowMaximum would be set to 91 because that was the maximum row that had valid data. Converting this data back to PTX (size 91,100) would NOT reproduce the identical file. It would be missing the invalid points on rows 92 to 100.

    Work Around

    The best way to solve this problem is to set the IndexBounds to the grid size of the structured point cloud like this:

    // FoundationAPI

    StructureNode indexBounds = StructureNode(imf_); indexBounds.set("rowMinimum", IntegerNode(imf_, 0 )); indexBounds.set("rowMaximum", IntegerNode(imf_, nSizeRows - 1)); indexBounds.set("columnMinimum", IntegerNode(imf_, 0 )); indexBounds.set("columnMaximum", IntegerNode(imf_, nSizeColumns - 1)); indexBounds.set("returnMinimum", IntegerNode(imf_, 0 )); indexBounds.set("returnMaximum", IntegerNode(imf_, nSizeReturns - 1)); scan.set("indexBounds", indexBounds);

    Solution

    The real solution is to add an SP1 extension that defines a new IndexLimits structure with the grid size of the structured point cloud. Then the IndexBounds structure can be the bounds that exists in the real data and IndexLimits can be the grid size that was scanned.

    // FoundationAPI

    StructureNode indexLimits = StructureNode(imf_); indexLimits.set("sp1:rowMinimum", IntegerNode(imf_, 0 )); indexLimits.set("sp1:rowMaximum", IntegerNode(imf_, nSizeRows - 1)); indexLimits.set("sp1:columnMinimum", IntegerNode(imf_, 0 )); indexLimits.set("sp1:columnMaximum", IntegerNode(imf_, nSizeColumns - 1)); indexLimits.set("sp1:returnMinimum", IntegerNode(imf_, 0 )); indexLimits.set("sp1:returnMaximum", IntegerNode(imf_, nSizeReturns - 1)); scan.set("sp1:indexLimits", indexLimits);

    Then the reader would have to look at both structures in order to get the point cloud size.

    // FoundationAPI

    double rowMaximum = 0.; double rowMinimum = 0.; double columnMaximum = 0.; double columnMinimum = 0.; double returnMaximum = 0.; double returnMinimum = 0.; if(scan.isDefined("sp1:indexLimits")) { StructureNode xbox(scan.get("sp1:indexLimitss")); if(xbox.isDefined("sp1:rowMaximum")) { rowMinimum = IntegerNode(xbox.get("sp1:rowMinimum")).value(); rowMaximum = IntegerNode(xbox.get("sp1:rowMaximum")).value(); } if(xbox.isDefined("sp1:columnMaximum")) { columnMinimum = IntegerNode(xbox.get("sp1:columnMinimum")).value(); columnMaximum = IntegerNode(xbox.get("sp1:columnMaximum")).value(); } if(xbox.isDefined("sp1:returnMaximum")) { returnMinimum = IntegerNode(xbox.get("sp1:returnMinimum")).value(); returnMaximum = IntegerNode(xbox.get("sp1:returnMaximum")).value(); } } else if(scan.isDefined("indexBounds")) { StructureNode ibox(scan.get("indexBounds")); if(ibox.isDefined("rowMaximum")) { rowMinimum = IntegerNode(ibox.get("rowMinimum")).value(); rowMaximum = IntegerNode(ibox.get("rowMaximum")).value(); } if(ibox.isDefined("columnMaximum")) { columnMinimum = IntegerNode(ibox.get("columnMinimum")).value(); columnMaximum = IntegerNode(ibox.get("columnMaximum")).value(); } if(ibox.isDefined("returnMaximum")) { returnMinimum = IntegerNode(ibox.get("returnMinimum")).value(); returnMaximum = IntegerNode(ibox.get("returnMaximum")).value(); } } long nSizeRows = rowMaximum - rowMinimum + 1; long nSizeColumns = columnMaximum - columnMinimum + 1; long nSizeReturns = returnMaximum - returnMinimum + 1;
  • Cartesian and Spherical Bounds only allows for type double.
  • The standard only allows double types to be used as the cartesian and spherical bounds coordinates. However, point data can be any type including scaled integer. This mismatch will cause round off errors when comparing the bounds with the point data.

    Work Around

    The best way to solve this problem is to pass the bounds through the same calculations as a scaled integer if the point data is a scaled integer.

    // FoundationAPI

    //use the scale factor for the point data #define DATA_SCALE_FACTOR (.000001) //calculate the minimum/maximum from the point data. double MaxX,MinX,MaxY,MinY,MaxZ,MinZ; //do this only if the point coordinate data is a scaled integer. StructureNode bbox = StructureNode(imf_); bbox.set("xMinimum", FloatNode(imf_, floor(MinX / DATA_SCALE_FACTOR - 0.9999999) * DATA_SCALE_FACTOR)); bbox.set("xMaximum", FloatNode(imf_, ceil(MaxX / DATA_SCALE_FACTOR + 0.9999999) * DATA_SCALE_FACTOR)); bbox.set("yMinimum", FloatNode(imf_, floor(MinY / DATA_SCALE_FACTOR - 0.9999999) * DATA_SCALE_FACTOR)); bbox.set("yMaximum", FloatNode(imf_, ceil(MaxY / DATA_SCALE_FACTOR + 0.9999999) * DATA_SCALE_FACTOR)); bbox.set("zMinimum", FloatNode(imf_, floor(MinZ / DATA_SCALE_FACTOR - 0.9999999) * DATA_SCALE_FACTOR)); bbox.set("zMaximum", FloatNode(imf_, ceil(MaxZ / DATA_SCALE_FACTOR + 0.9999999) * DATA_SCALE_FACTOR)); scan.set("cartesianBounds", bbox);

    Solution

    The real solution is to allow the bounds to be same type as the point data. The SP1 extension will document this. The reader will need to check the type of the bounds and convert them properly.

    // FoundationAPI

    double MaxX,MinX,MaxY,MinY,MaxZ,MinZ; if(scan.isDefined("cartesianBounds")) { StructureNode bbox(scan.get("cartesianBounds")); if( bbox.get("xMinimum").type() == E57_SCALED_INTEGER ) { MinX = (double) ScaledIntegerNode(bbox.get("xMinimum")).scaledValue(); MaxX = (double) ScaledIntegerNode(bbox.get("xMaximum")).scaledValue(); MinY = (double) ScaledIntegerNode(bbox.get("yMinimum")).scaledValue(); MaxY = (double) ScaledIntegerNode(bbox.get("yMaximum")).scaledValue(); MinZ = (double) ScaledIntegerNode(bbox.get("zMinimum")).scaledValue(); MaxZ = (double) ScaledIntegerNode(bbox.get("zMaximum")).scaledValue(); } else if( bbox.get("xMinimum").type() == E57_INTEGER ) { MinX = (double) IntegerNode(bbox.get("xMinimum")).value(); MaxX = (double) IntegerNode(bbox.get("xMaximum")).value(); MinY = (double) IntegerNode(bbox.get("yMinimum")).value(); MaxY = (double) IntegerNode(bbox.get("yMaximum")).value(); MinZ = (double) IntegerNode(bbox.get("zMinimum")).value(); MaxZ = (double) IntegerNode(bbox.get("zMaximum")).value(); } else if( bbox.get("xMinimum").type() == E57_FLOAT ){ MinX = FloatNode(bbox.get("xMinimum")).value(); MaxX = FloatNode(bbox.get("xMaximum")).value(); MinY = FloatNode(bbox.get("yMinimum")).value(); MaxY = FloatNode(bbox.get("yMaximum")).value(); MinZ = FloatNode(bbox.get("zMinimum")).value(); MaxZ = FloatNode(bbox.get("zMaximum")).value(); } }
  • The timestamp data doesn't have enough precision.
  • The timestamp data doesn't have enough precision. The E57 standard defines the timestamp data relative to the GPS acquisitionStart field. The problem with using the full GPS time is that it is only accurate to about 1 ms. The common practice is to only use the GPS week time without the GPS week number number mixed in order to give a better resolution down to .1 us.

    Solution

    Use the new TIME -- E57_LEICA_Time_Bounds extension to gives the time:timeMinimum and time:timeMaximum value of the timestamp data in GPS week time using the same accuracy as found in the point cloud timestamp data. Then add the time:timeMinimum to the point data's timestamp to the time data with more precision.

    		double T1 = point.TimeStamp + time:timeMinimum;  // to get T1 in GPSWeekTime with .1 us resolution.
    		
    		// Or
    		
    		double gps_week_number = ((int)floor(acquisitionStart)) / 604800;
    		double T0 = point.TimeStamp + time:timeMinimum + gps_week_number * 604800; //full GPS time with lost of resolution.
    

    Note - point's timestamp data still starts at 0 because the time:timeMinimum is subtracted from the GPSWeekTime data.

    		point.TimeStamp = T1 - time:timeMinimum;	// to store the T1 
    

    This is why the older readers without the extension will still work.

    		
    		double T0 = point.TimeStamp + acquisitionStart;  //full GPS time
    
  • Multiple scans with same image issue.
  • The standard defined an Image2D::associatedData3DGuid for the Data3D object that was being acquired when the Image2D picture was taken. This allows for a 1 to 1 association between the scan and an image and a 1 to many association between a single scan and many images. However, the many scans with a single image case is not addressed.

    Work Around

    No work around is available.

    Solution

    The real solution is to use the RLMS -- E57_RIEGL_2011 extension that defines a new rlms:scanposGuid for both the images2D and Data3D structures where the scan and image can be grouped together into a pose position where they were taken.























This site is © Copyright 2010 E57.04 3D Imaging System File Format Committee, All Rights Reserved