In this tutorial you will learn how to design a data processing script for lidar point clouds and how to automate it with python.
< previous tutorial … next tutorial >
Introduction
In the following, you will learn how to use different tools for solving a typical lidar processing task and how to quickly convert this into python code.
Even if you are only interested in writing python scripts, it is worth taking this tour through the graphical user interface of LIS Pro 3D.
Take care that you have installed everything as described in the installation tutorial.
Take care that you have finished the previous tutorial! In this tutorial we will need the prepared folders and files.
Step 1: Find your prepared data and folders
Find your prepared data and folders from the previous tutorial and check its completeness.
Step 1.1: Make sure you have the whole folder structure created
Within the processing folder you should find these folders:
- export (the folder for your final output)
- shapes (the folder for shapefiles describing the extent of your available data)
- spc_class (the folder for the classified output pointclouds)
Step 1.2: Check the LAZ, LAX and LASVF files
Step 1.3: Check the tiling schemes
Step 1.4: Open LIS Pro 3D Graphical User Interface
Find it in your installed programs. It is recommended to use a desktop shortcut.
Step 2: Open new project
If you have any data open in LIS Pro 3D, open a new clear project!
Step 2: Open the tiles_500.shp
Use the upper-left menu in order to load the tiles_500.shp shapes layer
Step 3: Get a Subset from your Point Cloud Catalog
We can use tiles_500.shp in order to query an area of interest from our available datasets. The typical scenario is the extraction of a subset.
In our case, the tile size is larger than the “physical” tiling. Thus, the two available *.laz.files will be merged into a single point cloud.
You can both use the tool in order to query a small subset, but also in order to merge multiple tiles or scan positions into one single point cloud (depending on your query polygon or query extent).
Open the LIS Pro 3D > Virtual > Get Subset from Virtual LAS/LAZ Tool
Provide the tiles.lasvf file that we have created in the previous tutorial
Provide the tiles_500.shp for the spatial query!
use the settings shown here:
Use the Save button before execution of the tool in order to save the parameters in an *.sprm-file. Provide a reasonable name. If you have an sprm-file for a specific tool, you can simply load your settings by using the Load button. Thereby, you don’t have to remember the settings. Do this for all the tools you run!
- click Execute!
Step 4: Inspect the extracted point cloud data
Find the extracted data set in the PointCloud section of the Data Tab.
In the Description Tab you can see, that 1361643 points have been extracted from the point cloud catalog
Open the las_subset_tiles dataset in a map by double-clicking on it!
You can see that both “physical” point cloud tiles are merged together here.
We have already inspected the single *.laz-files in this tutorial.
View the dataset also in the LIS Pro 3D > Point Cloud Editor > Point Cloud Editor.
Step 5: Remove Noise Points
Open the tool LIS Pro 3D > Filtering > Remove Isolated Points
Provide the las_subset_tiles dataset.
use the following settings:
After Execution we can see that the point cloud has got an additional attribute field called noise, indicating, valid and invalid points for the next processing steps.
The Messages Window indicates that only 7 points have been labelled as noise
Keep in mind that in a typical raw data scenario a much higher number of noise points must be expected (including low and high outliers). These outliers have to be labelled before the following processing steps.
Step 6: Ground Classification
Open the tool LIS Pro 3D > Classification > Ground Classification
Provide the las_subset_tiles dataset.
Provide the noise attribute from the previous step.
Exclude Point Classes: 7 (computed in the previous step)
use the following settings:
- click Execute
After Execution, the map view will be updated and we can see the point cloud classified into ground and non-ground points.
View the updated dataset also in the LIS Pro 3D > Point Cloud Editor > Point Cloud Editor. Provide the grd_class as Classification and use the shortcut 5 in order to show the classification.
Step 7: Segmentation of planar objects for building roof detection
Open the tool LIS Pro 3D > Segmentation > Segmentation by Plane Growing
Provide the las_subset_tiles dataset.
Provide the grd_class attribute from the previous step as Filter Attribute.
use the following settings:
- click Execute
After Execution, the map view has been updated and we can see the segmented planar objects in the analysed scene. The ground points have been ignored in this segmentation.
View the updated dataset also in the LIS Pro 3D > Point Cloud Editor > Point Cloud Editor. Provide the segmentid as Random and use the shortcut Q in order to show the segment IDs.
Step 8: Building and vegetation classification
Open the tool LIS Pro 3D > Classification > Enhanced Point Cloud Classification
Provide the las_subset_tiles dataset.
Provide the segmentid attribute from the previous step.
Provide the grd_class attribute from ground classification.
Provide the dz attribute from ground classification.
use the following settings:
- click Execute
After execution the map view has been be updated
View the updated dataset also in the LIS Pro 3D > Point Cloud Editor > Point Cloud Editor. Provide the classid as Classification and use the shortcut 5 in order to show the classification.
This classification has still some errors: vegetation points on the facades of the buildings and vegetation points on small roof structures like chimneys. To overcome this, we run two additional cleaning steps.
Step 9: Clean building facades
Open the tool LIS Pro 3D > Classification > Clean Building Facades
Provide the las_subset_tiles dataset.
Provide the classid attribute from the previous step as Attribute to Filter.
Set the Attribute Suffix for the output attribute to facades.
use the following settings:
- click Execute
After execution a new attribute classid_facades has been added to the attributes table of the point cloud.
Step 10: Clean building roofs
Open the tool LIS Pro 3D > Classification > Clean Building Roofs
Provide the las_subset_tiles dataset.
Provide the classid_facades attribute from the previous step as Attribute to Filter.
Set the Attribute Suffix for the output attribute to roofs.
use the following settings:
- click Execute
After execution a new attribute classid_facades_roofs has been added to the attributes table of the point cloud.
Comparing the initial classid with the classid_facades_roofs shows that vegetation points on facades and roofs have been relabeled to the undefined class (1):
classid (not cleaned)
classid_facades_roofs (cleaned)
Step 11: Power line classification
The example dataset includes powerlines that have to be separated from the building and vegetation classes.
Open the tool LIS Pro 3D > Classification > Power Line Classification
Provide the las_subset_tiles dataset.
Provide the classid_facades_roofs attribute from the previous step as Classification.
Set the Attribute Suffix for the output attribute to pl.
use the following settings:
- click Execute
After execution the map view has updated again, showing the new classid_pl attribute. This classification has the powerlines in separate class.
Step 12: Clustering of power line points for filtering
In order to clean the result, the classified powerline points have to be clustered into different strings
Open the tool LIS Pro 3D > Segmentation > Density-based spatial Clustering
Provide the las_subset_tiles dataset.
Provide the classid_pl attribute from the previous step as Constraint.
Set the Attribute Range for the constraint attribute to 14;14.
Set the Attribute Suffix for the output attribute to pl.
use the following settings:
- click Execute
After execution the map view has updated again, showing the new cluster_pl attribute. This segmentation has the powerlines strings in separate classes.
Step 13: Calculate power line cluster lengths for filtering
Open the tool LIS Pro 3D > Analysis > Segment Features
Provide the las_subset_tiles dataset.
Provide the cluster_pl attribute from the previous step as Constraint.
Check the Length 2d.
Keep all other choices unchecked.
Set the Maximum Edge Length to 1.5.
- click Execute
After execution the point cloud has got a new length2d attribute, indicating the length of a detected cable string.
Step 14: Filter power line clusters by minimum length
Open the tool LIS Pro 3D > Filtering > Attribute Filter
Provide the las_subset_tiles dataset.
Provide the length2d attribute from the previous step as Attribute 1.
Set the Filter Type to lower limit (min).
Set the Minimum of the filter to 5.
Set the Attribute Name to pl_filtered.
- click Execute
Step 15: Relabel power line classification
Now we can use the filtered line segments information in order to relabel the classification of powerline points, where the segment length was too small.
Open the tool LIS Pro 3D > Arithmetic > Point Cloud Attribute Calculator
Provide the las_subset_tiles dataset.
Use the following formula:
14),ifelse(eq([pl_filtered],1),14,1),[classid_pl]) ifelse(eq([classid_pl],
Set the Output Field Name to classid_pl_filtered.
Set the Field Data Type to unsigned 1 byte integer.
- click Execute
After execution the map view has updated again, showing the new classid_pl_filtered attribute.
Step 16: Set color scheme to “Classified”
Select the point cloud in the Data tab and in the Settings tab set Colors > Type to Classified.
Select the classid_pl_filtered attribute as calulation.
After this you can see the current classification in the right color scheme again.
Step 17: Pylon classification
As a last step, we can classify also the pylons of the previously classified powerlines.
Open the tool LIS Pro 3D > Classification > Pylon Classification
Provide the las_subset_tiles dataset.
Provide the classid_pl_filtered attributes as Classification.
Provide the dz attributes as DZ.
Use the following settings:
- click Execute
After execution the map view has updated again, showing the new classid_pylon attribute.
First Conclusion
During this tour, we have performed a full classification of an ALS point cloud including 13 Main Steps:
query point cloud (with overlap) from virtual point cloud dataset
filter noise points
ground classification
segmentation of planar objects for building roof detection
building and vegetation classification
clean building facades
clean building roofs
power line classification
clustering of power line points for filtering
calculate power line cluster lengths for filtering
filter power line clusters by minimum length
relabel power line classification
pylon classification
In the following we want to automate these steps in a python script.
In a typical lidar acquisition scenario the datasets have much higher point counts and cover larger areas. In order to deal with this, the dataset is chunked down into smaller tiles for the processing and these tiles are then processed with overlap.
This approach will also be addressed in the following section, using a tile shape with multiple tiles.
Step 18: Open new project
If you have any data open in LISPro3D, open a new clear project!
Step 19 Load Tileshapes and Inspect
In the previous tutorial we have created a shapes folder with three tile-shape shapefiles in it.
Drag and Drop the Tiles_100.shp and the Tiles_500.shp into the GUI and show it in a map.
Tiles_500.shp is the shapefile that we used in the 13-steps-workflow above. The shapefile processes the whole dataset as a single block.
For the automation with python we want to use the Tiles_100.shp shapefile, chunking down the whole dataset into 10 tiles. Thereby we will run the complete processing pipeline for each of the small subregions separately. We will start with processing these tiles one after the other and will proceed by parallelizing this procedure.
Chunking down into 10 tiles is not necessary for this specific example (because the dataset is small) but it simulates a typical lidar processing tasks with 100, 1000 or 10000 tiles. You will be able to use the same processing scripts for terrabytes of data.
Step 20: Open Miniforge Prompt
Search for the Miniforge Prompt and open it.
This will look like this:
Step 21: Go to your project directory
Copy the path to your project folder (CTRL - C)
In Miniforge Prompt type cd + space and right-click after the space in order to paste the path you have copied. Type Enter!
Thereby you change your current directory to the project directory
Step 22: Start your own conda environment
Initially you are in the (base) environment of conda. Type conda activate environmentName in order to use your own environment with all the installed modules (see installation tutorial)!
In this case the environment is called (myenv). On the left side I can see that I am in the (myenv) environment now.
Note, that after this switch, the current working directory is still the same.
Step 23: Go to the scripts folder
Type cd scripts in order to go to the scripts folder.
Step 24: Start Visual Studio Code
Type code + space + . in order to start Visual Studio Code from the scripts directory (the current directory)
Visual Studio Code will open and you see the two scripts that we have developed in the previous tutorial.
Step 25: Create a new script
Use the New File Button in order to create a new script called 02_data_processing.py. This will create a new empty script.
Step 26: Create basic code of data_processing.py
Copy the following code into your 02_data_processing.py script:
import os
import project_config as cfg
from PySAGA import saga_api, lis_cmd
def Task(TaskNum=0):
return True
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# the main function
#
if __name__ == '__main__':
# -----------------------------------------------------------------------------
# start the logging
= lis_cmd.Start_Logging(False)
starttime
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
'Loading tileshape ' + cfg.TILESHAPE_100 + ' failed', starttime, True)
lis_cmd.Error_Exit(
'Number of tiles to process: {:d} \n\n'.format(tileshape.Get_Count()))
lis_cmd.Message_Print(
# -----------------------------------------------------------------------------
# iterate over the tiles and process each tile by calling the task function
# with the processing pipeline; have a look at the "parallel tiles" version of
# this script on how to process each tile on a separate CPU core in parallel
for iTile in range(tileshape.Get_Count()):
= Task(iTile)
result
if not result:
'Processing task failed', starttime)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# free memory and stop logging
saga_api.SG_Get_Data_Manager().Delete()
lis_cmd.Stop_Logging(starttime)
This script performs the following steps:
import relevant modules
define a task function
starttime
import the tile-shape tiles_100.shp using the python variable (TILESHAPE_100) that we have defined in the configuration script project_config.py (see previous tutorial)
run the Task() function for each given tile of tiles_100.shp.
stoptime
The tileshape is the shapefile we have inspected in the GUI, containing 10 tiles with an extent of 100x100m each
Step 27: Extract the extent of a processing tile
Add the following code to your Task() function:
def Task(TaskNum=0):
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
return lis_cmd.Error_Return('Loading tileshape ' + cfg.TILESHAPE_100 + ' failed')
# -----------------------------------------------------------------------------
# get tile to process
= tileshape.Get_Shape(TaskNum)
tile
= tile.Get_Extent().Get_XMin()
xmin = tile.Get_Extent().Get_YMin()
ymin = tile.Get_Extent().Get_XMax()
xmax = tile.Get_Extent().Get_YMax()
ymax
# tiles are referenced by their lower-left corner coordinate
'\nProcessing tile {:.0f}_{:.0f} \n'.format(xmin, ymin))
lis_cmd.Message_Print(
= str(int(xmin)) + '_' + str(int(ymin))
tilename
return True
This code loads the tiles_100.shp file and uses the variable TaskNum in order to access a single tile from the dataset.
For the selected tile (selection from the 10 given tiles), we read the extent coordinates and store them in the python variables xmin,ymin,xmax,ymax.
Additionally, the xmin and ymin values are converted into strings and concatenated (tilename) in order to use them for the tilenames of the individual tiled point clouds later.
In the Main function a for-loop is iterating through the 10 given tiles, providing the tile index to the Task() function. Thus, every command that we put into this Task() will be repeated for every tile of tiles_100.shp until all tiles have been processed. Therefore, for every given tile the extent coordinates are extracted. In the next steps we will put our whole processing workflow into this Task() function, in order to repeat it for every tile.
The updated code should look like this:
import os
import project_config as cfg
from PySAGA import saga_api, lis_cmd
def Task(TaskNum=0):
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
return lis_cmd.Error_Return('Loading tileshape ' + cfg.TILESHAPE_100 + ' failed')
# -----------------------------------------------------------------------------
# get tile to process
= tileshape.Get_Shape(TaskNum)
tile
= tile.Get_Extent().Get_XMin()
xmin = tile.Get_Extent().Get_YMin()
ymin = tile.Get_Extent().Get_XMax()
xmax = tile.Get_Extent().Get_YMax()
ymax
# tiles are referenced by their lower-left corner coordinate
'\nProcessing tile {:.0f}_{:.0f} \n'.format(xmin, ymin))
lis_cmd.Message_Print(
= str(int(xmin)) + '_' + str(int(ymin))
tilename
return True
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# the main function
#
if __name__ == '__main__':
# -----------------------------------------------------------------------------
# start the logging
= lis_cmd.Start_Logging(False)
starttime
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
'Loading tileshape ' + cfg.TILESHAPE_100 + ' failed', starttime, True)
lis_cmd.Error_Exit(
'Number of tiles to process: {:d} \n\n'.format(tileshape.Get_Count()))
lis_cmd.Message_Print(
# -----------------------------------------------------------------------------
# iterate over the tiles and process each tile by calling the task function
# with the processing pipeline; have a look at the "parallel tiles" version of
# this script on how to process each tile on a separate CPU core in parallel
for iTile in range(tileshape.Get_Count()):
= Task(iTile)
result
if not result:
'Processing task failed', starttime)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# free memory and stop logging
saga_api.SG_Get_Data_Manager().Delete()
lis_cmd.Stop_Logging(starttime)
Now we want to add the following steps to the script:
- query point cloud (with overlap) from virtual point cloud dataset
- LIS Pro 3D > Virtual > Get Subset from Virtual LAS/LAZ Tool
- filter noise points
- LIS Pro 3D > Filtering > Remove Isolated Points
- ground classification
- LIS Pro 3D > Classification > Ground Classification
- segmentation of planar objects for building roof detection
- LIS Pro 3D > Segmentation > Segmentation by Plane Growing
- building and vegetation classification
- LIS Pro 3D > Classification > Enhanced Point Cloud Classification
- clean building facades
- LIS Pro 3D > Classification > Clean Building Facades
- clean building roofs
- LIS Pro 3D > Classification > Clean Building Roofs
- power line classification
- LIS Pro 3D > Classification > Power Line Classification
- clustering of power line points for filtering
- LIS Pro 3D > Segmentation > Density-based spatial Clustering
- calculate power line cluster lengths for filtering
- LIS Pro 3D > Analysis > Segment Features
- filter power line clusters by minimum length
- LIS Pro 3D > Filtering > Attribute Filter
- relabel power line classification
- LIS Pro 3D > Arithmetic > Point Cloud Attribute Calculator
- pylon classification
- LIS Pro 3D > Classification > Pylon Classification
Step 28: Prepare some parameters in the project_config.py
Add the following definitions to your project_config.py script.
# get subset from point cloud
= 30.0
TILE_OVERLAP
# outlier removal
= 3.0
SEARCH_RADIUS_OUTLIER
# ground filtering
= 50.0
GRD_INITIAL_CELLSIZE = 0.5
GRD_FINAL_CELLSIZE = 35.0
GRD_MAX_ANGLE = 0.8
GRD_MAX_DISTANCE
# building classification
= 1.0
SEARCH_RADIUS
# powerline classification
= 5.0
SEARCH_RADIUS_PL = 5.0
MIN_WIRE_LENGTH
# pylon classification
= 20.0 MIN_PYLON_HEIGHT
After this, your project_config.py script should look like this:
import os
# -----------------------------------------------------------------------------
# processing directories
#
= 'C:' + os.sep + 'LiDAR_Data' + os.sep + 'AUTOMATION_GUIDE'
BASEDIR
= BASEDIR + os.sep + 'data' + os.sep + 'las'
LASINDIR
= BASEDIR + os.sep + 'processing'
PROCESSDIR = PROCESSDIR + os.sep + 'shapes'
SHAPESDIR = PROCESSDIR + os.sep + 'spc_class'
SPCCLASSDIR = PROCESSDIR + os.sep + 'export'
EXPORTDIR # -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# create (non-existing) dirs
if not os.path.exists(PROCESSDIR): os.makedirs(PROCESSDIR)
if not os.path.exists(SHAPESDIR): os.makedirs(SHAPESDIR)
if not os.path.exists(SPCCLASSDIR): os.makedirs(SPCCLASSDIR)
if not os.path.exists(EXPORTDIR): os.makedirs(EXPORTDIR)
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# global settings: files, tool parameters, ...
#
= SHAPESDIR + os.sep + 'Tiles_100.shp'
TILESHAPE_100 = SHAPESDIR + os.sep + 'Tiles_300.shp'
TILESHAPE_300 = SHAPESDIR + os.sep + 'Tiles_500.shp'
TILESHAPE_500 = LASINDIR + os.sep + 'tiles.lasvf'
LASVF
= 100.0
TILE_SIZE_PROCESSING = 300.0
TILE_SIZE_DELIVERY
# get subset from point cloud
= 30.0
TILE_OVERLAP
# outlier removal
= 3.0
SEARCH_RADIUS_OUTLIER
# ground filtering
= 50.0
GRD_INITIAL_CELLSIZE = 0.5
GRD_FINAL_CELLSIZE = 35.0
GRD_MAX_ANGLE = 0.8
GRD_MAX_DISTANCE
# building classification
= 1.0
SEARCH_RADIUS
# powerline classification
= 5.0
SEARCH_RADIUS_PL = 5.0
MIN_WIRE_LENGTH
# pylon classification
= 20.0 MIN_PYLON_HEIGHT
Step 29: Query current tile extent
From our point cloud catalog, we need to extract the subset which is intersecting with the extent of a single tile.
We will get the code for this from the graphical user interface by the copy-to-clipboard (Python Wrapper Call (all settings) feature of the tool.
Step 29.1: Open the tool
Open LASERDATA LiS > Virtual > Get Subset from Virtual LAS/LAZ Tool (in the GUI)
If you have not closed the GUI after executing the steps above, the GUI remembers your last settings and you can simply copy to clipboard!
If it has hot remembered your settings, try to load the *.sprm-file for this tool in order to recover the settings. If you have not saved your settings to an sprm-file, simply type in the following settings again and click Apply.
Step 29.2: Copy to clipboard
After you have ensured that the GUI has remembered your settings or the settings have been loaded,
right-click onto the tool and select Copy to Clipboard and choose Python!
Step 29.3: Paste the code into your script
The code should look like this:
from PySAGA.tools import lis_virtual
= saga_api.SG_Get_Data_Manager().Add_Shapes('C:/LiDAR_Data/AUTOMATION_GUIDE/processing/shapes/Tiles_500.shp') # input data object
aoi_shp = [] # Python list, will become filled after successful execution with output data objects of type 'saga_api.CSG_PointCloud'
pc_out
=aoi_shp, PC_OUT=pc_out, FILENAME='C:/LiDAR_Data/AUTOMATION_GUIDE/data/las/tiles.lasvf', FILEPATH='', COPY_ATTR=True, CONSTRAIN_QUERY=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True, THIN_OUT=True, SPACING=0.010000, FIELD_TILENAME='<not set>', AOI_XRANGE='0; 0', AOI_YRANGE='0; 0', AOI_ADD_OVERLAP=True, OVERLAP=30.000000, SKIP_EMPTY_AOIS=False, FILENAME_TILE_INFO='', METHOD_CLIP='lower and left boundary is inside', ONE_PC_PER_POLYGON=False) lis_virtual.Get_Subset_from_Virtual_LASLAZ(AOI_SHP
Step 29.4: Make modifications on your template code
29.4.1
The tool call (coming from the GUI) requires a shapefile for defining the area of interest that we want to subset (aoi_shp). We don’t need it here, because we want to use the extent variables xmin,ymin,xmax,ymax instead.
Delete this line:
= saga_api.SG_Get_Data_Manager().Add_Shapes('C:/LiDAR_Data/AUTOMATION_GUIDE/processing/shapes/Tiles_500.shp') # input data object aoi_shp
and the AOI_SHP parameter in the function call
=aoi_shp AOI_SHP
Instead, add the AOI_XRANGE and AOI_YRANGE parameter to the function call using the extent variables xmin,ymin,xmax,ymax
=str(xmin)+';'+str(xmax),AOI_YRANGE=str(ymin)+';'+str(ymax) AOI_XRANGE
29.4.2
Put the import of the tool library lis virtual (from PySAGA.tools import lis_virtual) to the top of the script!
29.4.3
The tool will output a python list containing saga_api.CSG_PointCloud objects. Again, we have to initialize the output as an empty object before passing it to the tool.
Rename the python list pc_out to pc_out_list in both the variable definition and the function call!
29.4.5
Replace this filepath
='C:\LiDAR_Data\AUTOMATION_GUIDE\data\las\tiles.lasvf' FILENAME
by the LASVF Variable, defined in project_config.py.
=cfg.LASVF FILENAME
This allows us to edit the filepaths and other parameters (centralized) in the project_config.py and we don’t have to look into the processing script again.
29.4.6
Replace the overlap value
=30 OVERLAP
by the TILE_OVERLAP Variable, defined in project_config.py.
=cfg.TILE_OVERLAP OVERLAP
For each tile (red square) We use an overlap (red dotted square), in order to avoid edge effects. After the end of the processing pipeline, the overlap will be deleted again.
29.4.7
After this, the code for the tool call will look like this:
= []
pc_out_list
=pc_out_list, FILENAME=cfg.LASVF, COPY_ATTR=True,
lis_virtual.Get_Subset_from_Virtual_LASLAZ(PC_OUT=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True,
CONSTRAIN_QUERY=True, SPACING=0.01, AOI_XRANGE=str(xmin)+';'+str(xmax),
THIN_OUT=str(ymin)+';'+str(ymax), AOI_ADD_OVERLAP=True,
AOI_YRANGE=cfg.TILE_OVERLAP, SKIP_EMPTY_AOIS=True,
OVERLAP='lower and left boundary is inside') METHOD_CLIP
29.4.8
As a user we still have no idea if the tool call within our script was executed successfully or if there was an error. This is of particular importance, if we have scripts processing thousands of files.
To get more control, we put the the tool call into an if-statement and check if it was really successful. If not, we print an error and return from the Task() function.
Don’t forget the colon at the end of the if-statement and the indentation of the Error_Return code block!
Change the tool call to this:
if not lis_virtual.Get_Subset_from_Virtual_LASLAZ(PC_OUT=pc_out_list, FILENAME=cfg.LASVF, COPY_ATTR=True,
=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True,
CONSTRAIN_QUERY=True, SPACING=0.01, AOI_XRANGE=str(xmin)+';'+str(xmax),
THIN_OUT=str(ymin)+';'+str(ymax), AOI_ADD_OVERLAP=True,
AOI_YRANGE=cfg.TILE_OVERLAP, SKIP_EMPTY_AOIS=True,
OVERLAP='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
29.4.9
The code stores the extracted point cloud within a point cloud list pc_out_list.
Let’s check if the list contains at least one entry and get the point cloud from this list using the python variable pc_buff.
if len(pc_out_list) < 1:
return lis_cmd.Error_Return('Tile ' + tilename + ' contains no point cloud data, skipping!')
= saga_api.SG_Get_Data_Manager().Add(pc_out_list[0]).asPointCloud() pc_buff
If the list has no entry, we return the task function in order to skip the respective tile.
If the extracted point cloud pc_buff has not enough points for a useful analysis, we also return from the Task() function.
if pc_buff.Get_Count() < 5:
return lis_cmd.Error_Return('Tile ' + tilename + ' has less than 5 points, skipping ...')
29.4.10
To complete this code by freeing memory resources again, we put the following code at the end of the code block:
saga_api.SG_Get_Data_Manager().Delete(pc_buff)
Because of computer memory reasons, this code deletes the pc_buff object before the next iteration is started.
29.5 Check updated code
After all changes, your code should look like this:
import os
import project_config as cfg
from PySAGA import saga_api, lis_cmd
from PySAGA.tools import lis_virtual
def Task(TaskNum=0):
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
return lis_cmd.Error_Return('Loading tileshape ' + cfg.TILESHAPE_100 + ' failed')
# -----------------------------------------------------------------------------
# get tile to process
= tileshape.Get_Shape(TaskNum)
tile
= tile.Get_Extent().Get_XMin()
xmin = tile.Get_Extent().Get_YMin()
ymin = tile.Get_Extent().Get_XMax()
xmax = tile.Get_Extent().Get_YMax()
ymax
# tiles are referenced by their lower-left corner coordinate
'\nProcessing tile {:.0f}_{:.0f} \n'.format(xmin, ymin))
lis_cmd.Message_Print(
= str(int(xmin)) + '_' + str(int(ymin))
tilename
= []
pc_out_list if not lis_virtual.Get_Subset_from_Virtual_LASLAZ(PC_OUT=pc_out_list, FILENAME=cfg.LASVF, COPY_ATTR=True,
=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True,
CONSTRAIN_QUERY=True, SPACING=0.01, AOI_XRANGE=str(xmin)+';'+str(xmax),
THIN_OUT=str(ymin)+';'+str(ymax), AOI_ADD_OVERLAP=True,
AOI_YRANGE=cfg.TILE_OVERLAP, SKIP_EMPTY_AOIS=True,
OVERLAP='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
if len(pc_out_list) < 1:
return lis_cmd.Error_Return('Tile ' + tilename + ' contains no point cloud data, skipping!')
# add the point cloud datasets to the data manager for easier memory management
= saga_api.SG_Get_Data_Manager().Add(pc_out_list[0]).asPointCloud()
pc_buff
if pc_buff.Get_Count() < 5:
return lis_cmd.Error_Return('Tile ' + tilename + ' has less than 5 points, skipping ...')
saga_api.SG_Get_Data_Manager().Delete()
return True
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# the main function
#
if __name__ == '__main__':
# -----------------------------------------------------------------------------
# start the logging
= lis_cmd.Start_Logging(False)
starttime
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
'Loading tileshape ' + cfg.TILESHAPE_100 + ' failed', starttime, True)
lis_cmd.Error_Exit(
'Number of tiles to process: {:d} \n\n'.format(tileshape.Get_Count()))
lis_cmd.Message_Print(
# -----------------------------------------------------------------------------
# iterate over the tiles and process each tile by calling the task function
# with the processing pipeline; have a look at the "parallel tiles" version of
# this script on how to process each tile on a separate CPU core in parallel
for iTile in range(tileshape.Get_Count()):
= Task(iTile)
result
if not result:
'Processing task failed', starttime)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# free memory and stop logging
saga_api.SG_Get_Data_Manager().Delete()
lis_cmd.Stop_Logging(starttime)
30 Step: Filter noise points
Find the tool: LIS Pro 3D > Filtering > Remove Isolated Points
Restore the following settings
After Copy to Clipboard, the code should look like this:
from PySAGA.tools import lis_filtering
= saga_api.SG_Get_Data_Manager().Add_PointCloud('') # input data object
pc_in
=pc_in, COPY_ATTR=True, SEARCH_RADIUS=3.000000, MAX_POINTS=2, METHOD='label points', ATTR_NAME='noise', CLASSID=7, CLASSID_UNDEF=1)
lis_filtering.Remove_Isolated_Points_PC(PC_IN
Step 30.1: Make modifications on template code
The same modifications will be necessary for adapting all other tool calls!
Step 30.1.1:
First of all, put the imports again on top of the script!
Step 30.1.2:
The template code creates an input data object for that the outlier removal will be applied. We can delete the creation of pc_in, because we want to use our previously extracted pc_buff for this tool call instead.
In the graphical user interface, we also updated the same point cloud for all the processing steps, and added different columns with new information for each computation. Thus we already have no pc_out in the template! Note that pc_in will also be updated in the script!
Step 30.1.3:
Replace
=3.000000 SEARCH_RADIUS
with the variable defined in project_config.py
=cfg.SEARCH_RADIUS_OUTLIER SEARCH_RADIUS
Step 30.1.4:
Put the whole function call into an if-statement and print an error and exit if the tool call fails!
The modified tool call will look like this:
if not lis_filtering.Remove_Isolated_Points_PC(PC_IN=pc_buff, COPY_ATTR=True, SEARCH_RADIUS=cfg.SEARCH_RADIUS_OUTLIER,
=2, METHOD='label points', ATTR_NAME='noise', CLASSID=7,
MAX_POINTS=1):
CLASSID_UNDEFreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
Step 30.2 Updated Code
import os
import project_config as cfg
from PySAGA import saga_api, lis_cmd
from PySAGA.tools import lis_virtual
from PySAGA.tools import lis_filtering
def Task(TaskNum=0):
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
return lis_cmd.Error_Return('Loading tileshape ' + cfg.TILESHAPE_100 + ' failed')
# -----------------------------------------------------------------------------
# get tile to process
= tileshape.Get_Shape(TaskNum)
tile
= tile.Get_Extent().Get_XMin()
xmin = tile.Get_Extent().Get_YMin()
ymin = tile.Get_Extent().Get_XMax()
xmax = tile.Get_Extent().Get_YMax()
ymax
# tiles are referenced by their lower-left corner coordinate
'\nProcessing tile {:.0f}_{:.0f} \n'.format(xmin, ymin))
lis_cmd.Message_Print(
= str(int(xmin)) + '_' + str(int(ymin))
tilename
= []
pc_out_list if not lis_virtual.Get_Subset_from_Virtual_LASLAZ(PC_OUT=pc_out_list, FILENAME=cfg.LASVF, COPY_ATTR=True,
=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True,
CONSTRAIN_QUERY=True, SPACING=0.01, AOI_XRANGE=str(xmin)+';'+str(xmax),
THIN_OUT=str(ymin)+';'+str(ymax), AOI_ADD_OVERLAP=True,
AOI_YRANGE=cfg.TILE_OVERLAP, SKIP_EMPTY_AOIS=True,
OVERLAP='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
if len(pc_out_list) < 1:
return lis_cmd.Error_Return('Tile ' + tilename + ' contains no point cloud data, skipping!')
# add the point cloud datasets to the data manager for easier memory management
= saga_api.SG_Get_Data_Manager().Add(pc_out_list[0]).asPointCloud()
pc_buff
if pc_buff.Get_Count() < 5:
return lis_cmd.Error_Return('Tile ' + tilename + ' has less than 5 points, skipping ...')
if not lis_filtering.Remove_Isolated_Points(PC_IN=pc_buff, COPY_ATTR=True, SEARCH_RADIUS=cfg.SEARCH_RADIUS_OUTLIER,
=2, METHOD='label points', ATTR_NAME='noise', CLASSID=7,
MAX_POINTS=1):
CLASSID_UNDEFreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
saga_api.SG_Get_Data_Manager().Delete()
return True
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# the main function
#
if __name__ == '__main__':
# -----------------------------------------------------------------------------
# start the logging
= lis_cmd.Start_Logging(False)
starttime
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
'Loading tileshape ' + cfg.TILESHAPE_100 + ' failed', starttime, True)
lis_cmd.Error_Exit(
'Number of tiles to process: {:d} \n\n'.format(tileshape.Get_Count()))
lis_cmd.Message_Print(
# -----------------------------------------------------------------------------
# iterate over the tiles and process each tile by calling the task function
# with the processing pipeline; have a look at the "parallel tiles" version of
# this script on how to process each tile on a separate CPU core in parallel
for iTile in range(tileshape.Get_Count()):
= Task(iTile)
result
if not result:
'Processing task failed', starttime)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# free memory and stop logging
saga_api.SG_Get_Data_Manager().Delete()
lis_cmd.Stop_Logging(starttime)
31 Repeat Copy-To-Clipboard and Modifications
Repeat Copy-To-Clipboard and Modifications for all 13 Steps.
After this, the final code will look like this (or similar):
import os
import project_config as cfg
from PySAGA import saga_api, lis_cmd
from PySAGA.tools import lis_virtual
from PySAGA.tools import lis_filtering
from PySAGA.tools import lis_classification
from PySAGA.tools import lis_segmentation
from PySAGA.tools import lis_analysis
from PySAGA.tools import lis_arithmetic
from PySAGA.tools import lis_tools
# -----------------------------------------------------------------------------
# the processing pipeline in a single function; the function is called for each
# tile to be processed
# note: this would not be absolutely necessary here, but makes it easier to
# parallelize a script later on
def Task(TaskNum=0):
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
return lis_cmd.Error_Return('Loading tileshape ' + cfg.TILESHAPE_100 + ' failed')
# -----------------------------------------------------------------------------
# get tile to process
= tileshape.Get_Shape(TaskNum)
tile
= tile.Get_Extent().Get_XMin()
xmin = tile.Get_Extent().Get_YMin()
ymin = tile.Get_Extent().Get_XMax()
xmax = tile.Get_Extent().Get_YMax()
ymax
# tiles are referenced by their lower-left corner coordinate
'\nProcessing tile {:.0f}_{:.0f} \n'.format(xmin, ymin))
lis_cmd.Message_Print(
= str(int(xmin)) + '_' + str(int(ymin))
tilename
# -----------------------------------------------------------------------------
# start processing pipeline
# -----------------------------------------------------------------------------
# query point cloud (with overlap) from virtual point cloud dataset
= []
pc_out_list
if not lis_virtual.Get_Subset_from_Virtual_LASLAZ(PC_OUT=pc_out_list, FILENAME=cfg.LASVF, COPY_ATTR=True,
=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True,
CONSTRAIN_QUERY=True, SPACING=0.01, AOI_XRANGE=str(xmin)+';'+str(xmax),
THIN_OUT=str(ymin)+';'+str(ymax), AOI_ADD_OVERLAP=True,
AOI_YRANGE=cfg.TILE_OVERLAP, SKIP_EMPTY_AOIS=True,
OVERLAP='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
if len(pc_out_list) < 1:
return lis_cmd.Error_Return('Tile ' + tilename + ' contains no point cloud data, skipping!')
# add the point cloud datasets to the data manager for easier memory management
= saga_api.SG_Get_Data_Manager().Add(pc_out_list[0]).asPointCloud()
pc_buff
if pc_buff.Get_Count() < 5:
return lis_cmd.Error_Return('Tile ' + tilename + ' has less than 5 points, skipping ...')
# -----------------------------------------------------------------------------
# filter noise points
if not lis_filtering.Remove_Isolated_Points(PC_IN=pc_buff, COPY_ATTR=True, SEARCH_RADIUS=cfg.SEARCH_RADIUS_OUTLIER,
=2, METHOD='label points', ATTR_NAME='noise', CLASSID=7,
MAX_POINTS=1):
CLASSID_UNDEFreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# ground classification
if not lis_classification.Ground_Classification(PC_IN=pc_buff, FIELD_CLASS='noise', COPY_ATTR=True,
=cfg.GRD_INITIAL_CELLSIZE, FINAL_CELLSIZE=cfg.GRD_FINAL_CELLSIZE,
INIT_CELLSIZE=False, TERRAIN_ANGLE=88, MIN_EDGE_LENGTH=cfg.GRD_FINAL_CELLSIZE,
FILTER_SEEDS=cfg.GRD_MAX_ANGLE, MAX_DISTANCE=cfg.GRD_MAX_DISTANCE,
MAX_ANGLE=0.05, ATTACH_DZ=True, EXCLUDE_CLASSES='7'):
Z_TOLERANCEreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# segmentation of planar objects for building roof detection
if not lis_segmentation.Segmentation_by_Plane_Growing(PC_IN=pc_buff, FILTER_FIELD='grdclass', COPY_ATTR=True, LOW_RANGE='0;1',
='3;64', METHOD_NN='radius', RADIUS_NN=cfg.SEARCH_RADIUS,
HIGH_RANGE=3, ROBUST=True, ROBUST_THRES=0.1, ROBUST_PERCENTAGE=50.0,
MIN_K='search point must be included in best fit plane',
METHOD_ROBUST_SEARCH_POINT=cfg.SEARCH_RADIUS / 2.0, GROWING_PLANE_DIST=0.1,
GROWING_RADIUS=1, REGION_RADIUS=cfg.SEARCH_RADIUS, REGION_MINSIZE=20,
SEG_ITERATIONS=100000, REGION_MAXANGLE=10, REGION_MAXOFFSET=0.2,
REGION_MAXSIZE=100000):
REGION_RECALC_NORMALreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# building and vegetation classification
if not lis_classification.Enhanced_Point_Cloud_Classification(PC_IN=pc_buff, FIELD_SEGMENTID='segmentid', FIELD_CLASSID_GROUND='grdclass',
='dz', COPY_ATTR=True, EXCLUDE_CLASSES='7', METHOD_BUILDING='point cloud features',
FIELD_DZ=2.0, GROWING_RADIUS_BUILDING=cfg.SEARCH_RADIUS,
MIN_DZ_BUILDING=cfg.SEARCH_RADIUS, MIN_NEIGHBOR_RATIO_BUILDING=60.0,
FEATURE_RADIUS_BUILDING=False, MIN_POINTS_BUILDING=200, RADIUS_VEGETATION=cfg.SEARCH_RADIUS * 2.0,
SLOPE_ADAPTIVE_RATIO=5, MIN_DZ_LOW_VEGETATION=0, MIN_DZ_MEDIUM_VEGETATION=0.3,
MIN_POINTS_VEGETATION=2, OUTPUT_CALIBRATION=False):
MIN_DZ_HIGH_VEGETATIONreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clean building facades
if not lis_classification.Clean_Building_Facades(PC_IN=pc_buff, FIELD_FILTERATTR='classid', COPY_ATTR=True, ATTR_SUFFIX='facades',
=cfg.SEARCH_RADIUS, BUILDING_CLASS='6;6', LOW_ATTR_RANGE='3;5',
RADIUS_NN='7;64', TARGET_CLASS=1):
HIGH_ATTR_RANGEreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clean building roofs
if not lis_classification.Clean_Building_Roofs(PC_IN=pc_buff, FIELD_FILTERATTR='classid_facades', COPY_ATTR=True, ATTR_SUFFIX='roofs',
='3D search', RADIUS_NN=2.5, USE_INNER_RADIUS=False, THRES_MAJORITY=25,
DIMENSION=1, FILTER_FACADE=False):
TARGET_CLASSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# power line classification
if not lis_classification.Power_Line_Classification(PC_IN=pc_buff, FIELD_CLASS='classid_facades_roofs', FIELD_THRES='dz',
=True, ATTR_SUFFIX='pl', EXCLUDE_CLASSES='7', THRES_RANGE='3;1000',
COPY_ATTR=cfg.SEARCH_RADIUS_PL, CLASSIFY_POLES=False):
SEARCH_RADIUSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clustering of power line points for filtering
if not lis_segmentation.Densitybased_Spatial_Clustering(PC_IN=pc_buff, FIELD_ATTR_1='X', FIELD_ATTR_2='Y', FIELD_ATTR_3='Z',
='classid_pl', COPY_ATTR=True, ATTR_SUFFIX='pl',
FIELD_CONSTRAINT='3-dimensional', SEARCH_RADIUS=cfg.SEARCH_RADIUS * 2, MIN_POINTS=1,
CHOICE_DIM='14;14', TOL_CONSTRAINT='-0.5;0.5'):
RANGE_CONSTRAINTreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# calculate power line cluster lengths for filtering
if not lis_analysis.Segment_Features(PC_IN=pc_buff, FIELD_SEGMENTID='cluster_pl', COPY_ATTR=True, LENGTH_2D=True):
return lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# filter power line clusters by minimum length
if not lis_filtering.Attribute_Filter(PC_IN=pc_buff, FIELD_ATTR_1='length2d', COPY_ATTR=True, METHOD='binary flag',
='lower limit (min)', MIN_1=cfg.MIN_WIRE_LENGTH, ATTR_NAME='pl_filtered'):
TYPE_1return lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# relabel power line classification
if not lis_arithmetic.Point_Cloud_Attribute_Calculator(PC_IN=pc_buff,
='ifelse(eq([classid_pl],14),ifelse(eq([pl_filtered],1),14,1),[classid_pl])',
FORMULA='classid_pl_filtered', TYPE='1 byte unsigned integer',
NAME=False):
INCLUDE_NODATAreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# pylon classification
if not lis_classification.Pylon_Classification(PC_IN=pc_buff, FIELD_CLASS='classid_pl_filtered', FIELD_DZ='dz',
=True, ATTR_SUFFIX='pylon', MIN_PYLON_HEIGHT=cfg.MIN_PYLON_HEIGHT,
COPY_ATTR=25.0, RADIUS_NN=3.0, MAX_DIFFERENCE=1.0, MIN_POINTS=200,
MAX_PYLON_DIAMETER=14, EXCLUDE_CLASSES='7', OUTPUT_CALIBRATION=False):
SEED_CLASSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# free memory resources and return success
saga_api.SG_Get_Data_Manager().Delete()
return True
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# the main function
#
if __name__ == '__main__':
# -----------------------------------------------------------------------------
# start the logging
= lis_cmd.Start_Logging(False)
starttime
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
'Loading tileshape ' + cfg.TILESHAPE_100 + ' failed', starttime, True)
lis_cmd.Error_Exit(
'Number of tiles to process: {:d} \n\n'.format(tileshape.Get_Count()))
lis_cmd.Message_Print(
# -----------------------------------------------------------------------------
# iterate over the tiles and process each tile by calling the task function
# with the processing pipeline; have a look at the "parallel tiles" version of
# this script on how to process each tile on a separate CPU core in parallel
for iTile in range(tileshape.Get_Count()):
= Task(iTile)
result
if not result:
'Processing task failed', starttime)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# free memory and stop logging
saga_api.SG_Get_Data_Manager().Delete()
lis_cmd.Stop_Logging(starttime)
Note, that we have not yet saved anything to file. In the moment, we perform the whole processing workflow and update pc_buff. After the processing has finished, we delete pc_buff. Thus, we have to add a save command.
Note, that pc_buff has been extracted with an overlap (red dotted square). In the end, we don’t want to save duplicate points and we only want to save the tile (red square) without overlap. Thus we need to get rid of this overlap before saving!
Step 32: Clip Overlap and Save
LISPro3D provides the Tool LIS Pro 3D > Tools > Clip Point Cloud with Extent. We will use this tool in order to clip with our initial extent (xmin,ymin,xmax,ymax / without overlap)
Add the following code to your script:
# -----------------------------------------------------------------------------
# remove tile overlap
= saga_api.SG_Get_Data_Manager().Add_PointCloud()
pc_out
if not lis_tools.Clip_Point_Cloud_with_Extent(PC_IN=pc_buff, PC_OUT=pc_out,
=str(xmin)+';'+str(xmax), Y_EXTENT=str(ymin)+';'+str(ymax),
X_EXTENT='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# save classified tile (without overlap)
if(pc_out.Get_Count() > 10):
+ os.sep + 'tile_' + tilename + '.sg-pts-z') pc_out.Save(cfg.SPCCLASSDIR
As we are clipping pc_buff, we define an output point cloud and (finally) save it to the SPCCLASSDIR output class (in contrast to the other tool calls in this pipeline).
Each processed tile is saved with a unique tilename, based on its extent.
Step 33: Final Script
import os
import project_config as cfg
from PySAGA import saga_api, lis_cmd
from PySAGA.tools import lis_virtual
from PySAGA.tools import lis_filtering
from PySAGA.tools import lis_classification
from PySAGA.tools import lis_segmentation
from PySAGA.tools import lis_analysis
from PySAGA.tools import lis_arithmetic
from PySAGA.tools import lis_tools
# -----------------------------------------------------------------------------
# the processing pipeline in a single function; the function is called for each
# tile to be processed
# note: this would not be absolutely necessary here, but makes it easier to
# parallelize a script later on
def Task(TaskNum=0):
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
return lis_cmd.Error_Return('Loading tileshape ' + cfg.TILESHAPE_100 + ' failed')
# -----------------------------------------------------------------------------
# get tile to process
= tileshape.Get_Shape(TaskNum)
tile
= tile.Get_Extent().Get_XMin()
xmin = tile.Get_Extent().Get_YMin()
ymin = tile.Get_Extent().Get_XMax()
xmax = tile.Get_Extent().Get_YMax()
ymax
# tiles are referenced by their lower-left corner coordinate
'\nProcessing tile {:.0f}_{:.0f} \n'.format(xmin, ymin))
lis_cmd.Message_Print(
= str(int(xmin)) + '_' + str(int(ymin))
tilename
# -----------------------------------------------------------------------------
# start processing pipeline
# -----------------------------------------------------------------------------
# query point cloud (with overlap) from virtual point cloud dataset
= []
pc_out_list
if not lis_virtual.Get_Subset_from_Virtual_LASLAZ(PC_OUT=pc_out_list, FILENAME=cfg.LASVF, COPY_ATTR=True,
=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True,
CONSTRAIN_QUERY=True, SPACING=0.01, AOI_XRANGE=str(xmin)+';'+str(xmax),
THIN_OUT=str(ymin)+';'+str(ymax), AOI_ADD_OVERLAP=True,
AOI_YRANGE=cfg.TILE_OVERLAP, SKIP_EMPTY_AOIS=True,
OVERLAP='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
if len(pc_out_list) < 1:
return lis_cmd.Error_Return('Tile ' + tilename + ' contains no point cloud data, skipping!')
# add the point cloud datasets to the data manager for easier memory management
= saga_api.SG_Get_Data_Manager().Add(pc_out_list[0]).asPointCloud()
pc_buff
if pc_buff.Get_Count() < 5:
return lis_cmd.Error_Return('Tile ' + tilename + ' has less than 5 points, skipping ...')
# -----------------------------------------------------------------------------
# filter noise points
if not lis_filtering.Remove_Isolated_Points(PC_IN=pc_buff, COPY_ATTR=True, SEARCH_RADIUS=cfg.SEARCH_RADIUS_OUTLIER,
=2, METHOD='label points', ATTR_NAME='noise', CLASSID=7,
MAX_POINTS=1):
CLASSID_UNDEFreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# ground classification
if not lis_classification.Ground_Classification(PC_IN=pc_buff, FIELD_CLASS='noise', COPY_ATTR=True,
=cfg.GRD_INITIAL_CELLSIZE, FINAL_CELLSIZE=cfg.GRD_FINAL_CELLSIZE,
INIT_CELLSIZE=False, TERRAIN_ANGLE=88, MIN_EDGE_LENGTH=cfg.GRD_FINAL_CELLSIZE,
FILTER_SEEDS=cfg.GRD_MAX_ANGLE, MAX_DISTANCE=cfg.GRD_MAX_DISTANCE,
MAX_ANGLE=0.05, ATTACH_DZ=True, EXCLUDE_CLASSES='7'):
Z_TOLERANCEreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# segmentation of planar objects for building roof detection
if not lis_segmentation.Segmentation_by_Plane_Growing(PC_IN=pc_buff, FILTER_FIELD='grdclass', COPY_ATTR=True, LOW_RANGE='0;1',
='3;64', METHOD_NN='radius', RADIUS_NN=cfg.SEARCH_RADIUS,
HIGH_RANGE=3, ROBUST=True, ROBUST_THRES=0.1, ROBUST_PERCENTAGE=50.0,
MIN_K='search point must be included in best fit plane',
METHOD_ROBUST_SEARCH_POINT=cfg.SEARCH_RADIUS / 2.0, GROWING_PLANE_DIST=0.1,
GROWING_RADIUS=1, REGION_RADIUS=cfg.SEARCH_RADIUS, REGION_MINSIZE=20,
SEG_ITERATIONS=100000, REGION_MAXANGLE=10, REGION_MAXOFFSET=0.2,
REGION_MAXSIZE=100000):
REGION_RECALC_NORMALreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# building and vegetation classification
if not lis_classification.Enhanced_Point_Cloud_Classification(PC_IN=pc_buff, FIELD_SEGMENTID='segmentid', FIELD_CLASSID_GROUND='grdclass',
='dz', COPY_ATTR=True, EXCLUDE_CLASSES='7', METHOD_BUILDING='point cloud features',
FIELD_DZ=2.0, GROWING_RADIUS_BUILDING=cfg.SEARCH_RADIUS,
MIN_DZ_BUILDING=cfg.SEARCH_RADIUS, MIN_NEIGHBOR_RATIO_BUILDING=60.0,
FEATURE_RADIUS_BUILDING=False, MIN_POINTS_BUILDING=200, RADIUS_VEGETATION=cfg.SEARCH_RADIUS * 2.0,
SLOPE_ADAPTIVE_RATIO=5, MIN_DZ_LOW_VEGETATION=0, MIN_DZ_MEDIUM_VEGETATION=0.3,
MIN_POINTS_VEGETATION=2, OUTPUT_CALIBRATION=False):
MIN_DZ_HIGH_VEGETATIONreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clean building facades
if not lis_classification.Clean_Building_Facades(PC_IN=pc_buff, FIELD_FILTERATTR='classid', COPY_ATTR=True, ATTR_SUFFIX='facades',
=cfg.SEARCH_RADIUS, BUILDING_CLASS='6;6', LOW_ATTR_RANGE='3;5',
RADIUS_NN='7;64', TARGET_CLASS=1):
HIGH_ATTR_RANGEreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clean building roofs
if not lis_classification.Clean_Building_Roofs(PC_IN=pc_buff, FIELD_FILTERATTR='classid_facades', COPY_ATTR=True, ATTR_SUFFIX='roofs',
='3D search', RADIUS_NN=2.5, USE_INNER_RADIUS=False, THRES_MAJORITY=25,
DIMENSION=1, FILTER_FACADE=False):
TARGET_CLASSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# power line classification
if not lis_classification.Power_Line_Classification(PC_IN=pc_buff, FIELD_CLASS='classid_facades_roofs', FIELD_THRES='dz',
=True, ATTR_SUFFIX='pl', EXCLUDE_CLASSES='7', THRES_RANGE='3;1000',
COPY_ATTR=cfg.SEARCH_RADIUS_PL, CLASSIFY_POLES=False):
SEARCH_RADIUSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clustering of power line points for filtering
if not lis_segmentation.Densitybased_Spatial_Clustering(PC_IN=pc_buff, FIELD_ATTR_1='X', FIELD_ATTR_2='Y', FIELD_ATTR_3='Z',
='classid_pl', COPY_ATTR=True, ATTR_SUFFIX='pl',
FIELD_CONSTRAINT='3-dimensional', SEARCH_RADIUS=cfg.SEARCH_RADIUS * 2, MIN_POINTS=1,
CHOICE_DIM='14;14', TOL_CONSTRAINT='-0.5;0.5'):
RANGE_CONSTRAINTreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# calculate power line cluster lengths for filtering
if not lis_analysis.Segment_Features(PC_IN=pc_buff, FIELD_SEGMENTID='cluster_pl', COPY_ATTR=True, LENGTH_2D=True):
return lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# filter power line clusters by minimum length
if not lis_filtering.Attribute_Filter(PC_IN=pc_buff, FIELD_ATTR_1='length2d', COPY_ATTR=True, METHOD='binary flag',
='lower limit (min)', MIN_1=cfg.MIN_WIRE_LENGTH, ATTR_NAME='pl_filtered'):
TYPE_1return lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# relabel power line classification
if not lis_arithmetic.Point_Cloud_Attribute_Calculator(PC_IN=pc_buff,
='ifelse(eq([classid_pl],14),ifelse(eq([pl_filtered],1),14,1),[classid_pl])',
FORMULA='classid_pl_filtered', TYPE='1 byte unsigned integer',
NAME=False):
INCLUDE_NODATAreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# pylon classification
if not lis_classification.Pylon_Classification(PC_IN=pc_buff, FIELD_CLASS='classid_pl_filtered', FIELD_DZ='dz',
=True, ATTR_SUFFIX='pylon', MIN_PYLON_HEIGHT=cfg.MIN_PYLON_HEIGHT,
COPY_ATTR=25.0, RADIUS_NN=3.0, MAX_DIFFERENCE=1.0, MIN_POINTS=200,
MAX_PYLON_DIAMETER=14, EXCLUDE_CLASSES='7', OUTPUT_CALIBRATION=False):
SEED_CLASSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# remove tile overlap
= saga_api.SG_Get_Data_Manager().Add_PointCloud()
pc_out
if not lis_tools.Clip_Point_Cloud_with_Extent(PC_IN=pc_buff, PC_OUT=pc_out,
=str(xmin)+';'+str(xmax), Y_EXTENT=str(ymin)+';'+str(ymax),
X_EXTENT='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# save classified tile (without overlap)
+ os.sep + 'tile_' + tilename + '.sg-pts-z')
pc_out.Save(cfg.SPCCLASSDIR
# -----------------------------------------------------------------------------
# free memory resources and return success
saga_api.SG_Get_Data_Manager().Delete()
return True
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# the main function
#
if __name__ == '__main__':
# -----------------------------------------------------------------------------
# start the logging
= lis_cmd.Start_Logging(False)
starttime
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
'Loading tileshape ' + cfg.TILESHAPE_100 + ' failed', starttime, True)
lis_cmd.Error_Exit(
'Number of tiles to process: {:d} \n\n'.format(tileshape.Get_Count()))
lis_cmd.Message_Print(
# -----------------------------------------------------------------------------
# iterate over the tiles and process each tile by calling the task function
# with the processing pipeline; have a look at the "parallel tiles" version of
# this script on how to process each tile on a separate CPU core in parallel
for iTile in range(tileshape.Get_Count()):
= Task(iTile)
result
if not result:
'Processing task failed', starttime)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# free memory and stop logging
saga_api.SG_Get_Data_Manager().Delete()
lis_cmd.Stop_Logging(starttime)
Step 34: Run and View Results
Now we are done with the implementation of the script. We can run it by selecting the script and simply clicking on the play/run button in the upper-right corner of Visual Studio Code
In the TERMINAL Window at the bottom of Visual Studio Code we can monitor the whole processing and see which tools are currently running. In the end we see if everything has succeeded.
After successful processing you will find the output files in your output folder (processing/spc_class). For each processed tile you find one output file.
If you load all these files into the GUI and add them all to the same map, it will look like this.
The different color stretches per tile depend on the individual height distributions per tile. If we change the coloring for every tile to Classified and use the classid_pylon, you will see that everything fits together.
Step 35: Create Virtual point cloud of resulting tiles
In order to have seamless access onto the data again, we extent our script by Create Virtual Point Cloud of the resulting tiles.
In order to do so, we first have to define the path, where our virtual point cloud file *.spcvf will be located. We use the same location, where our output files are located and add this definition in our project_config.py.
= SPCCLASSDIR + os.sep + 'tiles.spcvf' SPCVF_CLASS
After that we add the following code before the end of our script. After the FOR-loop and before “stop logging”“:
# -----------------------------------------------------------------------------
# create virtual point cloud dataset of classified tiles
= lis_cmd.Create_File_List(cfg.SPCCLASSDIR, '.sg-pts;.sg-pts-z', cfg.SPCCLASSDIR + os.sep + 'spcvf_input_file_list.txt')
inputfiles
if not lis_virtual.Create_Virtual_Point_Cloud_Dataset(INPUT_FILE_LIST=inputfiles, FILENAME=cfg.SPCVF_CLASS,
=1, USE_HEADER=True):
METHOD_PATHS'Failed to execute tool', starttime, True) lis_cmd.Error_Exit(
This code block is similar to what we have already implemented in the previous tutorial. We create a filelist of all available 10 files using the function Create_File_List() and create a virtual point cloud from it.
As we have *.sg-pts-z. files as output, we do not use the Create Virtual LAS/LAZ Dataset but the Create Virtual Point Cloud Dataset Tool here!
Step 36: Updated Script
import os
import project_config as cfg
from PySAGA import saga_api, lis_cmd
from PySAGA.tools import lis_virtual
from PySAGA.tools import lis_filtering
from PySAGA.tools import lis_classification
from PySAGA.tools import lis_segmentation
from PySAGA.tools import lis_analysis
from PySAGA.tools import lis_arithmetic
from PySAGA.tools import lis_tools
# -----------------------------------------------------------------------------
# the processing pipeline in a single function; the function is called for each
# tile to be processed
# note: this would not be absolutely necessary here, but makes it easier to
# parallelize a script later on
def Task(TaskNum=0):
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
return lis_cmd.Error_Return('Loading tileshape ' + cfg.TILESHAPE_100 + ' failed')
# -----------------------------------------------------------------------------
# get tile to process
= tileshape.Get_Shape(TaskNum)
tile
= tile.Get_Extent().Get_XMin()
xmin = tile.Get_Extent().Get_YMin()
ymin = tile.Get_Extent().Get_XMax()
xmax = tile.Get_Extent().Get_YMax()
ymax
# tiles are referenced by their lower-left corner coordinate
'\nProcessing tile {:.0f}_{:.0f} \n'.format(xmin, ymin))
lis_cmd.Message_Print(
= str(int(xmin)) + '_' + str(int(ymin))
tilename
# -----------------------------------------------------------------------------
# start processing pipeline
# -----------------------------------------------------------------------------
# query point cloud (with overlap) from virtual point cloud dataset
= []
pc_out_list
if not lis_virtual.Get_Subset_from_Virtual_LASLAZ(PC_OUT=pc_out_list, FILENAME=cfg.LASVF, COPY_ATTR=True,
=False, FILE_COMPRESSION=True, ERROR_IF_EMPTY=True,
CONSTRAIN_QUERY=True, SPACING=0.01, AOI_XRANGE=str(xmin)+';'+str(xmax),
THIN_OUT=str(ymin)+';'+str(ymax), AOI_ADD_OVERLAP=True,
AOI_YRANGE=cfg.TILE_OVERLAP, SKIP_EMPTY_AOIS=True,
OVERLAP='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
if len(pc_out_list) < 1:
return lis_cmd.Error_Return('Tile ' + tilename + ' contains no point cloud data, skipping!')
# add the point cloud datasets to the data manager for easier memory management
= saga_api.SG_Get_Data_Manager().Add(pc_out_list[0]).asPointCloud()
pc_buff
if pc_buff.Get_Count() < 5:
return lis_cmd.Error_Return('Tile ' + tilename + ' has less than 5 points, skipping ...')
# -----------------------------------------------------------------------------
# filter noise points
if not lis_filtering.Remove_Isolated_Points(PC_IN=pc_buff, COPY_ATTR=True, SEARCH_RADIUS=cfg.SEARCH_RADIUS_OUTLIER,
=2, METHOD='label points', ATTR_NAME='noise', CLASSID=7,
MAX_POINTS=1):
CLASSID_UNDEFreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# ground classification
if not lis_classification.Ground_Classification(PC_IN=pc_buff, FIELD_CLASS='noise', COPY_ATTR=True,
=cfg.GRD_INITIAL_CELLSIZE, FINAL_CELLSIZE=cfg.GRD_FINAL_CELLSIZE,
INIT_CELLSIZE=False, TERRAIN_ANGLE=88, MIN_EDGE_LENGTH=cfg.GRD_FINAL_CELLSIZE,
FILTER_SEEDS=cfg.GRD_MAX_ANGLE, MAX_DISTANCE=cfg.GRD_MAX_DISTANCE,
MAX_ANGLE=0.05, ATTACH_DZ=True, EXCLUDE_CLASSES='7'):
Z_TOLERANCEreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# segmentation of planar objects for building roof detection
if not lis_segmentation.Segmentation_by_Plane_Growing(PC_IN=pc_buff, FILTER_FIELD='grdclass', COPY_ATTR=True, LOW_RANGE='0;1',
='3;64', METHOD_NN='radius', RADIUS_NN=cfg.SEARCH_RADIUS,
HIGH_RANGE=3, ROBUST=True, ROBUST_THRES=0.1, ROBUST_PERCENTAGE=50.0,
MIN_K='search point must be included in best fit plane',
METHOD_ROBUST_SEARCH_POINT=cfg.SEARCH_RADIUS / 2.0, GROWING_PLANE_DIST=0.1,
GROWING_RADIUS=1, REGION_RADIUS=cfg.SEARCH_RADIUS, REGION_MINSIZE=20,
SEG_ITERATIONS=100000, REGION_MAXANGLE=10, REGION_MAXOFFSET=0.2,
REGION_MAXSIZE=100000):
REGION_RECALC_NORMALreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# building and vegetation classification
if not lis_classification.Enhanced_Point_Cloud_Classification(PC_IN=pc_buff, FIELD_SEGMENTID='segmentid', FIELD_CLASSID_GROUND='grdclass',
='dz', COPY_ATTR=True, EXCLUDE_CLASSES='7', METHOD_BUILDING='point cloud features',
FIELD_DZ=2.0, GROWING_RADIUS_BUILDING=cfg.SEARCH_RADIUS,
MIN_DZ_BUILDING=cfg.SEARCH_RADIUS, MIN_NEIGHBOR_RATIO_BUILDING=60.0,
FEATURE_RADIUS_BUILDING=False, MIN_POINTS_BUILDING=200, RADIUS_VEGETATION=cfg.SEARCH_RADIUS * 2.0,
SLOPE_ADAPTIVE_RATIO=5, MIN_DZ_LOW_VEGETATION=0, MIN_DZ_MEDIUM_VEGETATION=0.3,
MIN_POINTS_VEGETATION=2, OUTPUT_CALIBRATION=False):
MIN_DZ_HIGH_VEGETATIONreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clean building facades
if not lis_classification.Clean_Building_Facades(PC_IN=pc_buff, FIELD_FILTERATTR='classid', COPY_ATTR=True, ATTR_SUFFIX='facades',
=cfg.SEARCH_RADIUS, BUILDING_CLASS='6;6', LOW_ATTR_RANGE='3;5',
RADIUS_NN='7;64', TARGET_CLASS=1):
HIGH_ATTR_RANGEreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clean building roofs
if not lis_classification.Clean_Building_Roofs(PC_IN=pc_buff, FIELD_FILTERATTR='classid_facades', COPY_ATTR=True, ATTR_SUFFIX='roofs',
='3D search', RADIUS_NN=2.5, USE_INNER_RADIUS=False, THRES_MAJORITY=25,
DIMENSION=1, FILTER_FACADE=False):
TARGET_CLASSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# power line classification
if not lis_classification.Power_Line_Classification(PC_IN=pc_buff, FIELD_CLASS='classid_facades_roofs', FIELD_THRES='dz',
=True, ATTR_SUFFIX='pl', EXCLUDE_CLASSES='7', THRES_RANGE='3;1000',
COPY_ATTR=cfg.SEARCH_RADIUS_PL, CLASSIFY_POLES=False):
SEARCH_RADIUSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# clustering of power line points for filtering
if not lis_segmentation.Densitybased_Spatial_Clustering(PC_IN=pc_buff, FIELD_ATTR_1='X', FIELD_ATTR_2='Y', FIELD_ATTR_3='Z',
='classid_pl', COPY_ATTR=True, ATTR_SUFFIX='pl',
FIELD_CONSTRAINT='3-dimensional', SEARCH_RADIUS=cfg.SEARCH_RADIUS * 2, MIN_POINTS=1,
CHOICE_DIM='14;14', TOL_CONSTRAINT='-0.5;0.5'):
RANGE_CONSTRAINTreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# calculate power line cluster lengths for filtering
if not lis_analysis.Segment_Features(PC_IN=pc_buff, FIELD_SEGMENTID='cluster_pl', COPY_ATTR=True, LENGTH_2D=True):
return lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# filter power line clusters by minimum length
if not lis_filtering.Attribute_Filter(PC_IN=pc_buff, FIELD_ATTR_1='length2d', COPY_ATTR=True, METHOD='binary flag',
='lower limit (min)', MIN_1=cfg.MIN_WIRE_LENGTH, ATTR_NAME='pl_filtered'):
TYPE_1return lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# relabel power line classification
if not lis_arithmetic.Point_Cloud_Attribute_Calculator(PC_IN=pc_buff,
='ifelse(eq([classid_pl],14),ifelse(eq([pl_filtered],1),14,1),[classid_pl])',
FORMULA='classid_pl_filtered', TYPE='1 byte unsigned integer',
NAME=False):
INCLUDE_NODATAreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# pylon classification
if not lis_classification.Pylon_Classification(PC_IN=pc_buff, FIELD_CLASS='classid_pl_filtered', FIELD_DZ='dz',
=True, ATTR_SUFFIX='pylon', MIN_PYLON_HEIGHT=cfg.MIN_PYLON_HEIGHT,
COPY_ATTR=25.0, RADIUS_NN=3.0, MAX_DIFFERENCE=1.0, MIN_POINTS=200,
MAX_PYLON_DIAMETER=14, EXCLUDE_CLASSES='7', OUTPUT_CALIBRATION=False):
SEED_CLASSreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# remove tile overlap
= saga_api.SG_Get_Data_Manager().Add_PointCloud()
pc_out
if not lis_tools.Clip_Point_Cloud_with_Extent(PC_IN=pc_buff, PC_OUT=pc_out,
=str(xmin)+';'+str(xmax), Y_EXTENT=str(ymin)+';'+str(ymax),
X_EXTENT='lower and left boundary is inside'):
METHOD_CLIPreturn lis_cmd.Error_Return('Failed to execute tool on tile ' + tilename)
# -----------------------------------------------------------------------------
# save classified tile (without overlap)
+ os.sep + 'tile_' + tilename + '.sg-pts-z')
pc_out.Save(cfg.SPCCLASSDIR
# -----------------------------------------------------------------------------
# free memory resources and return success
saga_api.SG_Get_Data_Manager().Delete()
return True
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# -----------------------------------------------------------------------------
# the main function
#
if __name__ == '__main__':
# -----------------------------------------------------------------------------
# start the logging
= lis_cmd.Start_Logging(False)
starttime
# -----------------------------------------------------------------------------
# load the shapefile with the tiling scheme used for processing
= saga_api.SG_Get_Data_Manager().Add_Shapes(cfg.TILESHAPE_100)
tileshape
if not tileshape.is_Valid():
'Loading tileshape ' + cfg.TILESHAPE_100 + ' failed', starttime, True)
lis_cmd.Error_Exit(
'Number of tiles to process: {:d} \n\n'.format(tileshape.Get_Count()))
lis_cmd.Message_Print(
# -----------------------------------------------------------------------------
# iterate over the tiles and process each tile by calling the task function
# with the processing pipeline; have a look at the "parallel tiles" version of
# this script on how to process each tile on a separate CPU core in parallel
for iTile in range(tileshape.Get_Count()):
= Task(iTile)
result
if not result:
'Processing task failed', starttime)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# processing completed, now ...
# create virtual point cloud dataset of classified tiles for subsequent processing steps (here LAS/LAZ export in script #3)
= lis_cmd.Create_File_List(cfg.SPCCLASSDIR, '.sg-pts;.sg-pts-z', cfg.SPCCLASSDIR + os.sep + 'spcvf_input_file_list.txt')
inputfiles
if not lis_virtual.Create_Virtual_Point_Cloud_Dataset(INPUT_FILE_LIST=inputfiles, FILENAME=cfg.SPCVF_CLASS,
=1, USE_HEADER=True):
METHOD_PATHS'Failed to execute tool', starttime, True)
lis_cmd.Error_Exit(
# -----------------------------------------------------------------------------
# free memory and stop logging
saga_api.SG_Get_Data_Manager().Delete()
lis_cmd.Stop_Logging(starttime)
Check your indentation! The tool call for the creation of the virtual pointcloud must be outside of the FOR-loop.
Step 38: View Virtual Point Cloud Result in the GUI
After successful execution, the output folder has also the *.spcvf file available that we can import in the GUI.
In the GUI, use the tool: LIS Pro 3D > Virtual > Get Subset From Virtual Pointcloud [Interactive] and provide the ***.spcvf file.
Click Execute!
This is an interactive tool and expects that you drag a box in the map view with the action tool (black arrow in the to menu bar) in order to define your query.
hold left-mouse in order to drag a box that covers all available datasets! After releasing the left-mouse button the query will be processed.
A new file appears in the Data tab and you can load it to the map.
You can see that you have one single point cloud again.
Step 39: Conclusion and Outlook
We designed a 13-step processing workflow in the GUI of LIS Pro 3D and adapted for automation with python. The way the python scripts are implemented allows us now to apply it also to larger datasets with 1000 or even 10000 tiles.
In the next tutorial we will have a look how we can make our scripting architecture more efficient and target-oriented.
We will have a look in how we can run our two main scripts 01_data_preparation.py and 02_data_processing.py together
We will have a look in how we can write all prints that we have seen in the TERMINAL during processing into std.log and err.log files. This will allow us to better find error sources and evaluate result completeness after the processing has finished
We will have a look in how we can parallize the whole processing pipeline