-
Notifications
You must be signed in to change notification settings - Fork 22
Open
Description
Hi the team of sat-bundleadjust:
I've run this software on a small raster (52MB, 6685x8237 pixels) and it can run till the end, but I also notice that the required maximum memory throughout the processing is 32GB.
Later, I run this software again on a much bigger raster (600MB, 40000x51200 pixels) and it shows the following error:
Error during the allocation.
The source code shows this error stems from this line in SIFT algorithm. Thus my questions are:
- I am new to this project. But it looks like the pipeline will read the whole stereo pair into the memory and run SIFT algorithm on the whole images, which consumes a large amount of memory?
- If 1. is affirmative, is there a way to run this pipeline on a window-based manner, meaning it splits a raster into subtiles and run SIFT (collect and match keypoints) on each subtile. Afterwards, it collects all valid keypoints and calculated the adjusted RPCs. By doing this, I believe the memory footprint will be much smaller than reading the whole raster into memory.
- The spec of my environment is 16 CPUs and 64 GB memory.
- Here is my
config.json. I use the default setting as README indicates. I didn't use any other customized setting.
{
"geotiff_dir": "/home/ubuntu/tmp/images",
"rpc_dir": "/home/ubuntu/tmp/rpcs",
"rpc_src": "geotiff",
"output_dir": "/home/ubuntu/tmp/output"
}
Thank you.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels