I have about 300 hazard layers that I want to batch-run for a risk analysis. All my input files are shapefiles. My model at the moment looks like this.
It depends a little on whether you want a single result or 300 separate results. If it’s the latter, you can use riskscape model batch command ( Tip: Use --help to inspect what you can do with the CLI ).
I think in your case you’ll want to try something like:
riskscape model batch Taranaki-exposure --vary-parameter input.hazards.layer --vary-input all.csv
I would prefer to have one output of a .csv and .shp file with the associated risk analysis of each hazard layer file, i.e one .csv and .shp file with 300 rows of data.
I also tried the code provided, but I am still getting the same error.
To start with, let’s make it work with a single hazard layer and go from there. Change the hazard layer bookmark (or add a new one) to be a single shapefile and make sure that’s working.
Once you’ve done that, if I’ve understood correctly, you want to run the model once against your 300 hazard layers to produce a single output that incorporates losses for the exposure layer against all the hazard layers and lists the results in a single csv and shapefile. Is this a probabilistic model? If not, can you give me an idea of what you’re trying to achieve? I.e. do you want to compare the results of 300 different hazard scenarios? Is this a Monte Carlo simuation of 300 different hazard events? Are you hoping to get AAL statistics from the model, or just basic statistics like means or percentiles?
Regardless, if you want a single output that covers running the model across all 300 hazard layers then the approach you need to take will be similar to how we do probabilistic modelling.
We’ve not built this type of modelling in to the wizard (yet) so the process is a bit more involved and goes a bit like this:
Get the wizard to produce you a single-event pipeline to work on as a starting point
I’ve been talking to one of the other developers here and it looks like we don’t support loading Shapefiles in the manner described in the example. We’ve got it in our TODO list, but not sure when it’s going to be released.
In the meantime, you could try to convert the shapefiles to CSV, but that’s probably not going to work if the geometry within them is polygons. Probably your best bet for now is to use batch mode to run the model individually and then collate the results yourself. If you’re using linux or mac to do the modelling, we can help you with some command-line magic to concatenate the resulting CSVs from batch mode.