classical_psha_based_risk demo is broken
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
OpenQuake (deprecated) |
Fix Released
|
Critical
|
Lars Butler |
Bug Description
The root cause of the breakage appears to be that the hazard curve's
'statistic_type' is NULL but the risk code in question
(ClassicalPSHAB
Maybe the mean hazard curve is not being written to the database?
This is from openquake/
hc = models.
I checked the geohash and it seems to be correct.
openquake=# SELECT * FROM hzrdr.hazard_curve WHERE output_id=509
id | output_id | end_branch_label | statistic_type | quantile
-----+-
215 | 509 | 0 | |
openquake=# SELECT * FROM hzrdr.hazard_
id | hazard_curve_id | poes | location
------+
9369 | 215 | {1,0.9999999999
Changed in openquake: | |
status: | New → Confirmed |
importance: | Undecided → High |
tags: | added: broken-test defect |
Changed in openquake: | |
status: | Confirmed → In Progress |
importance: | High → Critical |
milestone: | none → 0.5.0 |
assignee: | nobody → Lars Butler (lars-butler) |
tags: | added: enduser-visible faq |
Changed in openquake: | |
milestone: | 0.5.0 → 0.5.1 |
Changed in openquake: | |
status: | Fix Committed → Fix Released |
Here's the problem:
A few weeks ago, mean hazard curves were always being computed (even if the param COMPUTE_ MEAN_HAZARD_ CURVE was set to false). So we fixed the bug and the redundant computations are no longer happening.
This is what caused the bug to surface.
When the Classical Risk calculator computes loss ratio curves for a given asset + location, a hazard curve is required to do the calculation. The hazard curve used should be the mean curve of the hazard curve computed for the site of interest over N logic tree samples. The loss ratio curve calculation function queries the database for a mean hazard at the given site.
The classical_ psha_based_ risk demo specifies the COMPUTE_ MEAN_HAZARD_ CURVE parameter as false. So, when the database is queried for the mean hazard curve (at any site), the curve doesn't exist and the calculation blows up.
Here's what we need to do to fix it:
1) For Classical Risk calculations, COMPUTE_ MEAN_HAZARD_ CURVE should always be 'true' (according to Vitor). MEAN_HAZARD_ CURVE (a Hazard parameter) should always be 'true'. If it is specified in the config, it should be ignored and the software provide a default.
2) We need appropriate database and job config constraints to enforce this rule. The rule is: if calculation mode is "classical" and job type is Hazard + Risk, then COMPUTE_
3) In the database, we are recording the 'job_type' in the oq_params table. This is really 'calc_mode' and should be renamed as such.
4) Once the existing 'job_type' is renamed to 'calc_mode', a new 'job_type' parameter should be introduced to record the type of the job, which at present is 'hazard' and/or 'risk'. This field should be a VARCHAR[] and should <= ['hazard', 'risk'] (one and/or the other).
5) Make any changes necessary to job config validation. I can't think of anything, but we may have to change something.
6) Finally, write a QA test that exercises this demo and validates the computed risk and hazard artifacts (curves, map, etc.).