Project Goals

This is a list, in no particular order, of desired targets for the project. Please note that this is more than can be acheived in the near term, so priorities will be set according to funding and developer availability.

For those interested in specific code development targets, technical enhancements, and bug-fixes etc., please see the developer resources page

  1. Officially release all applications: Right now all applications are at a beta stage.
  2. Release source code and establish email lists to enhance collaboration and strengthen the developer community.
  3. Collaborate with CyberShake: Expand existing tools for processing and analyzing CyberShake results (CyberShake involves full PSHA calculations based on 3D waveform modeling rather than using empirical attenuation relationships).
  4. Collaborate with CSEP: SCEC's Center for the Study of Earthquake Predictability (CSEP) is an infrastructure for testing earthquake forecasts and predictions. OpenSHA involvement is not in terms of helping develop the CSEP computational infrastructure, but rather to ensure interoperability (e.g., a forecast model can be plugged into our system for PSHA or their system for testing).
  5. Collaborate with risk-analysis efforts:
    • Help define a generic/standard seismic hazard to loss interface that can be used for both loss forecasting (e.g., for the California Earthquake Authority) and real-time assessments (e.g., the USGS PAGER system).
    • Contribute to the USGS ResRisk project being led by Nico Luco (a prototype application together that is available upon request).
    • Continue developing an interface for implementing full PSHA calculations in HAZUS (being led by Hope Seligson with funding from SCEC).
    • Contribute to the OpenRisk/Risk-Agora effort recently initiated by Keith Porter (he's interested in using OpenSHA as a foundation/starting-point; some prototype applications have already been developed that are available upon request).
    • Contribute to the Multi-Hazards demonstration project being led by Lucy Jones.
    • Develop and implement a relatively simple loss metric for evaluating the influence of various possible PSHA logic-tree branches (a tree-trimming tool in terms of identifying what's important; this is desperately needed by the ongoing WGCEP).
    • Implement loss calculations for time-dependent earthquake forecasts that include clustering/interaction effects (which current loss models are not designed to handle). This is of interest because the ongoing WGCEP plans to include clustering effects in their statewide model ASAP, and Nico Luco and Matt Gerstenberger have a SCEC-funded project to explore the vulnerability of buildings to aftershocks when those buildings have been weakened by the main shock.
  6. Continue the WGCEP development of a Uniform California Earthquake Rupture Forecast (UCERF): A goal of the ongoing WGCEP has been to build a living, adaptive, and extensible model that can be updated as we collect more data, make scientific progress, or following a large, significant earthquake. This living model has been constructed using OpenSHA components, and includes interoperability with various database elements (real-time access over the internet). Future goals with respect to this project include:
    • Core maintenance of model/infrastructure (including the addition of new model components as they become available).
    • Move/integrate the Oracle fault databases into the USGS National Seismic Hazard Mapping program in Golden.
    • Continue developing GUI-based tools for data entry and 3-D visualization.
    • Relax segmentation and include multi-fault ruptures.
    • Include clustering/triggering effects (e.g., aftershocks statistics, coulomb stress changes).
    • Develop tools to further explore logic-tree branches in terms of implied losses.
    • Make model capable of generating synthetic catalogs of events (especially needed for loss calculations when clustering effects are included).
  7. Implement STEP (the USGS 24-Hour aftershock hazard map): We have worked with Matt Gerstenberger to port his Matlab based code to Java while simultaneously defining and implementing generic components that can be used by others (e.g., for the clustering component of the WGCEP model). This is nearly complete, but finding the time to finish it off in the midst of competing efforts is difficult.
  8. Implement NSHMP's countrywide models: This has been achieve in the context of our collaboration with the Global Earthquake Model (GEM) development, although verification remains a work in progress.
  9. Assist others in implementing non-US Models: We often get inquires from individuals in other countries that are interested in implementing their models in OpenSHA (e.g., Mark Stirling from New Zealand, Laura Paruzza and Bruno Pace from Italy, John Douglas from France, Ulrich Wegler from Germany, Tamaz Chelidze from Georgia, and others). Implementing other models currently requires Java programming, which we understand constitutes a considerable impediment to most potential users. Therefore, what we need is to define standard input files that can be used without modifying existing codes (although this will necessarily limit the number of bells and whistles). Progress on this has also been achieved via our collaboration with the Global Earthquake Model (GEM) development.
  10. Create User manuals and tutorials for all applications: Some presently exist, but more are needed. This activity is particularly time consuming, and not a top priority given our necessity to focus on what will sustain funding for the effort (people or more willing to pay us to solve their specific problem than to write user manuals to help others solve their problems).
  11. Create user manuals and tutorials for code development: These would allow easier entry for those interested in understanding or contributing to the Java code.
  12. Develop educational tools: These would facilitate learning PSHA both in and out of the classroom. Jon Stewart, Jack Baker, and others have expressed interest in this.
  13. Polish PEER PSHA verification test cases: We have formally implemented the PEER test cases, but need to polish the web/GUI-based applications that allow anyone to configure and run these test cases.
  14. Implement Vector-Valued PSHA: Both Hong Kie Thio and Paolo Bazzurro have received SCEC funding for this (separately).
  15. Implement Attenuation Relationships for other IMTs: Several people have expressed interest in those for Nonlinear Spectral Acceleration, and Jack Baker has expressed interest in having Spectral Acceleration results averaged over a range of periods.
  16. Implement other site-response modules: There is interest in applying alternative, more sophisticated site-reponse modules to existing attenuation relationships. We have already implemented some in collaboration with Jon Stewart, Christine Goulet, and Paolo Bazzurro, and there is desire to have more.
  17. Continue collaborating with the Global Earthquake Model (GEM) initiative. We worked with the GEM1 pilot project to implement a global set of models, and OpenSHA has since been officially adopted as the hazard platform for GEM in the future
  18. Impove access to high performance computing: Calculating full PSHA maps is very computationally demanding. We currently use either the Condor GRID at USC (e.g., see Field et al., 2005c) or the TeraGrid (“the world's largest, most comprehensive distributed cyberinfrastructure for open scientific research”). However, the availability of these or other high-performance computing facilities to general OpenSHA users remains in question.
  19. Improve visualization tools: We currently use GMT for making maps and SCEC VDO for 3-D visualization (the latter developed as part of the SCEC UseIT intern program). However, we'd also like to explore the usefulness of GIS- and Google-Earth-based tools.
  20. Support physics-based earthquake simulations: Work with Jim Dieterich, John Rundle, Steve Ward, and others to support their development and analysis of physics-base earthquake simulations (some of us believe these models represent the future of earthquake forecasting). In addition to providing them with input data/models (e.g., faults and slip rates) we would like to develop generic data-mining tools for analyzing simulation results (e.g., to explore implied recurrence intervals and aperiodicities).