Question |
Response |
---|---|
How I can use a 3D velocity model in NonLinLoc? |
The tools provided in NLL-Vel2Grid are mainly for 1D and simple 2D-2.5D models. I recommend that, if you want to use a fully 3D model, you transform your model using you own software into slowness*length in the format used by NLL-Grid2Time (in fact, this is the format used by the previously existing Podvin-Lecomte FD travel-time program). The NLL grid format is explained at http://alomax.net/nlloc → Formats → 3D Grid Files The
model file for Grid2Time must consist of cubic cells (dx = dy =
dz) and the cell values must be SLOW_LEN = slowness*length
(sec) units, where length = dx. The cell values should correspond
to the slowness at the center of each cell, e.g. the value at grid
point x, y, z should be based on the slowness in the velocity
model at x+dx/2, y + dy/2, z + dz/2. |
What types of grids does the NLL package use? |
You
have to use SLOW_LEN for the velocity grid if you want to create
travel time files with Grid2Time. You can also use VELOCITY
or VELOCITY_METERS if you want to plot the velocity model with
Grid2GMT The files output by NLLoc with a Grid-search (LOCSEARCH GRID) are PROB_DENSITY or GRID_MISFIT |
How do I set the travel-time errors in LOCGAU and LOCGAU2? |
It is best to set the travel time errors using a-priori information (e.g. a layered model should have large travel time errors for a region with complex geologic/tectonic structures) and based on analysis of the station residuals (the travel time errors should be of the same order as the phases residuals, for phases that are definitely associated with an event and for which the pick uncertainties are smaller than the residuals. When
the locations look good, and the P and S models are stable, then
the model errors should be at least the order of the typical
residuals and of the location RMS. But again, it is best to
set the error values from independent information, if available.
For example, for teleseismic P, lots of studies show errors of 1+
sec when spherically symmetric models are used. The errors
are larger at regional distances and around plate
boundaries. |
Is it possible to weight observations as a function of the distance from the event? |
Not directly, but there is LOCGAU2 if you are using EDT: #
travel-time dependent gaussian model error parameters LOCGAU2 gives a travel-time dependent error, which is similar to a distance-dependent error - readings from further stations get greater error. It is meant to mimic distance weighting found in many linearized location packages. Setting SigmaTfraction depends on the scale of the study and the assumed accuracy of the travel time model. For teleseismic location the best value may be 0.01-0.02 (1-2% error), though this is not completely adequate because close teleseismic readings (~10-30deg) have more error than far readings (~50-90deg). For local-regional work, I would expect values of 0.01 to 0.05 (1-5% errors), though 5% means there is low confidence in the model, and/or the model is thought to be too simplified. The error is SigmaTfraction*travel_time (thus a function of the hypocenter location and for this reason not used with LOCMETH GAU_ANALYTIC, to keep LOCMETH GAU_ANALYTIC exactly as specified in Tarantola and Valette, 1982). |
How are the phase weights scaled and what do they represent? |
The weights in the output phase listing are normalized so that their mean is 1. Thus more important phases have weight >1. For GAU_ANALYTIC the weights are fixed by the station observations and travel time errors (thus prior and posterior weights). For EDT the listed weights are posterior weights based on the contribution of the observation to the maximum likelihood EDT solution, these weights will in general differ from the EDT prior weights, which can be calculated but are not listed in the phase output. This difference is necessary because EDT effectively removes outliers, whose posterior weights (which may be near zero) are more informative than their prior weight. |
How does NLLoc calculate origin times? |
With LOCMETH GAU_ANALYTIC, the origin time is calculated exactly as specified in Tarantola and Valette, 1982 - it is a weighted average of the observed arrival times minus the predicted travel times for ALL picks, where the weights are the prior picking and travel time errors. So the origin time does enter in the inversion calculations in a pre-determined, analytic sense, but there is no effective search over origin time. With LOCMETH EDT, the origin time is the weighted average of the estimated origin times (observed arrival times minus the predicted travel times) for picks that contribute to the optimal location stack, where the weights are the contribution of each pick to the location stack. So the origin time is not explicitly pre-determined, but instead determined after location for the optimal x,y,z point (though the origin time is inderectly considered during the EDT search). The important difference is that EDT is much more able to remove the influence of outlier picks than GAU_ANALYTIC. With LOCMETH EDT_OT_WT* methods, the origin time is determined form a search for the maximum likelihood point in a histogram of the estimated origin times (observed arrival times minus the predicted travel times) +/- time_error, where time_error includes the picking and travel-time uncertainties, and the finite size of the grid or oct-tree cell of the optimal location. |
How does NLLoc calculate RMS error? (QUALITY → RMS) |
The RMS error is the best estimate of an origin time error in NLLoc. However, only the LOCMETH EDT_OT_WT* methods in NLLoc perform an explicit search over origin time, and this is only done for the optimal x, y, z hypocenter. With
LOCMETH GAU_ANALYTIC the RMS error is the weighted rms of the
arrival residuals: With LOCMETH EDT and EDT_OT_WT* methods, the RMS error is the variance of the histogram of estimated origin times (see How does NLLoc calculate origin times?). |
How do I tell NLLoc to ignore S phases in location? |
The
solution I use is to map S phases to an invalid phase name, such
as $: |
What are the output grid and PDF scatter sample values? |
For grid-search, the output grid values are normalized (with respect to their integral over the search volume ) probability densities, but it may still be easier (when also working with oct-tree) and safer to assume that they are not normalized. Oct-tree output and scatter sample values are non-normalized likelihoods, they not normalized with respect to their integral over the search volume. This normalization is done internally in NLL for generating the scatter sample, but the output pdf values are never modified with respect to this integral. When SAVE_NLLOC_OCTREE is used, the saved values are normalized likelihoods. The normalization with respect to the integral over the search volume is done when the oct-tree structure results are saved to file. |
What is the PDF scatter sample, and what is its relation to the confidence ellipsoid? |
The PDF scatter sample is generated directly from the accepted Metropolis samples for LOCSEARCH MET. For LOCSEARCH GRID and OCT the PDF sample is generated from the location search-grid PDF solution (type PROB_DENSITY) for LOCSEARCH GRID and OCT; this is done by drawing a number of samples within each grid cell in proportion to the cell volume multiplied by the pdf value in the cell. The density of points in the PDF scatter sample around a point x,y,z is directly proportional to the PDF value at x,y,z. There is no cutoff confidence level in the PDF scatter sample, so sample points can appear throughout the location search-grid. But the finite number of sample points results in few or no samples occurring where the PDF values are low; this may include most of the search-grid when the location is well constrained. Thus visualization of the PDF scatter sample shows the higher probability regions of the PDF as a higher density of points (e.g., dense, red regions in SeismicityViewer). The confidence ellipsoid is generated from the covariance matrix of the PDF scatter sample. It is an ellipsoidal, "Gaussian" or normal statistic approximation to the PDF, truncated at the 68% confidence level (i.e., if the PDF were perfectly ellipsoidal, then there would be 68% probability that the hypocenter is inside the ellipsoid. See http://alomax.net/nlloc → NLLoc → Inversion Approach The confidence ellipsoid forms a compact, approximate representation of the spatial error of location. The confidence ellipsoid is centred on the expectation hypocenter, but the maximum likelihood hypocenter can be outside of the ellipsoid - this is common with complicated or irregular (non-ellipsoidal) pdf shapes. However, the maximum likelihood hypocenter will always fall within the highest density part of the PDF. The expectation hypocenter, covariance and 68% confidence ellipsoid are reported in the STATISTICS line of the NLLoc Hypocenter-Phase file. Axes 1, 2 amd 3 are the semi-minor (shortest), semi-intermediate, and semi-major (longest) axes, respectively, of the confidence ellipsoid. Len1, Len2, Len3 are the semi-axes lengths (half-lengths) in kilometres. The 68% horizontal (epicentral) confidence ellipse (semi-axes lengths and orientation of semi-major axis) is reported in the QML_OriginUncertainty line of the NLLoc Hypocenter-Phase file. The semi-axes of the confidence ellipse will always be shorter than the projection of the corresponding axes of the confidence ellipsoid on a horizontal plane. Note that all of the NLLoc Hypocenter-Phase file output other the STATISTICS and QML_OriginUncertainty lines refers to the maximum likelihood hypocenter. References: Press, W.H., Teukolosky, S.A., Vetterling, W.T. and Flannery, B.P., 1992, Numerical recipes in C: the art of scientific computing or Numerical recipes in FORTRAN: the art of scientific computing, Cambridge University Press, Cambridge. Hypoellipse: http://pubs.usgs.gov/of/1999/ofr-99-0023/ → Chapter 3 - Error Estimates |
How can I obtain the horizontal and 3D standard errors for the PDF scatter sample? |
The 2D horizontal errors along the major and minor axes of the horizontal ellipsoid are available in the NLLoc Hypocenter-Phase file: see QML_OriginUncertainty Line. More generally, the 3D x, y, z errors can be calculated from the covariances in the NLLoc Hypocenter-Phase file STATISTICS line: double
errx = sqrt(DELTA_CHI_SQR_68_3 * cov.xx); where DELTA_CHI_SQR_68 is 3.53, (value for 68% conf, see Num Rec, 2nd ed, sec 15.6, table) And the 2D x, y errors can be calculated from the covariances in the NLL STATISTICS line: double
errx = sqrt(DELTA_CHI_SQR_68_2 * cov.xx); where DELTA_CHI_SQR_68_2 is 2.30, (value for 68% conf, see Num Rec, 2nd ed, sec 15.6, table) |
How do I set the spatial position in VGGRID for 1D, laterally homogenous) models? |
The 2D travel time grids used for 1D laterally homogenous (e.g. layered) models represent the travel times from each station, with the station at the position 0,0,z_station in the grid; thus the velocity grid (which serves as a template for the travel time grid) must have orig_grid_x/y = 0. The positions in geographic space is controlled by the station location, the grid specified in LOCGRID and by the TRANS control statements; in contrast, or 3D grids, the geographic spatial position is also controlled by the orig_grid_x/y /z values in the VGGRID statement. |
Can NonLinLoc be modified or used for commercial purposes? |
All
of the NLLoc software is under the GNU GPL license/copyright. Thus
it can be modified or used for commercial purposes, but cannot
itself be sold. So you can, for example, use NLLoc to perform an
analysis for which you are paid, or be paid for your services to
modify NLLoc (the resulting software, except for entirely new,
independent modules, cannot be sold and must be freely available
to all), but you cannot sell software that includes NLLoc or a
version of NLLoc that you have modified. |
How can I locate with NLLoc using a fixed hypocentral depth? |
To perform a fixed depth location, use 1) for LOCGRID a very small range in depth and use 2 depth nodes, and 2) use LOCSEARCH OCT init_num_cells_z = 1, e.g. Regional (rectangular) - depth fixed 5-6km: LOCGRID
400 350 2 -250.0 -150.0 5.0 1.0 1.0 1.0 PROB_DENSITY
SAVE Global - depth fixed 30-31km: LOCGRID
361 181 1 -180.0 -90.0 30.0 1.0 1.0 1.0 PROB_DENSITY
SAVE The depth range cannot currently be fixed to a single depth, as this would effectively be a 2D search while NLLoc always performs 3D searches. |
How can I build a 2D vel model with Vel2Grid using polygons? |
See: Building a 2D vel model with Vel2Grid using polygons Thanks to Alberto M. López Venegas |
In LOCSEARCH OCT how do I choose the numbers of initial cells (x,y,z)? Does the number in one direction depend on the numbers in the other directions? |
For setting: LOCSEARCH OCT initNumCells_x initNumCells_y initNumCells_z minNodeSize maxNumNodes numScatter useStationsDensity stopOnMinNodeSize there are a few rules of thumb to follow: The total number initNumCells_x * initNumCells_y * initNumCells_x should be << maxNumNodes. This insures that a sufficient number of nodes to accurately define the pdf are processed using oct-tree division search after the initial, fixed grid-search defined by the number of initial cells. The dx, dy and dz implied by initNumCells_x/y/z (depending on the extent of the LOCGRID) should be of similar sizes to avoid anisotropy in the search. However, in some cases (e.g. large scale, regional or teleseismic location; locations with very poor depth constraint; models with thin layers with strong velocity contrasts, ...) it may be desirable to introduce anisotropy in the depth z direction (dz smaller or larger than dx and dy). As the complexity of the model or the expected complexity of the location pdf's increase, the number of initial cells should be increased, consequently their size will decrease. |
How can I install NonLinLoc on Windows using Cygwin? |
NLL complied on Windows XP using Cygwin: Issues encountered related to getting the path and environment variables straight and loading the necessary libraries (GMT and netCDF). See this resource http://seis.bris.ac.uk/~glxaw/Cygwin4seismology.htm (Cygwin for Geophysics; covers installation of Cygwin and use of SAC, TauP and GMT, should be enough to get started with NLL). Thanks to James R. Humphrey |
|
|
|
|
|
|
|
|
|
|
|
|
Back to the NonLinLoc site Home page.
Anthony Lomax