Popis: |
As new methods of interpreting 3D seismic data, particularly prestack and derived attribute data, increase in popularity, the management of ever-larger data volumes becomes critical. Compared with acquisition and processing, however, the interpretation use of seismic data requires faster and non- sequential, random access to large data volumes. In addition, quantitative interpretations lead to an increasing need for full 32-bit resolution of amplitudes, rather than the 8 or 16 bit representations that have been used in most interpretation systems up to the present. Seismic data compression can be a significant tool in managing these on- line datasets, but the implementation previously used (e.g. Donoho, et al. 1995) are not well suited to providing rapid random access in arbitrary directions (CDP, crossline, timeslice). If compression ratios of twenty or greater can be routinely provided, with full random access to all parts of the dataset, then both larger cubes and more cubes can be handled within the finite memory and disk systems available on interpretation systems. Such on-line data accessibility will lead to higher productivity by interpreters and greater value for existing seismic surveys. We address here several problems that arise when using the lossy wavelet-transform data compression algorithms currently available. We demonstrate that wavelet compression introduces less noise than currently accepted truncation compression. We also show how compressing small blocks of data needed for random access leads to artifacts in the data, and we provide a procedure for eliminating these artifacts.© (1998) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only. |