Popis: |
Given sufficiently extensive data, deep-learning models can effectively predict the behavior of unconventional reservoirs. However, current approaches in building the models do not directly reveal the causal effects of flow behavior, underlying physics, or well-specific correlations; especially when the models are trained using data from multiple wells of a large field. Field observations have indicated that a single reservoir does not have similar production behaviors. This makes pre-filtering the data to build local models that capture region specific correlations more pertinent than a single global model that will provide averaged-out predictions from different correlations. In this work, we investigate a sophisticated network architecture to expedite the clustering process by training the global model. We utilize attention-based (transformer) neural networks for the input data before mapping to the target variable to extract the attention scores between well properties and the production performance. We leverage the interpretability from these attention-based models to improve the prediction performance for data-centric models derived from clustered datasets. We show the benefits of building local models that are more accurate as they learn correlations that are more region/data specific. Specifically, by utilizing the attention mechanism, we can separate and curate data subsets to train local models, improving the prediction performance by reducing the variability in the entire field. |