Please wait...
×

Error

  • Unusual temperature effects in ATES-simulations

    Dear FeFlow-community, I have several issues when running FeFlow-ATES-simulations. Problems in first study case: I have built a model containing a reservoir consisting of two lithologies with varying petrophysical properties. 80% of this reservoir (ambient T: 13°C) is covered by the well screens. This model contains 5 wells (ideal cell size considered), where 3 wells represent the cold side (injection with ~40°C, varying flow and production rates, and a distance of ~200m between these wells) and 2 wells the warm side (injection with ~90°C, also varying flow and production rates, and a distance of ~250m; distance to cold wells: ~350 m). In the first case ("model 1"), I have a hydraulic gradient from North to South, in the second case ("model 2"), I have a constant water level (hydraulic head boundary at all sides of the model). In all cases, I use the SAMG Equation-System Solver SAMG. Screenshots of settings in figures 1-3 attached. 1. Problem: In Model 1, I observe the effect that during an extraction phase, one of the warm wells and one of the cold wells (always the same pair during a single run) exhibit a completely different behavior and temperature range (see figure 4). However, this effect seems somewhat arbitrary because it affects different wells from run to run, hence I don't think that the distance between the wells are the reason for this effect. Additionally, when I include more wells in the model, it sometimes happens that only one cold well is affected by this effect, while none of the warm wells are. At this point, I am very confused because I can’t identify any pattern and am certain that I have parameterized all the boreholes identically. Moreover, the only thing I’ve varied in the model is the number of wells (and hence the flow rates at each well), nothing else… 2. Problem: In Model 2, I don’t observe the effect described earlier. However, a different issue occurs here. The temperature at the wells remains absolutely constant during the extraction phases, both on the warm (Fig. 5) and cold sides (Fig. 6). Essentially, I get a step-like diagram. Additionally, the temperatures, particularly on the cold side, are extremely low, as if the injection of warm water has no effect on the subsurface. Moreover, the extraction temperature on the warm side also fluctuates unusually strongly, from one cycle to the next. As with Model 1, I have parameterized this model in the exact same way, except for using a different hydraulic head boundary. In this model, I set up one cold and one warm well spaced approximately ~1500 meters apart. The reservoir has an ambient temperature of around 55°C, with warm water (90°C) and cold water (40°C) being injected at varying flow rates. The main issue arises when comparing scenarios with a hydraulic gradient versus a constant water level. The simulation with a hydraulic gradient shows significantly higher heat recovery efficiencies. This difference is primarily due to variations in extraction temperatures: - With a hydraulic gradient: The extraction temperature on the warm side starts higher and decreases linearly. On the cold side, the extraction temperature starts below the injection temperatures of 40°C and also increases linearly (see figure 7). - Without a hydraulic gradient: The warm side's extraction temperature decreases more gradually, reaching lower values over time. On the cold side, the extraction temperature initially exceeds the injected temperature but slows down as the extraction cycle progresses (see figure 8). These differences significantly impact the efficiency of heat recovery under varying conditions. Has anyone observed similar effects and found a explanation/solution for them? I would be highly thankful for any help or tipp!
  • Strange temperature effects in simple ATES-simulation

    Dear Feflow-community, I'm quite new to working with feflow and hence, am struggling with some issues. I've built a quite simple structured 3D-model with several layers including an aquifer that is confined by low-permeable layers. This aquifer should serve as an ATES-aquifer with several injection and production wells. All wells are already implemented with the ideal cell size, according flow rates as well as a temperature bc at the bottom node of each well. First simulations are working quite well, however, I'm observing a weird effect, as after the first year (1 injection and 1 production cycle), I can extract more energy/heat than I injected, although I injected 90°C into a reservoir with 13°C ambient temperature. Has anyone observed a similar effect and explain the reason? I am seeing another effect I cannot explain, as after each switching point (from injection to quiet phase and from quiet phase to production), the temperature shortly drops by ~20°C...why? What's the problem here? And one last question: If I group the wells and want to assign a temperature difference in the "Open Loop" property of this well group, how do I need to prepare the t-difference time series/how do I use this option? What t-difference is meant here (I don't really understand the feflow documentation explanation)? Can anyone maybe help me with these three issues? Many thanks, Simon