Hi, I've just noticed this very strange default parametrization in ASUs:

[list]
[li]ASM1Temp. theta_K_X = 1.116, (greater than 1), therefore K_X increases with temperature.[/li]
[li]ASM2dModTemp. theta_K_X = 0.896, (smaller than 1), therefore K_X decreases with temperature.[/li]
[/list]

Is this correct? It seems pretty weird, at least at first glance.

(I know, ASM1 and ASM2d have [i]different[/i] default parameterizations, but more or less their core reactions and parameters are quite equivalent. A parameter evolving in complete opposite directions is, at least, strange).
The parameterization for theta_K_X is somehow based on originally published K_X values at different temperatures, namely:
[list][li][b]ASM1[/b]: K_X = 0.03 gCOD/gCOD at 20°C, K_X = 0.01 gCOD/gCOD at 10°C, hence theta_K_X > 1[/li]
[li][b]ASM2[/b]: K_X = 0.10 gCOD/gCOD at 20°C, K_X = 0.30 gCOD/gCOD at 10°C, hence theta_K_X < 1[/li][/list]
In the case of ASM2, theta_K_X < 1 indicates a decreasing half-saturation coefficient, thus increasing affinity (and overall hydrolysis rate) with increasing temperature.

The difficulty in defining unique trends for temperature dependence of half-saturation coefficients was already acknowledged by authors of ASM1, who wrote "[i]Because half-saturation coefficients are not rate coefficients, but are parameters which influence the shape of a μ-S ... curve it is more difficult to generalize about the effects of temperature on them[/i]" (Henze et al., 1987).
Therefore, you may also wish to assume negligible temperature effect on K_X and set theta_K_X = 1.