Error in calculating min & max value for texture scaling dialog window

In this model all the surface should be in the ~10E-6 mBar range, but getting a texture scaling minimum value in the 10E-9 range. I’ve checked all the surfaces, and the pressure are as they should be. I cannot figure out where its getting such a low min value from. The max value is also affected. If you search for the max value on the textured facet it doesn’t match the max value in the texture scaling window. I’ve also played around the mesh size, which it seems to affect somewhat. I’ve attached the model


Tesst Setup.zip (5.4 MB)

Hello Alan,

This simulation is time-dependent:
image

The min/max search takes into account all moments (so that colors remain comparable, which is essential when making an animation like this:)
timedep_clic

In your file, the low pressures are on the first moment (as the pressure builds up after the initial pulse)

Thanks, I didn’t notice since the source was set at a fixed rate…wasn’t referring to a time dependent gas source name. Still, this doesn’t make sense to assume it’s time-dependent. I guess you assume it turns on at the that fixed rate at time t=0? Wouldn’t it make more sense to assume that you are just running steady state in this case. What ever you set in the interface should be explicitly clear what it’s going to do, rather than remembering some default undocumented behavior.

Cheers,
Alan

Hello Alan,

You’re right that your source is constant, and it’s not a problem: I confirm that you can create as many time-dependent parameters as you want, and the simulation will still remain steady-state.

However, you have explicitly made the simulation time-dependent by…

  • defining 150 moments
  • and you have disabled constant flow calculation

image

There is no way to assume in this case that “you are just running steady state”.

And yes, your gas source that’s defined like this…

image

…is assumed to constantly desorb from t=0 with 5E-4 mbar.l/s. If you would have enabled steady-state calculation, it would, when you select “const.flow”, ignore particle hit times and give you the steady-state result, albeit with a performance and memory penalty and the autoscale issue.

There is no way to start earlier than t=0 (you kind of suggested starting at -inf), as I would need to pre-populate the volume with gas particles to evaluate the system state at t=0.0000001s. Moreover, the constant desorption plus time-dependent mode is used very often by us, for example to simulate leak propagation.