The ./setup
coopies the codes for the necessary modules (specified in Config
and the setup flags) into the object/
directory. After setup, you also need to compile the F90 code.
./setup Sphere -3d -auto
cd object/
make -j8
cd ..
You can find a list of setup flags here. Note that whatever you put inside setup overrides your settings in Config, so make sure to understand what you're loading in. One important setting is the settings for which solver to use (+usm, +8wave
..etc), the default hydro solver is PPM.
The simulation setup is compiled into the executable ./flash4
inside objects. You can just run ./flash4
on its own if you are only running on a single processor. However, if you have compiled the simulation with MPI, then you need to properly submit it through the batch scheduling system that your supercomputer center uses.
This is an example of SLURM job submission script in scripts/job.pbs
:
#!/bin/bash -l
#SBATCH -p debug
#SBATCH -n 512
#SBATCH -o %j.out
#SBATCH -e %j.err
#SBATCH --qos=normal
#SBATCH -t 00:30:00
#SBATCH -J flash
#SBATCH -L SCRATCH
cd /global/project/projectdirs/astro250/dlee/FLASH4.3/object
srun -n 512 ./flash4
To submit this, run:
sbatch job.pbs
The boundary conditions for our problem is outflow in the x and y direction, but periodic in the z direction. The periodic BC in the z direction is motivated by the fact that observationally dense cores tend to reside in strands of high-density gas