srun: job 2101950 queued and waiting for resources
srun: job 2101950 has been allocated resources
* Info: Selecting the 'perf-low-ppn' engine for node inti6000
* Info: "ref-cycles" not supported on inti6000: fallback to "cpu-clock"
fread: Success
[R 0P 0N 0] WARNING Failed to load set at /ccc/home/cont001/ocre/oserete/.mpc/sets/2583
:-) GROMACS - gmx mdrun, 2023.1 (-:
Executable: /ccc/work/cont001/ocre/oserete/gromacs-2023.1-install-MPCfix-icc/bin/gmx_mpi
Data prefix: /ccc/work/cont001/ocre/oserete/gromacs-2023.1-install-MPCfix-icc
Working dir: /ccc/work/cont001/ocre/oserete/GROMACS_DATA
Command line:
gmx_mpi mdrun -s ion_channel.tpr -nsteps 10000 -deffnm gcc
Reading file ion_channel.tpr, VERSION 2020.3 (single precision)
Note: file tpx version 119, software tpx version 129
Overriding nsteps with value passed on the command line: 10000 steps, 25 ps
Changing nstlist from 10 to 50, rlist from 1 to 1.095
Update groups can not be used for this system because there are three or more consecutively coupled constraints
Using 1 MPI process
Using 128 OpenMP threads
NOTE: Cannot set thread affinities on the current platform.
starting mdrun 'Protein'
10000 steps, 25.0 ps.
Writing final coordinates.
Core t (s) Wall t (s) (%)
Time: 10517.953 82.176 12799.4
(ns/day) (hour/ns)
Performance: 26.288 0.913
GROMACS reminds you: "At school I had a teacher that didn't like me and I didn't like him. At the end of the year he decided to fail me. The ironic thing is that the topic was chemistry. I have the distinction of being the only Chemistry Laurate who failed the topic in high school." (Thomas Lindahl)
* Info: Process launched (host inti6000, process 3715338)
* Info: Process finished (host inti6000, process 3715338)
Your experiment path is /ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0
To display your profiling results:
############################################################################################################################################
# LEVEL | REPORT | COMMAND #
############################################################################################################################################
# Functions | Cluster-wide | maqao lprof -df xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
# Functions | Per-node | maqao lprof -df -dn xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
# Functions | Per-process | maqao lprof -df -dp xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
# Functions | Per-thread | maqao lprof -df -dt xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
# Loops | Cluster-wide | maqao lprof -dl xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
# Loops | Per-node | maqao lprof -dl -dn xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
# Loops | Per-process | maqao lprof -dl -dp xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
# Loops | Per-thread | maqao lprof -dl -dt xp=/ccc/work/cont001/ocre/oserete/GROMACS_DATA/OV1_WP_128_v3/tools/lprof_npsu_run_0 #
############################################################################################################################################