The parameters for the analysis are in an input file. An example of
an input file is
"/home/dlr/development/dynamic_aperture/da_test.in"
&input
lat_file = 'BMAD_LAT:hibetainj_20040628_v01.lat_and_layout_symp_map'
n_turn = 1000 !number of turns
n_xy_pts = 37 !number of angles. Here steps in theta are pi/37
point_range = 1 37 !start at theta =pi/37 and go to pi
n_energy_pts = 3 !for each R,theta, use 3 different energies
x_init = 0.01 !initial horizontal displacement
y_init = 0.001 !initial vertical displacement
energy = 0.0 0.004 0.006 !initial energy offsets
accuracy = 0.00001 !accuracy with which maximum stable amplitude is determined
aperture_multiplier = 1. !multiple of aperture in aperture file
qx_ini = 0.55 !horizontal tune set point before turning on pretzel. Sometimes
turning on the pretzel pulls the tune and it is necessary to tune away
from the half integer first. qx_ini and qy_ini do not effect the
set point for the scan but may be useful in getting there
qy_ini = 0.6 !vertical tune set point before turing on pretzel.
qx = 0.54 !horizontal tune
qy = 0.615 !vertical tune
qz = 0.1 !synchrotron tune
qp_x = 1. !horizontal chromaticity - requires RAW_XQUNEING_2 group
qp_y = 1. !vertial chromaticity - requires RAW_XQUNEING_1 group
particle = 1 !species (1=positrons, -1=electrons)
i_train = 1 !bunch train of particle being tracked
j_car = 1 !car in train
n_trains_tot = 9 !number of trains in strong beam
n_cars = 5 !number of cars in each train
current = 1 !current in each strong beam bunch (mA)
rec_taylor = .true. !restore taylor maps for split wigglers (default = .true.)
/
- The data is written to several files named
"INPUT_FILENAME.dat" and "INPUT_FILENAMEn.dat" where "n=1,n_energies"
For the example input file above, we would get
- da_example.dat
- da_example1.dat
- da_example2.dat
- da_example3.dat
- The data can PLOTTED using the "PHYSICA" command file "DA". (This only works on unix)
- To plot with physica type "physica"
- At the prompt "PHYSICA:" type "@da_portrait" for an HEP lattice,
(use "da_cesrdr" for a CESR TF lattice)
(You will need to copy file "da_portrait.pcm" or "da_cesrdr" from the directory
/home/dlr/physica_macro/)
- The da data file prefix in the example above is "da_example"
- To make a hard copy type "hardcopy", then "s" to write a postscript file
- Submitting multiple jobs to apc cluster
- Create a directory da_jobs/ and subdirectory da_jobs/job_log/
- Create a subdirectory that will hold the input and output files for your calculation
da_jobs/da_test/
- Move to the directory da_jobs/ and create a soft link associating current_job/ to da_test/
with the command "ln -s da_test current_job
- Create an input file "da.in" in da_jobs/da_test/
- Move to da_jobs/da_test/ and execute the program
- /home/dlr/da_jobs/scripts/bin/input_linux ------ if you are on a linux machine, or
- /home/dlr/da_jobs/scripts/bin/input_osf ----- if you are on a unix machine
- Specify the number of separate input files to create. If n_xy_pts = 37
then you might want to create 37 input files. Each file will correspond to one point.
Then specify the input file, in this case "da.in"
- This will create 37 files, so you will now submit 37 separate jobs onto however many
processors you have available.
- OSF - The script /home/dlr/da_jobs/scripts/subapc_da will submit the jobs to the
21 apc nodes.
- LINUX - The script /home/dlr/da_jobs/scripts/subapc_da_linux will submit the jobs to
lnx313-316.
- Log output is sent to da_jobs/job_log/
- Output files go to da_jobs/da_test/
- On completion the output data can be combined.
- Go to da_jobs/da_test/ and execute the program
- /home/dlr/da_jobs/scripts/bin/output_linux ------ if you are on a linux machine, or
- /home/dlr/da_jobs/scripts/bin/output_osf ----- if you are on a unix machine
- The number of files is 37. The number of energies is n_energy_pts (usually 3). The basename is "da"
- The data will be combined into 3 files. da_combined1.dat, da_combined2.dat, da_combined3.dat
- The combined data can be plotted with the same physica macro. The data file prefix is "da_combined"