CDF Note No. 384 A BEGINNER'S GUIDE TO ANALYSIS_CONTROL AND BUILD_JOB Version 7.21 18 May 1995 Marjorie Shapiro, Elizabeth Sexton-Kennedy, and David R. Quarrie Fermilab MS318 P.O. Box 500 Batavia IL 60510 Tel: (312) 840-2117/4974 DECnet Node Name : B0 BITNET Node Name : FNALB0 User Names : MDSHAPIRO, SEXTON ________ ABSTRACT The ANALYSIS_CONTROL and BUILD_JOB packages together provide a unified framework for Event Reconstruction, Dst Analysis and Online Calibration and Monitoring. The goals of these products are to provide a simple and straightforward means of combining any number of independent subprograms, called modules, into a single executable image and to provide a flexible system for specifying (either interactively or in batch mode) how these modules are run. ANALYSIS_CONTROL is the the overall driver program. BUILD_JOB provides an automatic means of creating an executable image from a number of independent modules without the user writing any additional code. CDF-384 (V3.02) Page 2 ________________ ACKNOWLEDGEMENTS These packages are the result of conversations with a large number of CDF physicists. In particular, we would like to thank the members of the CDF Reconstruction Working Group, Steve Kaye and Dave Ritson for useful discussions. The ideas on which we have based our work come largely from the PEP-4 Analysis Package. In particular, we wish to acknowlege the work done by Bernard Gabioud for that system. Please note that David R. Quarrie is no longer associated with CDF. CDF-384 (V3.02) Page 3 1 INTRODUCTION . . . . . . . . . . . . . . . . . . . . 5 2 WRITING AN ANALYSIS MODULE . . . . . . . . . . . . . 6 2.1 Types of Modules . . . . . . . . . . . . . . . . . 7 2.2 Module Entrypoints . . . . . . . . . . . . . . . . 7 3 CREATING AN EXECUTABLE IMAGE . . . . . . . . . . . 10 3.1 Using BUILD_JOB . . . . . . . . . . . . . . . . 10 3.2 Linking ANALYSIS_CONTROL . . . . . . . . . . . . 13 4 RUNNING ANALYSIS_CONTROL . . . . . . . . . . . . 15 5 SOME ADVANCED FEATURES . . . . . . . . . . . . . . 18 5.1 Histogram Manipulations . . . . . . . . . . . . 18 5.2 Special Notes For HBOOK4 Users . . . . . . . . . 20 5.3 Event Selection And Filtering . . . . . . . . . 22 5.4 Multiple Analysis Paths and Multiple Output Streams . . . . . . . . . . . . . . . . . . . . 23 5.5 Input Filtering Selection . . . . . . . . . . . 24 5.6 Multiple Parameter Sets . . . . . . . . . . . . 25 5.7 Calibration Consumers . . . . . . . . . . . . . 25 5.8 Adding Banks to the Output Stream . . . . . . . 27 5.9 Using $ Banks in ANALYSIS_CONTROL . . . . . . . 27 5.10 Using BLREADs in ANALYSIS_CONTROL . . . . . . . 28 5.11 Using VMS Sharable Images . . . . . . . . . . . 29 6 ANALYSIS_CONTROL INTERFACE TO THE CDF OFFLINE . . 29 APPENDIX A ANALYSIS_CONTROL COMMANDS A.1 BEGIN_ANALYSIS . . . . . . . . . . . . . . . . . . A-2 A.2 CALIBRATE . . . . . . . . . . . . . . . . . . . . A-5 A.3 CALL . . . . . . . . . . . . . . . . . . . . . . . A-9 A.4 CLEAR . . . . . . . . . . . . . . . . . . . . . A-10 A.5 COMMENT . . . . . . . . . . . . . . . . . . . . A-11 A.6 CONTINUE_ANALYSIS . . . . . . . . . . . . . . . A-12 A.7 DELETE . . . . . . . . . . . . . . . . . . . . . A-15 A.8 EXIT . . . . . . . . . . . . . . . . . . . . . . A-16 A.9 FILTER . . . . . . . . . . . . . . . . . . . . . A-17 A.10 HELP . . . . . . . . . . . . . . . . . . . . . . A-19 A.11 HISTOGRAMS . . . . . . . . . . . . . . . . . . . A-20 A.12 INPUT . . . . . . . . . . . . . . . . . . . . . A-23 A.13 OUTPUT . . . . . . . . . . . . . . . . . . . . . A-29 A.14 OUTPUT FILE . . . . . . . . . . . . . . . . . . A-35 A.15 SET . . . . . . . . . . . . . . . . . . . . . . A-41 A.16 SHOW . . . . . . . . . . . . . . . . . . . . . . A-47 A.17 SPAWN . . . . . . . . . . . . . . . . . . . . . A-50 A.18 STOP . . . . . . . . . . . . . . . . . . . . . . A-51 A.19 TALK_TO . . . . . . . . . . . . . . . . . . . . A-52 A.20 TAPE . . . . . . . . . . . . . . . . . . . . . . A-53 A.21 USE_MODULES . . . . . . . . . . . . . . . . . . A-62 CDF-384 (V3.02) Page 4 APPENDIX B MODULE-DRIVER COMMUNICATION CDF-384 (V3.02) Page 5 INTRODUCTION ____________ 1 INTRODUCTION The CDF collaboration consists of several hundred physicists from about 20 institutions. The goals of these physicists, as well as the computer resources available to them, span a considerable range. In spite of this diversity, it is extremely desirable to insure that all CDF analysis is done within a common framework. This approach allows code to migrate from one application to another (algorithms developed by users can easily be incorporated in the production analysis package, or can easily be borrowed by other users). It also allows physicists to shift from one project to another with a minimum of overhead. In addition, the existence of a common analysis framework encourages the development of high quality, reliable analysis tools since such tools can be used by the whole collaboration rather than by a limited subset. In order to satisfy the needs of the CDF community, we therefore have developed a general control structure for driving CDF analysis programs. This structure can be used for Online Calibration and Monitoring, as well as for Standard Reconstruction and Dst Analysis. The ANALYSIS_CONTROL and BUILD_JOB packages together provide a mechanism for painlessly combining several independent subprograms, called modules, into a single executable image. In addition, the system supports a high level of run-time flexibility, thus allowing the user to easily modify the default running conditions without having to edit analysis modules he did not write and without having to relink his program. Among the features available to the user within his executable program are: (a) The ability to specify that only a subset of the linked modules should be run and the ability to override the default order in which these modules are run. (b) The ability to manipulate the histograms associated with a given analysis module. Each group of histograms can be independently booked, deleted, cleared, displayed and output to disk. (c) The ability to enter a "parameter override" menu for any modules that supply such menus. This facility allows a mechanism for changing cuts without relinking. (d) The ability to stop event analysis in the middle of an analysis path on the basis of information provided by an event selection or filter module. The package allows multiple analysis paths to be executed for each event and each path can have an independent set of filters. If an event passes the filter criteria for more than one path, the analysis modules common to the two paths will be executed only once. (An CDF-384 (V3.02) Page 6 INTRODUCTION override mechanism is provided for the applications programmer who wishes to insure that his module will be run multiple times.) (e) The ability to specify multiple output streams. For each output stream, the user can specify what banks and record types will be retained. The user can also specify that only those events that pass a set of filter requirements or selected paths will be output. (f) The ability to specify the source of the input data. Options are currently available for reading events from disk or tape, for reading events from the online event buffer and for accessing the online data acquisition system over Decnet or TCPIP. The package allows the user to include other "input modules" in his link. For example, a Monte Carlo generator could be included as an input module. (g) The package provides a mechanism for handling modules that can be run with more than one set of default parameters (for example, different physics analyses might use the same electron finding code, but might change the values of the cuts used in that code). For such modules, the user can specify which parameter set he wishes to use and can require that the module be run several times for each event using different parameter sets. In order to make it easy to add new user code, the package includes a Job building procedure. This BUILD_JOB program is used to generate both the code necessary to incorporate a new module into the system and the code that specifies what analysis modules will be linked. Thus, in most cases a user can add his own subprograms to the analysis ___ system without writing any additional code. _______ __ ________ ______ 2 WRITING AN ANALYSIS MODULE A module is collection of subroutines that perform a specific analysis task. A module may be as simple as a single subroutine that is called for every event in order to dump the contents of a specific bank, or it can be a complicated set of code that requires specified tasks to be performed at program initialisation and termination, at the start and end of each run and for each event. A module can also contain (if the author wishes) user setable parameters or switches. Dividing an analysis job into independent modules allows the user the ability to independently manipulate these modules from within his executable image. An author who does not wish CDF-384 (V3.02) Page 7 WRITING AN ANALYSIS MODULE to use this feature of the package can, of course, define his complete analysis job to be a single module. 2.1 Types of Modules ANALYSIS_CONTROL and BUILD_JOB recognize three types of modules: Input, Output and Normal. Input modules are those sub-programs that perform the input of data to the program. Examples of input modules are the routines to read events from disk or tape, the routines to read events from the online global event buffer, and the routines used to create Monte Carlo events. ANALYSIS_CONTROL requires that at any time one and only one input module be active. More than one such module can be linked concurrently, however, and the user can switch between input modules inside the executable program. Output modules are those that perform the Output of data from a program. Most of the requirements of CDF may be met by a single output module, but the possibility of more specialised ones has been retained. Normal modules are typical analysis sub-programs. The majority of modules in the system will fall into this category. This note is intended to provide a guide to authors of normal modules. People interested in writing input or output modules should consult appendix B and Liz Sexton-Kennedy. 2.2 Module Entrypoints ANALYSIS_CONTROL allows user supplied code to be called under nine different circumstances. A module's author can supply "entrypoints" to be called under any subset of these circumstances. These entrypoints can be separate subroutines, ENTRY statements within a single subroutine, or a combination of the two. The allowed entrypoints are: ______________ (a) Initialisation The Initialisation entrypoints for all linked modules are called once at the beginning of the program. These initialisation routines can be used to set up default values or to initialise local arrays or control structures. ___ ______________ (b) Run Initialisation ______ The run initialisation entrypoints for all active modules are called whenever a begin run or begin file record is encountered with a different run number than the previous run or when the run number changes. In addition, whenever the command BEGIN is used to CDF-384 (V3.02) Page 8 WRITING AN ANALYSIS MODULE start processing events, this entrypoint is called. The run initialisation entrypoint can be used to zero run-oriented statistics, read run dependent databases or perform any other initialisation that needs to be redone whenever event processing begins. _____ ________ (c) Event Analysis ______ The event analysis entrypoints for all active modules are called once for each event (excluding non-data logical records such as Begin run records, comment records, etc.) This entrypoint is where most analysis is done. (Note: for online calibration, both event and calibration event records are sent to the event analysis entrypoint.) ___ ___ (d) End Run ______ The end run entrypoints for all active modules are called when an end run record is encountered or when the run number changes. This entrypoint is typically used to output run dependent information (as printout, as a file, or as a database entry). (Note: For calibration users, the end run entry point is called whenever the specified number of events has been processed so as to insure that a calibration database file is written. See Section 5.6 for details.) _______ ___________ (e) Program Termination The termination entrypoints for all linked modules are called once at the end of the analysis program. Uses of this entrypoint include outputting final statistics and closing appropriate files or output devices. _____ ______ (f) Other Record ______ The "Other record" entrypoints for all active modules are called for all logical records that are not Event, Begin-Run or End-Run. This entrypoint can be used to access such special records as comment and calibration. It is the author's responsiblity to determine what logical record type is present. This can be done by unpacking the LRID bank, by using the offline common JOBSTA (which is defined in C$INC:JOBSTA.INC) or by using the fuction __ STATUS = ANGRTY(RECTYP) or STATUS = ANALYSIS_GET_REC_TYPE(RECTYP) where RECTYP is the record type. (A list of possible record types can be found in C$INC:RECTYP.CIN) STATUS is an integer error return that signals ANSUCC if the package is initialised properly. (The other possible CDF-384 (V3.02) Page 9 WRITING AN ANALYSIS MODULE error returns can all be found in A_C$LIBRARY:ANERROR.INC) _______ (g) Talk_to The Talk_to entrypoint provides a means of accessing a user supplied subroutine on demand. This entrypoint is called when ANALYSIS_CONTROL detects the command (either interactively or from a command file): TALK_TO One purpose of this entrypoint is to allow a mechanism for setting adjustable parameters within each module. Authors providing a Talk_to entrypoint for this purpose must supply the code that asks the appropriate questions (or displays the appropriate menu). Authors should use the UIPACK user interface routines for such IO (see CDF-372). In addition to protecting the program from invalid responses, the use of UIPACK means that the responses to questions asked by the Talk_to entrypoint can be driven from the same command file as the ANALYSIS_CONTROL dialog. Another use of the Talk_to entrypoint is to drop the user into non-event oriented modules. For example, we have a version of the CDF EVENT display program that can be linked to ANALYSIS_CONTROL. In order to enter it, the user types TALK_TO DISPLAY _________ _______ (h) Histogram Booking ANALYSIS_CONTROL provides mechanisms for booking, clearing and deleting histograms on demand. In order to use these features of the package, the author must specify to ANALYSIS_CONTROL what subroutine is used to book his module's histograms and what range of histogram id's the module accesses IF the user is using HBOOK3 or YHIST. For HBOOK4 a users id's are isolated by being in different HBOOK4 subdirectories. A more detailed discussion of histogramming can be found in Section 5. ________ ___________ (i) Abnormal Termination This entrypoint will be called for all linked modules if the analysis job terminates abnormally. When writing a module, the author is free to choose the names of his subroutines. However, authors of code that will be incorporated into the standard reconstruction package should follow the naming conventions specified in CDF-321. CDF-384 (V3.02) Page 10 CREATING AN EXECUTABLE IMAGE ________ __ __________ _____ 3 CREATING AN EXECUTABLE IMAGE Once the user has written his code, he must interface it to ANALYSIS_CONTROL. This interface has two parts. First, the author must tell the analysis driver that his module exists. Second, he must provide a list specifying what modules (both standard and user-supplied) he wishes to include in his executable image. While the subroutines to interface a module to ANALYSIS_CONTROL can in fact be written by the code author, an interactive program exists that will automatically write this code. This program, BUILD_JOB, is described in CDF-386 and is the preferred method for code integration. 3.1 Using BUILD_JOB The BUILD_JOB program is used to create several subroutines needed by ANALYSIS_CONTROL. 1. The routine ANSTUP initialises the primary YBOS array, as well as the YBOS arrays used for calibration, histogram and user interface packages. (It also optionally can initialise blank common, which is used by HBOOK3 or PAWC used by HBOOK4.) The fact that BUILD_JOB is used to write this routine means the user can painlessly change the size of these YBOS arrays, thus tailoring them to meet his analysis needs. 2. The routines XXDECL are the "module declaration routines." Each routine defines a single module to ANALYSIS_CONTROL. These routines tell the driver what subroutines to call for each of the allowed entrypoints described in section 2.2. 3. The routine ANDECL specifies what modules the user wishes to include in his program. This routine calls the appropriate XXDECL routines and thus forces the correct code to be linked. The rest of this section describes how BUILD_JOB gets the information it needs to create these routines. In order to activate BUILD_JOB, the user should execute the command: @BUILD_JOB$COMMANDS:SETUP and then type the command BUILD. (Note: BUILD_JOB and ANALYSIS_CONTROL are automatically set up if the user executes the CDF offline setup.) Once inside the program, the user has many options. For example, he can override the choice of utilities to be initialised, or change the default size of the YBOS arrays CDF-384 (V3.02) Page 11 CREATING AN EXECUTABLE IMAGE associated with these utilities. By default, BUILD_JOB initialises YBOS, UIPACK, HBOOK4 and the Calibration and RunConditions Databases. A given utility can be deactivated by typing SET UTILITY/=OFF or can be activated (for read access) by typing SET UTILITY/=ON In addition, a given database will be initialised for write access if the user types SET UTILITY/=WRIT The storage size allocated to any active utility can be changed using the SET SIZE command. A user that is happy with the default utilities and sizes need not execute any of the above commands. In order to tell BUILD_JOB what modules exist, the user specifies the name of a dictionary (or dictionaries) that describe the modules. A dictionary is read wheneveer the user gives the command: DICTIONARY Filename where Filename is the name of the dictionary file and where the default file extension (if no other is specified) is .UIC. Note on UNIX systems a file extension must be used. We now have a number of standard dictionaries. These dictionaries reside in the C$DIC directory of the CDF offline package. To determine what dictionaries are currently available on your VAX, look in this area. In order to include or define his own module to BUILD_JOB the user must either DEFINE the module interactively or provide his own DICTIONARY file. An example interactive BUILD_JOB session is provided in Appendix C of CDF-386. We will concentrate here on the case where the module is defined using a dictionary file. The following is an example of dictionary file. Those wishing to use this file as a template for their own dictionary file can copy it from: A_C$COMMANDS:EXAMPLE_DICT.UIC. ! E ! Example dictionary definition file ! ! ! The module is named MYANA, and it uses histograms ! with id's in the range 1:100. CDF-384 (V3.02) Page 12 CREATING AN EXECUTABLE IMAGE ! ! BUILDJOB should write a subroutine defining the ! module and call this routine MYDECL ! The run initialisation routine is called MYRINI ! The event analysis is done in subroutine MYEVT ! The run termination subroutine is MYRFIN ! The histograms are booked in the routine MYBOO ! The program initialisation is done by MYINIT ! The "TALK_TO" routine for the module is MYTALK ! The program termination routine is MYFINI ! The abnormal termination routine is MYABTM ! The routine to be called for unusual record ! types is MYOTHR ! ! ! Define the module: ! Define MYANA/Descrip="My analysis module"/Hist=1:100 ! ! Specify the subroutines in the module:- ! ! Note: The line Declare="MYDECL" tells BUILD_JOB ! that the declaration subroutine it writes should ! be called MYDECL ! Declare="MYDECL", Run_Init="MYRINI", Event="MYEVT", Run_Fin="MYRFIN", Book="MYBOO", Init="MYINIT", Talk="MYTALK", Finish="MYFINI", Abnorm="MYABTM", Other="MYOTHR" ! ! Tell BUILD_JOB that the definition is done ! OK E ! End of file S Several feature of the above command file should be noted: 1. Lines starting with ! are comments and are ignored by the BUILD_JOB program. 2. A user does not need to include in the dictionary file any reference to entry points he does not use. For example, if the module does not have a program initialisation entry point, the line Init="MYINIT", CDF-384 (V3.02) Page 13 CREATING AN EXECUTABLE IMAGE should be omitted from the file. (Note: There is one exception to this rule: BUILD_JOB expects to find an event entrypoint. To tell the program that no such entrypoint exists, the user must explicitly type Event=" ") 3. A number of module definitions can be put in a single file. The command OK tells BUILD_JOB that the definition of a give module is complete. This command must be given after each definition. 4. Warning: By default, BUILD_JOB only recognises six character entrypoint names. Users who wish to use longer subroutine names MUST execute the command SET NAMES/ENTRYPOINT= BEFORE reading their dictionary file. Once BUILD_JOB has a list of modules and their characteristics, the user can specify the subset of modules he actually wishes to link. This is done using the LINK command. For example, LINK tells BUILD_JOB to create the code needed to make an executable image that contains only the two modules specified. (Note: the standard input and output modules used to read events from a file and to write events to a file are always linked by ANALYSIS_CONTROL. The user does not need to specify them in his BUILD_JOB session.) When the user has finished specifying his analysis needs, the code necessary for his job is written to a file using the WRITE command, which will prompt the user for a filename. alternately, the file is written when the user types EXIT. The default name for this file is BUILD_JOB.CDF. (Users wishing to end a session without producing a file should type QUIT.) 3.2 Linking ANALYSIS_CONTROL Once a user has generated his interface code using BUILD_JOB, he is ready to link. On the VAX, the following steps must be performed: (a) The user must compile the fortran file written by BUILD_JOB. (This file is in fact a ".CDF" rather than a ".FOR" file and must be compiled using the EXPAND/XFORT facility.) CDF-384 (V3.02) Page 14 CREATING AN EXECUTABLE IMAGE (b) The user must create an options file containing a list of all non-standard code he wishes to use. This options file contains the code for his own module, and also must contain the file created by BUILD_JOB. This file can have whatever name the user desires, but should have the extension .OPT. (An options file is a list of object files or libraries that the user wishes to link. On VAXs the names of object files should be separated by commas and a minus sign should appear at the end of each line apart from the last. If libraries are included in an options file, the qualifier /LIB should follow the library name. On UNIX machines the only requirement is that the entries should be space or separated.) (c) The user must setup the appropriate logical names by executing the following command procedure on the VAX: $ @A_C$Commands:Setup This command file need be executed only once per login session, regardless of how many times the user links. (Note: these logicals also must be set up before the user can run his program. It is therefore advisable to include this setup command in the user's login. The Analysis_Control logicals will automatically be set up if the user executes the offline setup command.) (d) Standard linking can be done by typing the command link_ana. This symbol activates a command file that will ask the appropriate questions and will then perform the link. (the analysis driver requires that one of these histogram packages be linked. It is not possible to concurrently link both histogram packages.) On UNIX the only available histogram package is HBOOK4 and currently there is only one optional library, CPS (CPS is for farm executables and is not used by general users), on VMS there are many optional libraries DI3000, PAWlib, HBK4, HBK3, YHISt, DAQ, MULTinet, MENU, FATmen, and the option of linking to sharable images (SHR) instead of object libraries. Users of the DI3000 links should be warned, however, that the DI3LOAD command procedure does not work if the user puts blanks or tabs within his options file (eg. between filenames, before a continuation character or at the beginning of a line). On VMS if the symbol MAPFILE points to a file link_ana will create a linker map file. For example: $ MAPFILE :== MYSCR:MYJOB.MAP $ LINK_ANA ... On UNIX if the environmental variable MAPFILE points to a file link_ana will create the closest thing UNIX has to a map file. For example: CDF-384 (V3.02) Page 15 CREATING AN EXECUTABLE IMAGE % setenv MAPFILE $MYSCR/myjob.map % link_ana ... _______ ________________ 4 RUNNING ANALYSIS_CONTROL ANALYSIS_CONTROL allows the user many options from within his executable image. The package is controlled using the UIPACK command line interpreter and therefore works equally well in interactive mode (taking its commands from the terminal or from a command file) and in batch. Some noteworthy features of this interface are: (a) Commands may be truncated as long as they remain unambiguous (b) Commands may be continued across several lines by typing a minus sign ("-") as the last character in the line (c) A response of the form:- ? will provide the user with a list of all valid responses and a response of the form:- ? will provide the user with help text for that command. (d) A response of the form:- @Filename will cause further responses to be read from the file "Filename" (with default file extension .UIC). Once the commands in this file have been exhasted, further response is taken from the default input (the terminal for interactive mode, the Batch command file in Batch). (e) A response of the form:- {Filename will cause all further responses to be stored in a file "Filename" (with default file type .UIC). These commands will also be executed. The file can be closed by typing the response CDF-384 (V3.02) Page 16 RUNNING ANALYSIS_CONTROL } A complete list of the commands available at run time is given in Appendix A. Among the most important commands are: (a) HELP - This command allows the user to get further information about analysis_control or about the linked modules. (b) BEGIN - This command tells the program to start processing events. If input is coming from a file, the command causes the file to be opened. An optional qualifier /NEVENT=n can be used to specify how many events should be processed. Note: For online calibration users, the command BEGIN is replaced by the command CALIBRATE. The difference between these commands is described in Appendix A. (c) CONTINUE - This command causes the program to continue processing events in an already opened file. If no file is currently open, an error condition is reported. The /NEVENT=n qualifier can be used here as well. Note: This command is not accessible to online calibration users (see Appendix A for details.) (d) USE_MODULES - This command can be used to specify that only a subset of the linked analysis modules should be run. (The default mode is to run all modules.) In addition, the command allows the user to change the order in which the modules are executed. (e) INPUT - The user can use this command to change between available input modules (e.g., Reading from a file and reading from the online event buffer). It is also used to specify the input filename (the default name is CDFINP). (f) OUTPUT - The command controls whether event output to disk or tape is enabled. (The default is output disabled.) The OUTPUT command can also be used to change the output destination (the default destination is CDFOUT) (g) SET - This command is used to set a number of driver parameters. For example, the user can specify a list of runs to be analysed and (optionally) a list of events to be analysed for these runs. He also can use the SET command to enable asynchronous input. CDF-384 (V3.02) Page 17 RUNNING ANALYSIS_CONTROL (h) SHOW - The command is used the give status information. It can be used to give a list of modules, information about input and output, and information about what modules are active. (i) TALK_TO - The command is used to access the Talk_to entrypoint for the specified module. (j) HISTOGRAM - This command is used to book, delete, and clear the histograms associated with a specified module or module(s). The driver does not by default book histograms. Users wishing histograms enabled must do so using the HISTOGRAM ON command. (Note: this is not true of calibration users. See section 5.6 for details, also HBOOK4 users must first use the HISTOGRAM OPEN command.) More information on histograms is available in section 5.1. (k) TAPE - (Vax and UNIX Only) Allows the user to mount and dismount tapes from within the Analysis Job. (l) EXIT - This command causes the program to terminate. (In batch mode, the program will automatically terminate when the list of commands to be executed has been exhausted.) In ANALYSIS_CONTROL, reasonable default options have been set, so that a user performing a simple task does not need to specify much. For example, if a user enters his program and types the following: BEGIN EXIT His job will run all linked modules in the default order. Input will come from the first input module he has specified in BUILD_JOB. If he has not specified an input module, the default will be to read all of the events in the file with logical name CDFINP. If, however, the user wants to run only the module MYANL and wants to analyze only the first ten events in the file MYFILE.YBS, he must type: USE MYANL INPUT FILE MYFILE.YBS BEGIN/NEVENT=10 At this point, the user could EXIT the program. Instead, he can continue reading the same file (with the eleventh event) by typing CONTINUE. Alternately, before doing any more event processing, the user can change the default conditions. For example, he could demand that the events be written to an CDF-384 (V3.02) Page 18 RUNNING ANALYSIS_CONTROL output file (using the OUTPUT command), or specify that he wishes to begin reading the same file again (using the BEGIN command), or respecify which analysis modules he wants run (using the USE_MODULE command). ____ ________ ________ 5 SOME ADVANCED FEATURES 5.1 Histogram Manipulations At present, three histogram packages are in use within CDF. The most commonly used is HBOOK4. HBOOK4 is the latest CERN histograming package. Output files produced in A_C can be viewed and manipulated with PAW (another popular CERN product). This is mainly a batch oriented package however the user may choose to link in the HPLOT package and interactively view histograms through the module's talk-to entry point (as is done for some online consumers). Another package, YHIST, was developed primarily for use in an interactive graphics environment (used by other consumers) and offers compatibility with the YBOS data management package in use throughout CDF. The other package, HBOOK3 is available for backwards compatibility. Since the use of these packages is likely to continue in the future, it is necessary to allow for the possiblity of both HBOOK3/4 and YHIST histograms within the analysis driver. While this approach allows the greatest possible freedom, it also involves a certain degree of danger:- (a) Programs may be forced to pay the overhead of linking to both packages and of providing data storage for both YHIST and HBOOK histograms if different modules within the program reference different packages. (b) Functionality provided by one package may not be recognized as not being available within the other, thus leading to confusion and frustration. A example of this might be a user attempting to display interactively HBOOK histograms, such functionality not being available in the proposed solution. In order to minimize these dangers, the Data Reduction Working Group has recommended that HBOOK4 be identified as the supported histogram package for the "standard" environment. Thus modules to be incorporated into the standard set to be used widely throughout CDF should be available with HBOOK4 format subprogram calls. As a result of this recommendation, we have decided to provide the following possibilities within the analysis driver: CDF-384 (V3.02) Page 19 SOME ADVANCED FEATURES (a) The user can specify from within BUILD_JOB whether he wants HBOOK4, YHIST or HBOOK3 to be initialised. (The default is to initialise HBOOK4.) BUILD_JOB will also allow the user to specify the size of the storage space associated with each histogram package. (b) The user will be provided a means of enabling, disabling or clearing histograms from inside the executable image. Both HBOOK3/4 and YHIST versions are available, but the ANALYSIS_CONTROL will not support the interactive handling of both histograms packages within the same executable image. (c) A mechanism for outputting (on demand) the histograms for a given module will be provided in ANALYSIS_CONTROL. HBOOK4 files are used for temporary storage buffers if N-tuples are booked. For this reason you must open the file before you book the N-tuples and begin analysis. You can then write to them at any time. For YHIST, histograms are output as a volume and folder (which can later be examined using the YHIST display program). The HBOOK3 version uses the HSTORE command. In order to use the histogram manipulation features in ANALYSIS_CONTROL, the module's author must do the following things: (a) When specifying his histogram booking entrypoint in BUILD_JOB, he must also specify the range of histogram id's he wishes to reserve if he is using YHIST or HBOOK3. (b) Since ANALYSIS_CONTROL allows the user to deactivate or delete histograms from within the analysis job, the user's code should check if the modules histograms are active before it tries to fill them. This check is done by making following function call: HISTAT = ANGHAC() HISTAT will give a value of .TRUE. if the histograms for the module are both booked and enabled and a value of .FALSE. otherwise. A similar logical function ANGHBO tells whether the histograms for the module have been booked. (c) In general, the author should let the booking of histograms be done by the analysis driver. If, however, histograms are an integral part of the analysis done by the module (for example, if histograms are used to store information essential for the operation of the module), the author can call his booking routine within his own code. If he does CDF-384 (V3.02) Page 20 SOME ADVANCED FEATURES so, he should inform the package that this has been done by making the following call: __ STATUS = ANPHST(HISTAT) or STATUS = ANALYSIS_PUT_HIST_STATUS(HISTAT) where HISTAT is .TRUE. if the module has just booked its own histograms and .FALSE. if the module has just deleted its histograms. Warning: Analysis_control does not by default book histograms. Users wishing histograms enabled must do so using the HISTOGRAM ON command. (Note: this is not true of calibration users. See section 5.6 for details.) 5.2 Special Notes For HBOOK4 Users Currently support is implemented for all HBOOK4 file handling. It is possible to have A_C open more than one histogram file. Each file has an associated histogram stream number, analogous to an output file stream. For the purposes of this note I will restrict the discussion to using only one file. Since HBOOK4 is the default package in BUILD_JOB the only thing you must consider at this stage is how big you should dimension the PAWC common block. N-tuple users will want to make it at least 500,000 long words this is done with the command: SET SIZE/HBK4=500000 BUILD_JOB will then generate code needed for the HBOOK4 interface. Special care must be taken when booking Ntuples. First you need to know the top level directory of the HBOOK4 file. You get this information from A_C with the following function: STATUS = ANGTH4(STREAM,TOPDIR,FILNAM,OPNOPT) Where STREAM is the stream id for which the information should be returned. The following are RETURNED arguments: TOPDIR is the Top directory string, FILNAM is the Filename string that the user typed in with the "HIST OPEN..." command and OPNOPT is a character*2 variable indicating the open option as described as CHOPT in the documentation for HROPEN. An example set of calls to book Ntuples is: NBUFF = 10000 STATUS = ANGTH4(1,TOPDIR,FILNAM,OPNOPT) CALL HBOOKN(NID,TITLE,NVAR,TOPDIR,NBUFF,CHAR) Note that stream 1 is specified. This will always work if the CDF-384 (V3.02) Page 21 SOME ADVANCED FEATURES user uses the defaults at the ANA>> prompt. The other thing to be careful with is the relative size of NVAR and NBUFF. NBUFF is the size of the buffer in memory that must be used up before the buffer is written to the HBOOK4 disk file. If it is too large the Ntuple will take up too much of the PAWC common block. If it is too small writes to disk will happen too frequently. Since HBOOK4/ZEBRA allows these buffers to be read back in there is some overhead in writing to disk. Pointers are stored in the PAWC common for each write. This is why it is possilbe to write too frequently. Some users have managed to run out of space in the PAWC common by filling it up with pointers. When this occurs, the error is fatal and not very informative. A_C uses the HBOOK4 directory structure in order to avoid histogram id collisions between two different user modules. Each A_C module's histograms get booked and filled into it's own subdirectory. The name of the subdirectory is the module name. What this means to the user is that someone else's module can not accidently fill entries into your histograms however when you want to read the HBOOK4 output you will have to change directory to the correct directory to see your histograms. In PAW you can use the command "cd ". In a fortran program that reads the output histogram file you can call HCDIR('',' '). In the analysis control program you must give the following commands at the ANA>> prompt (or in your uic file). This is a minimal set: HISTOGRAM OPEN HISTOGRAM ON BEGIN... HISTOGRAM WRITE N-tuple users should be warned that there is an upper limit to the allowed size of HBOOK4 files. The way RZ files are implemented in HBOOK4 a maximum number of direct access file records are allocated when the file is opened. The default value is 4000 and can be up to 65000. The record length is the number of longwords/record so the total number of longwords in the file would be the product of these two parameters. The VAX has a maximum record length of 4095 longwords or 16380 bytes so there will be an absolute limit no matter what you do. (However there is a rumor that cern is planning on changing the way they implement HBOOK4 RZfiles in order to allow even larger N-tuples than what is currently available). One word of caution. If you max out both of these parameters you will probably need a bigger size PAWC so I would suggest only doing this if you are now having problems. In order to change these values from the default values you can use these two qualifiers on the "HISTOGRAM OPEN" command. They are: CDF-384 (V3.02) Page 22 SOME ADVANCED FEATURES /MAX_NREC= and /RECLENGTH= (in longwords) For a full discription of how these parameters affect the allowed size of your N-tuple see the note written by John Marrafino PM0082-S. It's called "Writing Large Ntuple Files via HBOOK" and it is in the FNAL computing division library. For more information about the "HISTOGRAM OPEN" command see Appendix A. 5.3 Event Selection And Filtering One function provided by ANALYSIS_CONTROL is the ability to redefine without relinking the effect of a filter decision on the data path. The package allows for the possibility of event filtering to occur at many stages within the job. Each module can in fact make a decision about the event. This decision consists of a logical variable, which tells whether the event is "good" or "bad." In addition, the module can provide up to 128 integer words (4096 bits) that categorize the event. These integer words are expected to be a bit pattern, analogous to the online trigger mask, and shall hereafter be referred to as the module's "Tag Bits," since they are meant to tag the reason(s) why the event was accepted or rejected by the module. These Tag Bits will be included in the data stream for the event and therefore will appear on the output file. Using the FILTER command, the user can specify inside their program whether they wishes to ignore the filter decisions made by an individual module or change the type of decision made using the /SPECIFY qualifier. There are two types of decisions SELECT and VETO. SELECT is the default and means that processing will continue if the filter declares this to be a "good" event. A VETO decision means that processing will continue if the filter declares this to be a "bad" event, this feature is useful for users who want to study filter efficiencies and create a sample of events that fail a given filter. In the absence of any FILTER commands whatsoever, the default condition is to keep and continue to analyze all events regardless of any filter decision. By specifying FILTER MYANL1 ON The user tells ANALYSIS_CONTROL to stop processing events if the module MYANL1 says the event is "bad." If however, the sequence of commands: CDF-384 (V3.02) Page 23 SOME ADVANCED FEATURES FILTER/SPECIFY MYANL1 VETO FILTER MYANL1 ON was given ANALYSIS_CONTROL will stop processing events if the module MYANL1 says the event is "good." 5.4 Multiple Analysis Paths and Multiple Output Streams When event reconstruction is done on the main stream CDF data tapes, it is likely that only a subset of all events will be completely analyzed. One possible production scenario is to incorporate filtering routines provided by the various physics analysis groups into a single executable image and to reconstruct all events that pass one or more of the filters. (The reason for doing this within a single program is that there is likely to be significant overlap between the physics filters and it is inefficient to reconstruct the same events more than once.) This possibility puts two requirements on the analysis driver. First, since several physics analyses are being done in parallel, it makes sense to write several output files: one containing the events selected by each physics group. Second, since the different analysis will share some (but not necessarily all) reconstruction code, the package must be able to handle multiple analysis paths concurrently. If a module is needed by two analysis paths, this module should be run only once per event, even if that event passes the both paths' event filters. ANALYSIS_CONTROL provides the capability for multiple analysis paths and multiple output streams. Additional paths can be specified with the USE_MODULES and FILTER commands. For example, the series of commands USE_MODULES/PATH=1 MYANL1 STANDARD1 MYANL2 USE_MODULES/PATH=2 YOURANL1 STANDARD1 YOURANL2 STANDARD2 FILTER/PATH=1 MYANL1 ON FILTER/PATH=2 YOURANL2 ON set up two analysis paths. The first path runs modules MYANL1, and then runs STANDARD1 and MYANL2 only if MYANL1 said the event was good. The second path runs YOURANL1 and then runs STANDARD1 (unless it has already been run in path 1) and YOURANL2. STANDARD2 is only executed if YOURANL2 also said the event was good. If the user wishes to specify that the events that make it through the first path are written to one file (called OUTPUT1.YBS) and the events that make it through either the second OR third path are written to another (called OUTPUT2.YBS) , he can give the following commands: OUTPUT/STREAM=1 FILE OUTPUT1.YBS OUTPUT/STREAM=1 SELECT EVENT/PATH=1 OUTPUT/STREAM=2 FILE OUTPUT2.YBS CDF-384 (V3.02) Page 24 SOME ADVANCED FEATURES OUTPUT/STREAM=2 SELECT EVENT/PATH=(2,3) If the user wishes to specify that events that pass filter YOURANL2 are written to stream one, and the logical AND of a set of filters, MYANL1 and YOURANL2, are written to stream two, he can give the following commands: OUTPUT/STREAM=1 FILE OUTPUT1.YBS OUTPUT/STREAM=1 SELECT EVENT/FILTER=YOURANL2 OUTPUT/STREAM=2 FILE OUTPUT2.YBS OUTPUT/STREAM=2 SELECT EVENT/FILTER=(MYANL1,YOURANL2) 5.5 Input Filtering Selection This feature is intended to facilitate secondary analysis when the primary analysis was done using filters. Input selection is determined by taking the logical OR of the currently defined requirements. A requirement is a set of filters and their descriptions which are logically ANDed together to make up the requirement decision. A decision about an individual filter is made by comparing the specified filter and its associated description to the corresponding filter information in the appropriate TAGB bank in each event. (If an Analysis Control filter was used in creating the input file Analysis Control will have made a TAGB bank for it.) Most filters are simple; a name and parameter set number completely specifies them (ie. the author of the filter has used ANPTRG). Others, most notably TRGSEL, require a TAGB mask to be specified (ie. the author of the filter has used ANFTAG). Care must be taken in specifying the filter names and parameter sets since there is no way for Analysis Control to check that the name you typed in was a valid filter used in making the input file. This means that the person using this feature must know how the input file was produced. One way is to know the standard production filter names and parameter sets, another is to dump the TAGB banks from an event on the file since the names and parameter sets are invariant within a file (the parameter set number is the first integer after the filter name). Once input filtering has been enabled by specifing one or more requirements the presence of a nonexistent filter will cause that requirement to fail for every event. If this is the only requirement specified, none of the input file events will be processed. The basic command for setting up a requirement is: INPUT SELECT EVENT/FILTER=MYFILTER A complete description of this and related commands is in apendix A. CDF-384 (V3.02) Page 25 SOME ADVANCED FEATURES 5.6 Multiple Parameter Sets It is possible that some of the code to define physics objects will have more than one set of default parameters. For example, the missing transverse energy group and the electron group might use the same algorithms to define electrons, but because of different background processes might need different values of some cuts. To handle this possibility, ANALYSIS_CONTROL provides a means of running the same module with more than one parameter set. A module that allows multiple parameter sets must declare this fact to the analysis driver. This can be done from BUILD_JOB using the /PARAMETER_SET[=n] qualifier on the DEFINE command (where n is the number of parameter sets the module allows). The user can then pick the parameter set he wishes run as part of the USE_MODULES command. For example: USE_MODULES MYANL1/PARAMETER_SET=n specifies that module MYANL1 should be run using the nth parameter set. The program will signal an error if MYANL1 has not declared that it recognizes at least n parameter sets. Note that the parameter set qualifier can take the parameter set name as well. In fact anywhere you can specify a parameter set number you can specify a parameter name instead. When checking whether a module has already been run for a given event, the package tests independently for each parameter set. It is the responsibility of the module's author to poll ANALYSIS_CONTROL to determine which parameter set should be used. This polling is done using the command PARSET = ANGPAR() or PARSET = ANALYSIS_GET_PARAMETER_SET() where PARSET is the parameter set the module should use. 5.7 Calibration Consumers The ANALYSIS_CONTROL framework is used for calibration consumer processes as well as offline analysis. In this enviroment, however, there are additional requirements placed on the system: 1. Only a subset of the online consumer processes are automatically fired up by RUN_CONTROL. In the case of calibration, this means the consumer framework must make sure that the calibration data is taken only once the correct run has started. CDF-384 (V3.02) Page 26 SOME ADVANCED FEATURES 2. Some calibration tasks can act on several different input banks (corresponding to different detector components). It is useful to have a general way of specifying that only a subset of the specified banks should actually be examined. (The desire to handle this problem in a unified manner is driven by our belief that at some future date the online calibration tasks will be driven by RUN_CONTROL. Uniformity in implementation at this stage will make such future enhancements much easier.) 3. Calibration consumers are likely to write calibration database files in their end run entrypoint routines. A mechanism to disable the writing of such files is desirable for program debugging. The calibration user needs the ability to enable and disable the code that compares the current calibration data with a reference file to create a bad channel list and the code that displays to the terminal the results of the calibration. 4. In addition, these calibration files should be written whenever the specified number of events are processed. Since the writing of database files will be done from the endrun entrypoint of the calibration module, it is necessary that this entrypoint be called at the end of event processing, even if no endrun record is encountered. Because of these requirements, a special calibration command exists in ANALYSIS_CONTROL. This command replaces BEGIN and CONTINUE and has as a required parameter the range of runs for which analysis should be performed. In addition, it allows the user to specify whether database files should be written (using the optional qualifier /[NO]TMP), to specify an input banklist (using the optional qualifier /BANKLIST=), to turn on or off the checking against a reference file (/[NO]CHECKREF) and to enable or disable display activities (/[NO]VIEW). The program performs the appropriate endrun activities whenever the requested number of events has been analysed. Note: switching between analysis and calibration versions of the software does not require relinking the code. In order to access the TMP, BANKLIST, REF and VIEW information filled by the CALIBRATE command, functions have been provided: STATUS = ANGTMP(TMPON) or STATUS = ANALYSIS_GET_WRITE_TMP(TMPON) returns the logical variable TMPON, which is .TRUE. if a calibration database temporary file should be written and .FALSE. if no file should be written. Similarly, ANGVEW(VIEW) and ANGCHR(CHKREF) perform the same function for the view and checkref flags respectively. If the code author provides a Character*4 array BNKLST(NDIM), then the function CDF-384 (V3.02) Page 27 SOME ADVANCED FEATURES STATUS = ANGCBK(NDIM,NUSED,BNKLST) or STATUS = ANALYSIS_GET_CALIB_BANKS(NDIM,NUSED,BNKLST) returns in that array a list of the banks to be calibrated. NUSED is the number of banks returned. If the dimension of BNKLST is not large enough, then the routine will return the status ANLSTR (list truncated). 5.8 Adding Banks to the Output Stream If a user's module creates a YBOS bank that should be written to the output file, then the user must within his code copy that bank to the "A" bankset using the YBOS command: STATUS = BLIST(IW,'XXXX','A') where XXXX is the four character bankname. 5.9 Using $ Banks in ANALYSIS_CONTROL $ banks are YBOS banks whose names begin with "$". Any banks named this way are treated as special banks by both YBOS and ANALYSIS_CONTROL. ANALYSIS_CONTROL does not delete $ banks between event reads, they stay in memory for the duration of the job. In this way they can be used to store information that is constant throughout the job. For example users may wish to store parameters given in a talk to entry in a $ bank such that the number of the bank is equal to the parameter set number. In this way the event entry of the same module may inquire from A_C which parameter set the user wanted the module run with, directly locate the corresponding YBOS bank, and use the values stored there for making the appropriate cuts or running the appropriate code. Another use is to accumulate statistics for a job in a $ bank and then output the information to a data base or a log file at the end of run or end of job. It is strongly suggested that you NOT USE $ BANKS AS OUTPUT BANKS. ANALYSIS_CONTROL will not write $ Banks on event records they can exist only on BEGIN, END, and OTHER type records. If you must have them it is suggested you copy them to a new name just before you exit your module. The only reason to write a bank to an output file is if the next process in the analysis chain is going to read and use the information. However because of the way YBOS treats $ banks the above two applications are at odds. If the $ bank is used to hold talk to parameter set values then the correct thing for YBOS to do if it has a $ banks already in memory and one with the same name on a file, is to ignore the on in the file. This is the default in this situation. In the second CDF-384 (V3.02) Page 28 SOME ADVANCED FEATURES application you wish to read in some history information about the previous job, so you would need the $ bank from the file. To do this you would use the "INPUT OVERWRITE$" command. However you can not run in both modes in the same job. So if you have a job that uses $ banks for both purposes there is no solution. This is why it is better to rename them before you output them. 5.10 Using BLREADs in ANALYSIS_CONTROL The purpose of ANALYSIS_CONTROL input modules is to somehow generate or read event data into the primary YBOS array. Therefor it is dangerous for user modules to BLREAD data into this array. However the module may need to say read in some private constants data base information stored in YBOS format. For this purpose it is recommended that the user module use a secondary YBOS array. Here is an example of how to do so: C ----------------------------------------------------------- C C Declare YBOS read buffer. It is best to put this in an C include file if you use it in more than one routine. C INTEGER MYIW DIMENSION MYIW(200000) 2 INTEGER2 MYIW2(400000) REAL RMYIW DIMENSION RMYIW(200000) EQUIVALENCE(MYIW(1),RMYIW(1),MYIW2(1)) COMMON/MYIW/MYIW C C ----------------------------------------------------------- C...Declare some more stuff... C C C Initialise the secondary YBOS array C =================================== STATUS = BOSAR(MYIW,200000,'/MYIW/', 256) C C Now you can read into it C ======================== STATUS = UIGTLU(LUN) STATUS = BLREAD(LUN,MYIW,'S',NWORDS) C C...You can then do BLOCATs and anything else you normally do C except instead of using IW use MYIW and instead of using C BCS.INC use your own include file. CDF-384 (V3.02) Page 29 SOME ADVANCED FEATURES 5.11 Using VMS Sharable Images Many of the utilities that ANALYSIS_CONTROL links to support VMS sharable image versions of their libraries. The advantages of using them are: 1. They reduce user linking time since symbols in the images are already resolved. 2. The size of the users resulting EXE is much smaller, so they save on disk space. 3. Without relinking it is possible to change the size of any YBOS array or use a debug version of the utility instead of a nondebug version by simply redefining a logical name. 4. If a bug fix is made to a utility the user does not have to relink their job. They will get the change as soon as they rerun it. 5. If many of the jobs on the system are using the same utility (eg. YBOS) there should be an increase in performance. The program should run faster since VMS will not have to make as many disk accesses. For those who would like to link to sharable images of the utilities instead of the libraries here is the procedure: Make sure that you are happy with the size of the YBOS arrays that are setup in your BUILD_JOB procedure. BUILD_JOB creates a commands file that must be executed before linking and running with sharable images. This file defines all of the logicals you need to link or run your job. Then when you run link_ana add the SHR key to the optional library list. For example: $ @MYMOD_BUILD_JOB_OUTPUT $ LINK_ANA MYMOD MYMOD N "SHR,PAW,HBK4,..." If you want to take advantage of number 3. above, you just redefine $library to be $debuglib just before you run the job; however there are some extra debugger commands you will need to use in order to debug a problem inside the image. For this reason it is probably best that you bring the problem to the appropriate consultant if you believe there is a bug that requires you to look at utility code. 6 ANALYSIS_CONTROL INTERFACE TO THE CDF OFFLINE There are a limited number of places where ANALYSIS_CONTROL and it's standard I/O modules perform actions for the CDF CDF-384 (V3.02) Page 30 ANALYSIS_CONTROL INTERFACE TO THE CDF OFFLINE Offline package. Here is the list: 1. If the user has specified to BUILD_JOB that offline initialization should be done with the command "SET UTIL/OFFLINE=ON", A_C will call the offline initialization routine ANOFFI. This routine initializes the JOBSTA common and other miscellaneous offline commons. It reads the geometry data base (pointed to by the logical or environmental variable CDFGEOM or defaulted) which holds the detector geometry constants. It also reads the CDF particle properties data base (pointed to by the logical or environmental variable CDFPARP). 2. Most A_C input modules fill the JOBSTA common for each event with a call to the ANFCOM routine. This routine unpacks information from the LRID and EVCL banks as well as sets the magnetic field variable BMAGNT. For Monte-Carlo data it also sets the center of mass variable ROOTS from the GENH bank or a default value. When it is enabled this routine also fills the TAGC trigger bank. 3. A_C can also optionally extend the TPID bank with the name of the current input file. If all processes in the analysis chain use this there will be a complete history of the origin of the event. 4. Most of the A_C output modules will fill in missing information in the LRID bank. For real data most of the information is filled by the online data logger (which is an A_C program). However the offline version number and AC version number are filled every time the bank is written out. In this way you can always tell which version of the code last looked at or modified the event. APPENDIX A ANALYSIS_CONTROL COMMANDS ANALYSIS_CONTROL COMMANDS Page A-2 BEGIN_ANALYSIS ______________ A.1 BEGIN_ANALYSIS ______ Syntax BEGIN_ANALYSIS[/Qualifiers] ______ Action Begin processing events. The command causes the program to begin reading events using the currently active input module. If input is coming from the standard READ_FILE module, this means that the currently defined input file will be opened and read. ________ __________ Optional Qualifiers /[NO]CLEAR Using the NOCLEAR qualifier will instruct Analysis_Control to call begin run entry points for all enabled modules only once in the very begining of the JOB. This is intended for people who want to keep statistics over many runs/files. Note that /CLEAR is the default so the /NOCLEAR option must be used every time a BEGIN or CONTINUE command is entered even within the same JOB. /COMMENT= This qualifier causes the specified text string to be displayed following the BEGIN Command. The text string should be enclosed in quotes if it contains embedded spaces or commas etc. Its main purpose is for the Online Data Logger which is driven automatically via RUN_CONTROL rather than via an operator at the keyboard. Default: No comment will be displayed. /CPU_REPORT= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the elapsed CPU time) during analysis. The reporting interval should be specified in seconds. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/CPU_TIME= Command. The overall default is for no reporting to occur. /DELAY= ANALYSIS_CONTROL COMMANDS Page A-3 BEGIN_ANALYSIS This qualifier causes a delay between every event. It's main use is for the Online Data Logger, where it allows that program to sample events at a low rate. The delay should be specified in seconds. Default: In the absence of this qualifier, the delay between events will be that determined by a prior SET DELAY Command. The overall default is for no delay to be present. /ELAPSED_REPORT= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the elapsed clock time) during analysis. The reporting interval should be specified in seconds. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/ELAPSED_TIME= Command. The overall default is for no reporting to occur. /FIRST_EVENT= This qualifier may be used to specify the event number of the first event to be processed. Analysis_Control will bypass all preceeding events but pass this event and all subsequent events to the analysis modules. Default: First event encountered. /GOOD_EVENTS= This qualifier is meant to be used in conjunction with event filters. It allows the user to specify that events should be processed until the given number have sucessfully passed all active filters. If no filters are active, this command is identical to the /NEVENT qualifier. Default: None /NEVENT= Number of events to be processed. Default: All events /[NO]PROCESSED This qualifier is used in conjunction with the reporting qualifiers /REPORT_EVENTS, /CPU_REPORT and /ELAPSED_REPORT. It modifies the reporting text to include (or exclude) the number of events that have been ANALYSIS_CONTROL COMMANDS Page A-4 BEGIN_ANALYSIS processed as well as the current run number and event number. Default: In the absence of this qualifier, the reporting format will be that determined by a prior SET REPORT/[NO]PROCESSED Command. The overall default is for the number of processed events not to be reported. /REPORT_EVENTS= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the qualifier value) during analysis. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/EVENTS= Command. The overall default is for no reporting to occur. /[NO]RUN_REPORT This qualifier causes the current Run and Event Number to be reported during analysis every time the event just processed had a different run number than the last event. NORUN_REPORT disables this feature. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT Command. The overall default is for no reporting to occur. /SKIP_EVENT= This qualifier allows the user to specify that a number of event records be skipped before any events are analyzed. ________ __________ Required Parameters None Note: For Calibration users the command BEGIN has been replaced by the Command CALIBRATE. ANALYSIS_CONTROL COMMANDS Page A-5 CALIBRATE _________ A.2 CALIBRATE ______ Syntax CALIBRATE[/Qualifiers] ______ Action This command replaces the BEGIN and CONTINUE commands for calibration users. It tells the program to begin processing events, where the range of runs tells which calibration runs should be analyzed. This range can either be entered as a single number or as two numbers separated by a colon. For example, CALIBRATE 10 would process only run number 10 and CALIBRATE 1:10 would process all runs from 1 to 10. Note: all qualifiers must be attacted to the CALIBRATE verb. The expression CALIBRATE 1:10/NEV=2 is NOT legal! ________ __________ Optional Qualifiers /[NO]ACC Specifies whether accepted calibration database files should be written at the end of data processing (in the modules' end run entrypoints) Default: ACC /BANKLIST= Specify a list of which input banks should be analyzed. The four character bank names should be concatenated without spaces. Default: Analyze default required banklist specified in the module's declaration routine. /[NO]CHECKREF Specifies whether the calibration task should check a reference calibration file and create a bad channel for those channels that have changed relative to the reference Default: CHECKREF ANALYSIS_CONTROL COMMANDS Page A-6 CALIBRATE /[NO]CLEAR Specifies whether the histograms associated with the calibration should be cleared before starting the calibration. Default: CLEAR /COMMENT= This qualifier causes the specified text string to be displayed following the BEGIN Command. The text string should be enclosed in quotes if it contains embedded spaces or commas etc. Its main purpose is for the Online Data Logger which is driven automatically via RUN_CONTROL rather than via an operator at the keyboard. Default: In the absence of this qualifier, no comment will be displayed. /CPU_REPORT= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the elapsed CPU time) during analysis. The reporting interval should be specified in seconds. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/CPU_TIME= Command. The overall default is for no reporting to occur. /DELAY= This qualifier causes a delay between every event. It's main use is for the Online Data Logger, where it allows that program to sample events at a low rate. The delay should be specified in seconds. Default: In the absence of this qualifier, the delay between events will be that determined by a prior SET DELAY Command. The overall default is for no delay to be present. /ELAPSED_REPORT= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the elapsed clock time) during analysis. The reporting interval should be specified in seconds. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/ELAPSED_TIME= Command. The overall ANALYSIS_CONTROL COMMANDS Page A-7 CALIBRATE default is for no reporting to occur. /NEVENT= Number of events to be processed Default: All events /[NO]PROCESSED This qualifier is used in conjunction with the reporting qualifiers /REPORT_EVENTS, /CPU_REPORT and /ELAPSED_REPORT. It modifies the reporting text to include (or exclude) the number of events that have been processed as well as the current run number and event number. Default: In the absence of this qualifier, the reporting format will be that determined by a prior SET REPORT/[NO]PROCESSED Command. The overall default is for the number of processed events not to be reported. /REPORT_EVENTS= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the qualifier value) during analysis. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/EVENTS= Command. The overall default is for no reporting to occur. /[NO]RUN_REPORT This qualifier causes the current Run and Event Number to be reported during analysis every time the event just processed had a different run number than the last event. NORUN_REPORT disables this feature. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT Command. The overall default is for no reporting to occur. /[NO]SUM Specifies whether summary calibration database files should be written at the end of data processing (in the modules' end run entrypoints) Default: SUM /[NO]TMP ANALYSIS_CONTROL COMMANDS Page A-8 CALIBRATE Specifies whether temporary calibration database files should be written at the end of data processing (in the modules' end run entrypoints) Default: TMP /[NO]VIEW Specifies whether the results of the calibration should be "viewed" (ie that results should be printed to or plotted on the terminal). Default: NOVIEW ________ __________ Required Parameters Run Number: Run number (or range of run numbers specified as m:n) ANALYSIS_CONTROL COMMANDS Page A-9 CALL ____ A.3 CALL ______ Syntax CALL ______ Action Call Specified entrypoint for specified module This command is intended to allow the user to call the begin or end run entrypoint for a specified module without reading any events. ________ __________ Optional Qualifiers None ________ __________ Required Parameters Module Name: Module Name (from the linked Module list) Entrypoint: Module Entrypoint The possible entrypoint names are: BEGIN_RUN END_RUN TALK_TO EVENT ANALYSIS_CONTROL COMMANDS Page A-10 CLEAR _____ A.4 CLEAR ______ Syntax CLEAR[/Qualifiers] ______ Action Clear given counter or timer. ________ __________ Required Parameters Item to be cleared. It is possible to clear the following: TIMING Clears the CPU timing statistic for the specified module if the module qualifier is used, otherwise it clears the CPU timing statistic for all modules in the default module group (ie. it won't clear input or output modules unless they are explicitly cleared with the /MODULE=... qualifier). The syntax of this command is: CLEAR TIMING/MODULE= FILTER_STATS Clears events seen and passed statistics for the specified filter if the filter qualifier is used, otherwise it clears these statistics for all filter modules. The syntax is:- CLEAR FILTER_STATS/FILTER=<...>/PARAMETER_SET= The parameter_set qualifier is optional. If it is not used, parameter set 1 of specified filter will be cleared. MODULE_COUNTER Clears the module counter statistic, (which is the number of times the module was executed), for the specified module if the module qualifier is used, otherwise it clears the module counter statistic for all modules in the default module group (ie. it won't clear input or output modules unless they are explicitly cleared with the /MODULE=... qualifier). The syntax is:- CLEAR MODULE_COUNTER/MODULE= ANALYSIS_CONTROL COMMANDS Page A-11 COMMENT _______ A.5 COMMENT ______ Syntax COMMENT "Comment Text" ______ Action Echoes the specified Comment Text on the current output device. The command just echoes the specified text string (which should be enclosed in quotes if it contains spaces or commas) on the current output device. It is designed to allow for annotation from within Indirect Command Files. ________ __________ Optional Qualifiers None ________ __________ Required Parameters Comment Text: Text of Comment (should be enclosed in quotes) ANALYSIS_CONTROL COMMANDS Page A-12 CONTINUE_ANALYSIS _________________ A.6 CONTINUE_ANALYSIS ______ Syntax CONTINUE_ANALYSIS[/Qualifiers] ______ Action Continue processing events This command is only meaningful when the user has partially processed an input file (for example, the user had specified BEGIN/NEVENT=5 on a file with more than 5 events). This command will continue processing events in the file starting with the next event. If no file is open, it will signal an error. This command acts identically to a BEGIN Command if no prior BEGIN Command was specified. ________ __________ Optional Qualifiers /[NO]CLEAR Using the NOCLEAR qualifier will instruct Analysis_Control to call begin run entry points for all enabled modules only once in the very begining of the JOB. This is intended for people who want to keep statistics over many runs/files. Note that /CLEAR is the default so the /NOCLEAR option must be used every time a BEGIN or CONTINUE command is entered even within the same JOB. /COMMENT= This qualifier causes the specified text string to be displayed following the BEGIN Command. The text string should be enclosed in quotes if it contains embedded spaces or commas etc. Its main purpose is for the Online Data Logger which is driven automatically via RUN_CONTROL rather than via an operator at the keyboard. Default: In the absence of this qualifier, no comment will be displayed. /CPU_REPORT= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the elapsed CPU time) during analysis. The reporting interval should be specified in seconds. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/CPU_TIME= Command. The overall default ANALYSIS_CONTROL COMMANDS Page A-13 CONTINUE_ANALYSIS is for no reporting to occur. /DELAY= This qualifier causes a delay between every event. It's main use is for the Online Data Logger, where it allows that program to sample events at a low rate. The delay should be specified in seconds. Default: In the absence of this qualifier, the delay between events will be that determined by a prior SET DELAY Command. The overall default is for no delay to be present. /ELAPSED_REPORT= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the elapsed clock time) during analysis. The reporting interval should be specified in seconds. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/ELAPSED_TIME= Command. The overall default is for no reporting to occur. /FIRST_EVENT= This qualifier may be used to specify the event number of the first event to be processed. Analysis_Control will bypass all preceeding events but pass this event and all subsequent events to the analysis modules. Default: Any event /GOOD_EVENTS= This qualifier is meant to be used in conjunction with event filters. It allows the user to specify that events should be processed until the given number have sucessfully passed all active filters. If no filters are active, this command is identical to the /NEVENT qualifier. Default: None /NEVENT= Number of events to be processed. Default: All events /[NO]PROCESSED ANALYSIS_CONTROL COMMANDS Page A-14 CONTINUE_ANALYSIS This qualifier is used in conjunction with the reporting qualifiers /REPORT_EVENTS, /CPU_REPORT and /ELAPSED_REPORT. It modifies the reporting text to include (or exclude) the number of events that have been processed as well as the current run number and event number. Default: In the absence of this qualifier, the reporting format will be that determined by a prior SET REPORT/[NO]PROCESSED Command. The overall default is for the number of processed events not to be reported. /REPORT_EVENTS= This qualifier causes the current Run and Event Number to be reported at regular intervals (determined by the qualifier value) during analysis. Default: In the absence of this qualifier, the reporting frequency will be that determined by a prior SET REPORT/EVENTS= Command. The overall default is for no reporting to occur. /[NO]RUN_REPORT This qualifier causes the current Run and Event Number to be reported during analysis every time the event just processed had a different run number than the last event. NORUN_REPORT disables this feature. /SKIP_EVENT= This qualifier allows the user to specify that a number of event records be skipped before any events are analyzed. ________ __________ Required Parameters None ANALYSIS_CONTROL COMMANDS Page A-15 DELETE ______ A.7 DELETE ______ Syntax DELETE ______ Action Delete the appropriate control banks from ANALYSIS_CONTROL. ________ __________ Optional Qualifiers None ________ __________ Required Parameters Item: Item to be deleted. Possible items are:- EVENT_LIST[/Qualifiers] Delete the list specifying what events should be analyzed (see the SET command for an explanation of event lists) /RUN_LIST= If qualifier used, the run list only for the given event list is deleted. NOTHING Return to menu without taking any action RUN_LIST Delete the list specifying what runs should be analyzed (see the SET command for an explanation of run lists) ANALYSIS_CONTROL COMMANDS Page A-16 EXIT ____ A.8 EXIT ______ Syntax EXIT[/Qualifiers] ______ Action Exit from the program. The action of the EXIT Command may be modified by using the /DISABLE and /ENABLE Qualifiers. If the EXIT Command has been disabled then typing EXIT will cause an error to be reported and the command to be ignored. The correct sequence of Commands in this instance is EXIT/ENABLE EXIT, which re-enables the EXIT Command and then executes it. ________ __________ Optional Qualifiers /DISABLE Disables the EXIT Command until it is re-enabled via the EXIT/ENABLE Command. /ENABLE Enables the EXIT Command following an EXIT/DISABLE Command. /EOJ_RECORD Using this qualifier will cause an EOJ record to be written to every output stream before closing it. This record will contain an appropriate LRID bank and any banks created by an Analysis_Control Program Termination entry point, for all linked modules. This feature is intended to allow job summary banks to be written to all output files. ________ __________ Required Parameters None ANALYSIS_CONTROL COMMANDS Page A-17 FILTER ______ A.9 FILTER ______ Syntax FILTER[/Qualifiers] ______ Action Controls whether event filters are active. If the FILTER command is not invoked, the default condition is for all filters to be OFF. The can be either ON or OFF. If the ON command is given, Analysis_Control will terminate event processing after the specified module unless that module has set the global accept flag to .TRUE. using the function ANPTRG. If the ON command is given, AND a VETO decision has been specified with a previous FILTER/SPECIFY command, Analysis_Control will terminate event processing after the given module unless that module has set the global accept flag to .FALSE. using the function ANPTRG. Note: Analysis_Control will signal an error unless the specified module (and parameter set) are active (ie. in the analysis path currently set up). ________ __________ Required Parameters Module Name: Name of Filter Module State: Filter State (may be ON or OFF) ________ __________ Optional Qualifiers /PATH= This qualifier is used to activate and deactivate filters on secondary analysis paths. For example, the command FILTER/PATH=2 FRED ON tells Analysis_Control to stop the path 2 analysis unless the filter FRED says this is a "good" event. (Note: Analysis_Control will signal an error unless the specified path has already been defined and unless the specified module is in that path.) /PARAMETER_SET= This qualifier allows the user to activate the non-default parameter sets of a module. The qualifier can be placed after the verb FILTER, or after the module name. If it placed after the verb, the paramter set must be specified by number. If it is placed after the module name, either the parameter set name or number can be ANALYSIS_CONTROL COMMANDS Page A-18 FILTER specified. Thus, if we have in the path a filter module named FRED with two parameter sets, the second of which has the name MIN_BIAS associated with it, the following three syntaxes are allowed: FILTER/PARAM=2 FRED ON FILTER FRED/PARAM=2 ON FILTER FRED/PARAM=MIN_BIAS ON /[NO]STATISTICS This qualifier allows the user to enable or disable the tallying of events statistics for the specified filter module. The qualifier can be placed after the verb FILTER, or after the module name. Default: By default statistics tallying is enabled. /SPECIFY This qualifier is used to modify what type of filter the specified filter is. The qualifier can be placed after the verb FILTER, or after the module name. A filter may either be a select or veto type filter. If it is a select filter an event will pass if the global accept flag is .TRUE.. If it is a veto filter an event will pass if the global accept bit is .FALSE.. If this qualifier is used, a parameter cannot be used. In this case the third parameter on the command line should be either VETO or SELECT. Thus the syntax used with this qualifier is: FILTER/SPECIFY FRED VETO -or- FILTER/SPECIFY FRED SELECT Default: By default a given filter is assumed to be a select filter. ANALYSIS_CONTROL COMMANDS Page A-19 HELP ____ A.10 HELP ______ Syntax HELP[/Qualifiers] ______ Action Provides Help text for all relevant commands ANALYSIS_CONTROL contains a set of nested help files (similar in format to VAX help). To access these files the users must type: HELP The program will then type the appropriate help file and prompt for the next help topic. To exit from help, just type to each help prompt. Note: If you do not know what command you wish to find out about, just type HELP. A list of commands will then be displayed. Analysis_control also allows the user access to any help files provided by the modules that have been linked. Typing HELP/MODULE will place the user in the appropriate help file. If no such file has been provided, the module name and descriptive text will be displayed. ANALYSIS_CONTROL COMMANDS Page A-20 HISTOGRAMS __________ A.11 HISTOGRAMS ______ Syntax HISTOGRAM[/Qualifiers] [/Qualifiers] ______ Action Control Histogram manipulations ________ __________ Optional Qualifiers /CREATE_CHAPTER Creates a YHIST chapter containing all the histograms for this module. (Note: the HBOOK version of Analysis_Control ignores this qualifier) /MODULE= Name of module containing the histograms to be manipulated Default: All modules with histograms ________ __________ Required Parameters Action: Action to be performed. Valid actions are:- CLOSE[/STREAM=] This command is only useful for HBOOK4. If no stream is specified stream one is closed. The above command will close the file. This may be usefull if you want to look at the file with another process. DELETE Delete the histograms for the specified modules Note this command is not yet available for HBOOK4. DIRECTORY[/MEMORY][/STREAM=][/MODULE=] This command will give a directory of the currently written histograms for all modules or the module specified with the /MODULE qualifier. For HBOOK4, by default if no stream is specified the histograms written to stream 1 will be reported. If you have not yet issued the "HIST WRITE" command there will be nothing listed if you would like to see a listing of directories and contents in memory you can use the /MEMORY qualifier. This will work even before the "HIST WRITE" command. FETCH ANALYSIS_CONTROL COMMANDS Page A-21 HISTOGRAMS Displays available folders and volumes in the current YHIST directory file, prompting for folder, then volume to be used. The specified volume is then placed in the histogram array, overwriting existing contents. Note this command is not available for HBOOK4, however the HISTOGRAM OPEN command allows you to open an already existing file in update mode. You can then modify it. OFF[/MODULE=] Sets flag specifying that the histograms should be deactivated ON[/STREAM=][/MODULE=] Sets flag specifying that the histograms should be booked or reactivated (if they have already been booked). The module qualifier is important if one of your linked modules calls YHIST booking entries. In this case you can force it to book histograms for only the module that uses HBOOK4. The stream qualifier tells A_C which file you would like the N-tuples booked into. OPEN[/STREAM=][/READ_ONLY] [/TOP_DIRECTORY=][/UPDATE] [/FILE_NAME=][/MAX_NREC=] [/RECLENGTH=] This command is only useful for HBOOK4. If no stream is specified stream one is opened. If READ_ONLY is specified the file will be opened for read access. If UPDATE is specified the file will be opened in update mode. If both READ_ONLY and UPDATE are specified the last one on the command line will be used. If neither is specified a new file will be created (Caution on a UNIX machine if the file already exists it must be opened in either READONLY or UPDATE mode). If a TOP_DIRECTORY is not specified the value defaults to 'ACHBK4'. If a FILE_NAME is not specified the value defaults to 'AC_HBK4.DAT'. If MAX_NREC is specified it will be used instead of the default value of 4000. If RECLENGTH is specified it will be used instead of the default value of 1024. See chapter 5.2 for details on the last two qualifiers. READ Reads in a set of preexisting histograms from a file. The user is prompted for the filename. (Note: This command works for both the HBOOK3 and YHIST versions of Analysis_Control but not yet HBOOK4. It, however, bypasses the YHIST Volume and Folder directory mechanism.) ANALYSIS_CONTROL COMMANDS Page A-22 HISTOGRAMS STORE Stores the contents of the YHIST histogram array in a volume on disk and records this information in the current YHIST directory file. The user is prompted first for the YHIST volume to be written and then for the YHIST folder. (Note: For the HBOOK3 version of ANALYSIS_CONTROL, this command performs the identical operation as WRITE. Histograms are stored (using HSTORE) on a file with the filename specified by the Volume parameter. The folder parameter is ignored.) This command is not used for HBOOK4 applications. WRITE[/ID=][/MODULE=] [/STREAM=] This command will write out the requested ID's or the ID's belonging to the requested module. If no qualifiers are used it writes all currently booked histograms to a file. For HBOOK3 and YHIST the user is prompted for the filename. For HBOOK4 the file associated with with the specified stream number is written to. If no stream is specified they will be written to whatever file is associated with stream one. ZERO Clear Histogram Contents, Note this command is not yet available for HBOOK4. ANALYSIS_CONTROL COMMANDS Page A-23 INPUT _____ A.12 INPUT ______ Syntax INPUT ______ Action Control input of data ________ __________ Optional Qualifiers None ________ __________ Required Parameters Action: The action required. Possible actions are: FILE_NAME Specify the input file to be read (relevant only if the READ_FILE input module is in use). The syntax of this command is: INPUT FILE_NAME This file can be either a disk or tape file, and will be opened immediately upon receipt of this command. If another input file is already opened it will be closed before opening the new file. This command may also be used to set up a queue of input files to be used in the order they are type in. INPUT FILE ", , ..." These files will be processed as if they had been concatenated together. For VAXs and UNIX machines, wildcards may be used in specifying input files for either the queue or a single file specification. A "*" can be used anywhere where it is also valid in DCL, including tape drives. So if you don't know the names of files on a certain tape you can specify the file using: INPUT FILE DRIVEn:* and Analysis_Control will sequentially use all files on the tape. If wildcards are used A_C will echo the name of each file as it opens it. This will allow you to correlate a particular error message to a particular file. (Of course if the error is FATAL ANTERM will also tell you the name of the input file the error occured in.) For users who have the Cern FATMEN (File and Tape Management System) package installed on their ANALYSIS_CONTROL COMMANDS Page A-24 INPUT machines (eg. FNALD) there is a command you can give that will tell A_C that the input file name given is a FATMEN file specification and that it should reference the FATMEN catalog to find the file. The command is: INPUT FILE/FATMEN - "//FNAL/CDF/PROD/EXP/OTH/TEST/E35904AA" For more information see the FATMEN$DOC area or CDF1677.MEM MODULE Specify what input module to use. The syntax of this command is: INPUT MODULE If an input module is already active and has an input file open, the file will be closed implicitly by this command. Note: A list of valid input module names may be displayed using the SHOW INPUT Command. YBOS_MODE Specify what YBOS input mode to use. The input mode must be selected before the input file is opened, therefor it must be specified before the INPUT FILE command. The syntax of the command is: INPUT YBOS_MODE FAST -or- INPUT YBOS_MODE NORMAL Normal is the default. If fast mode is specified Analysis_Control will use the optimized version of the YBOS input routines. The optimized version will typically result in a 30% reduction in processing time. These routines can not however be used for network file access; for this application NORMAL ybos mode should be used. Note: The SHOW INPUT command has been modified to show the input mode selected. NODE_NAME Specify remote node where the data acquisition system is being run. This command is only relevant if the NETWORK_READ input module is in use. This input module accesses events in realtime from the CDF data acquisition system running on the specified remote node. The syntax for this Command is:- ANALYSIS_CONTROL COMMANDS Page A-25 INPUT INPUT NODE_NAME Note: The Network Link will only be opened on receipt of a subsequent BEGIN Command. It is possible that a DECNet timeout occurs whilst the link is being established, in which case the BEGIN Command should be repeated. DROP Specify that a bank or list of banks to be dropped from the primary YBOS array before any analysis is performed. The syntax of this command is: INPUT DROP Where is either an individual bank name or a list of bank names. Note: A second call to the DROP command will add to the list of drop banks. This list can be deleted using the command: INPUT/CANCEL DROP RENAME Specify that a bank in the primary YBOS array be renamed before any analysis is performed. The syntax of this command is: INPUT RENAME XXXX YYYY Where XXXX is the name of the bank on the input file and YYYY is the name after renaming. Note: A second call to the RENAME command will add to the list of renamed banks. This list can be deleted using the command: INPUT/CANCEL RENAME COPY Specify that a bank in the primary YBOS array be copied before any analysis is performed. The syntax of this command is: INPUT COPY XXXX YYYY Where XXXX is the name of the bank on the input file and YYYY is the name after coping. Note: A second call to the COPY command will add to the list of copied banks. This list can be deleted using the command: INPUT/CANCEL COPY Note: The copied bank will not be considered as part of the input list so if you want it to be output you must ANALYSIS_CONTROL COMMANDS Page A-26 INPUT tell A_C to do so with a "OUTPUT SELECT KEPTBANKS YYYY" command. OVERWRITE$ Changes the default $bank handling. If this command is used $banks from the input file will overwrite $bank in the current YBOS array if both have the same name and number. By default $banks in the YBOS array supersede any on the input file. SELECT This command is used to allow the user to specify what events will be passed to processing after being read in. This feature is intended to facilitate secondary analysis when the primary analysis was done using filters. The syntax of the command is: INPUT SELECT[/Qualifiers] EVENTS[/Qualifiers]- /FILTER=([/Qualifiers],...) Note that the FILTER qualifier is a required qualifier for setting up this feature. If this feature is used the SHOW INPUT command will give a description of all defined requirements along with a report of how many events have been read in verse how many events have been passed to processing. The following qualifiers may be used with the select option: /FILTER=([/Qualifiers],[/Qualifiers],...) Defines the filter names and associated descriptions that make up the current requirement. For simple requirements all that is needed is a filter name and parameter set (if not equal to one). Otherwise one or more of the following optional qualifiers must be used: /BITS="n1 n2 n3:n4 n5 ..." will set up a mask to be compared to the mask in the TAGB bank. Note that n3:n4 specifies an inclusive range of bits to be checked. If this qualifier is not used then only the global accept bit is checked. /EXCLUSIVE specifies that an exclusive check for the mask set up by the BITS qualifier should be made, ie. all bits specified must also be set in the events appropriate TAGB mask. The default is a non-exclusive check which means that if any of the bits specified are set in the events appropriate TAGB mask the filter will pass. The FILTER qualifier must be the last qualifier used ANALYSIS_CONTROL COMMANDS Page A-27 INPUT on the EVENT verb. /ADD Allows the current specification to be added onto the requirement defined in the previous INPUT SELECT EVENT command. This is usefull when the desired requirement specification is so long that it won't fit into the allowed UIPACK command line length (which is 512 characters). This qualifier may be attached to either the SELECT or EVENT verb. /REQUIREMENT= Allows the current specification to be assigned to the given requirement number. If this qualifier is not used the current specification will be assigned a number one greater than the largest assigned so far. If the given requirement number has already been used the current specification will be added onto the already existing one. This qualifier may be attached to either the SELECT or EVENT verb. RESET This command is the compliment of the SELECT command and it allows Analysis_Control to be reset to its default setup. The available options are: DROP Instructs Analysis_Control to not remove any banks from the input events. RENAME Instructs Analysis_Control to not rename any banks in the input events. OVERWRITE$ Sets Analysis_Control back to its default where $banks in the current YBOS array do not get overwritten by the corresponding $banks on the input file. EVENTS Drops or resets one or more input requirements. The syntax is: INPUT RESET[/Qualifier] EVENTS[/Qualifier] The only valid qualifier is /REQUIREMENT=. This qualifier may be attached to either the SELECT or ANALYSIS_CONTROL COMMANDS Page A-28 INPUT EVENT verb. If no qualifier is used all requirements are dropped. ANALYSIS_CONTROL COMMANDS Page A-29 OUTPUT ______ A.13 OUTPUT ______ Syntax OUTPUT[/Qualifiers] ______ Action Control the output of data ________ __________ Optional Qualifiers /STREAM= Number of the output stream. Default: Stream 1 ________ __________ Required Parameters Action: Action on Output Stream. Possible actions are:- BUFFERS In the VAX environment, this command specifies the number of I/O buffers used in asynchronous YBOS I/O, the default is two. Asynchronous YBOS I/O is only used if the command OUTPUT YBOS_MODE FAST has been issued before the output file was opened. The syntax of this command is: OUTPUT BUFFERS The allowed range of buffers is two to eight. CLOSE[/Qualifiers] Close the current output file. Possible qualifiers are:- /[NO]REPORT Determines whether a message is issued reporting that the output file has been closed successfully. The default is /NOREPORT. FILE Specify that an output file should be opened with a given filename. The full syntax for this command is described in the next section. HISTORY Directs Analysis_Control to write a text file containing ANALYSIS_CONTROL COMMANDS Page A-30 OUTPUT a log of all files written during the job for the given output stream. The filename will be the same as the the first output file name written with a .HIS extension. For an example of this file see DATA_DISK:[ONLINE_DATA.HISTORY.PRM]R*.HIS which is written by the online DATA_LOGGER. Possible qualifiers are:- /DIRECTORY= Allows the user to specify the directory that the history text file will be written to. FORMAT Allows the user to specify the data format (eg. byte order) of the output file in your job. For speed it is suggested that you write out the file in the format of the machine on which you will read it most. The A_C command is: OUTPUT FORMAT (UNIX/VAX) VAX is the default. When reading files YBOS will determine which format the file was written in and take whatever actions are necessary. If the file was written in a format native to the machine it will do nothing, otherwise it will take the time to convert the data. MODULE Specify an alternate output module. The syntax of this command is: OUTPUT MODULE Note: Any already opened output files will be closed on receipt of this Command. YBOS_MODE Specify what YBOS output mode to use. The output mode must be selected before the output file is opened, therefor it must be specified before the OUTPUT FILE command. The syntax of the command is: OUTPUT YBOS_MODE FAST -or- OUTPUT YBOS_MODE NORMAL Normal is the default. If fast mode is specified Analysis_Control will use the optimized version of the YBOS output routines. The optimized version will typically result in a 30% reduction in processing time. ANALYSIS_CONTROL COMMANDS Page A-31 OUTPUT These routines can not however be used for network file access; for this application NORMAL ybos mode should be used. Note: The SHOW OUTPUT command has been modified to show the output mode selected. OFF Disable the writing of the output stream (Note: This command doesn't close the file.) ON Enable writing of the output stream. For stream 1, the output will be written by default to the file CDFOUT. For stream n, where n is greater than 1, the output is written to CDFOUn by default. These defaults may be overridden using the OUTPUT FILE Command. RECORD If necessary, this command will open the currently setup file name and output the current record to it. RESET This command is the compliment of the SELECT command and it allows Analysis_Control to be reset to its default setup. These defaults are recovered using the command OUTPUT RESET