INTRODUCTION

In any High Energy Physics experiment, detector simulation is of great importance, not only for the physics analysis, but also as a hard-ware and soft-ware development tool.

Traditionally, H.E.P. simulation programs are either pure physics simulations, complete detailed detector simulations, or parametrisations of full simulation results. In the first case, one generates the initial physics process according to a theory, eg. decays of the $Z^0$ according to the standard model. Since no information on the detector response is included, such programs cannot be used to design a real-life analysis procedure, other than very crudely. In the second case (which usually also includes the first, as a sub-program), as much detail as possible included of the detection process, eg. ionisation in gases, electronics response, interactions in the material of the detector, and data formatting according to the actual detector read-out system. As a consequence, such programs are very demanding in terms of computer resources, both CPU-cycles and data storage. In fact, the LEP experiments at CERN typically produced simulated $Z^0$ decays at approximatively the same rate as they were recording real ones. While such complete simulations are absolutely essential for the physics analysis, their sheer weight make them hard to use for the individual physicist, and producing events in the amounts needed to be statistically significant must be done on the grid, to get the needed resources in terms of CPU-cycles, mass-storage capacity and, last, not least, man-power. The third case, parametrisation of full simulation, gives more realistic results than the first, but has caveats: Firstly, it clearly needs an existing full simulation of the detector in question, and considerable data-sets produced with this simulation. The parametrisation cannot be better than what the statistics of the full simulation allows for. Secondly, it inevitably lacks detail of the detector performance, in particular how a given aspect correlates with another (eg. c-tagging vs. b-tagging).

Hence, there is a need for soft-ware at an intermediate level of detail. Such a system should make it possible to produce significant amounts of simulated events at the home-institutes, or even on a PC at home. The events thus produced will of course not have the detail of the full detector simulation, but will be vastly much more realistic than the output of a bare physics simulation, or a parametrised approach. On the other hand, being fast, it can simulate much larger data-sets than a full simulation could do, and could in cases where the full simulation results are statistics limited, actually give a more accurate estimate than full simulation. In addition, the existence of such a program will make it much easier to test 'wild ideas', because it doesn't interfere with the work of colleagues to run such a test a short time on a single PC. This is in stark contrast with the case of using a large scale detector simulation, where days of CPU-time on a high-end system might be needed, Terabytes of data might be written, and assistance of operators will be needed. In addition, with such a fast detector simulation program, theoretical physicists with no close contact with colleagues in an experiment can test the experimental implications of a theory without having to learn un- reasonably much about detector performance. It should be a true fast detector simulation, i.e. it should not depend on the existance of full simulation of the same detector, and can thus be used for hypothetical, future, detectors.

The Simulation a Grande Vitesse (SGV) is such a program. It is a detector simulation program, designed to yield a fast answer to the question of experimental visibility of a theoretically predicted process. It is accomplished by the implementing a simple description of the detector, giving low over-head for the initial setting-up of the program, and by a fast simulation code for the detector response, making it feasible to perform high-statistics simulation, even with limited resources. By applying SGV to the DELPHI detector at LEP, or the ILD concept at ILC, it has been shown that the time to simulate the detector is of the same order of magnitude as the time to generate the bare event. It is also shown that, despite the simplicity of the procedure, in many aspects the results are quite comparable to those of the full-blown detector simulation program.

SGV simulates colliding beam detectors in a solenoidal magnetic field, ie. such as eg. the detectors at LEP, those at the LHC at CERN or those at a future linear electron-positron collider, among others.

The detector is described as cylinders with a common axis, parallel to the magnetic field, and as planes perpendicular to the common axis. The cylinders are described by their radius and minimum and maximum extent along this common axis. The planes are described by their position along the axis and their minimum and maximum radius. In addition, the material, the thickness in radiation lengths, and the type and precision of measurements are also given. Each cylinder or plane can be divided in repeating sectors of measuring and non-measuring parts, so that eg. blind sector-boundaries between detector sectors or over-lapping detectors can be simulated.

Cylindrical and plane calorimeters can be specified in a similar fashion. The geometry is given in the same way, while the energy resolution and the shower axis measurement precision are given by parameters.

For each charged particle generated by the bare physics simulation (which can be selected at will by the user) which is either stable or decays weakly, SGV calculates which of the tracking-detector surfaces the track helix intersects. From the list of those surfaces, the program analytically calculates the precision with which the parameters of the track can be measured. This calculation includes the multiple-scattering in the traversed surfaces, and the measurement precision at each surface that measures the track position.

Each particle (neutral or charged) is also followed to its intersection with the calorimeters. The program determines which one the particle hits first (ignoring electro-magnetic calorimeters if the particle is a hadron), and using the parameters to simulate the measured energy and the direction of the shower in the detector. Optionally, the program can also merge showers that are so close together, that they would be hard to distinguish in a real detector.

Optionally, the most important electromagnetic interactions in the detector can be generated, ie. brems-strahlung from electrons and production of ${\rm e}^{+}{\rm e}^{-}$ pairs from photons.

The user can supply routines to simulate inefficiencies and particle identification.

In the design of SGV, we kept in mind that the analysis code developed by the user should be easy to transport to an other environment, eg. the analysis program of the experiment. This was accomplished by sealing of the analysis part of SGV as a separate module, which the rest of SGV need not to know the internal workings of, nor vice versa. Hence, one can develop an analysis 'at home', write a small interface routine that reads the experimental data and formats it according to the specifications for the input to this analysis-module, and then use exactly the same analysis code on real data, or on data simulated by the full detector simulation.

We also considered the fact that, unlike most others, the user-community of particle physicist is quite fluent in programing. Hence,it is a large waste of human resources to write a program aimed at such a community as a non-transparent 'black box'. Therefor, SGV was written to be accessible for modifications at all levels, by keeping the structure clean and by extensive documentation all the way down to the individual routines.



Subsections