Monte Carlo Computational Software and Methods in Radiation Dosimetry

: The fast developments and ongoing demands in radiation dosimetry have piqued the attention of many software developers and physicists to create powerful tools to make their experiments more exact, less expensive, more focused, and with a wider range of possibilities. Many software toolkits, packages, and programs have been produced in recent years, with the majority of them available as open source, open access, or closed source. This study is mostly focused to present what are the Monte Carlo software developed over the years, their implementation in radiation treatment, radiation dosimetry, nuclear detector design for diagnostic imaging, radiation shielding design and radiation protection. Ten software toolkits are introduced, a table with main characteristics and information is presented in order to make someone entering the field of computational Physics with Monte Carlo, make a decision of which software to use for their experimental needs. The possibilities that this software can provide us with allow us to design anything from an X-Ray Tube to whole LINAC costly systems with readily changeable features. From basic x-ray and pair detectors to whole PET, SPECT, CT systems which can be evaluated, validated and configured in order to test new ideas. Calculating doses in patients allows us to quickly acquire, from dosimetry estimates with various sources and isotopes, in various materials, to actual radiation therapies such as Brachytherapy and Proton therapy. We can also manage and simulate Treatment Planning Systems with a variety of characteristics and develop a highly exact approach that actual patients will find useful and enlightening. Shielding is an important feature not only to protect people from radiation in places like nuclear power plants, nuclear medical imaging, and CT and X-Ray examination rooms, but also to prepare and safeguard humanity for interstellar travel and space station missions. This research looks at the computational software that has been available in many applications up to now, with an emphasis on Radiation Dosimetry and its relevance in today's environment.


Introduction
The Monte Carlo technique uses random numbers as a reference point to simulate a given circumstance. The physical process may be directly reproduced in most Monte Carlo applications. All that is required is that the system and physical processes can be described using established probability density functions. Random sampling may be used to simulate these probability density functions if they are precisely specified.
Simulation studies, in general, provide a number of benefits over experimental research. It is quite simple to alter various criteria in any given model and analyse the impact of those alternations on the system's efficiency. To gain in rate of convergence, one must lose in reliability, which means that the enhanced rate of convergence is compensated by tolerating some degree of uncertainty in the findings. The www.aetic.theiaer.org reason is that stochastic algorithms get a higher convergence rate with a given good quality probability, the advantage of those algorithms is based on converging probability error.
Mimicking natures behaviour, the interactions laws are processed randomly and continuously, before numerical outcomes converge usefully to approximate means, moments, and their variances. Monte Carlo portrays an attempt to design the physical environment by prompt emulation of a system's fundamental characteristics. As a result, the computational process is straightforward for its strategy, it solves macroscale problem by simulating the microscopic interactions. This is where the method's benefit lies. Almost all interactions are microscopic in nature. The geometry of the world we are, which is so important in the construction of macroscopic solutions, is only used to describe the local environment of objects interacting at a given location at a given time.
To understand nature, the scientific method relies on observation (measurements) and hypothesis (theories). The connection between these two is facilitated by a multitude of statistical, numerical, and simulation techniques, all of which are exploited by the Monte Carlo process.

Medical Radiation Dosimetry and Radiation Treatment using Monte Carlo
In the case of cancer radiation therapy, Monte Carlo is better than other analytical approaches because it is best adapted to tackle the complicated problem of particle travel and energy deposition inside a heterogeneous medium, as an example mammals body structure, skeleton, internal organs.
Radiation transport algorithms has had a huge effect on various aspects of radiation dosimetry over the last few decades. Several quantities are often difficult (or impossible) to determine correctly during the experimental estimation of dosages taken without computational models. Radiation dosimetry detectors are typically made up of many parts, each of which is made of a substance that varies significantly from the medium when it is necessary to calculate the received dosage. This condition creates a well-known problem that can be described using variational factors and changes in energy response. Growth in Monte Carlo techniques has influenced the precision of absorbed dose determination over the decades of radiation dosimetry protocols.
In modern radiotherapy, Monte Carlo simulations are a very useful tool. These codes have immense means for predicting and interpreting all kind of particles in radiotherapy, but they have also proved useful for keV x-ray devices, brachytherapy sources, and patient dosage measurements. Furthermore, computational modelling is being more frequently utilized in imaging in the diagnostics and medical care verifications but still is far from mass clinical implementation. Simulation time and data storage, which were once prohibitive for complex Monte Carlo modelling, are now less of a concern taking into consideration the advent of ultrafast GPU processors. Monte Carlo clinical treatment preparation will become the workhorse of inpatient planning in the next few years. Speed and accuracy will be critical, and consistent commissioning and model generation guidance will be needed. Computational techniques are widely acknowledged as a very important technique for modelling and studying radiation transfer in radiotherapy applications. The development of a simulated image of the radiation source is common and significant, especially on the field of external beam radiotherapy where the computational processes can play a crucial role. Source configuration and optimization, radiation detector reaction analysis, and treatment preparation are some of the applications. Since the vast majority of radiotherapy is done using (MeV) photon beams from a linear accelerator (LINAC), the research activities demonstrate this.
Electron rays, lower energy photon beams originating from an x-ray tube, and hadron beams are used in external beam radiotherapy to a lesser degree. Monte Carlo (MC) simulation, like other branches of radiation therapy, has been an important dosimetry technique in contemporary brachytherapy, with central positions in clinical practice and study.
To calculate dosage rate around a discrete radiation source is well-known use of MC methods in brachytherapy. Low energy radionuclides with mean energies of less than 0.05 MeV or higher energy radionuclides with mean energies ranging from 0.3 to 1 MeV are commonly found in modern sources. Miniature x-ray sources with a Bremsstrahlung spectrum of 51 keV are also available. The root geometries as well as the clinical implementations are very diverse. Radioactive content and radiopaque markers are encapsulated into permanent implantable seeds in 'LDR' brachytherapy and an iridium mixture was welded at the point of a drive wire in 'HDR' brachytherapy with mobile after loader. Even if they emit low energy photons, mini x-ray sources with tungsten anodes are classified as HDR.
Anisotropic dose distributions result from absorption of photons and keeping track of their movement after they interact with the surrounding medium, also keeping track of radiation interactions within the source structure. To achieve clinically appropriate dosimetry precision, substantial control of dose distributions must be adequately modelled, and these features are difficult to extract using analytical methods like the Sievert integral [1].
Because of the difficulties in calculating single source dosage distributions experimentally due to the sharp dose gradients, low photon energies, and high dose rates associated with brachytherapy, numerical dosimetry techniques such as MC simulations have been an important method in the field [2].
In addition, correctly accounting for radiobiological effects of low energy brachytherapy necessitates the measurement of photon and electron spectra at different wave lengths from sources, which is best accomplished using MC modelling. Future MC approaches for brachytherapy applications can be classified into two groups, similar to the preceding two sections: (a) single source dosimetry for clinical development, and (b) patient-oriented applications, such as patient care preparation and treatment preparation.
Proton and ion beam treatments have recently gained popularity. Monte Carlo simulations of protons are used at various stages of the dosimetry field. In terms of electromagnetic interactions, proton and ion transport simulations are similar to electron transport simulations, with the exception that both scattering angles and energy straggling are greatly reduced. Multiple scattering and straggling theories developed for electron transport work well for protons and ions, and major Monte Carlo codes for protons and ions transport, such as Geant4 and MCNPX, use similar models.
About a large percentage of proton treatment clinics now employ the passively scattering approach. In the future, magnetic beam scanning is projected to be employed to treat a growing number of proton therapy service users. Both strategies can produce a broadcast Bragg peak (SOBP) uniformly distributed at a target dosage for each region, with the latter allowing intensity-modulated proton beam therapy.
An SOBP guarantees coverage around the thickness of the target volume in the patient to provide a standardized dosage to the target for a given treatment area (i.e., beam angle). It's made by integrating pure Bragg curves with various beam energies that reach the patient.
As in passive scattering, this can be accomplished by spinning an absorber with steps of different water-equivalent thicknesses or by discretely adjusting the beam intensity for each energy layer as in beam scanning. A range shifter in the treatment head in some scanning systems is used to change the beam energy. To obtain a laterally flat dosage profile passive scattering, the beam must be scanned magnetically in a plane direction of a cartesian system or it must be broadened using scatterers and absorbers made of different material combinations. www.aetic.theiaer.org Apertures that confine the field's lateral dimension and range converters that are moulded to match the dosage distribution to the desired volume's distal form are used in passive scattering to further shape the area.
Patient-specific hardware is not really required for beam scanning. Depending on the width of the generated pencil beams, an angle can be used to increase the beam's lateral region. FLUKA [3][4], Geant4 [5][6], and MCNPX [7][8] are popular Monte Carlo codes used in proton therapy.
These codes vary not only in terms of precision (which is probably minor), but also in terms of user experience, also the degree of control over monitoring parameters differs between the codes as well.
Aside from the previous mentioned software, other options exist like packages that operate as interfaces to MC hard written libraries. GATE [9][10], GAMOS [11], and TOPAS [12] are examples of these for Geant4. Furthermore, the user can use computed tomography (CT) to model a patient's geometry.
In the near future, medical radiation dosimetry and the strategy for radiation therapy will play an important part in our lives. As it is now, cancer kills a very significant number of people and many techniques are employed to combat with this monster. As a teaching tool, Monte Carlo offers users with the opportunity to test and assess the effect on patients using highly costly instruments, including proton therapy accelerators, treatment planning facilities which includes different radiation experiments such as testing various isotopes. More and more individuals have access to radiation information in the short run, as computers and GPUs, cloud technologies become user-friendly and inexpensive. Many of the software discussed here are also accessible in cloud containers, like amazon, azure or any other provider, since computer scientists mostly use this method to address computing costs via multi-threading and clustering. It's highly restrictive to simulate essential particles that travel through substance with layers of various elements like bones, veins, soft tissues, fat etc. Our grasp of natural processing and its statistics is the basis of computer programs, therefore the precise function of probability must be precisely established. The problem is multi-dimensional since it combines studies from many fields. The tools we have now also require training since they are quite complicated in running, building and experimenting, because some of them are open source which is a wonderful thing , but are not easy to use or to explore for an emerging user on the other side.

Detector design for nuclear medicine and diagnostic imaging
During the creation of novel detectors for scanning purposes, which is lengthy time and expensive process, the 3D shape of the detector, the physics participating in the detecting elements, and the end outcome rating must be thoroughly inspected. Other considerations, such as cost and production limits, must also be taken into account. A proto type is normally created at some point to test the validity of the detector idea and to do some preliminary performance evaluation. Monte Carlo simulations can help with novel detector development by helping to better understand the detector response, testing alternative alternatives without having to create the relevant components, and identifying the most promising arrangement without having to conduct costly tests.
In Radiography CT scanning and in nuclear medical imaging SPECT and PET, Monte Carlo simulations are now a standard technique. The principles and application of computations in SPECT and PET, as well as the various codes available to execute such simulations, have been extensively studied in recent papers [13][14].
Since the year 2000, 15 codes have been used, eight of which were created by the writers and seven of which were open source published or available as open access or at a cost of a licence fee (EGS4, MCNP, SimSET , and SIMIND , GEANT, Penelope and GATE). As a result, the general number of codes used in detectors for nuclear medical imaging of Monte Carlo simulations is constant. This means that a common software for SPECT and PET that most percentage of people can agree to and use does not exist. SimSET and SIMIND have been the most commonly used codes since 1995.
Codes like GEANT and Penelope start to focus on the need of SPECT and PET simulations, codes that were not developed specifically for SPECT and PET simulations but for a variety of other purposes. This discovery, along with the fact that users were willing to trust a code created by others (as evidenced by the widespread usage of SimSET and SIMIND), were the driving forces behind the creation of GATE, which utilizes GEANT4. www.aetic.theiaer.org Figure 2. This is of a phantom and a cylindrical PET system visualization using GATE during a simulation in our Lab.
GATE's implementation was in order to address the pre-existing codes' constraints at that period. GATE architecture designed for PET and SPECT simulations was available thanks to the inclusion of a very flexible, object-oriented software toolkit, which assures stability, support. The creators wanted the software to be simple enough so that the users can easily build any kind of nuclear medical imaging scanner, even prototypes. Except for GATE, none of the codes currently satisfy any of these requirements.
SPECT and PET imaging needs for computational simulations is growing, as it provides useful opportunities to better understand the strength and limitations of these modalities, as well as to detect crucial aspects of the imaging and detection mechanisms that have been limiting their potential.
Atomic medical detectors and diagnostic imaging simulation have a number of instruments that may be simulated and assessed in today's world, such as SPECT, PET, CT, CBCT, etc. The degree of abstraction and implementation in present code structures is considerably to improve, as well as hybrid models and novel detection techniques and materials with new electronics are not clearly specified. There are also huge improvements not only in our approaches but also algorithmically. In the near future, radiation detection will play an important role in our ability to diagnose and explore within our bodies to discover healthy solutions and bring us more understanding.

Radiation shielding design and dose estimation for radiation protection purposes
Radiation shielding is an important aspect of disciplines such as radiation physics, engineering of nuclear mechanisms and engines, reactor physical processes, and celestial radiation. Shielding is an essential feature of radiation protection because it may be used as a type of radiation control and provides a more reliable method of controlling personnel exposure by limiting the dose rate than the distance and time factors. Monte Carlo computational software plays a significant role because it only requires computing power to simulate realistic experiments and test various sources, materials, and cover more ground more efficiently and less expensively, whereas physical experiments require tremendous costs and effort to build platforms, test new materials, measure exposure and protection factors.
Nuclear shielding is used to protect human health and the physical ecosystem from radiation exposure. Dosage estimation and vulnerability assessment from various sources, however, special dosimetry devices have been created for such calculations. Dosimetric research aids in the construction of radiation source shields, imaging procedures, low-energy accelerators, and nuclear reactors.
Radiation shielding is known as the placement of any absorption material between a person and a radioactive origin so it can reduce the harmful impacts of nuclear radiation [15]. Shielding materials are provided by various safety regulating organizations throughout the world.
Ionizing radiation is a high-energy particle that passes its energy to surrounding cells, causing damage to living tissues [16]. Shielding materials have been developed to protect different life forms from dangerous radioactivity [17]. Radiation shielding compounds with heavy nucleus and high in density, such as metals and cement structures, have been used to guard against radiation damage [18]. www.aetic.theiaer.org For handling hazardous wastes, nuclear plants, personal devices, electric parts, and preventing personnel from being subjected to radiation, some safety authorities have radiation control solutions and shielding materials [19][20].
Shielded containers (shipping casks) are used to move radioactive materials from nuclear plant sites to manufacturing facilities during radiochemical processing [21]. Personal protective equipment (PPE) is also provided by these regulatory commissions to deter nuclear exposure [22].
The International Space Station (ISS) is currently conducting manned space explorations, which will eventually stretch at Lunar surface and past that. Astronauts of the group are affected to energetic charged particles in space radiation, which can have adverse radiation effects.
Since the ISS is in low Earth orbit (LEO), the geomagnetic system protects crews from interstellar celestial rays (GCRs) and charged particles from the sun (SEPs). Since electromagnetic filtering, close to Earth orbit, is virtually non-existent , the radioactivity environment is aggressive. Owing to the long flight durations to reach stations and facilities that are orbiting around our Planet [23] and potential flights near planets and the depths of the universe, astronauts is going to receive substantially larger doses of radioactivity.
Human health risk during space operations has been detected by NASA [24].Even though contributions to HZE particles to the dosimetry throughout the space atmosphere have been undertaken in multiple operations throughout the Apollo period [25], the spaceship [26][27], the station [28], and the ISS [29][30][31], the HZE particles add significantly the dosage exposure because of the strong LET, leading in important biochemical consequences. Men on the Moon Circular Stations would be subjected to substantially greater doses of ionizing radiation than crews on LEO flights. To reduce the amount of radiation in missions outside the Earth's atmosphere there have been made many efforts over the years with different approaches like passive and active shielding [32] .
Shielding and radiation protection is going to play a significant role in our journey to conquer the universe in the years to come, as well as in our approach to nuclear energy, imaging and treatment facilities. Numerous very expensive exhibitions have provided us with a picture of what to expect outside earth atmosphere in terms of radiation. Computational software has made it possible to test new materials in these conditions. But new data from material science with denser kernels and higher atomic numbers cannot be tested as we don't have statistical information, so limitation is that new materials need to be observed and we need to measure probability functions for numerous physical events of particles passing via them and then we can use computational methods, so again Monte Carlo is very closely associated with the statistics of each event observed for each material and compound.

EGSnrc
EGSnrc is a Monte Carlo software toolkit for ions and photons transfer through matter. It simulates the propagation of photons, electrons, and positrons within homogeneous materials with spectrum of 1 keV up to 10 GeV.
Historically EGS, was a code originally created for high-energy physics shielding and detector simulations, came to be used in medical physics has never been published in writing. Rogers' modesty most likely affected his presentation of it in his essay [33].
EGS3 [34] was released by SLAC in 1978, and Rogers used it in many significant publications [35][36][37][38]. The publication that gave a patch to the EGS3 algorithms to replicate a technique used in ETRAN, making electron-dependent calculations more accurate by shortening the steps to simulated interactions, was particularly significant.
Electron transport step-size objects were entirely mysterious at the period. As Larsen [39] so eloquently explains, shortening the steps is known to solve the problem, but at the expense of increased computational time.
These step-size objects drew Nelson's interest, and he invited Rogers to collaborate with Hirayama, a research scientist at KEK, Tsukuba, Japan, on the following version of EGS, EGS4 [40].
In the spring of 1984, Nahum, who was involved in modeling ionization chamber reaction, came to Rogers' lab to work on this project. Nahum also had a scholarly background in electron Monte Carlo, www.aetic.theiaer.org developing what would later be discovered to be a much superior electron transport algorithm by a Lewis [41] moments review [42].
The problem was solved by reducing the step size [43][44], but the quest for a solution to step-size inconsistencies began, leading to the creation of the PRESTA algorithm [45][46]. Following the publication of PRESTA, a number of minor but significant flaws were revealed [47][48]. Improvements were made over time [49][50][51][52], ultimately leading to the EGS implementation revisions known as EGSnrc [53] and EGS5 [54].
Several facets of electron transport have seen major advancements in recent years. Kawrakow and Bielajew [55][56] established advances in multiple scattering theory, which overcome the majority of the flaws in Moliere theory [57] which was utilized on version 4.
Essentially, Kawrakow and Bielajew developed a new ion movement and tracking library, known as PRESTA-II, that played a significant breakthrough on particle transport physics. Moreover, the advancements, Kawrakow made some other changes to EGS' electron transport method, allowing it to calculate ion chamber reaction to 0.1 percent range [58][59].
EGSnrc incorporates a number of new functions, a large number of were previously created extensively as modifications to EGS4 [60][61]. EGSnrc method was not anything like the previous versions in part because they followed a straightforward pattern when making fundamental improvements to the code. The KEK party, on the other hand, has introduced multiple alternatives that are not yet in EGSnrc. It's worth recalling that the NRC and SLAC have signed a written agreement acknowledging that they all have rights to EGSx and EGSnrc.

FLUKA
FLUKA (FLUktuierende KAskade) is a paid MC software with restricted robustness for particle and nuclei movement and activity in matter [62][63].
FLUKA is a multipurpose program that can be used to calculate particle movement and activity with matter. It can accurately model the different reactions and distributions of around 60 distinct particles in matter. The updated version of the Combinatorial Geometry (CG) tool, it can accommodate even the most complex geometries. Optical photons and polarized photons (e.g., synchrotron radiation) may also be transported.
Online monitoring and time evolution of emitted radiation from unstable residual nuclei is possible. The customer does not need to program for the majority of applications. For users with special specifications, an api in Fortran 77 is available.
The FLUKA CG has been developed to detect ions correctly (with magnetic or electric forces presented). The software has been entirely restructured, and dynamical dimensioning has been introduced.
FLUKA has a plethora of applications in particle physics, large energy physics, material science, trying different substances for shields, building nuclear cameras and optoelectronics, celestial radiation research, [64] calculating and scoring dosage [65], imaging nuclear or radiology, and radiobiology, to mention just some. Hadron therapy is a relatively new field of research [66][67].

SimSet
The electronics, physical operations and different instruments used in emission medical procedures, the SimSET (Simulation System for Emission Tomography) kit employs a variety of MC techniques. SimSET has been a key reference for numerous nuclear imaging detector's research groups around the world since its initial publication in 1993.
SimSET is now being developed at the University of Washington Imaging Research Laboratory, with new features and utilities being added all the time. The course of growth is influenced by both our own needs and those of our customers. SimSET is a free program. The Photon History Generator, or PHG, is the software's heart components module. The PHG generates photons and records the following decay through a phantom in a standard simulation. SimSET is used mostly for Nuclear Medical Imaging applications.

PENELOPE
PENELOPE was built as a software program for emulating light and ions travelling through substances of any kind. PENELOPE covers a spectrum of energies from 50 eV to 1 GeV. Physical processes and activity models that are used through the tool's code are built on the most up-to-date knowledge available. The www.aetic.theiaer.org particles equations, semi-experimental trials, and quality approved databases are combined in the modeling of this software.
It's worth noting that, while PENELOPE will has the ability to simulate particles from 50 eV, the activity that cross sections provides for energies underneath 1 keV are subject to large uncertainties which is also true for other codes that claim to simulate transportation down to such low energies, as it is below the work function of most elements.
Theoretically the person operating can have a key guiding software to monitor particle sequences across the substance formation and maintain monitoring on their path. PENELOPE includes the PENGEOM the flexible geometry tool, which allows for automatic particle tracking in complex geometries. The OECD/NEA Data Bank distributes the PENELOPE software.

MCNP
The regional research lab of Los Alamos [68] developed MCNP (Monte Carlo N-Particle) software package which is commonly utilized from various scientific groups and for various substances. MCNP can simulate photons, hadrons, fermions travelling through organic and inorganic substances. Regarding hadrons, all interactions in a given cross-section analysis are taken into account. MCNP has an advanced geometry description.
The MC method for calculating the movement of radioactive particle was firstly developed at LANL in 1946 with the most famous computer scientist involved at the time. Since the dawn of modern computing, MC methods for particle movement and track have been directing the development of computational advancements.
The initial general purpose MC radioactive particles movement software, MCNP, was created in 1977 by combining different codes built in the 1950's and 1960's as processes. The first version of the MCNP code was published in 1983, and it was called version 3. Coding processes was completed in December 1947, and the first calculations were performed on the ENIAC in April/May 1948.
In 1948, Enrico Fermi created the FERMIAC mechanical instrument, which used the Monte Carlo Method to track neutron motions through highly enriched materials. These modern techniques were grouped into a sequence of particularly unique MC tools in the 1950s and 1960s, including MCS, MCN, MCP, and MCG.

SIMIND
The MC computational software SIMIND defines standard clinical SPECT camera which could be conveniently adjusted virtually for any form of SPECT imaging experimentation. Professor Michael Ljungberg is the creator of simind. Regarding the language selected for building the software FORTRAN-90 was picked, having cross-platform versions which are completely functional.
The benefit of using a MC software is that it makes for a more thorough analysis of radioactive transfer, and can be used to investigate the causes of image loss in nuclear medicine scintillation camera systems.
The detector, shell, and phantom can all be made of different materials. It is possible to simulate phantom and source forms that are cylindrical, circular, rectangular, and more complex. The Physics capabilities includes photoelectric, incoherent, coherent, and pair production phenomena which can be scored and calculated in a simulation. Various detecting variables can be measured, for example the energy of a pulse-height distribution and pulse pile up because of the scintillation light emission's finite decay time. To approximate the system's energy resolution, the energy delivered is normalized with an energydependent Stochastic feature. It is possible to mimic the center of the incidents in the visual vector of the detector.

GEANT4
GEANT (GEometry ANd Tracking) software toolkit that encapsulates modern design and state of art developing techniques using Monte Carlo modelling methods to explain the movement of elementary particles through matter. The base of Geant4 is a plenty set of physics models to take care of particle-matter encounters covering a large area of energy range. In few words we can say that the software toolkit encapsulates information and modelling methods used from many sources around the world.
To gain clarity, the application is developed in C++ and uses sophisticated software programming techniques and object-oriented programming concepts. CERN physicists designed it to satisfy the www.aetic.theiaer.org requirements of the next era of investigations in particle physics. Developing the project into a broad multinational consortium of physicist developers and software engineers from a variety of institutes and universities across Europe, Japan, Canada, and the United States, engaging in a variety of high-energy physics experiments.
Geant4 aims to be the first computer simulation program with the versatility and usability needed to fulfill the requirements of the future generation of particle physics experimentation, including nuclear, accelerator, space, medical physics, biological DNA experimentation. It's worth mentioning that the project is open source so that all the people involved can contribute and understand deeply the techniques and applications of mathematical models that are used.
The RD44 project was a ground breaking attempt to rewrite a large CERN computational toolkit in a new age architecture which utilizes the object-oriented (OO) ecosystem using C++. With the delivery of the first production update in December 1998, the R&D process was completed. The partnership was called Geant4 and resurrected after a formal Memorandum of Understanding was signed.
Geant4 could well be the biggest and very complex effort of its class outside of the commercial sphere in terms of code size and scale, as well as the number of participants.

GATE
In emission tomography, MC simulation is utilized for developing contemporary clinical imaging equipment, evaluating new figure restoration methods , also ways of adjustment of scattering, as well as the optimization of scanning techniques. Despite the fact that special code has been developed focused only on PET and SPECT, these methods have numerous of disadvantages and shortcomings when it comes to verification, precision, and maintenance.
GATE is a user-friendly modeling software package that combines photographs, radiotherapy, and dosimetry in one environment [69]. GATE's features have been expanded for use of this type of therapy since its 6.0 version, which introduced new software devoted to radiation therapy simulations.
GATE utilizes the GEANT4 toolkit classes to provide a scalable, flexible, easy to use scripts for computational experimentation in nuclear medicine. In particular the software allows in the level of simulation to take into consideration realistic variables of the conditions by modelling phenomena of electronics and mechanical parts of the detector.
GATE was created as part of the Open GATE Collaboration for the purpose to supply the world of physicists with a flexible open-source package tool, multi-purpose, GEANT4-based emission tomography computational tool to the academic community. GATE was created to conduct experiments with PET and SPECT [70][71], but new tools made it possible to use it for radiotherapy simulations, including linear accelerator simulations [72][73][74].
GATE integrates the excellently evaluated physical processes designs, complex geometrical systems, and accurate graphics and multi-dimensional modelling features of the GEANT4 toolkit with exclusive emission tomography functionality. There are several hundred C++ groups in it. A base abstract level of classes identical to the GEANT4 base, contains mechanisms for managing scoring, particle history, complex geometrical descriptions of objects, and radioactive sources. User classes originating from the abstract level may be implemented using a simple object layer. GATE does not need object-oriented programming since the framework layer supports all necessary features: a specific coding method, which is mostly known as a macro tool is used instead, which in its order calls internally the geant4 control layer interpreter. The data analysis architecture ROOT [75] is used by GATE to conduct data analysis.

GAMOS
GAMOS is a MC emulation platform built on the Geant4 toolkit, with the exception that it is more userfriendly and more scalable than GEANT4.
Without the need for object-oriented coding, the robust scripting language makes it simple to implement the most basic specifications of a software that are capable of reproducing Medical Physics experimentation.
The plugin technology, together with a thoughtful modular architecture, extensive documentation, and a series of examples and tutorials that describe in depth how to expand the platform in various directions, helps users to fully leverage GEANT4's functionality by writing new user code or reusing existing GEANT4 code and combining it seamlessly with existing GAMOS modules. www.aetic.theiaer.org The aim of GAMOS is to provide an easy-to-use software framework that allows an inexperienced people make experimentations and build their project without bothering to write in C++ but by just a basic understanding of Geant4, as well as a flexible framework that does not preclude the use of all Geant4 functionality, allowing for the addition of new functionalities and easy integration without spending time.

TOPAS
TOPAS (TOol for PArticle Simulations) bundles and expands the Geant4 libraries to take advantage of a more sophisticated Monte Carlo simulation which includes most types of radiotherapy available systems so that medical physicists can find it more easily to use.
TOPAS can emulate effectively photon and particle therapy systems, build a human geometry from CT DICOM pictures, score doses, calculate fluence, and other parameters, save and play back a distinct area, display visuals that are sophisticated, and on top of that the tool is entirely four-dimensional (4D) to accommodate differences in beam distribution and human geometrical shape over the course of therapy.
People who use TOPAS customize prebuilt modules to model a large range of radiotherapy systems without needing to know much about the Geant4 or any programming language.
A special TOPAS parameter control system controls all facets of the simulation, including all 4D behaviors. The tool was built with the feature of being adaptable, but convenient, efficient and consistent. The software design places huge emphasis that the users will be protected from faults, using several of strategies to make it more difficult for users to make errors.
Though proton therapy was the most common early use of TOPAS, it is now accessible for usage in all radiation treatment domains, as well as some medical imaging applications. TOPAS is currently being expanded to include radiation biology and scientific education. Studies in radiation exposure to electronic devices, particle and atomic physics as well as astrophysics are also possible applications in the near future.

Software Summary
After going through each piece of software one by one, we've summed together the most important information for someone who wants to start experimenting with Monte Carlo and wants to know all there is to know about the obstacles, capabilities, and popular applications of each piece of software. Note that there are seven columns and ten rows, in each column information is provided for the particular piece of software. The design of the table was made to provide a picture to someone that wants to use Monte Carlo software but does not know where to start, and also provide some key characteristics of all software in order to make a comparison and make a decision over which tool to use to address the experimental needs. The first column is the name of software, the second column is the particle which each software can model, this is an important characteristic as not all software can simulate all fundamental particles. The third column is the energy range that every software can provide as this plays also a significant part in the decision to use or not this particular code because some experiments require higher energy spectrum than others. The fourth column is the Programming Language that the software was written, this is also an important characteristic as some people can have the option to build something extra on the existing software or just know the limitation by design that the end software brings. The fifth column describes if the software is open source or not. The sixth column offers you a feeling of the pre-existing needs of computing techniques to use every software, certain software is easy to use, for example you needed just knowledge on the command line and the program provides graphic user interfaces so no programming/developing is required, others are harder to use without graphical user interfaces, and others requires programming code to be written in order to describe an experiment and address by code geometry, physics and detection. The seventh column describes the most common uses of each software in order for someone to make a decision to use a software based on the experiment that needs to be simulated.

Conclusion
In this work we have presented all of the most known software packages that can do experimentation simulation, combining high energy physics, algorithms and statistics. Also, the history of these software and methodologies were discussed. We believe that a huge role in the near future for Monte Carlo is not only the analysis of algorithms and new modelling to existing problems but also new building, supporting and maintaining software tools. Essential role plays the open-source availability of such complex software packages because in order for them to survive, evolved and maintain over the course of the years, codes must be open and available to the public, one huge example is Linux-Android. Another challenging problem is the complexity that someone has to deal with to get involved in this field, as it combines www.aetic.theiaer.org Computer Science, Mathematics, Statistics, Optimization techniques with the science of Physics. The two projects that are available on GitHub the GATE and EGSnrcmp are in our opinion leading examples for such technology to move forward, as the GitHub platform supports open-source projects only and provides available tools online so that a lot of people can contribute to the code and move the project forward solving a lot of issues, details for installing, bugs, features, patches, versioning, benchmarking and running. Monte Carlo as a methodology that encapsulates complex mathematical methods for simulating events in nature and to be built as a software package, toolkit and framework has been a very difficult task over the years, this work is only a glimpse of a few examples of Monte Carlo on subjects such as, clinical diagnostic imaging both radiography and nuclear, hadron and photon therapy, treatment planning systems, radiation dosimetry, radiation shielding, cosmic radiation shielding and material science mostly focused on natural events, the Monte Carlo methodology is a field that is growing also in all directions of the scientific world. We believe that in the future it will be available in a lot of commercial applications including not only high energy particle events but also events of everyday life guiding people towards making better decisions.