Linear Collider Forum

Home » Analysis and Reconstruction » Analysis Tools » Simple analysis example of stdhep files ?
Re: Simple analysis example of stdhep files ? [message #1285 is a reply to message #1270] Thu, 08 November 2007 09:35 Go to previous messageGo to previous message
Messages: 138
Registered: January 2004

kutschke wrote on Wed, 07 November 2007 14:29

Hi Graham,

Some of the ILC event files advertised to be in stdhep format are not. They are written in an unofficial extension of stdhep that will cause most stdhep readers to dump core ( or, sometimes, to produce incorrect results). So you must either choose to avoid the bad files or choose your stdhep reader wisely.

Any file that contains more than NMXHEP particles in one event will cause the problem. This happens often in beamstrahlung event files. I have not seen the problem in other files but I can imagine situations that it might. Currently NMXHEP=4000 in the official release.

As you say the only non-standard feature of the stdhep files in use in the ILC community is the possibility that they can contain more than 4000 particles per event. I have never seen this be an issue except in background overlay events, I would be surprised if it effects anyone doing physics benchmark studies.


What readers to use:

The files can be read safely by org.lcsim.

I guess, but do not know for sure, that the code that converts stdhep to LCIO is also safe. Norman can you confirm this?

I do not know about Marlin but I would guess it's safe. Frank do you know?

The files cannot be read by the 4th concept code.

The files cannot be read by the standard stdhep libraries owned by the stdhep maintainers.

All of the org.lcsim code and LCIO code uses dynamic allocation for events and can handle any number of particles.

To be pedantic the standard stdhep libraries can read files generated by SLIC or org.lcsim, so long as they do not actually contain more than 4000 particle per event.

If it is true that the standard libraries core-dump when reading an event with >4000 particles then it is easy to at least fix them so that they die with a friendly message, since the number of particles is clearly in the header of each event.


I am aware of a hack that will work around the problem in a confined environment but it is not truly robust for use with the full body of legacy code. Any robust solution will require changes to the legacy code: the question is where to require the changes.

Note that the Java code at least supports a "compatibility" option that will cause it to refuse to write events with >4000 particles, although it is not turned on by default.

I am happy to incorporate any suggestions anyone would like to make for handling compatibility with legacy code better. It would be nice to add a "max number of particles" in the header of the file, but difficult to implement since it is generally not known until the file is completely written.
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic:Neural Nets and
Next Topic:JAS3 support of HBOOK and ROOT files
Goto Forum:

Current Time: Sun Nov 17 21:37:16 Pacific Standard Time 2019
.:: Contact :: Home ::.

Powered by: FUDforum 3.0.1.
Copyright ©2001-2010 FUDforum Bulletin Board Software