Skip to main content

Currently Skimming:


Pages 70-89

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 70...
... will probably run for three or four more years -- that is a little bit uncertain -what we do and how we do it in the context of the statistical analysis of high-energy physics data. ~ am going to leave to you the question at the end, as to whether we are actually analyzing a massive data stream or not.
From page 71...
... 71 you can think of as forces, but we tend to actually think of as particles that communicate with each other by the exchange of other particles. Depending on how you count, there are basically three types of stuff, fermions, and there are basically four forces and we omit the fact that we don't actually understand gravity.
From page 72...
... This is a cartoon. ~ get a little bit defensive about that.
From page 73...
... particle physics has 88 distinguishable particles in it. So, no matter what we COO, we seem to keep coming back to approximately 100.
From page 74...
... 74 Well, that is bad. The theory is getting more and more complicated, but we are still explaining all of the data, and there is one part where we have experimental evidence completely lacking, for whether the theory is really working the way it is supposed to.
From page 75...
... The typical reaction, though, is a little bit hidden from us. If ~ want to study the box that ~ showed you earlier, ~ can arrange for it to happen, but ~ cannot see the direct result.
From page 76...
... The gravitational constant, mass of the electron, tax rate -- well, maybe not the tax rate -- none of these are intrinsically complex. Standard Model connects types of quarks via a complex "CKM mixing matrix" u s b *
From page 77...
... 77 before my time. You build a large number of measurements and you try to over-constrain this.
From page 78...
... So, the hardware detects that, drops this data stream down, both in space and time, fewer bytes per crossing, because you don't read out the parts of the detector that have nothing in them, and fewer per second. Then, there is software that takes that 60 megabytes per second down to about 100 per second that can't be distinguished from interesting events.
From page 79...
... The simulation knows about all the material in the detection, it knows about all the standard model, it knows how to throw dice to generate random numbers. It is generating events where we know what actually happened because the computer writes that down, very similar to your simulations.
From page 80...
... The events you are looking for will have a finite number of final state particles. The electric charge has to ally up to the right thing, due to charge conservation, etc.
From page 81...
... A typical efficiency in terms of that kind of mistake for us is 20 percent. We can discuss whether it is 10 percent, 25 percent, but it is routine to throw away the majority of events that actually contain what you are looking for because of measurement errors and uncertainty.
From page 82...
... Maybe that fit will have directly in it the standard model parameters that we are looking for. Maybe there will be parameters that describe a resolution.
From page 83...
... 83 | Reconstructed ems ~nereric.
From page 84...
... That means that they will write code that is just complicated enough that it will complete before they come back with a cup of coffee. Now, my question to you -- we don't have to answer this now -- is ~ am not sure that we are doing statistical analysis of massive data streams at this point.
From page 85...
... 85 If something is marginal because maybe something a little bit more complicated than the first order thing ~ am looking for has happened, we throw it away. You can easily do back of the envelope estimations that we are talking about factors of three in running time for experiments that cost $60 million by doing this.
From page 86...
... 86 the statistics of combining data with non-Gaussian error. The way that you interpret this is that each little color band or hatched band is a different measurement from a different experiment, and this little egg-shapec!
From page 87...
... Anomalies can be due to mistakes. The standard models predict that an electric charge is 100 percent conserved all the time.
From page 88...
... MR. JACOBSEN: Neural networks?
From page 89...
... 89 algorithms. AUDIENCE: What are the number of -- toff microphone]


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.