Skip to main content

Currently Skimming:


Pages 264-279

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 264...
... This is, you know, partly, for me, but T hope that some people may not know as much about the nuts and bolts of it, which are relatively important in understanding how intrusions are done. T would like to talk a little bit about our project, do a little demo.
From page 265...
... 265 like to talk about streaming algorithms and maybe some graphics proposals that we have. Scow: of th6~Pm~blem i.
From page 266...
... 266 You can also have Class C networks, in which field four identifies the host and the first three fields identify the particular network. So, Class C networks are relatively small networks, they only have 256 hosts in the network.
From page 267...
... 267 TCP/IP Add tong ~ if.
From page 268...
... If you are getting Internet traffic from e-mai! or, for example, from Web pages, there is maybe lots more of this going on, but this is sort of the typical prototype.
From page 269...
... 269 things you can see. If you sort of multiply all that junk out, you get 3.4 times 1038, which roughly means that every 30 atoms can be having its own {Pv6 address.
From page 270...
... The idea is that the program will capture something like TCP Delta's program that captures the header information of all data flowing through a given point. Ours is implemented in such a way that all traffic in and out of George Mason University is routed through -- not routed through, but also sent to this machine.
From page 271...
... Within George Mason University, we have very little traffic between about 3:00 and 5:00 in the morning, comparatively little, very heavy traffic up to about 10:00 o'clock when people are sorting out, ~ guess, their e-mail. Traffic tapers off a little bit.
From page 272...
... He wants to find out who is naughty or nice, but he discards the data after a year, so he is clearly a streaming data analyst.
From page 273...
... Steve Marin was asking me why ~ do this instead of using somebody else's database. Let me show you a little bit of sort of what data looks like.
From page 274...
... —~¢ ~~—i~—x~:~: ~x~ ~'^~'~) ~' ::: - ~:~ws~:~ ; ~~~~::~ : ~ I: So, ~ wanted to say a word on recursive algorithms, because recursive algorithms are the algorithms that you basically need to deal with streaming data.
From page 275...
... So, if you are interested in streaming data and collecting density information, you can do that this way. ~ thought Daryl Pregibon brought up an interesting idea yesterday, which is the idea of exponentially weighted averages.
From page 276...
... 276 ~ . ~ ~ : ~ : : ::::: :: : ~ :::: ::: : ~ : :: ::: :: i: i: ::: ~:~:~: : T just wanted to give you a couple of references.
From page 279...
... 279 AUDIENCE: Ed, it doesn't strike me -- looking at these packets, it doesn't strike me as a visualization problem. So, ~ am sort of curious why you framed it that way.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.