Thread View: comp.misc
3 messages
3 total messages
Started by pokey@well.UUCP
Wed, 09 Nov 1988 09:55
Worm CPU usage.
Author: pokey@well.UUCP
Date: Wed, 09 Nov 1988 09:55
Date: Wed, 09 Nov 1988 09:55
24 lines
1216 bytes
1216 bytes
I've been thinging about how many CPU cycles the worm used in its short but sweet rampage. People have been tossing around the number 6000 for systems infected. (Where did this number come from?) Let's say these machines averaged 2 MIPS each, and wildly guess that the average length of infection was a day. That comes out to (kachunk kachunk) one quadrillion instructions. This may well be the most CPU power expended so far on a single problem. Now, if we could harness that kind of power for useful work... Imagine a net-wide distributed ray-tracing service. If you want something traced, you send the NFF to a central clearinghouse. All over the net, people start ray tracing servers when they leave for the day. These servers connect up to the clearinghouse and fetch the scene description, and a list of pixels to trace. By morning, hundreds of scenes have been traced and returned to their senders. (Close your mouth, you're drooling.) By the way, I spent Sunday doing the PostScript artwork for a worm t-shirt. Anyone interested in ordering one? --- Jef Jef Poskanzer jef@rtsg.ee.lbl.gov ...well!pokey "Why should we subsidize intellectual curiosity?" -- Ronald Reagan
Re: Worm CPU usage.
Author: desnoyer@Apple.C
Date: Thu, 10 Nov 1988 01:01
Date: Thu, 10 Nov 1988 01:01
10 lines
319 bytes
319 bytes
In article <7600@well.UUCP> pokey@well.UUCP (Jef Poskanzer) writes: > [estimate of CPU cycles used by virus] > >Now, if we could harness that kind of power for useful work... Already been done. DEC just factored a huge number this way in the last month or two. Made the papers and everything. Peter Desnoyers
Re: Worm CPU usage.
Author: rds95@leah.Alban
Date: Thu, 10 Nov 1988 12:53
Date: Thu, 10 Nov 1988 12:53
17 lines
962 bytes
962 bytes
In article <7600@well.UUCP>, pokey@well.UUCP (Jef Poskanzer) writes: > Now, if we could harness that kind of power for useful work... Imagine > a net-wide distributed ray-tracing service. If you want something traced, > you send the NFF to a central clearinghouse. All over the net, people > start ray tracing servers when they leave for the day. These servers > connect up to the clearinghouse and fetch the scene description, and a > list of pixels to trace. By morning, hundreds of scenes have been traced > and returned to their senders. (Close your mouth, you're drooling.) I guess most of us [well, at least the groovier of us, anyway...] have thought about this. Do we all need RPC? Why aren't we doing it now? They do it at CMU, and apparently at DEC and elsewhere, so why not 'distributed ray-tracing (or luminace modelling, or acid deposition modelling, or...) for the rest of us'? I'd contribute my meager resources each night... honest, rob
Thread Navigation
This is a paginated view of messages in the thread with full content displayed inline.
Messages are displayed in chronological order, with the original post highlighted in green.
Use pagination controls to navigate through all messages in large threads.
Back to All Threads