The Gay Secretary
Let's remember what the "internet" is. It's nothing more than several suites of protocols for transferring data from one computer to another, or from one newtwork to another; and it's any set of (say it with me now) inter-networked computers running those protocols. Thanks to the boys at DARPA, the internet we know and love (based largely around the TCP/IP protocol suite) is desinged to be decentralised, with realtime compensation for changes in the number and positioning of functioning routers and nodes.
If I have a computer publishing a web page using HTTP and another computer across the room reading it, that's the internet. If I maintain a file server that exterior clients can access using VPN, that's the internet. If I work out a means to shift packets using AM radio, and such packets can cross from one network to another, that's the internet.
In essence, the internet is like a road system. Sure, and interested party can put up roadblocks at strategic locations, and they can try to lock down or tear up any routes they don't like. But others can build new roads and new roadnetworks as well, and it's impossible to control the whole thing at once, as long as the equipment and the people who know how to build and use it are scattered throughout the populace.
Now, possibly, the World Wide Web as we know it might be able to be co-opted. But the www is not the whole internet. I recall the salad days of Usenet - a decentralized method of collecting public information. In fact, I recall the atomic bomb board. Either a legitimate means for nuclear engineering students to share infromation or an elaborate joke, the atomic bomb board (Usenet designations varied) held actual real-world information on nuclear weapons design. Not the kind of thing that the Department of Energy, the Secret Service, or the FBI liked to see.
However, they couldn't get rid of it. Not just because of the perfidy of computer savy students, but because of the nature of Usenet. Usenet consisted of directories of files stored at various nodes. Periodically, each node hosting a Usenet directory would enquire if any of the nodes it was connected to lacked any of the files in it's host directory. Any missing files (or the whole directory) would be transferred to the other nodes.
So for instance, the University of Illinois would ask the University of Wisconsin if there were any Usenet files the UofI was lacking, or the UofW was lacking, and they'd copy and transfer files. Simple enough. So if the Secret Service tried to take down all of the atomic bomb board files at the Univeristy of Illinois, the University of Wisconsin would just copy it's files over to UofI, and service would be restored. If the University of Wisconsin suffered the same treatment, then the Univeristy of Illinois would fill in the gap. If, somehow, there was a co-ordinated attack at both of them simultaneously, then it was likely that the Univeristy of Minnesota would update both UofW and UofI after the fact. The atomic bomb board became almost impossible to remove.
Now, interestingly, that was the model of the late 1980's and early 1990's. The hacker/cracker credo was "All information wants to be free". Using a redundant, decentralised, self-correcting model, the information was in a sense free.
But something changed. E-commerce. The World Wide Web came into general use, and with it attempts to sell things online. The need to control access to information (paid sites, credit transfers, personal financial information) led to information no longer being free. Gradually, the server-client structure became commonplace, as it is easier to control information that way. Web-mail has edged aside traditional email, Chat clients have overtaken IRC, Web forums have replaced newsgroups and mailing lists. Websites have replaced gopher sites.
But we remember. We remember Usenet. We remember SMTP and POP3. We remember Gopher. We remember Telnet and the BBS. In the 21st century, the information we need is canaled, damed, and piped. It's controlled because it's easier that way, because money can be made that way, and free information doesn't drive an economy well. But all that can change. If anyone tries to "control the internet" we remember how to set that information free.
If I have a computer publishing a web page using HTTP and another computer across the room reading it, that's the internet. If I maintain a file server that exterior clients can access using VPN, that's the internet. If I work out a means to shift packets using AM radio, and such packets can cross from one network to another, that's the internet.
In essence, the internet is like a road system. Sure, and interested party can put up roadblocks at strategic locations, and they can try to lock down or tear up any routes they don't like. But others can build new roads and new roadnetworks as well, and it's impossible to control the whole thing at once, as long as the equipment and the people who know how to build and use it are scattered throughout the populace.
Now, possibly, the World Wide Web as we know it might be able to be co-opted. But the www is not the whole internet. I recall the salad days of Usenet - a decentralized method of collecting public information. In fact, I recall the atomic bomb board. Either a legitimate means for nuclear engineering students to share infromation or an elaborate joke, the atomic bomb board (Usenet designations varied) held actual real-world information on nuclear weapons design. Not the kind of thing that the Department of Energy, the Secret Service, or the FBI liked to see.
However, they couldn't get rid of it. Not just because of the perfidy of computer savy students, but because of the nature of Usenet. Usenet consisted of directories of files stored at various nodes. Periodically, each node hosting a Usenet directory would enquire if any of the nodes it was connected to lacked any of the files in it's host directory. Any missing files (or the whole directory) would be transferred to the other nodes.
So for instance, the University of Illinois would ask the University of Wisconsin if there were any Usenet files the UofI was lacking, or the UofW was lacking, and they'd copy and transfer files. Simple enough. So if the Secret Service tried to take down all of the atomic bomb board files at the Univeristy of Illinois, the University of Wisconsin would just copy it's files over to UofI, and service would be restored. If the University of Wisconsin suffered the same treatment, then the Univeristy of Illinois would fill in the gap. If, somehow, there was a co-ordinated attack at both of them simultaneously, then it was likely that the Univeristy of Minnesota would update both UofW and UofI after the fact. The atomic bomb board became almost impossible to remove.
Now, interestingly, that was the model of the late 1980's and early 1990's. The hacker/cracker credo was "All information wants to be free". Using a redundant, decentralised, self-correcting model, the information was in a sense free.
But something changed. E-commerce. The World Wide Web came into general use, and with it attempts to sell things online. The need to control access to information (paid sites, credit transfers, personal financial information) led to information no longer being free. Gradually, the server-client structure became commonplace, as it is easier to control information that way. Web-mail has edged aside traditional email, Chat clients have overtaken IRC, Web forums have replaced newsgroups and mailing lists. Websites have replaced gopher sites.
But we remember. We remember Usenet. We remember SMTP and POP3. We remember Gopher. We remember Telnet and the BBS. In the 21st century, the information we need is canaled, damed, and piped. It's controlled because it's easier that way, because money can be made that way, and free information doesn't drive an economy well. But all that can change. If anyone tries to "control the internet" we remember how to set that information free.
No comments:
Post a Comment