I'm doing an initial read of the report by ICANN's SSAC report on Verisign's Sitefinder.
Findings (2), (4), and (5) of the report are based on an assertion that Verisign violated "the well-defined boundary between architectural layers", "accepted codes of conduct" and "established practices". I find the claim that there are such "codes of conduct" and such "established practices" to be unjustified and dangerous. These claims represent a major new claim of power in restraint of innovation and commercial practices. And there are neither clear limits nor objective definitions to these "codes of conduct" and "practices"; they amount to the imposition of neo-religious principles cloaked in technological garb.
The report's condemnation of Sitefinder as violating a "well-defined boundary between architectural layers" falls equally hard on practices that the report does not reject - Network Address Translators (NATs), firewalls, and stateful and policy based packet forwarding.
It is interesting to contrast the report's assertions regarding layering with those found in section 3 of RFC 3439, a section entitled "Layering Considered Harmful"
While I personally agree that the internet has been created and has flourished on the basis of common beliefs and practices held among the technical literati, I do not believe that such beliefs and practices are automatically held by all or that it is proper to condemn those who chose other paths. The internet itself was born out of a rejection of the techical orthodoxy of the old telephone company (telco) world. Had the principles of adherence to established technical practices such as is asserted by the SSAC report been accepted as binding among those who invented the internet during the 1970's the internet would probably never have been created and communications today would probably strongly resemble ISDN.
Section 2.3 of the SSAC is a nice discussion of some of the practices used by some members of the internet community. But these are merely that - voluntary practices of a small, relatively homogeneous group. Yet even within that group there are quite strong differences of opinion over rather fundamental issues as, for example, how packet routing ought to work (MPLS vs hot potato routing) or whether internationalized naming ought to be performed within DNS or layered upon DNS.
If we techies can't agree among outselves on these major technical matters, then on what basis should a company engaged in competitive enterprise be constrained?
Different people and different organizations have divergent views on what constitutes the common good, on what constitutes acceptable and desirable goals, and what are legitimate and ethical constraints.
The fact that certain vague principles have resulted in useful techology is not sufficient reason to embue the practitioners of those principles with control over the use of that technology except for limited times under systems such as patent law. Indeed, those who invent a thing are often too close to the thing to comprehend how it may best be used. (See my note Techies wanna do policy)
In consequence, while I agree with the underlying notions of the value of the end-to-end principle and the value of cooperative practices as outlined in the SSAC report, I do not yet see that the report establishes a foundation upon which one can constrain Verisign from deploying Sitefinder.
Once again, I believe that the proper framework for analysing these kinds of situations is my First Law of the Internet.
I believe that had the SSAC realized that decisions about Sitefinder require a balancing of equities, and had the SSAC adopted something like the First Law of the Internet as a framework, then the report could have been more than a regurgitation of vague beliefs and instead have provided a good case study regarding when troublesome practices on the internet (and I do believe Sitefinder is quite a body of trouble) should be restrained and when they should be allowed to go forth even if there is some ancillary harm.Posted by karl at July 9, 2004 5:53 PM