Tags:
Views: 38
Followed by the dissenting view that my taxonomy entails:
I think that "overarching" is generally a bad idea. We should focus on underpinning design constraints. We should not treasure a set of principles that drive many different network activities. We should rather look for the minimal underpinnings that allow all sorts of different principles to be applied.
And now, for the total blasphemy, taking the name of NSF in vain: We should not seek a trustworthy network. Rather, we should seek a network that we don't have to trust, but one that allows us to build trust through the network when appropriate. The right infrastructure for building trust almost certainly does not involve support at the packet-forwarding/IP level. But, it probably does require a very small amount of global agreement on the distribution of public keys. This very small amount of global agreement almost certainly does not include the mechanisms by which those keys become trusted---those should be various and added on as the need justifies the expense.
I am rather astounded that trust got in as a fundamental starting point. It appears to me quite analogous to reliable delivery, which was so wisely omitted from IP. IP is valuable in spite of not providing reliable delivery. It gives us a platform upon which to build reliable delivery when it is worth the cost.
I think you are right we should carefully examine which of the use cases require redesign of the network architecture and which could be accomplished by writing better applications at the end-hosts. But I disagree that trustworthy is among things that should be an add-on to the untrusted network. I think Internet attacks, like spam, phishing, DoS, etc., are the biggest impediment to the Internet, and we have seen that the best industrial and research effort to combat it have failed to solve the problem. So, I think this makes it a good case that this could be a killer motivating use case for the new architecture.
© 2024 Created by David Clark. Powered by