Indiary

Follow us
Stay Informed about the Latest News
Sign up to our newsletter
Back to news

Beyond net neutrality

Good friends Vinton Serf and Bob Kahn invented the Internet Protocol together. But they are on the opposite sides of the raging global debate on network neutrality. It's a principle that ensures all Internet Service Providers (ISPs) and governments treat data equally. Net neutrality is simple. It also seems absolutely fair and reasonable. But its simplicity is precisely the reason why it is under severe strain from the complex policy challenges of an evolving set of technologies. The fundamental bedrock of network neutrality is the assumption that the Internet is a public good. In many ways it really is. No one owns it, yet all its moving parts -- from server forms to codes -- have some form of ownership and intellectual property rights associated with it. It is modulated and moderated by global and national regulatory mechanisms, but it still not a tightly controlled resource like, say, Uranium. It isn't bound by geography due to common standards, though there are geographical variations in terms of platforms, technologies and, of course, content. In many parts of the world it has become a critical resource of modern living, much like piped water and metered electricity. In fact, several countries are seriously considering making access to Internet a core and fundamental right. Net neutrality is also an important engine of open internet, which based on equal treatment of data and interoperable web standards. The twin principles of net neutrality and open internet have served the digital world quite well till now. Tim Berner-Lee, Lawrence Lessing, Steve Wozniak, Barack Obama are supporters of net neutrality as are Internet giants like Yahoo, eBay and Amazon. Yet for its support and even handedness net neutrality today is a fiercely contested domain.

The fight for the control of cyberworld, as it were, emerges from three sets of complexities that are a unique combination of technology, human logic and policy. Each one challenges the fundamental assumption of non discrimination of data underlying net neutrality. The first complexity can actually be traced back to 1888 when Almon Brown Strowger, fed up with operators who farmed off calls for profit, invented the first automatic telephone exchange. Strowger, who was an undertaker, had a strong personal motivation too as he was losing his business to a competitor due to his telephone operator deliberately redirecting calls meant for him to anyone named Strowger. The automatic telephone exchange made subscribers independent to choose who and when to call irrespective of whether they were using AT&T or any other competing service. By removing the human layer, the operators, exchange neutrality became the driving principles of modern telephony. The concept of net neutrality is directly derived from this particular principle. A one-size-fits-all principle works wonderfully when the service offered is based on single format, say voice or data. The complexity arises when the technology behind the service morphs to allow for new formats to be offered. An additional layer of complexity is added when some of these formats keep evolving breaching boundaries of data, voice, text, audio and video, obliterating conventional understanding of singlecasting, narrowcasting and broadcasting. As long as the cyberworld was primarily confined to a few formats revolving around text and pictures, and relatively standard distribution and consumption frameworks, the principle justified itself. But the moment cyberworld turned into a multichannel communication, transaction, financial and entertainment platform, with its own production, distribution and consumption mechanisms and devices, data could no longer be defined in singular terms. This has led to several policy questions with long-standing implications for the future. Can a high definition video stream be equated with a webpage? Can a secure online banking system integrated with a stock trading engine be treated on par with a picture sharing website? How should one look at data that emanates from social media platforms? What about networked devices? Net neutrality and open internet, at least conceptually, are finding it tough to answer these questions

The second complexity arises from the way Internet is embedding itself into contemporary built environment. It ranges from café, lounges, railways stations to parks, animals and human beings. Internet is no longer confined to specific devices. Internet of Things, as the networked reality is being increasingly referred to, is resulting in a singularity that's redefining the concept of dumb pipe and end-to-end principle. Both are interlinked with net neutrality. The concept of a dumb pipe is predicated on an understanding that only the end points of a network are imbued with intelligence. The end-to-end principle, a network design, posits that communication protocol operations should be applicable at the ends points of a network. In short the resources being delivered should be controlled only at the points of delivery and consumption. Internet of Things demolishes this assumption conclusively. Every single component of the dumb pipe is now turning smart, integrated as it were with algorithms and codes that determine everything from packet transmission to individualised customisations. Within this emerging ecosystem a content distribution network (CDN), as an example, is already allowing companies with resources to position their content in ways that ensures that it reaches consumption devices earlier than any other competing content. This can still be explained as a legitimate business practice for competitive advantage. But there are other unanswered questions. What about the data transmitted by a networked pacemaker fitted on a heart patient? Should such data have a priority over a video being download from a website? What about financial transactions? Is it time to look at a staggered hierarchy of priority as far as data is concerned? In fact, should data itself be defined in a more nuanced and tiered manner?

The third complexity comes from the different forms of control and intelligence that have emerged in recent years on data itself. One component of this control comes from the mechanisms of storage itself. The technological advancements in storage and retrieval in itself, the geographical location of such hardware, software and services and the differential access to it have contributed in diluting the principle of network neutrality. Consequently, for example, cloud services not only provide storage for data but also software solutions to select clients for 'faster delivery'. Data is also used extensively to mix and match profiles, create personalisations and track user behaviour. Today, net neutrality is hard pressed to account for the burgeoning intelligence embedding different forms of data. In this new and emerging ecosystem of smart data there are no end points of a network. Such data are often independent of the pipe and have multiple ways of being transmitted, distributed and consumed. Intelligent data has fundamentally transformed the business of the Internet, slowly changing the levers of power, transferring control from the owners of hardware and undersea cables to owners of proprietary algorithms, codes and software. The much debated American SOPA (Stop Online Piracy Act) and PIPA (Protect Intellectual Property Act) have to be seen within this larger context of intelligent data. Both Acts seek to cater to a growing policy vacuum on the need to clearly quantify and understand the emerging networked society of algorithmic intelligence. Some pressing questions are: Does net neutrality also apply to codes and algorithms? How does it take into account the increasing, and completely autonomous, personalisation of content and interfaces? What about user behaviour monitoring?

What's the way forward? The answer to this question is critical for India, which as of date does not have an official policy on net neutrality. Two things are reasonably clear. Net neutrality as it exists today cannot continue for long. But neither can the alternative conceptualisation of closed Internet advocated by several powerful ISPs and global hardware and server companies. A combination of solutions advocated by Tim Berners-Lee and Tim Wu, the man credited with coining the phrase network neutrality, might pave the road for a new and nuanced policy framework. The first would be treat data in a differential manner, but not its transmission, distribution and consumption. There is an obvious difference between a webpage and the data transmitted by a networked pacemaker. The second would be to create a nuanced and tiered hierarchy of data, but temper such hierarchical structure by following a principle of absolute network neutrality in each tier. The third would be regulate and legislate specific intersection points where different applications overlap. This would reorient the concept of net neutrality from the current definition of neutral transmission to a definition of equal treatment among similar applications. Without a nuanced policy framework it's going to collectively impossible for all of us to engage with and regulate the emergent Internet of Things and its networked artificial intelligence.

R. Swaminathan
09 July 2014
The writer is a Visiting Fellow at Observer Research Foundation and a Fellow at the National Internet Exchange of India.

 

Post a comment

Please check that the information in the fields here below is correct.

Your comment is awaiting approval and will soon appear below!

Comments :

  • No comments

Newsletters

Stay Informed about the Latest News

Created by BlueLeaf.ch
Stay Informed about the Latest News
Sign up to our newsletter