The Internet is Growing

Published on July 2016 | Categories: Types, Research, Internet & Technology | Downloads: 61 | Comments: 0 | Views: 441
of 25
Download PDF   Embed   Report

Comments

Content

The Internet Is Growing

1

The Internet Is Growing: Why It Is a Problem and What Can Be Done About It Brendyn O'Dell-Alexander Rochester Institute of Technology

The Internet Is Growing Abstract The Internet has become an integral part of modern society. This is apparent in the substantial growth over the last decade in the number of unique Internet users worldwide and the average time each spends online. Alongside these trends, a shift is occurring toward consuming larger and more varied media, due in part to the emergence of popular services such as YouTube, iTunes, and Xbox Live.

2

These changes are stressing components of the Internet designed to solve different problems than those faced today. As a result, assumptions about the longevity of core Internet technologies such as the Internet Protocol (IP) and the Transmission Control Protocol (TCP) are faltering. To address these concerns, solutions such as IPv6, Content Delivery Networks, and faster physical mediums will need to be adopted in the near future to help ensure that the Internet remains a valuable resource for generations to come.

The Internet Is Growing 1. Introduction The Internet is an incredible tool. First created for military purposes (Clark, 1988), the Internet has since penetrated almost every aspect of our lives. When Tim Berners-Lee unveiled the complex system of cables and protocols known as the World Wide Web in 1990 (Cailliau, 1995), few could have predicted the popularity it has seen. As of June 2008, nearly 1.5 billion people around the world have access to this resource--a number, while large, equal to just 21.9% of the total population (Miniwatts Marketing Group, n.d.a). And, with countries like China and India on the rise, each with only a small

3

percentage of its one billion citizens currently connected, it is likely that use of the Internet will continue to climb. Also increasing is the prevalence of rich content. In its original conception, the Internet was meant to ferry small amounts of information from host to host. The underlying technologies of the Internet were built and configured for this purpose. In the epithetical "Internet 2.0" world, however, where online videos, music, and gaming reign supreme, users have developed an insatiable appetite for the large amounts of data services are able to provide. This increase in demand has evoked an increase in supply, with large corporations pouring billions of dollars into infrastructure aimed at serving more content. Together, growth in the total number of Internet users and the amount of data they are consuming has imposed challenging issues for the current Internet architecture. While most users encounter their fair share of headaches while surfing the Web, they are typically shielded from its underlying failures. Web browsers and the layers of technology beneath them mask many of the problems encountered when a user makes a request. These problems arise from environmental factors that impact the transmission of information and are frequently automatically accounted for. That said, anyone who has tried to view an extremely popular website has seen the subcutaneous flaws of the Internet's architecture surface. Under high volumes of use, the Internet becomes a first-come-first-served environment with its pitfalls prominently on display.

The Internet Is Growing Demands during these moments of popularity are analogous to the consistent demand the Internet could see as developing countries increase their connectivity and our desire for rich content continues to swell. In other words, what is now anomalous in those rare moments of super popularity could soon become commonplace. Fortunately, computer scientists and networking engineers have

4

been working to address this looming crisis. If issues inherent to current solutions can be overcome, this potential catastrophe can be avoided even as massive user growth and rich data consumption continue. This paper analyzes growth in three key areas over the last decade--the number of unique Internet users, the size of data they consume, and the amount of time they spend online--and maps that growth to the components of the Internet architecture that are impacted most. Key issues with each of these components are then outlined, followed by examples of popular solutions capable of addressing these concerns and prolonging the life of the Internet.

2. Internet Growth by the Numbers Since tracking of worldwide Internet usage began, the unique number of people surfing the Web has ballooned, going from 16 million in 1995 to approximately 1.46 billion as of June 2008 (Miniwatts Marketing Group, n.d.a)--a 9,125% increase in only 23 years. This explosive growth is due to the traction the Internet has quickly gained with businesses, educational institutions, and other establishments. One possible way to observe Internet growth is through a set of three metrics, including the number of unique users, the type of media consumed, and the total time spent online. The first represents the total number of people around the world with the ability to connect to the Internet and consume its services. The second encompasses the types of media, such as plain HTML, pictures, or videos, that people typically interact with while surfing the Web. And the final metric quantifies how much time users spend on the Internet once they have logged on. Together, these values demonstrate

The Internet Is Growing how Internet growth over the last decade has been multidimensional, with separate forces aligning to create compounded upward trends.

5

2.1. Population-based Unique User Growth The number of unique Internet users has grown extremely fast over the last decade. Much of this growth is attributable to a subset of countries, namely China, India, and a handful of other Asian nations. In the time between 2000 and 2008, China alone experienced an increase of 1,024.4%, now with 253 million of its citizens online (Miniwatts Marketing Group, n.d.b). This number represents only a fifth of its total population. India, during the same period, saw its unique user count jump by 1,100% (Miniwatts Marketing Group, n.d.b), with a mere 5.2%, or 60 million people, of its total population connected. Growth trends like these are common across Asia, with other countries, though smaller in size, exhibiting equally as large or larger growth spurts. Myanmar, Vietnam, Pakistan, Uzbekistan, and Afghanistan saw their online populations grow by 3,900%, 9,979.8%, 12,969.5%, 23,166.7%, and an incredible 57,900%, respectively (Miniwatts Marketing Group, n.d.b). Considering that these nations represent a sample of those with mountainous gains, and that each has a population of at least 28 million of which none has more than 24% online, this rapid and large growth is likely to continue. In other parts of the world, growth is occurring at a lesser but steady rate. Since 2000, North America has seen its pool of online users grow from 108 million to 248 million (Miniwatts Marketing Group, n.d.a), an increase of 129.6%, as of 2008. Africa (1,031.2%), Europe (266%), the Middle East (1,176.8%), Latin America (669.3%), and Oceania (165.1%) (Miniwatts Marketing Group, n.d.c) have all seen sizable increases, as well. This enormous growth suggests that a greater number of people are consuming the same amount of limited Internet resources, pushing it toward its present maximum capacity.

The Internet Is Growing 2.2. Market-based Unique User Growth

6

Another meaning of unique users is the total number of distinct people who access a web site or service. This metric is used to calculate the popularity of a given site and the services it provides. Three market segments contributing substantially to the growth of Internet use are online media, gaming, and social networking. Combined, these industries have hundreds of millions of users and account for a large amount of Internet traffic.

2.2.1 Online Media Online media is perhaps the most prominent industry to have emerged on the Web in the last 10 years. The stars within this field are streaming video and digital music, each having seen enormous expansion. On the streaming video stage, YouTube, a Google web application that allows users to upload and share videos, is the main actor. Throughout July 2006, YouTube had approximately 16 million unique users in the United States and streamed 649 million videos to that group (comScore, Inc., 2006). Only two years later in July 2008, the number of unique users multiplied by five, with 71.6 million people watching videos via their service. The quantity of videos streamed increased by an even larger amount, jumping seven times to roughly 5 billion (comScore, Inc., 2008a). Though the year-over-year percentage growth in unique users since 2006 has stemmed, going down from 347.5% between 20062007 to 28.6% between 2007-2008, it is estimated that nearly 200,000 videos are uploaded daily (YouTube Statistics, 2008), a number indicative of continued demand. To bolster that indication, online attendance during the 2004 Athens and 2008 Beijing Olympic games presents evidence that streaming video is fast becoming a staple of the online experience. In 2004, NBC served 10.8 million video streams to a total of 25.2 million unique viewers during the 17-day period in Athens (NBC Universal, Inc., 2008). Four years later in Beijing, the number of video streams grew by 599% to 75.5 million, while the number of unique users more than doubled to 51.9 million (NBC

The Internet Is Growing

7

Universal, Inc., 2008). These numbers equate to an impressive 9.9 million hours, or 1,126 years, of video watched (NBC Universal, Inc., 2008). Digital music has also seen tremendous growth over the last few years, with iTunes, Apple's online music store, at the center of the market. iTunes debuted in January 2001 and saw its unique user count climb to 13.9 million in only four years. More impressively, between November 2005 and January 2007, the service saw growth of approximately 4 million users every year (Website Optimization, LLC, 2007). Only a year later in December 2007, the number of users jumped to 35.7 million, representing 157% growth in only three years (Website Optimization, LLC, 2008). Considering Apple's increasing share in the phone, mp3 player, and laptop markets, as well as new entrants into the digital music realm such as Microsoft and Amazon, this growth will presumably continue in the future.

2.2.2 Online Gaming The second major contributor to the rise in the quantity of unique Internet users is online gaming. Game console providers such as Microsoft and Sony have invested heavily in Internet connectivity with their Xbox Live and PlayStation Network services. Microsoft alone, since unveiling its Xbox Live features in 2002, has encountered large success. Three years after its release, the service had 1.4 million subscribers (Microsoft Corporation, 2005). Two years later, when the number of subscribers was measured again, it had climbed to 6 million (Microsoft Corporation, 2007). Eleven months after that, it reached 12 million, with an estimated new user registering every 5 seconds (Microsoft Corporation, 2008). Calculated out, that number implies that nearly 6.3 million additional users will register in 2008, bringing the Xbox Live total to approximately 18.3 million subscribers. Similarly, though the Sony PlayStation Network has only existed for one-and-a-half years, it now has 2.8 million subscribers with 100,000 new users joining weekly (Carless, 2008). This implies that by the end of 2008,

The Internet Is Growing its network will have almost 8 million unique users. With just 50% of PlayStation owners consuming its online service, plenty of room exists for the PlayStation Network to grow (Carless, 2008). In parallel to services consumed via gaming consoles, massively multiplayer online role-playing games (MMORPGs) have attracted millions of users to their immersive, virtual worlds. For example, World of Warcraft, which began in 2004 and is now the most popular MMORPG, has over 10 million

8

subscribers (Woodcock, 2008). Since its mainstream breakthrough in 2002, the overall MMORPG market has seen its numbers increase by large increments, going from 3.5 million users in 2002 to 6 million in 2004, 13 million in 2006, and an estimated 16 million in 2008 (Woodcock, n.d.). Gaming has been a common source for entertainment for decades, and by incorporating sociability into what was once isolated play, makers are broadening the appeal of their games and their market size.

2.2.3 Online Social Networking Social networking sites--most prominently Facebook and MySpace--are the third cause of the large growth in online unique users. For the past few years, these two online behemoths have grappled for the title of largest virtual community, adding features at a record pace to attract new users. MySpace, as of June 2008, had 117 million unique visitors per month (comScore, Inc., 2008b). Facebook, with rapid growth between 2007 and 2008, saw 132 million unique visitors per month on average (comScore, Inc., 2008b). In addition, other members in the virtual community business such as HI5.com, Friendster.com, and Google-owned Orkut, have a combined 127 million unique visitors a month, with each experiencing more than 40% growth year-over-year (comScore, Inc., 2008b). Because these sites primarily target young people and college students, two user bases which are constantly replenished, their growth will no doubt continue.

The Internet Is Growing 2.3. Business Response to Growth What hints most at continued growth may not be the increase in unique users itself, but the business reaction to it. As the Internet's popularity has risen, so has the investment in advertising to its

9

users. Google, for instance, a company whose profits are based mainly on online advertising, purchased ad firm Doubleclick for $3.1 billion in 2007 (Story & Helft, 2007) to shore up its Internet capabilities. Not to be outdone, Microsoft, which aggressively competes with Google in the online ad market, bought aQuantive in the same year for $6 billion (Isidore, 2007). Though impressive investments, having the capability to advertise well is only the first step toward increasing profits. Much like traditional advertising, space is needed for product placements for companies who pay Google and Microsoft to be their advertising firm. This has created a race among the software giants to expand their suite of web services. Because each new service acts as a digital billboard by providing additional space for advertisements, Google, Microsoft, and others have an incentive to create quality services which attract users who will, in turn, buy the advertised products. One core asset in this ongoing technology arms race is the number of data centers each company has. Data centers are used to serve up content to the hundreds of millions of people around the world who request it. As such, both the number and geographical location of data centers are important measures of success. To improve both, massive capital investments are required. Google, for instance, has invested approximately $6.3 billion in data center infrastructure since 2006 (Miller, 2008). In the first quarter of its 2008 fiscal year, Google poured $842 million into its data center strategy, its largest single-quarter purchase of equipment to date (Miller, 2008). Microsoft plans to invest a large sum of money, as well, predicting it will spend between $1.2 and $1.5 billion a year to grow its data center capacity (Associated Press, 2008). For Google, as their business model depends on Internet software and online advertising, this spending habit will likely continue unabated. And it is equally likely that Microsoft will further invest in

The Internet Is Growing this lucrative space. At a yearly analyst meeting in Washington, Steve Ballmer, Microsoft CEO, said, "Everything you read, everything you watch, everything you want to communicate, all of those

10

experiences are going to happen over the Internet" (Associated Press, 2008). With its growing line-up of Live-branded Windows, Office, and Xbox offerings, Microsoft is joining many others in realigning its business model to offer online service complements to many of its traditional software applications.

2.4. Change in Size of Media Consumed One by-product of the transition from traditional software to software-as-a-service is the user demand for a rich experience. On the Internet, users want the same high-quality, immersive content they consume on their personal computers. To retain users and gain a competitive edge, this demand is being met by online service providers. One indicator that can be used to detect a change in content trends is the amount of data flowing through the Internet, measured in bandwidth consumed. Traditional files are smaller than their richer counterparts, so an increase in demand for rich media should surface through an increase in bandwidth consumed. Indeed, between 2007 and 2008, international bandwidth usage grew by 53%, down only slightly from the 61% increase seen between 2006 and 2007 (Kim, 2008). Traditional Web surfing entails navigating to a web site and browsing through its pages. A sampling of the top 1000 web sites on the Internet in 2008 found that the average size of a page on those sites was approximately 310,000 bytes. In comparison, the size of an average YouTube video is around 10 million bytes--over 30 times larger than a typical web page (Website Optimization, 2008b). Similarly, the average size of an mp3 file is roughly 4.9 million bytes, or 15 times larger than the typical web page (comScore, Inc., 2007). The increase in the number of unique users combined with more consumption of richer, larger media has greatly taxed the current Internet architecture.

The Internet Is Growing 2.5. Increase in Time Spent Online Many of these new types of services are also more immersive, encouraging longer amounts of time spent online. According to the 2008 "Annual Internet Survey" conducted by the Center for Digital Future at the University of Southern California, the amount of time spent online per week by the average American was 15.3 hours, an increase of an hour over 2006 and the largest amount reported since the survey began (Center for the Digital Future, 2008). Globally, the average time spent on a PC

11

per month increased by 22 minutes between August and September 2008, climbing from 32 hours and 59 minutes to 33 hours and 21 minutes (The Nielsen Company, n.d.). Whether the sites they visited promoted longer browsing sessions or the data they consumed required persistent online connectivity, the fact is users stayed connected longer. As an increasing number of Internet users have interacted with richer content for longer periods of time, this activity has produced compounded stress on underlying cornerstones of the Internet architecture. To understand the problem further, a working knowledge of the Internet's technical fundamentals is necessary. Learning about its core components and their histories, it becomes clear that the Internet was meant to solve a different set of problems and is not particularly suited for the ones faced today.

3. Fundamentals of the Internet Architecture The Internet as we know it is a semblance of what it started as in the early seventies. Throughout its history, the Internet has faced challenges which arose as a result of architectural decisions made to satisfy early requirements. To alleviate these issues, small changes, or what amounted to temporary bandages, were applied to the Internet's underlying components. Because of

The Internet Is Growing the rapid growth of the Internet over the last decade, the problems are now overcoming the stopgaps put in place and are demanding new attention be paid to implementing permanent fixes.

12

3.1. Original Goals The concept of the Internet first arose within the United States military under the Advanced Research Projects Agency (ARPA) of the Department of Defense. The original goal of the Internet was to connect two disparate military networks to allow for resource sharing (Clark, 1988). Adoption of this goal carried with it other objectives regarding how that interconnectivity should work, including: Neither network should be reconstructed and other networks should be able to interconnect, as well; management of the networks should remain decentralized; and multiple hosts must be able to use the same physical communication paths at the same time (Clark, 1988, pp. 1-2). That no centralized Internet governance exists today, yet millions of people originating from different networks are able to connect and use the Internet simultaneously, demonstrates that these goals were successfully met. Of the goals listed, the most important to the military were decentralized management, as the networks were to be used in combat situations, and the ability to connect wholly different networks together with no reconfiguration required. The technologies subsequently developed to satisfy these goals were the Internet Protocol (IP) and the Transmission Control Protocol (TCP). These two protocols, often referred to in tandem as TCP/IP, have been important to the success of the Internet while at the same time the components which have experienced the majority of the growing pains.

3.2. The Internet Protocol The Internet Protocol was developed to meet the goal of network interoperability. The basic requirement was that the Internet must "be able to incorporate and utilize a wide variety of network topologies, including military and commercial facilities" (Clark, 1988, p. 4). This was a futuristic goal as

The Internet Is Growing the concept of a computer network was then still in its infancy. Today, however, given the array of

13

private and public networks, this goal has proved prescient. Achieving interoperability also meant it was necessary for the protocol to be basic and to impose as few requirements on the underlying network as possible. One requirement that did make it into the standard was that each host on the network have a way to be uniquely identified to make direct host-to-host communication possible. This identifier is known as an IP address. An IP address is assigned to each host when it first connects to the Internet. Once the IP address is granted, it is embedded in every packet of information that originates from the host. The format of an IP address today consists of four period-separated blocks, each comprised of eight binary bits. Each bit, from right to left, represents 2nth power starting with n=0 (e.g. 20, 21, 22,...,27). When a bit is equal to 1, its 2nth value is added to the total block value. For example, the eight-bit binary number 11111111 is equal to 255, or the sum 20 + 21 + ... + 27. This particular number is also the maximum decimal value for each block in the IPv4 standard. Thus, no IP address can have a value greater than 255.255.255.255. Because each binary bit can only be 1 or 0, raising the possible binary number value options (2) to the total number of bits in an IPv4 address (8 bits per block x 4 blocks = 32 bits) yields the total number of possible IPv4 addresses, which is approximately 4 billion. This number seemed substantially large at the time the Internet was invented, as "the idea of a vast global Internet connecting hundreds of millions of individuals was, at best, a science-fiction fantasy" (Golding, 2006, p. 22). Nevertheless, that fantasy has become reality and there exists now a serious issue with depletion of this limited resource.

3.2.1 Internet Protocol Address Allocation Knowing that IP addresses are a scarce commodity may beg the question, how are they given out? Allocation of IP addresses is done in a hierarchical fashion, though that hierarchy has changed over time. Originally, entities could purchase Class A, B, or C address space based on whether they wanted 4

The Internet Is Growing million, 64,000, or 254 addresses, respectively (Young, 2008). As is perhaps obvious, these classes did not accurately represent the variations in size of the requestors, leading to wasteful distribution of a

14

finite resource. With Internet usage erupting in the early nineties, this wastefulness resulted in not only the realization by many Internet architects that "the impossible would happen--they would run out of address space", but that complete depletion "would happen in only a few years" (Golding, 2006, p. 22). To avoid the impending crisis, the allocation process was changed to what is called Classless Inter-Domain Routing, or CIDR. This new system abandoned the inflexible classes and allowed IP addresses to be granted in amounts based on powers of two. In this model, "blocks of 64 addresses could be issued as easily as those containing 4 million addresses" (Golding, 2006, p. 23). Even with this change, however, depletion still loomed in the near future. To counteract this scarcity, other technologies such as Network Address Translation (NAT), a band-aid that reserved a fixed range of IP addresses for intranet reuse, appeared in addition to new governing bodies known as Regional Internet Registries (RIRs), whose were tasked with ensuring address consumption was carefully managed (Golding, 2006). Regardless, even these measures have not withstood the demand the Internet has seen. In May 2007, the American Registry for Internet Users (ARIN) issued a press release which stated that the "available IPv4 resource pool has now been reduced to the point that ARIN is compelled to advise... that migration...is necessary for any applications that require ongoing availability...of contiguous IP number resources" (Plzak, 2007). In other words, the number of remaining addresses had dwindled to the point where requests for new ones would soon be declined.

3.3. The Transmission Control Protocol The other fundamental building block of the Internet is the Transmission Control Protocol. When two hosts establish communication, the information that flows between them is transmitted in

The Internet Is Growing small fragments known as packets. In alignment with the goal of assuming little about the underlying

15

network, the IP specification had no built-in mechanism to determine important aspects such as how big packets could be, when they should be sent, or how to guarantee their receipt. Responsibility for defining these parameters was intentionally delegated to higher-level protocols such as TCP. At the core of TCP is a simple flow control built on byte streams. When data is ushered through the protocol, it is broken into packets based on the amount of contiguous data allowed. Prior to 1988, TCP would send the information to the host requesting it and wait for acknowledgment of delivery. If that never came, the protocol would resend the data as many times as it took to ensure that it was successfully delivered. This functionality seemed harmless and even beneficial at first, but proved crippling when traffic on a network increased (Wischik, 2005).

3.3.1 Early Issues With Traffic Flooding Internet traffic is similar to highway traffic. Like roads, each connection between two points on the Internet has a physical limitation on the number of elements that can pass through it at one time. Much like a traffic jam, when that limit is reached, the aggregate speed of all elements passing through the medium decreases. This happened as a result of the unchecked resend mechanism built into TCP. When the connections on a network were under heavy use and the number of failed packet transmissions increased, TCP saw each failure as a sign to try again. Essentially, the initial response to a flood of traffic was to flood the network even more. What was instituted in 1988 to curb this unintended byproduct was the slow and gradual increase in packets sent followed by a rapid decrease when a failure was detected (Wischik, 2005). If all was going well, consistent and high transfer rates were possible. But, when something went wrong, the worst possible case was assumed and transmission rates reduced significantly. This host-level mitigation was necessary because no central regulator existed to reroute data around traffic jams--though this fix

The Internet Is Growing succeeded in producing that effect by "using the collected decentralized intelligence of all the computers connected to the Internet" (Wischik, 2005).

16

Despite the success of this update to the TCP protocol, cracks in its foundation are beginning to show as Internet data trends change. As an example, consider high-definition video. To improve the display quality of the video, more data is compressed into the same amount of playback time than in regular videos. Hence, high-definition video files contain many times more bytes. This size difference matters little if a user is viewing the video in a DVD player, for instance, where the disc holding the data and the device the data are sent to are in close proximity, connected by cables capable of transferring information rapidly. Small distance and a high-bandwidth medium are necessary for fast playback. Over the Internet, these characteristics are neither guaranteed nor easy to maintain. Because of environmental factors such as radiation and the basic nature of electronic signals, packets sent long distances over limited-bandwidth connections have a higher probability of loss. As TCP underpins video streaming on the Internet, it is difficult to reach and maintain the speed necessary for fluid video playback as packet failures and resulting speed decreases occur frequently. Ironically, it seems the mechanism widely credited with keeping the Internet alive may now be stunting its growth.

4. The Next-Generation Internet Though issues with the Internet Protocol and Transmission Control Protocol are not disastrous, they introduce significant roadblocks to the Internet's continued success. Fortunately for the 1.46 billion people who enjoy or depend on the services provided by the Internet, many efforts are ongoing within the computer science and networking fields to address these concerns. Some of the solutions are temporary fixes aimed at extending the amount of time available to develop new remedies, while some are sweeping changes that will require the adoption of entirely new technologies. The first set of fixes target the weaknesses in the IPv4 standard, the most prominent suggestions being more efficient use of

The Internet Is Growing

17

remaining addresses and deployment of the IPv6 standard to take advantage of its larger address space. In addition, attention is being paid to TCP. Bringing data and the machines that serve them closer to users is one option. Others include improving the efficiency of common transmission mediums and installing new infrastructure capable of faster speeds and more concurrent traffic.

4.1. Internet Protocol Version 6 Leading the charge toward mitigating IPv4 address space depletion is the next version of the Internet Protocol known as IPv6. Despite numerous improvements over IPv4, the most notable enhancement in IPv6 for this discussion is the number of addresses allowed. Each IPv6 address takes the form of eight colon-separated blocks, with an individual block comprised of a four-digit hexadecimal number. A single hexadecimal number equals four binary bits, making each block sixteen bits long. Using the same math demonstrated in the IPv4 calculation, this results in an enormous 2128, or 3.4 x 1038, possible addresses. This number dwarfs the 4 billion addresses possible with IPv4 and is well suited to support a world where not only humans are connecting to the Internet at a record pace, but devices such as household appliances, cell phones, and cars are as well. There are, however, quite a few challenges to the adoption of the IPv6 standard. The biggest is an attitude among some in the computing industry that the IPv4 crisis is overblown. They argue that the IPv4 address space is not as close to depletion as is generally perceived, and that a majority of those calling for its replacement are doing so because they have a vested interest in the success of IPv6 (Golding, 2006). Their argument is substantiated by the lack of available addresses being due not to their current use, but rather to hoarding. By allowing RIRs to sell IP addresses while prohibiting buyers from reselling them, so the argument goes, inefficiencies now exist in the IP address market. In fact, the "clearest evidence of wasted IP address space is the large amount of space that has been allocated to users but is not present in the global routing table--over a third of allocated address space at last count"

The Internet Is Growing

18

(Golding, 2006, p. 7). That implies that there are nearly 1.3 billion unused addresses which require only relaxed rules to become available, the solution being market-based incentive for the hoarders to sell their IP resources to those in need. While this market-based approach may work, it is a short-term solution. Even with the release of those unused, hoarded IP addresses, the lifespan of IPv4 is approximated to extend no further than 2026--and that is only if today's Internet consumption rates remain the same (Golding, 2006, p. 8). The case for finding an alternative to IPv4 can still be made and points directly to IPv6.

4.2. Transmission Improvements Another subset of modifications address the shortcomings of the Transmission Control Protocol. As mentioned, long distances tend to increase the possibility of failure when sending packets. The outcome of this is the inability for a TCP-based connection to maintain the high transfer speeds necessary for rich Internet experiences. One idea for alleviating this is to bring data closer to users via Content Delivery Networks (CDNs). CDNs are geo-distributed servers that allow static content such as videos and images to be uploaded to a central server and dispersed to nodes all over the world. The benefit is that a user's request for content could go to the server geographically closest to them, thus decreasing the distance the information has to travel and increasing both the speed at which it is received and the probability of successful delivery. A second option available is to improve the transmission mediums themselves, either by implementing cutting-edge virtual technologies which greatly increase maximum speeds or by leveraging new physical mediums whose speeds are naturally higher. Virtual solutions involve taking further advantage of existing copper lines installed by telephone companies used ubiquitously today. A leading potential technology in this arena is Dynamic Spectrum Management, or DSM. Most cable lines today base transfer rates statically on the worst-case scenario to guarantee a consistent speed for users,

The Internet Is Growing an approach that often leaves the medium under-utilized (Cioffi, n.d.). DSM, however, observes

19

conditions in real-time and adjusts performance accordingly. This could lead to "DSL [Digital Subscriber Lines] connections that top out at 100Mbps or more" (Anderson, 2006), a vast improvement over the approximately 10Mbps maximum speed of DSL lines today. Faster transit speeds mean decreased probability of failure and a more robust experience. An alternate approach is to replace current mediums with newer ones whose speeds are inherently faster. At the forefront of this is the switch from copper cables to fiber. With fiber, data are transmitted by sending light pulses through glass or plastic tubes as opposed to electron signals sent over copper wires. What results are transfer speeds measured in gigabits per second, which greatly outpace megabit-per-second speeds of traditional cable lines (Scomptec, Inc., 2005). Add to this benefit lower maintenance costs and the ability for fiber to span greater distances than copper without the need for signal re-amplification (Scomptec, Inc., 2005), and the appeal of changing becomes apparent. But transitioning to fiber has been difficult in the past due to higher costs of implementation and issues with the fragility of the medium. That has changed recently, though, as "cost cuts for cabling and components are being driven by improved production techniques, as well as the use of less expensive connector materials" (Scomptec, Inc., 2005). The pros of using fiber may soon outnumber the cons, leading to the implementation of a better medium likely to improve the Internet experience. While no one of these individual fixes is a panacea for Internet growth, they combine to offer a fundamental shift toward technologies necessary to handle the explosive expansion the Internet has seen over the past decade and will continue to see in the coming years.

The Internet Is Growing 5. Summary A wide range of innovative services have been built upon the Internet and have attracted hundreds of millions of people. Consequently, those people have come to rely on those products, making the Internet an indispensable cornerstone of our progress over the last decade. And, though

20

only a fraction of the global population is currently able to enjoy its benefits, the tracks are being laid for billions more to connect. But this exciting expansion of the Internet has proved both a blessing and a burden. Accompanying the benefits of growth have been issues with the Internet's underlying technologies, and with them the question of their capability to support increased demand. New protocols such as IPv6 and solutions to the speed and distance problems of TCP, such as improved algorithms and new physical mediums like fiber, are paving the way for a new generation of Internet use. There remains a faction of people, however, who believe the potential problems are not impending enough to warrant expedient investment. This attitude, while partly correct, is one that should be tempered with the benefits of proactively addressing the issues, especially given the broad negative impact a failure of the Internet could have. It has become a resource relied on by so many because of its immense potential, potential which must be preserved for generations to come.

The Internet Is Growing References Anderson, N. (2006, October 10). Copper wire as fast as fiber? Retrieved from Ars Technica: http://arstechnica.com/news.ars/post/20061010-7952.html

21

Associated Press. (2008, July 24). Microsoft plans big investments in search. Retrieved from msnbc: Breaking News, Weather, Business, Health, Entertainment, Sports, Politics, Travel, Science, Te...: http://www.msnbc.msn.com/id/25840677/ Beijnum, I. v. (2008, November 13). Google: more Macs mean higher IPv6 usage in US. Retrieved from Ars Technica: http://arstechnica.com/news.ars/post/20081113-google-more-macs-mean-higher-ipv6-usagein-us.html Cailliau, R. (1995). A Little History of the World Wide Web. Retrieved from World Wide Web Consortium: http://www.w3.org/History.html Carless, S. (2008, March 10). Best Of GDC: 'Making Games For PlayStation Network - The Facts'. Retrieved from GameSetWatch: http://www.gamesetwatch.com/2008/03/best_of_gdc_making_games_for_p.php Center for the Digital Future. (2008). Annual Internet Survey by the Center for the Digital Future Finds Shifting Trends Among Adults About the Benefits and Consequences of Children Going Online. Los Angeles: Annenberg School for Communication, University of Southern California. Cioffi, J. (n.d.). Dynamic Spectrum Management Project. Retrieved November 14, 2008, from Dynamic Spectrum Management Project: http://isl.stanford.edu/~cioffi/dsm/ Clark, D. D. (1988). The Design Philosophy of the DARPA Internet Protocols. Proc. SIGCOMM. comScore, Inc. (2006, October 11). comScore Data Confirms Reports of 100 Million Worldwide Daily Video Streams from YouTube.com in July 2006. Retrieved from comScore, Inc. - Measuring the Digital World: http://www.comscore.com/press/release.asp?press=1023

The Internet Is Growing comScore, Inc. (2007, August 1). New comScore Tech Metrix Service Tracks Computer Hardware Configurations and Software Usage. Retrieved from comScore, Inc. - Measuring the Digital World: http://www.comscore.com/press/release.asp?press=1557 comScore, Inc. (2008b, August 12). Social Networking Explodes Worldwide as Sites Increase their Focus on Cultural Relevance. Retrieved from comScore, Inc. - Measuring the Digital World: http://www.comscore.com/press/release.asp?press=2396 comScore, Inc. (2008a, September 10). YouTube Draws 5 Billion U.S. Online Video Views in July 2008. Retrieved from comScore, Inc. - Measuring the Digital World: http://www.comscore.com/press/release.asp?press=2444 Golding, D. (2006). IPv6: Unmasked. Midvale: Burton Group. Isidore, C. (2007, May 18). Microsoft buys aQuantive for $6 billion. Retrieved from CNNMoney.com: http://money.cnn.com/2007/05/18/technology/microsoft_aquantive/

22

Kim, G. (2008, September 03). Global Internet Bandwidth Usage Up 53 Percent. Retrieved from TMCnet.com: The World's Largest Communications And Technology Community: http://ipcommunications.tmcnet.com/topics/ip-communications/articles/38657-global-internetbandwidth-usage-up-53-percent.htm Microsoft Corporation. (2008, July 14). New Xbox Experiences Reinvent Home Entertainment, and Everyone Is Invited. Los Angeles, California, United States of America. Microsoft Corporation. (2007, March 5). Xbox LIVE Reaches Six Million Members. Retrieved from Xbox.com | Xbox News: http://www.xbox.com/en-US/community/news/2007/0305-xboxlivereaches6million.htm Microsoft Corporation. (2005, January 20). Xbox Live Sets New Online Gaming Benchmark: Xbox Popularity Rises With Steady Market Share Growth in 2004. Retrieved from Microsoft PressPass - The Journalist's

The Internet Is Growing Resource for Microsoft News: http://www.microsoft.com/presspass/press/2005/jan05/0120XboxLive04BenchmarkPR.mspx Miller, R. (2008, October 16). Google Capex Eases to $452M in 3Q. Retrieved from Data Center Knowledge: Data centers, design, power, cooling: http://www.datacenterknowledge.com/archives/2008/10/16/google-capex-eases-to-452m-in-3q/ Miller, R. (2008, April 17). Google’s Biggest Data Center Investment Yet. Retrieved from Data Center Knowledge: Data centers, design, power, cooling:

23

http://www.datacenterknowledge.com/archives/2008/04/17/googles-biggest-data-center-investmentyet/ Miniwatts Marketing Group. (n.d.a). Internet Growth Statistics. Retrieved October 15, 2008, from Internet World Stats: http://www.internetworldstats.com/emarketing.htm Miniwatts Marketing Group. (n.d.b). Internet Usage in Asia. Retrieved October 15, 2008, from Internet World Stats: http://www.internetworldstats.com/stats3.htm#asia Miniwatts Marketing Group. (n.d.c). Internet Usage Statistics: The Internet Big Picture. Retrieved October 15, 2008, from Internet World Stats: http://www.internetworldstats.com/stats.htm NBC Universal, Inc. (2008, August 26). Beijing Olympics Set Record as Most-viewed Event in U.S. TV History with 214 Million Viewers. Retrieved from NBC Universal Media Village: http://www.nbcumv.com/broadcast/release_detail.nbc/sports-20080826000000beijingolympicsset.html Plzak, R. A. (2007, May 21). ARIN Board Advises Internet Community on Migration to IPv6. Retrieved from American Registry for Internet Numbers (ARIN): http://www.arin.net/announcements/20070521.html Scomptec, Inc. (2005). The Cabling Cost Curve Turns Toward Fiber. Retrieved from Scomptec: http://www.scomptec.co.id/fiber.htm

The Internet Is Growing Story, L., & Helft, M. (2007, April 14). Google Buys Doubleclick for $3.1 Billion. Retrieved from New York Times: http://www.nytimes.com/2007/04/14/technology/14DoubleClick.html?_r=2&ref=technology

24

The Nielsen Company. (n.d.). Global Index Chart. Retrieved November 1, 2008, from Nielsen Online - A global leader in Internet media and market research: http://www.nielsenonline.com/resources.jsp?section=pr_netv&nav=1 Website Optimization. (2008b, April 28). Average Web Page Size Triples Since 2003. Retrieved from Web Site Optimization: Speed Up Your Site: http://www.websiteoptimization.com/speed/tweak/average-webpage/ Website Optimization, LLC. (2008, January 18). iTunes Player Hits a High Note, Passes RealPlayer - US Broadband Penetration Increases to 86.79% Among Active Internet Users - January 2008 Bandwidth Report. Retrieved from Web Site Optimization: Speed Up Your Site: http://www.websiteoptimization.com/bw/0801/ Website Optimization, LLC. (2007, February 26). iTunes Popularity to Surpass RealPlayer in 2007 - European Broadband Growth Slows - US Broadband Penetration Grows to 79.1% Among Active Internet Users February 2007 Bandwidth Report. Retrieved from Web Site Optimization: Speed Up Your Site: http://www.websiteoptimization.com/bw/0702/ Wischik, D. (2005, November 2). Internet growth requires new transmission protocol. Retrieved from CERN Courier: http://cerncourier.com/cws/article/cern/29466 Woodcock, B. S. (2008, April 8). An Analysis of MMOG Subscription Growth: Version 23.0. Retrieved October 19, 2008, from MMOGCHART.COM: http://www.mmogchart.com/analysis-and-conclusions/ Woodcock, B. S. (n.d.). Total MMOG Active Subscriptions - Absolute Contribution. Retrieved October 19, 2008, from MMOGCHART.COM: http://www.mmogchart.com/Chart5.html Young, J. (2008). Network Protocols. Midvale: Burton Group.

The Internet Is Growing YouTube Statistics. (2008, August 13). Retrieved November 1, 2008, from Digital Ethnography: http://ksudigg.wetpaint.com/page/YouTube+Statistics

25

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close