Table of Contents
What’s at Stake
National Security and Maintaining World Leadership
Political, Social and Economic Factors
Prosperity in a Digital Economy
Electronic Entertainment with High-Definition Content
Converged and Bundled Services
The Economic Value of Driving these Benefits Earlier
Domestic Jobs vs. Outsourcing
Globalization and Offshore Outsourcing
Close Proximity and Instant Messaging
Politics and the Need for Executive Leadership
How much bandwidth is needed and when?
Broadband Subscriptions still trail behind Dialup
Focus on Application Benefits and Eliminate Baby Steps
Last-mile Onramps Connect Underused PC Capacity
Glut of Fiber Capacity
Competition in the Last Mile
Intellectual Property Rights
Pushing on Market Accelerators
Enabling New Business Models
Leading by Example with E-Government
Society cannot fully benefit from the digital economy until convenient and affordable broadband connections are available to all Americans. Multimedia applications for enhanced distance learning, e-commerce, teleworking, telemedicine, home networking, and electronic entertainment with high-definition programming will not reach consumers without these high-speed connections. So, reviving interest in the information superhighway and its last-mile access is needed to encourage the development and purchases of new technology and services, and to benefit both consumers and the economy.
Calrad HDTV Accessories
That’s why I support the industry leaders that believe The United States needs a national broadband policy â€“ one that is on par with the other G7 (group of seven) countries that already have such policies and are moving ahead in the new digital economy. And I will argue that America needs an initiative equal in importance to the space program, which was kicked off in a 1962 speech by John F. Kennedy at Rice University, where he said:
“We choose to go to the moon. We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win.”
A national broadband policy is needed now since the Telecommunications Act of 1996 left the task of building an information superhighway system to private companies, and that has mostly not happened. The Act was supposed to encourage open competition in all aspects of the telecommunications industry, from the networks themselves to the services delivered across them, but since no real network competition resulted, the Baby Bells suppressed their DSL (digital subscriber line) technology for over 15 years to protect their high-margin T-1 services.
Broadband still costs too much, there’s no “killer application” to drive demand, and it’s too difficult to setup. Most people find that the benefits just don’t justify the cost and effort â€“ at least not yet.
Broadband service costs too much since there’s not enough competition. We lack compelling new apps and content since today’s networks are too slow. And setup can require professional installation, with the added cost of a truck roll. While all of these obstacles are slowly being addressed, more formidable obstacles include incumbent competitors, lobbyists, and the political process.
After fifteen years, we might brag that nearly 80M US Households have access to broadband, but just 19M subscribe (about 16% of us: 12M cable and 7M DSL); and analysts expect subscriptions to barely reach 40M by YE 2004. That’s much less than we should hope for, so what happened, and why the delay?
It seems that the primary mistake of the Telecom Act was viewing the network as a commodity instead of as the necessary infrastructure for delivering commodity services. The impact of that viewpoint and the resulting delays has been severe.
America is already starting to loose its competitive edge in world technology markets, and we have seen a dramatic downturn in technology and telecommunications sectors. TIA President Matthew Flanigan describes the impact in his November 2002 letter to the Chairman of the FCC, asking that the commission promote widespread broadband deployment and remove regulatory hurdles that get in the way.
“The dramatic downturn in the telecommunications sector has led to more than 500,000 job losses, $1 trillion in corporate debt and nearly $2 trillion in market valuation losses in the telecommunications industry alone since 2000. These developments have precipitated an unprecedented slashing of research and development budgets that seriously threatens the future of industry innovation, our global leadership in technology, and in some very important respects, the very security of the United States.”
What’s at Stake
This section is about what’s holding up progress in building the last-mile onramps to the Information Superhighway (mostly incumbent carriers with too little competition, and politics), and the benefits of extending real broadband (100Mbps+) to all Americans. So what has been the cost of delay, and what is the opportunity if we jump-start the vision?
The South Korean’s are doing this with government support, but it’s easier there with so many high-rise apartments, close contact with peers, and rapid fashion uptake, where Koreans replace cell phones every 3 months vs. every 30 for Americans. Also included is a discussion of offshore outsourcing, which has contributed to North America loosing 10% of its technology jobs, and which has driven down average salaries.
National Security and Maintaining World Technology Leadership
The United States currently leads the world in the total number of Internet users, at just over 150 million, but the growth rate in new users is slower than in other countries, and some of those other countries have significantly passed the U.S. in terms of the percentage of households (penetration rates) subscribing to broadband Internet services.
Several studies predict that the world population of Internet users will exceed one billion by 2005. That’s twice the number of 2001users, and most of them will be concentrated in Asia and Western Europe â€“ no longer in the U.S.
We are falling further behind other world competitors in terms broadband subscribers per capita, and remain the only G-7 country (so-called Group of Seven industrial nations) without a national broadband policy. If we are to maintain our technology leadership, then it seems obvious that the original thinking behind the Information Superhighway must be revived, all market inhibitors must be removed, and new incentives (market accelerators) must be created.
Political, Social and Economic Factors
South Korea, with over 10 million broadband subscribers (over 60% of households vs. just 19% in the USA), is considered the most advanced broadband market in the world and has become a model for other countries. The South Korean government in 1995 decided to become a leading knowledge-based economy and set an objective of connecting a 100+ Mbps “highway” to each home by 2004. To do that, it created a broadband strategy and offered funding for a high-capacity backbone network. That way, competing telecom firms wouldn’t depend on using infrastructure from the incumbent, Korea Telecom. The government also provided low-interest loans to companies investing in new infrastructure and included other incentives to drive broadband into rural areas. But social and economic factors have also played a part in Korea’s broadband success.
It’s relatively easy to install fiber cabling in high-density housing where most Koreans live, and since their people ride to work on a bike or train, or just walk, each individual crosses paths with thousands of others each day. As a result, there’s a very fast fashion up-take, and Koreans replace their cellular phone every 3 months. That compares to every 30 months for Americans, where we may only see a dozen or so other people each day. But that doesn’t mean that we should all just sell our SUVs and single-family homes and move into city apartments.
Reflecting back on America’s previous investments in infrastructure for highways, power distribution, telephones, etc., I worry that South Korea and others have taken a lead over the U.S. in deploying information superhighways, and in doing so could surpass us economically. But we were not the only country to invest in these earlier infrastructures, so if that’s not what made our country great, what was it? And will it keep that way?
One of the great advantages our democracy has is its rich mix of different nationalities, cultures, and religions, and that variety has led to more creative solutions to problems. By comparison, nations with just one ideology, religion, or way of thinking tend to have a more limited set of experiences to draw upon when making decisions to exploit opportunities and address problems.
The diversity of our nation can be an advantage, but it also means that many people want to influence political outcomes, and that can slow the decision process.
Prosperity in a Digital Economy
In the Information Age, nations that invest in telecommunications will have a distinct advantage. They’ll have the means to educate their people, to develop their products, and to sell and deliver their services. Countries that don’t invest will find themselves falling behind, going outside for needed help, and paying a premium for it. This section covers some of the areas improved by a high-speed information superhighway, including commerce, education, health care, and entertainment.
The U.S. Department of Commerce reports that online retail sales in 2002 reached $45.6 billion, a 22.7% increase over 2001 but still only a fraction of the Business-to-Business e-commerce market. A March 2003 report from eMarketer puts that larger B2B market at $1.41 trillion in 2003 and $2.37 trillion by 2004. Half of that amount ($721B in 2003 and $1.01T in 2004) is expected to come from the U.S.
As impressive as these forecasts are, however, they were scaled back from earlier projections from companies like Forrester Research, which in 2000 predicted worldwide e-commerce revenues of $6.8 trillion by 2004 and nearly $13 trillion by 2006, including both B2B and B2C. It’s no wonder that investors poured so much money into any company beginning with e- or ending with dot-com. But while it seems that these earlier forecasts relied heavily on having an Information Superhighway in place, the broadband “onramps” to the superhighway were never completed.
If we make the gross assumption that not having onramps is the only factor explaining the difference between the Forrester and eMarketer forecasts (it’s not), then we can estimate the potential economic loss. By comparing the forecasts, we see a difference of $3.43 trillion for 2004 and over $89 trillion when looking at the next five years! While this estimate is rudimentary at best, and the U.S. B2C portion is a small fraction of the total, the magnitude of the potential loss (or opportunity) should at least get your attention. Note that we’re not just risking millions of dollars or billions, but trillions, and that’s why having a broadband policy is so important.
USA e-Commerce Market Share is Declining
It doesn’t much matter what set of market forecasts you look at. The U.S. share of the e-commerce pie is about 73% today, but falling as other countries invest for competitive advantage in the Information Age. An important reason for the decline is that the U.S. remains the only G-7 country without a national broadband policy, according to a report by the Organization for Economic Cooperation and Development (OECD).
A recent OECD study examined why some countries prosper more in the Information Age than others and concluded that factors include more than just information and communication technologies (ICT). They found that policies that engage a mix of ICT, human capital, innovation, and entrepreneurship, as well as fundamental policies for controlling public finances and inflation while instilling competition, seem to yield the best results over the longer term. This implies that a U.S. broadband policy is just one contributor to future economic growth, although an important one.
Much can been said about enhanced Distance Learning and how video-rich media will improve the lifelong learning process, but first let me stress that broadband is just part of the solution, and learning can also benefit from more modest computers with no network, especially with very young children.
I introduced my son to the PC when he was just 2 years old (he’s now 19) and we bought a program called Early Games. To focus his attention, I made a cardboard overlay to cover the keyboard and show only the number keys. The program’s first level taught the differences between numbers â€“ that a 3 was like an 8 but open on the side, that a 6 was like an upside down 9, and that a 1 looked something like a 7. It then got into counting, with simple rewards when he pressed 3 to match three boxes or 8 to match eight boxes. Slowly, it led him into addition, subtraction, multiplication, division, and even fractions, and all of this was done on the original IBM PC (4.77 MHz) with no network or hard drive.
Most experts agree that children learn more and more quickly when entertained. For my 2-year-old son, the chirping bird that hopped across the screen as a reward was good enough, and it was fun to watch as he also tried pressing the wrong keys on purpose â€“ just to see what happened. He’d hold down a key until the keyboard buffer filled and the PC screamed at him, and he’d laugh out loud. As kids get older, however, they need multimedia, hands-on experience, or multi-player competition to keep them engaged and compete with TV and video games.
Today’s Smart Toys
Now kids have smart toys with embedded microprocessors and networking, like Microsoft’s Interactive Barney. Barney talks to kids in spoken phrases, sings songs, and plays games, and to keep them engaged, as they grow older, Barney communicates through a wireless network to expand its capabilities.
Networking does more for education than simply improving the logistics of shipping content and making sure that students always have access to interesting courseware. It also ties many students together electronically with inspiring teachers and fosters competition, mentoring, and real exploration.
During the late-1980’s, IBM was an early leader in multimedia edutainment, and I was especially impressed with its Illustrated Books and Manuscripts series, which captured the knowledge and perspectives of different literary experts and used a laser discs to store hours of video content. The program Ulysses encouraged students to play these different ideas against each other to gain a deeper understanding of this epic poem.
That same multimedia technology can also be used to capture someone’s passion for (and understanding of) Civil War history, for example, and then to clone and amplify that value by extending it beyond a typical classroom to reach thousands of classrooms, either on recordable media like CDs or across a network.
When I first saw an early IBM video of MIT students competing in robot wars, I was blown away by the excitement as the robots fought each other to collect ping-pong balls. The Robot Wars concept has since trickled down to high schools, middle schools, and hobbyists; and the contests are broadcast on TV. Eventually, global networks with real-time language translation will let students speaking different languages compete with each other or cooperate on projects.
Meaningful Career Development
Networked multimedia also lets students experiment with things that would be too dangerous or too expensive otherwise. Imagine high school and college students controlling a virtual nuclear reaction, or maneuvering a robot arm on a space shuttle, or doing open-heart surgery, as they can with IBM’s Emergency Room.
Networked multimedia makes it easier to explore potential careers and find ones that fit your natural talents, rather than just sliding into whatever opportunity pops up. That’s the real benefit in my opinion â€“ helping to create a society where people get up inspired and anxious to go to work, and where they share values, knowledge, and perspectives with their peers around the world.
Broadband networks and multimedia education also improve the lifelong learning process, which is especially important now, since employees no longer work for one company for their entire career, and few companies provide employee development education. Today’s workforce faces the need to continually update their skills and education as they move from one employer to another.
In-Stat/MDR put the number of “remote and mobile workers” at 63.9 million in 2001, increasing to 85 million by 2005. That number includes mobile workers, frequent business travelers, people who work from home part-time, those who toggle between two corporate work sites, and even “non-office” workers who work in production facilities and warehouses. Interestingly, the number does not include full-time teleworkers since they work in only one place, the home. And it does not include people who run a home-based business. So let’s just agree that there are many people working from home, and more will do so in the future.
For years, advocates have argued that teleworking reduces traffic, saves time and gas, helps the environment, makes it easier for companies to find and employ qualified workers, and generally improves the quality of life for employees. So, why don’t we do more of it?
For one, telework usually requires broadband connections equal in speed to office networks and that can support applications like video conferencing. But there are also other factors, including management fears and political barriers. While teleworking does offer many opportunities, it also poses a threat to building owners and cities unwilling to change, not to mention the car manufacturers, oil industry, and others that fear they may be affected.
The Impact on Buildings & Cities
The merger of computers and telecommunications systems has profoundly altered the physical design of office buildings and the type of activities occurring inside. The newest buildings feature advanced networking infrastructure built into hollow walls, hung ceilings, and raised floors, to connect data and video transmission equipment. And when older office buildings are unable to meet today’s networking needs, they become obsolete and generate demand for new buildings, which are often located outside of the city.
So, just as electricity enabled the elevators and air conditioning needed to draw workers together into tall office buildings in big cities, high-speed telecom networks are enabling a move away from cities. Broadband networks also let workers move out of offices in tall buildings and into ones at home, at their customer, or in their car. And this trend has attracted as many opponents as champions.
The Impact on Employers
Effective use of teleworking takes planning, investment in networks and equipment, changes in management style, and compassion for employees. For business travelers, the hotel room that was once a place to rest is now a place to resume work, and while that improves productivity, it doesn’t go over well with all of us. Network connections now transform hotels and airports into work centers during layovers or while waiting for flights, and cellular phones mean that managers expect to reach us anytime.
IBM was one of the first to recognize that mobile employees don’t really need full-time desks and offices. So years ago, the company pushed them out into the field to spend more time with customers or to work from home. The move reduced real estate costs considerably and also helped to increase sales due to an increase in customer “face time.” When IBM reps do visit the branch office, they now call ahead to temporarily reserve a desk, network port, and phone â€“ a process called “hoteling” â€“ where they can use shared facilities such as printers and conference rooms.
Policy makers worry that healthcare costs are already rising, and as baby boomers retire it will get much worse and put a severe strain on the economy. This is not the time to champion universal health coverage, however, since the cost of that coverage will go up dramatically. It’s a time to instead promote telemedicine, as a way of lowering costs.
Apps such as remote health monitoring already help the elderly or infirmed remain productive and stay home longer, instead of crowding them into assisted living facilities. The addition of high quality video can greatly enhance these applications, but that requires more bandwidth than we have today, and it’s one more reason to have a national broadband polity focused on the Information Superhighway and its application benefits.
Electronic Entertainment with High-Definition Content
Where’s the Value? In the Network or the Application?
A DVD movie consumes about 4 GB of space on the disc and costs just $5 to rent at the video store. Because of the large file size, consider its value compared to an emergency alert from a security sensor or health monitor. Then ask, “How should we pay for broadband services – by the amount of bandwidth consumed (4,000,000,000 bytes vs. less than 100 bytes), or by the relative application value?”
Now think of how your answer might change if fiber optics made bandwidth a cheap and plentiful commodity instead of a fairly scarce resource. That’s the world of university students in college dorms. With no network constraints, they were the first to download digital music and videos. But as long as the rest of us are stuck with relatively slow DSL and cable modems, or dialup (heaven forbid), we’ll be stuck renting movies and wishing we could download them.
TV Broadband vs. On-demand
Broadcast television is a great business model when millions of people want to watch the same program at once, such as with the Super Bowl or the President’s state of the union address. It doesn’t make sense, however, for specialized content like instructional videos, or where individuals want specific content at a given time with the ability to pause and rewind. That’s a better model for video-on-demand (VoD). Between these extremes are other models like narrowcast and pay-per-view.
With Internet-based VoD, there should be less demand for the hundreds of programs that are pushed to each wall outlet. That’s because you can only watch one program at a time (or two with picture-in-picture, or maybe nine with a video marquee).
While households benefit from the ability to access hundreds of channels so people in different rooms can watch different programs, the individual TV does not, so I see a strong future in distributing video from a master set-top box over wireless networks. That master doesn’t even have to be in the house but could be a remote service.
With advancements in Moore’s Law that continue to enable new compression and encryption technologies, we can send more and more digital content over the same networks and delay adding bandwidth, but not forever.
MPEG4 video compression is a lossy technology that uses powerful processors to offer near-DVD quality video at just 750 Kbps instead of 4-6 Mbps for MPEG2. It adjusts picture quality based on available bandwidth, CPU capacity, and display resolution; and it’s being added to mobile phones on one end of the device hierarchy and DVD recorders on the other, where it can store more content on the disk.
MPEG4 lets more applications benefit from video content and more service providers offer it. Even today’s DSL and cable networks can send compressed near-DVD quality movies, and with some network enhancements, they could also send high-definition programming.
At CES, Toshiba showed a proprietary new version of MPEG4 that supports high-definition at 1080p resolution with only 2-4 Mbps of bandwidth (depending on the amount of motion), instead of over 20 Mbps for MPEG2. This implies that carriers could eventually send HD content over enhanced cable and VDSL networks without extending fiber to each home. As interesting as these developments are, they don’t offer enough performance improvement to move from a broadcast model to a VoD model with multiple HD video streams for active TV sets in each home.
Today’s narrowcast model sends video content to locations closer to the consumer, such as to head-end video servers, where it can then be accessed on-demand. And DVRs like TiVo extend this model all the way to the household and provide VoD benefits with existing broadcast content, even allowing the delay of transmitting more-personalized content to off-peak hours.
In our house, for example, we read the online program guide and decide what to record and watch later. That way we can bypass the commercials and watch a 30-minute program in just 15-20 minutes. What we don’t yet do is download recordings from Internet services like KaZaA. A nasty byproduct of this ability to skip commercials is that programs are filled with more of them, and in some cases more time is taken up by the ads than by the content. I don’t think the consuming public will stand for this trend much longer, and that leads to the next topic.
Broadcasters trying to protect their Old-World markets don’t like the fact that DVRs let consumers skips the ads, so they’re crying “foul” and appealing to Congress to legislate solutions, and appealing to Hollywood to insert the ads within content itself. A better way to avoid losing ad revenue lies in mass personalization, but broadcasters haven’t embraced the DVR as a technology to tell them about individual preferences. With that knowledge, they could present only the ads that match individual interests. Personalized ads could even be viewed as added benefits instead of disruptions.
Converged and Bundled Services
Since broadband networks can carry any digital content, they enable the convergence of voice, data, photos, music, and video, as well as the concept of service bundling. This convergence will eventually result in lower subscription costs and improved services with new capabilities, and all carriers are moving in this direction. Those that don’t will see their individual services become low-cost commodities, and they’ll be driven out of business. Many will go out of business anyway, since most consumers will eventually have only one provider that bundles all of their telecommunication services.
I wish I could get bundled services now, since I currently pay over $250 per month for all of them individually. OUCH!!!
$135 goes to Sprint for mobile phone service with 3 phones, a large pool of shared minutes, free long distance, and other attractive features;
$135 goes to Time Warner for cable TV, digital cable, DVR rental, and RoadRunner cable modem service; and
$100 goes to SBC for two local phone lines with Caller ID and some other features.
When phone solicitors call to sell their services, I tell them I’d gladly pay $175 if their bundle includes all of the services I want, but so far none of them offer that.
The Economic Value of Driving these Benefits Earlier
Lawrence Vanston says service bundling will occur by 2015, when 88% of households in North America will have high-speed Internet access, with many of them watching streaming video and about two-thirds owning a high-definition TV. I think it will occur much sooner since new competitors offering service bundles will make commodities of individual voice, TV, and Internet services. Most people will get these core services from the same company, and incumbent carriers will either join the convergence trend or become extinct.
While most analysts think it could take 15 years or more to fully benefit from the digital economy described in this section, it could happen much quicker than that with a little push, and I hope you’ll agree that the push is justified.
Earlier, we compared different e-commerce forecasts from Forrester Research and eMarketer, noting that consumers still don’t have affordable broadband access and the impact this has had on both the telecom industry and e-commerce forecasts. This exercise highlighted $3.43 trillion in lost opportunity for 2004 and much more across the next five years!
Another way to look at the opportunity is to move future projections in one year earlier, meaning that 2004 benefits come in 2003, 2005 benefits come in 2004, and so forth. Wow! The value of a national broadband policy sure adds up fast!
And this rudimentary exercise only considered e-commerce. It didn’t factor in enhanced distance education, telecommuting, telemedicine, and other segments of the new digital economy or their impact on national security. With so much at stake, we can’t afford delays.
Domestic Jobs vs. Outsourcing
Studies have long shown that the information technology (IT) sector is critically important to the health of the U.S. economy. Since 1995, IT capital investments provided 22% of the Gross Domestic Product Growth, accounted for at least half of the increases in productivity, and were responsible for a significant decrease in inflation. IT also generated jobs, with 1.2 million new jobs added from 1994 to 1998, and with salaries averaging 85% more than other jobs.
America’s IT jobs were the first to benefit from growth in telecommunications and the first hit by the telecom and .com bust. Since IT is now at the center of an increasingly service-oriented economy, these jobs also get cut when work is outsourced offshore.
A New York Times article by David Leonhardt quotes recent U.S. Department of Labor reports, saying the economy has lost more than 2 million jobs â€“ a drop of 1.5% – since the recession began in March 2001. That’s despite the resumption of economic growth. An unusually large number of people have been unemployed for months, with almost 1.9M still looking for jobs after six months or more, and that’s triple the number from two years ago. These high levels of unemployment put strain on our economy and people still working, since they must contribute, through taxes, to the survival of the unemployed.
Globalization and Offshore Outsourcing
Even more disturbing was a Business Week article that described a new round of globalization and how the driving forces of (1) digitization, (2) the Internet, and (3) high-speed data networks are enabling telework and outsourcing and sending upscale jobs offshore. Outsourced jobs have now expanded to include accountants, architects, graphic artists, financial analysts, telemarketers, help desk specialists, engineers, programmers, chip designers, product developers, and basic R&D.
Chip designers with a master’s degree and five years of experience can make $7,000/month in the U.S. â€“ or $1,000/mo in India, where the average salary is just $500 per year. That disparity also contributes to falling U.S. salaries. A senior engineer that made $130K a year in 2000, for example, now makes just $100K, and an entry-level help desk specialist that made $55K in 2000 now makes just $35K. So, rather than keeping up with inflation, these examples show over 30% decline.
The Business Week article also shows that China, India, Philippines, and Mexico show MUCH stronger growth in the number of engineering and natural-science college graduates. And if this is not disturbing enough, outsourcing experts say the job migration has only just begun. â€˜Scared yet?
Close Proximity and Instant Messaging
Many jobs don’t make sense to send offshore, and it seems that local engineers are still needed during critical early product development. That’s so they can meet with clients and see problems first hand, and so they can quickly exchange materials that would otherwise have to go through U.S. customs. Also, by working with local manufacturers during the development phase, companies don’t have to worry as much about intellectual property theft.
Instant Messaging (IM) is another reason for close proximity. My son, for example, is a college freshman who uses both email and IM but uses email for his Drum Corps friends around the country and IM for his local buddies. It seems that IM works best when everyone is in the same time zone and on similar work schedules. So while IM is evolving into a great tool for communications and idea sharing, this need for close proximity may limit its role with offshore outsourcing.
Politics and the Need for Executive Leadership
America has had a love affair with its cars, but the taxpayer cost of building the interstate highways they ride is about $30 million per mile. Compare that with the much lower cost of laying fiber-optic cable â€“ just half a million per mile in a new trench and less than $50 thousand if fed through existing conduit. Furthermore, experts say a high-speed information superhighway system could eliminate half of the travel on physical roads.
We continually use the term Information Superhighway to describe what government officials sometimes call a National Information Infrastructure (NII). That’s because the high-speed network is viewed as “infrastructure â€“ much like the interstate highway system, electric grid, and rail system preceding it. Using that analogy, we think nothing of getting into a car and driving to work, flipping a light switch, turning on a faucet, or picking up a phone to call someone; but these services all depend on infrastructure and agreed upon standards that took decades to unfold.
The information superhighway is evolving much like traditional highways that were paved as traffic and demand increased. But just as government played a role in building America’s roads, it should also play a role with information roadways.
It may be helpful to look at who paid to build and maintain our highway system, to see if there are parallels that should be followed. Individual homeowners and businesses privately funded many roads. Cities and states funded others. And the Interstate Highway System comes under the U.S. Department of Transportation. It stands to reason that information networks should be viewed in the same way, with part of the overall expense funded by federal, state, and local governments, according to a national objective.
What if that funding never appears? We’ll “eventually” get widespread broadband because of the market dynamics. Fiber will continue to reach closer to homes and small businesses, and the dark fiber will get lit up. This revolution can’t be stopped, but it can be slowed (or accelerated).
It could take less than a decade with a national broadband policy and funding, or 15-20 years without. The longer the delay, the further we will fall behind competing nations that have broadband policies and invest in it. As described in the previous section, there’s a lot at stake, including higher education, technology development, commerce, economic leadership, and even national security.
We can’t afford to wait and let this revolution take its own course while other nations actively pursue it, because prosperity in the digital economy is worth Trillions.
TechNet and the Computer Systems Policy Project (CSPP) are among the many influential organizations that have petitioned the FCC, demanding a national broadband policy â€“ one that opens the “last mile bottleneck” for all Americans, with a goal of at least 100 million homes and small businesses having affordable access to 100+ Mbps broadband capacity by the end of the decade.
The FCC has responded with a ruling this past February that gave the RBOCs (Regional Bell Operating Companies) both a carrot and a stick to encourage deployment of new VDSL and fiber-to-the-premise (FTTP) networks. The carrot removes past regulations that would have required RBOCs to share their data networks with competitors at cost, and this should make it easier for them to invest in new capacity. The stick is to keep these regulations for existing networks, with state regulators determining the lease rates.
Competing carriers (including cable companies offering phone services) are put on the defensive and now need to think about making (or increasing) their own network investments.
The Commission didn’t go as far as FCC Chairman Michael Powell would have liked, however, and he complained that allowing states to regulate rates could cause confusion, inconsistency, an increase in court challenges, and slower broadband deployment than desired. His inability to get majority approval of a bolder broadband policy highlights the intense lobbying and enormous political pressures that FCC commissioners face, and it suggests that effective leadership must come from higher up the chain of command â€“ such as from the President himself.
How much bandwidth is needed and when?
I support a goal of having at least 100 million homes and small businesses with affordable access to 100+ Mbps broadband capacity by the end of the decade, but I don’t expect the market will get there on its own. That’s because of the chicken vs. egg syndrome, where consumers are unwilling to pay for more capacity than their primarily text-based applications need. Rich multimedia with high quality audio and video will greatly enhance e-commerce, distance learning, and the many applications mentioned in the first section, but this development effort is waiting on faster networks, which is waiting on consumer demand, which is waiting on compelling reasons to buy more. Chickens. Eggs.
With a 56Kbps modem, a 1-page email with basic fonts (20K) can be displayed in less than 0.4 seconds, and a relatively large Excel attachment (500KB) takes less than 9 seconds to transmit. Even web browsing over dialup connections is fast if you turn off the graphics, so most users are happy.
Simpler applications require even less bandwidth. Basic security monitoring needs just 19 bytes (7 digit customer number + 2 digit zone + 10 digit date/time stamp), and then only when an alert happens or for periodic system tests. Health monitoring equipment has similar needs. And both examples offer high value, including protecting the lives of loved ones.
That’s why I wanted you to consider the consumer value of a 19-byte security alert, compared with a movie rental. The DVD movie, which costs $5 to rent at Blockbusters, takes up about 4 GB of space on the disc and would take 3-4 days to download over a 56 Kbps connection. Even with digital compression, video-on-demand and HD video streaming consumes so much bandwidth that it’s not feasible or possible when network capacity is limited (as with dialup, DSL, and cable service).
To add rich video content, we need much faster networks, and that’s the reason for selecting 100+ Mbps as the objective â€“ to support several high-definition video streams simultaneously.
Broadband Subscriptions still trail behind Dialup
A recent CEA “Internet Connections” survey found that consumers buy only as much bandwidth as they need for email, instant messaging, and Web surfing, and the FCC confirms that most Internet users still use dialup services. Parks Associates says users are fairly happy with their dialup service at $15-25 per month and not likely to switch to broadband at $40-50 per month. DSL and cable services are still too expensive and too slow, at about 1 Mbps, to support enough compelling new applications to justify the extra cost.
Even though both digital cameras and music benefit from broadband connections, most users aren’t convinced that this justifies the monthly cost. These markets would grow much more quickly if broadband were cheaper and more readily available. That’s because it takes two minutes to send just one photo attachment to grandma over a dialup connection (1.3M pixel images consume about 675KB each) and 4 hours to send 100 vacation pictures to Kodak for printing, while it takes just 11 minutes to send the vacation pictures over a 1Mbps broadband connection.
Focus on the Application Benefits and Eliminate the Baby Steps
Consumers don’t want to buy or rent network capacity and instead just want the applications and content that comes across the network. So let’s examine the different applications that each successively faster network supports, in a step fashion. The objective of this exercise is to encourage the elimination of network baby steps that add extra time and cost without adding much value.
When focus shifts to application benefits, the Internet infrastructure becomes just the required means of achieving Economic Value. Think of Internet infrastructure as three different physical pipes to distribute Economic Value, represented by spherical objects of different diameters:
1. The diameter of the first economic pipe supports the delivery of marbles and makes it possible to produce and market marbles â€“ as well as BBs or anything smaller than marbles. This pipe represents a 56Kbps modem that supports text, data, and small digital files.
2. The slightly larger diameter of the second economic pipe makes it possible to also sell and deliver golf balls, but it’s too small for tennis balls, baseballs, or anything larger. It adds some value over the first pipe, but not much; and if you only want to buy marbles, you don’t need the added capacity. This second pipe represents the present DSL and cable networks, which support larger files, photographs, compressed music, and gaming.
3. The much larger diameter of the third economic pipe supports basketballs, bowling balls, golf balls, and almost any type of round object you can think of, making it possible to produce, sell, and deliver them all, even if you can hardly imagine ever wanting to do that. It represents the Information Superhighway, with support for enormous files and high quality video and audio content, flowing in either direction with no constraints. And it adds considerable value and utility for today’s applications and tomorrow’s.
Note that compression may allow some beach balls to flow through golf ball size pipes, but only one at a time, and even then with great difficulty. That’s like trying to push video down a DSL or cable pipe. Bowling balls, on the other hand, can’t be compressed, so why try.
Last-mile Onramps Connect Underused PC Capacity to Long-haul Network
It’s safe to say that we already have an Information Superhighway with a glut of capacity, but we lack the high-speed onramps in the “last mile”. Without these onramps, we’ve not seen the advanced applications that need long-haul network capacity, and it has become a cheap commodity. Ideally, each home and business would connect to the highway with optical fiber, but the telcos and cable companies keep trying to squeeze more life from existing copper and coax cabling and have only been taking baby steps that waste time and money.
As we look at the network value chain, with PC at one end and long-haul network at the other, both ends have a glut of capacity. Moore’s Law has brought us more computing power than mainstream PC apps need, and even though PC makers offer 3GHz models, most consumers need no more than a 3-year old, 200 MHz PC, even for photo editing. So Moore’s Law is no longer driving the PC industry and is rapidly being replaced by Metcalfe’s Law, which describes the value of a network as increasing exponentially with the number of devices connected to it.
As you’d expect, Intel and Microsoft are trying to protect their legacy markets by adding new functions that consume more MIPS (a term describing PC capabilities). Video requires more power, but standalone video editing applications are cumbersome and far from being mainstream. So with no need to today’s high-end PCs, consumers benefit more from buying several Value PCs and networking them together.
The capacity glut at each end of the chain explains why companies on the ends have watched their products become cheap commodities while companies in the middle see growth. This observation suggests that there’s money to be made in the middle, by bridging the gap and balancing the value chain.
Once we have broadband performance of 100+ Mbps, which approaches the speed of a PC’s internal bus, data from remote disks will fly around nearly as fast as data from local drives. In a way, this will add to the abundance of computing power, since it lets services make use of the many idle PCs, so they can all work together on complex programs.
Glut of Fiber Capacity
The glut of long-haul network capacity, which caused the dot-com bubble to burst and the telecom industry to suffer, was the result of building too much capacity in response to marketing hype without also building the last-mile onramps. And the onramp problem was primarily caused by a lack of real competition. To be fair, several technical innovations also contributed to this situation, as discussed below.
Too many companies chased after the market potential, and too much fiber was deployed â€“ along railroad tracks, in the middle of long-distance electrical power lines, and through all sorts of other rights-of-way. Through the individual efforts of AT&T, Qwest, IXC, Williams Communications, and others, over 100,000 miles of fiber was installed. We now have an abundance of fiber in the Internet backbone, and most of it is still dark.
The Quest story was interesting. They had access rights to railroad lines and designed special rail cars to dig the trenches and lay fiber inside of conduits, at the rate of a mile per day. And where each mile may have 100 fibers, only one fiber has been lit up so far.
At the same time, optical engineers developed fancy prisms to split white light into more and more different colors using a technology called dense wavelength division multiplexing (DWDM). With this technology, each fiber can carry much more data.
And semiconductor advancements led to lasers that pulse more quickly, for speeds up to 40 gigabits per second â€“ per color, per fiber. As a result, we now have products that can transmit 3.2 terabits per second on a single fiber, or enough capacity to carry the world’s phone traffic. Soon, Bell Labs plans to push 200 Tbps, or enough to send the entire Library of Congress every second, all on a single fiber. It’s mind-boggling.
Multiplying Effect on the Bandwidth Glut
Investments in these technologies (more fibers, more colors, and faster pulsing) were each driven by early expectations for the Information Age, and they had a multiplying effect on each other that resulted in a glut bandwidth capacity. But don’t blame the problem on false expectations, when it was really the lack of competition and last-mile onramps. The Information Age vision is still sound and inspiring us onward.
Competition in the Last Mile
The FCC and Congress, through the Telecommunications Act of 1996 and related rulings, made it clear that they favor markets that are open to vigorous competition, as a way of encouraging innovation; but there’s a certain amount of wasted overlap in this approach, and incumbents naturally try to discourage competition and protect their turf. Federal and state regulations have so far failed to create a competitive environment, but that’s starting to change.
As mentioned earlier, the February FCC ruling removed some regulatory barriers that were keeping the RBOCs from investing in new high-speed networks, networks that cannibalize existing service plans, especially when they had to share those networks with competitors at cost. While they must still lease existing networks to competitors, they won’t have to share new networks. These moves will hopefully jumpstart a stalled market for last-mile access, and both the RBOCs and their competitors now have reason to invest in performance improvements.
The FCC was divided on this ruling due to intense lobbying and the enormous political pressures that FCC commissioners face. I believe that shifting the focus from network services to application benefits will help them resist these pressures. But the inability of FCC Chairman Michael Powell to cause that shift suggests that visionary leadership and arbitration must come from higher up, and I hope to see presidential candidates focus on this issue.
The competing last-mile solutions all depend on optical fiber in some manner. By extending fiber closer to the customer, legacy wiring gets shorterâ€¦ and therefore faster. A generic term describing systems that push fiber deeper into the network is Deep Fiber. It includes DSL, HFC (hybrid fiber-coax), and FTTP, as well as future wireless systems that complete in the last-mile with radio signals. Each of these technologies are illustrated in the following chart and described below.
The selection of each technology is logically based on the target market, with some best suited for dense urban environments, and others better used for rural surroundings, mobile use, or specialized applications.
By extending fiber to the premises, carriers can bypass the baby steps designed to extend the life of legacy wiring. FTTP is the fastest route to a TRUE Information Superhighway and an obvious preference for new home construction, where there’s no legacy to protect. Retrofitting FTTP is more difficult, however, due to the added cost and disruption of digging trenches through gardens.
At the same time that regulatory changes are making it easier to install fiber networks, new passive optical network (PON) couplers in the end nodes eliminate the need for electric power and active electronics to operate them. A PON coupler extracts signals destined for each customer and reinserts them back onto the fiber when going back. By 2010, the speed of FTTP networks could exceed 600 Mbps, and for households where fiber is part of a home network, the term FTTP could mean fiber-to-the-pillow.
VDSL (Very high-speed digital subscriber line)
The incumbent phone carriers will likely continue taking baby steps and using legacy copper phone lines to connect homes and small businesses. The length of these lines gets shorter as fiber extends deeper into the network, and this can improve the speed of DSL services to 52 Mbps. But note that 52 Mbps is still less than our 100 Mbps objective, and it’s far less than the speed of competitive all-fiber systems. VDSL end nodes include VDSLAMs (VDSL access modules) that include the active electronics and convert light from the fiber to electricity on copper, and visa versa. To get maximum performance, the end nodes need to be within 1,000 feet from the customer; and without taking fiber that close, VDSL will only support 24 Mbps. At that speed, it could have a hard time supporting a single stream of MPEG2 high-definition video. So, VDSL remains a weak compromise, and I can argue that the required use of active electronics can make VDSL more expensive in the long run.
Advanced Hybrid Fiber-Coax
Cable companies connect customers to end nodes using coaxial cabling that already carries a hundred TV channels. By extending fiber closer to customers, the cable gains additional bandwidth for HDTV programming, as well as 24-100 Mbps for Internet access. These advanced HFC networks come closer to the bandwidth objectives described here, but they are still viewed as a short-term compromise since cable networks are shared like Ethernet, and that means consumers won’t get sustainable 100 Mbps performance.
Future Wireless (and PowerlLine) Systems
New access network alternatives based on wireless and powerline technologies will also get a boost from the new FCC rules, but wireless is best used to supplement high-speed fiber networks when mobility is the key decision factor. Wireless is especially interesting since this market segment is so hot with new technologies and business models, including this non-exhaustive list:
Â· FreeNets setup in neighborhoods and parks using cheep WiFi components and unlicensed spectrum;
Â· Fixed-point networks that use directional antennas and require line-of-sight;
Â· Cellular-style networks with electronically steerable antennas that aim a narrow beam of microwave energy at each customer in turn, thus extending both range and performance and improving security;
Â· Mesh networks, where each client acts as a signal repeater to the next one and eliminates the need to build as many antenna towers;
Â· Enhancements to existing cellular and satellite networks; and
Â· Use of the high-voltage electric power grid for long distance backbone coverage.
With the ability to bypass incumbents that control the wiring infrastructure, new wireless services will increase the competition for last-mile access, and this will help drive down prices. Inexpensive wireless products need no line-of-sight and no professional installation (no truck roll), and their performance can scale somewhat. While it’s possible that we’ll see 100 Mbps wireless access networks in five years, today wireless is limited to much less.
Andy Dorman of Network Magazine described the many benefits of wireless and looked at the various wireless technologies: MMDS, LMDS, 802.16, and 802.11 (Wi-Fi) and concluded that the use of unlicensed bands and Wi-Fi standards seems to be the likely wireless winner. That’s due to Wi-Fi’s low cost, its close affinity with (wired) Ethernet, and the ubiquity in wireless LANs. He also noted the remaining issues and described companies working on them. The main issues were security, QoS, and coverage. The coverage issue is being addressed with the wireless switch concept, where bandwidth isn’t shared but steerable antennas logically spin to focus their energy on each client in turn, thus improving range, security, and performance.
What would happen if we were to deploy FTTP in mass, and everyone started using the Internet for video at once? (Not likely.)
Turnpike Effect is a queuing theory term that describes what happens once highway construction ends and more lanes (more capacity) finally open up. It refers to construction projects that were supposed to relieve traffic congestion but never did so, and where new lanes fill up as soon as they are opened.
The turnpike effect happens when highway planners fail to allow for pent-up demand and people are forces to find alternate routes to avoid congestion. During construction, bad traffic gets worse, as even more people are forced onto alternate routes, thus causing even more pent-up demand. Once the new lanes open, those drivers come back, and the expanded highway seems as crowded as ever.
I describe this effect since it could happen with an information highway too, if we only build onramps without a plan for the whole network. Fear of turnpike effect may contribute to delays in building the last-mile onramps that enable video apps.
Significant obstacles, including powerful special-interest lobbyists and intellectual property rights, will be discussed here.
Special interest groups present different views of how to pave the Information Superhighway, especially the threatened incumbents that use any means to subvert progress that’s not in their best interest. Their tactics include hiring powerful lobbyists to influence policy. Policy makers must understand this, find ways to filter through the various demands, and not succumb to their special interest pressure.
The digital economy brings many benefits but also threatens many incumbent stakeholders.
Â· Telephone carriers â€“ The voice quality of Internet telephony has greatly improved, and it no longer requires a PC at each end. VoIP decimated long distance profit margins by digitizing standard voice calls and passing them over the unregulated data networks.
Â· Recording industry and radio broadcasters – Likewise, Internet radio drew the ire of broadcasters and the recording industry, which lobbied for new rules that require Internet radio broadcasters to pay per-song royalties.
Â· TV broadcasters â€“ And just around the corner, Internet television will challenge the TV broadcasting business models.
Â· Building landlords â€“ Network infrastructure is critical to business in the digital economy, so older buildings that can’t accommodate it are obsolete, and companies will move out in favor of newer buildings.
Â· Cities â€“ Just as companies move from old buildings in inner cities to new ones in suburbs, telework lets employees move home, and this threatens to change the whole landscape of cities. Many will resist this trend instead of embracing it.
Â· Oil industry â€“ Telework also eliminates much of the daily commute, as well as millions of gallons of gas. Car manufacturers also worry about this trend.
Â· Universities â€“ Even as brick-and-mortar universities struggle to keep up with increasing demand for education, they face new competition from startups specializing in distance learning.
Â· College professors â€“ Just as the recording industry views the Internet as a tool for piracy, professors worry about protecting their course materials.
How might the Internet affect the future of cities?
The list of affected stakeholders goes on and on, and on. And they each have a lot to lose from the Information Superhighway and digital economy. But they also have much to gain. Let’s first discuss cities as an important stakeholder.
In one view, leading companies will locate themselves in information centers â€“ cities with modern buildings and high-speed fiber infrastructure to support their information needs. That’s like the manufacturers who once settled next to transportation centers (sea ports, rivers, railroads, highways, and airports).
In the opposing view, the deployment of fast networks into growing suburbs and rural environments will encourage companies to move away from the urban centers and allow them to hire workers that may live anywhere â€“ across towns, nations or continents.
We already saw how electricity enabled the development of elevators and air conditioning, and the construction of tall buildings in large cities. With office workers so close together, they could share ideas, meet customers face-to-face, and act on common goals. But telecommunications can make it easy to locate companies anywhere they can get bandwidth, with the same benefits.
Property owners and city planners naturally feel threatened. Futurist George Gilder describes cities as “leftover baggage from the industrial era.” Nicholas Negroponte states, “The post-information age will remove the limitations of geography.” And Frances Cairncross foretells a migration to suburban communities (and a corresponding drop in crime) as homes re-emerge as the center of economic activity.
Besides the threat to professors, distance learning can have unforeseen consequences. Some people worry that it could help other countries catch up economically and eventually pass the U.S., especially if they have better networking infrastructures. It could benefit underdeveloped countries the most, but it still needs funding from capitalistic ones. Distance Learning helps displaced U.S. workers upgrade their skills, but some people worry that companies like General Electric, Intel, and Microsoft will invest in it to further develop their offshore outsourcing programs and thus get rid of more U.S. workers.
Intellectual Property Rights
The music industry ignored their digital opportunities and then felt threatened when Napster defined a market for music downloads. But rather than jumping on the electronic bandwagon and building a business with copyright protection and digital rights management, the record labels used legal means to subvert progress and delay the inevitable.
Napster was shut down, and the strong growth in new broadband subscribers slowed greatly and never recovered. Debates over intellectual property rights continue to keep faster growth at bay. The Recording Industry Association of America (RIAA) even went to far as to go after its customers, sending as many as 1 million emails per week to users with songs on their computers enabled for swapping. The RIAA has already filed suit against two college students, alleging that they shared more than 1 million recordings and demanding damages of $150,000 per song.
I am outraged by such heavy-handed tactics but happy to hear that Napster assets were just acquired by a leading company for legitimate purposes. And I’m especially encouraged by two recent announcements in April 2003.
Grokster and StreamCast Networks
A Los Angeles federal judge dismissed a lawsuit against Grokster and StreamCast Networks, saying they can’t be held liable for illegal file trading over their networks. Justifying this decision was the fact that the networks are also used for non-infringing activities, such as distributing movie trailers, shareware, free songs, and other non-copyrighted material. Besides, network operators have no direct knowledge of when illegal trading is taking place on their systems.
The five major record labels recently gave legitimate support to iTunes Music Store, an Apple-branded online music store that lets users browse a library of 200,000 songs and then download single tracks for 99 cents apiece or $9.99 for a complete album. Users can share their music among 3 Macs, can download them to iPODs, and burn CDs. Each digital copy includes a watermark containing the user’s registration number so record labels have an audit trail if abuses are discovered. I’m happy that the music industry is finally coming to grips with reality and greatly support the use of watermarks, since consumers have genuine needs to copy music to different media for use on their other devices.
Pushing on Market Accelerators
Who should pay to build Information Superhighway and its high-speed onramps? For guidance, let’s look at the interstate highway system as government-supported infrastructure. The Feds never agreed to extend the roads to each home, and that responsibility was left to state and local governments, neighborhood developers, and in even individuals. There were interstate highways, farm-to-market roads, toll roads, and private driveways, and they each had a different source of funding.
If we apply the same logic to extending fiber networks and broadband Internet access, the responsibility for municipal and last-mile networks should fall on cities, neighborhood developers, and apartment owners, while consumers own the home networks.
If carriers install the networks and get service revenues from them, cities are allowed to define rules that carriers must live by. Some cities and neighborhoods have installed their own fiber networks (or contracted that task to others) to attract affluent homebuyers and citizens to improve the tax base. At least one developer in Austin even runs fiber inside homes. Likewise, universities install high-speed networks in dorm rooms, and apartments are forced to follow suit for young renters who have developed expectations.
While the installation of fiber networks is driven by profit incentive, the installation of many wireless “free nets” is not. Individuals and groups often i