April 28, 2004
Members will hear testimony on suggested revisions of telecommunications law and alternative regulatory frameworks that policymakers should consider in any future reform of telecommunications policy. Senator McCain will preside. Following is a tentative witness list (not necessarily in order of appearance):
If you are having trouble viewing this hearing, please try the following steps:
- Clear your browser's cache - Guide to clearing browser cache
- Close and re-open your browser
- If the above two steps do not help, please try another browser. Google Chrome and Microsoft Edge have the highest level of compatibility with our player.
The Honorable John McCain
· Today we continue our series of hearings reviewing telecommunications policy. I will be brief so that we can hear from our witnesses. Yesterday, we looked back at the Telecommunications Act of 1996 to identify the successes and failures of that law. Today, we look ahead to consider potential reforms to our telecommunications policy given advances in technology. · This examination is important because numerous members have discussed reforming the Act. It is imperative that any new legislation will provide a more streamlined statutory framework for telecommunications policy in the 21st century – one in which technological innovation could flourish, competition could thrive, and the need for regulation is either eliminated or greatly reduced. · I thank the witnesses for being here today and I look forward to your testimony.
Mr. Charles Ferguson
Click here for a Microsoft Word version of Mr. Ferguson's remarks.
Mr. Adam Thierer
Good morning, my name is Adam Thierer and I serve as Director of Telecommunications Studies at the Cato Institute. Thank you Mr. Chairman for your invitation to testify here this morning as the Committee begins the important business of thinking about what the next Telecom Act should look like. As someone who worked closely with members of this Committee a decade ago when we started getting serious about telecom reform, I think it’s safe to say that we all share a sense of frustration and disappointment that we were not able to advance the ball a little further last time around. If I had to summarize what went wrong with the Telecom Act of 1996, I would use the following paradox: Congress wanted market competition but did not trust the free market enough to tell regulators to step aside and allow markets to function on their own. Consequently, the FCC, the Department of Justice, state and local regulatory commissions, and the courts, have spent the last ten years treating this industry as a regulatory plaything to be endlessly toyed with. Today there is virtually no element of telecommunications that is not subject to some sort of meddling by some or all of these regulatory officials. While it’s fair to say that it was probably wishful thinking to believe we could have undone a century’s worth of command and control regulatory policies in a few short years, one would have at least hoped that we would not be stuck still debating the same issues today that dominated the agenda over a decade ago. Indeed, if Rip Van Winkle fell asleep in 1994 and woke up in 2004, he wouldn’t think he’d missed a beat if telecom regulation was any guide. But despite the ongoing regulatory quagmire, the good news is that we have witnessed amazing strides in terms of technological progress and we can confidently say that this marketplace has never witnessed such competitive forces at work. Whether it’s the wireless revolution that is allowing millions to cut the cord entirely, or the Internet and broadband revolution that is opening up a whole new world of opportunities that did not exist prior to 1996, by almost any measure, consumers are better off and have more choices now than ever before. Still, much remains to be done to clear out the regulatory deadwood that continues to hold back further innovation and competition. While there are dozens of important regulatory reform objectives I could outline, in my limited time here today it makes more sense to briefly discuss the three most important over-arching themes or priorities that should frame our current thinking about how to reform telecommunications policy. These priorities include: (1) Rationalizing Regulatory Classifications (2) Dealing with Jurisdictional Matters (3) Getting Agency Power and Size Under Control Regulatory Classifications With respect to regulatory classifications, a general consensus exists today that Congress will need to formally close the book on the archaic regulatory classifications of the past, which pigeonhole technologies and providers into distinct vertical policy “silos.” That is, we still have Title II for common carriers, Title III for wireless, Title IV for cable, and so on, even though rapid technological change and convergence have largely wiped out such distinctions and pitted these formerly distinct sectors against one another in heated competition for consumer allegiance. Thus, although the communications / broadband marketplace is becoming one giant fruit salad of services and providers, regulators are still separating out the apples, oranges, and bananas and regulating them differently. This must end. One way to do this is to replace the vertical silos model with a “horizontal layers” model that more closely resembles the way the new marketplace operates. We can divide the new industry into at least four distinct layers: (1) Content; (2) Applications; (3) Code; and, (4) Infrastructure, and regulate if we must, each accordingly. But I would caution Congress against formally enshrining a network layers model as a new regulatory regime. While this model provides a useful analytical tool to help us rethink and eliminate the outmoded policy paradigms of the past, we would not want these new layers to become the equivalent of rigid regulatory quarantines or firewalls on industry innovation or vertical integration. A second and better way to tear down the old regulatory paradigms and achieve regulatory parity would be to borrow a page from trade law and adopt the equivalent of a “most favored nation” (MFN) principle for communications. In a nutshell, this policy would state that: “Any communications carrier seeking to offer a new service or entering a new line of business, should be regulated no more stringently than its least regulated competitor.” This would allow us to achieve regulatory simplicity and parity not by “regulating up” to put everyone on equal difficult footing but rather by “deregulating down.” Given the confusion over the Brand X court case and the ongoing FCC investigation into a Title 1 “information services” classification for broadband, this “Most Favored Nation” approach might help us bring some resolution to this difficult issue. Jurisdictional Matters Next we come to jurisdictional matters, which could very well end up being the most controversial issue this Committee will take up if you choose to re-open the Telecom Act. Here I am speaking of the heated debate between federal, state and local regulators for control over the future of communications policy. As I noted in my 1998 book The Delicate Balance: Federalism, Interstate Commerce and Economic Freedom in the Information Age, decentralization of political power almost always has a positive effect in terms of expanding human liberty. But as our Founders wisely realized when penning the Constitution, there are some important exceptions to that general rule. Let me be perfectly blunt on this point: Telecommunications regulation is one of those cases where state and local experimentation doesn’t work so well. After all, at the very heart of telecommunications lies the notion of transcending boundaries and making geography and distance irrelevant. If ever there was a good case to be made for an activity being considered interstate commerce, this is it. And yet, America’s telecom market remains riddled with a patchwork of policies that actually thwart that goal and seek to divide the indivisible and place boundaries on the boundless. This must end. And the only way it will end is by Congress taking the same difficult step it had to take when deregulating airlines, trucking, railroads, and banking: pre-emption. We must get serious about the “national policy framework” mentioned in the preamble of the Telecom Act by comprehensively pre-empting state and local regulation in this sector. The rise of wireless and Internet-based forms of communications makes this an absolute necessity. If you feel compelled to leave some authority to state regulators, why not devolve to them any universal service responsibilities that continue to be deemed necessary? This is one area where experimentation can work if the states devised targeted assistance mechanisms. But they should not be allowed to impose regulatory restraints or levies on interstate communications to do so. Agency Power My third and final “big picture” reform involves what may have been the most glaring omission from the Telecom Act of 1996: The almost complete failure to contain or cut back the size and power of the FCC. Again, we would do well to remember the lessons of the past. When Congress deregulated airlines, trucking and railroads, lawmakers wisely realized that comprehensive and lasting reform was possible only if the agencies that oversaw those sectors were also reformed or even eliminated. In the telecom world, by contrast, the FCC grew bigger and more powerful in the wake of reform and we witnessed spending go up by 37 percent, a tripling of the number of pages in the FCC Record, and there were 73 percent more telecom lawyers after the Act than before. It is safe to say that you cannot deregulate an industry by granting regulators more power over that industry. This too must end. The next cut at a Telecom Act must do more than just hand the FCC vague forbearance language with the suggestion that the agency take steps to voluntarily regulate less. We can’t expect the regulators to deregulate themselves. We need clear sunsets on existing FCC powers, especially the infrastructure sharing provisions of the last Act. And then we need to impose sunsets on any new transitional powers we grant them in the next Telecom Act. And we need funding cuts too. If we fail to do so, we’ll likely be sitting here again in 10 years having this same conversation all over again. Conclusion: Ending “Chicken Little Complex” In conclusion, it is my hope that Congress rejects the many doomsday-ers and nay-sayers in the telecom sector who claim the sky will fall without incessant regulatory oversight and intervention. “Chicken Little complex” seems to run rampant throughout this sector even though it is less warranted than ever before. We have a chance to make more than just a clean break with the past; we have the chance now to close the book on a regulatory past that has done little to truly benefit consumers. Regulators have been given over 100 years to conduct a grand experiment with the telecom sector. Why not give markets a chance for once? Thank you, and I’m happy to take any questions you may have.
The Honorable Reed E. Hundt
Click here for a PDF version of Mr. Hundt's remarks.
Mr. Raymond Gifford
Click here for a PDF version of Mr. Gifford's remarks.
Mr. George Gilder
Mr. Chairman and Senator Hollings, thank you for the opportunity to appear before your committee today. Your selected topic is crucial to the well-being of the U.S. and global economies, and I appreciate your deep interest in the subject. Overthrowing matter and media with the new worldwide web of glass and light and air should be a happy and defining event in the history of man. Global information networks offer unprecedented potential opportunities for economic growth, cultural revival, and individual freedom and empowerment. Yet the United States has in large part blocked the path of the technologies and companies needed to consummate this vast new infrastructure of chips, fiber optics, antennae, digital storage, and software. Although American companies invented almost all the technologies crucial to the Internet, we have fallen behind many other nations in the deployment of these technologies. The U.S. now ranks eleventh internationally in residential “broadband” access. Using the FCC’s silly 200-kilobit-per-second definition, some now say that 25 percent of American homes have broadband. But by the standards of Asia—where most citizens enjoy access speeds 10 times faster than our fastest links—U.S. residences have no broadband at all. U.S. businesses have far less broadband than South Korean residences. South Korea, for instance, has 40 times the per capita bandwidth of the U.S. Japan is close behind Korea, and countries from China to Italy are removing obstacles to the deployment of vDSL, fiber-to-the-home, and broadband wireless networks. Asian broadband also proves there was no Internet “bubble.” Today, Korea runs over the net between a three and five times larger share of its economy than we do. Riding the bus to work, Koreans watch television news and exchange video mail over their mobile phones. They enjoy full-motion video education and entertainment in their homes. Many of the dot-coms that failed in America due to the lack of robust broadband links are thriving in Korea. Consider that by this time next year Verizon Wireless’s 38 million customers will enjoy faster Internet access via their mobile phones than through their Verizon DSL connections to their homes. Only the most severe disincentives to invest could have yielded such a result, which defies the laws of physics. The American Internet “bubble” was actually a crisis of policy. The Telecom Act of 1996 was meant to “deregulate” America’s telecom infrastructure and technologies, the most dynamic sectors in the entire world economy. But after the usual lobbying and horse-trading, the Act turned into a million-word re-regulation of the industry. Regulatory actions by the FCC and the 51 state utility commissions greatly exacerbated the bad parts of the Act and distorted many of the good parts. As I predicted the day after it was enacted, the result was a carnival of lawyers, micro-mis-management by bureaucrats, price controls, the socialization of infrastructure, the screeching halt of innovation and investment in the “last-mile” local loop—and the Great Telecom and Technology Crash of 2000-2003. In the last year or so, the FCC has partially reversed some of its most egregious errors. Some are still being adjudicated in the courts. But U.S. telecom remains a highly regulated, highly taxed sector of our economy. The mistakes of the last 10 years have greatly harmed the U.S. economy, and continued gridlock and inaction threaten to shift American leadership in technology to Asia, which has embraced the Internet with open arms. Today, just as the telecom and technology sectors exit a three-year depression, we are in danger of repeating the very worst mistakes of the 1996 Telecom Act, but this time on an even grander scale. In today’s testimony I will address and refute one particular proposal that is being offered as the basis for the new telecom legislation. In doing so I hope also to offer an alternative vision. The new “big idea” in telecom regulation comes from a host of learned and experienced telecom thinkers: the likes of former FCC authority Kevin Werbach, Stanford law professor and technology author Lawrence Lessig, industry analyst Roxanne Googin, and IPioneer Vint Cerf, to name just a few. The idea is mandated “open access” to the logical layers of the network, and it is embodied in a new legislative proposal by MCI, “A Horizontal Leap Forward: Formulating a New Public Policy Framework Based on the Network Layers Model.” A horizontal layers approach would supposedly be a radical shift from the “vertical silos” approach now used, where telephony, cable, and wireless, for example, are regulated based on historical industry definitions, not generic functional categories. The common denominator of Internet Protocol (IP)—supposedly the basis for all future communications networks—is said to necessitate the new layered regulatory approach. Barely recovering from the FCC’s TELRIC and UNE-P “open access” mandates that chopped up and assigned ownership rights to the physical infrastructure—the hardware—of the Net, we now face the prospect of rigid reassignment of content, applications, services, and protocols, too. Whatever it is called, it represents more micromanagement of a dynamic industry in the midst of major technological transitions. The new proposal feeds on fear—fears that cable TV companies or the Bells might seek to leverage their broadband networks by wrapping content into their conduits, or that Microsoft might keep “tying” new applications into Windows, or that Google might monopolize information on the Net (yes, there is already an organized effort to turn Google into a public utility). MCI’s layering proposal defines rigid boundaries between content (voice, text, video), applications (e-mail, browsers, VoIP), protocols (TCP/IP, HTTP, FTP), and infrastructure (wires, switches, spectrum, PCs, handsets). In a paper entitled “Codifying the Network Layers Model,” MCI proposes to “quarantine” major providers of one of the layers within that layer, and to prohibit them from vertically integrating into another layer unless they offer wholesale open access to all competitors. Lessig, MCI, and company worry that the “end-to-end” nature of the Internet—where any terminal attached to the net can be reached from any other terminal--will be threatened if these new layering rules are not adopted. Layering proponents, however, make a fundamental error. They ignore ever changing trade-offs between integration and modularization that are among the most profound and strategic decisions any company in any industry makes. They disavow Harvard Business professor Clayton Christensen’s theorems that dictate when modularization, or “layering,” is advisable, and when integration is far more likely to yield success. For example, the separation of content and conduit—the notion that bandwidth providers should focus on delivering robust, high-speed connections while allowing hundreds of millions of professionals and amateurs to supply the content—is often a sound strategy. We have supported it from the beginning. But leading edge undershoot products (ones that are not yet good enough for the demands of the marketplace) like video-conferencing often require integration. Metaphors from the Telecosm help explain the fluid nature of these layers that MCI wants to preserve in concrete. Consider Corvis, our favorite optical equipment company and national fiber optic bandwidth provider. It blows apart the MCI approach on several fronts. First is CEO David Huber’s architecture of an all-optical network, devoid of electronic regenerators and protocol readers, which unites content and conduit by using colors of light both to bear the message and to determine the path of the circuit. It radically collapses the top layers of the OSI (Open Systems Interconnection) stack used in the Sonet voice and data networks of the past, not so much redefining the interfaces as transcending them. A “switchless” web of always-on fixed lambdas (wavelengths of light) can function as both the physical and logical layers of the Net because the intelligence is embedded in the path. There will be some controlling devices at the edge of the network, and IP will still be widely used, but the heyday of IP packet switched networks may well be over. Typically government enshrines the past in the name of progress. In uniting Corvis, a cutting edge equipment provider, with Broadwing, an infrastructure builder and service provider, Huber is also betting that IP networks are not inherently modular, where equipment from a thousand providers can easily be cobbled together to deliver high-bandwidth, low-latency services, but that networks are still in fact in an era of undershoot where an integrated provider can deliver a superior product at a much lower cost. Our favorite digital chip company, EZchip, also explodes the idea that the layers of the Net can always be defined and “quarantined.” Where until now data flowing through the seven layers and numerous sub-layers were parsed and modified by a gaggle of hundreds of chips connected by thousands of wires and glue-logic galore, EZ puts all seven layers of the OSI stack onto one-chip, performing all the essential functions of an Internet router on a single sliver of silicon. The “layers” are once again transcended when EZ’s software tools allow programmers to tell the chip what to do without even referring to the rigid layers, channelizations, protocols, and interfaces used in the previous software environment. Is this fair? Should EZchip be allowed to invade someone else’s turf, perhaps that of Cypress’s high-end content addressable memories (CAMs) or Broadcom’s Silicon Spice communications processors or the sacred code of the OSI idol? Or to blow apart someone’s whole field, like EZ could one day do to the many providers of communications ASICs (applications specific integrated circuits), or to Internet router king Cisco itself? It might be said that the “layering” proposals now in circulation are yet another (if more clever) attempt by competitors to target the Bell telephone and cable TV companies. Indeed, MCI’s own paper implies the cable companies (bundling network, ISP, and content) and the Bells (bundling network, ISP, and voice) are already stomping all over the layers, creating a muddy (and hopefully one day illegal!) mishmash of vertical integration. What a coincidence that the activities of its rivals violate MCI’s framework and cry out for cleansing and re-ordering (read structural separation, consent decrees, price controls, divestiture) by new teams of FCC horizontalawyers and IPolice. But if the proposals are meant as anything more than political lobbygagging of rivals, if the proponents really mean their model legislation as a principled, generic set of rules, then we must consider the logical consequences of such new laws. If applied dispassionately, how would such general rules affect the rest of the Internet, communications, and technology industries? Should Google be able to leverage search into Gmail, or to supply content using its proprietary algorithms and its physical network of 100,000 servers? Shouldn’t any rival search provider be able to feed off of Google’s advanced infrastructure? After all, wouldn’t it be impossible to recreate Google’s massive web of global intelligence? Doesn’t Google’s superior infrastructure exhibit “market power”? Might Google actually evolve into a general provider of web-based information management services, rivaling the PC-based Microsoft, or should Google be “quarantined” as a search provider? Or maybe we should structurally separate Google into three companies: an infrastructure provider (its 100,000 networked servers plus algorithmic IP), a content/advertising company, and an information services company (Gmail plus future knowledge management applications). Surely FCC bureaucrats can make these easy distinctions and explain the resulting penalties to weary entrepreneurs who have just spent 10 years of their life building a new service that people really like. Should Sony be able to demand that its PlayStation gamers get access to Microsoft’s Xbox Live online video game network? Should Amazon be able to aggregate and make searchable the text of hundreds of thousands of books? Should Sprint PCS or Verizon Wireless be allowed to develop specialized content delivery platforms or applications that take advantage of their superior wireless data networks? Sprint was the first to build its own photo-sharing platform, and it is apparently the most user-friendly wireless photo-sharing system. Can we let such infrastructure-leveraging stand? What if Equinix (the data center company that almost defines of the integration of the physical, protocol, application, and content layers of the Net) succeeds in becoming the overwhelming meeting place (peering point) for the world’s network, e-commerce, and content providers? Network economics suggest the concentration of all the largest Internet players in Equinix facilities is possible, or even likely. If Equinix achieves such “market power,” are we to assume that other “virtual data centers,” like the CLECs before them, could force Equinix to “open up” its hosting facilities so that the new virtual competitors can offer services over infrastructure they did not build? Why should anyone build risky and expensive new infrastructure if it can be readily used by competitors. What about Microsoft integrating easy-to-use voice-over-IP software into its next operating system? Should Microsoft rival Real Networks be barred from aggregating music and video for download with its RealPlayer multimedia suite? All of these are, to one degree or another, inter-layer integrated products and services. Proponents of “layering, or “Net neutrality,” or a free Internet “commons,” assume there is one network, that it is sufficient and timeless, that no new networks are possible or needed. They want innovation on the edge, in the form of software apps and Wi-Fi attachments. Innovation in the core is either assumed or ignored. The logical conclusion, however, is that since the “best network”—the free commons—cannot make any money, there will be no network. And just how much innovation at the edge will there be if there is no innovation—no bandwidth—in the core? MCI’s “horizontal leap” asks authorities to pursue vigilantly those who would exploit “network choke points” or take advantage of “network effects.” In industries where “entities seek to obtain market power” (i.e., seek to make money in a business enterprise), policymakers need to ensure four things: “open architecture, open access, universal access, and flexible access.” When imposed by regulators or courts in a national capital, these four euphemisms boil down to one hard reality: socialization and micromanagement of the “architectures” and “access” networks built by others. The ability to tie and merge and break apart and outsource products, services, and technologies are the very stuff of business. As is the ability to pursue an unguaranteed return on one’s risky investment. As is the decision how to price these products and services. Some services will be bundled. Some will be free, loss leaders to leverage the purchase at another point of sale. But the entire system cannot be free. Everybody else’s product or service, except one’s own, cannot be a commodity, barred from bundling or profit. The companies that enable this broadband world will be able to charge for it during the years that they provide the optimal service. Their initial margins will be high. When communications becomes a commodity, as it eventually will, the margins will drop. This is not a catastrophe. No one has a right to high margins for a commodity service. But the Telecosm is still an arena of innovators, such as Corvis, EZChip, Qualcomm, Verizon Wireless, Essex, AFCI, Agilent, and hundreds of others, who will enjoy large monopoly rents until their inventions are standardized and commoditized and the leading edge moves elsewhere. The telecom industry is nowhere near some mythical paradox of perfection or cul de sac bargain basement of commoditization. It is still engaged in a thrilling adventure of putting together worldwide webs of glass and light that reach from your doorstep or teleputer to every other person and machine on the planet. It is long distance and it is local, it is packetted and circuited, it is multithreaded and aggregated, it is broadband and narrowcast, all at once. These crystal palaces of light and air will be hard to do and the world will reward the pioneers who manage to build them. The real threat to monopolize and paralyze the Internet is not the communications industry and its suppliers, but the premature modularizers and commoditizers, the proponents of the dream of some final government solution for the uncertainties of all life and commerce.