In case you've never seen it, this site is interesting. The MINTS project's objective is to collect and study public data about the growth rates of Internet traffic. Interestingly, they conclude that traffic growth, while substantial, is far below the hype ... typically 50-60% per year.
02 December 2008
To assess compliance and risk, a stratified random sample of 390 service area providers (represented by unique Study Area Codes, or "SACs") was drawn and compliance attestation examinations/audits were completed. Audit data were provided for 384 auditees. Under IPIA standards, a program is "at risk" if the erroneous payment rate exceeds 2.5% and the amount of erroneous payments exceeds $10 million. The estimated erroneous payment rate for this HC audit cycle was 23.3% and the margin of error was 2.3% at the 90% level of confidence. The statistical estimate of erroneous HC payments during FY 2006 is $970.3 million. The rate of overpayment out of total disbursements was 22.8% with a margin of error of 2.3% at the 90% confidence level. As a consequence, statistical results from this sample indicate that the HCF USF program is "at risk" as defined by the IPIA.
Elsewhere in the report (p. 21), this was elaborated further:
The rate of improper over payment is 22.8%, and the proportion of improper over payments out of total improper payments in 98.2%.
So, basically, when there was an improper payment, it almost always was an over payment (Are you surprised?). We (users of the telephone network) paid almost 23% more than we needed to. Put another way, almost 1/4 of the collected funds (approximately $4.4 billion in FY07) were over payments!
The US Inspector General, in their report to Congress, noted
The results from Round 1 and the preliminary results from Round 2 have not lessened our concern about the possibilities for fraud, waste, and abuse in the Commission's USF programs as administered by USAC.
Thus, this is an ongoing problem that hasn't (apparently) gotten any better since the last audit!
24 November 2008
04 November 2008
Given that a small percentage of households receive over the air TV, shouldn't we really be discussing doing away with that technology, and encourage broadcasters to change their business model to being pure programming providers, instead of bundling programming with delivery techniques?
The reason this is important to discuss now is that many of the white spaces devices are optimized for the particulars of the TV spectrum. If this becomes unlicensed use, then the existing channelization becomes effectively locked in because it is difficult to coordinate disparate unlicensed users to transition to a new channelization regime. Is this in the best long term interest?
03 November 2008
I found this item over at Ars Technica interesting. To me, this is an interesting case of the technology hype cycle ... lots of initial promise followed by a (sometimes temporary) decline until the applications are well understood. In this case, it seems that fragmentation of the sponsorship base was a key contributor to this setback.
As of this writing, the link has been temporarily restored, but you can expect that this will make waves again. It can be quite difficult for a carrier to maintain the traffic levels needed for peering, especially with a large ISP like Sprint.
28 October 2008
p>My guess is the former is true. It will be hard for them to compete with nation-wide, facilities based carriers with established brands. Remote control of entertainment is a relatively new idea (and it may not be bad), for which future returns are quite uncertain. Furthermore, how difficult would it be for someone to develop an app that runs on Windows Mobile, Symbian, etc. that could do essentially the same thing?
Olga Karif over at BusinessWeek is also skeptical, though for different reasons.
10 October 2008
For the past several months, there has been much discussion about carriers such as Comcast imposing bandwidth caps on their users. The stated motivation for this is to rein in the highest bandwidth users, and perhaps to set the stage for price discrimination, where users would pay more for higher caps. To this end, there are a few items that I would like to bring to your attention.
This item over at BusinessWeek takes a look at the rationale for bandwidth caps. Dispite the claims of congestion, the article does not find evidence of the feared growth in traffic.
Secondly, this item over at GigaOm points to a recent white paper on the subject. It then points to some possible unintended consequences of this approach. By the way this is a critique of the cited paper over at the Tech Liberation Front. Tim Lee is a fan of metered usage.
David Clark's paper on the incremental cost of IP service at the recent Telecommunications Policy Research Conference was quite interesting, so this item over at Telegeography caught my eye. According to their research, transit prices have been declining over the past year and vary significantly across the world. The figure below captures some of this data.
24 September 2008
IBM has announced a new policy with regard to standards particiption. There are a couple of interesting things around this. Notably, this policy was developed by IBM employees using a Wiki. Quoting the policy:
The tenets of IBM's new policy are to:
- Begin or end participation in standards bodies based on the quality and openness of their processes, membership rules, and intellectual property policies.
- Encourage emerging and developed economies to both adopt open global standards and to participate in the creation of those standards.
- Advance governance rules within standards bodies that ensure technology decisions, votes, and dispute resolutions are made fairly by independent participants, protected from undue influence.
- Collaborate with standards bodies and developer communities to ensure that open software interoperability standards are freely available and implementable.
- Help drive the creation of clear, simple and consistent intellectual property policies for standards organizations, thereby enabling standards developers and implementers to make informed technical and business decisions.
To me, it will be interesting to see how quickly IBM moves to adhere to this policy. Suppose there was a consortium that did not meet one of the tenets (eg. the first one) that was in an area of vital business interest to IBM. I wonder how they would respond?
Given this data, is it any wonder that carriers would try to hike rates? After all, text volume seems remarkably inelastic given prior price increases. Economics suggests that they would raise prices in this case!
19 September 2008
I found this article over at BusinessWeek interesting (even if it didn't explicitly discuss telecom). I think this article supports R.B. Horwitz's contention that regulation is often introduced at the behest of the regulated industry to stabilize markets. Pure capitalism results in large uncertainties and almost demands periodic business failures as a form of market discipline. Regulation works to ease those pressures and, if done well, provides a predictable marketplace (which is generally supportive of investment).
So, if this article is correct, the recent market turmoil may usher in an era of increased regulation. Will telecom be included in this trend?
17 September 2008
15 September 2008
I found this report interesting (free registration required). It reports data as seen by Akamai, which serves many of the Internet's web sites. While the data are interesting, there are some notable countries missing ... Russia, for example ... so, while of interest, it may of limited value.
11 September 2008
In this previous post, I had discussed rising text message rates in the US. According to this article in Forbes, it has gotten the attention of Sen Herb Kohl (D-Wis) as well. Here is the announcement from his website. While specific action is unlikely in this session given the upcoming election, Forbes notes a possible connection with the EU's action on mobile phone price regulation.
While Kohl blames this on industry consolidation, I believe that there are other interpretations that should be considered. For one, the price increase is for a-la-carte messages, not for packages or bundles of messages. Second, carriers often offer cheap or free messages between users in their networks as a way to increase switching costs for users. Should they be prevented from doing this?
08 September 2008
This report, published by the UK Broadband Stakeholders Group, provides an upper and lower estimate of the cost of deploying fiber-based broadband throughout the UK: from GBP5.1 to GBP28.8 (US$9B to US$50B), depending on the architecture used. The lower estimate is for fiber to the cabinet, while the upper one is for fiber to the home. The report itself (available here) is a good example of a carefully done cost study.
Cloudonomics Law #1: Utility services cost less even though they cost more. An on-demand service provider typically charges a utility premium — a higher cost per unit time for a resource than if it were owned, financed or leased. However, although utilities cost more when they are used, they cost nothing when they are not. Consequently, customers save money by replacing fixed infrastructure with clouds when workloads are spiky, specifically when the peak-to-average ratio is greater than the utility premium.
Cloudonomics Law #2: On-demand trumps forecasting. The ability to rapidly provision capacity means that any unexpected demand can be serviced, and the revenue associated with it captured. The ability to rapidly de-provision capacity means that companies don’t need to pay good money for non-productive assets. Forecasting is often wrong, especially for black swans, so the ability to react instantaneously means higher revenues, and lower costs.
Cloudonomics Law #3: The peak of the sum is never greater than the sum of the peaks. Enterprises deploy capacity to handle their peak demands – a tax firm worries about April 15th, a retailer about Black Friday, an online sports broadcaster about Super Sunday. Under this strategy, the total capacity deployed is the sum of these individual peaks. However, since clouds can reallocate resources across many enterprises with different peak periods, a cloud needs to deploy less capacity.
Cloudonomics Law #4: Aggregate demand is smoother than individual. Aggregating demand from multiple customers tends to smooth out variation. Specifically, the “coefficient of variation” of a sum of random variables is always less than or equal to that of any of the individual variables. Therefore, clouds get higher utilization, enabling better economics.
Cloudonomics Law #5: Average unit costs are reduced by distributing fixed costs over more units of output. While large enterprises benefit from economies of scale, larger cloud service providers can benefit from even greater economies of scale, such as volume purchasing, network bandwidth, operations, administration and maintenance tooling.
Cloudonomics Law #6: Superiority in numbers is the most important factor in the result of a combat (Clausewitz). The classic military strategist Carl von Clausewitz argued that, above all, numerical superiority was key to winning battles. In the cloud theater, battles are waged between botnets and DDoS defenses. A botnet of 100,000 servers, each with a megabit per second of uplink bandwidth, can launch 100 gigabits per second of attack bandwidth. An enterprise IT shop would be overwhelmed by such an attack, whereas a large cloud service provider — especially one that is also an integrated network service provider — has the scale to repel it.
Cloudonomics Law #7: Space-time is a continuum (Einstein/Minkowski) A real-time enterprise derives competitive advantage from responding to changing business conditions and opportunities faster than the competition. Often, decision-making depends on computing, e.g., business intelligence, risk analysis, portfolio optimization and so forth. Assuming that the compute job is amenable to parallel processing, such computing tasks can often trade off space and time, for example a batch job may run on one server for a thousand hours, or a thousand servers for one hour, and a query on Google is fast because its processing is divided among numerous CPUs. Since an ideal cloud provides effectively unbounded on-demand scalability, for the same cost, a business can accelerate its decision-making.
Cloudonomics Law #8: Dispersion is the inverse square of latency. Reduced latency — the delay between making a request and getting a response — is increasingly essential to delivering a range of services, among them rich Internet applications, online gaming, remote virtualized desktops, and interactive collaboration such as video conferencing. However, to cut latency in half requires not twice as many nodes, but four times. For example, growing from one service node to dozens can cut global latency (e.g., New York to Hong Kong) from 150 milliseconds to below 20. However, shaving the next 15 milliseconds requires a thousand more nodes. There is thus a natural sweet spot for dispersion aimed at latency reduction, that of a few dozen nodes — more than an enterprise would want to deploy, especially given the lower utilization described above.
Cloudonomics Law #9: Don’t put all your eggs in one basket. The reliability of a system with n redundant components, each with reliability r, is 1-(1-r)n. So if the reliability of a single data center is 99 percent, two data centers provide four nines (99.99 percent) and three data centers provide six nines (99.9999 percent). While no finite quantity of data centers will ever provide 100 percent reliability, we can come very close to an extremely high reliability architecture with only a few data centers. If a cloud provider wants to provide high availability services globally for latency-sensitive applications, there must be a few data centers in each region.
Cloudonomics Law #10: An object at rest tends to stay at rest (Newton). A data center is a very, very large object. While theoretically, any company can site data centers in globally optimal locations that are located on a core network backbone with cheap access to power, cooling and acreage, few do. Instead, they remain in locations for reasons such as where the company or an acquired unit was founded, or where they got a good deal on distressed but conditioned space. A cloud service provider can locate greenfield sites optimally.
05 September 2008
In an earlier post, I had discussed Australia's plans for construcing an (apparently) subsidized broadband network. According to this site, proposals for this network will be due in November 2008. Do you think this is a reasonable approach, or should broadband be privately provided by market incentives?
Those of you who like to keep score might find this item interesting. It would be interesting to take comparative look at pricing for these services as well. In any case, I guess it pretty much renders moot the discussions about the different airtime charging schemes (caller pays vs. mobile pays) and multiple standards!
28 August 2008
25 August 2008
Microsoft's Silverlight technology and rival Adobe's Flash format are currently locked in a race over who delivers the world's online video, but the ultimate prize may be who powers the next generation of Web software.
Using Silverlight, the NBC site offers a glimpse of what is possible with future Web applications because viewers are able to watch up to four videos at once or follow the action with an online commentary that runs alongside the video.
By building up Silverlight's user base, the world's largest software maker is looking to win over developers who see Web platforms such as Silverlight and Flash as a new way to deliver powerful Web-linked programs incorporating rich graphics.
Microsoft, which said nearly half the visitors to NBC's site did not have Silverlight, plans to expand its reach to close the gap on Flash, which is already running on most of the world's Web-connected computers and powers over 80 percent of the video on the Internet.
Taking advantage of Flash, Silverlight and other more simple Web-coding technologies such as AJAX, a new breed of interactive Web software -- known as rich Internet applications (RIAs) -- has emerged.
Like other Web applications, RIAs are cheaper to deploy and maintain than traditional software, but they differ from more simple Web programs by employing rich graphics, running faster and creating a seamless experience that does not require the application to constantly reload or refresh.
Gartner analyst Ray Valdes said 90 percent of the top global 1,000 companies have yet to deploy any sort of RIA, while 90 percent of the top 100 consumer Web sites have already done so using the nonproprietary and more simple AJAX format.
That opportunity has Microsoft eyeing current leader Adobe for business that extends beyond Silverlight and into the sale of design tools along with server and database software to enable these new applications.
Microsoft is approaching Silverlight from the opposite direction. It plans to take advantage of its legions of outside developers experienced in writing for its ubiquitous Windows operating system.
The next version of Silverlight, being tested now and due later this year, will support Microsoft's .NET framework -- tools used by developers to create desktop applications that work on Windows.
This item over at Ars Technica is interesting. With the IPv4 address space nearing exhaustion, there has been increasing interest in and attention to converting to IPv6. This article shows that this transition is barely under way:
[The researchers] measured traffic flowing through almost 2,400 routers, amounting to no less than 4.5 terabits per second. The results: about 12 megabits of Teredo traffic, which was about 10 percent of the protocol 41 traffic, for a total of some 117Mbps.
The report cites studies that estimate the amount of native IPv6 traffic as 10 to 75 percent, so the total amount of IPv6 traffic would be 130 to 470Mbps. The measured IPv6 traffic constitutes 0.0026 percent of the total traffic.
... the Amsterdam Internet Exchange, which is one of the three big public interconnects between ISPs in Europe, does count native IPv6 traffic, which amounts to some 450Mbps out of a total of 370Gbps, or about 0.12 percent.
22 August 2008
So, an important question is, given the relatively poor teledensity of Africa, why would these large investments be made to provide connectivity there?
16 July 2008
The high-cost program’s structure has resulted in the inconsistent distribution of support and availability of services across rural America. The program provides support to carriers in all states. However, small carriers receive more support than large carriers. As a result, carriers serving similar rural areas can receive different levels of support. Currently, the high-cost program provides support for the provision of basic telephone service, which is widely available and subscribed to in the nation. But, the program also indirectly supports broadband service, including high-speed Internet, in some rural areas, particularly those areas served by small carriers. The program provides support to both incumbents and competitors; as a result, it creates an incentive for competition to exist where it might not otherwise occur.
There is a clearly established purpose for the high-cost program, but FCC has not established performance goals or measures. GAO was unable to identify performance goals or measures for the program. While FCC has begun preliminary efforts to address these shortcomings, the efforts do not align with practices that GAO has identified as useful for developing successful performance goals and measures. For example, FCC has not created performance goals and measures for intermediate and multiyear periods. In the absence of performance goals and measures, the Congress and FCC are limited in their ability to make informed decisions about the future of the high-cost program.
This report is highly critical of the FCC and its management of these monies. There is no statement on the FCC Website (that I have been able to find, anyway).
There were some useful facts buried in the article that I'd like to pull out, though:
... India is trying to increase household PC penetration, which is currently at just 2 PCs for every 100 households, says the technology trade group NASSCOM, and broadband connectivity, an abysmal 4 million connections, vs. China's 3.2 million new connections every quarter, according to BNP Paribas. Even Vietnam, with a population of just 84 million, is signing up 120,000 new broadband users per month, according to IDC.
15 July 2008
02 July 2008
Is it because the cost of text messages have increased? Hardly. A more reasonable conclusion is that carriers are charging more because they can. Apparently, text message demand is fairly inelastic. Is this a case of implicit collusion among the larger carriers? Possibly, though not necessarily. Even though Sprint led the price increases this time, I would imagine that one strategy they could pursue to regain market share would be to decrease the price and become the preferred carrier of the text messaging fans (demographically younger).
01 July 2008
Very interesting. I will have to search out similar data in the US. Anecdotal evidence is that payphones are disappearing in the US, which would make it difficult to use them! Do you have any data about payphone use elsewhere in the world?
27 June 2008
In my cursory review, there was one datum -- households with mobile only access, that I wanted to compare with the US. The result? According the the EU report, the number of wireless only households grew to 24% from 22% the year before (figure from p. 10 of the report).
In this item, I had reported a similar study of US households by the CDC. It shows that 14.5% of US households are in a similar position, up from 12.6% of households in the previous year. So, the trends are in the same direction on both sides of the Atlantic, though the percentages are different. Here is how it breaks out by country:
20 June 2008
- This report on convergence is a nice summary of the implications of NGNs, with ideas for regulators to consider.
- This statistical profile to support discussions at the meeting.
In addition, this report was released as an outcome of the ministerial meeting, as well as the Seoul Declaration.
13 June 2008
The FCC held a full hearing to consider a wireless industry proposal that would essentially federalize the regulation of ETFs. Big Wireless would give consumers a break on these fees, which sometimes go as high as $200. In exchange, the FCC would send a note to the judges overseeing various state-level trials recommending that they shut them down, potentially saving wireless giants billions of dollars.
The economic motivation in this case is quite clear!
11 June 2008
European wireless carriers have sharply raised prices for making and receiving calls outside the European Union to compensate for regulator-imposed lower tariffs within the EU, a market research firm said.
Informa Telecoms and Media said on Friday roaming charges had risen as much as 163 percent since the EU capped rates last year at 0.49 euros ($0.76) a minute for making calls abroad in the 27-nation bloc and 0.24 euros for receiving them.
The average price of a call home by an Italian subscriber in Russia was 3.67 euros a minute in 2006, but this has risen 25 percent to 4.58 euros, Informa said.
For customers in most European countries, the cost of roaming in Africa, China, India, Japan, the Middle East, Russia and the United States has risen, the Informa data showed.
14 May 2008
but the quote is also worthwhile:
In the last 6 months of 2007, nearly one out of every six households (15.8%) did not have a landline telephone, but did have at least one wireless telephone. Approximately 14.5% of all adults-more than 32 million adults-lived in households with only wireless telephones; 14.4% of all children-more than 10 million children-lived in households with only wireless telephones.
The percentage of adults living in wireless-only households has been steadily increasing. During the last 6 months of 2007, more than one out of every seven adults lived in wireless-only households. One year before that (that is, during the last 6 months of 2006), fewer than one out of every eight adults lived in wireless-only households. And 2 years before that (that is, during the last 6 months of 2004), only 1 out of every 18 adults lived in wireless-only households.
The percentage of adults and the percentage of children living without any telephone service have remained relatively unchanged over the past 3 years. Approximately 2.2% of households had no telephone service (neither wireless nor landline). Approximately 4 million adults (1.9%) and 1.5 million children (2.1%) lived in these households.
09 April 2008
26 March 2008
The CDMA market is under long-term pressure in Brazil and is being swapped out for 3GSM in Australia. In Venezuela, Mobilnet has announced plans to overlay GSM on its CDMA network to reach low-end subscribers. In India, Reliance is migrating its CDMA network to GSM in urban areas. Neither Verizon nor Sprint has opted for long-term CDMA migration strategies. Sprint is planning a commercial launch of mobile WiMAX in April, and last November Verizon announced it will implement long-term evolution (LTE), a 3GSM migratory technology.
This, combined with the improvements in GSM technology, has removed many of the advantages that CDMA once had. This graph (from the article) shows CDMA slowing. Since LTE and other technologies don't enter into the technology mix until after 2011, it doesn't show a decline.
21 March 2008
20 March 2008
I have culled this graphic from Table 3 of this report. Plotting it in on a semi-log scale highlights the fastest growing technologies in addition to the ones in greatest use. You can see that, while cable modems are the dominant technology, followed closely by ADSL, the most rapid growth is in fiber and wireless. This should surprise no one, but it is interesting to see it graphically.
19 March 2008
26 February 2008
Google and the five telecoms companies said in joint statement that the 10,000 km (6,200 mile) undersea fiber optic cable, connecting the United States to Japan, will cost $300 million.
Google's partners in the consortium, dubbed Unity, comprises Bharti Airtel, Global Transit, KDDI Corp, Pacnet, and Singapore Telecommunications.
The cable will provide much-needed capacity to sustain unprecedented growth in data and Internet traffic between Asia and the United States.
The consortium said it has picked NEC Corporation and Tyco Telecommunications to construct and install the system, which is expected to be ready for service in the first quarter of 2010.
The cost per kilometer of this cable is approximately US$30,000.
What is also interesting to me is that it continues to mark Google's engagement in carriage. Do you view this as a long term contract to assure supply (like one might do in a futures market)? This, by the way, is what Google implied in this statement. Or, do you view this as a part of a continuing effort at Google to build an integrated information utility?
19 February 2008
This figure, taken from this presentation, is quite interesting. I think you'll gain an appreciation for "network management" practices of ISPs regarding P2P, especially when their networks were optimized for web browsing.
28 January 2008
While it is hard to imagine that these services experienced large costs increases (they are software, after all), I would like to point out that the services indicated in the article are hardly essential, mainstream services. Given that, is there a need to regulate? How many "elderly and low income" families actually use these services?
This article over at CNET shows just how quickly market tipping can occur in a standards rivalry. As I blogged earlier, Warner's announcement to back Blu-Ray was (apparently) the defining moment. In response, Toshiba has cut the prices of stand-alone HD-DVD players; this weekend's advertisement at Best Buy was offering a "buy one, get one free" special on HD-DVD movies. This is either an inventory clearance effort or it is an attempt to change the market dynamics. I suspect the former is true.
17 January 2008
This article "exposes" some interesting phenomena in my mind. Quoting the article:
When the big four cellular companies decided to hike the price of sending a text message, they all managed to settle on precisely the same increase. Sprint Nextelraised its price from 10 cents to 15 cents per message in 2006. AT&T quickly followed suit, as did Verizon Wireless and finally, in June 2007, T-Mobile. And now Sprint has raised its price again, to 20 cents.
Today those copycat price hikes are producing banner results for the carriers. In the most recent quarter, the Big Four's customers coughed up anywhere from 29% to 64% more for data services (that is, everything but regular phone calls) than they had the previous year. The four carriers collectively produced $17 billion in operating income on $104 billion in revenue in the first nine months of last year. The carriers say that consumers can buy a monthly package that lowers the cost of a text message. But without the huge surge in payments for data, revenue per user would have fallen at every company.
Their ability to hike prices on text messages certainly can't be explained by the companies' costs. On modern cellular networks the few hundred bits of information that make up a text message take up such a minuscule amount of capacity that they can be carried for a fraction of a cent.
The first paragraph basically points to implicit collusion in price setting. As any economist with tell you, this is one of the principal problems in oligopoly. In this case, it was evident even with four industry participants, which suggests either that the price for text messaging is quite inelastic or that this "ala carte" price doesn't reflect consumer experience. Since avid SMS-ers purchase packages, the actual price is far less, so, I believe the latter explanation dominates.
The end of the second paragraph makes little sense ... data services and SMS are quite different and appeal to different markets. It is clear, though, that non-voice services are the future for profits in this sector.
There was obviously something going on in January and April that stimulated complaints!
15 January 2008
In spite of a recent industry downturn, international backbone traffic has maintained an almost unstoppable growth, around 300 percent in aggregate over the last 10 years, say specialists at TeleGeography at PTC ’08. This is ’good news’ for carriers, says Tim Stronge, VP of Research at the consultancy. But in looking at the figures around the world, he suggests that the road ahead might be a little more bumpy. “We have seen a disturbing trend, recently,” he warns.
This “disturbing trend” comes after years of traffic growth of at least 15 percent on a CAGR basis. Stronge says last year, however, there was a “fairly significant dip down to about 10 percent”. One factor was a major slowdown in traditional TDM traffic, perhaps to only 6 percent growth, a figure not seen since World War II. However, VoIP has now reached that seen for TDM—traditionally a much larger business segment—because of the much higher VoIP growth rates.
VoIP is not just settlement rate bypass, says Stronge emphatically. “[It] is becoming a mainstream technology...the USA is No. 2 VoIP recipient in the world after Mexico, even though settlement rates are very low [for TDM traffic]; China is No. 4, UK is No. 3.” [snip]
"Why is traffic finally slowing?" ... Drivers have been international trade and economic health between countries. Another has been international migration, where migrants call home frequently ... The probable key is ongoing price reductions and their impacts. [snip]
However, voice is only one part of an increasingly large picture and one that is dominated by Internet traffic almost completely. Voice, according to TeleGeography, accounts for less than 1 percent of traffic on international backbone networks. Private networks account for around 14 percent, and the Internet (principally web usage and peer to peer activity) the remaining 75 percent. In aggregate, Internet demand is consuming available capacity on key transoceanic cable systems pushing cable operators into wanting new cable projects and upgrading existing systems. On both key Atlantic and Pacific routes capacity is being consumed, says TeleGeography. Internet growth on these backbones may well be in the 60-70 percent per annum range, far above the 40 percent per annum increase in corresponding supply.
11 January 2008
A Google win could also spark a hiring spree in the mobile sector. The company has zero experience building and operating a network, and would need to bring on folks who do.
Investments in wireless technologies could rise as well. "You'll see a flood of investment capital come back to the wireless market," Ellison says. "The mobile operators are a barrier to innovation. They move really, really slowly."
Still, analysts say there are good reasons for Google not to win. The costs involved in acquiring the spectrum, and building and operating a wireless network, are sky-high. The starting bid for the spectrum is $4.6 billion, and analysts predict it will sell for more than that. All told, the entire cost of a new network could reach $15 billion--and take a few years to build.
Plus, profit margins for wireless service are extremely low. "As a Google shareholder, you'd have to question a wireless network, which doesn't have the same margins as [online] advertising," says Forrester Research analyst Charles Golvin.
Mathias speculated that Google can lose and still win. For instance, the company might drive up the spectrum price and then drop out of the auction. Another company could win it and possibly take on enormous debt, which could give Google great bargaining power later on. "This is a game; make no mistake," Mathias say. "There are all kinds [of ways] of getting what you want without winning anything and spending money."
Continuing on the theme I began developing late last year, this announcement suggests that cable TV operators are beginning to dis-integrate their networks, separating services from platform. So, will CATV move to an "open" (wholesale) platform on which multiple "retail" services will be provided from multiple providers? Is this another instance of "functional/structural separation"?
- The big squeeze, operators cosying up and a push on data
- Device convergence -- the rise of the multimedia handset
- Mobile 2.0 -- user-generated content, social networking, location-based services
- Femtocells -- a cost-cutting, data services driver -- or not?
- Disruption -- not just the Google factor
Read the article for the gory details.
Do you think these will be the five predominant trends for the coming year?
08 January 2008
You might find this graph interesting, which was derived from data reported on FCC Form 499-A. I think it shows what we have all been experiencing ... declining wireline and increasing wireless. The data are in $US millions and the y axis is logarithmic. I do not believe that these are in constant dollars.
Note the decline in payphone revenues, wireline, and toll revenues and the increase in revenues for companies classified as wireline competitors (my words, not the FCC's) and, of course wireless providers. This data has wireless providers' revenues exceeding wireline ... I think that is because it is aggregated data.
According to this article over at Forbes, the content industry is more interested in a standard, regardless of which. Their concern is the market for digital media.
I think the speculation that this may be hollow victory is more interesting ... quoting the Forbes article:
But Sony can't afford to spend too long drinking the champagne. The real news isn't that HD-DVD's future looks grim. It's that if Blu-Ray's backers can finish off HD-DVD quickly, Blu-Ray might have one.
With Apple, Amazon.com, NetFlix and Microsoft pushing downloadable movies and cable and phone companies peddling a plethora of on-demand, high-definition content, the day is coming when the stacks of plain vanilla DVDs that clutter many home entertainment centers will go the way of the CD collection.
JVC even introduced a flat-screen television at the International Consumer Electronics Show that allows users to simply pop in one of Apple's iPods to watch video content--threatening to turn the slim media players into an alternative to digital video discs. And Panasonic is building iPod docks into its home theater systems alongside an integrated Blu-Ray player.
Another worry, according to Robin Harris, an analyst with the Data Mobility Group, is that Blu-Ray adoption will be slow because few people will notice the difference between formats, since many players can neatly "up-convert" DVDs for high-definition sets. As a result, few will opt to replace their entire DVD libraries, as many did with the earlier generation of videotapes. "So is this going to be a pyrrhic victory for Sony? I think that there's a fair chance that it will be," Harris says.
07 January 2008
Facing pressure from regulators, the cable TV industry plans to make good on a promise to standardize its technology and open the door to televisions and other gadgets that don't need cable boxes to receive video-on-demand programs and other interactive services.
An industry initiative, to be renamed "tru2way" after a decade in the works, is expected to allow electronics manufacturers to make TVs and other gear that will work regardless of cable provider. By making devices compatible, the standard also could encourage the development of new services and features that rely on two-way communication over the cable network.
Our business model has changed completely, from a closed, proprietary model to an open architecture that will work across cable companies - not just across Comcast. That was a Herculean job to accomplish.
Suppose we took Roberts' view at face value. Is it reasonable to imagine the industry model evolving from a vertically integrated "customer experience" to a "platform-based" one? That is, is it reasonable to imagine that the initial innovation in an industry would require a high degree of control, so that specific investments in physical infrastructure could be coupled with specific investments in "software"?
In fact, we have seen this initially in telephony:
- The introduction of automatic switching had to be closely coupled with end user devices.
- The transitition to a "common battery" for handsets (from locally powered devices) also had to be closely coordinated. It is interesting that we seem to be gradually transititioning back to the locally powered paradigm, but that's another story.
04 January 2008
I thought you might be interested in an update of this item, posted earlier. As Forbes reported today, the Maine PUC approved Fairpoint's acquisition of Verizon's lines in its state. Recall that Vermont did not approve this.
Questions about appropriate regulation or inappropriate micromanagement aside, I think this case provides an interesting window into the machinations necessary in telecom for commercial transactions such as this. These kinds of processes are not required in other industries and adds to the "friction" of these markets, which leads to economic inefficiency. Quoting the article:
The deal also requires approval from Vermont and New Hampshire regulators. On Dec. 21, Vermont's Public Service Board rejected the deal but invited the company to submit a revised application. New Hampshire's PUC staff recommended against the initial proposal, but it is also willing to consider a revised deal.
The deal is subject to review by the Federal Communications Commission.
Maine's PUC attached several conditions to its approval. The commission suggested reducing FairPoint's debt to Verizon by $100 million by scaling back fees FairPoint will pay to lease equipment from Verizon Communications.
Verizon general counsel Don Boecke dismissed that approach as "pretty much a nonstarter," but left the door open to an alternative. FairPoint Chief Executive Officer Gene Johnson then told the PUC a number of options that would have the same financial impact were available.
Some of those include reducing dividends, selling noncore assets, selling more stock and suspending dividends if necessary, he said.
Other conditions ordered by the PUC included having FairPoint develop and implement a policy protecting customers' privacy, and not having the other two states' regulators materially change FairPoint's financial condition.
Johnson said he did not see the latter condition as a problem, saying he believes the agreement approved in Maine will serve as a "roadmap" in New Hampshire and Vermont.
Update (2008-1-15): The FCC approved this transaction (see the order for details):
In accordance with the terms of sections 214(a) and 310(d), we must determine whether the Applicants have demonstrated that the proposed transactions would serve the public interest, convenience, and necessity. Based on the record before us, we find that the transaction meets this standard. We conclude that it is unlikely the merger will result in any anticompetitive effects or other public interest harms. Specifically, the Applicants do not compete in any of the relevant local exchanges. Moreover, after consummation of the transaction, the Applicants will compete for large business and long distance customers. The transaction also is likely to produce public interest benefits, including the accelerated deployment of broadband throughout the region.
Note that the FCC's analysis is motivated by a different set of concerns than the states' ... the latter group is interested (at least on paper) in the viability of Fairpoint after the transaction. This difference can be seen in Commissioner Copps's dissent.
03 January 2008
I have gotten interested in the potential reorganization of the mobile telephone industry ... not necessarily because of Google's potential market entry, but because of actual events (see this and this). This article from the NY Times adds additional evidence.
The question is, are we on the cusp of a sea change in the mobile industry? If the industry does split into wholesale/retail components, what are the implications for telecom policy? Would this make the interest in "structural" or "functional" separation more intense?