31 December 2009

Report on WiFi Hotspot Usage

This report from In-Stat is interesting but not surprising: WiFi hotspot usage is increasingly shifting from laptops to hand held devices:

As a percentage of total connects, handhelds increased from 20% in 2008 to 35% in 2009. By 2011 handhelds are anticipated to account for half of hotspot connects ...

I wonder if we can anticipate a model where handheld devices automatically test WiFi hot spots for data service either in conjunction with or instead of carrier-based services. Using this in conjunction with carrier services would require inverse multiplexing capabilities in the phone and in the service provider, which is more complex and requires more coordination than an either/or type of service. The capability to augment carrier-based services could ease the pressure on spectrum that some carriers are reporting due to the increasing popularity of smart phones.

30 December 2009

Prices at different app stores

This item over at GigaOm is interesting. The Apple App Store currently has the cheapest average price if you consider paid apps only, while Android has the lowest overall price. Given the open nature of the Android apps, I would expect this to change as competition increases as the number of Android apps increase.


Challenges with phasing out landlines

This commentary over at GigaOm is insightful and worth reading. In it, Stacey Higginbotham reflects on this AT&T filing at the FCC. Some excerpts from the filing:
Over 99% of Americans live in areas with cellular phone service, and approximately 86% of Americans subscribe to a wireless service. Many of these individuals see no reason to purchase landline service as well. Indeed, the most recent data show that more than 22% of households have “cut the cord” entirely.

Demand for VoIP service – from both cable companies and over-the-top providers such as such as Vonage, Skype, and many others – is also booming. At least 18 million households currently use a VoIP service, and it is estimated that by 2010, cable companies alone will be providing VoIP to more than 24 million customers; by 2011, there may be up to 45 million total VoIP subscribers.

Today, less than 20% of Americans rely exclusively on POTS for voice service. Approximately 25% of households have abandoned POTS altogether, and another 700,000 lines are being cut every month. From 2000 to 2008, the number of residential switched access lines has fallen by almost half, from 139 million to 75 million. Non-primary residential lines have fallen by 62% over the same period; with the rise of broadband, few customers still need a second phone line for dial-up Internet service. Total interstate and intrastate switched access minutes have fallen by a staggering 42% from 2000 through 2008. Indeed, perhaps the clearest sign of the transformation away from POTS and towards a broadband future is that there are probably now more broadband connections than telephone lines in the United States

The public policy problem, of course, is what to do with the 20% that do not have connections. Further how about those people who use the copper PSTN plant for DSL access (whether from a telco or a reseller)?

It seems that the hard problems with public policy is dealing not with the majority but with the marginal.

23 December 2009

Singapore-Japan cable system

This article over at Telecommunications Online announced a new cable project, so I collected it here with my other similar announcements. Quoting the article:

Several Asian telcos and Google are to invest US$400million in a new subsea cable linking South East Asia to Japan aimed at providing the highest capacity connection link to-date. The South East Asia Japan Cable System (SJC) has a design capacity of 17 terabits per second with provision to be ramped up to 23 Tbps. As an example of its capacity, the bandwidth allows the SJC submarine cable system to handle 30 million high definition videos simultaneously.

The 8,300 km cable will link Singapore to Japan with branches to Indonesia, Philippines and Hong Kong initially. From Japan, it will be plugged into the recently commissioned trans-Pacific Unity cable to the USA, besides a branching link to Guam which is becoming an alternative cable junction linking Asia to USA.

Comprising 6-fiber pairs, the SJC cable is scheduled to be completed by the second quarter of 2012, and is believed to be modeled after the Unity business concept providing autonomy in operation for partners to the initiative.

The decision to proceed with this landmark project, no doubt, has been prompted by growing demand not only from Internet traffic but also from the surge in telco TV, games and enterprise data.

An earlier report by TeleGeography shows that international Internet traffic has not been affected by the recent economic meltdown. In fact, international traffic growth was up 79 percent in 2009, from 61 percent in 2008.

By end 2009, fixed broadband in the Asia Pacific is expected to grow 17.3 percent to 182 million subscribers clocking billings of US$44.8 billion, according to Frost & Sullivan. The next generation networks in progress in South Korea, Malaysia, Singapore and Australia will further fuel growth in fixed broadband, not counting the phenomenal surge in mobile broadband.


So this cable will cost a bit over $48,000 per kilometer. This is the first time I have seen the capacity of the cable described in terms of the number of HD television streams that can be supported; previously, it was always the number of simultaneous telephone calls.

22 December 2009

Test of 3G networks in the US

Gizmodo recently ran a test of 3G networks in the US. The results are here. AT&T had the fastest average upload and down load speeds in the cities where the test was conducted. This is a bit of a surprise given the heat that the AT&T network has been taking recently from iPhone users. Gizmodo wonders whether it is the iPhone or some interaction between the iPhone and the AT&T network, and I can see why. These test results don't corroborate the belief that AT&T's network is (only) at fault.

21 December 2009

Follow-on to Mobile OS item

This item over at Ars Technica caught my attention, especially in light of my last post on this blog. Given the importance of a vibrant ecosystem that includes apps, this software from Palm substantially lowers the barriers to entry for app developers for the WebOS. Using this environment, apps can be constructed by relatively unsophisticated (i.e., without particular programming skills) end users and, if done well, even be monetized. I presume the most successful apps would be professionally re-programmed and optimized. Wouldn't this have strongly positive effects for the ecosystem?

18 December 2009

Mobile operating systems

There are two articles related to Mobile OS that caught my attention. This one, in Ars Technica, shows that, for the first time, the total number of iPhone users have passed the total number of Windows Mobile users. This one, over at CNN Money, shows the growth of Android-based devices. Clearly the flat Android line in the first article will change as 1.5+ and 2.x-based devices enter the marketplace and start gaining in installed base.

This is significant in a number of ways. First, installed base is a key factor in the development of the "ecosystem" of a device. Apple, which has locked down the hardware as well as the OS on the iPhone, enjoys an unrivaled hardware ecosystem from third parties. By tightly controlling the application environment, they maintain a strong grip on the app store, which enables them to monetize this very profitable aspect of the iPhone ecosystem. Because of the size and momentum of the installed base, they will attract more applications developers.

Android does not have the hardware control and has a more open approach to the app store which does not lend itself to monetization in the same way. However, the more open approach may attract more innovative app developers. There is clearly momentum in the Android OS camp, even if the installed base is lacking at present.

Carriers have an interest in minimizing the number of mobile OSs they support. Adding a new OS is costly because the features and services of the OSs must be tested with the network and supported by the carrier after the device is released. Thus, OSs with little installed base or momentum are at risk of losing their sponsors (the carriers) which leads to a self-reinforcing downward spiral in their ecosystem. Thus, the stakes for Microsoft in its Windows Mobile 7 (due in 2010) could not be higher; for that matter, the same is true for Palm's WebOS. Failure to attract a user base means failure to build an ecosystem.

This creates a high hurdle, but it is not insurmountable. At one time, the Palm OS was the standard with a huge catalog of applications; it is no longer so dominant. Thus, dominance is not permanent even if it can be a huge advantage in the market place

05 December 2009

Google and DNS

I found this commentary on Google's entry into the DNS domain the most worthwhile (from this article in the NY Times):

“You have to remember they are also the largest advertising and redirection company on the Internet. To think that Google’s DNS service is for the benefit of the Internet would be naïve,” Mr. Ulevitch wrote. “They know there is value in controlling more of your Internet experience and I would expect them to explore that fully.”

In an interview, Mr. Ulevitch said OpenDNS was steering customers who mistyped addresses to Yahoo search results and ads. “I have no doubt they see that as a competitive threat,” he said. He also stressed that there is valuable information in the DNS layer — like where servers are located, what Web sites people are looking at and how frequently they are looking at them. “I have no doubt they will be monetizing this by increasing intelligence on people’s surfing habits,” he said.

Ulevitch runs the "Open DNS", which has provided this service for some time. He differentiates Google DNS from Open DNS here; you should bear in mind that Google is offering him competition. Finally, I think it is important to remember, as Ulevitch does here, that Google is a profit-making organization.

03 December 2009

Importance of framing

No, I'm not talking about Time Division Multiplexing. This NOI from the FCC came to my attention (via this item in Ars Technica). I have addressed the importance of the "framing" of a topic in regulatory reform before (see this item), and I believe that this may be a similar example. Here, it seems as though the FCC is framing the debate about spectrum reform as a matter of broadband policy, which, of course, is related if wireless access is a key component of broad policy. Maybe Tom Hazlett's ideas are gaining traction after all?

02 December 2009

Telegraph in the US

Ars Technica posted this article, which is a partial history of the telegraph in the US. The history of the telegraph involves considerably more than what is written and, taken as a whole, provides a graphic illustration of "bare knuckles" capitalism. Fortunes were won (eg. Ezra Cornell, who later founded Cornell University) and lost sSee my book Shaping American Telecommunications for more). What the Ars article does not mention is that while Gould was nipping at WU's heels, the telephone business was just getting off the ground. WU ended up making the business decision to focus on its core telegraph business and sold its telephone business to National Bell. It was a fateful decision indeed.

Mesh networks

Here is a nice article on mesh networking technology from Ars Technica. It includes a nice discussion of the historical context and a very little bit about the economics. What I found most interesting was the following, taken from an interview with NYU's Jinyang Li:

Li pointed out that current mesh protocols don't give nodes incentives to forward one another's packets. This isn't a major problem in academic, corporate or government settings where all the nodes can be assumed to have the same owner. But in open, heterogenous networks, each user will be tempted to "cheat" by modifying his own device to take advantage of other nodes' forwarding services without reciprocating. "It's still an open question whether we'll be able to solve this incentive problem," Li told Ars.

A related concern is security. Users should worry about the security implications of entrusting their packets to random strangers. Things would get even more dicey if routing protocols were enhanced to add monetary incentive schemes such as micropayments; in that case, security flaws in mesh networks could allow unscrupulous mesh participants to generate bogus traffic in order to siphon money away from their neighbors.


These are indeed serious problems that have to be solved before this approach to networking has a prayer of commercial success in anything but niche markets.

01 December 2009

Telecom cost estimates

This is my latest installment in my continuing effort to collect information on the cost of telecom system deployment. NATOA filed these comments with the FCC that contain some case studies on fiber deployment to "anchor tenants" in communities. Perhaps you will find this interesting or useful.

30 November 2009

Unintended consequences of price regulation: Traffic pumping

This order reveals the practice known as "traffic pumping". I believe that it is a good example of arbitrage that results from price regulation (footnotes deleted):

2. Qwest is an interexchange carrier, serving customers throughout the United States. Farmers is the incumbent local exchange carrier in Wayland, Iowa, serving approximately 800 access lines for local residents. Farmers provides local exchange and exchange access services. Qwest purchases tariffed access service from Farmers, which enables Qwest’s long distance customers to terminate calls to customers located in Farmers’ exchange.

3. In 2005 and 2006, Farmers entered into a number of commercial arrangements with conference calling companies for the purpose of increasing its interstate switched access traffic and revenues. Under the agreements, conference calling companies sent their traffic to numbers located in Farmers’ exchange and, in return, Farmers paid the companies money or other consideration. The agreements resulted in a substantial increase in the number of calls bound for Farmers’ exchange. As a result, the amounts of Farmers’ monthly bills to Qwest for terminating access charges rose precipitously.

Maybe Sony will help you build a supercomputer too!

I have known about using Sony PlayStation 3 platform as a supercomputer, but I still found this item over at Ars Technica interesting because it lays out the case better.

It leads me to wonder how long the business model for console gaming can last. Basically, game purchasers subsidizing unintended uses of the PS3 platform. This will produce one or more of the following consequences in the long run:
  • Fewer gamers adopt the PS3 platform because the game prices are higher than on competing platforms;
  • The profits of Sony and of game developers are lower;
  • Sony does what Apple does and tries to "capture" the PS3 users, creating a future for people who can "jailbreak" PS3s
  • Console manufacturers create "crippled" consoles that can't be effectively used in this way. For example, is anyone using clusters of Xbox 360s as a supercomputing platform?

23 November 2009

iPhone Apps and economic networks

I have blogged around this subject before, so when this article in BusinessWeek showed up, my interest was piqued. When economists began looking at standards in IT in the 1980s, researchers like Joe Farrell, Garth Saloner, Carl Shapiro and Michael Katz began noting the importance of networks of tie-ins to the lock-in often associated with standards. They noted in particular the importance of software to the success of PCs vs. Macs (though this view was disputed by Margolis and others).

The BusinessWeek article is significant because it shows a weakening of the tie-ins to the iPhone, which would, according to the economic theories of standards, suggest that the durability of this platform in the face of other existing (and emerging platforms) is not particularly strong. Note that this is not unrelated to the "opaqueness" of the approval process by the Apple App Store and the relatively poor profitability of iPhone apps.

Economics in the wireless industry

This item over at the tech. liberation blog is fascinating. It makes clear the deeper battles going on releated to "wireless network neutrality". I am becoming more convinced than ever that the network neutrality discussion is largely a question about political economy -- that is, who gets which revenue and who pays which costs. But is allocating more spectrum to mobile, as the article goes on to suggest, the answer?

20 November 2009

Regulation and cloud computing

This item over at the Tech Liberation blog got me thinking about regulation and cloud computing. Given the underlying economies of scale and scope, it seems to me to be almost certain that we'll see concentration in the market for cloud services. Thus, is it too soon to be thinking about marketplace rules for cloud computing? After all, we haven't seen serious market abuses (that I know of). If so, is the FCC the right agency to do this? As the article points out, there is no real statutory authority to do so. On the other hand, cloud computing is a deep integration between processing and communication to provide a service, so it is easy to see how they might be tempted to move in this direction. Where would you draw the line between communications and processing in cloud computing?

19 November 2009

Google, Android, maps and more

I have been interested in Google's approach to business and, like others, have been a bit wary about it as well (See this search of this blog for more).  Thus, this item resonated with me, especially in combination with this item from the Washington Post (please be sure to read this follow-on from Tech Crunch in the spirit of completeness). This article over in BusinessWeek is another piece of the puzzle, I think, considering that gaming is a major profit business.

It seems clear that Google is building a platform and ecosystem. It is a for-profit corporation, so the question is not if but how this platform will be monetized, how it will compete with other platforms (Apple-centric and Microsoft-centric), and how switching costs will be imposed to deter defections from the platform. Google's interest in certain intellectual property around Android provides some evidence of this. Will monetization come from advertising revenues? Or will we see a series of user fees that will emerge after users are invested in the platform?

The questions matter because our government has historically gotten interested in platform competition and how it relates to control. These questions will be at the core of future anti-monopoly policy, I believe.

16 November 2009

iPhone Apps and the war for the web

There are a couple of items that have come to my attention recently that suggest change is on the way for app stores (perhaps the iPhone app store in particular):  The first comes by way of Seeking Alpha.  In it, the author cites this item from Gizmodo, which argues that, in the not to distant future, apps will be very cheap or free.  This has been a significant profit center for Apple, but may not be in the future.  The second item, from GigaOm, discusses the opaque approval process for iPhone apps which can be frustrating for developers.

Taken in tandem, the second item suggests higher than necessary transaction costs for developers (since the approval process is uncertain) and the first item suggests lower prices (tending to $0) because of competition.  High transaction costs for no profit suggests that developers will seek to monetize their investment in other ways, through advertising or through other kinds of tie-ins (where price is not $0) that are outside of the control of the app store.

But as O'Reilly points out, there are limits to this too.  Apple is not shy about blocking apps that try to escape their business model.  The O'Reilly article is interesting in that it argues for the shape of competition to come and makes the case for the tendency toward market concentration that we see in the Internet.

10 November 2009

More on Clearwire

I found this item over at GigaOm a worthwhile followup to my earlier article.  The article lays out the dilemma faced by Clearwire and its backers.  On the one hand is the short term market opportunity and on the other is the current competition from WiFi and the future competition from LTE.

Economics of Wireless ISPs

In the FCC's Electronic Comment Filing System, you will find this contribution from the Wireless Internet Service Providers Association (WISPA). What is notable about this comment is that it contains useful traffic engineering and cost data. For example:

  • "Middle Mile" capacity needs range from 50-260kbps per user and should be 5% of the aggregate bandwidth supplied to last mile business customers
  • Second mile capacity needs range from 100-300kbps per user.  This is higher because it is averaged over fewer customers.
  • Rural ISPs pay more than $200 per megabit for OCn connectiveity
  • Capital expense for a Part 15 microwave is up to $10K and for a Part 101 (licensed), this cost goes up to $15-$20K.
  • In the second mile, a DS3 router interface costs $8K and a fiber interface module $200.  
  • One rural ISP reported paying $4K/mo for a burstable DS3 to support 250 rural customers.  This translates to $12/mo per customer for middle mile transport and limits how cheap service can be.
  • For Internet service, WISPs pay from $2 to $300 per megabit per month to Tier 1 providers, depending on where they are.
There is a lot of info embedded in this paper, though it isn't in tabular form.  Instead, it is in more of an anecdotal form.  In any case, if you're interested in the economic foundations of wireless internet provision, you will find this an interesting read.

Separately, this Wiki, which describes a research project over at KU, is worthwhile reading.

09 November 2009

Clearwire and the future of WiMAX

This article over at Forbes is interesting:

In a note to clients, Stifel Nicolaus analyst Christopher King said the latest round of financing will probably not cover all of Clearwire's cash needs over the next few years. He expects the company to lose several billion more dollars.

But he added that "Clearwire needs to build out its nationwide network as quickly as possible, as both Verizon and AT&T will begin to quickly catch up with what we believe will likely be a superior 4G network."

Clearwire's second-quarter loss, though narrower than a year earlier, came to $73.4 million. The company reports third-quarter results after the closing bell Tuesday.


So, Sprint and Comcast are investing an additional US$1.5 billion. Google decided against upping their ante in the WiMAX provider. So, is this a near term play? Will Clearwire switch to LTE when they can or will they focus on fixed wireless broadband?

06 November 2009

Internet censorship as restraint of trade?

I found this item interesting, though I have yet to read the full report. It reminds me of how telecom reform in the EU was developed and pushed under the guise of competition policy rather than communications policy in the early 1990s. In similar fashion, the US FCC introduced competition into telecommunications not as a common carrier matter but as a frequency allocation matter (the "Above-890" decision in the late 1950s). It clearly matters how you frame the question!

"Censorship is the most important non-tariff barrier to the provision of online services, and a case might clarify the circumstances in which different forms of censorship are WTO-consistent," said the study by Brian Hindley and Hosuk Lee-Makiyama.

05 November 2009

Intel and Dell

I found this article, which describes the background behind Andrew Cuomo's (NY Attorney General) suit against Intel, interesting. Intel apparently made payments to Dell in exchange for their continued (exclusive) use of Intel processors in their machines. These payments were substantial:
In 2006, Dell received $1.9 billion. During two quarters that year, Intel payments even exceeded Dell earnings. In the quarter that ended in April, payments were $805 million, compared with $776 million in net income. For the quarter that ended in July 2006, Intel's payment amounted to 116% of net income.
I find this article interesting because it illustrates the "sponsorship" a "standard". Economists who have studied standards (Farrell, Saloner, Katz, Shapiro, David and others) have cited sponsorship as a rational strategy for firms wishing to establish (or defend) a de facto standard. In its interactions with Dell, Intel seems to view its processors as a standard platform for PCs (as opposed to Intel). Thus, this reveals the costs of (and limits to) sponsorship.

02 November 2009

Business models for 4G mobile systems

This article over at Forbes essentially points out that there are two different business models implied by the upcoming technology: one that provides for higher bit rates and new services and another that minimizes costs.
The Big Four are scrambling to offset any drop in calling revenue by shifting their focus to new wireless opportunities. They are just beginning to spend tens of billions of dollars deploying new "fourth generation" cellular technology to greatly expand their data-moving capacity and make all sorts of new wireless devices possible, from e-books to dog collars that let you track Fido's whereabouts. Linquist [of MetroPCS] just signed contracts to buy the same 4G technology for a very different reason: He plans to use it to radically improve his ability to carry phone calls--and do it much more cheaply.

[SNIP]

The new gear is so powerful that he will be able to simultaneously increase the quality of cell phone calls while cutting the cost of providing each minute, from just under a penny today to closer to a tenth of a cent. Linquist charges 2.1 cents a minute, just under half of the industry's average revenue. He'll continue cutting, confident his singular focus on running the cheapest voice network will keep his costs well below those of the rest of the industry.

A decade ago there were three phone businesses: local, long distance and cellular. The first two have already collapsed, done in by advancing Internet and cellular technology and the cutthroat competition they unleashed. Americans paid $110 billion annually for long-distance phone calls nine years ago. It's now down to $55 billion and still shrinking. Local phone companies took in $126 billion at its peak eight years ago; that sum has fallen to $86 billion and is dropping fast.

Another interesting quote (it is always nice when data is included):

MetroPCS has pioneered the use of new technology that lets it pack more bits into high-traffic areas. On a recent afternoon in East Los Angeles, dozens of parents waited to pick up their kids from school, many chatting away on their phones. While most calls linked invisibly to large cell phone towers, the MetroPCS subscribers unknowingly connected to a tiny antenna no bigger than a car radio antenna that dangled from a nearby telephone pole. From there the calls traveled via fiber-optic cables that snaked alongside old phone wires back to a dingy building and then to the main network. These clever minicells are only one way the company keeps chopping the cost of providing each customer with unlimited calls. Providing unlimited calls now costs the company $16.82 per customer per month, down from $18.23 a year ago.

Economics of SIP

I found this article over at GigaOm interesting. The article argues that Skype's "Global Index" technology is much cheaper than SIP, which is a significant factor in its (i.e., Skype's) success. I only wish they had provided real numbers. I wonder if anyone has done a careful cost analysis of VoIP signalling?

23 October 2009

Google and Verizon

This joint statement is interesting less for what it says than for who is saying it (see also this item).

We can frame the net neut discussion as a faceoff between application providers and network operators about who gets to pay (and how much) -- i.e., we frame it as a problem in political economy. In that world, Google is the 800 kg gorilla amongst the apps provider and Verizon is the 800 kg gorilla (at least for a significant fraction of the US population) amongst the operators. An agreement between these two is likely to set the tone for the rest of the community (the joint statement is far from an agreement).

This is eerily reminiscent of the bargaining that took place between the Associated Press and Western Union in the 1860s. At the time, AP was WU's biggest customer, so AP had the traffic to build their own telegraph network; WU had the local resources (in telegraph operators) to build their own news agency. The agreement between the two was basically an agreement to not compete with each other in their core markets.

Is this shaping up to be a similar business arrangement? Are there parallels worth considering or is the WU/AP analogy worthless?

22 October 2009

Nokia sues Apple

This item at Gigaom (and elsewhere on the Internet) is interesting. So I wonder if this is sour grapes on Nokia's part or arrogance on Apple's? Either would be distinctly plausible, since, apparently, talks have been going on for a couple of years.

Update (2009/10/23): This article provides a little more detail, but still not that much.

Internet retailing in the EU

While this is a little bit off-topic for this blog, I found this EU report interesting. In particular, there is potentially useful data in Annex 1. From the executive summary:
While e-commerce is taking off at national level, it is still relatively uncommon for consumers to use the internet to purchase goods or services in another Member State. The gap between domestic and cross-border e-commerce is widening as a result of cross-border barriers to online trade. From 2006 to 2008, the share of all EU consumers that have bought at least one item over the internet increased from 27% to 33% while cross-border e-commerce remained stable (6% to 7%). One third of EU citizens indicate that they would consider buying a product or a service from another Member State via the internet because it is cheaper or better.

21 October 2009

First "white spaces" deployment

In this article, Ars Technica reports on what they claim is the first use of "white spaces" in the television band. As I stated in my iConference paper, I believe that the white spaces ruling will be of greatest benefit to rural communities.

Quoting the article:
The nation's first wireless broadband network operating in unused TV channel "white spaces" is now live in an unlikely spot—Claudville, Virginia.

Claudville is a small place—only 20,000 people live in the entire county, and only 900 in Claudville proper—and its Blue Ridge Mountain terrain has made Internet access hard to come by. Combine that with a countywide per capita income of $15,574 and its not hard to see why the big ISPs haven't rushed to Claudville.

In this case, the white spaces provided "backhaul" for WLANs that were deployed in town. I don't know whose equipment was used, though it appears to be a database-driven radio rather than a sensor-driven cognitive radio.

20 October 2009

NANOG 47

There are some interesting presentations up at the NANOG 47 website. In particular, this paper is an interesting global look at network traffic, and this one seems to be a nice presentation on LTE.

Update: Here is an article that discusses the first presentation.

12 October 2009

End of handset contracts?

I found this item over at CNet interesting. Apparently, AT&T's profitability point in the iPhone contract occurs rather late in the two year contract cycle. Does this mean that the carriers will shift its strategy away from captive contracts for handsets? If it isn't particularly profitable for them to do this, then they would, presumably, stop (absent other benefits)?

09 October 2009

Interesting article re: WiMAX

I found this item over at GigaOm extremely interesting. It appears that Clearwire has contractual commitments to what it recognizes is a less efficient technology than LTE (at least for mobile). Interesting.

I wonder what will happen in 2011! It is not out of the question for Clearwire to do a technology transformation; this has been done before when it has been strategically beneficial to do so.

Mobile as the new mass medium?

I found this item over at Seeking Alpha interesting. For business, public policy, etc., determining the most effective way to reach people is clearly critical. Over time, this has moved from leaflets to newspapers to broadcasting (in fact, the Gutenberg press was arguably the first mass communication technology). This article suggests that we may be at the cusp of another (disruptive?) transformation in mass communication media. If this is correct, then we can only expect the battle over mobile operating systems, carrier "opennness", applications, advertising relationships, etc. to intensify.

05 October 2009

The future of WiMAX

I find it worthwhile looking in the financial literature when it comes to new technologies since it is often a more worthwhile predictor of where things are going than the technology press. Thus, I found this item interesting. The article states:
The much-hyped next-generation (4G) technology of WiMAX is quickly losing ground to the alternative technology of Long-Term Evolution (LTE). Large telecom infrastructure equipment makers are gradually shifting from WiMAX to LTE, and as a result the WiMAX field is getting less crowded day by day.
Later, the article notes that there is a role for fixed WiMAX even as the fortunes for mobile WiMAX seem to be fading. If this is true, it seems as though perhaps Sprint erred in choosing mobile WiMAX as its 4G technology to gain first mover advantage!

30 September 2009

For techno-economic study geeks only

If you like to do cost studies, then adjusting for inflation is critical. This website has the inflation adjustment factors from 1774 to 2016 (obviously future ones are projected). So have at it and start modelling!

The Iridium saga continues

Iridium has been an interesting case study in the telecom industry. This article points out that there is still a business model its LEO satellite-based system. Without going into details, Iridium was designed to be a system that would enable people to be in touch anywhere. It was developed prior to the widespread deployment of terrestrial mobile systems (eg. GSM) and when voice was the only application of interest. By the time it was deployed and open for business, this had all changed and terrestrial mobile systems satisfied many of the needs for which Iridium was developed and offered diverse handsets, data services and indoor capabilities, which were not available from Iridium. Its fortunes plummeted and was purchased on the cheap in bankruptcy. Today, its 347,000 subscribers generated annual revenue of $1.2 billion (or an Annual Revenue per User of about $3460 per user), so there motivated users (mostly government) exist. I wonder if the new satellites will have data capability ...

Update (7 Oct 2009): BusinessWeek posted this article on Iridium which covers much of the same ground as the previously cited article, but with more analyst insight.

29 September 2009

How open is open source Android?

I have found the Android project to be interesting to follow. There have clearly been some successes in using open source for product development (Linux, obviously but also Apache). The success of open source seems to be a bit uneven; for example, I am not aware of a huge development community around Solaris, which was open-sourced by Sun. So when Google launched Android as an open source development project, I was most interested, especially given Google's market clout and user following.

So this item really caught me by surprise. I especially found this paragraph interesting (and a bit counter-stereotypical):
Google, however, appears to be significantly less permissive on this front than Microsoft. The company's legal department objects to the Cyanogen mod on the basis of its inclusion of Google's proprietary software. They sent Kondik a cease and desist order compelling him to remove the mod from his Web site. The Android enthusiast community has responded fiercely, condemning Google for taking a heavy-handed approach. Even Google's own Android team appears to be frustrated with the legal department's zeal.
I fully expect that some bright and committed programmers will find a work-around. The whole reason for the mod, of course, is that Google's ROM doesn't work as well. Isn't that how open source is supposed to go? Has Google discovered the hazard of this approach? I wonder if they'll encounter similar problems when Chrome OS goes open source?

25 September 2009

Microsoft study places value on white spaces

I found this article interesting.

The study, by consultant Richard Thanki of Perspective Associates, suggests that by augmenting current unlicensed wireless networks, such as Wi-Fi hot spots, the white spaces could generate between $3.9 billion and $7.3 billion in value annually over 15 years. That would be the result of increased use of consumer electronics and other factors, according to the study.

In his study, Thanki writes that white spaces spectrum offers a broader range than a typical Wi-Fi connection. A single Wi-Fi access point enhanced by the white spaces could "fully cover a large building and the neighboring grounds and areas," he writes.

In addition, use of the white spaces could lower the cost of providing Internet access in rural areas, Thanki writes.

While chips used to power white spaces devices would initially cost roughly $10 more than existing technologies in 2012, the difference should steadily decline at a rate of about 30% annually from that point on, according to the study.


I would love to get a copy of that study ...



Posted using ShareThis

Economic value of unlicensed spectrum

Here a new item for my reading list. This isn't the first analysis of its kind (search for Bill Lehr's paper, for example) and it undoubtedly won't be the last. I haven't read the paper in detail yet, but a quick scan through it indicates that the paper suggests that more spectrum be allocated to unlicensed use (whether "dedicated" unlicensed or through white space devices). In our study of secondary use, we determined that adding additional unlicensed spectrum tends to favor applications and systems that have small geographic reach since the contention from other unlicensed users is lower. Most of these studies do not differentiate application types; I'll be interested to see if this one does.

24 September 2009

Facts about broadband provision

Today's WSJ had an op-ed by Holman Jenkins. In it he claimed the following:

Two-thirds of Comcast's new broadband subscribers signed up in a recent quarter were defectors from DSL. "Churn" is the biggest challenge to broadband profitability, especially as competition drives down margins. According to Arbor Networks, the cost of fielding a single call to customer service can wipe out three years' profitability for a customer's broadband account.

18 September 2009

Open platforms and walled gardens

These paragraphs (from this article) caught my attention:

Despite AT&T’s success and optimism for the future, Rinne also offered a warning. She pointed to the checkered history of AOL, which stood on top of the Internet world in the late 20th century but saw its subscriber base collapse after the advent of home broadband. The Internet industry witnessed a tipping point of its own, which foreordained the broadband revolution and the collapse of the walled garden. As AT&T rides the cusp of the mobile Internet revolution, it has to be cautious it doesn’t fall into the same traps as AOL, Rinne said. And that means being open to new and numerous applications and business models. Rinne pointed to AT&T’s growing portfolio of embedded emerging devices, smartphones and connected PCs as an example of how AT&T has discarded the notion of the walled garden — and how it’s clearly enjoying the spoils.

AT&T may have discarded the walled garden itself, but ironically the key driver to its mobile Internet success hasn’t. Apple’s (NASDAQ:AAPL) iPhone is probably the ultimate culmination of the walled garden approach in mobile — one implemented elegantly and artfully but a walled garden nonetheless, where a single entity controls the platform and access to applications. Rather than reject the walled garden, consumers are flocking to it, frolicking happily within its confines. The iPhone isn’t the only example. The success of the Amazon (NASDAQ:AMZN) Kindle was built behind high topiary walls. Every book, newspaper and magazine downloaded to the Kindle comes from the selections available offered at the Amazon store, which offers no access to hundreds of thousands of titles available across the Web from such sites as Google Books.

Rather than tearing down those walls, operators and application developers are erecting new ones. Android, Nokia (NYSE:NOK), Palm (NASDAQ:PALM) and Research In Motion (NASDAQ:RIMM) are launching centralized apps stores. What’s different is the number of variety of app stores out there. The industry is abuzz with this idea that an open and expansive mobile Internet is in our future. But today, consumers don’t want expanse; they want alcoves — nests of innovative content. Maybe as tastes evolve consumers will gravitate toward truly open devices and they’ll develop the sophistication to hunt down the content, applications and services on their own. But for now they seem content to hop between one walled garden to another, and the winners in such a world will likely be the companies that can lay out the best garden plans.



To me, it stands in contrast to this item by Tim Lee at TLF:

The typical life cycle of a technology goes something like this. Technologies are usually originated in the laboratory, where a small number of geeks invent it and explore its capacities. Once the time comes for commercialization, often proprietary versions are the first out of the chute because it’s easier to monetize closed platforms, and therefore to raise the capital necessary to deploy them quickly. So in the early years of any new technology’s growth, it often looks like the proprietary technology has a decisive advantage (think AOL in 1993). Then, as the technology begins to mature, the disadvantages of closed technologies become apparent. They can only grow and evolve as fast as their owners can manage them, and as their owners get larger and more bureaucratic (think AOL in 1998), these platforms begin to stagnate. Meanwhile, the open alternatives, which are not held back by centralized management, continues growing rapidly, equalling and then quickly surpassing the closed competitors. Finally, the open platform’s lead gets so large that the closed platform, facing a choice between interoperability or irrelevance, is forced to open itself up to the competition (think AOL in 2003).

And once an open platform has become firmly established, proprietary firms stop trying to dislodge it. Instead, they try to build new proprietary technologies atop the underlying open architecture. Mac OS X, for example, is a thin layer of proprietary software atop a big stack of open technologies. Similarly, Facebook is a thin layer of proprietary code atop a lot of open Internet technologies. But that means that even as a company is trying to establish the dominance of their new, proprietary platform, they’re reinforcing the primacy of the underlying open architecture. Which means that that open architecture remains available to be built on further by anyone who cares to do so. And that, in turn, ensures that the process I described in the previous paragraph can begin again at another layer of the software stack.

What happens, though, is that every time this process begins in a new sector of the economy—online access in the mid-1990s, e-commerce in the late 1990s, broadband access in the early 2000s, social networking today—folks on the left-hand side of the political spectrum begin wringing their hands about how this time is going to be different. Sure, open platforms have had a good run so far, but now the big, bad corporations are going to take over. And because the specific names and technologies are different each time, it’s always possible to come up with a theory about the specific developments that will bring about the triumph of the walled gardens. Their warnings invariably turn out to be overblown, but by the time it becomes clear that the last round of predictions were wrong—Lessig’s 1999 predictions about e-commerce destroying the Internet, say—there’s a new threat to obsess over.

This, incidentally, is why it annoys me so much when libertarians denigrate the value of open platforms. The triumph of open architectures over the last couple of decades has been a vindication of the Hayekian idea of spontaneous order. AOL tried to build a closed, centrally-planned network, and it got clobbered by the Internet for precisely the same reasons that the US economy outperformed the Soviet economy: central planners lack sufficient information to satisfy the needs of millions of users with diverse needs. What the answer to the Lessigs and Zittrains of the world isn’t that open systems are bad. It’s that precisely because open systems are good, they’re unlikely to be dislodged by closed systems in the marketplace. Even when the structure of the market is far from ideal, as it is, for example, in the broadband duopoly, the open architectures have turned out to be far more robust than anyone expected.



So, the "walled gardens" in the first article are indeed built upon a system of open standards -- WiFi, GSM/HSDPA, etc. Will we devolve to a battle of walled gardens? Or, is there some other business model that is waiting in the wings to clobber Apple, Amazon, etc?

AT&T and LTE

There were some interesting bits in this article that I think are worth highlighting:

The upgrade to 7.2 Mb/s [HSPA] along with the addition of new HSPA data carriers and removing choke points in the backhaul network will ease many of those problems. But the advent of LTE in 2011 will provide the ultimate antidote. Not only will AT&T be able to deliver far more capacity over the new network, it will be able to deliver it much more efficiently and cheaply. Rinne estimated that the cost of delivering a megabit per second of capacity over LTE was just 3% the cost of delivering that same megabit on an EDGE network, compared to the 14% of the EDGE’s cost on the HSPA network.

Rinne does not define what "cost" is here. I suspect it is spectrum use rather than capital and operating cost, but it is hard to know for sure.

09 September 2009

Open source and marketing

This article over at Ars Technica is interesting. Apparently, ipoque, a manufacturer of "Deep Packet Inspection" (DPI) equipment has released the code for parts of its key inspection engines to reassure the public that it does not include the ability to store personal information associated with users, which has been a concern voiced about these technologies.

The interesting thing about this article for me was the tradeoff made by ipoque between the value of their intellectual property and the value of the PR they would gain, followed, they must have presumed, by increased sales.

As the article in Ars points out, that privacy isn't the only concern; others are concerned about DPI from the perspective of bandwidth caps for certain applications, which would be enabled by this technology.

The principle of "good enough" and telecom networks

This article at the PFF website caught my attention. In the article, Adam Thierer, in reflecting on the recent Gmail outage, applies the ideas of this article from Wired to telecom. The telecom network has been engineered (at high cost) to 99.999% (i.e., "five nines") reliability; the question is whether this quality level is anywhere close to what is demanded by the market.

In some sense, we have a test case in that we are willing to consume different feature sets in telecom at different prices. Wireline telephony is the most reliable with the highest voice quality at price $x, mobile telephony is less reliable and has lower voice quality (with mobility) and is offered at price $y and VoIP has probably less reliability and lower quality than either at a lower price ($z). As a note, I don't think it is fair to say that $z=0 because we do pay for internet access and the computer that runs the VoIP software.

So is there only a marginal demand for quality, which might partially explain why wireline access lines are on the decline? Or is it strictly due to the substitution of mobile for wireline access?

26 August 2009

Ofcom consultation on BT's NGN

This report is very interesting. In it, the UK regulator OFCOM responds to changes in BT's investment plans in its network upgrade. In the process, the consultation does a good job in laying out the technology and regulatory implications not just for consumers but also for competitors. It is a bit technical but well worth the read if you're interested in NGNs.

24 August 2009

Apple, AT&T and network capacity

I have written about this before (see this) so this article in BusinessWeek is an interesting follow up. This article highlights several things that I find interesting to note:

  • The relationship between product development and service development. In the wireless market, handsets are provided by companies that do not provide services. Especially in the case of the iPhone, Apple made a considerable investment in developing a user experience and singular branding that required investment from AT&T to implement.
  • The differing investment life cycles of products, software and infrastructure services. Apple has gone through at least three generations of its hardware, more than three generations of its software (if nothing else than to break jailbroken phones). In the same time period, AT&T has still been implementing its original network technology (HSDPA/HSPA) throughout its network. The cost and complexity of delivering infrastructure support is far higher than the product that users directly see and interact with.

Google voice and the iPhone

The dust-up around the Google Voice iPhone application has been fun to watch (see this for an update). There are many things going on here, including (possibly) AT&T's involvement (which both it and Apple deny). I found this item over at TechCrunch an interesting analysis of Apple's response to the FCC's inquiry.

I have to agree that it seems largely about Apple's control over users. AT&T would get either the data traffic or the voice traffic and, unless there is a serious difference in the profitability of one service over another, they should be largely indifferent.

Welcome to life in Apple's walled garden.

Update: The plot thickens ... according to this item, AT&T and Apple did have an agreement regarding VoIP, but it did not cover third party apps.

21 August 2009

Termination charges in mobile systems

This post discusses a paper by economist Sandy Levin comparing Wireless Party Pays and Calling Party Pays. The paper rightly concludes that CPP systems require ongoing regulation because there is no competition for termination charges, so they can get quite high. The subject of high termination charges has been the subject of EU regulation as well as complaints by the US Trade Representative.

Future of WiMAX

I am not quite as pessimistic about WiMAX as this item is. Instead, I think it will serve a role as a wireline replacement technology rather than as a competitor for LTE.

USDA report on rural broadband

This report form the US Dept of Ag. seems as though it will be worth reading. I suspect that the timing of the report is no accident, coming as it does as the FCC is in the midst of developing a broadband policy for the US. From the report summary:

Analysis suggests that rural economies benefi t generally from broadband availability. In comparing counties that had broadband access relatively early (by 2000) with similarly situated counties that had little or no broadband access as of 2000, employment growth was higher and nonfarm private earnings greater in counties with a longer history of broadband availability.

By 2007, most households (82 percent) with in-home Internet access had a broadband connection. A marked difference exists, however, between urban and rural broadband use—only 70 percent of rural households with in-home Internet access had a broadband connection in 2007, compared with 84 percent of urban households. The rural-urban difference in in-home broadband adoption among households with similar income levels reflects the more limited availability of broadband in rural settings.

Areas with low population size, locations that have experienced persistent population loss and an aging population, or places where population is widely dispersed over demanding terrain generally have diffi culty attracting broadband service providers. These characteristics can make the fixed cost of providing broadband access too high, or limit potential demand, thus depressing the profitability of providing service. Clusters of lower service exist in sparsely populated areas, such as the Dakotas, eastern Montana, northern Minnesota, and eastern Oregon. Other low-service areas, such as the Missouri-Iowa border and Appalachia, have aging and declining numbers of residents. Nonetheless, rural areas in some States (such as Nebraska, Kansas, and Vermont) have higher-than expected broadband service, given their population characteristics, suggesting that policy, economic, and social factors can overcome common barriers to broadband expansion.

20 August 2009

Availability in current cloud computing services

This article reports research performed in Australia looking at the availability of cloud services provided by Amazon, Google and Microsoft. Here is how the experiment was set up:

The team of researchers, led by the University of New South Wales (UNSW) and in collaboration with researchers at NICTA (National ICT Australia) and the Smart Services Cooperative Research Centre (CRC), have spent seven months stress testing Amazon's EC2, Google's AppLogic and Microsoft's Azure cloud computing services.

The analysis simulated 2000 concurrent users connecting to services from each of the three providers, with researchers measuring response times and other performance metrics.



Here are some things they found:

Response times on the service also varied by a factor of twenty depending on the time of day the services were accessed, she said.

The response times collated in Sydney were tested against measurement instruments loaded onto the cloud platform to isolate whether delays were attributable to the service itself or the latency involved with accessing US-based data centres from Australia.


and
None of the platforms have the kind of monitoring required to have a reasonable conversation about performance," she said. "They provide some level of monitoring, but what little there is caters for developers, not business users. And while Amazon provides a dashboard of how much it is costing you so far, for example, there is nothing in terms of forecasts about what it will cost you in the future.

19 August 2009

Network economics and universal service

I found this item over at CircleID interesting. In it, the author discusses the penalty of non-inclusion (see the graph below) and uses it to make an argument in favor of universal service.



This brought to mind some of Eli Noam's seminal work (this article, for example), in which Noam outlines the incentives people have to leave a universal network (see the graph below, from the paper). We are thus left with conflicting incentives, which serve to underscore the difficulties in achieving universal service.

Linux development report

I came across this report by way of this article in Ars Technica. It brought to mind the paper I wrote with a student a decade ago or so where we looked at who was developing 10BaseT standards and who was profiting from them (i.e., free ridership) which was published in the now-defunct ACM StandardView.

There are lots of interesting data in this report if you're interested in studying FLOSS (eg. Linux). Apropos the free ridership article was this graph that I extracted from Table 10 of the report.
It is interesting that the largest single class of contributors are those claiming no commercial affiliation, hence people who don't directly profit from their effort. The graph does seem to exhibit a "long tail" character ...

18 August 2009

LTE testing in the US

If you're interested in technology migration in the wireless industry, you might find this item over at Ars Technica interesting. The article reports on this news release:

Verizon Wireless today completed its first successful Long Term Evolution (LTE) fourth generation (4G) data call in Boston based on the 3GPP Release 8 standard; the company also announced today that it had earlier completed the first LTE 4G data call based on the 3GPP Release 8 standard in Seattle. The successful data calls involved streaming video, file uploads and downloads, and Web browsing. Significantly, Verizon Wireless has successfully made data calls using Voice over Internet Protocol (VoIP) to enable voice transmissions over the LTE 4G network.

----------SNIP-------------

Boston and Seattle each now have 10 LTE 4G cell sites up and running on the 700 MHz spectrum. These LTE 4G markets were selected by network planners due to their geographic configuration of suburban and urban areas as well as the areas’ high-technology population. The trials will help Verizon Wireless and its LTE 4G network partners understand issues that include how to best prepare cell sites and how to add the new technology to the network.


Surely Verizon is interested in LTE because it provides a bridge to the GSM world, which it now lacks.

In regards to the competition with WiMAX in the race to 4G, Ars observed:

The announcement also made one of LTE's advantages over WiMax clear: a number of traditional wireless telecom powers were backing it. The tests' description read a bit like a who's who of the cellular world. Network equipment came from Starent Networks and Nokia Siemens Networks, Alcatel-Lucent and Ericsson provided the base station hardware, and devices were provided by LG and Samsung.

But a key factor may ultimately wind up being bank balances. Verizon has continued to grow its earnings throughout the financial crisis, and wireless services account for nearly 90 percent of its income; it can't afford to appear as an also-ran, and has the money to make sure that it doesn't. Clearwire benefits from the deep pockets of its backers, most notably Intel, and has nearly $2.5 billion in the bank, according to its recent earnings release. But, at its current rate of operating losses, that cash will last it less than three years.


In other words, it may have little or nothing to do with the technical benefits of one versus the other, but rather with the ability to sustain the technological conversion. This reveals one of the essential features of telecom: that large capital investments are required before revenue can be earned, giving incumbents a powerful advantage.

Here is a related article from GigaOM.

17 August 2009

Economics of content on the web

I only follow this topic in a casual way, but I found this article to be interesting, especially given the challenges being faced by the traditional news organization. Quoting the article:
The vast majority of the value gets captured by aggregators linking and scraping rather than by the news organizations that get linked and scraped. We did a study of traffic on several sites that aggregate purely a menu of news stories. In all cases, there was at least twice as much traffic on the home page as there were clicks going to the stories that were on it. In other words, a very large share of the people who were visiting the site were merely browsing to read headlines rather than using the aggregation page to decide what they wanted to read in detail. Obviously, this has major ramifications for content creators’ ability to grow ad revenue, as the main benefit of added traffic is the potential for higher CPMs.

So, as always, the big question is how you get the incentives right so that people can be compensated for creating valuable content?

14 August 2009

Broadband carriers and government funding

This article is interesting. According to the atricle
As the Aug. 20 deadline nears to apply for $4.7 billion in broadband grants, AT&T, Verizon and Comcast are unlikely to go for the stimulus money, sources close to the companies said.

Their reasons are varied. All three say they are flush with cash, enough to upgrade and expand their broadband networks on their own. Some say taking money could draw unwanted scrutiny of business practices and compensation, as seen with automakers and banks that have taken government bailouts. And privately, some companies are griping about conditions attached to the money, including a net-neutrality rule that they say would prevent them from managing traffic on their networks in the way they want.

While it is quite possible that some of the rules, such as "network neutrality" may affect them anyway, it is clear that the carriers felt that the cost of participating in this program outweighed the benefits. A significant part of their concern is related to uncertainty about the consequences of an irreversible commitment. Thus, it seems an apt subject for a real options analysis.

Doing such an analysis rigorously would be challenging since the uncertainty is not easily quantifiable. But clearly carriers have concluded that the high probability of a modest upside does not outweigh the uncertain probability of a potentially large downside.

10 July 2009

Sprint and Ericsson

This item , which reports that Sprint is basically outsourcing its network operations to Ericsson, is interesting. Given their challenges in the wireless industry, Sprint is innovating in business models. First we see the 4G deal with Clearwire. Now we see this deal. So, Sprint has kept some of its strategy (though it has limited control over this in the 4G space due to the Clearwire deal). It has kept its capital investment and spectrum. And, it keeps control over the brand and the customer interface. I also assume that they keep some degree of control over network engineering, though that begins to but up against operations in some cases.

Also interesting to me is that they laud Ericssons expertise, which it undoubtedly has in GSM networks. How does that translate to Sprint's CDMA/WiMAX combo?

This will be interesting to watch. It could portend a shift in the industry.

23 June 2009

Transatlantic telecom capacity

You might find this item from GigaOm interesting. If this projection is correct, we will almost certainly be paying more for transatlantic capacity.

18 June 2009

iPhone and network capacity

This article, which describes how the iPhone is putting strains on AT&T's wireless network is interesting. It further illustrates the close relationship between carriers, content and network devices that I blogged about here. As the article shows, reducing handset prices increases network capacity (and revenues) and also increases the use of social networking applications. It seems likely that there is some revenue sharing going on to facilitate this ...

10 June 2009

Wireless carriers and applications

This article from GigaOm is interesting. Basically, the article explains that carriers are benefiting from consumer interest in social networking, which has migrated to wireless platforms. Interestingly, this was not an application carriers had in mind when they made investments in 3G infrastructure. How will carriers justify their upcoming investments in 4G?

The nature of telecommunications is that large investments must be made before revenues can be realized. The telecom network has to be largely built out before it becomes valuable to consumers. As a result, investments have traditionally been conservative and tied to applications. The telephone network was tied to voice communications and associated services, etc. A significant exception was the Internet, which was built with no particular application in mind, but then it was also built with government funding. It wasn't privatized until a commercially interesting application set emerged (email and the web).

The phenomenon that this article describes indicates that unexpected applications (social networking) are a significant new revenue source for carriers. Will this change their investment model? The other thing that is addressed, though not explicitly, is that these applications require collective action from independent entities whose interests are partly common and partly at odds. Applications need broadband and want it to be cheap. Social networking works better with smart phones, which are more costly than "regular" ones. So manufacturers need joint marketing agreements with carriers, yet they want the ability for users to control applications and configuration.

If high-revenue end user applications can no longer be predicted or managed by telecom carriers, how will their investments be justified? It seems that carriers need a strategy and a set of tools for managing investment risk. Physical commodities have futures markets for this purpose.

One of the reasons why I am interested in markets for capacity (most recently spectrum) is that derivatives (such as futures) are possible, which allow for industry restructuring. In the absence of explicit risk management, carriers will integrate applications with carriage, which leads to "network neutrality" concerns.

05 June 2009

Asia-Pacific Gateway fiber cable

Following my interest in collecting information on undersea fiber cables, here is the latest installment, the Asia-Pacific Gateway. According to the article,

Targeted to be ready for service in 2011, the 8,000 kilometer network with a minimum design capacity of four terabits per-second ... the proposed fiber optic network will offer alternative communication routes and nodes. In the event of accidents or subsea earthquakes, it will minimize the impact of a breakdown in services provided by existing regional fiber optic network ...

The APG network will connect Japan, Korea, mainland China, Taiwan, Philippines, Hong Kong, Vietnam, Thailand, Malaysia and Singapore. The signatories to the agreement for the development of the gateway are China Telecom, China Unicom, Chunghwa Telecom, NTT Communication (Japan), Vietnam Post and Telecommunications (VNPT), Korea Telecom (KT), Philippines Long Distance Telephone (PLDT) and Telekom Malaysia. The telcos will jointly finance and own the APG network.



Phone line shrinkage at AT&T

I have blogged before about the decrease in access lines (see this, for example). While it is not surprising given the increase in wireless only households, this article over at GigaOm shows that the decline has been in the 6% per year range for AT&T, not the 3% range.

As the article correctly points out, this is one of the reasons that the large ILECs have been aggressive in rolling out their broadband infrastructures. Since consumers are increasingly opting for wireless for voice, the only way that the ILECs have to continue receiving a share of the consumer's communications expenditures is to build out broadband, which enables them to compete with cablecos for television and internet access expenditures.

If they don't they have to depreciate their infrastructure at a faster rate than consumers are leaving it, else investors (the company owners) will be left holding the bag. Of course, this is an end-game that they would only play if they decided to cede the marketplace to other access providers. There is no sign that ILECs are interested in that strategy!

03 June 2009

Net Neutrality in the UK

I found this article over at Ars Technica interesting and worth reading. The article describes an ongoing "discussion" between the carrier BT and the content provider BBC. BT is using tier-based categories to implement a price discrimination. While this is normal in telecom (and arguably necessary since these industries have marginal costs below average costs), the challenge is that the consequence of this (bandwidth limitation, in this case) affects the business of content providers (in this case BBC).

What is interesting about this is that there is a problem of joint action by parties with competing interests. BT has to make substantial investments to enable access to broadband content, so it is hard to argue against their right to earn a reasonable return on this investment. At the same time, BT needs content providers (like the BBC) to develop compelling programming so that users will find it worthwhile to pay for the investment. The BBC wants to have its programming available to as many viewers as possible (to maximize their returns) at the highest quality level possible.

So, BT wants broadband prices on the higher side to maximize its benefits whereas BBC wants them on the lower side to maximize theirs. It appears that BT has set its lowest tier such that viewers can see the BBC content at its basic level of quality. Higher quality levels (which require more bandwidth) require a higher monthly fee

While I am not an economist, this seems to be a bargaining problem of the kind discussed by Oliver Williamson (and Ronald Coase, for that matter). Thus, either we see an allocation of rights and costs that is socially efficent (Coase, in the absence of transaction costs) or we see vertical integration (Williamson, in the presence of high transaction costs) or government intervention (to reduce transaction costs).

04 March 2009

We should have seen this coming ...

It seems that many FCC decisions are challenged in Appeals Court, so this one, challenging the "white spaces" decision, shouldn't come as a surprise. This could turn into an entertaining "food fight", though, pitting the powerful National Association of Broadcasters (NAB) against the well connected White Spaces Coalition.

On a related note, you might find this editorial by Tom Hazlett interesting. Hazlett writes:

The new Obama Administration got Congress to fast-track an 11th-hour delay, pushing back the mandatory analog switch-off until June 12, 2009.

But many TV stations wanted to cut their analog broadcasts anyway. In truth, they believe that over-the-air transmissions are a waste, not worth the electricity it takes to send them. Hence, some 420 TV stations pulled the plug last week, joining another 200 analog stations that had already signed off.


Later in the article, he writes:
One hundred million households now pay $600 or so per year to avoid [over the air broadcasting], subscribing to cable or satellite. Well over 90 per cent of TV viewing takes place in households opting out of broadcast delivery. And for a very small additional investment – no more than $3bn – every last rabbit-eared home in America could join them.

Yet, the US is subsidizing off-air receivers; $1.5bn has been allotted for digital set-top converters (two $40 vouchers per family), and the Obama “stimulus” pumps in $650m more. This is not merely money down the drain. In extending life-support to DTV signals that hog hugely valuable frequencies, consumers lose hundreds of billions worth of wireless service. The bandwidth available to iPhones, Blackberrys and GPhones and other emerging technologies would double were TV air waves to accommodate mobile apps as requested in 1985.

Why don’t all broadcasters – digital and analog – just unplug? First, their licenses mandate that they broadcast TV signals, and they cannot legally sell the air waves used for more valuable services. And second, because their signals continue to reach one key target audience: Congress. Those appreciative entertainment fans reward stations with “must carry” rights forcing cable and satellite operators to provide their subscribers all local channels. The off-air transmission is a side show; gaining free cable carriage the main event.


If you agree with Hazlett's viewpoint, the NAB's challenge to the white spaces decision is consistent with a "do whatever it takes" strategy to preserve the existing business model, regardless of whether they are viable in the long term.

02 March 2009

New spectrum license fee in the US?

Spectrum license fees are one way to encourage spectrum users to use spectrum efficiently. The trick is to set the fees properly; if they are set too high, then spectrum will be underutilized with respect to their social benefit, and set too low and it doesn't accomplish efficient utilization. On page 126 of his recent budget proposal, the President is proposing increasing revenues from spectrum licenses to US$550M to US$50M over four years, 1000% increase.

There is little but speculation about which spectrum will be taxed, since the Office of the President is apparently saying little about this. The PFF has remarked that, if applied to mobile carriers (which have already paid for licenses through auctions), the tax is quite regressive. According to this item, this fee may just apply to broadcasters. If this is the case, it will help motivate broadcasters to refocus on being programming producers (and possibly aggregators) since it will cost them ever more to transmit their product to an ever-diminishing audience at even higher costs than before. Michael Marcus just posted this related item.

24 February 2009

A couple of telecom-related URLs

I need to break the "radio silence" ... I've been pretty busy with administrative projects and travel, so blogging has taken a back seat.

I have been working on dynamic spectrum access (DSA) for a number of years now. A paper on one of my projects, together with Arnon Tonmukayakul, is coming out in Netnomics shortly. In addition, I presented a paper on DSA and the FCC White Spaces decision at the recent iConference. Hopefully, the papers will be made available soon.

So, given this, I found this site to be of interest. You can use it to find the TV white spaces at a particular address.

If you go back through this blog, you'll find that I have been interested in (OK, critical of) comparative studies of broadband penetration. So, I found this site interesting, which looks at broadband from the point of view of connectivity rather than penetration. From this perspective, rankings (for what they're worth) look quite different.

06 January 2009

What I did over Christmas break


DSC01086
Originally uploaded by smpl5
While this is not telecom related, I thought you might enjoy the picture, taken Christmas Day, during a blizzard in Park City. We had over 2 feet of fresh "Utah Lite" that day. So it was skiing deep powder in poor visibility and high winds. The ice beard testifies to the conditions.