30 September 2009

For techno-economic study geeks only

If you like to do cost studies, then adjusting for inflation is critical. This website has the inflation adjustment factors from 1774 to 2016 (obviously future ones are projected). So have at it and start modelling!

The Iridium saga continues

Iridium has been an interesting case study in the telecom industry. This article points out that there is still a business model its LEO satellite-based system. Without going into details, Iridium was designed to be a system that would enable people to be in touch anywhere. It was developed prior to the widespread deployment of terrestrial mobile systems (eg. GSM) and when voice was the only application of interest. By the time it was deployed and open for business, this had all changed and terrestrial mobile systems satisfied many of the needs for which Iridium was developed and offered diverse handsets, data services and indoor capabilities, which were not available from Iridium. Its fortunes plummeted and was purchased on the cheap in bankruptcy. Today, its 347,000 subscribers generated annual revenue of $1.2 billion (or an Annual Revenue per User of about $3460 per user), so there motivated users (mostly government) exist. I wonder if the new satellites will have data capability ...

Update (7 Oct 2009): BusinessWeek posted this article on Iridium which covers much of the same ground as the previously cited article, but with more analyst insight.

29 September 2009

How open is open source Android?

I have found the Android project to be interesting to follow. There have clearly been some successes in using open source for product development (Linux, obviously but also Apache). The success of open source seems to be a bit uneven; for example, I am not aware of a huge development community around Solaris, which was open-sourced by Sun. So when Google launched Android as an open source development project, I was most interested, especially given Google's market clout and user following.

So this item really caught me by surprise. I especially found this paragraph interesting (and a bit counter-stereotypical):
Google, however, appears to be significantly less permissive on this front than Microsoft. The company's legal department objects to the Cyanogen mod on the basis of its inclusion of Google's proprietary software. They sent Kondik a cease and desist order compelling him to remove the mod from his Web site. The Android enthusiast community has responded fiercely, condemning Google for taking a heavy-handed approach. Even Google's own Android team appears to be frustrated with the legal department's zeal.
I fully expect that some bright and committed programmers will find a work-around. The whole reason for the mod, of course, is that Google's ROM doesn't work as well. Isn't that how open source is supposed to go? Has Google discovered the hazard of this approach? I wonder if they'll encounter similar problems when Chrome OS goes open source?

25 September 2009

Microsoft study places value on white spaces

I found this article interesting.

The study, by consultant Richard Thanki of Perspective Associates, suggests that by augmenting current unlicensed wireless networks, such as Wi-Fi hot spots, the white spaces could generate between $3.9 billion and $7.3 billion in value annually over 15 years. That would be the result of increased use of consumer electronics and other factors, according to the study.

In his study, Thanki writes that white spaces spectrum offers a broader range than a typical Wi-Fi connection. A single Wi-Fi access point enhanced by the white spaces could "fully cover a large building and the neighboring grounds and areas," he writes.

In addition, use of the white spaces could lower the cost of providing Internet access in rural areas, Thanki writes.

While chips used to power white spaces devices would initially cost roughly $10 more than existing technologies in 2012, the difference should steadily decline at a rate of about 30% annually from that point on, according to the study.


I would love to get a copy of that study ...



Posted using ShareThis

Economic value of unlicensed spectrum

Here a new item for my reading list. This isn't the first analysis of its kind (search for Bill Lehr's paper, for example) and it undoubtedly won't be the last. I haven't read the paper in detail yet, but a quick scan through it indicates that the paper suggests that more spectrum be allocated to unlicensed use (whether "dedicated" unlicensed or through white space devices). In our study of secondary use, we determined that adding additional unlicensed spectrum tends to favor applications and systems that have small geographic reach since the contention from other unlicensed users is lower. Most of these studies do not differentiate application types; I'll be interested to see if this one does.

24 September 2009

Facts about broadband provision

Today's WSJ had an op-ed by Holman Jenkins. In it he claimed the following:

Two-thirds of Comcast's new broadband subscribers signed up in a recent quarter were defectors from DSL. "Churn" is the biggest challenge to broadband profitability, especially as competition drives down margins. According to Arbor Networks, the cost of fielding a single call to customer service can wipe out three years' profitability for a customer's broadband account.

18 September 2009

Open platforms and walled gardens

These paragraphs (from this article) caught my attention:

Despite AT&T’s success and optimism for the future, Rinne also offered a warning. She pointed to the checkered history of AOL, which stood on top of the Internet world in the late 20th century but saw its subscriber base collapse after the advent of home broadband. The Internet industry witnessed a tipping point of its own, which foreordained the broadband revolution and the collapse of the walled garden. As AT&T rides the cusp of the mobile Internet revolution, it has to be cautious it doesn’t fall into the same traps as AOL, Rinne said. And that means being open to new and numerous applications and business models. Rinne pointed to AT&T’s growing portfolio of embedded emerging devices, smartphones and connected PCs as an example of how AT&T has discarded the notion of the walled garden — and how it’s clearly enjoying the spoils.

AT&T may have discarded the walled garden itself, but ironically the key driver to its mobile Internet success hasn’t. Apple’s (NASDAQ:AAPL) iPhone is probably the ultimate culmination of the walled garden approach in mobile — one implemented elegantly and artfully but a walled garden nonetheless, where a single entity controls the platform and access to applications. Rather than reject the walled garden, consumers are flocking to it, frolicking happily within its confines. The iPhone isn’t the only example. The success of the Amazon (NASDAQ:AMZN) Kindle was built behind high topiary walls. Every book, newspaper and magazine downloaded to the Kindle comes from the selections available offered at the Amazon store, which offers no access to hundreds of thousands of titles available across the Web from such sites as Google Books.

Rather than tearing down those walls, operators and application developers are erecting new ones. Android, Nokia (NYSE:NOK), Palm (NASDAQ:PALM) and Research In Motion (NASDAQ:RIMM) are launching centralized apps stores. What’s different is the number of variety of app stores out there. The industry is abuzz with this idea that an open and expansive mobile Internet is in our future. But today, consumers don’t want expanse; they want alcoves — nests of innovative content. Maybe as tastes evolve consumers will gravitate toward truly open devices and they’ll develop the sophistication to hunt down the content, applications and services on their own. But for now they seem content to hop between one walled garden to another, and the winners in such a world will likely be the companies that can lay out the best garden plans.



To me, it stands in contrast to this item by Tim Lee at TLF:

The typical life cycle of a technology goes something like this. Technologies are usually originated in the laboratory, where a small number of geeks invent it and explore its capacities. Once the time comes for commercialization, often proprietary versions are the first out of the chute because it’s easier to monetize closed platforms, and therefore to raise the capital necessary to deploy them quickly. So in the early years of any new technology’s growth, it often looks like the proprietary technology has a decisive advantage (think AOL in 1993). Then, as the technology begins to mature, the disadvantages of closed technologies become apparent. They can only grow and evolve as fast as their owners can manage them, and as their owners get larger and more bureaucratic (think AOL in 1998), these platforms begin to stagnate. Meanwhile, the open alternatives, which are not held back by centralized management, continues growing rapidly, equalling and then quickly surpassing the closed competitors. Finally, the open platform’s lead gets so large that the closed platform, facing a choice between interoperability or irrelevance, is forced to open itself up to the competition (think AOL in 2003).

And once an open platform has become firmly established, proprietary firms stop trying to dislodge it. Instead, they try to build new proprietary technologies atop the underlying open architecture. Mac OS X, for example, is a thin layer of proprietary software atop a big stack of open technologies. Similarly, Facebook is a thin layer of proprietary code atop a lot of open Internet technologies. But that means that even as a company is trying to establish the dominance of their new, proprietary platform, they’re reinforcing the primacy of the underlying open architecture. Which means that that open architecture remains available to be built on further by anyone who cares to do so. And that, in turn, ensures that the process I described in the previous paragraph can begin again at another layer of the software stack.

What happens, though, is that every time this process begins in a new sector of the economy—online access in the mid-1990s, e-commerce in the late 1990s, broadband access in the early 2000s, social networking today—folks on the left-hand side of the political spectrum begin wringing their hands about how this time is going to be different. Sure, open platforms have had a good run so far, but now the big, bad corporations are going to take over. And because the specific names and technologies are different each time, it’s always possible to come up with a theory about the specific developments that will bring about the triumph of the walled gardens. Their warnings invariably turn out to be overblown, but by the time it becomes clear that the last round of predictions were wrong—Lessig’s 1999 predictions about e-commerce destroying the Internet, say—there’s a new threat to obsess over.

This, incidentally, is why it annoys me so much when libertarians denigrate the value of open platforms. The triumph of open architectures over the last couple of decades has been a vindication of the Hayekian idea of spontaneous order. AOL tried to build a closed, centrally-planned network, and it got clobbered by the Internet for precisely the same reasons that the US economy outperformed the Soviet economy: central planners lack sufficient information to satisfy the needs of millions of users with diverse needs. What the answer to the Lessigs and Zittrains of the world isn’t that open systems are bad. It’s that precisely because open systems are good, they’re unlikely to be dislodged by closed systems in the marketplace. Even when the structure of the market is far from ideal, as it is, for example, in the broadband duopoly, the open architectures have turned out to be far more robust than anyone expected.



So, the "walled gardens" in the first article are indeed built upon a system of open standards -- WiFi, GSM/HSDPA, etc. Will we devolve to a battle of walled gardens? Or, is there some other business model that is waiting in the wings to clobber Apple, Amazon, etc?

AT&T and LTE

There were some interesting bits in this article that I think are worth highlighting:

The upgrade to 7.2 Mb/s [HSPA] along with the addition of new HSPA data carriers and removing choke points in the backhaul network will ease many of those problems. But the advent of LTE in 2011 will provide the ultimate antidote. Not only will AT&T be able to deliver far more capacity over the new network, it will be able to deliver it much more efficiently and cheaply. Rinne estimated that the cost of delivering a megabit per second of capacity over LTE was just 3% the cost of delivering that same megabit on an EDGE network, compared to the 14% of the EDGE’s cost on the HSPA network.

Rinne does not define what "cost" is here. I suspect it is spectrum use rather than capital and operating cost, but it is hard to know for sure.

09 September 2009

Open source and marketing

This article over at Ars Technica is interesting. Apparently, ipoque, a manufacturer of "Deep Packet Inspection" (DPI) equipment has released the code for parts of its key inspection engines to reassure the public that it does not include the ability to store personal information associated with users, which has been a concern voiced about these technologies.

The interesting thing about this article for me was the tradeoff made by ipoque between the value of their intellectual property and the value of the PR they would gain, followed, they must have presumed, by increased sales.

As the article in Ars points out, that privacy isn't the only concern; others are concerned about DPI from the perspective of bandwidth caps for certain applications, which would be enabled by this technology.

The principle of "good enough" and telecom networks

This article at the PFF website caught my attention. In the article, Adam Thierer, in reflecting on the recent Gmail outage, applies the ideas of this article from Wired to telecom. The telecom network has been engineered (at high cost) to 99.999% (i.e., "five nines") reliability; the question is whether this quality level is anywhere close to what is demanded by the market.

In some sense, we have a test case in that we are willing to consume different feature sets in telecom at different prices. Wireline telephony is the most reliable with the highest voice quality at price $x, mobile telephony is less reliable and has lower voice quality (with mobility) and is offered at price $y and VoIP has probably less reliability and lower quality than either at a lower price ($z). As a note, I don't think it is fair to say that $z=0 because we do pay for internet access and the computer that runs the VoIP software.

So is there only a marginal demand for quality, which might partially explain why wireline access lines are on the decline? Or is it strictly due to the substitution of mobile for wireline access?