Showing posts with label ATT. Show all posts
Showing posts with label ATT. Show all posts

28 September 2010

AT&T Long distance network, Sept 1898

This item, a 1898 map of AT&T's long distance network, is interesting. This dates from the days before vacuum tubes, so there were no active repeaters in the network. Since Pupin did not receive his patent on "loading coils" until 1890, it is likely that there were relatively few of these in the network when the map was drawn. In addition, local exchange competition, which began in 1893, would have been in full swing, and one of AT&T's strategies was to leverage the network effects due to its growing long distance network. Indeed, such a map would have been valuable in marketing the benefits of AT&T's affiliates in competitive local markets.

(Shameless plug: learn more about this in my book!)

22 June 2010

Investment and demand in fiber to the home

There are a couple of threads that, juxtaposed, are interesting. First, Australia moved ahead with its ambitious fiber-based Internet project, and second, this report gives one pause as to whether it would make sense in the US.

So is the US foolish in not making larger investments in infrastructure like our trading parters, is it wise?

The answer is not so simple and certainly not obvious because of the enormous delay involved in rolling out infrastructure. AT&T witnessed this when it signed on as the exclusive carrier of Apple's ground-breaking iPhone. Indeed, they are arguably still trying to make the investments due to the runaway success of the device.

Would the same thing happen with some as-yet unknown device (Device X), except with end user bandwidth? We don't know if Device X exists, or, if it does, what the impact on existing networks would be. The iPhone has shown us that the potential exists, but how can carriers (and their shareholders) be convinced to make large investments based on revenue streams that are highly uncertain at best? Should the government take these risks?

19 May 2010

Google's Nexus One and unsubsidized handsets

I found this item over at the Technology Liberation Front interesting and worth reading. The article points out that the primary cause of Google's failed experiment is Americans' unwillingness to give up subsidies. This may well be largely true, but it fails to capture other aspects of the American telecom environment and this particular experiment that are also contributing factors.

My last two handsets have been unsubsidized and unlocked (by choice), and yes, one of them is a Nexus One. My carrier (AT&T) does not give me a discount for an unsubsidized handset, so I am paying the monthly price assuming I had a subsidized handset. In a sense, then, I am paying twice. I chose this route not for lower cost (obviously) but for flexibility. I am now free to use SIM cards of my choosing and I have a month-to-month contract, which I also like.

The other aspect of the Google experiment is that consumers did not have the opportunity to "try before buy". The brick-and-mortar retail experience does allow this while the virtual one does not. For a complex product like a smart phone, user experience is key (a lesson that Steve Jobs taught us). User experience cannot be assessed via a web page, no matter how well designed it is.

27 April 2010

Apple, Google, AT&T and Verizon

This article over at Seeking Alpha is interesting. It describes the "small numbers bargaining problem" in telecom. Quoting the article:

Verizon has already shown resistance to putting the Apple iPhone on its platform for fear that it will use tremendous amounts of data without sharing any of the third party application profits with the carrier. Now VZ is beginning to play games with Google saying that they won’t pick up the Nexus One, planned to be released spring of 2010. Google loses access to the carrier’s more than 90 million users, and seems to have stumbled for the time being in becoming a major player in the mobile handset market.

AT&T has not picked up the phone either. But AT&T has more problems than just Google, they are trapped in a mutually hated relationship with Apple now, where neither party can get rid of the other. It really is the marriage from hell. Apple doesn’t have another carrier, and AT&T can’t throw Apple off for fear that its massive amount of iPhone users will defect from its completely inferior network. So AT&T is trapped having to provide Apple with more and more bandwidth, towers and other infrastructure, as the public and media scream at AT&T to get their network up to Verizon’s standards, not to even mention how well Sprint (S) works. AT&T is for sure frustrated that they have to make these capital expenditures and see no increased profit from them, it’s like bailing out the water from a ship with a 20 foot hole in the bottom. You’re just spending energy trying to stay afloat.

Something has to give here, this can’t go on forever, and I think we are soon to see a resolution to the issue of the telecom giants paying for the network on which Apple and Google make tremendous amounts of money. Maybe the telecoms chose the nuclear option and just stop building their networks holding Apple’s feet to the fire. Steve Jobs can’t revolutionize the content distribution market without a network to do it on, and believe me, the stuff that he wants to do is going to take a lot more bandwidth than is available today. Do we really think the carriers are going to pay for that to happen? Maybe Apple will buy a carrier, or perhaps even build its own network with a next generation technology they have been developing.


I wonder to what extent the independent LTE network proposed by Harbinger (see this) will be the event that shakes things up? In many ways, the situation described above seems to be an analog of the drama being played out over network neutrality in that we have infrastructure investments required for applications that are not easily monetized by the infrastructure owners.

21 April 2010

AT&T and the Internet of Things

This item over at GigaOm is interesting. To me, beyond the numbers, the most interesting observation was this:
The irony here is that M2M connectivity in many ways represents the dumb pipe future that AT&T is so worried about — it’s not providing anything to its partners but the bits. On the call, AT&T executives explained that the number of bits sent via the network are high-margin bits and the machine-to-machine clients have very low churn.
Customer acquisition costs in wireless are quite high, so low churn is a rational cost management strategy. The observation is right that M2M clients would not turn over very fast, if at all.

15 March 2010

iPhone usage and AT&T's network

This item over at GigaOm is interesting:
iPhone users tend to use their devices in the evening and on the weekends, reports Localytics, a Cambridge, MA-based start-up offering mobile analytics services. According to as study conducted by the company, the mobile app usage in the US peaks at around 9 pm EST on week days. Over the weekend, the usage is at its peek during afternoon and nights.

Since these time correspond (historically) with lower voice traffic, why are there so many complaints about AT&T's network? Could it be that their network is only part of the problem (see this earlier post)?

30 December 2009

Challenges with phasing out landlines

This commentary over at GigaOm is insightful and worth reading. In it, Stacey Higginbotham reflects on this AT&T filing at the FCC. Some excerpts from the filing:
Over 99% of Americans live in areas with cellular phone service, and approximately 86% of Americans subscribe to a wireless service. Many of these individuals see no reason to purchase landline service as well. Indeed, the most recent data show that more than 22% of households have “cut the cord” entirely.

Demand for VoIP service – from both cable companies and over-the-top providers such as such as Vonage, Skype, and many others – is also booming. At least 18 million households currently use a VoIP service, and it is estimated that by 2010, cable companies alone will be providing VoIP to more than 24 million customers; by 2011, there may be up to 45 million total VoIP subscribers.

Today, less than 20% of Americans rely exclusively on POTS for voice service. Approximately 25% of households have abandoned POTS altogether, and another 700,000 lines are being cut every month. From 2000 to 2008, the number of residential switched access lines has fallen by almost half, from 139 million to 75 million. Non-primary residential lines have fallen by 62% over the same period; with the rise of broadband, few customers still need a second phone line for dial-up Internet service. Total interstate and intrastate switched access minutes have fallen by a staggering 42% from 2000 through 2008. Indeed, perhaps the clearest sign of the transformation away from POTS and towards a broadband future is that there are probably now more broadband connections than telephone lines in the United States

The public policy problem, of course, is what to do with the 20% that do not have connections. Further how about those people who use the copper PSTN plant for DSL access (whether from a telco or a reseller)?

It seems that the hard problems with public policy is dealing not with the majority but with the marginal.

22 December 2009

Test of 3G networks in the US

Gizmodo recently ran a test of 3G networks in the US. The results are here. AT&T had the fastest average upload and down load speeds in the cities where the test was conducted. This is a bit of a surprise given the heat that the AT&T network has been taking recently from iPhone users. Gizmodo wonders whether it is the iPhone or some interaction between the iPhone and the AT&T network, and I can see why. These test results don't corroborate the belief that AT&T's network is (only) at fault.

12 October 2009

End of handset contracts?

I found this item over at CNet interesting. Apparently, AT&T's profitability point in the iPhone contract occurs rather late in the two year contract cycle. Does this mean that the carriers will shift its strategy away from captive contracts for handsets? If it isn't particularly profitable for them to do this, then they would, presumably, stop (absent other benefits)?

18 September 2009

Open platforms and walled gardens

These paragraphs (from this article) caught my attention:

Despite AT&T’s success and optimism for the future, Rinne also offered a warning. She pointed to the checkered history of AOL, which stood on top of the Internet world in the late 20th century but saw its subscriber base collapse after the advent of home broadband. The Internet industry witnessed a tipping point of its own, which foreordained the broadband revolution and the collapse of the walled garden. As AT&T rides the cusp of the mobile Internet revolution, it has to be cautious it doesn’t fall into the same traps as AOL, Rinne said. And that means being open to new and numerous applications and business models. Rinne pointed to AT&T’s growing portfolio of embedded emerging devices, smartphones and connected PCs as an example of how AT&T has discarded the notion of the walled garden — and how it’s clearly enjoying the spoils.

AT&T may have discarded the walled garden itself, but ironically the key driver to its mobile Internet success hasn’t. Apple’s (NASDAQ:AAPL) iPhone is probably the ultimate culmination of the walled garden approach in mobile — one implemented elegantly and artfully but a walled garden nonetheless, where a single entity controls the platform and access to applications. Rather than reject the walled garden, consumers are flocking to it, frolicking happily within its confines. The iPhone isn’t the only example. The success of the Amazon (NASDAQ:AMZN) Kindle was built behind high topiary walls. Every book, newspaper and magazine downloaded to the Kindle comes from the selections available offered at the Amazon store, which offers no access to hundreds of thousands of titles available across the Web from such sites as Google Books.

Rather than tearing down those walls, operators and application developers are erecting new ones. Android, Nokia (NYSE:NOK), Palm (NASDAQ:PALM) and Research In Motion (NASDAQ:RIMM) are launching centralized apps stores. What’s different is the number of variety of app stores out there. The industry is abuzz with this idea that an open and expansive mobile Internet is in our future. But today, consumers don’t want expanse; they want alcoves — nests of innovative content. Maybe as tastes evolve consumers will gravitate toward truly open devices and they’ll develop the sophistication to hunt down the content, applications and services on their own. But for now they seem content to hop between one walled garden to another, and the winners in such a world will likely be the companies that can lay out the best garden plans.



To me, it stands in contrast to this item by Tim Lee at TLF:

The typical life cycle of a technology goes something like this. Technologies are usually originated in the laboratory, where a small number of geeks invent it and explore its capacities. Once the time comes for commercialization, often proprietary versions are the first out of the chute because it’s easier to monetize closed platforms, and therefore to raise the capital necessary to deploy them quickly. So in the early years of any new technology’s growth, it often looks like the proprietary technology has a decisive advantage (think AOL in 1993). Then, as the technology begins to mature, the disadvantages of closed technologies become apparent. They can only grow and evolve as fast as their owners can manage them, and as their owners get larger and more bureaucratic (think AOL in 1998), these platforms begin to stagnate. Meanwhile, the open alternatives, which are not held back by centralized management, continues growing rapidly, equalling and then quickly surpassing the closed competitors. Finally, the open platform’s lead gets so large that the closed platform, facing a choice between interoperability or irrelevance, is forced to open itself up to the competition (think AOL in 2003).

And once an open platform has become firmly established, proprietary firms stop trying to dislodge it. Instead, they try to build new proprietary technologies atop the underlying open architecture. Mac OS X, for example, is a thin layer of proprietary software atop a big stack of open technologies. Similarly, Facebook is a thin layer of proprietary code atop a lot of open Internet technologies. But that means that even as a company is trying to establish the dominance of their new, proprietary platform, they’re reinforcing the primacy of the underlying open architecture. Which means that that open architecture remains available to be built on further by anyone who cares to do so. And that, in turn, ensures that the process I described in the previous paragraph can begin again at another layer of the software stack.

What happens, though, is that every time this process begins in a new sector of the economy—online access in the mid-1990s, e-commerce in the late 1990s, broadband access in the early 2000s, social networking today—folks on the left-hand side of the political spectrum begin wringing their hands about how this time is going to be different. Sure, open platforms have had a good run so far, but now the big, bad corporations are going to take over. And because the specific names and technologies are different each time, it’s always possible to come up with a theory about the specific developments that will bring about the triumph of the walled gardens. Their warnings invariably turn out to be overblown, but by the time it becomes clear that the last round of predictions were wrong—Lessig’s 1999 predictions about e-commerce destroying the Internet, say—there’s a new threat to obsess over.

This, incidentally, is why it annoys me so much when libertarians denigrate the value of open platforms. The triumph of open architectures over the last couple of decades has been a vindication of the Hayekian idea of spontaneous order. AOL tried to build a closed, centrally-planned network, and it got clobbered by the Internet for precisely the same reasons that the US economy outperformed the Soviet economy: central planners lack sufficient information to satisfy the needs of millions of users with diverse needs. What the answer to the Lessigs and Zittrains of the world isn’t that open systems are bad. It’s that precisely because open systems are good, they’re unlikely to be dislodged by closed systems in the marketplace. Even when the structure of the market is far from ideal, as it is, for example, in the broadband duopoly, the open architectures have turned out to be far more robust than anyone expected.



So, the "walled gardens" in the first article are indeed built upon a system of open standards -- WiFi, GSM/HSDPA, etc. Will we devolve to a battle of walled gardens? Or, is there some other business model that is waiting in the wings to clobber Apple, Amazon, etc?

AT&T and LTE

There were some interesting bits in this article that I think are worth highlighting:

The upgrade to 7.2 Mb/s [HSPA] along with the addition of new HSPA data carriers and removing choke points in the backhaul network will ease many of those problems. But the advent of LTE in 2011 will provide the ultimate antidote. Not only will AT&T be able to deliver far more capacity over the new network, it will be able to deliver it much more efficiently and cheaply. Rinne estimated that the cost of delivering a megabit per second of capacity over LTE was just 3% the cost of delivering that same megabit on an EDGE network, compared to the 14% of the EDGE’s cost on the HSPA network.

Rinne does not define what "cost" is here. I suspect it is spectrum use rather than capital and operating cost, but it is hard to know for sure.

24 August 2009

Apple, AT&T and network capacity

I have written about this before (see this) so this article in BusinessWeek is an interesting follow up. This article highlights several things that I find interesting to note:

  • The relationship between product development and service development. In the wireless market, handsets are provided by companies that do not provide services. Especially in the case of the iPhone, Apple made a considerable investment in developing a user experience and singular branding that required investment from AT&T to implement.
  • The differing investment life cycles of products, software and infrastructure services. Apple has gone through at least three generations of its hardware, more than three generations of its software (if nothing else than to break jailbroken phones). In the same time period, AT&T has still been implementing its original network technology (HSDPA/HSPA) throughout its network. The cost and complexity of delivering infrastructure support is far higher than the product that users directly see and interact with.

Google voice and the iPhone

The dust-up around the Google Voice iPhone application has been fun to watch (see this for an update). There are many things going on here, including (possibly) AT&T's involvement (which both it and Apple deny). I found this item over at TechCrunch an interesting analysis of Apple's response to the FCC's inquiry.

I have to agree that it seems largely about Apple's control over users. AT&T would get either the data traffic or the voice traffic and, unless there is a serious difference in the profitability of one service over another, they should be largely indifferent.

Welcome to life in Apple's walled garden.

Update: The plot thickens ... according to this item, AT&T and Apple did have an agreement regarding VoIP, but it did not cover third party apps.

14 August 2009

Broadband carriers and government funding

This article is interesting. According to the atricle
As the Aug. 20 deadline nears to apply for $4.7 billion in broadband grants, AT&T, Verizon and Comcast are unlikely to go for the stimulus money, sources close to the companies said.

Their reasons are varied. All three say they are flush with cash, enough to upgrade and expand their broadband networks on their own. Some say taking money could draw unwanted scrutiny of business practices and compensation, as seen with automakers and banks that have taken government bailouts. And privately, some companies are griping about conditions attached to the money, including a net-neutrality rule that they say would prevent them from managing traffic on their networks in the way they want.

While it is quite possible that some of the rules, such as "network neutrality" may affect them anyway, it is clear that the carriers felt that the cost of participating in this program outweighed the benefits. A significant part of their concern is related to uncertainty about the consequences of an irreversible commitment. Thus, it seems an apt subject for a real options analysis.

Doing such an analysis rigorously would be challenging since the uncertainty is not easily quantifiable. But clearly carriers have concluded that the high probability of a modest upside does not outweigh the uncertain probability of a potentially large downside.

18 June 2009

iPhone and network capacity

This article, which describes how the iPhone is putting strains on AT&T's wireless network is interesting. It further illustrates the close relationship between carriers, content and network devices that I blogged about here. As the article shows, reducing handset prices increases network capacity (and revenues) and also increases the use of social networking applications. It seems likely that there is some revenue sharing going on to facilitate this ...

05 June 2009

Phone line shrinkage at AT&T

I have blogged before about the decrease in access lines (see this, for example). While it is not surprising given the increase in wireless only households, this article over at GigaOm shows that the decline has been in the 6% per year range for AT&T, not the 3% range.

As the article correctly points out, this is one of the reasons that the large ILECs have been aggressive in rolling out their broadband infrastructures. Since consumers are increasingly opting for wireless for voice, the only way that the ILECs have to continue receiving a share of the consumer's communications expenditures is to build out broadband, which enables them to compete with cablecos for television and internet access expenditures.

If they don't they have to depreciate their infrastructure at a faster rate than consumers are leaving it, else investors (the company owners) will be left holding the bag. Of course, this is an end-game that they would only play if they decided to cede the marketplace to other access providers. There is no sign that ILECs are interested in that strategy!

28 January 2008

Rate flexibility and market power

This article over at USA Today points to one of the tensions in deregulation. Carriers, who gain pricing flexibility, will use it to maximize their competitive position and profits. In this case, it appears that the incumbents with market power are using nominal prices to motivate their customers to adopt bundles of services (such as voice, long distance, mobile and video). These bundles tend to lock consumers in to their services since it becomes much more costly to purchase these services individually. This is the same basic issue that I brought up earlier related to text message charges.

While it is hard to imagine that these services experienced large costs increases (they are software, after all), I would like to point out that the services indicated in the article are hardly essential, mainstream services. Given that, is there a need to regulate? How many "elderly and low income" families actually use these services?

17 January 2008

Text Message: 15 Cents

This article "exposes" some interesting phenomena in my mind. Quoting the article:

When the big four cellular companies decided to hike the price of sending a text message, they all managed to settle on precisely the same increase. Sprint Nextelraised its price from 10 cents to 15 cents per message in 2006. AT&T quickly followed suit, as did Verizon Wireless and finally, in June 2007, T-Mobile. And now Sprint has raised its price again, to 20 cents.

Today those copycat price hikes are producing banner results for the carriers. In the most recent quarter, the Big Four's customers coughed up anywhere from 29% to 64% more for data services (that is, everything but regular phone calls) than they had the previous year. The four carriers collectively produced $17 billion in operating income on $104 billion in revenue in the first nine months of last year. The carriers say that consumers can buy a monthly package that lowers the cost of a text message. But without the huge surge in payments for data, revenue per user would have fallen at every company.

Their ability to hike prices on text messages certainly can't be explained by the companies' costs. On modern cellular networks the few hundred bits of information that make up a text message take up such a minuscule amount of capacity that they can be carried for a fraction of a cent.

The first paragraph basically points to implicit collusion in price setting. As any economist with tell you, this is one of the principal problems in oligopoly. In this case, it was evident even with four industry participants, which suggests either that the price for text messaging is quite inelastic or that this "ala carte" price doesn't reflect consumer experience. Since avid SMS-ers purchase packages, the actual price is far less, so, I believe the latter explanation dominates.

The end of the second paragraph makes little sense ... data services and SMS are quite different and appeal to different markets. It is clear, though, that non-voice services are the future for profits in this sector.

11 January 2008

ISPs and content filtering

I found this item interesting ... apparently some ISPs are seeing a role for themselves in managing pirated content. Interesting idea, but is it really feasible? Can ISPs really check for illicit content on a large scale on their networks?

11 December 2007

AT&T network upgrades

Stories like this one in Forbes don't get a lot of press attention, but I think that they are worth tracking anyway. Since Forbes doesn't do permalinks, here are some key excerpts from the article:


AT&T Inc. said on Monday it has switched on its high-speed backbone network, which is designed to ferry data traffic across the U.S. four times faster.

AT&T has begun placing traffic on its so-called "ultra-long haul" network, which boasts a capacity of 40 gigabits per second, meaning consumers will be able to download large files quicker and more easily stream online videos to their computers. Carriers have been upgrading the backbone network - the underlying pipes needed to move data across extremely long distances - to meet the increasing demand in bandwidth-intensive programs and videos.

[...snip...]

The company, which is deploying routing equipment supplied by Cisco Systems Inc., has upgraded 50,000 miles of its network and plans to connect 25 major metropolitan areas in the next several months. ... In addition to a faster connection for consumers, the upgrades will help ease the capacity requirements for the company's U-Verse Internet-based TV system.

[...snip...]

While the network is the first in the U.S., Verizon Communications Inc. said that this month it would begin building a 2,000-mile backbone network connecting major cities in Europe.

Both companies plan to push the 40-gigabit standard in the U.S. and eventually upgrade to 100 Gbps.


The article doesn't mention it, but I think it is safe to assume that the "40 Gigabit" standard is, in fact, OC-768 (this article in Network World confirms this). NW also reports that this is AT&T's MPLS network. I'm not sure what the "100 Gbps" is ... OC-1536 comes in at approximately 80Gbps. Wikipedia reports that the OC-3072 standard is a "work in progress".

Could the 100Gbps bit rate be referring to 100 Gbps Ethernet (as this article in Wikipedia suggests)? That would be quite a departure ... and would suggest an explicit strategy to integrate local and long distance network standards. Ethernet has truly come a long way (pun intended)!

In light of the Comcast "network management" discussion, this is an interesting development. Do you think AT&T would be credible if they employed similar techniques on this new network?