Anyone who’s used popular peer-to-peer (P2P) applications such as BitTorrent, Gnutella or Limewire has probably been plagued by network slowdowns that make sharing heavy media files a time-consuming endeavor.
However, a consortium of technologists at the Distributed Computing Industry Association, or DCIA, has found a way to alleviate Internet network congestion created by P2P applications. In fact, this new technology will soon exit the testing phase. There’s just one problem: Broadband providers such as Comcast, Verizon and AT&T may be shy about actually implementing it.
P2P Causing Traffic Snares
Most broadband providers will tell anyone who will listen that users of P2P applications use a disproportionate amount of network resources as they send and receive massive video files and other rich media content over the Internet.
“Something like 5 percent of our customers can be using between 50 percent and 66 percent of the capacity of Comcast’s network at any one time while using P2P applications,” Charlie Douglas, a Comcast spokesperson, told the E-Commerce Times.
The same holds true for New York-based Verizon Communications. “No matter what the capacity of a network, there are points where people have to share capacity,” Paul Brigner, executive director of Internet and technology policy, told the E-Commerce Times. “Depending on the applications being used, some users take up a disproportionate share of the network.”
One might then ask, why not simply build a faster network with more capacity instead of restricting users?
“[The congestion problems] wouldn’t be solved by building a bigger pipe,” Douglas said. “In Japan, where they have all-fiber networks and 100 megabits per second speeds, they are still experiencing heavy congestion. You can’t build your way out of it.”
The FCC Weighs In
The broadband providers’ initial response to these network traffic snares was to temporarily limit the network capacity available to P2P users when severe congestion occurred.
However, that didn’t sit well with privacy advocates who felt that broadband providers had no business monitoring what paying subscribers were doing on their networks.
The FCC was asked to look into the matter and eventually ruled that it wasn’t legal for Internet service providers to delay P2P file-sharing requests for any reason.
That forced Philadelphia-based Comcast and other ISPs to look for another solution.
‘Application Agnostic’
“The new technique we’re migrating to is application agnostic,” Comcast’s Douglas said. “We’ll be using new technologies that will identify when an area of the network is reaching congestion, usually at the 70 percent utilization rate.”
When that happens, Comcast will target the heaviest consumers of network resources — without monitoring the specific type of application being used — and slow down their Internet capacity to about 70 percent for a period of around 15 minutes.
“So, a 6 megabits per second customer will be brought down to 4.2 megabits per second for about 15 minutes, or until the period of congestion passes,” Douglas said. “We haven’t seen a need for any period longer than 15 minutes yet.”
A similar technology is in use at Verizon.
“Our network management is application agnostic,” Verizon’s Brigner said. “We don’t do what they call ‘deep-packet inspection.’ We don’t look for specific types of applications or activities.”
Is “application agnostic” really “application agnostic?”
The answer is not clear-cut.
By limiting capacity to those users who tie up the most network resources, broadband providers are de facto targeting P2P users, because it’s an established fact that they use a disproportionate amount of available bandwidth, Robert Levitan, CEO of Pando Networks, told the E-Commerce Times.
“There’s a reason why [ISPs] use terms like ‘application agnostic,'” Levitan said. “They were investigated by the FCC for blocking BitTorrent users.”
An Alternative Solution
New York-based Pando develops commercially deployable P2P delivery technologies. One of its customers is the NBC television network.
“We help companies like NBC deliver large video files,” Levitan said.
Pando and a consortium of technologists at the DCIA — with the help of companies such as Verizon, AT&T, Cisco, BitTorrent, Limewire and VeriSign — have formed the so-called P4P Working Group to come up with a solution to the congestion problem created by P2P users.
“We’ve tested a new set of protocols on the networks at Verizon and Comcast that help reduce the stress on their networks and increase delivery speeds,” Levitan said. “It’s been proven. The results show that these new protocols can route P2P traffic in a much more efficient way.”
In mid-March, Verizon announced test results that showed that large files can be moved over the Internet more quickly and efficiently using a new P2P file transfer system.
The new P2P protocol guides the selection of file sources and network pathways rather than letting the selection happen randomly, or using criteria that don’t maximize efficiency, according to the company.
Comcast performed similar tests using P4P protocols last summer, Levitan said, and the results were even better than those demonstrated on Verizon’s broadband network.
“We know this stuff works,” he commented.
Fear Stifles Innovation
It may work, but that doesn’t necessarily mean the problem is solved. The problem with the P4P protocols is that they identify specific applications as heavy users of network resources — the very thing that raised the ire of privacy groups and prompted the FCC’s investigation.
“If you do that, you’re not application agnostic anymore,” Levitan said.
It remains to be seen whether the network operators will decide to implement P4P protocols in the future, but Levitan isn’t sanguine about the prospects just yet.
“[Broadband providers] are now too fearful of upsetting the FCC or special-interest groups to innovate and solve this problem,” he said.
Social Media
See all Social Media