By: Tim Young
A few months ago, U.S. cable giant Comcast announced that it would suspend the 250GB-a-month data cap it had in place for its residential customers, and would instead experiment with other
ways of dealing with excessive data use. Specifically, the MSO announced that it will be trying out various forms of tiered data service, under which the floor would be raised for all
consumers (300GB instead of 250), and additional service tiers or usage charges would be added on top of that threshold.
This is the latest step in Comcast’s bandwidth pricing vision-quest. The cableco instituted the 250GB cap back in 2008 after its much-publicized run-in with regulators over charges of bandwidth
throttling. However, in a display of how much user behavior has changed in the few years since that cap was put into place, it’s worth noting that back then even detractors of the cap called
the allowance, “relatively high”. Now 300GB is the bottom floor, with soaring usage potential beyond that. Granted, most residential users never came close to hitting that 250GB mark,
and certainly aren’t likely to top 300GB, but the level of bandwidth-sucking, over-the-top content that has become commonplace in the time since makes it clear that data capacity is still a major
issue for service providers.
In fact, it’s worse than that. It’s not that some users are in danger of exceeding that bandwidth cap. It’s that lots of users might be in danger of heading that way. “It’s video, video, video,”
said Jonathon Gordon, director of marketing for Allot Communications, a bandwidth management vendor. “Video is skyrocketing. Fixed or mobile, it’s the same issues.” Cisco predicts that
internet video consumption will more than quadruple by 2016, at which point more than 1.2 million minutes worth of video will be coursing through the internet every second.
This glut of video traffic has not been entirely unforeseen, but it isn’t the climate in which Comcast instituted its bandwidth caps, necessarily. Back then, carriers were being bowled over
by massive data spikes, but from a different, more localized, source. “Five years ago it would have been P2P, with a small percentage of users eating massive amount of
bandwidth,” said Gordon. “Now instead of 10 percent of your users using 80 percent of your bandwidth, you have 100 percent of your users using 500
percent of your bandwidth.”