Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's probably not quite as insane as it sounds until the rest of the internet catches up. I had 650 Mbps at Georgia Tech and realized most websites can deliver nowhere near that speed.


One particular action won't demand that kind of speed, that's sure. However, think of scenarios, where there's multiple users down- and uploading stuff. For example, you and your family are watching tv via iptv provider (multiple tv sets in the house, each streaming HD video), at the same time you might have bittorrent server somewhere in the basement and numerous mobile devices (tablets, phones). The demand can break 50Mbps for one technically advanced family.

Now, if you think of businesses, then the demand for data throughput might be 10x greater or more. Depends on the type of the business and number of users (users also being servers).


Yeah, my brief experience with a gigabit connection in school was pretty underwhelming. I could break 50 MB/sec down with bittorrent, but a lot of websites couldn't even hit 1 MB/sec and even CDNs rarely broken 10 MB/sec.


BitToreent should utilized more building next generation high speed Interweb. Deliver gigs of multimedia in microseconds.


Or they limit the bandwidth per IP because there is really no need for that kind of insane bandwidth.


We'll find a need, have no worries about that. It's human nature, we always find ways to use 110% of the resources available.


Something tells me gatech and google have different links to the internet....


Universities and research institutes are also connected to private peering links such as ESNet[1] or Internet2[2]. This is actually a huge benefit because these links are typically uncongested and you can get much more reliable transmission on them (and hence consistent latency and faster transfer speeds). Other countries/continents also have their own private academic networks that peer with the US ones.

I think I read somewhere that they are upgrading the ESNet backbone to be 100Gbit/s and GEANT's[3] to be 1000 Gbit/s, so I wouldn't immediately write off University/academic networking as automatically inferior to Google's.

Honestly it's hard to tell without knowing any of the details of how much capacity Google is provisioning in Kansas City (and how much contention there is). How much backbone fiber is laid to KS/MO - it doesn't seem to be where existing big-scale connectivity is (unlike say, DC Metro area where a ton of datacenters are located or London/Paris/Amsterdam).

[1] http://www.es.net/ [2] http://www.internet2.edu/ [3] http://www.geant.net/pages/home.aspx


I'm not sure if it's true or not, but when I went to school there, the OIT people bragged that GT had the second fastest internet connection to the Pentagon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: