Earlier in the year, we referenced New Order and looked at how “everything’s gone green.” Remember that? Wasn’t that fun? Managers of data centers are still constantly seeking out ways to increase their energy efficiency. Like the Holland Tunnel–or just about any major road in The Garden State (no, not that silly movie)–the internet has become an ever-congested, high-traffic route, The Information Superhighway, as it was once so endearingly called. In response to heightened demands, data storage capacity has increased exponentially and continues to (even as you read this–ever looked at GMail’s data capacity ticker?), but what’s equally as necessary is a continuance of energy solutions. Just as cars are becoming more fuel efficient, hybrid, electric, etc., network hardware is condensing its technology to save both physical space in the datacenter and to cut costs and usage on power. The last large innovation in this particular area of concern would had to’ve been the introduction of power-over-ethernet (PoE) technology.
Server equipment such as routers and switches require a lot of electricity and power to run, and there are more of them running concurrently than ever before. In 1993, global internet traffic totaled about a couple hundred terabytes (TB) per year. In 2011, almost two decades later, internet traffic reached at least this much data per *second*. The bulk of data centers’ operating expenses (more than 50%) is spent on cooling mechanisms–air conditioning and fan try operation–to keep these units from running hot. Inefficiency has plagued the data center manager for decades as internet traffic has continued to peak, and, with the ubiquity of web connectivity via mobile devices and public WiFi hotspots, it continues to climb ever higher. PoE allows for both a device’s data and power to course through the same Ethernet infrastructure, and in recent years it has begun to offer solutions to the energy conundrum.
PoE technology may not have been perfect when it first debuted, but, like most practical innovations, took a few years to really shine. Now, PoE is not simply a novelty or an alternative to traditional AC (alternating current) power. Since its introduction, it has allowed for versatile structuring and configuration within a data center. And since the power and data share an infrastructure, it has the potential to be a very efficient solution, with no separate installations or repairs for AC or cable. On top of this, PoE offers remote shut-off and power preservation technologies that simple AC power bricks cannot. When a server is running all night long, but no one is using it, PoE is able to idle or power down conditionally under unique circumstances. Perhaps the office’s IP phones should be off during the weekend or over a holiday break–well, PoE is versatile enough to where you can make that distinction from a central station. The best part of all of this is that PoE still has potential, a few plays left in its deck, and it will continue to show off its abilities in the years to come, as we save precious resources and precious, precious dollars.