In the healthcare sector, hospitals and clinics are now using deep learning technology that enables computers to read large numbers of X-rays, MRIs, and CT scans more rapidly than a radiologist. Researchers at Google have built a deep learning model that detects lung cancer as well as or better than human radiologists. Meanwhile, across the financial services sector, deep learning is being used to enhance fraud detection. By analyzing past customer spending patterns, the technology can identify and alert banks to anomalous activity in real time.
Advertising and marketing, agriculture, automotive and transportation, cybersecurity, drug testing, oil and gas exploration and production, and retail are among the industries that will benefit from advances in deep learning.
At eStruxture, we are seeing substantial demand for capacity to support AI-powered applications that are being implemented by enterprises across a variety of verticals. AI labs and other associations are also flocking to multi-tenant data centers to deploy AI platforms with robust power and connectivity requirements that are comparatively richer than what we have previously witnessed.
AI developers and organizations already using AI-based applications do not fit easily within traditional Tier III data center models. When we talk about the applications that drive a lot of AI development and deep learning needs, we find that their requirements are different from a power, cooling and redundancy perspective. Where a standard customer might need three or five kilowatts in a rack, an AI provider could require 10, 15, 20, or even 30 kilowatts in a rack. Additionally, due to the size of the compute workloads and power density requirements of both ML and deep learning environments, servers can run approximately 30 percent hotter, presenting challenges in managing the power and thermal demands of the equipment.
With respect to issues surrounding power and cooling, and for a host of other advantages, AI providers seeking the optimal environment for hosting their applications would do well to consider colocating in Montreal. With its long winters and temperate summers, maintaining proper server temperature is easier and more cost-effective. In data centers, approximately 40 percent of the total energy consumed goes toward cooling IT equipment. Hence, cooling costs are one of the major contributors to the total electricity bill of large data centers. By taking advantage of the naturally cold climate to cool their infrastructure, thereby reducing operating expenses, data centers and colocation facilities in Montreal use less power than their counterparts to the South.
Enterprises and cloud service providers are attracted to colocating in Montreal because of the city’s favorable power rates. Compared to all of the major North American markets, Montreal offers the lowest-cost power for data center operations on the continent, and nearly 100 percent of the energy generated by the local utility, Hydro-Quebec, comes from hydroelectricity. A clean form of energy, hydropower produces greenhouse gas emissions that are 50 times lower than natural gas, five times lower than solar power and about equal to wind power. Montreal is already home to a thriving tech sector that includes AI, digital media, and managed IT services providers. These types of companies depend on high availability, high bandwidth capacity, and ultra-secure facilities to get their offerings to market quickly and efficiently.
While there’s much that data center operators can do to meet the needs of customers involved in AI, the emerging technology also has the potential to transform the management of these mission-critical facilities themselves. For example, one program that utilized an AI unit to manage power usage in a cloud provider’s data centers achieved a 40 percent reduction in the amount of electricity needed for cooling across its facilities. In another application of deep learning, an enterprise facility planned the most efficient methods of cooling by analyzing data from sensors strategically located among the server racks, including information on inlet temperatures and cooling pump speeds.
AI and deep learning have also been found to be useful in server optimization and load balancing. In fact, Gartner predicts that by 2020, 30 percent of data centers that fail to apply AI and ML effectively in support of enterprise business will cease to be operationally and economically viable.
All this is to say: AI is here to stay. Regardless of the dire, fictionalized prognostications or misperceptions you might associate with the technology, the reality is that AI is now affecting the way we do business, interact, and provide care for one another. If managed judiciously today and tomorrow, AI will remain a force for good that far outweighs any possibilities of harm.