homeWelcome, sign in or click here to subscribe.login
     


 

 

Architecture & Engineering


print  email to a friend  reprints add to mydjc  

October 28, 2010

Data center design focuses on five fundamentals

  • Project teams are re-examining traditional approaches and devising new methods for data center development.
  • By LEONARD RUFF and BILL FETTERLEY
    Callison

    mug
    Ruff

    mug
    Fetterley

    In today’s progressively data-intensive world, data centers are in high demand. The industry, known as “mission critical,” is experiencing tremendous growth — it is expected to have a 60 percent growth rate globally within the next year.

    Since the dot-com bust in 2001, data center providers have grown and matured, learning to take advantage of the enormous growth in information demand, and have developed business processes to manage the data and the facilities that support the data. Modern business depends on data, whether it’s financial, retail sales, resource management, computational modeling, news or analysis, information is vital. Companies have realized this information is critical to doing business and recognize the importance of providing secure, reliable and flexible solutions that ensure valuable data is available anywhere, anytime, no matter what.

    With data availability and security at stake, the role of the data center is more critical than ever. This is seen in the increase of data center expenditures, new standards being established and the increased role of the data center. From 2006 thru 2009 the market grew nearly 35 percent annually.

    Five fundamental drivers have challenged the mission critical industry: Moore’s Law, convergence, cloud computing, modular design and deployment, and sustainability. These factors are forcing data center providers and designers to re-examine traditional approaches and develop new methodologies.

    Photo courtesy of Callison [enlarge]
    Sabey Corp. developed Intergate.Columbia, a data center in Wenatchee designed by Callison.

    Moore’s Law

    First postulated in Gordon Moore’s 1965 paper, Moore’s Law predicted that the number of transistors on a processor would double approximately every year. Amazingly, Moore’s prediction has held true. The obvious impact is that as chip density increases, so does power consumption and heat dissipation. The less obvious impact is the increase of processing capability with the corresponding decrease in size allows users to do more with less and enables functions or operations not previously provided.

    These factors have increased the amount of power needed in data centers. In the mid- to late-1980s, it was fairly common for a mainframe data center to support 30 to 40 watts per square foot of processing area. Today, the generally accepted minimum is at least 150 watts per square foot. Callison has even designed data centers that support more than 500 watts per square foot.

    Convergence

    Advancements in technology have created a zone where everyday business and life converge; from the exponential growth in the use of smart phones and advancements in medical imaging, to the worlds of online retailing and social networking. As technology changes the fabric of one area, it drives up the demand for data in other areas, with a corresponding increase in demand for data centers.

    Being able to continuously support these demands also drives a need for multiple data centers in different locations, as latency issues and survivability require geographically diverse facilities to ensure the delivery of services if a facility in one area is inoperable due to an event such as a storm or earthquake.

    Cloud computing

    There are many visions of cloud computing, but generally speaking, the idea is that information can be accessed, processed and stored in a remote, safe and secure location away from the user’s office or home. Cloud computing provides business and consumers immediate access to all the functions they need, without supporting an enormous and expensive computing environment. The major technology companies are all vying for dominance in this arena, driving up the demand for more data centers.

    Over the next few years, large providers will build cloud computing infrastructure across the globe to provide base-level computing services that are more competently supported by a cloud computing infrastructure. This will make providing computing functions like data storage and backup a global utility or commodity. Cloud computing will give emerging economies a way to access the necessary information and capability without having to invest heavily in computing infrastructure.

    Modular design and deployment

    There are many forms of modular data centers, from discrete pre-engineered modular infrastructure components to large-scale “data center in a box” deployments. All of these strategies allow data center users to more carefully fit their computation needs to capital and schedule requirements by building out capability in a more just-in-time delivery method.

    Traditional data center design and construction focused on delivering a large infrastructure, built out to allow for future growth. The downside of this approach is that it required large capital investments and created stranded capacity. In this scenario, much of the infrastructure was not used until future deployments, leaving much of the capacity on the shelf, bought and paid for, but not generating revenue.

    Modular design and delivery methods allow for a more incremental deployment in data center infrastructure and more discrete levels of capital investment. By carefully matching computation demand to capacity, the data center designer can develop a strategic plan to allow a data center operator to build in discrete modules to more closely match actual demand and financial targets.

    Sustainability

    For many years, the data center world has been under the radar in terms of power usage, but as the demand for facilities continues to rise, the awareness of energy consumption also rises. In 2007, an EPA report to Congress demonstrated the data center market consumed approximately 61 billion kilowatt-hours of electricity — 1.5 percent of the total energy produced in the United States — and it projected consumption to reach 100 billion kilowatt-hours by 2011.

    The data center design industry is faced with the significant challenge of making these data centers more energy efficient while allowing for growth and maintaining high levels of reliability. One particular area of focus has been the HVAC system. Callison has been working with a number of industry-leading engineering firms to use new concepts in energy-efficient design. For example, the Callison-designed Redmond Ridge Data Center experienced a 40 percent increase in efficiency over traditional design approaches with the use of evaporative cooling systems.

    The future

    As demand continues to grow, data center operators will be challenged to find ways to provide high levels of service in a cost-efficient model. Data center architects bring a holistic approach to the design process, coordinating the diverse disciplines to create a unified and cohesive design that maximizes the building efficiency and minimizes the time and costs required to design and construct the facility.


    Leonard Ruff, AIA, has a comprehensive knowledge of analyzing, programming and planning mission critical data facilities. Bill Fetterley, AIA and LEED AP, has more than 30 years of experience dealing with technically complex buildings and construction issues. Both men are directors at Callison.


    Other Stories:


    
    Email or user name:
    Password:
     
    Forgot password? Click here.