Archive for February, 2013

Data Center Site Selection Based on Economic Modeling | 2013-02-05 | Mission Critical Magazine #cre #ccim #sior #datacenter

February 28, 2013 Leave a comment


Data Center Site Selection Based on Economic Modeling | 2013-02-05 | Mission Critical Magazine.

Data Center Site Selection Based on Economic Modeling

The path to a high-performance data center starts at the beginning

By Debra Vieira

As the demand for data center capacity grows around the world, so too grows the need among data center owners, developers, and operators to find better ways to improve the economic performance as well as the energy efficiency of their facilities. According to a “Green Data Centers” report by Pike Research,1 the global market for green data centers segment of the industry is expected to more than double in size in the next four years. Underscoring that trend are announcements in recent months of large green data center investments by such bellwether companies as Apple, Microsoft, and IBM. As the industry has grown, so too has the concern that the data center industry may face regulatory pressure related to energy consumption. Some utilities are already expressing concerns about having enough power in the future to feed the number of data centers expected in their regions. Owners are justifiably seeking to discover new and better ways to improve their sustainable performance with reliable analytical methods and tools.


This article describes an economic modeling approach being used to help data center owners more wisely choose locations for new data center developments and to forecast costs along with long-term economic returns on investment associated with those prospective sites.

The data center site selection process has traditionally been influenced by such qualitative factors as personal preference, economic development incentives, or economic projections. The analytical approach described in this article avoids such pitfalls by applying a more comprehensive and quantitative means of evaluating potential data center sites.

The methodology developed applies a multi-variable approach measuring the impact of site location on a data center’s cost expressed in net present value and environmental performance. It calculates the impact site choices have on a project’s schedule and ability to be sustainable, evaluates options to incorporate on-site energy production, and factors in potential public and private incentives related to sites and other key decision-making drivers. The approach also evaluates such data center sustainability site factors as carbon usage (CUE), water usage (WUE), and power usage (PUE). This approach has enabled reductions of up to 70% in some of these categories. The methodology is also designed to perform in multi-national site comparison scenarios.

Any advanced planning methodology of this kind must use validated data rather than hypothetical inclinations. Accordingly, the intent of models such as our “Opportunity Mapping” macro-analysis methodology is to replace subjective site selection decision-making with an objective data-driven process. The Opportunity Mapping methodology uses a geographic information system (GIS) approach of layering and weighing numerous site criteria with our Data Center Site Analysis Model, which applies a data-driven methodology to analyze prospective sites and forecast owner costs and economic returns on investment associated with specific prospective sites.


Experience has shown that owners who rely on qualitative preference rather than quantitative site selection criteria run a greater risk of being disappointed in the end result. Craig Harrison, the developer of Colorado’s Niobrara Energy Park in Weld County, CO, has been a vocal proponent of using data-driven validation for green data center development decisions. He underwrote a comprehensive economic comparison of already established data center markets in 11 U.S. states. His analysis objectively compares a range of factors including economic development incentive packages, energy costs and taxes. The analytical model uses a 200,000-sq-ft data center entailing $240 million in construction along with $500 million in hardware at each compared location.

Data applied in these models include construction costs, operating labor costs, renewable energy resources, incentives and special enterprise zones, workforce availability and qualifications, taxes, utility rates, logistics and transportation, network connectivity, transportation infrastructure, environmental and regulatory restrictions, weather, geology, land and building costs, environmental factors, proximity to raw materials, and markets and special environmental considerations.

Externally derived data are used to develop site-specific conclusions related to such factors as climate, utilities, natural hazards, regulatory issues, and economic development incentives. Examples of specific sources for such data include local utilities, the National Renewable Energy Laboratory (NREL), U.S. Geological Survey (USGS), U.S. Nuclear Regulatory Commission, Federal Energy Regulatory Commission (FERC), National Oceanic and Atmospheric Administration (NOAA), Energy Information Administration, and various regional offices of economic development and trade.

Data-driven decision-making is imperative for owners seeking to reduce their facilities’ resource consumption and energy-related costs. The goal of this economic modeling methodology is to provide a clear choice of a site for data center development. An example of using this Opportunity Mapping economic method was the data-driven analysis done for Colorado’s Niobrara Energy Park (NEP).


Natural disasters can be a significant threat to any data center. The NEP site evaluation included reviewing seismic activity based on data from the USGS National Seismic Hazard Map. To determine the frequency and potential impact to the site of severe wind, tornado, hail damage, and other historic weather events, data from NOAA’s Annual Severe Weather Report Summary were analyzed. Wildfire hazards as well as fuel sources (e.g., grasses) for wild fires were evaluated. Flood hazards based on 100- and 500-yr flood plains were reviewed and overlaid on the site to assist in master planning to develop locations of buildings and critical infrastructure.

Manmade disasters can also be a significant threat. Data were gathered and reviewed to understand NEP’s proximity to both active and inactive nuclear power plants and predicted fallout zones based on prevailing winds. Beyond disasters, the site was evaluated using topological maps to determine suitable building locations and flexibility for accommodation of future development, as well as natural security features. NEP was determined to be located in a low seismic hazard area, with minimal impacts by severe weather and wildfires and located outside of flood plains and manmade disaster areas.

The Opportunity Mapping methodology also evaluates site assets including electricity, network, natural gas, water, and wastewater. Site proximity to inexpensive electrical utility sources is vital for operations of any data center. It was determined that NEP was in close proximity to three large transmission-level electrical lines. The existing substation has available capacity, which can also be expanded. Additionally, opportunity exists to build a new large capacity substation that connects to the transmission lines. Locations of primary and secondary fiber networks were defined. Networks were evaluated for capacity, redundancy, and ease of network connectivity.

This evaluation determined that NEP’s location near one of the largest gas hubs in the nation provided access to some of the lowest spot gas prices in the country. Given the proximity to the hub and the resulting low fuel prices, the site lends itself to the construction of a natural gas-fired power plant to produce electricity. Water sources were evaluated for flow and quality to determine their potential for data center cooling. Wastewater discharge options are considered based on anticipated volumes and profiles for various data center cooling options. These discharge options were then reviewed against local discharge permit requirements to determine allowable disposal methods.


Data centers are significant consumers of energy. Efforts to reduce energy consumption by utilizing free-cooling opportunities based on local climate conditions can provide significant operational savings. A site’s climate is evaluated using historic psychometric data and comparing the results to the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) Class 1 recommended or allowable data center operational criteria, to determine potential energy savings through use of economizer cooling. As an illustration of this strategy, the NEP site was determined suitable for use of a cost-effective method of using external air to assist in the cooling the data center.

An even greater savings can be realized by using renewable energy sources to offset consumption from the local utility. Weather and wind data were evaluated for the NEP site. Analysis indicated that the potential for reliable wind-electric generation was substantial and that the NEP location was ideal for this renewable resource. Solar energy opportunities were studied to determine the potential capacities of photovoltaic farms as well as their optimal locations on the property. Application of fuel cell technology is also a possibility at the NEP site due to the nearby natural gas hub and inexpensive fuel. Storage of renewable energy is a challenge; therefore, sites are also evaluated for large-scale energy storage opportunities such as compressed air and thermal storage.

Due to the significant availability of renewable resources, the NEP site is a prime location for using microgrid technologies. This technology forecasts and manages energy generation and consumption. Capacities from renewable resources, natural gas, and traditional electric utilities are forecast against the critical load. A dispatch planning program determines the most cost-effective mix of power generation constrained around maximizing renewable resources and a schedule is created for the following day. The dispatch program then executes the schedule on the following day, adjusting in real time to compensate for deviations from forecasted load or for unplanned events.


The Opportunity Mapping methodology researches business incentives from utility providers; grants from county, state and federal agencies; exemptions from sales and use taxes; and property tax abatement. Additionally, financial assistance programs, low-cost financing, development bonds, and tax credits are reviewed for applicability. All of these programs help define a community’s business environment and its willingness to accept and support new business from data center opportunities.

Some of the more obscure site assets, than the ones identified above, are proximity to a skilled workforce and transportation including highways, airports, and railroads. The methodology related to these determining factors identifies nearby concentrations of workers in cities and towns as well as the availability of education for a regional workforce. Universities, colleges, and other high-technology companies are located and evaluated against the requirements of the data center. Highway access to the site as well as other regional travel infrastructure (e.g., rail and air) are considered to indicate ease of site access as well as site proximity to local communities as criteria related to a workforce’s inclination to be employed at the facility.

Site analysis is not complete until regulatory and permitting requirements have been analyzed. Storm water and wastewater discharge as well as air emissions must be in compliance with local, state, and federal regulations. Site locations are compared to air pollution attainment and nonattainment maps to determine impacts by stringent and lengthy air permitting processes. This method reviews the requirements, financial impacts of compliance, and potential schedule impacts.

Upon the validation of site assets and associated capacities, and with a full understanding of financial impacts and regulatory requirements, master planning and data center concepts can be developed that utilize assets to the fullest extent. These concepts can be further refined to define project costs, data center IT load, projected cooling requirements, and associated PUE, WUE, and CUE metrics for which solid, data-driven comparisons and decisions can be made for site selection.


Another example of Opportunity Mapping was the data-driven analysis done for the Lefdal Mine Data Center located in Måløy, Norway. This project went through the same extensive evaluation process as NEP. The Lefdal mine is an inactive olivine mine. Olivine is a magnesium iron silicate mineral, which is typically olive-green in color and when found in gem quality is known as peridot. Industrial quality olivine is used as an industrial abrasive or as refractory sand (it resists high temperatures). The olivine was extracted using a room and pillar technique, which leaves pillars of untouched material to support the roof and the extracted areas become the rooms.

The results of the evaluation process indicated that the Lefdal mine is an excellent location for a data center complex. There is no underground gas from the ore and the mine offers natural protection from electromagnetic pulse (EMP) waves, making these large rooms an ideal location to house high-value data centers as well as supporting mechanical and electrical infrastructure. Other site assets include a large, cold fjord that can be used for cooling. The fjord is ice-free and fed by four different glaciers, thereby maintaining a constant temperature. Conceptual cooling utilizes a seawater circulation system with a heat exchanger and another water circulation system within the complex. Norway’s national fiber backbone is located near the mine and is in close proximity to wind and hydroelectric renewable power sources. Wind power from nearby wind farms has already proven itself a viable source and an early projection of hydroelectric power capacity is in the range of 6 terawatts. The Lefdal site is located near three major airports and is near a large harbor and shipping port. Road access conforms to European standards for deliveries of construction and production materials.

Understanding the Lefdal site’s assets and capacities allowed for master planning and data center concepts to be developed, which utilized the cold fjord and significant adjacent renewable energy resources. These concepts helped to define project costs, critical IT load, and cooling requirements along with other data center metrics allowing for objective site selection.


While this Opportunity Mapping methodology is excellent for large data center developments such as NEP and Lefdal, the same methodology can be used for smaller assessments. For example, evaluation of the potential for combined heat and power (CHP) of which both power and heat are produced from a single fuel source for data center power and cooling. The heat recovered from the production of power can be utilized in an absorption chiller to produce chilled water for data center cooling. Technologies used for power production can include reciprocating engines, gas turbines, microturbines, and fuel cells. By developing financial models, which incorporate location and climates, utility rates, local incentives, and projected electrical demands, impacts to capital and operating expenses can be quantified. This allows for technology comparisons as well as site comparisons using objective financial and performance metrics.

By considering the economic modeling methodology presented, data center owners can compare technologies, evaluate locations for site selections, evaluate environmental performance, estimate financial impacts and assistance programs, and wisely forecast long-term investments. Objective, quantifiable, and reliable net present value costs of a data center can be developed, thereby avoiding the pitfalls of qualitative evaluation.


1.“Green Data Centers IT Equipment, Power and Cooling Infrastructure, and Monitoring and Management Tools: Global Industry Drivers, Market Analysis and Forecasts.” Pike Research. April 23, 2012.

Categories: DataCenter

C&W -Canadian Industrial Marketbeat- Q4 #CRE #CCIM #SIOR #INDUSTRIAL #REALESTATE

February 27, 2013 Leave a comment

Guy Masse, CCIM, SIOR
Senior Vice President
Cushman & Wakefield

2001 University, suite 1950
Montreal, Quebec H3A 2A6
Tel.: 514 841 3830

Preview of Google’s New Built-from-Scratch Googleplex | #cre #ccim #sior

February 27, 2013 Leave a comment

Exclusive Preview: Google’s New Built-from-Scratch Googleplex | Vanity Fair.

By         Paul Goldberger


Google occupies some of the most famous offices in the world—think cafés everywhere you look, treadmills with laptops attached to them, pool tables and bowling alleys, green buildings, and vegetable gardens—but not one of the places in which the company’s 35,000+ employees work has been built by the company. The core of the “Googleplex,” as the headquarters in Mountain View, California, is generally known, consists of a suburban office park once occupied by Silicon Graphics that Google remodeled to suit its needs; in New York, Google occupies (and owns) the enormous former Port Authority headquarters in Chelsea. The company has been similarly opportunistic around the world, taking over existing real estate and, well, Google-izing it. “We’ve been the world’s best hermit crabs: we’ve found other people’s shells, and we’ve improved them,” David Radcliffe, a civil engineer who oversees the company’s real estate, said to me. Under Radcliffe, the company’s home base in Mountain View has expanded to roughly 65 buildings.

For the last year or two, Google has been toying with taking the plunge and building something from scratch. In 2011 it went so far as to hire the German architect Christophe Ingenhoven to design a brand new, super-green structure on a site next to the Googleplex, but that was a false start: the company abandoned the project a year later, when it decided to build in another part of Mountain View, closer to San Francisco Bay, and went looking for another architect. Now Google has partnered with the Seattle-based firm NBBJ, a somewhat more conventional choice. And the renderings of the new project, which Google has made available to Vanity Fair, show something that looks, at first glance, like an updated version of one of the many suburban office parks that Google has made a practice of taking over and re-doing for its own needs.

The more you look at the complex, however, the more intriguing it is. The new campus, which the company is calling Bay View, consists of nine roughly similar structures, most of which will be four stories high, and all of which are shaped like rectangles that have been bent in the middle. The bent rectangles are arranged to form large and small courtyards, and several of the buildings have green roofs. All of the structures are connected by bridges, one of which will bring people directly to one of the green roofs that has been done up with an outdoor café and gathering space. And cars, the bane of almost every suburban office complex, including the Googleplex, are hidden away.

What is really striking about this project, however, isn’t what the architecture will look like, about which renderings can show only so much anyway. It’s the way in which Google decided what it wanted and how it conveyed this to its architects. Google is, as just about everyone in the world now knows, the most voracious accumulator of data on the planet. When it decided to build a building, it did what it did best, which was to gather data. Google studied, and tried to quantify, everything about how its employees work, about what kind of spaces they wanted, about how much it mattered for certain groups to be near certain other groups, and so forth.

The layout of bent rectangles, then, emerged out of the company’s insistence on a floor plan that would maximize what Radcliffe called “casual collisions of the work force.” No employee in the 1.1-million-square-foot complex will be more than a two-and-a-half-minute walk from any other, according to Radcliffe. “You can’t schedule innovation,” he said. “We want to create opportunities for people to have ideas and be able to turn to others right there and say, ‘What do you think of this?’”

What may be most significant is that the company’s research led to a design that isn’t substantially different from the existing Google buildings, just more so. The older buildings have a mix of private, quiet work spaces (though no private offices) and social and communal work spaces; so will the new one. The older buildings are full of cafés; the new complex will be, too. Radcliffe said that “the cafés were validated” in Google’s studies, as if anyone were surprised. The existing buildings have a relaxed and casual, even whimsical, quality to their interiors, as if to say that pleasure is a part of efficiency; I’m not sure how Google quantifies this except by seeing how many workers like it, but here, too, the plan is to continue on the same track, even if the new buildings aren’t likely to feel quite as improvised. And as the existing buildings have been retrofitted to conserve energy, the new ones will be even greener. And so on.

A lot of this seems like a statement of the obvious, but then again, lots of data is. And architecture, which is so often form-driven, doesn’t necessarily suffer from a bit more attention to factors other than shapes. “We started not with an architectural vision but with a vision of the work experience,” Radcliffe said. “And so we designed this from the inside out.”


Data Centre Risk Index | #datacenter #cre #ccim #sior #realesate

February 20, 2013 Leave a comment

1310475346_Data%20Centre%20Risk%203Data Centre Risk Index launched

via Building Services Design – M&E Engineering Consultants.

International consultancies Cushman & Wakefield and hurleypalmerflatt have issued a new study evaluating the risks to global data centre facilities and international investment in business critical IT infrastructure. The firms have carried out a joint study evaluating risk in 20 leading and emerging markets and across key regional centres. The study identifies a weakness in industry decision-making processes and argues that companies should be evaluating their risk across a greater number of criteria and locations and mitigating or managing certain risks before investing in data centres.According to the Data Centre Risk Index, published today by Cushman & Wakefield and hurleypalmerflatt, the demand for data storage capacity, accelerated by recent technological advances, means that more and more companies are investing in data centres overseas and potentially increasing their exposure to risk.

Data centres support business critical IT systems and any downtime can cost millions in lost revenue and even threaten the viability of a business.  The Index highlights the impact on business continuity resulting from extreme acts of nature – as witnessed in Japan, New Zealand, Iceland, USA and Australia – and political instability following the unrest in North Africa and the Middle East.
The Index ranks countries according to the risks likely to affect the successful operation of a data centre and also identifies factors such high energy costs, poor international bandwidth and protectionist legislation as major risks (see risk categories in notes to editors).
The U.S. ranks first in the index, with the lowest risk for locating a data centre, reflecting the low cost of energy and its favourable business environment.  It is followed by Canada in second position, and Germany, in third.

Data Centre Risk Index 2011

Rank            Index Score              Country
1                      100                   United States
2                       91                    Canada
3                       86                    Germany
4                       85                    Hong Kong
5                       82                    United Kingdom
6                       81                    Sweden
7                       80                    Qatar
8                       78                    South Africa
9                       76                    France
10                     73                    Australia
11                     71                    Singapore
12                     70                    Brazil
13                     67                    Netherlands
14                     64                    Spain
15                     62                    Russia
16                     61                    Poland
17                     60                    Ireland
18                     56                    China
19                     54                    Japan
20                     51                    India

Stephen Whatling, Global Services Director at hurleypalmerflatt, said: “Despite their status as engines of global growth, China and India score poorly as a result of strict foreign ownership regulations and other barriers to investment.
“Brazil is a key emerging market, currently enjoying substantial growth and attention from foreign investors.  With improvements in international bandwidth and infrastructure and tax reforms for non-domiciled companies, Brazil could emerge as a Latin American technology powerhouse.”
Keith Inglis, Partner at Cushman & Wakefield, said: “Sweden, Qatar and South Africa are untapped markets and attractive locations, although requiring further investment in infrastructure.
“Meanwhile high corporation tax, energy and labour costs in the United Kingdom mean there is a risk that owners and operators could begin to look overseas to reduce overheads.”

Categories: DataCenter

2012 Migration Patterns – #CRE #CCIM #SIOR

February 13, 2013 Leave a comment

C&W MarketBeat Series 4Q12 #cre #ccim #sior #realestate

February 11, 2013 Leave a comment


C&W MarketBeat Series 4Q12.

The latest Q4 #global marketbeat reports are out, for more information regarding markets around the globe, click on to: …

LED lighting gaining traction in commercial retrofits | #cre #ccim #sior #sustainability

February 7, 2013 Leave a comment

LED lighting gaining traction in commercial retrofits |

LED lighting gaining traction in commercial retrofits

Published February 06, 2013 ,
LED lighting gaining traction in commercial retrofits


Once thought to be too costly for commercial buildings, LED lighting is increasingly popping up in warehouses and commercial facilities as part of energy retrofit projects.

Atlas Box, a Massachusetts-based manufacturer of protective packaging for electronics and heavy equipment, has embarked on a plan to reduce energy consumption by 55 percent in two facilities by installing LED lighting systems. The two-year retrofit project for installing lighting controls and other energy efficiency measures got underway thanks to a no upfront cost financing program set up by the local utility, National Grid.

Energy, of course, remains a significant cost for buildings and facility managers and “anything we can do to manage and control energy costs gives us a competitive advantage,” said Frank Tavaras, global process engineer with Atlas Box and project leader.

Atlas worked with Groom Energy to perform building energy assessments to map savings opportunities. Tavares said he considered the warehouse’s machinery and power cutting equipment as using the most energy in the three-year old facility. “But we were wrong, lights use more and gave us the opportunity to target something  right away” to save energy.

For Atlas’s interior warehouse lighting, Groom Energy installed a Digital Lumens system enabling occupancy-based lighting and the ability to track and manage the system through software.

Having individual control of each fixture, as well remote monitoring and Web-based dashboard tools, were critical features in selling Tavaras on the project.

Indeed, wireless controls with underlying analytics and data management to help building managers track and monitor energy savings helps justify the investment in new LED technologies, writes Casey Talon with IDC Energy Insights.

In addition to Digital Lumens, there are a number of new players in the market for advanced lighting and with Internet-enabled controls, including  Enlighted, Adura Technologies (which was recently acquired by Acuity Brands), Redwood Systems and Daintree Networks.

But trying to gather the initial capital expense, which ran upwards of $500,000, for retrofitting a seemngly new, three-year old facility was of a tough sell, said Tavares. The company received a number of incentives from the local utility National Grid, including a finance program that covers 70 percent of energy efficiency upgrade costs with an option to finance the remaining 30 percent with no interest over the next two years.

“We are committed to promoting and offering programs that will help our customers save money while reducing their carbon footprint,” said Marcy Reed, president, National Grid, MA in statement about the project. “We’re proud to partner with [Atlas Box] and help defray the costs through program incentives that will make the project a reality.”

Energy efficiency retrofits can produce big savings for commercial building owners with little upfront costs.

FirstFuel, a Boston-based energy analytics startup, released a report today revealing 51 percent of all energy efficiency opportunities could be achieved through low and no-cost operational improvements. They calculated the savings over the entire U.S. commercial building market as a $17 billion opportunity for operational improvements. An infographic showing the energy savings can be found here.LED lighting gaining traction in commercial retrofits |

Categories: Sustainability, Workspace
%d bloggers like this: