<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Mohamad Hnayno, Author at OVHcloud Blog</title>
	<atom:link href="https://blog.ovhcloud.com/author/mohamad-hnayno/feed/" rel="self" type="application/rss+xml" />
	<link>https://blog.ovhcloud.com/author/ali-chehade/?mab_v3=24781</link>
	<description>Innovation for Freedom</description>
	<lastBuildDate>Fri, 12 Dec 2025 14:18:19 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Understanding OVHcloud&#8217;s data centres: how Delta T helps with water efficiency</title>
		<link>https://blog.ovhcloud.com/understanding-ovhclouds-data-centres-how-delta-t-helps-with-water-efficiency/</link>
		
		<dc:creator><![CDATA[Mohamad Hnayno,&#160;Julien Jay&#160;and&#160;Ali Chehade]]></dc:creator>
		<pubDate>Tue, 21 Mar 2023 09:40:00 +0000</pubDate>
				<category><![CDATA[OVHcloud Engineering]]></category>
		<guid isPermaLink="false">https://blog.ovhcloud.com/?p=24781</guid>

					<description><![CDATA[In our latest installment we introduced the merits of our Hybrid Immersion Liquid Cooling Technology. This new approach to cooling essentially enables new scenarios within the data centre all the while significantly reducing power consumption by up to 20% for cooling systems dedicated to servers. Amongst other benefits, it helps for operation in harsh climatic [&#8230;]<img src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Funderstanding-ovhclouds-data-centres-how-delta-t-helps-with-water-efficiency%2F&amp;action_name=Understanding%20OVHcloud%26%238217%3Bs%20data%20centres%3A%20how%20Delta%20T%20helps%20with%20water%20efficiency&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></description>
										<content:encoded><![CDATA[
<p>In our latest installment we introduced the merits of our <a href="https://blog.ovhcloud.com/new-hybrid-immersion-liquid-cooling-developments-at-ovhcloud/" data-wpel-link="internal">Hybrid Immersion Liquid Cooling Technology</a>. This new approach to cooling essentially enables new scenarios within the data centre all the while significantly reducing power consumption by up to 20% for cooling systems dedicated to servers. Amongst other benefits, it helps for operation in harsh climatic zones. While OVHcloud’s R&amp;D is hard at work on the next generation of cooling, it also keeps on improving the basic building blocks of cooling in the data centre. Today we’ll discuss in more detail one of the singular aspects of OVHcloud&#8217;s designed data centres that is crucial to reducing water consumption and overall cooling costs in comparison to mechanical cooling system used within data centres: Delta T.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img fetchpriority="high" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-1024x538.jpg" alt="Understanding OVHcloud's data centres: how Delta T helps with water efficiency" class="wp-image-24909" width="512" height="269" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-1024x538.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-300x158.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-768x404.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344.jpg 1199w" sizes="(max-width: 512px) 100vw, 512px" /></figure>



<p>Ever since 2003, OVHcloud has developed watercooling at scale for its servers in a free cooling approach that prohibits the use of power-hungry air conditioning thus reducing power costs and water usage. Over the years, our technology has been refined from generation to generation. All these years of work allowed us to see that one crucial but often overlooked parameter is the Delta T, a parameter that measures the difference in temperature for the water coming in and out of the data centre. From this single parameter comes greater efficiency particularly when it comes to Water Usage Effectiveness (WUE). A parameter that we continue refining with the most recent milestone being the Delta T=20K. Ready for the deep dive?</p>



<h3 class="wp-block-heading">Overview of OVHcloud cooling DC principles</h3>



<p>A typical OVHcloud data centre benefits from an indirect free cooling technique, also known as IFC, with the deployment at scale of a liquid cooling system. It helps in improving global data centre power efficiency with up to 50% energy savings compared to conventional mechanical cooling. With such a cooling strategy that leverages liquid cooling and air cooling, OVHcloud&#8217;s data centres aim at being highly efficient.</p>



<p>Our proprietary watercooling system recovers fatal heat generated by our servers through carefully crafted waterblocks offering direct contact with the chips. If we break down the system, there are four cooling bricks in the data centre:</p>



<ul class="wp-block-list">
<li>Dry Cooler (heat rejection system),</li>



<li>Pumping Substation (PSS),</li>



<li>Rack Rear Door Heat Exchanger (RDHX) and Cooling Distribution Unit (CDU),</li>



<li>Server waterblock,</li>
</ul>



<p>A typical water circuit in an OVHcloud data center goes like this: cold water is pumped to the rack from a pumping substation (PSS) before reaching the Rear Door Heat Exchanger (RDHX) installed at the rack level. Each rack comes with 3 RDHXs, each door having its own set of fans supporting air cooling capacity. Water then flows through the CDU or Cooling Distribution Units: placed inside the rack, CDU are comprised of pumps and plate heat exchangers (PHEXs). They distribute liquid to the waterblocks, effectively acting as a watercooling system, where the IT equipment heat will be transported back to the dry coolers located outside of the data centre for heat rejection.</p>



<p>Playing an important role in the cooling infrastructure, the dry coolers use external air to cool down the circulating liquid pumped from the data centre without any refrigeration. Depending on the climatic conditions, the most efficient evaporative cooling (EC) system to date is used to cool the outside air before it reaches the fins of the dry cooler.</p>



<h3 class="wp-block-heading">Kelvin, how delta T is measured and why it’s a key parameter</h3>



<p>In the research field, the usual norm is to express temperature difference in Kelvin while all temperature measurements are in °C, F or K. In this document, temperature is express in °C, while temperature difference is expressed in K. When it comes to our data centres, Delta T is the difference of water temperature between the inlet and outlet of a rack (in other words the in/out water temperature of the data center): the greater this difference is, the more efficient the data centre will be. Why? An increase in the temperature difference in the racks (dT) causes a decrease in the flow rate of water supplied per rack allowing the use of lower pumps capacity. Thus, an increase in the temperature difference permits a higher heat load per pumping unit and dry cooler for an equivalent PSS hydraulic capacity and heat rejection system.</p>



<h3 class="wp-block-heading">From Delta T=5K to Delta T=20K</h3>



<p>As we have seen, from a greater temperature difference comes more efficiency. It also means that the performance of the dry cooler is significantly improved. Indeed, water enters at higher temperature levels, while fan and evaporative cooling water consumption per kWh IT is reduced because a lower air flow rate is required. That later consequence allows the optimization of the ventilation capacity.</p>



<p>Since 2019, we upgraded to a new generation of cooling infrastructure for our most recent data centres capable of operating at a Delta T=20K instead of Delta T=5K. We rely on a lower number of dry coolers per data centre, yet more efficient ones, with smaller pumps and reduced piping. The result is a potential higher footprint (kW/m2) for the data centre, lower overall carbon footprint, better PPUE and an improved WUE, as well as reduced CAPEX and OPEX.</p>



<ul class="wp-block-list">
<li>Delta T=5K explained</li>
</ul>



<p>In a traditional OVHcloud data centre with a Delta T=5K setting, water temperature is to be maintained at 25°C providing inlet air temperature at 30°C and inlet water temperature of the waterblock at 35°C. As shown in the figure below, two circuits in parallel are connected to the rack.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-1024x605.png" alt="Delta T=5K" class="wp-image-24905" width="512" height="303" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-1024x605.png 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-300x177.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-768x454.png 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340.png 1488w" sizes="(max-width: 512px) 100vw, 512px" /></figure>



<p>Through constant development, we improved the cooling design of our rear door heat exchanger (RDHX) to increase the inlet water temperature and reduce pinch between cold air and cold water: this way, the data center inlet water temperature is now set at 27°C against 25°C previously.</p>



<ul class="wp-block-list">
<li>Delta T=20K explained</li>
</ul>



<p>Building up on the notable improvements we discussed, we were also able to setup a serial circuitry with PHEX and RDHX on the rack level in order to achieve a complete Delta T=20K environment as shown in the figure.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-1024x681.png" alt="Delta T=20K explained" class="wp-image-24926" width="512" height="341" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-1024x681.png 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-300x199.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-768x511.png 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345.png 1322w" sizes="(max-width: 512px) 100vw, 512px" /></figure>



<p>With water temperature maintained at 27°C, the inlet air temperature of the data centre remains at 30°C and inlet water temperature at 45°C for the waterblock. The data centre water outlet temperature is set at 47°C. On a sidenote, our robust design gives us confidence in the ability to operate with a higher temperature profile, still in a Delta T=20K setting, like 32-52°C.</p>



<figure class="wp-block-video"><video height="1080" style="aspect-ratio: 1920 / 1080;" width="1920" autoplay loop muted src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/animation-Delta-T.mp4"></video></figure>



<h3 class="wp-block-heading">What are the benefits of Delta T=20K VS Delta T=5K? And what are the advantages when applied to different data centre locations?</h3>



<p>To measure the gains of OVHcloud&#8217;s IFC system operating with a Delta T=20K, we conducted an extensive series of benchmarks in various locations with the same Cooling Module configured first as Delta T=5K then as Delta T=20K. We should be able to highlight significant gains in terms of water consumption.</p>



<p>We elected for the following methodology. We collected and analyzed the weather data for the past 40 years, including temperature and relative humidity with a record every 30 minutes. For the Roubaix site for example, the warmest year happened to be 2019. Figure below shows the temperature frequency in hours extracted from climatic data recorded at Roubaix in 2019. The key hours are between 5 and 20 °C.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-1024x591.png" alt="" class="wp-image-24787" width="512" height="296" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-1024x591.png 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-300x173.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-768x443.png 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-1536x886.png 1536w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-2048x1181.png 2048w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<p>Based on our working principle, we can deduct the WUE in every country where we have a data centre. For your reference, in this <a href="https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">scientific paper</a> we estimated the theoretical WUE for some of our locations that offer the best possible representation of various climatic conditions: Roubaix (France), Mumbai (India), Singapore, VintHill (USA). For accuracy purposes we’re sharing below the estimated WUE for Roubaix:</p>



<figure class="wp-block-image aligncenter size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20.png" alt="" class="wp-image-24844" width="502" height="351" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20.png 772w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20-300x210.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20-768x537.png 768w" sizes="auto, (max-width: 502px) 100vw, 502px" /></figure>



<p>This theoretical result is impressive. Yet, it’s important to have a better sense of what WUE can represent: we estimate that in an IFC with a Delta T=20K setting, 381 m<sup>3</sup> of water are enough to cool a 600 kW DC versus 964 m<sup>3 </sup>in an IFC with Delta T=5K setting.</p>



<p>The table below is a summary of the most glaring changes between Delta T=5K and Delta T=20K:</p>



<figure class="wp-block-table">
  <table class="ovhcloud_table">
    <thead>
      <tr>
        <th>&nbsp;</th>
        <th>dT= 5K</th>
        <th>dT= 20K</th>
      </tr>
    </thead>
    <tbody>
      <tr><td>Temperature profile</td><td>25°C &#8211; 30°C @ 5K</td><td>&nbsp;27°C – 47°C @ 20K</td></tr><tr><td>Water flow rate per server</td><td>Up to 0.25 l/min</td><td>Up to 0.5 l/min</td></tr><tr><td>Dry coolers consumption for 600kW of IT workload</td><td>Up to 48kW</td><td>Up to 24kW</td></tr><tr><td>Pumping Substation(s) (PSS) for 600 kW of IT workload (pump consumption)</td><td>Up to 17kW</td><td>Up to 3kW</td></tr><tr><td>Rack internal pumping (CDU)</td><td>0</td><td>Up to 7kW for 600kW of IT workload</td></tr><tr><td>PSS footprint</td><td>125 kg/pump</td><td>50 kg /pump</td></tr><tr><td>Average annual WUE (l/kWh)</td><td>0.16</td><td>0.09</td></tr></tbody></table></figure>



<h3 class="wp-block-heading">Versus the competition?</h3>



<p>To date, there are two different families of water liquid cooling techniques employed within data centres: close-coupled cooling and direct liquid cooling (see table). Watercooling solutions take advantage of the heat-carrying capacity of the water that is 3,3 times higher than that of air for a dramatic reduction in thermal resistance. With CPU and GPU being the most power-hungry parts in a server, liquid cooling via direct-to-chip contact contributes in returning warmer water from IT equipment than that of closely coupled cooling systems. Logically, the coolant temperature directly impacts energy savings and cooling efficiency participating in a significant decrease in the total DC site energy consumption by 14 to 20%<sup>1</sup>. Please note, more information on the above can be obtained <a href="https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">here</a>.</p>



<figure class="wp-block-table">
<table class="ovhcloud_table">
  <thead>
    <tr>
      <th colspan="2">Liquid cooling solutions</th>
      <th>Description</th>
    </tr>
  </thead>
  <tbody>

    <tr>
      <td rowspan="2">Close coupled cooling with a typical inlet water temperature = 5 to 23 °C</td>
      <td>In-row Cooling</td>
      <td>Row-based air conditioning units are installed inside the rack rows</td>
    </tr>
    <tr>
      <td>In-rack Cooling</td>
      <td>The cooling system is combined with the server rack, and both are sealed completely.</td>
    </tr>
    <tr>
      <td rowspan="2">Direct-to-chip cooling with a typical inlet water temperature 27–45 °C</td>
      <td>Rear Door Heat Exchanger</td>
      <td>Based on a combination of fans and a finned heat exchanger installed on the rear side of racks.</td>
    </tr>
    <tr>
      <td>Cold plates (waterblocks)</td>
      <td>A cold plate with micro-channels in direct contact with components like CPUs and GPUs. Here, the water flowing across absorbs chip energy.</td>
    </tr>
  </tbody>
</table></figure>



<h3 class="wp-block-heading">What’s next?</h3>



<p>As we celebrate the 20th anniversary of our watercooling technology this year, we continue to innovate. We have been working on designing a new breed of data centre employing climate friendly technologies when it comes to cooling. In a global context of surging energy prices and several water shortages in the 2022 summer, it becomes quite obvious as to why our previous technological bets pay off. With best-in-class PUE and WUE for our data centres, we continue to work tirelessly on improving our footprint within the ecosystem illustrating our deep commitment towards <a href="https://corporate.ovhcloud.com/en-ie/sustainability/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">sustainability</a>. Hopefully by now you have come to realize how important to the data centre is the Delta T parameter. From the infrastructure point of view to the tangible water efficiency, Delta T=20K contributes in adapting to various geographies and ultimately water conservation.</p>



<p>The water consumption of data centres has indeed a tremendous impact locally going that far as to affect river flow, groundwater levels and influence overall water resources for a whole territory. Thus, reducing WUE proves critical for any modern data centres to lower the overall impact on local communities.</p>



<p>The impressive saving of water illustrated by our WUE also contributes to a streamlined PUE for compliance with ISO/IEC 30134 standard. Last, but not least, the demonstrated advantages of Delta T=20K will prove to be a solid basis for the next generation of improvements our teams develop.</p>



<p>1: Performance analysis of new liquid cooling topology and its impact on data centres &#8211;&nbsp; <a href="https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub</a></p>
<img loading="lazy" decoding="async" src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Funderstanding-ovhclouds-data-centres-how-delta-t-helps-with-water-efficiency%2F&amp;action_name=Understanding%20OVHcloud%26%238217%3Bs%20data%20centres%3A%20how%20Delta%20T%20helps%20with%20water%20efficiency&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></content:encoded>
					
		
		<enclosure url="https://blog.ovhcloud.com/wp-content/uploads/2023/03/animation-Delta-T.mp4" length="11425209" type="video/mp4" />

			</item>
		<item>
		<title>New Hybrid Immersion Liquid Cooling developments at OVHcloud</title>
		<link>https://blog.ovhcloud.com/new-hybrid-immersion-liquid-cooling-developments-at-ovhcloud/</link>
		
		<dc:creator><![CDATA[Mohamad Hnayno,&#160;Julien Jay&#160;and&#160;Ali Chehade]]></dc:creator>
		<pubDate>Thu, 13 Oct 2022 07:31:17 +0000</pubDate>
				<category><![CDATA[OVHcloud Engineering]]></category>
		<category><![CDATA[Datacenters & network]]></category>
		<category><![CDATA[Immersion cooling]]></category>
		<category><![CDATA[Liquid cooling]]></category>
		<category><![CDATA[OVHcloud]]></category>
		<category><![CDATA[R&D]]></category>
		<category><![CDATA[Watercooling]]></category>
		<guid isPermaLink="false">https://blog.ovhcloud.com/?p=23587</guid>

					<description><![CDATA[If you think about it, datacentre cooling infrastructure hasn’t changed much over the past few decades. Whilst compute power and efficiency have remarkably improved over the years, many datacentres still rely on racks whose cooling depends on a variety of systems comprised of fans. Heat, especially fatal heat, remains a top concern to keep servers [&#8230;]<img src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fnew-hybrid-immersion-liquid-cooling-developments-at-ovhcloud%2F&amp;action_name=New%20Hybrid%20Immersion%20Liquid%20Cooling%20developments%20at%20OVHcloud&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></description>
										<content:encoded><![CDATA[
<p>If you think about it, datacentre cooling infrastructure hasn’t changed much over the past few decades. Whilst compute power and efficiency have remarkably improved over the years, many datacentres still rely on racks whose cooling depends on a variety of systems comprised of fans. Heat, especially fatal heat, remains a top concern to keep servers up-and-running. Whilst some datacentres have opted for power hungry solutions such as air conditioning (direct expansion), others have a free cooling approach.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-1024x576.jpg" alt="New Hybrid Immersion Liquid Cooling developments at OVHcloud" class="wp-image-23645" width="512" height="288" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-1024x576.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-300x169.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-768x432.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head.jpg 1230w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<p>Here at OVHcloud, we pioneered free cooling, and have used water cooling at scale to cool down our servers since 2003. This unique approach helped us to set high standards in PUE/WUE indexes for our global datacentres, all the while lowering our overall carbon impact.</p>



<p>With the evolution of IT equipment power, OVHcloud R&amp;D worked on a new generation of datacentre cooling solutions. One of the most promising technologies today is immersion cooling, thanks to direct contact of the fluid with hardware components. It allows us to build datacentres all over the world maintaining low PUE/WUE indexes even in harsh climatic zones.<br><br>In this blog post, we will explain how we developed our own breed of immersion cooling to achieve several goals, from improving datacentre footprint and reducing power consumption, to adapting to new operating conditions and improved overall reliability.</p>



<h3 class="wp-block-heading">OVHcloud Hybrid Immersion Liquid Cooling</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-869x1024.jpg" alt="Immersion Cooling 01" class="wp-image-23594" width="217" height="256" title="" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-869x1024.jpg 869w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-255x300.jpg 255w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-768x905.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-1304x1536.jpg 1304w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-1739x2048.jpg 1739w" sizes="auto, (max-width: 217px) 100vw, 217px" /></figure>



<p>Immersion Cooling is the practice of submerging electronic devices in a thermally and not electrically conductive liquid.</p>



<p>The typical immersion cooling solution usually employs pumps, heat sink structures, heat exchangers, condensers, sealed evaporative equipment etc. that either consume large amounts of energy to operate, require sealed casings, or occupy relatively large surface areas that limit the number of servers that can be implemented.</p>



<p>OVHcloud developed its own Hybrid Immersion Liquid Cooling technique. It is comprised of a direct to chip water cooling system and a passive natural single phase immersion cooling system containing two fluids:</p>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-1024x640.jpg" alt="Immersion Cooling 03" class="wp-image-23600" width="256" height="160" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-1024x640.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-300x188.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-768x480.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-1536x960.jpg 1536w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1.jpg 2000w" sizes="auto, (max-width: 256px) 100vw, 256px" /></figure>



<ul class="wp-block-list"><li>Water: cooling a heat sink through water blocks disposed on CPUs and GPUs with the same solution used in all OVHcloud servers and a proprietary serpentine convection coil connected to a pumping substation (PSS) and a dry cooler to evacuate heat outside the DC.<br></li><li>Immersion Cooling (IC): the fluid is contained in a tank and cools all IT equipment in the server rather than just CPUs and GPUs; that fluid basically replaces the air circulating in OVHcloud servers thus enhancing the efficiency of any component not cooled by OVHcloud water cooling systems.</li></ul>



<p>It comes with a new 3 floor server rack design, in a library format, which we can populate with up to 48 servers (1U) or 24 servers (2U), in a book format. Each server is submerged in its own tank for independent cooling, allowing large scale deployments. In addition, each server benefits from a specific monitoring system of all environmental factors ensuring safe and secure operation at the server level.</p>



<h3 class="wp-block-heading">Benefits in terms of power consumption and efficiency</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-1024x641.jpg" alt="Immersion Cooling 04" class="wp-image-23602" width="256" height="160" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-1024x641.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-300x188.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-768x480.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-1536x961.jpg 1536w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04.jpg 2000w" sizes="auto, (max-width: 256px) 100vw, 256px" /></figure>



<p>OVHcloud Hybrid Immersion Liquid Cooling has several advantages that makes it an efficient solution:</p>



<ul class="wp-block-list"><li>The new passive rack design means there are neither pumps nor fans, resulting in zero cooling electrical consumption at the rack level</li><li>High power racks can operate with datacentre inlet temperatures of up to 45 °C, allowing for different cooling loads at different climatic conditions</li><li>Stay energy efficient using free cooling available on site</li><li>The electrical consumption and CAPEX are further reduced through the elimination of the Evaporative Cooling system (no pumps) used on dry coolers for data centers located in areas where air ambient climatic temperatures are below 43 °C</li><li>The difference between supplied and recovered temperature of DC water (DT) of 20K can be achieved through highly efficient CPU and GPU cold plates and a large thermal contact surface between the serpentine and the dielectric fluid. The patent pending serpentine design occupies a very small area within the server chassis maximizing space for components</li><li>Global datacentre cooling infrastructure power consumption is reduced by at least 20.7% compared to the OVHcloud water cooling system</li><li>The energy consumption of a server per year is reduced by at least 20% compared to air cooled servers and 7% versus water cooled servers</li><li>Reduced power consumption means lower operating expenses (OPEX)</li></ul>



<p>Finally, in the context of ever-increasing CPU/GPU TDP, the solution supports higher computing power density.</p>



<h3 class="wp-block-heading">A new generation of datacentre</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-861x1024.jpg" alt="Immersion Cooling 09" class="wp-image-23605" width="215" height="256" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-861x1024.jpg 861w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-252x300.jpg 252w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-768x913.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-1291x1536.jpg 1291w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09.jpg 1500w" sizes="auto, (max-width: 215px) 100vw, 215px" /></figure>



<p>Our new Hybrid Immersion Liquid Cooling approach stands outs with several key advantages for datacentre operators:</p>



<ul class="wp-block-list"><li>Actual footprint of up to 37U/m<sup>2</sup> can be guaranteed with no extra pumping and condensers systems. The design is scalable to two or three times its footprint if immersion cooling racks are installed in stacked maritime containers</li><li>With no sealing required nor sophisticated heat exchangers and pumping circuits the capital expenditure (CAPEX) remains low</li><li>With a datacentre outlet temperature of 65 °C, fatal heat is better captured, and affordable heat recovery systems can be envisioned</li><li>More flexibility and freedom in design at OVHcloud, where the new proposed rack design requires access to the front side, unlike classic racks requiring access from both the front and rear sides.</li></ul>



<h3 class="wp-block-heading">Redefining datacentre indexes</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-788x1024.jpg" alt="Immersion Cooling 10" class="wp-image-23606" width="197" height="256" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-788x1024.jpg 788w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-231x300.jpg 231w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-768x998.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-1182x1536.jpg 1182w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-1577x2048.jpg 1577w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10.jpg 1732w" sizes="auto, (max-width: 197px) 100vw, 197px" /></figure>



<p>With such a new game-changing technique, the usual datacentre performance indicators evolve dramatically.</p>



<p>Enhance PPUE<sup>[1]</sup> to 1.004: PPUE or partial power usage effectiveness defines a certain portion of the overall PUE of a datacentre within a clearly defined boundary. Infrastructure cooling PPUE is as follows:</p>



<figure class="wp-block-image size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/Eq-2.png" alt="" class="wp-image-23638" width="168" height="55" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/Eq-2.png 671w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/Eq-2-300x99.png 300w" sizes="auto, (max-width: 168px) 100vw, 168px" /></figure>



<p>Reduce WUE to 0 for datacentre located in areas where ambient air climatic temperatures are below 43 °C: WUE or water usage effectiveness is a sustainability metric to measure the amount of water used by datacentres to cool IT equipment. Annual site water usage includes water used for humidification and water evaporated on-site for energy production or cooling of the datacentre and its support systems. It is defined as follow:</p>



<figure class="wp-block-image size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O.png" alt="" class="wp-image-23632" width="253" height="53" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O.png 1010w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O-300x62.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O-768x160.png 768w" sizes="auto, (max-width: 253px) 100vw, 253px" /></figure>



<h3 class="wp-block-heading">Environmental considerations</h3>



<p>The Hybrid Immersion Liquid Cooling technique relies on a non-volatile dielectric hydrocarbon fluid. OVHcloud R&amp;D tested and qualified the system with a variety of market fluids. While each fluid has different properties, we paid special attention to their compliance with the following considerations:</p>



<ul class="wp-block-list"><li>Non-corrosive</li><li>Ultra-low vaporization</li><li>Nontoxic</li><li>Non allergenic</li><li>Biodegradability in 30 days</li><li>Very high flash point</li><li>Dielectric Strength up to 42 kV</li><li>GWP<sup>[2]</sup>=0</li><li>ODP<sup>[3]</sup>=0</li></ul>



<h3 class="wp-block-heading">Operational changes within the datacentre</h3>



<p>The new 3 floor library format rack design is footprint optimized for a high density of servers and incidentally compute power per square meter (U/m<sup>2</sup>). It could be fitted with manual/automated guidance to extract servers, reducing the maintenance impact of one server on the overall rack.</p>



<p>The new Hybrid Immersion Liquid Cooling technique also improves reliability reducing failure rate by up to 60 % by eliminating risk of dust circulation. It contributes to a silent datacentre with the removal of any fan at rack and server levels. Finally, several components are no longer needed:</p>



<ul class="wp-block-list"><li>No heat exchangers attached to the racks (3 heat exchangers are eliminated per rack),</li><li>36 fans usually installed per rack are eliminated</li><li>All small fans inside the servers are eliminated</li><li>Cooling modules comprised of pumps and plate heat exchangers are eliminated</li></ul>



<figure class="wp-block-image aligncenter size-full"><img loading="lazy" decoding="async" width="640" height="360" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/Immersive-cooling.gif" alt="" class="wp-image-23607"/></figure>



<h3 class="wp-block-heading">What’s next?</h3>



<p>The new OVHcloud Hybrid Immersion Liquid Cooling paves the way for unapologetically more compute power in modern datacentres. Its unique patent pending design is sustainable and preserves the opportunity to valorize fatal heat due to higher operating temperatures. Moreover, lower power and water consumption, as well as improved footprint can be achieved with the benefit of adapting to regions with harsh climatic conditions. We are incredibly excited by the very first use cases of Hybrid Immersion Liquid Cooling in the fields of banking, medical and scientific research, not forgetting gaming workloads. We can’t wait to share more with you in the future.</p>



<p style="font-size:12px">[1]: Partial PUE is used instead of PUE. It is expected that the global PUE will improve in a similar approach which includes the power lost in the energy distribution system [More details about the PPUE and PUE <a href="https://doi.org/10.1016/j.applthermaleng.2022.118733" target="_blank" rel="noreferrer noopener nofollow external" data-wpel-link="external">here</a>]. In line with its transparency commitments, OVHcloud will communicate PUE values upon gathering significant data on the long run on a specific location basis.<br>[2]: GWP stands for Global Warming Potential. The fluid does not contribute to global warming in opposition to most of gas/liquid fluids used in cooling technologies.<br>[3]: ODP stands for Ozone Depletion Potential. The fluid does not contribute to ozone depletion in opposition to most of gas/liquid fluids used in cooling technologies.<br></p>



<p><a id="_msocom_1"></a></p>
<img loading="lazy" decoding="async" src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fnew-hybrid-immersion-liquid-cooling-developments-at-ovhcloud%2F&amp;action_name=New%20Hybrid%20Immersion%20Liquid%20Cooling%20developments%20at%20OVHcloud&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
