<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Julien Jay, Author at OVHcloud Blog</title>
	<atom:link href="https://blog.ovhcloud.com/author/julien-jay/feed/" rel="self" type="application/rss+xml" />
	<link>https://blog.ovhcloud.com/author/ovh/?mab_v3=26361</link>
	<description>Innovation for Freedom</description>
	<lastBuildDate>Fri, 30 Jan 2026 16:24:43 +0000</lastBuildDate>
	<language>en-GB</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Industrial Excellence meets Artificial Intelligence: Behind the Scenes with Smart Datacenter</title>
		<link>https://blog.ovhcloud.com/industrial-excellence-meets-artificial-intelligence-behind-the-scenes-with-smart-datacenter/</link>
		
		<dc:creator><![CDATA[Ali Chehade,&#160;Julien Jay&#160;and&#160;Christian Sharp]]></dc:creator>
		<pubDate>Fri, 12 Dec 2025 14:35:42 +0000</pubDate>
				<category><![CDATA[OVHcloud Engineering]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[cooling]]></category>
		<guid isPermaLink="false">https://blog.ovhcloud.com/?p=30107</guid>

					<description><![CDATA[At OVHcloud, we are constantly looking for ways to improve our operations and reduce our impact on the environment. This has been a defining part of the company since 1999 and is a key part of our organisational DNA and our commercial model. We are very proud to present the new Smart Datacenter cooling system, [&#8230;]<img src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Findustrial-excellence-meets-artificial-intelligence-behind-the-scenes-with-smart-datacenter%2F&amp;action_name=Industrial%20Excellence%20meets%20Artificial%20Intelligence%3A%20Behind%20the%20Scenes%20with%20Smart%20Datacenter&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></description>
										<content:encoded><![CDATA[
<p></p>



<p>At OVHcloud, we are constantly looking for ways to improve our operations and reduce our impact on the environment. This has been a defining part of the company since 1999 and is a key part of our organisational DNA and our commercial model.</p>



<p>We are very proud to present the new Smart Datacenter cooling system, which significantly improves energy and water efficiency while delivering a significant reduction in carbon impact across the entire cooling chain, from manufacturing and transport to daily operations.</p>



<p>The system is a new way of building and deploying datacenter infrastructure, changing how we manage and monitor water supply and demand, using a combination of industrial design, IoT sensors and AI innovation, specifically in our smart racks, advanced cooling distribution units (CDUs) and intelligent dry coolers.</p>



<p>Smart Datacenter delivers a reduction in power consumption of up to 50% across the entire cooling loop, from server water blocks to dry coolers, and consumes 30% less water compared to OVHcloud’s earliest design, driving major sustainability benefits. The system also uses complex mathematical models capturing detailed rack-level and environmental data to optimize cooling performance in real time. Furthermore, all operational data is fed into a centralized data lake, enabling cutting-edge artificial intelligence to predict, adapt, and enhance system efficiency and reliability.</p>



<h2 class="wp-block-heading">Let’s get into the detail.</h2>



<p>The system has three main components:</p>



<ol start="1" class="wp-block-list">
<li><strong>Smart Racks: </strong>These are designed with an innovative hydraulic “pull” architecture, where each rack autonomously draws exactly the water flow, pressure, and temperature it needs, dynamically adapting to server load and performance.</li>



<li><strong>Advanced Cooling Distribution Unit (CDU): </strong>This is a compact, next-generation primary loop unit that autonomously balances flow and pressure across all racks without manual intervention or any electrical communication. It uses only hydraulic signals (pressure, flow and temperature of water) to “understand” rack demands and continuously optimizes operation for lowest power consumption and extended pump lifespan.</li>



<li><strong>Intelligent Dry Cooler: </strong>This is operated seamlessly by the CDU, eliminating the need for separate control systems (“brains”) on both the dry cooler and the CDU. This unified control architecture ensures optimized, coordinated performance across the entire cooling infrastructure.</li>
</ol>



<p>OVHcloud’s new Single-Circuit System (SCS) replaces the previous Dual-Circuit System cooling architecture (DCS), which consisted of a primary facility loop and a secondary in-rack loop separated by an in-rack Coolant Distribution Unit (CDU), installed inline directly after the rear door heat exchangers (RDHX), as shown in Figure 1. The CDU housed multiple pumps, several plate heat exchangers (PHEX), and a network of valves and sensors.</p>



<figure class="wp-block-video aligncenter"><video height="1080" style="aspect-ratio: 1920 / 1080;" width="1920" controls src="https://blog.ovhcloud.com/wp-content/uploads/2025/12/OVH-cooling-loop.mp4"></video></figure>



<p>Figure 1. Dual-Circuit System cooling architecture (DCS) vs Single-Circuit system (SCS).</p>



<p>That previous design maintained turbulent flow through water blocks (WBs) using the in-rack CDU to regulate flow and temperature differences, ensuring performance despite OVHcloud’s ΔT of 20 K on the primary loop (far higher than the typical market value around 5 K).</p>



<p>Removing the in-rack CDU — replaced by a Pressure Independent Control Valve (PICV), a flow meter, and two temperature sensors on each rack — simplifies the system to a single closed-loop, where the flow rate through servers is dictated directly by the primary loop, adapting dynamically to rack load density. On the rack side, the system adapts the exact flow the rack requires by analyzing the water behavior and performing iterative, predictive thermal optimization considering IT components and the supplied water temperature and flow. This results in lower inlet water temperatures at the server level due to the elimination of the in-rack CDU’s approach temperature difference, and reduces electrical consumption, CAPEX, carbon footprint, and rack footprint.</p>



<p>To prevent laminar flow and maintain heat transfer efficiency at low flow rates, OVHcloud introduced a passive hydraulic innovation by arranging servers into clusters connected in series with servers inside each cluster connected in parallel, rather than all servers in parallel. This ensures higher water flow through individual servers even when the rack density is low. While this increases system pressure drops depending on cluster configuration, it results in better thermal performance and all servers receive water at temperatures equal to or lower than in the previous DCS design.</p>



<p>The racks operate on a novel hydraulic “pull” principle — where each rack draws exactly the hydraulic power it requires, rather than being pushed by the system. The &nbsp;CDU then dynamically adapts the overall hydraulic performance of the primary loop, balancing flow and pressure in real time to match the actual demand of the entire data center.</p>



<p>A key breakthrough is the CDU’s communication-free operation: it requires no cables, radio waves, or other electronic communication with racks. Instead, it analyzes hydraulic signals — pressure, flow, and temperature fluctuations within the water itself — to understand each rack’s cooling needs and adapt accordingly. This eliminates complex telemetry infrastructure, reduces operational risks, and enhances system reliability. To ensure water quality and system longevity, water supplied to the data center is filtered at 25 microns, and multiple sophisticated high-precision sensors continuously monitors water quality in real time.</p>



<p>The CDU is 50% smaller than the previous generation and manages the entire thermal path — from chip-level water blocks, through the racks and CDU, to the dry coolers.</p>



<p>The newly designed dry cooler is also 50% smaller than the previous model and features one of the lowest density footprints worldwide. Thanks to years of thermal studies on heat exchangers by the OVHcloud R&amp;D team, it has 50% fewer fans, resulting in very low energy consumption, while also reducing noise. Its compact size means that we can also transport more units in the same truck!  This design achieves a 30% reduction in water consumption compared to OVHcloud’s earliest dry cooler design. A key innovation in the dry cooler is its advanced adiabatic cooling pads system, which cools incoming hot air before it passes through the heat exchangers. This high-precision water injection system is the first of its kind, and adjusts water application based on multiple sensors and extensive iterative calculations, including data center load, ambient temperature, and humidity levels.</p>



<p>Unlike traditional adiabatic systems, the pads’ system does not use a conventional recirculation loop. Instead, water is injected when needed onto the pads via a simple setup consisting of a solenoid valve and a flow meter, eliminating complex hydraulics such as pumps, filters, storage tanks, level sensors, and conductivity sensors. The system maintains water quality and physical/chemical properties through careful design, drastically simplifying operation and reducing maintenance needs.</p>



<p>The CDU continuously analyzes data from up to 36 sensors distributed across the CDU itself and the associated dry cooler. It also collects operational data from solenoid valves, pumps, and dry cooler fans across the infrastructure loop. All components are monitored and managed by the system’s central intelligence—the CDU’s control panel—providing a comprehensive understanding of the entire system’s behavior, from the data center interior to the external ambient environment, ensuring real-time performance oversight and precise thermal regulation.</p>



<p>Through this iterative and precise control of water injection, the system optimizes cooling performance and Water Usage Effectiveness (WUE), ensuring minimal water consumption without sacrificing thermal effectiveness.</p>



<h2 class="wp-block-heading"><strong>Advanced System Analytics, Learning &amp; AI Integration</strong></h2>



<p>The entire system is designed to continuously analyze the thermal, hydraulic, and aerodynamic behaviors of the various fluids along the cooling path. It uses daily operational data to learn and adapt its performance dynamically, optimizing cooling efficiency and reliability over time.</p>



<p>The CDU’s brain—the control panel—aggregates data from 36 sensors distributed across the CDU and dry cooler, as well as operational data from solenoid valves, pumps, and dry cooler fans within the infrastructure loop. It also collects critical rack-level information, including flow rates, temperatures, and IPMI data that reflect IT equipment behavior and performance. All this operational data is pushed to a centralized data lake for parallel analysis, which forms the foundation for the next step: integrating cutting-edge artificial intelligence (AI). This AI will leverage the continuously gathered data and learning processes to enhance predictive capabilities, optimize future operating points, and enable fully autonomous decision-making.</p>



<p>This combination of real-time learning and AI-powered analytics will provide advanced diagnostics, predictive maintenance, and proactive management — maximizing uptime, reducing costs, and driving ever-greater sustainability.</p>



<h2 class="wp-block-heading"><strong>Iterative Control System Innovation</strong></h2>



<p>The iterative control system manages all aspects in real time, hands-free, continuously learning from sensor data and operational feedback. It applies algorithms to the pump speed on the CDU, the fans on the dry cooler and the solenoid valve controlling water injection on the adiabatic pads.</p>



<p>On the rack side, the system uses a PICV valve, flow meter, and two temperature sensors to adapt the exact hydraulic flow needed by each rack, considering IT load and incoming water conditions, iteratively optimizing thermal performance and energy efficiency.</p>



<p>On the CDU, the system analyzes combined hydraulic signals from all racks alongside ambient data center conditions, dynamically balancing flow and pressure across the entire data center infrastructure without human intervention.</p>



<p>Furthermore, OVHcloud’s cooling system integrates intelligent communication between cooling line-ups to enhance failure detection and simplify maintenance. This is achieved through embedded freeze-gaud and resilience-switch mechanisms that ensure continuous operation and system resilience. The freeze-gaud system is designed to protect the dry coolers in sub-zero ambient conditions by keeping water circulating through their heat exchangers. If the overall loop flow drops below a predefined threshold, the system automatically opens a normally closed bypass valve to maintain circulation—preventing freezing despite the use of pure water (without glycol) as the cooling medium. The resilience-switch system maintains redundancy by hydraulically linking multiple cooling lines. In the event of failure or overload on one line, normally open solenoid valves isolate the affected line, while bypass valves on neighboring lines open to redistribute water flow and maintain cooling performance. This dynamic and autonomous valve management ensures uninterrupted service and rapid fault response.</p>



<p>Drawing inspiration from autonomous control methodologies in leading-edge industries, the system predicts future behavior based on iterative calculations, dynamically adapting pump speed, fans speed and solenoid valves openings to converge rapidly on optimal operating points. It also adjusts performance based on external constraints such as noise limits, water availability, or energy costs — for example, consuming more energy to save water in water-stressed regions or balancing noise restrictions in urban deployments.</p>



<p>This unique, self-optimizing end-to-end control system maximizes energy efficiency, sustainability, and operational simplicity, extending pump life cycles and ensuring the most environmentally responsible data center cooling solution available today.</p>



<p>This vertically integrated, autonomous system — including smart racks, the advanced CDU, and the intelligent dry cooler — represents a world-first in end-to-end, intelligent, sustainable, communication-free, and data-driven data center cooling.</p>



<h2 class="wp-block-heading"><strong>Why is this important?</strong></h2>



<p>This innovation is critical because it marks a decisive step toward radically more sustainable, efficient, and autonomous data center cooling — addressing the growing demands of digital infrastructure while reducing its environmental footprint.</p>



<p>By using fewer, smaller components, we are saving power, cutting transport costs and reducing carbon impact. Using fewer fans on the dry cooler means up to 50% lower energy consumption on the cooling cycle – and the new pad system means 30% lower water consumption in the cooling system. The system is fully autonomous, avoiding human error. A temperature gradient of 20K on the primary loop – four times higher than the industry average – means that flow rates can be lower and water efficiency is higher. The system doesn’t rely on Wi-Fi or cabling, and the predictive control constantly adapts to external conditions or situational goals, feeding into a data lake to help continuously optimize performance.</p>



<p>Today’s world is built on technology, and datacenters are a key part of that technology, but there is a pressing need to ensure we can maintain human progress without incurring a significant carbon footprint. Power and water efficiency is a key part of this equation in the datacenter industry, and our innovation in the Smart Datacenter continues our trajectory of supporting today’s needs without compromising the world of tomorrow.</p>



<figure class="wp-block-image size-large"><img fetchpriority="high" decoding="async" width="1024" height="575" src="https://blog.ovhcloud.com/wp-content/uploads/2025/12/image-6-1024x575.png" alt="" class="wp-image-30116" srcset="https://blog.ovhcloud.com/wp-content/uploads/2025/12/image-6-1024x575.png 1024w, https://blog.ovhcloud.com/wp-content/uploads/2025/12/image-6-300x169.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2025/12/image-6-768x432.png 768w, https://blog.ovhcloud.com/wp-content/uploads/2025/12/image-6.png 1502w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p></p>
<img decoding="async" src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Findustrial-excellence-meets-artificial-intelligence-behind-the-scenes-with-smart-datacenter%2F&amp;action_name=Industrial%20Excellence%20meets%20Artificial%20Intelligence%3A%20Behind%20the%20Scenes%20with%20Smart%20Datacenter&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></content:encoded>
					
		
		<enclosure url="https://blog.ovhcloud.com/wp-content/uploads/2025/12/OVH-cooling-loop.mp4" length="4050958" type="video/mp4" />

			</item>
		<item>
		<title>Why Data Sovereignty and Trusted Cloud is critical for the future of your business</title>
		<link>https://blog.ovhcloud.com/data-sovereignty-trusted-cloud-critical-business/</link>
		
		<dc:creator><![CDATA[OVHcloud,&#160;Julien Jay&#160;and&#160;William Dubreuil]]></dc:creator>
		<pubDate>Wed, 06 Mar 2024 11:10:10 +0000</pubDate>
				<category><![CDATA[Accelerating with OVHcloud]]></category>
		<category><![CDATA[Anssi]]></category>
		<category><![CDATA[Data Sovereignty]]></category>
		<category><![CDATA[OVHcloud]]></category>
		<category><![CDATA[Security]]></category>
		<category><![CDATA[SNC]]></category>
		<category><![CDATA[trusted cloud]]></category>
		<guid isPermaLink="false">https://blog.ovhcloud.com/?p=26361</guid>

					<description><![CDATA[As a global player and the leading European cloud provider, we have witnessed first-hand the challenges and opportunities that the digital transformation brings to our continent. In this post, we would like to share our vision of how data sovereignty and trusted cloud help customers and partners. But also all private and public organizations to [&#8230;]<img src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fdata-sovereignty-trusted-cloud-critical-business%2F&amp;action_name=Why%20Data%20Sovereignty%20and%20Trusted%20Cloud%20is%20critical%20for%20the%20future%20of%20your%20business&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></description>
										<content:encoded><![CDATA[
<p>As a global player and the leading European cloud provider, we have witnessed first-hand the challenges and opportunities that the digital transformation brings to our continent. In this post, we would like to share our vision of <strong>how data sovereignty and trusted cloud help customers and partners</strong>. <strong>But also all private and public organizations to innovate, compete, and thrive in a globalized and fast-changing world</strong>.</p>



<figure class="wp-block-image size-full is-resized"><img decoding="async" width="830" height="553" src="https://blog.ovhcloud.com/wp-content/uploads/2024/03/Cloud-Security.webp" alt="data sovereignty trusted cloud security" class="wp-image-26363" style="aspect-ratio:4/3;object-fit:cover;width:450px" title="Data sovereignty Trusted Cloud" srcset="https://blog.ovhcloud.com/wp-content/uploads/2024/03/Cloud-Security.webp 830w, https://blog.ovhcloud.com/wp-content/uploads/2024/03/Cloud-Security-300x200.webp 300w, https://blog.ovhcloud.com/wp-content/uploads/2024/03/Cloud-Security-768x512.webp 768w" sizes="(max-width: 830px) 100vw, 830px" /></figure>



<p>Recognizing that data is now key to many industries, OVHcloud supports its clients in their data journey. Without compromising data sovereignty and trusted cloud. Nor performance or security. When building its open and trusted Cloud, OVHcloud ensured that customers, partners and cloud users alike can keep their freedom of choice on a secure space. Providing users with clear, transparent, and accessible information on regulations applying to their data. OVHcloud offers a trustworthy environment. A place where data, including the most sensitive one, is protected with the highest standards and transparent certifications. This unique proposition, backed by a resilient infrastructure, ensures that your data stays your data. All the while being immune to extra-territorial laws. We are confident that data sovereignty and trusted cloud can make a difference.</p>



<p><strong>What is Trusted Cloud and how does it support data sovereignty?</strong></p>



<p>Trusted Cloud provides the highest levels of security, privacy, and transparency for its users. It gives industries the tools and the confidence to leverage the cloud for their digital transformation, while preserving their values, interests, and competitiveness.</p>



<p>In other words, Trusted Cloud provides data protection and sovereignty, that encompass both cybersecurity and legal dimensions:</p>



<ul class="wp-block-list">
<li>It ensures that the data of its users is encrypted, and backed up. With this in mind, users keep track of their data usage and have full control and ownership over their data and the insights derived from it. Trusted cloud complies with the strictest regulations and certifications (SNC, C5…) and offers contractual guarantees to its users.</li>



<li>It gives its users the visibility and the control over the operations of the cloud provider, such as where the data is stored, who can access it, and how it is processed.&nbsp;</li>
</ul>



<p>OVHcloud Trusted cloud solutions contribute to the development of a European cloud market that is diverse, dynamic, and resilient. For this reason, our solutions comply with various certifications, including the highest known today in protecting organizations, industries, and governments’ interest: SecNumCloud.</p>



<p><strong>What is SecNumCloud qualification all about?</strong></p>



<p>SecNumCloud is issued by the French National Cybersecurity Agency (ANSSI). Its goal is to guarantee that the cloud service meets the strictest requirements in terms of <strong>data protection</strong>, <strong>and data sovereignty</strong>. The SecNumCloud standard covers requirements relating to the service provider, its employees and the way in which services are provided. It guarantees the robustness of the solution in the face of the most common cyber-attacks, as well as the rigor and formalization of the service provider&#8217;s processes and methods. The qualification process is a long-term guarantee, ensuring a relationship of trust between service providers and ANSSI.</p>



<p>The requirements of SecNumCloud’s most recent qualification (SecNumCloud 3.2) guarantee the protection of cloud services from non-European law thanks to a combination of three types of measures:</p>



<p><strong>&#8211; technical:</strong> watertightness test standards of information systems;</p>



<p><strong>&#8211; operational:</strong> only the service provider can intervene on the resources supporting the service;</p>



<p><strong>&#8211; legal:</strong> exclusive application of European law.</p>



<p>What is at stakes with SecNumCloud is the use of this qualification for both public administrations and private organizations. Domains &#8211; industry sectors &#8211; such as finance and banking, health and cybersecurity, anchor their expansion to their ability to protect cloud data and cloud services.</p>



<p>OVHcloud has recently received the SecNumCloud 3.2 qualification on three distinct datacenters located in France: Gravelines, Roubaix and Strasbourg. With an unparallel commitment to SecNumCloud, OVHcloud is a firm believer that the qualification can guarantee the future of business for industries and administrations. For which cloud is now a center piece and a driver for disruption. By providing them with a trusted cloud solution that enables to store, process, and share their sensitive and strategic data.</p>



<p>OVHcloud can support European public and private organizations:</p>



<p>&#8211; by offering them a range of cloud services SecNumCloud qualified (such as Hosted Private Cloud, which is a dedicated, scalable, and hybrid cloud solution);<br>&#8211; OVHcloud&#8217;s SecNumCloud services are hosted in its own data centers in France; <br>&#8211; OVHcloud ensures that no data is accessible, transferred, or processed outside the European Union.</p>



<p><strong>Extended certifications to meet local regulations</strong></p>



<p>While SecNumCloud is the highest standard of security and data protection available today, our Cloud is multilocal and meets a number of stringent international certifications, such as:</p>



<p>1. ISO/IEC 27001 (International): Widely adopted internationally, including in European countries, this standard focuses on information security management systems.</p>



<p>2. C5 (Cloud Computing Compliance Controls Catalog) certification (Germany): the certification is a recognized standard for cloud service providers that emphasizes information security and data protection. It is designed to ensure compliance with German and European data protection requirements. C5 incorporates elements from ISO 27001 but adds specific criteria tailored to cloud services.</p>



<p>3. ENS (National Security Scheme or Esquema Nacional de Seguridad) certification (Spain): it establishes a common framework of basic principles, requirements, and security measures for adequate protection of information processed and services provided by electronic means. OVHcloud has obtained the highest level of the ENS certification ENS High.</p>



<p><strong>How OVHcloud is paving the way for trusted cloud in Europe</strong></p>



<p>Trusted cloud is pivotal to our customers ensuring they can continue innovating through a secure, reliable, open, and sustainable cloud.</p>



<p>At a European level, OVHcloud meets the most demanding certifications in the countries/markets in which it operates.</p>



<p>Our Group supports the establishment of a future European Cloud Certification Scheme (EUCS), which would provide users with harmonized EU-wide certification and maintain the same high standards of transparency and data protection, including against extraterritorial legislation for the most sensitive data.</p>



<p>If you want to learn more about trusted cloud and how OVHcloud can help you achieve your digital ambitions, please visit <a href="https://www.ovhcloud.com/en-ie/about-us/sovereign-cloud/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">our website.</a></p>



<p>We are convinced that trusted cloud is the future of cloud computing. That it is a strategic asset for the digital sovereignty and the competitiveness of the ecosystems in industries. We invite you to join us in this journey and to discover how OVHcloud can help you to achieve your digital ambitions with trust and confidence.</p>



<p><a href="https://corporate.ovhcloud.com/en/trusted-cloud/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">Links SecNumCloud Trusted Cloud Sovereignty</a></p>



<p>(1)SecNumCloud: Le visa de sécurité ANSSI</p>



<p>OVHcloud.&nbsp;<a href="https://www.ovhcloud.com/fr/enterprise/certification-conformity/secnumcloud/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">https://www.ovhcloud.com/fr/enterprise/certification-conformity/secnumcloud/</a></p>
<img loading="lazy" decoding="async" src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fdata-sovereignty-trusted-cloud-critical-business%2F&amp;action_name=Why%20Data%20Sovereignty%20and%20Trusted%20Cloud%20is%20critical%20for%20the%20future%20of%20your%20business&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Collaborating on open-source projects: an example with FreeBSD and OpenZFS</title>
		<link>https://blog.ovhcloud.com/collaborating-on-open-source-projects-an-example-with-freebsd-and-openzfs/</link>
		
		<dc:creator><![CDATA[Philippe Audeoud,&#160;Antonin Goude&#160;and&#160;Julien Jay]]></dc:creator>
		<pubDate>Thu, 20 Apr 2023 06:00:00 +0000</pubDate>
				<category><![CDATA[Accelerating with OVHcloud]]></category>
		<guid isPermaLink="false">https://blog.ovhcloud.com/?p=25077</guid>

					<description><![CDATA[OVHcloud develops a reversible, and open Cloud where interoperability is key thanks to open-source. This approach participates in offering a transparent Cloud with no vendor lock-in aligning with our belief of being stronger together and thus going further. As an actor of the open-source community with several software releases benefiting from an open-source license, we [&#8230;]<img src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fcollaborating-on-open-source-projects-an-example-with-freebsd-and-openzfs%2F&amp;action_name=Collaborating%20on%20open-source%20projects%3A%20an%20example%20with%20FreeBSD%20and%20OpenZFS&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></description>
										<content:encoded><![CDATA[
<p><em>OVHcloud develops a reversible, and open Cloud where interoperability is key thanks to open-source. This approach participates in offering a transparent Cloud with no vendor lock-in aligning with our belief of being stronger together and thus going further. As an actor of the open-source community with several software releases benefiting from an open-source license, we are publishing code through </em><em><a href="https://github.com/ovh" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">GitHub</a></em><em> while listening to feedback on our various channels.</em></p>



<p><em>Throughout the year, there are other ways we help the community with grants to access our infrastructure for testing or evaluation purposes. But that’s not all. Deploying at an unprecedented scale a number of open-source solutions we might find ourselves in the unique position to help the community through development and patching efforts. This recently happened within our storage team, and we thought sharing our experience would be worth a read.</em></p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1490-1024x538.jpg" alt="Collaborating on open-source projects: an example with FreeBSD and OpenZFS" class="wp-image-25080" width="512" height="269" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1490-1024x538.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1490-300x158.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1490-768x404.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1490.jpg 1199w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<h3 class="wp-block-heading" id="Collaboratingonopensourceprojects:anexamplewithFreeBSDandOpenZFS-OpenZFSandFreeBSDatOVHcloud">OpenZFS and FreeBSD at OVHcloud</h3>



<figure class="wp-block-image alignright size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1491.png" alt="FreeBSD" class="wp-image-25081" width="195" height="62" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1491.png 390w, https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1491-300x95.png 300w" sizes="auto, (max-width: 195px) 100vw, 195px" /></figure>



<p>Among the Storage Product Unit, our mission is to deliver innovative file storage services based on different hardware and software stacks. We take the time to test and validate new stacks of technologies to deliver high performance and high availability storage with the utmost care for data protection all the while keeping costs reasonable. With a complete portfolio of storage solutions, we use different set of software foundations for different storage access modes including OpenZFS for file storage.</p>



<figure class="wp-block-image alignright size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1492.png" alt="OpenZFS" class="wp-image-25082" width="198" height="58" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1492.png 396w, https://blog.ovhcloud.com/wp-content/uploads/2023/04/IMG_1492-300x87.png 300w" sizes="auto, (max-width: 198px) 100vw, 198px" /></figure>



<p>Based on our comprehensive testing, we chose FreeBSD for some of our offers like <a href="https://www.ovhcloud.com/en-gb/storage-solutions/nas-ha/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">NAS-HA</a> or Datastore NFS. One of the reasons is that FreeBSD is managed as a complete operating system with OpenZFS being a first-class citizen and natively integrated. It benefits from many years of experience across many teams ensuring quality and security.</p>



<p>FreeBSD&#8217;s <a href="https://www.freebsd.org/releng/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">release management</a> goes through multiple steps from idea inception to public releases:</p>



<ul class="wp-block-list">
<li>Technical reviews by peers,</li>



<li>Current branch to test,</li>



<li>Stable branch for wider user base testing,</li>



<li>Release candidate test,</li>



<li>Normal release,</li>
</ul>



<p>At the same time, patches for software are released to fix vulnerabilities and bugs.</p>



<p>The <a href="https://cgit.freebsd.org/ports/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">ports collection</a> is well-designed and simple. While FreeBSD has binary packages, which are handled by the <a href="https://www.freebsd.org/cgi/man.cgi?query=pkg" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">pkg</a> package manager, it also has the ability to compile software from source, allowing user to select desired options of compilation.</p>



<p>FreeBSD also provides tools like <a href="https://github.com/freebsd/poudriere" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">poudriere</a>. Poudriere is a utility for creating and testing FreeBSD packages. It makes it easy for users to build and set up their own binary package repository in which packages are built with their own options.</p>



<p>FreeBSD has over five hundred system variables that can be read and set using the <a href="https://www.freebsd.org/cgi/man.cgi?query=sysctl" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">sysctl</a> utility. These system variables are used to apply some changes to a running FreeBSD system. This includes many advanced options of the TCP/IP stack and virtual memory system that can improve performance.</p>



<p>Our goal today is not to come up with an exhaustive list of technical advantages on why to use FreeBSD. It would probably require one to many full blog posts. Keep in mind <a href="https://en.wikipedia.org/wiki/FreeBSD_jail" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">jail,</a> <a href="https://docs.freebsd.org/en/books/handbook/firewalls/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">pf</a>, linux binary compatibility and so on&#8230;</p>



<p>As we use FreeBSD, we are convinced that ZFS is a high-performance file system with replication, compression, encryption, and snapshots. If you want more details, our very own Frédéric Zind said it all during his tech talk at <a href="https://verytechtrip.com/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">Very Tech Trip 2023</a> : <a href="https://github.com/fzindovh/talk-zfs/blob/vtt/pdf/PITCHME.pdf" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">🇬🇧</a> / <a href="https://github.com/fzindovh/talk-zfs/blob/vtt/pdf/PITCHME-fr.pdf" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">🇫🇷</a></p>



<p>To illustrate what we are doing with OpenZFS and FreeBSD, let&#8217;s take an example with NAS-HA. This product is a file storage service (active/passive cluster illustrated below) built upon 2 nodes and a ZFS-based filesystem shared with NFS and/or CIFS. NAS-HA is a good example of how we integrated both FreeBSD and OpenZFS to build an open and secure storage service dedicated to versatile workloads (centralized storage for private or public cloud instances, bare metal servers…). With NAS-HA we are talking about several thousands of servers holding Petabytes of data.</p>



<figure class="wp-block-image aligncenter size-full"><img loading="lazy" decoding="async" width="468" height="312" src="https://blog.ovhcloud.com/wp-content/uploads/2023/04/clip_image001.png" alt="Collaborating on open-source projects: an example with FreeBSD and OpenZFS" class="wp-image-25078" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/04/clip_image001.png 468w, https://blog.ovhcloud.com/wp-content/uploads/2023/04/clip_image001-300x200.png 300w" sizes="auto, (max-width: 468px) 100vw, 468px" /></figure>



<h3 class="wp-block-heading" id="Collaboratingonopensourceprojects:anexamplewithFreeBSDandOpenZFS-Open-sourcecontribution">Open-source contribution</h3>



<p>To demonstrate how OVHcloud participates in open-source communities, we are sharing what we came to encounter in 2022 and how we fixed it.</p>



<p>Through our constant monitoring system, our operate team observed a condition where some filers had no more allocated memory due to a lack of free vnodes. As a reboot was necessary to unhang filers, the operate and engineering teams investigated the root cause. They found several leads but unfortunately couldn&#8217;t reproduce the incident. To improve the situation, the following was done:</p>



<ul class="wp-block-list">
<li>Double the base reservation of memory for OS (vfs.zfs.arc_max). Since our FreeBSD servers are booted into an in-memory OS without swap, we have to be careful that everything has sufficient resources and do not fight ZFS for memory,</li>



<li>Create a buffer memory zone (vfs.zfs.arc.sys_free),</li>



<li>Change the speed of release memory with no effect,</li>



<li>Disable periodic checks because default checks were consuming vnodes (a vnode is a structure representing a filesystem entity like a file, directory, device node) and were unneeded for our use cases,</li>
</ul>



<p>Even after applying these changes, some crashes were still happening. Then, we noticed that the incident seemed to be related to a non-holded snapshot during a restoration. There was a snapshot rotation while the snapshot was used during restoration, and it caused a deadlock. At this time, we still had no repro.</p>



<p>What we found:</p>



<ul class="wp-block-list">
<li>Huge spike in vnode usage,</li>



<li>Spikes seemed to be correlated to the rotation and restoration of our backups (available through the .zfs directory on our NFS shares),</li>
</ul>



<p>What we managed to:</p>



<ul class="wp-block-list">
<li>Limit the interaction of the .zfs directory to our essential operations,</li>



<li>Manually set vnode limit to 10 million (via sysctl),</li>
</ul>



<p>As a result, parallel rsync copies successfully finished. No more problems were observed during our quality assurance period (5 days), so we reached out to concerned communities.</p>



<p>Random servers were still experiencing incidents. The community started to report similar behaviors. In parallel, we noticed that if an NFS client browses the .zfs directory exposed by our servers all ZFS operations on snapshots would end up in a deadlock state afterwards. This happened only on OpenZFS 2.x (FreeBSD 13.x) and only a reboot would unfreeze the server. Problem reports have been escalated to&nbsp;OpenZFS (<a href="https://github.com/openzfs/zfs/issues/13974" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">here</a>) and Freebsd (<a href="https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=266236" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">here</a>). FreeBSD&#8217;s users confirmed our testing with a procedure to reproduce the incident.</p>



<p>Thanks to OVHcloud&#8217;s FreeBSD developers, we were quickly pointed at a FreeBSD developer, Mark, in touch with ZFS matters and we let him know what we have done, found, and tested.</p>



<p>Leveraging the community&#8217;s feedback, Mark suggested patches and our engineering team tested each one of them. Communication was smooth and the root cause was finally identified: a Linux-specific race condition. Technically speaking the issue was described the following way: <em>&#8220;zfsctl_snapdir_fid() sets fid_gen to 0 or 1 depending on whether the snapshot directory is mounted. On FreeBSD it fails, making snapshot dirs inaccessible via NFS.&#8221; </em>(<a href="https://github.com/openzfs/zfs/commit/ed566bf1cd0bdbf85e8c63c1c119e3d2ef5db1f6#diff-af782b869bef9bb226900dc88e23b83d0c075bdd39d8589ed45eb27cba78dc46" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">See commit on OpenZFS&#8217;s github</a>)The patch was committed and merged in FreeBSD and OpenZFS&#8217;s repositories.</p>



<p>Why didn&#8217;t we catch this bug during our preprod tests? Because of mathematics. With fewer servers, there is lower probability to have concurrent access to .zfs directory. This test has since then been added to our test book.</p>



<h3 class="wp-block-heading" id="Collaboratingonopensourceprojects:anexamplewithFreeBSDandOpenZFS-Maintakeaways">Main takeaways</h3>



<p>As a cloud actor, we are committed to open-source and many people at OVHcloud spend incredible amount of time to reproduce bugs, read code,&nbsp;test, and ultimately submit patches. Some of us even develop open-source projects (<a href="https://ovh.github.io/cds/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">CDS</a>, <a href="https://ovh.github.io/the-bastion/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">bastion</a> by example are shining great examples) and/or during their free time.</p>



<p>At the Storage Product Unit, we are first and foremost geeks that love FreeBSD and OpenZFS. And well managing more than 10k servers daily is quite the experience.</p>



<p>Most importantly, we want to give back to open-source communities like Linux Foundation through the Ceph foundation,&nbsp;Cloud Native Computing Foundation, Python community, PostgreSQL community, FreeBSD, OpenZFS and more. If you want to learn more, you can check our <a href="https://corporate.ovhcloud.com/en-ie/ecosystem/open-innovation/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">open innovation program</a> as well as <a href="https://github.com/ovh/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">OVHcloud GitHub repo</a>.</p>
<img loading="lazy" decoding="async" src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fcollaborating-on-open-source-projects-an-example-with-freebsd-and-openzfs%2F&amp;action_name=Collaborating%20on%20open-source%20projects%3A%20an%20example%20with%20FreeBSD%20and%20OpenZFS&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Understanding OVHcloud&#8217;s data centres: how Delta T helps with water efficiency</title>
		<link>https://blog.ovhcloud.com/understanding-ovhclouds-data-centres-how-delta-t-helps-with-water-efficiency/</link>
		
		<dc:creator><![CDATA[Mohamad Hnayno,&#160;Julien Jay&#160;and&#160;Ali Chehade]]></dc:creator>
		<pubDate>Tue, 21 Mar 2023 09:40:00 +0000</pubDate>
				<category><![CDATA[OVHcloud Engineering]]></category>
		<guid isPermaLink="false">https://blog.ovhcloud.com/?p=24781</guid>

					<description><![CDATA[In our latest installment we introduced the merits of our Hybrid Immersion Liquid Cooling Technology. This new approach to cooling essentially enables new scenarios within the data centre all the while significantly reducing power consumption by up to 20% for cooling systems dedicated to servers. Amongst other benefits, it helps for operation in harsh climatic [&#8230;]<img src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Funderstanding-ovhclouds-data-centres-how-delta-t-helps-with-water-efficiency%2F&amp;action_name=Understanding%20OVHcloud%26%238217%3Bs%20data%20centres%3A%20how%20Delta%20T%20helps%20with%20water%20efficiency&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></description>
										<content:encoded><![CDATA[
<p>In our latest installment we introduced the merits of our <a href="https://blog.ovhcloud.com/new-hybrid-immersion-liquid-cooling-developments-at-ovhcloud/" data-wpel-link="internal">Hybrid Immersion Liquid Cooling Technology</a>. This new approach to cooling essentially enables new scenarios within the data centre all the while significantly reducing power consumption by up to 20% for cooling systems dedicated to servers. Amongst other benefits, it helps for operation in harsh climatic zones. While OVHcloud’s R&amp;D is hard at work on the next generation of cooling, it also keeps on improving the basic building blocks of cooling in the data centre. Today we’ll discuss in more detail one of the singular aspects of OVHcloud&#8217;s designed data centres that is crucial to reducing water consumption and overall cooling costs in comparison to mechanical cooling system used within data centres: Delta T.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-1024x538.jpg" alt="Understanding OVHcloud's data centres: how Delta T helps with water efficiency" class="wp-image-24909" width="512" height="269" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-1024x538.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-300x158.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344-768x404.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1344.jpg 1199w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<p>Ever since 2003, OVHcloud has developed watercooling at scale for its servers in a free cooling approach that prohibits the use of power-hungry air conditioning thus reducing power costs and water usage. Over the years, our technology has been refined from generation to generation. All these years of work allowed us to see that one crucial but often overlooked parameter is the Delta T, a parameter that measures the difference in temperature for the water coming in and out of the data centre. From this single parameter comes greater efficiency particularly when it comes to Water Usage Effectiveness (WUE). A parameter that we continue refining with the most recent milestone being the Delta T=20K. Ready for the deep dive?</p>



<h3 class="wp-block-heading">Overview of OVHcloud cooling DC principles</h3>



<p>A typical OVHcloud data centre benefits from an indirect free cooling technique, also known as IFC, with the deployment at scale of a liquid cooling system. It helps in improving global data centre power efficiency with up to 50% energy savings compared to conventional mechanical cooling. With such a cooling strategy that leverages liquid cooling and air cooling, OVHcloud&#8217;s data centres aim at being highly efficient.</p>



<p>Our proprietary watercooling system recovers fatal heat generated by our servers through carefully crafted waterblocks offering direct contact with the chips. If we break down the system, there are four cooling bricks in the data centre:</p>



<ul class="wp-block-list">
<li>Dry Cooler (heat rejection system),</li>



<li>Pumping Substation (PSS),</li>



<li>Rack Rear Door Heat Exchanger (RDHX) and Cooling Distribution Unit (CDU),</li>



<li>Server waterblock,</li>
</ul>



<p>A typical water circuit in an OVHcloud data center goes like this: cold water is pumped to the rack from a pumping substation (PSS) before reaching the Rear Door Heat Exchanger (RDHX) installed at the rack level. Each rack comes with 3 RDHXs, each door having its own set of fans supporting air cooling capacity. Water then flows through the CDU or Cooling Distribution Units: placed inside the rack, CDU are comprised of pumps and plate heat exchangers (PHEXs). They distribute liquid to the waterblocks, effectively acting as a watercooling system, where the IT equipment heat will be transported back to the dry coolers located outside of the data centre for heat rejection.</p>



<p>Playing an important role in the cooling infrastructure, the dry coolers use external air to cool down the circulating liquid pumped from the data centre without any refrigeration. Depending on the climatic conditions, the most efficient evaporative cooling (EC) system to date is used to cool the outside air before it reaches the fins of the dry cooler.</p>



<h3 class="wp-block-heading">Kelvin, how delta T is measured and why it’s a key parameter</h3>



<p>In the research field, the usual norm is to express temperature difference in Kelvin while all temperature measurements are in °C, F or K. In this document, temperature is express in °C, while temperature difference is expressed in K. When it comes to our data centres, Delta T is the difference of water temperature between the inlet and outlet of a rack (in other words the in/out water temperature of the data center): the greater this difference is, the more efficient the data centre will be. Why? An increase in the temperature difference in the racks (dT) causes a decrease in the flow rate of water supplied per rack allowing the use of lower pumps capacity. Thus, an increase in the temperature difference permits a higher heat load per pumping unit and dry cooler for an equivalent PSS hydraulic capacity and heat rejection system.</p>



<h3 class="wp-block-heading">From Delta T=5K to Delta T=20K</h3>



<p>As we have seen, from a greater temperature difference comes more efficiency. It also means that the performance of the dry cooler is significantly improved. Indeed, water enters at higher temperature levels, while fan and evaporative cooling water consumption per kWh IT is reduced because a lower air flow rate is required. That later consequence allows the optimization of the ventilation capacity.</p>



<p>Since 2019, we upgraded to a new generation of cooling infrastructure for our most recent data centres capable of operating at a Delta T=20K instead of Delta T=5K. We rely on a lower number of dry coolers per data centre, yet more efficient ones, with smaller pumps and reduced piping. The result is a potential higher footprint (kW/m2) for the data centre, lower overall carbon footprint, better PPUE and an improved WUE, as well as reduced CAPEX and OPEX.</p>



<ul class="wp-block-list">
<li>Delta T=5K explained</li>
</ul>



<p>In a traditional OVHcloud data centre with a Delta T=5K setting, water temperature is to be maintained at 25°C providing inlet air temperature at 30°C and inlet water temperature of the waterblock at 35°C. As shown in the figure below, two circuits in parallel are connected to the rack.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-1024x605.png" alt="Delta T=5K" class="wp-image-24905" width="512" height="303" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-1024x605.png 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-300x177.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340-768x454.png 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1340.png 1488w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<p>Through constant development, we improved the cooling design of our rear door heat exchanger (RDHX) to increase the inlet water temperature and reduce pinch between cold air and cold water: this way, the data center inlet water temperature is now set at 27°C against 25°C previously.</p>



<ul class="wp-block-list">
<li>Delta T=20K explained</li>
</ul>



<p>Building up on the notable improvements we discussed, we were also able to setup a serial circuitry with PHEX and RDHX on the rack level in order to achieve a complete Delta T=20K environment as shown in the figure.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-1024x681.png" alt="Delta T=20K explained" class="wp-image-24926" width="512" height="341" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-1024x681.png 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-300x199.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345-768x511.png 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/IMG_1345.png 1322w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<p>With water temperature maintained at 27°C, the inlet air temperature of the data centre remains at 30°C and inlet water temperature at 45°C for the waterblock. The data centre water outlet temperature is set at 47°C. On a sidenote, our robust design gives us confidence in the ability to operate with a higher temperature profile, still in a Delta T=20K setting, like 32-52°C.</p>



<figure class="wp-block-video"><video height="1080" style="aspect-ratio: 1920 / 1080;" width="1920" autoplay loop muted src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/animation-Delta-T.mp4"></video></figure>



<h3 class="wp-block-heading">What are the benefits of Delta T=20K VS Delta T=5K? And what are the advantages when applied to different data centre locations?</h3>



<p>To measure the gains of OVHcloud&#8217;s IFC system operating with a Delta T=20K, we conducted an extensive series of benchmarks in various locations with the same Cooling Module configured first as Delta T=5K then as Delta T=20K. We should be able to highlight significant gains in terms of water consumption.</p>



<p>We elected for the following methodology. We collected and analyzed the weather data for the past 40 years, including temperature and relative humidity with a record every 30 minutes. For the Roubaix site for example, the warmest year happened to be 2019. Figure below shows the temperature frequency in hours extracted from climatic data recorded at Roubaix in 2019. The key hours are between 5 and 20 °C.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-1024x591.png" alt="" class="wp-image-24787" width="512" height="296" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-1024x591.png 1024w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-300x173.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-768x443.png 768w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-1536x886.png 1536w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/4-2048x1181.png 2048w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<p>Based on our working principle, we can deduct the WUE in every country where we have a data centre. For your reference, in this <a href="https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">scientific paper</a> we estimated the theoretical WUE for some of our locations that offer the best possible representation of various climatic conditions: Roubaix (France), Mumbai (India), Singapore, VintHill (USA). For accuracy purposes we’re sharing below the estimated WUE for Roubaix:</p>



<figure class="wp-block-image aligncenter size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20.png" alt="" class="wp-image-24844" width="502" height="351" srcset="https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20.png 772w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20-300x210.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2023/03/EstimatedWUERoubaixDT20-768x537.png 768w" sizes="auto, (max-width: 502px) 100vw, 502px" /></figure>



<p>This theoretical result is impressive. Yet, it’s important to have a better sense of what WUE can represent: we estimate that in an IFC with a Delta T=20K setting, 381 m<sup>3</sup> of water are enough to cool a 600 kW DC versus 964 m<sup>3 </sup>in an IFC with Delta T=5K setting.</p>



<p>The table below is a summary of the most glaring changes between Delta T=5K and Delta T=20K:</p>



<figure class="wp-block-table">
  <table class="ovhcloud_table">
    <thead>
      <tr>
        <th>&nbsp;</th>
        <th>dT= 5K</th>
        <th>dT= 20K</th>
      </tr>
    </thead>
    <tbody>
      <tr><td>Temperature profile</td><td>25°C &#8211; 30°C @ 5K</td><td>&nbsp;27°C – 47°C @ 20K</td></tr><tr><td>Water flow rate per server</td><td>Up to 0.25 l/min</td><td>Up to 0.5 l/min</td></tr><tr><td>Dry coolers consumption for 600kW of IT workload</td><td>Up to 48kW</td><td>Up to 24kW</td></tr><tr><td>Pumping Substation(s) (PSS) for 600 kW of IT workload (pump consumption)</td><td>Up to 17kW</td><td>Up to 3kW</td></tr><tr><td>Rack internal pumping (CDU)</td><td>0</td><td>Up to 7kW for 600kW of IT workload</td></tr><tr><td>PSS footprint</td><td>125 kg/pump</td><td>50 kg /pump</td></tr><tr><td>Average annual WUE (l/kWh)</td><td>0.16</td><td>0.09</td></tr></tbody></table></figure>



<h3 class="wp-block-heading">Versus the competition?</h3>



<p>To date, there are two different families of water liquid cooling techniques employed within data centres: close-coupled cooling and direct liquid cooling (see table). Watercooling solutions take advantage of the heat-carrying capacity of the water that is 3,3 times higher than that of air for a dramatic reduction in thermal resistance. With CPU and GPU being the most power-hungry parts in a server, liquid cooling via direct-to-chip contact contributes in returning warmer water from IT equipment than that of closely coupled cooling systems. Logically, the coolant temperature directly impacts energy savings and cooling efficiency participating in a significant decrease in the total DC site energy consumption by 14 to 20%<sup>1</sup>. Please note, more information on the above can be obtained <a href="https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">here</a>.</p>



<figure class="wp-block-table">
<table class="ovhcloud_table">
  <thead>
    <tr>
      <th colspan="2">Liquid cooling solutions</th>
      <th>Description</th>
    </tr>
  </thead>
  <tbody>

    <tr>
      <td rowspan="2">Close coupled cooling with a typical inlet water temperature = 5 to 23 °C</td>
      <td>In-row Cooling</td>
      <td>Row-based air conditioning units are installed inside the rack rows</td>
    </tr>
    <tr>
      <td>In-rack Cooling</td>
      <td>The cooling system is combined with the server rack, and both are sealed completely.</td>
    </tr>
    <tr>
      <td rowspan="2">Direct-to-chip cooling with a typical inlet water temperature 27–45 °C</td>
      <td>Rear Door Heat Exchanger</td>
      <td>Based on a combination of fans and a finned heat exchanger installed on the rear side of racks.</td>
    </tr>
    <tr>
      <td>Cold plates (waterblocks)</td>
      <td>A cold plate with micro-channels in direct contact with components like CPUs and GPUs. Here, the water flowing across absorbs chip energy.</td>
    </tr>
  </tbody>
</table></figure>



<h3 class="wp-block-heading">What’s next?</h3>



<p>As we celebrate the 20th anniversary of our watercooling technology this year, we continue to innovate. We have been working on designing a new breed of data centre employing climate friendly technologies when it comes to cooling. In a global context of surging energy prices and several water shortages in the 2022 summer, it becomes quite obvious as to why our previous technological bets pay off. With best-in-class PUE and WUE for our data centres, we continue to work tirelessly on improving our footprint within the ecosystem illustrating our deep commitment towards <a href="https://corporate.ovhcloud.com/en-ie/sustainability/" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">sustainability</a>. Hopefully by now you have come to realize how important to the data centre is the Delta T parameter. From the infrastructure point of view to the tangible water efficiency, Delta T=20K contributes in adapting to various geographies and ultimately water conservation.</p>



<p>The water consumption of data centres has indeed a tremendous impact locally going that far as to affect river flow, groundwater levels and influence overall water resources for a whole territory. Thus, reducing WUE proves critical for any modern data centres to lower the overall impact on local communities.</p>



<p>The impressive saving of water illustrated by our WUE also contributes to a streamlined PUE for compliance with ISO/IEC 30134 standard. Last, but not least, the demonstrated advantages of Delta T=20K will prove to be a solid basis for the next generation of improvements our teams develop.</p>



<p>1: Performance analysis of new liquid cooling topology and its impact on data centres &#8211;&nbsp; <a href="https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub" data-wpel-link="external" target="_blank" rel="nofollow external noopener noreferrer">https://www.sciencedirect.com/science/article/abs/pii/S1359431122006779?via%3Dihub</a></p>
<img loading="lazy" decoding="async" src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Funderstanding-ovhclouds-data-centres-how-delta-t-helps-with-water-efficiency%2F&amp;action_name=Understanding%20OVHcloud%26%238217%3Bs%20data%20centres%3A%20how%20Delta%20T%20helps%20with%20water%20efficiency&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></content:encoded>
					
		
		<enclosure url="https://blog.ovhcloud.com/wp-content/uploads/2023/03/animation-Delta-T.mp4" length="11425209" type="video/mp4" />

			</item>
		<item>
		<title>New Hybrid Immersion Liquid Cooling developments at OVHcloud</title>
		<link>https://blog.ovhcloud.com/new-hybrid-immersion-liquid-cooling-developments-at-ovhcloud/</link>
		
		<dc:creator><![CDATA[Mohamad Hnayno,&#160;Julien Jay&#160;and&#160;Ali Chehade]]></dc:creator>
		<pubDate>Thu, 13 Oct 2022 07:31:17 +0000</pubDate>
				<category><![CDATA[OVHcloud Engineering]]></category>
		<category><![CDATA[Datacenters & network]]></category>
		<category><![CDATA[Immersion cooling]]></category>
		<category><![CDATA[Liquid cooling]]></category>
		<category><![CDATA[OVHcloud]]></category>
		<category><![CDATA[R&D]]></category>
		<category><![CDATA[Watercooling]]></category>
		<guid isPermaLink="false">https://blog.ovhcloud.com/?p=23587</guid>

					<description><![CDATA[If you think about it, datacentre cooling infrastructure hasn’t changed much over the past few decades. Whilst compute power and efficiency have remarkably improved over the years, many datacentres still rely on racks whose cooling depends on a variety of systems comprised of fans. Heat, especially fatal heat, remains a top concern to keep servers [&#8230;]<img src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fnew-hybrid-immersion-liquid-cooling-developments-at-ovhcloud%2F&amp;action_name=New%20Hybrid%20Immersion%20Liquid%20Cooling%20developments%20at%20OVHcloud&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></description>
										<content:encoded><![CDATA[
<p>If you think about it, datacentre cooling infrastructure hasn’t changed much over the past few decades. Whilst compute power and efficiency have remarkably improved over the years, many datacentres still rely on racks whose cooling depends on a variety of systems comprised of fans. Heat, especially fatal heat, remains a top concern to keep servers up-and-running. Whilst some datacentres have opted for power hungry solutions such as air conditioning (direct expansion), others have a free cooling approach.</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-1024x576.jpg" alt="New Hybrid Immersion Liquid Cooling developments at OVHcloud" class="wp-image-23645" width="512" height="288" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-1024x576.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-300x169.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head-768x432.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-head.jpg 1230w" sizes="auto, (max-width: 512px) 100vw, 512px" /></figure>



<p>Here at OVHcloud, we pioneered free cooling, and have used water cooling at scale to cool down our servers since 2003. This unique approach helped us to set high standards in PUE/WUE indexes for our global datacentres, all the while lowering our overall carbon impact.</p>



<p>With the evolution of IT equipment power, OVHcloud R&amp;D worked on a new generation of datacentre cooling solutions. One of the most promising technologies today is immersion cooling, thanks to direct contact of the fluid with hardware components. It allows us to build datacentres all over the world maintaining low PUE/WUE indexes even in harsh climatic zones.<br><br>In this blog post, we will explain how we developed our own breed of immersion cooling to achieve several goals, from improving datacentre footprint and reducing power consumption, to adapting to new operating conditions and improved overall reliability.</p>



<h3 class="wp-block-heading">OVHcloud Hybrid Immersion Liquid Cooling</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-869x1024.jpg" alt="Immersion Cooling 01" class="wp-image-23594" width="217" height="256" title="" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-869x1024.jpg 869w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-255x300.jpg 255w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-768x905.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-1304x1536.jpg 1304w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-01-1739x2048.jpg 1739w" sizes="auto, (max-width: 217px) 100vw, 217px" /></figure>



<p>Immersion Cooling is the practice of submerging electronic devices in a thermally and not electrically conductive liquid.</p>



<p>The typical immersion cooling solution usually employs pumps, heat sink structures, heat exchangers, condensers, sealed evaporative equipment etc. that either consume large amounts of energy to operate, require sealed casings, or occupy relatively large surface areas that limit the number of servers that can be implemented.</p>



<p>OVHcloud developed its own Hybrid Immersion Liquid Cooling technique. It is comprised of a direct to chip water cooling system and a passive natural single phase immersion cooling system containing two fluids:</p>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-1024x640.jpg" alt="Immersion Cooling 03" class="wp-image-23600" width="256" height="160" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-1024x640.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-300x188.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-768x480.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1-1536x960.jpg 1536w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-03-1.jpg 2000w" sizes="auto, (max-width: 256px) 100vw, 256px" /></figure>



<ul class="wp-block-list"><li>Water: cooling a heat sink through water blocks disposed on CPUs and GPUs with the same solution used in all OVHcloud servers and a proprietary serpentine convection coil connected to a pumping substation (PSS) and a dry cooler to evacuate heat outside the DC.<br></li><li>Immersion Cooling (IC): the fluid is contained in a tank and cools all IT equipment in the server rather than just CPUs and GPUs; that fluid basically replaces the air circulating in OVHcloud servers thus enhancing the efficiency of any component not cooled by OVHcloud water cooling systems.</li></ul>



<p>It comes with a new 3 floor server rack design, in a library format, which we can populate with up to 48 servers (1U) or 24 servers (2U), in a book format. Each server is submerged in its own tank for independent cooling, allowing large scale deployments. In addition, each server benefits from a specific monitoring system of all environmental factors ensuring safe and secure operation at the server level.</p>



<h3 class="wp-block-heading">Benefits in terms of power consumption and efficiency</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-1024x641.jpg" alt="Immersion Cooling 04" class="wp-image-23602" width="256" height="160" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-1024x641.jpg 1024w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-300x188.jpg 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-768x480.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04-1536x961.jpg 1536w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-04.jpg 2000w" sizes="auto, (max-width: 256px) 100vw, 256px" /></figure>



<p>OVHcloud Hybrid Immersion Liquid Cooling has several advantages that makes it an efficient solution:</p>



<ul class="wp-block-list"><li>The new passive rack design means there are neither pumps nor fans, resulting in zero cooling electrical consumption at the rack level</li><li>High power racks can operate with datacentre inlet temperatures of up to 45 °C, allowing for different cooling loads at different climatic conditions</li><li>Stay energy efficient using free cooling available on site</li><li>The electrical consumption and CAPEX are further reduced through the elimination of the Evaporative Cooling system (no pumps) used on dry coolers for data centers located in areas where air ambient climatic temperatures are below 43 °C</li><li>The difference between supplied and recovered temperature of DC water (DT) of 20K can be achieved through highly efficient CPU and GPU cold plates and a large thermal contact surface between the serpentine and the dielectric fluid. The patent pending serpentine design occupies a very small area within the server chassis maximizing space for components</li><li>Global datacentre cooling infrastructure power consumption is reduced by at least 20.7% compared to the OVHcloud water cooling system</li><li>The energy consumption of a server per year is reduced by at least 20% compared to air cooled servers and 7% versus water cooled servers</li><li>Reduced power consumption means lower operating expenses (OPEX)</li></ul>



<p>Finally, in the context of ever-increasing CPU/GPU TDP, the solution supports higher computing power density.</p>



<h3 class="wp-block-heading">A new generation of datacentre</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-861x1024.jpg" alt="Immersion Cooling 09" class="wp-image-23605" width="215" height="256" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-861x1024.jpg 861w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-252x300.jpg 252w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-768x913.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09-1291x1536.jpg 1291w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-09.jpg 1500w" sizes="auto, (max-width: 215px) 100vw, 215px" /></figure>



<p>Our new Hybrid Immersion Liquid Cooling approach stands outs with several key advantages for datacentre operators:</p>



<ul class="wp-block-list"><li>Actual footprint of up to 37U/m<sup>2</sup> can be guaranteed with no extra pumping and condensers systems. The design is scalable to two or three times its footprint if immersion cooling racks are installed in stacked maritime containers</li><li>With no sealing required nor sophisticated heat exchangers and pumping circuits the capital expenditure (CAPEX) remains low</li><li>With a datacentre outlet temperature of 65 °C, fatal heat is better captured, and affordable heat recovery systems can be envisioned</li><li>More flexibility and freedom in design at OVHcloud, where the new proposed rack design requires access to the front side, unlike classic racks requiring access from both the front and rear sides.</li></ul>



<h3 class="wp-block-heading">Redefining datacentre indexes</h3>



<figure class="wp-block-image alignright size-large is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-788x1024.jpg" alt="Immersion Cooling 10" class="wp-image-23606" width="197" height="256" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-788x1024.jpg 788w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-231x300.jpg 231w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-768x998.jpg 768w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-1182x1536.jpg 1182w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10-1577x2048.jpg 1577w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/immersion-cooling-10.jpg 1732w" sizes="auto, (max-width: 197px) 100vw, 197px" /></figure>



<p>With such a new game-changing technique, the usual datacentre performance indicators evolve dramatically.</p>



<p>Enhance PPUE<sup>[1]</sup> to 1.004: PPUE or partial power usage effectiveness defines a certain portion of the overall PUE of a datacentre within a clearly defined boundary. Infrastructure cooling PPUE is as follows:</p>



<figure class="wp-block-image size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/Eq-2.png" alt="" class="wp-image-23638" width="168" height="55" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/Eq-2.png 671w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/Eq-2-300x99.png 300w" sizes="auto, (max-width: 168px) 100vw, 168px" /></figure>



<p>Reduce WUE to 0 for datacentre located in areas where ambient air climatic temperatures are below 43 °C: WUE or water usage effectiveness is a sustainability metric to measure the amount of water used by datacentres to cool IT equipment. Annual site water usage includes water used for humidification and water evaporated on-site for energy production or cooling of the datacentre and its support systems. It is defined as follow:</p>



<figure class="wp-block-image size-full is-resized"><img loading="lazy" decoding="async" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O.png" alt="" class="wp-image-23632" width="253" height="53" srcset="https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O.png 1010w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O-300x62.png 300w, https://blog.ovhcloud.com/wp-content/uploads/2022/10/H2O-768x160.png 768w" sizes="auto, (max-width: 253px) 100vw, 253px" /></figure>



<h3 class="wp-block-heading">Environmental considerations</h3>



<p>The Hybrid Immersion Liquid Cooling technique relies on a non-volatile dielectric hydrocarbon fluid. OVHcloud R&amp;D tested and qualified the system with a variety of market fluids. While each fluid has different properties, we paid special attention to their compliance with the following considerations:</p>



<ul class="wp-block-list"><li>Non-corrosive</li><li>Ultra-low vaporization</li><li>Nontoxic</li><li>Non allergenic</li><li>Biodegradability in 30 days</li><li>Very high flash point</li><li>Dielectric Strength up to 42 kV</li><li>GWP<sup>[2]</sup>=0</li><li>ODP<sup>[3]</sup>=0</li></ul>



<h3 class="wp-block-heading">Operational changes within the datacentre</h3>



<p>The new 3 floor library format rack design is footprint optimized for a high density of servers and incidentally compute power per square meter (U/m<sup>2</sup>). It could be fitted with manual/automated guidance to extract servers, reducing the maintenance impact of one server on the overall rack.</p>



<p>The new Hybrid Immersion Liquid Cooling technique also improves reliability reducing failure rate by up to 60 % by eliminating risk of dust circulation. It contributes to a silent datacentre with the removal of any fan at rack and server levels. Finally, several components are no longer needed:</p>



<ul class="wp-block-list"><li>No heat exchangers attached to the racks (3 heat exchangers are eliminated per rack),</li><li>36 fans usually installed per rack are eliminated</li><li>All small fans inside the servers are eliminated</li><li>Cooling modules comprised of pumps and plate heat exchangers are eliminated</li></ul>



<figure class="wp-block-image aligncenter size-full"><img loading="lazy" decoding="async" width="640" height="360" src="https://blog.ovhcloud.com/wp-content/uploads/2022/10/Immersive-cooling.gif" alt="" class="wp-image-23607"/></figure>



<h3 class="wp-block-heading">What’s next?</h3>



<p>The new OVHcloud Hybrid Immersion Liquid Cooling paves the way for unapologetically more compute power in modern datacentres. Its unique patent pending design is sustainable and preserves the opportunity to valorize fatal heat due to higher operating temperatures. Moreover, lower power and water consumption, as well as improved footprint can be achieved with the benefit of adapting to regions with harsh climatic conditions. We are incredibly excited by the very first use cases of Hybrid Immersion Liquid Cooling in the fields of banking, medical and scientific research, not forgetting gaming workloads. We can’t wait to share more with you in the future.</p>



<p style="font-size:12px">[1]: Partial PUE is used instead of PUE. It is expected that the global PUE will improve in a similar approach which includes the power lost in the energy distribution system [More details about the PPUE and PUE <a href="https://doi.org/10.1016/j.applthermaleng.2022.118733" target="_blank" rel="noreferrer noopener nofollow external" data-wpel-link="external">here</a>]. In line with its transparency commitments, OVHcloud will communicate PUE values upon gathering significant data on the long run on a specific location basis.<br>[2]: GWP stands for Global Warming Potential. The fluid does not contribute to global warming in opposition to most of gas/liquid fluids used in cooling technologies.<br>[3]: ODP stands for Ozone Depletion Potential. The fluid does not contribute to ozone depletion in opposition to most of gas/liquid fluids used in cooling technologies.<br></p>



<p><a id="_msocom_1"></a></p>
<img loading="lazy" decoding="async" src="//blog.ovhcloud.com/wp-content/plugins/matomo/app/matomo.php?idsite=1&amp;rec=1&amp;url=https%3A%2F%2Fblog.ovhcloud.com%2Fnew-hybrid-immersion-liquid-cooling-developments-at-ovhcloud%2F&amp;action_name=New%20Hybrid%20Immersion%20Liquid%20Cooling%20developments%20at%20OVHcloud&amp;urlref=https%3A%2F%2Fblog.ovhcloud.com%2Ffeed%2F" style="border:0;width:0;height:0" width="0" height="0" alt="" />]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
