Together, your Internet even better

Articles tagged with: centre de données

Data Center kezako ?

on Friday, 22 May 2020 Posted in Archives Rezopole

Data Center kezako ?

More numerous, larger, French data centers or data centers in French have become a crucial issue for the development of the Internet. It is a physical place where various computer equipment, such as computers, servers, etc., are grouped together.

 

The main function of a data center is to store information that is useful for the proper functioning of a company. Depending on its size, the power of its storage systems, and other characteristics, it can store billions of pieces of data on customers, individuals, etc.

 

A large part of the world's data centers, today offered by Cisco, Jerlaure, Sigma, etc., are dedicated to hosting the servers used for browsing the Internet. In order to protect the data stored in these particular locations, all are subject to a high level of security.

 

In order for a data center to function optimally, certain conditions must be met, such as: excellent air conditioning, air quality control, emergency and backup power supply, 24-hour surveillance, etc.

 

 

 Read the article

 

Source : Journal du Net

 

 

 

 

4A page is turning for the French sovereign cloud

on Wednesday, 07 August 2019 Posted in Archives Rezopole, Archives GrenoblIX, Archives LyonIX

4A page is turning for the French sovereign cloud

The Cloudwatt, Orange's online data hosting service, will be disconnected on February 1st. Launched in 2012, it was one of the two heads of the French-style sovereign cloud co-financed at a loss by public money. With Bercy calling again for the creation of secure data centers to host sensitive government and corporate data, this failure could serve as a lesson.

 

At the origin of this project, called Andromeda, France wanted to invest 150 million euros in a shared server service that could reduce the costs of ministries and companies. But here we are, the French technology groups called upon to help did not succeed in reaching an agreement and therefore shared the envelope.

On the one hand, Cloudwatt was created by Orange and Thales by adding 150 million euros to the amount provided by the State. On the other hand, Numergy was launched by SFR and Bull with the same investment. However, none of them were able to find customers. And two years after their launch, Cloudwatt claimed only 2 million dollars in revenue. Even if Numergy was doing better with 6 million billed, these are crumbs compared to Amazon, Microsoft and IBM.

Bercy stops the expenses, a few months later, by assuring that she has spent only half of the promised sums. Orange and SFR then bought back the shares of the State, one from Thales and the other from Bull. Numergy and Cloudwatt, which had then become simple brands, had since been part of the offers designed for large companies by the two telecoms operators.

 

Today, the dominance of American players in the online IT market continues to raise concerns about the integrity of critical data. A recent report by MP Raphaël Gauvain criticizes the Cloud Act, an American extraterritorial law recalling the Patriot Act and spy programs.

The Government should therefore sign a strategic supply chain contract to develop a "Cloud of Trust" ecosystem in the fall. Being French will not be enough to be French and some American or Chinese technologies used in French data centers will be difficult to support.

"This time, we will not assume the nationality of the actors but their ability to guarantee data integrity with regard to our laws and strategic autonomy over our essential infrastructures and data," notes Jean-Noël de Galzain, Hexatrust's president on the sector's strategic committee. In addition, the State should commit itself to playing its role as a buyer.

 

 

 

 

 Read the article

 

Source : Les Echos

 

 

 

 

Heat wave: why French DCs are holding up

on Thursday, 01 August 2019 Posted in Archives Rezopole, Archives GrenoblIX, Archives LyonIX

Heat wave: why French DCs are holding up

Heat episodes are not taken lightly by data center operators. In France, "we have gone from 40 degrees to 46 degrees in a few years. We have met the specifications of Spain," says Marie Chabanon, Technical Director of DATA4 Group.

 

In order to counter any heat stroke, the datacenters' resistance to temperatures has increased " The great fear is the domino effect [...] If all or part of the cold infrastructure has problems, it affects the rest of the equipment. And if the refrigeration unit stops, it's the worst thing that can happen to us with the complete power outage," added Fabrice Coquio, Interxion's Managing Director. A risk also linked to the quality of RTE or Enedis' electricity distribution. "We must anticipate a risk of electrical loss or incident," explains Marie Chabanon.

 

But data center operators have a secret boot to fight this domino effect. "Data center electrical systems are built to be 100% operational. However, this is never the case. The consequence is that in the event of a load, such as a higher cold demand, we have unallocated power that we can use," explains Fabien Gautier of Equinix. This is called capacity redundancy.

 

Especially since the densification of computing power per unit of space in recent years, with the democratization of virtualization, has led to more consumption and more heat. "With 14 or 15 kvA berries, we cause hot spots, which are more sensitive to heat waves," explains Fabien Gautier. The work of urbanizing the IT architecture deployed in the rooms is therefore essential. "Our work is therefore the urbanization of the rooms. If they were completed on the fly, that can be a problem," he adds.

This involves, among other things, load balancing. "Our data centers are designated with redundancies and a 50% load rate. The backup machines will be used to provide additional power" in the event of a heat wave, says Marie Chabanon. Nevertheless, it must be anticipated. "We must ensure that backup systems are ready to be operational, through maintenance and control actions on backup equipment."

 

The protection of data centers against heat also requires the installation of curative systems. "We installed water spray systems to water the roof equipment with water that is not too cold," says Fabrice Coquio.

And to be prepared for any eventuality in the early evening, the schedule of the technicians present on site has been modified. It is also necessary to warn customers so that they are careful.

 

Recent advances in hardware strength and data center design have made it possible to increase the temperatures in server storage rooms. "The idea is that the lower the PUE (Power Usage Effectiveness), the better it performs. Ten years ago, we used to make datacenters where it was difficult to achieve a PUE of 1.6. Today we are at 1.2 and we are getting closer to 1, which represents 20% savings by playing on the temperature and energy performance of the new equipment," says Marie Chabanon. As a result, the cooling system now focuses on machines with forced air. There is no longer any need to refrigerate entire rooms.

"We are seeing an evolution in the design of indoor temperature according to the recommendations of the Ashrae (American Society of Heating and Ventilating Engineers). The idea is to work well with much higher temperature ranges. We have gone from 20 to 22 degrees to 18 to 27 degrees," she adds. Since 2011, these standards have been raised: they recommend blowing at 26 degrees on the front panel on indoor equipment. "The humidity level was also modified [...] In 2008, it was between 40 and 60%. It is now 70%," says Fabrice Coquio.

 

This will limit cooling costs without affecting the resistance of the installations. A critical point in hot weather.

 

 

 

 

 Read the article

 

Source : ZDNet

 

 

 

 

The Data Center Continuum

on Tuesday, 25 June 2019 Posted in Archives Rezopole, Archives GrenoblIX, Archives LyonIX

The Data Center Continuum

The visionary trend of the 2010's massively positioned data center surfaces in Hyperscale DCs, ideally located in areas close to the Arctic Circle. At the time, only the issue of systemic risks seemed to be able to slow down this development.

 

But today the reality is no longer the same. Indeed, a continuum model has replaced this vision of hyper-concentration of surfaces, which can be summarized in 6 levels.

  • Hyperscal Data Centers are still attractive for mass storage and non-transactional processing. Their objective is to bring the best production cost, by positioning a large area pooling where land and energy are cheap.
  • Hub Data Centers are mainly located in Frankfurt, London, Amsterdam and Paris in Europe. These areas concentrate large data centers and benefit from fast interconnection between them. These areas over-attract operators because interconnection takes precedence over the potential of the local market.
  • Regional Data Centers, located in all other major cities, address this time the local economic potential, with cloud players for companies or hosting providers acting as first level access to DC Hubs.
  • "5G" Data Centers will be located as close as possible to urban areas in order to meet the need for latency required by population uses.
  • Micro-Data Centers will bring low latency during a high concentration of use (a stadium, a factory).
  • Pico-Data Centers will address the use of the individual, thus bringing a minimum latency and especially a management of private data.

 

Despite different sizes, the first three levels of these data centers follow the same design principles. Except that Hyperscal Data Centers are often single users. It is therefore possible for them to position more restrictive design choices than in shared apartments.

The last three levels belong to the Edge universe and aim to position the DC space as close as possible to usage. However, these levels have different design principles.

The installation will be done in an industrial way for micro and pico-Data Centers. The main issues will be more related to physical protection or maintenance/operation of these infrastructures.

The "5G" Data Centers bring a new deal. Indeed, they have all the characteristics of a "small" DC but must be implemented in complex environments. They are subject to numerous safety and standards compliance constraints being located in urban areas. However, the greatest complexity lies in the lack of space to deploy the technical packages.

 

 

 Read the article

 

Source : Global Security Mag

 

 

 

 

 

 

Designing DCs for tomorrow today

on Wednesday, 27 February 2019 Posted in Archives Rezopole, Archives GrenoblIX, Archives LyonIX

Designing DCs for tomorrow today

How can we succeed in building IT infrastructures in a sustainable and perennial way for the next 20 years? What are the important elements to consider during the design phase?

Although it may seem easy to build data centers, it is a rapidly evolving industry. Indeed, today's rooms are becoming denser, servers consume more and more energy and are heavier. Modularity concepts are shaking up the market every month, the ranges are evolving rapidly to better meet users' needs....

This is why adaptability and modularity must be part of the solutions to these problems from the design phases. For example, choose modular cooling and electrical solutions, increase power and load during maintenance, design large equipment by oversizing it.
It can also be very useful to implement new Agile working methods. It is therefore essential to be flexible and adapt to these changes that can affect the project in a sustainable way.
Modularity is also an essential point during the design phase, especially if you choose an atypical location to set up your data center. However, legal or regulatory aspects may run counter to this modularity. It is therefore necessary to address these problems as soon as possible, as they often have incompressible deadlines...

 

 

 Read the article

 

Source : Le Monde Informatique

 

 

 

 

Development of French DCs

on Wednesday, 20 February 2019 Posted in Archives Rezopole, Archives GrenoblIX, Archives LyonIX

Development of French DCs

Interviewed by LeMagIT, Olivier Micheli notes that French data centers are finally attracting international cloud players and that they are expanding geographically in order to reduce latency.

Olivier Micheli, who is both CEO of Data4 Group and President of the France Datacenter Association, estimates that there are 200 large datacenters in France covering up to 10,000 m². The capital with the largest number of people because Paris is a European interconnection node.
There are between 3000 and 5000 private computer rooms of varying size and power across the country.
Beyond the desire of companies to control their equipment, the importance of ever lower latency is increasingly important in local economic activities and the development of smart cities.
According to Olivier Micheli, the market is moving towards data centres whose size is proportional to the size of the economic activity nearby.

After a slow period between 2012 and 2015, the French data center market has caught up. France is now in fourth place in Europe, tied with Ireland. There are several reasons for this: the opportunity for international companies to reach 67 million people from locally hosted IT resources, the geostrategic importance of Marseille and also the government's efforts to create favourable conditions for the development of these datacenters.
This finally allows France to align itself with the United Kingdom, Germany and the Netherlands.

The customers of these data centers are 70% of public cloud players such as Amazon, AWS but also publishers such as Salesforce. User companies want a lot of support.

The first issue for datacenters is, according to Olivier Micheli, connectivity. Indeed, companies now want to benefit from an offshore computer room in order to redistribute this data to users and Internet actors.
The second challenge is that of intelligent buildings and to achieve 100% renewable energy by using, for example, Free Cooling.

 

 

 Read the article

 

Source : LeMagIT

 

 

 

 

 

 

 

Data centers: a hot topic for our data

on Thursday, 06 September 2018 Posted in Archives Rezopole, Archives GrenoblIX, Archives LyonIX

Data centers: a hot topic for our data

Located south of Paris, the Data4 campus hosts the data of major CAC 40 companies. Among the nine data centres spread over 111 hectares, is DC05. This new-generation data center, in service since the end of 2017, has a single facade, clad with blocks capable of drawing in external oxygen. "Once filtered, this air at room temperature is used to cool the core of the building, or the temperature of some components can easily rise to 60°C. This free cooling system replaces the frozen water reserves and large mechanical cold cabinets used in older data centres," explains Jérôme Totel, site manager.

Free cooling is in vogue and is becoming increasingly popular, mainly for economic reasons. Indeed, data centers consume nearly 10% of the world's electricity. "And nearly half of this energy is used to operate cooling systems," explains Guilhem Cottet, General Delegate of France Datacenter. Today, the temperature range of air sent to server rooms is much higher than it was ten years ago. Indeed, international guidelines recommend between 20 and 27°C.

However, "cooling a data center is a real science," says Jean-Michel Rodriguez, Chief Technology Officer at IBM. The ambient temperatures and humidity in the northern European countries naturally ensure good cooling throughout the year, making it easier to use free cooling. However, this is not the case with a more Mediterranean climate since this system can only operate for part of the year. Hosting companies therefore often rely on more modest installations such as simple air conditioning coupled with "cold aisles". "Each year brings its share of new products. We are in a state of constant optimization," confirms François Salomon, Schneider's free cooling specialist.

 

This is the case, for example, of Facebook, which last June revealed a system combining free cooling and water, or Lenovo, which relies on liquid cooling. 2CRSI, a Strasbourg-based company, offers a shared cooling system. Many researchers are currently working on algorithms that can reduce server consumption. Although the energy efficiency indicator for data centres has improved significantly in recent years, some experts are sounding the alarm bell. "Calculation and energy needs will increase exponentially. To develop artificial intelligence, optimizing cooling will not be enough. So much so that we will have to rethink the design of our data centers," warns Pierre Perrot of CryoConcept.

 

This observation has led engineers to develop diametrically different technologies such as immersion. Asperitas now immerses its servers in oil. "It is a non-conductive and non-corrosive liquid that absorbs up to 1,500 times more heat than air," says Pierre Batsch, head of the company's development in France. Thus there is no need for air conditioning, false ceilings, cold aisles, etc. While Alibaba has already announced a similar project for some of its servers, other players such as "cryptomone mining" factories or the financial world are also interested in this solution.

Microsoft has decided to immerse part of its infrastructure in the ocean. "Water is 800 times denser than air. When you touch it, you are in contact with many molecules, which means you can transfer heat faster and more efficiently," explains Ben Cutler, project manager at Microsoft. But this initiative is met with some criticism: "Will the heat released have an impact on the marine ecosystem? Wouldn't it be better to reuse it?," persimmates an expert. These remarks will not discourage Microsoft, which has not finished making waves.

 

 

Read the article

 

Source : L'Express

 

 

 

 

 

FaLang translation system by Faboba