top of page

Cloud Case: Facebook, Ebay, Google.

Video Facebook:

 

  • One engineer for every one million users.

  • Facebook as massive users and the thing is how to handle that quantity of users, how to storage them.

  • Data centers in Prineville Oregon. Size 3 football fields. Massive memory banks and servers.

  • How it works?  User profile on open internet goes to the data center and in that data center they compile all information profile in the servers and the go back to open internet.

  • Some people believe that internet is only in the space on the cloud, but they are wrong, all internet is supported by physical servers. It is a physical internet built with miles of fiber.

  • The data center has enough energy to operates 24/7, and it also use a generator in case of something goes wrong in the data center, in order to avoid the potential disaster and support the system in case of an emergency.

  • The massive quantity of servers, create heat, and the operators have to control it with constant cooling otherwise they quickly run out.

  • In Oregon, there are massive 7 room rooftop that control the heat with natural air conditioning. Using the air from Oregon and mixed with warm air to regulate the temperature.

  • More fans will soon be added. Around 6 million fans log in every day. Twice of population of EEUU

  • And will continue increasing, every day they add thousands of servers every day.

  • Petabytes most common word used there

  • Every server rack contains 500 terabytes.

  • Servers need maintenance every day. It is made by technicians.

 

 

GOOGLE:

 

  • They have been for reducing the energy used by the servers to a half.

  • ICT sector includes: mobile phone, computers and sell phone towers.

  • The 2 percent of global greenhouse gas emissions, the data center is responsible for the 15% percent of it.

  • Results of best practices: millions of dollars in energy savings.

Best practice:

  1. Managing the efficiency of the data center, having the instrumentation in place to measure the PUE or power usage effectiveness. It is Ratio of total facility energy to IT equipment energy within your data center.

  2. To control this, the most important thing is to eliminate mixing of cold air flow and hot air flow.

  3. Model the air flows inside the rack of the servers, and the servers.to improve the air flow in the data center.

 

Notes of the reading:

 

Questions:

 

  1. Why does Facebook’s data center specialist argue that “The Internet is not a cloud?”

Because all information about profiles, messages, new fans, new profiles and all data related to Facebook is supported by physical servers.The actions of the users in the open internet travel directly to the physical servers of Facebook in milliseconds and it is located in Oregon.

​

2. What are some of the techniques Facebook uses to cool its data centers?

The techniques used by Facebook to cool its data center is using giant massive roof top natural air-conditioning, that use the cold air of the exterior, filtering it and then mixing with warm air, to equilibrate the temperature and then directly to the cool the servers.

 

3. Describe the five methods recommended by Google for reducing power consump- tion.

  1. PUE:

- Measuring the PUE ( closer to 1 as possible) or power usage effectiveness. It is Ratio of total facility energy to IT equipment energy within your data center.

- Measure PUE as often as you can, every second if possible and meaningful (significativo) the results will be. In case of no constantly measure, the results will not be realistic or assertive and won´t be an actual measure of how the data center is operating.

 

2.  Air Flow

 

-Eliminating the mixing of cold air flow and hot air flow in the rack and between the servers.

- Modeling the air flows inside the rack and between the servers looking for the improvement of the air flow.

-Using computational Fluid dynamics that will delivers alternatives to improve the air flow in the data center.

- Using thermal modeling to identify hot spots. It helps to know better which alternatives or actions can improve the air flow.

- After determined the best actions to do, they spent 25.000 dollars in materials and parts,and those 25.000 dollars saved 65.000 dollars in energy

 

3. Adjust thermostat:

  • The temperature in google data centers is between 72 degrees to 80 degrees, saving thousands of dollars per year.

​​

4. Utilize free cooling:

​

  • Free cooling means utilizing ambient temperatures outside the data center to be able to provide cooling without operating very heavy energy consuming equipment like chillers(enfriadoras).

 

5. Optimizing power distribution:

  • Puling power in from the electriclal grid and then converting it to the voltages that are needed for all components of the data center.

  • There is a lot of conversion stages, but reducing those conversion stages can save money, efficiency.  Making each conversion MORE efficient.

  • One of the big losses are the UPS or uninterruptible power supply.

  • From 3 conversions stages to 1 using a direct connection between de AC and the AC and DC right into the server components.

 

4. . Based on the Google video, how much of the world’s global greenhouse gases are the result of computing?

- It is around 2 % in which google produce the 15 % of those emissions.

​

5.  What are some of the benefits of using Dell’s Triton water cooling technology?

 

The triton cooling technology, helps companies to reduce the heat of the servers in another way, more expensive and more efficient.

 

In the case of ebay, the company hired triton because they wanted to decrease the heat, so the triton cooling works converting the air to liquid cooling using water which is more expensive, but also more effective for removing heat.

Google logo.gif

© 2023 by Apelsin Group Proudly created with Wix.com

  • Facebook - Grey Circle
  • LinkedIn - Grey Circle
bottom of page