Visit Facebook’s data center, where the app tests over 2,000 phones at the same time

The photos below are taken in Facebook's data center located in Prineville, Oregon. This data center in addition to using data to store data, running servers for Facebook services is also used to test applications on many different mobile devices. Facebook builds its own zone, its shelf design and its own automation system to ensure that its app runs well on a variety of devices. Data center has 3 big buildings, one of which is still under construction. Come on, invite you to visit.

Most companies are anonymous, not knowing which company center data it is because of concerns about security or data theft. As Facebook does not, look at the table is immediately know who the goods are

3807882_Facebook_data_center_location

One of the reasons Facebook chose to build this data center in the high desert of Oregon is to take advantage of strong winds to cool the server. In the picture above, white water tank is used to cool the air if it is too hot. Facebook uses electricity primarily from the local grid, but it also has some solar panels to supplement.

3807884_Facebook_cong_dong_data_center

In a small town like Prineville, Facebook is one of the top employers. Facebook also helps improve electricity and water systems and local infrastructure. In the picture is a note-writing fabric thank you from a nearby school for Facebook's donation.

3807886_Disc_earase_room

There is a special room that even many data center employees cannot enter: the room deletes the data. This is where Facebook deletes user data before reusing the drive or bringing it to destruction.

3807906_20160713-facebook-prineville-mobile-testing-sled-100671621-orig

Facebook also uses this data center as a place to test apps on many different devices. At the beginning, engineers Facebook will test each machine one by one at their desks. To speed things up, Facebook made a metal tray to hold multiple phones at the same time. But because of metal, it is possible to affect Wi-Fi waves.

3807908_20160713-facebook-prineville-mobile-testing-gondola-100671620-orig

So the company switched to making plastic shelves. Each shelf holds about 100 phones. But the strings are still tangled.

3807910_20160713-facebook-prineville-mobile-testing-slatwall-100671619-orig

The next step, Facebook tries to develop a wall to mount the phone. Such a wall step contained about 240 machines at the same time. But to test more than 2,000 smartphones, Facebook will need 9 rooms like this if located at its headquarters, so the company decided to move it to this data center.

3807912_20160713-facebook-prineville-mobile-testing-rack-100671618-orig

Finally, the company designed a separate shelf, called Isolation Chamber. These shelves will separate from each other to ensure the Wi-Fi connection of this shelf will not conflict with other shelves. Facebook uses insulators, copper wires and source filters to make the Isolation Chamber. Each shelf can hold 32 phones, in addition there is a computer to automate the installation process from installing the app, test the app to remove the app. With iPhone, Facebook uses 8 Mac Mini per shelf, while with Android, the company uses 4 Open Compute Project servers. All can be remotely controlled so that Facebook employees around the world can use it.

3807918_img_20160712_122640

On the top of the shelf, there's a camera for Facebook engineers to monitor from a distance. This is an iPhone 5c shelf, you can see 8 Mac Mini units below.

3807914_20160713-facebook-prineville-mobile-testing-device-lab-100671617-orig

There are 60 shelves like this in the data center, calculated as 1920 phones used for testing purposes before uploading to the store. Facebook is working to increase the number of phones per shelf to 64, while opening the design for all users. An existing challenge is how to fit in bigger and bigger phones.

3807888_Open_compute_server_rack

Going to another area. Of course, having entered the data center, the familiarity with piles of server stretches is indispensable. These shelves contain servers designed according to the Open Compute Project configuration, an open configuration and anyone can participate in production and installation. These servers are cooled by an airflow, then the heat is sucked out of the building through a special "village". This corridor of course is very hot.

3807890_Machine_learning_rack

Besides Open Compute Project server, this data center also has some specialized servers running machine learning tasks, inside each box like this, there are 8 NVIDIA M40 Tesla GPUs running at the same time. Due to the use of available GPUs, it is larger than Open Compute servers.

3807892_fb_data_center_tour-10

Close up Big Sur server for you to take a closer look.

3807894_Phong_nhan_khong_khi

On the second floor is the main cooling system of the data center. The grids on the right will suck in air, then push to the left filter to remove dust and other particles.

3807896_Loc_nuoc

When the air has been filtered, it is put into an evaporator system. If the outdoor temperature is too cold, Facebook will mix a little more hot air to get the right temperature for the server operation.

3807898_Quat_hut

Finally, the fan will suck the temperature-adjusted air and take it to each server rack.

3807900_Day_ra_khoi_toa_nha

When the task is completed, the air after cooling the server will be pushed out of the building with these big fans like this.

3807904_cold_storage

This is a "cold storage" warehouse (cold storage). When Facebook detects that you rarely see a photo or video, it will move to the cold storage to store, not delete it. The speed of servers in the cold storage will be slower than where you store the data you normally use (hot storage) to save costs. When you need to load old photos, they will still be displayed but with a little longer time.

3807902_cold-storage_2

A shelf like this could contain 2 petabytes of data, ie more than 2,000 TB, more than ten million photos.

ITZone via digimarkvn

Share the news now