ITZone

Visit Facebook’s data center, where the app tests over 2,000 phones at the same time

The photos below are taken in Facebook's data center located in Prineville, Oregon. This data center in addition to using data to store data, running servers for Facebook services is also used to test applications on many different mobile devices. Facebook builds its own zone, its shelf design and its own automation system to ensure that its app runs well on a variety of devices. Data center has 3 big buildings, one of which is still under construction. Come on, invite you to visit.

Most companies are anonymous, not knowing which company center data it is because of concerns about security or data theft. As Facebook does not, look at the table is immediately know who the goods are

One of the reasons Facebook chose to build this data center in the high desert of Oregon is to take advantage of strong winds to cool the server. In the picture above, white water tank is used to cool the air if it is too hot. Facebook uses electricity primarily from the local grid, but it also has some solar panels to supplement.

In a small town like Prineville, Facebook is one of the top employers. Facebook also helps improve electricity and water systems and local infrastructure. In the picture is a note-writing fabric thank you from a nearby school for Facebook's donation.

There is a special room that even many data center employees cannot enter: the room deletes the data. This is where Facebook deletes user data before reusing the drive or bringing it to destruction.

Facebook also uses this data center as a place to test apps on many different devices. At the beginning, engineers Facebook will test each machine one by one at their desks. To speed things up, Facebook made a metal tray to hold multiple phones at the same time. But because of metal, it is possible to affect Wi-Fi waves.

So the company switched to making plastic shelves. Each shelf holds about 100 phones. But the strings are still tangled.

The next step, Facebook tries to develop a wall to mount the phone. Such a wall step contained about 240 machines at the same time. But to test more than 2,000 smartphones, Facebook will need 9 rooms like this if located at its headquarters, so the company decided to move it to this data center.

Finally, the company designed a separate shelf, called Isolation Chamber. These shelves will separate from each other to ensure the Wi-Fi connection of this shelf will not conflict with other shelves. Facebook uses insulators, copper wires and source filters to make the Isolation Chamber. Each shelf can hold 32 phones, in addition there is a computer to automate the installation process from installing the app, test the app to remove the app. With iPhone, Facebook uses 8 Mac Mini per shelf, while with Android, the company uses 4 Open Compute Project servers. All can be remotely controlled so that Facebook employees around the world can use it.

On the top of the shelf, there's a camera for Facebook engineers to monitor from a distance. This is an iPhone 5c shelf, you can see 8 Mac Mini units below.

There are 60 shelves like this in the data center, calculated as 1920 phones used for testing purposes before uploading to the store. Facebook is working to increase the number of phones per shelf to 64, while opening the design for all users. An existing challenge is how to fit in bigger and bigger phones.

Going to another area. Of course, having entered the data center, the familiarity with piles of server stretches is indispensable. These shelves contain servers designed according to the Open Compute Project configuration, an open configuration and anyone can participate in production and installation. These servers are cooled by an airflow, then the heat is sucked out of the building through a special "village". This corridor of course is very hot.

Besides Open Compute Project server, this data center also has some specialized servers running machine learning tasks, inside each box like this, there are 8 NVIDIA M40 Tesla GPUs running at the same time. Due to the use of available GPUs, it is larger than Open Compute servers.

Close up Big Sur server for you to take a closer look.

On the second floor is the main cooling system of the data center. The grids on the right will suck in air, then push to the left filter to remove dust and other particles.

When the air has been filtered, it is put into an evaporator system. If the outdoor temperature is too cold, Facebook will mix a little more hot air to get the right temperature for the server operation.

Finally, the fan will suck the temperature-adjusted air and take it to each server rack.

When the task is completed, the air after cooling the server will be pushed out of the building with these big fans like this.

This is a "cold storage" warehouse (cold storage). When Facebook detects that you rarely see a photo or video, it will move to the cold storage to store, not delete it. The speed of servers in the cold storage will be slower than where you store the data you normally use (hot storage) to save costs. When you need to load old photos, they will still be displayed but with a little longer time.

A shelf like this could contain 2 petabytes of data, ie more than 2,000 TB, more than ten million photos.

ITZone via digimarkvn

Share the news now