Nearly 50 years have passed since someone at ŠKODA pressed a button to start the automaker’s first computer. That was in 1969, when the first server was installed in the basement of a prefabricated building. ŠKODA’s current Data Centre boasts incomparably greater computing capacity, and these days the company could not produce, sell or repair a single vehicle without its help. There is scarcely a company department that does not use its services.

Petr Rešl
Head of IT for Production & Logistics

“The Data Centre is made up of specialized server rooms whose role is to ensure that all information systems and technologies operate smoothly, stably and unaffected by ambient influences,” says Petr Rešl, Head of IT for Production & Logistics. “The servers perform calculations for Technological Development, manage employees’ email communications, and store data on all vehicles manufactured by the brand, including such details as tightening torques for key screws.”

The Centre is laid out like a huge construction kit in which individual servers and memories are concentrated in sections that are integrated to constitute a single supercomputer. One special room is used as a data storage space, another as a back-up energy source. The automaker is growing fast, and so is the Data Centre. Besides cars, digital services and apps will have an important place in ŠKODA’s future product portfolio. Recognizing this fact, the Data Centre is already being expanded with an additional 900 m² of IT space to be put into operation during autumn this year, and plans for further expansion are already in place.

Take a look around the data centre, and delve into its depths!

Each of the two data rooms has 96 standard racks (which are cabinets where servers are placed) with two independent power sources. These currently contain nearly 5,000 servers.

A fully automated robotic library is used for backing up all centrally stored data.

The Centre features two data rooms, each with 96 standard racks (server cabinets). Each room is provided with two independent power supplies. The rooms are cooled with air brought in through a double floor. Several interior cooling units are deployed around the data rooms, each with a cooling power of 150 kW, a figure comparable with that required for cooling about 50 offices. Key fire-protection measures include a built-in fire-extinguishing system. In case of alarm, it releases an environmentally friendly fire-extinguishing gas.

A supercomputer performing calculations for Technical Development, for instance, joins together numerous highly powerful computers into a single unit, with the individual parts communicating through high-speed networks.

The data centre is laid out like a big construction kit, with the individual servers and memories connected to form supercomputers. In addition to servers and memories, more than 120 kilometres of electric cables join it all together and power the data centre.

The roof of the Data Centre is fitted with three hybrid coolers to discharge heat from the cooling machine room. The refrigerant is a non-freezing glycol mixture (functional also at low winter temperatures). At high summer temperatures, the Centre uses a sprinkler system designed to generate water mist around the hybrid cooler.

The Data Centre’s daily power consumption is about 31 MWh, an amount of energy that would keep an ordinary family house going for 2,000 days. The Centre has its own back-up system in case of a power outage. The ground-floor premises are equipped with diesel generators for cooling purposes plus a dynamic uninterruptible power supply system (DUPS) as a back-up for the server rooms. Diesel tanks (from which these back-up sources are fed) ensure uninterrupted operation for about 24 hours. The consumption of one back-up source is ca 400 l of diesel/hour.