Siemens PLM Software
Plano, TX

The use of lithium-ion (Li-ion) batteries has made the electric vehicle (EV) a reality, and electric mobility looks to be a not-so-distant future. However, there has been more than one incident of Li-ion batteries in electric vehicles catching fire and exploding due to either faulty thermal management systems or abuse. All this underscores the vital importance of finding new methods for effective, accurate design of the thermal management systems (TMS) that control temperature and optimize the performance of Li-ion batteries. This need is critical for the global automotive industry, where large Li-ion battery packs are used in EVs and hybrid electric vehicles (HEVs) — both enjoying growing consumer demand year after year. Battery suppliers need to create packs that are compact, efficient, and economical without compromising safety.

Figure 1. Geometry of the pack and the thermal management system.

A properly designed TMS is key to meeting these goals, as both high performance and long battery pack life can be achieved together only when battery temperature is maintained within a narrow thermal range near room temperature, which is the optimal thermal operating condition for this technology. The challenge is to design an efficient, effective TMS that maintains battery temperature within this range across widely varying operating conditions and demands on the battery pack.

Designing a safe, efficient TMS has proven difficult. Temperature variation in a single battery cell can significantly affect performance of the entire pack. Another risk is thermal runaway, which occurs when excessive internal heat escalates until the battery shuts down or explodes. Protection from, and avoidance of, thermal runaway conditions is another critical function of the TMS.

Samsung R&D Institute — in collaboration with Samsung Advanced Institute of Technology, Korea — developed a novel, liquid-coolant-based TMS for large Li-ion battery packs to address these challenges. They constructed a coupled 3D electrochemical/thermal model of the proposed battery pack, then used the model to evaluate the effects of varying operating conditions such as coolant flow rate and discharge current on the battery pack's temperature. The simulation revealed that the factor with the greatest impact on the pack's thermal performance was contact resistance.

From this numerical solution, a simple temperature correlation for predicting temperatures of every individual cell given the temperature of just one cell was devised and validated through experiment. Such coefficients have great potential to help engineers reduce the number and complexity of thermal sensors required in a large Li-ion battery pack of the kind used in EVs.

Considering the three-dimensional nature of the flow around the cells in a battery pack, and the spatial variance involved in heat generation, simulation of battery packs using computational fluid dynamics (CFD) methodology has evolved as an effective design and optimization tool to address thermal management problems. For the large battery packs operating at high discharge rates typically used in EVs and HEVs, CFD studies have shown that liquid cooling is more effective than air cooling, enabling design of more compact and efficient batteries. Although TMSs based on liquid cooling are already widely used in Li-ion battery-powered vehicles, these TMSs still need to be made safer. Coolants such as Dexcool commonly used in these systems are often flammable, and thus must never be allowed to come near the electrical connections in the battery cell, even in cases of leakage. This last requirement poses another TMS design challenge due to the penalty it incurs in heat transfer efficiency.

Pack Geometry and Experimental Setup

The Li-ion battery pack utilized a commercially available, 18,650-cell, Li-NCA/C battery. Conduction elements made of highly conductive metal transferred heat from the cylindrical cells to the coolant channel, and finally to the coolant liquid (in this case, water). A test pack of 30 cells was fabricated, with six cells in series and five cells in parallel, as shown in Figure 1. Aluminum conduction elements wrapped around the cells kept the coolant and cells apart, even in case of leakage, thereby enhancing safety without sacrificing heat transfer efficiency. The coolant flowed through a circular channel 9 millimeters in diameter inside a rectangular pipe.

The 3D CFD Model

A complete characterization of heat generation could be obtained only by constructing a 3D CFD-based electrochemical model of the battery that could be validated against experimental results, then used to simulate and evaluate performance of the TMS under various operating conditions. This project used STAR-CCM+® software from Siemens PLM Software to simulate flow and conjugate heat transfer, while electrochemical input data was obtained from Battery Design Studio™ software. This combination was used to simulate performance of the complete battery pack. Although simulating this complete electrochemical model can be computationally expensive, it yields a complete, comprehensive picture of thermal interactions under various operating conditions.

Figure 2. Locations of temperature measurements (T1 and T2) in the pack.

The 3D TMS model successfully computed the performance of the representative battery pack. It was found that the average temperature difference between the hottest and coldest cells was only 0.5 K. Observing a clear pattern in the temperature rise, the innovators realized that a properly defined temperature coefficient could be used to predict the temperature of other cells based on the temperature of just one cell. This reduces the total number of thermal sensors needed throughout the battery, greatly simplifying the design of the TMS. By extension, a model of a single battery module with a single temperature sensor can be used to effectively predict the temperature of every individual cell, resulting in a smaller CFD model and quicker results. Locations of model temperature measurements are shown in Figure 2.

Figure 3. Comparison of measured and predicted temperature at point T2 at 0.9C discharge.

Only the first parallel connected branch (consisting of five cells) and first series connected branch (consisting of six cells) were used in two separate computational models to incorporate the complete electrochemical thermal model. Because the heat generation and transfer mechanism remained the same for all branches, temperature distribution in the first parallel branch could be used to predict temperature distribution in all other branches. Therefore, temperature information from the series-connected cells could be used along with temperature of the first set of parallel connected branches to predict temperature of the entire pack. Using temperature coefficients from these results and input temperature of the first cell (T1 in Figure 2), the temperature of the last cell (T2 in Figure 2) was predicted with high accuracy, as shown in Figure 3.

Coolant Flow Rate is Critical

In EVs, power for operating the TMS comes from energy extracted from the battery. Reducing the energy requirement for the TMS reduces its drain on the battery, therefore optimizing coolant flow rate is essential. The STAR-CCM+ model revealed that more heat is stored in the battery pack in lower coolant flow velocity conditions, indicating in turn that at lower flow velocities, less heat is transferred into the coolant. Temperature rise less than doubled when flow velocity was halved.

In most battery packs, maximum temperature variation is limited to 3 K along the direction of the flow stream. The experimental model easily met the 3 K limit, and could effectively cool the pack even at very low flow velocities. It was found that temperature rise in the battery pack using the experimental TMS is on the same order as graphene-augmented, phase change material (PCM)-based thermal management systems reported in research literature. Although such PCM-based TMSs are also compact, this new TMS does not require novel materials such as graphene, and can therefore be produced at lower cost.

Overcoming Contact Resistance

Figure 4. Temperature rise in the first set of parallel cells in the pack as a function of contact resistance at the solid-solid interfaces at 0.9C discharge rate, and 0.2 m s-1 flow velocity.

Contact resistance at the solid-state interfaces has proven a major source of problems in large battery pack designs. In this TMS model, thermal contact resistance at the conduction element channel and cell conduction element interfaces were found to be the largest hindrance, resulting in temperature discontinuities, as shown in Figure 4. However, it was found that contact resistance could be reduced by use of thermal interface material at the solid-solid interfaces, which greatly improved thermal performance.

Because large Li-ion battery packs must operate at high-discharge-rate conditions in EVs and HEVs, heat generation becomes a critical factor to address. Even at high discharge rates and low-flow-velocity conditions, the experimental TMS kept maximum temperature rise within the acceptable industry range of 7 K, yielding excellent thermal performance. The TMS was found to cool the battery pack effectively even at low coolant flow rates.


Using the CFD-based TMS functional model created with STAR-CCM+ and Battery Design Studio, a close agreement between simulations and experimental measurements was achieved, validating the model against experiment with greater than 90 percent accuracy. Representative battery packs constructed using the symmetry of the total pack were successfully simulated, together with the TMS, to compensate for the high computational cost.

Methodologies determined through this research can be implemented in onboard battery management systems to reduce the number of sensors, reduce temperature non-uniformity, and simplify control systems. The ability of this novel compact TMS to work effectively and safely under stringent conditions makes it a suitable candidate for large Li-ion battery packs used in EVs.

This article was written by Suman Basu, Next Generation Research, Samsung R&D Institute India, Bangalore. For more information on the Siemens PLM Software products used in this application, visit here .