Jump to Content
Databases

70 apps in 2 years: How Renault tackled database migration

July 27, 2022
https://storage.googleapis.com/gweb-cloudblog-publish/images/renault.max-2600x2600.jpg
Cyril Picchiottino

Quality & Customer Satisfaction IS VP

Editor’s note: Renault, the French automaker, embarked on a wholesale migration of its information systems—moving 70 applications to Google Cloud. Here’s how they migrated from Oracle databases to Cloud SQL for PostgreSQL.


The Renault Group, known for its iconic French cars has grown to include four complementary brands, and sold nearly 3 million vehicles in 2020. Following our company-wide strategic plan, “Renaulution,” we’ve shifted our focus over the past year from a car company integrating tech, to a tech company integrating cars that will develop software for our business. For the information systems group, that meant modernizing our entire portfolio and migrating 70 in-house applications (our quality and customer information systems) to Google Cloud. It was an ambitious project, but it’s paid off. In two years we migrated our Quality and Customer Satisfaction information systems applications, optimized our code, and cut costs thanks to managed database services. Compared to our on-premises infrastructure, using Google Cloud services and open-source technologies comes to roughly one dollar per user per year, which is significantly cheaper. 

An ambitious journey to Google Cloud

We began our cloud journey in 2016 with digital projects integrating a new way of working and new technologies. These new technologies included those for agility at scale, data capabilities and CI/CD toolchain. Google Cloud stood out as the clear choice for its data capabilities. Not only are we using BigQuery and Dataflow to improve scaling and costs, but we are also now using fully managed database services like Cloud SQL for PostgreSQL. Data is a key asset for a modern car maker because it connects the car maker to the user, allows car makers to better understand usage and better informs what decisions we should make about our products and services. After we migrated our data lake to Google Cloud, it was a natural next step to move our front-end applications to Google Cloud so they would be easier to maintain and we could benefit from faster response times. This project was no small undertaking. For those 70 in-house applications (e.g. vehicle quality evaluation, statistical process control in plants, product issue management, survey analysis), for our information systems landscape, we had a range of technologies—including Oracle, MySQL, Java, IBM MQ, and CFT—with some applications created 20 years ago. 

Champions spearhead each migration

Before we started the migration, we did a global analysis of the landscape to understand each application and its complexity. Then we planned a progressive approach, focusing on the smallest applications first such as those with a limited number of screens or with simple SQL queries, and saving the largest for last. Initially we used some automatic tools for the migration, but we learned very quickly nothing can replace the development team’s institutional knowledge. They served as our migration champions.

Video Thumbnail

The apps go marching one by one

When we migrated our first few Oracle databases to Cloud SQL for PostgreSQL we tracked our learnings in an internal wiki to share common SQL patterns, which helped us speed up the process. For some applications, we simplified the architecture and took the opportunity to analyze and optimize SQL queries during the rework. We also used monitoring tools like Dynatrace and JavaMelody to ensure we improved the user experience.

The approach we developed was very successful—where database migration was initially seen as insurmountable, the entire migration project was completed in two years.

With on-premises applications it was hard for our developers to separate code performance from infrastructure limitations. So as part of our migration to Google Cloud, we optimized our applications with monitoring services. With these insights our team has more control over resources, which has reduced our maintenance and operations activity and resulted in faster, more stable applications. Plus, migrating to Cloud SQL has made it much easier for us to change our infrastructure as needed, add more power when necessary or even reduce our infrastructure size. 

A new regime on Cloud SQL

Now that we’re running on Cloud SQL, we’ve improved performance even on large databases with many connected users. Thanks to built-in tools in the Google Cloud environment, we can now easily understand performance issues and quickly solve them. For example, we were able to reduce the duration of a heavy batch processing by a factor of three from nine to three hours. And we don’t have to wait for the installation of a new server, so our team can move faster. Beyond speed, we’ve also been able to cut costs. We optimized our code based on insights from monitoring tools, which not only enabled a more responsive application for the user, but it also reduced our costs because we’re not overprovisioned.   

Learn more about the Renault Group and try out Cloud SQL today.

Posted in