Showing posts with label max_allowed_packet. Show all posts
Showing posts with label max_allowed_packet. Show all posts

Monday, July 20, 2015

Inserting large rows in MySQL and MariaDB

As the maximum storage size for a LONGBLOB in MySQL is 4GB and the maximum max_allowed_packet size is 1GB I was wondering how it is possible to use the full LONGBLOB.

So I started testing this. I wrote a Python script with MySQL Connector/Python and used MySQL Sandbox to bring up an instance of MySQl 5.6.25.

One of the first settings I had to change was max_allowed_packet, which was expected. I set it to 1GB.

The next setting was less expected, it was innodb_log_file_size. The server enforces that the transaction has to fit in 10% of the InnoDB log files. So I had to set it to 2 files of 5G to be able to insert one record of (almost) 1GB.

So that worked for a row of a bit less that 1GB, this is because there is some overhead in the packet and the total has to fit in 1GB.

For the next step (>1GB) I switched from Python to C so I could use mysql_stmt_send_long_data() which allows you to upload data in multiple chunks.

I expected that to work, but it didn't. This is because in MySQL 5.6 and up the max_long_data_size was replaced by max_allowed_packet. But max_allowed_packet can only be set to max 1GB and max_long_data_size can be set to 4GB.

So I switched from MySQL 5.6 to MariaDB 10.1, because MariaDB still has max_long_data_size. That worked, now I could upload rows of up to (almost) 4GB.

I also noticed InnoDB will complain in the error logs about large tuples
InnoDB: Warning: tuple size very big: 1100000025

So you can insert CD ISO images in your database. For small DVD images this could work if your connector uses the COM_STMT_SEND_LONG_DATA command.

But it's best to avoid this and keep the size of rows smaller.

The scripts I used for my tests (and references to the bugs I found):
https://github.com/dveeden/mysql_supersize