Importing Excel or CSV Files into Laravel
Utilize the Laravel-excel package to import Excel or CSV files and write data to the database.
When the backend interface isn’t fully developed or when users have extensive data that they prefer not to input manually, opting for a bulk import via a form becomes essential.
This is where the Laravel-excel package comes into play.
Additionally, it’s worth discussing the challenges encountered when importing a large volume of data at once.
Currently, the most widely adopted package among developers is Laravel-excel
.
It supports most file formats commonly used during import processes and encompasses numerous essential functionalities.
The official documentation is well-written and provides clear examples within the code snippets.
Installation
Install using the command, which auto-discovers and registers into the ServiceProvider and Facade.
The current latest version is: Documentation 3.1
composer require maatwebsite/excel
If you need to add a custom config, execute the command to generate config/excel.php
composer require maatwebsite/excel
php artisan vendor:publish --provider="Maatwebsite\Excel\ExcelServiceProvider" --tag=config
Defining File Reading Formats
Because Laravel’s built-in collection type is incredibly useful, we aim to convert the imported form data into a collection format.
First, create an object definition for the data to be loaded.
Let’s assume we’re importing CSV files for product specifications, so we’ll create a file named app/Imports/ProductImport.php
<?php
namespace App\Imports;
use Illuminate\Support\Collection;
use Maatwebsite\Excel\Concerns\ToCollection;
class ProductImport implements ToCollection
{
public $collection;
public function collection(Collection $collection)
{
$this->collection = $collection->transform(function ($row) {
return collect([
'name' => $row[1],
'main_image' => $row[2],
'author_name' => $row[3],
'material' => $row[4],
'specification' => $row[5],
'description' => $row[6],
]);
});
}
}
As you can see, we’ve used the ToCollection
interface and defined the correspondence between collection
and the content read from the form.
Reading CSV Files
Within the controller, call the Excel
that was previously registered via auto-discovery through the Facade to read the uploaded file.
Firstly, create an object named ProductImport
.
Then, use the Facade to invoke the Excel
package and employ its loading functionality. Once loaded, utilize the shift()
method of the collection to remove the header data from the first row.
public function uploadCsv(Request $request)
{
$import = new ProductImport();
Excel::import($import, $request->file('file'));
$productCollection = $import->collection;
$productCollection->shift();
}
This way, you can load the uploaded CSV file into a collection for further operations.
Uploading Testing
For testing with Postman, simply select form-data
in the Body tab.
In the ‘key’ field, you can select the ‘file’ format from the dropdown menu.
Common Issues
413 Payload Too Large
This error originates from Nginx and implies that the uploaded file is too large, causing the server to reject it.
Reference: 413 Payload Too Large
Solution using PHP development environment as an example:
Adjusting Nginx Config
By default, it’s set to 2M, meaning files larger than this require adjustments.
Open nginx.conf
and add this within the server block:
client_max_body_size 10M;
This allows requests larger than 2M to pass to Nginx for further PHP processing.
Adjusting php.ini
php.ini holds various PHP configurations.
We’re interested in modifying upload_max_filesize
, initially set to 2M, to match Nginx:
upload_max_filesize = 10M;
When using Docker, after creating a custom php.ini
file, mount it to the designated location within the container:
- ./php/config/php.ini:/usr/local/etc/php/conf.d/php.ini
Once these adjustments are made, restart or reload the settings for the containers that were modified.
Enter the container to ensure the configurations are applied β use phpinfo()
for PHP and nginx -T
for Nginx.
Error: 1390 too many placeholders
After successfully uploading the file and starting the program, everything seems to execute as expected during data checks.
However, when constructing SQL statements using query builders, an error pops up:
1390 Prepared statement contains too many placeholders
This MySQL error arises from our use of Prepared statement
, which supports only 65535 (2^16-1) parameters.
If you attempt to insert a vast amount of data exceeding this limit, whether in columns or rows, the assembly fails, resulting in this error.
Now that we understand the cause, the solution is evident: avoid inserting too much data at once… pretty straightforward.
The solution involves breaking down a large dataset, like an array containing ten thousand product entries, into multiple arrays of 500 entries each.
Then, execute the insertion or modification in chunks of 500 entries to circumvent the ’too many placeholders’ issue.
In PHP, we’ll use array_chunk to split:
$productChunks = array_chunk($products, 500);
After execution, you’ll obtain multiple arrays, each containing 500 entries from the original dataset.
Then, iterate through each set of 500 entries, resulting in 20 iterations of 500 entries each, instead of a single loop handling 10,000 entries.
This method illustrates how to use the Laravel-excel package to read CSV files and write into a database. Exporting files will be covered in a future example with a practical case study.