Small note, mostly for myself, as a reminder that when doing large imports of data, you may not need each block to be processed as an Eloquent model.
<?php // $processedData = ['name'=>'foobar', 'age'=39]; User::create($processedData); //
That code above would generally be faster as
<?php // $processedData = ['name'=>'foobar', 'age'=39]; DB::table('users')->insert($processedData); //
Obviously there are even faster ways – preparing your data in some text format, and using your DB cli tools to do a bulk/mass import.
Most recently, I’ve been given a large dump of data which I need to suck in to a different application (built with laravel).
Initial import process I wrote was the top style, and with the amount of data in the dump, it took around 12-14 minutes locally. Using the bottom style, it’s closer to 6 minutes.
On the public testing server, the process took around 9 minutes, and is now around 4.
This approach isn’t always suitable for every case, but it might be worth trying out if you’re hitting speed snags. In my case, this is re-importing sample data after a build/push, and waiting 9-10 minutes is … annoying. Waiting 4 minutes is less annoying.