在 Laravel 9 中 job 运行报错:has been attempted too many times or run too long. The job may have previously timed out.
1、在 Laravel 9 中 job 运行报错:has been attempted too many times or run too long. The job may have previously timed out.。如图1
2、发现在表中的记录数量共 8761297 条。基于 chunk 方法,以一次 10 条记录的块为单位检索整个表。在闭包函数的开关打印每一次的时间。最后结果如下
[2024-04-09 09:43:57] local.INFO: exportExcelStart {"datetime":"2024-04-09 09:43:57"} [2024-04-09 09:43:58] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:43:58"} [2024-04-09 09:44:00] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:00"} [2024-04-09 09:44:01] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:01"} [2024-04-09 09:44:03] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:03"} [2024-04-09 09:44:05] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:05"} [2024-04-09 09:44:07] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:07"} [2024-04-09 09:44:09] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:09"} [2024-04-09 09:44:11] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:11"} [2024-04-09 09:44:13] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:13"} [2024-04-09 09:44:14] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:14"} [2024-04-09 09:44:16] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:16"} [2024-04-09 09:44:18] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:18"} [2024-04-09 09:44:20] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:20"} [2024-04-09 09:44:22] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:22"} [2024-04-09 09:44:23] local.INFO: exportExcelChunk {"datetime":"2024-04-09 09:44:23"}
3、最终得出结论,每10条记录的处理,需要耗费的平均时长为 1.8 秒。决定将 10 调整为 1000。在闭包函数的开关打印每一次的时间。最后结果如下。每100条记录的处理,需要耗费的平均时长为 4.8 秒。那么相当于每 10 条记录的处理时间缩短至 0.48 秒。
[2024-04-09 09:53:48] local.INFO: exportExcelStart {"datetime":"2024-04-09 09:53:48"} [2024-04-09 09:53:53] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:53:53"} [2024-04-09 09:53:58] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:53:58"} [2024-04-09 09:54:02] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:02"} [2024-04-09 09:54:07] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:07"} [2024-04-09 09:54:11] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:11"} [2024-04-09 09:54:15] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:15"} [2024-04-09 09:54:20] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:20"} [2024-04-09 09:54:24] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:24"} [2024-04-09 09:54:28] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:28"} [2024-04-09 09:54:32] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:32"} [2024-04-09 09:54:37] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:37"} [2024-04-09 09:54:42] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:42"} [2024-04-09 09:54:48] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:48"} [2024-04-09 09:54:57] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:54:57"} [2024-04-09 09:55:04] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:55:04"} [2024-04-09 09:55:09] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:55:09"} [2024-04-09 09:55:14] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:55:14"} [2024-04-09 09:55:18] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:55:18"} [2024-04-09 09:55:22] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:55:22"} [2024-04-09 09:55:27] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:55:27"} [2024-04-09 09:55:33] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 09:55:33"}
4、再次尝试每次 10000 条记录,每10000条记录的处理,需要耗费的平均时长为 41 秒。那么相当于每 10 条记录的处理时间缩短至 0.41 秒。那么理论上来说,1000 的值是合适的。一次处理的内存占用更少,且总体耗费时间上,与 10000 相比,也并未延长多少了。
[2024-04-09 10:07:17] local.INFO: exportExcelStart {"datetime":"2024-04-09 10:07:17"} [2024-04-09 10:08:04] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 10:08:04"} [2024-04-09 10:09:00] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 10:09:00"} [2024-04-09 10:09:55] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 10:09:55"} [2024-04-09 10:10:49] local.INFO: exportExcelChunk1000 {"datetime":"2024-04-09 10:10:49"}
5、在性能环境中,第 1000 条记录的处理时长为 0.6 秒,那么相当于每 10 条记录的处理时间缩短至 0.06 秒。最终决定限制最大导出行数为 100000。代码实现如下
$count = 0; $orderShippingLogQueryBuilder->chunk(1000, function ($orderShippingLogs) use ($params, $excel, $columns, &$count) { // 限制最大导出行数为 100000 if ($count >= 100000) { return false; } Log::info( 'exportExcelChunk1000', [ date('Y-m-d H:i:s', time()) ] ); $data = []; $count = $count + 1000; });
6、查看 Laravel Telescope,本地环境设置为 3000,确认执行了 3 条 查询 SQL,符合预期。如图2
7、最终在性能环境,导出一个 100000 行的 Excel,总计执行时长为:107 秒。符合预期。
exportExcelStart ["2024-04-10 03:56:32"] exportExcelChunk1000 ["2024-04-10 03:56:32"] exportExcelChunk1000 ["2024-04-10 03:56:33"] exportExcelChunk1000 ["2024-04-10 03:56:33"] exportExcelChunk1000 ["2024-04-10 03:56:34"] exportExcelChunk1000 ["2024-04-10 03:56:35"] exportExcelChunk1000 ["2024-04-10 03:56:36"] exportExcelChunk1000 ["2024-04-10 03:58:13"] exportExcelEnd ["2024-04-10 03:58:19"]
近期评论