From e88c36a8d3fb0f0b3f7a4ab1b5a267797d550e9d Mon Sep 17 00:00:00 2001 From: Samuel Date: Tue, 17 Jun 2025 17:40:55 +0100 Subject: [PATCH] document steps on running out of memory --- docs/exports.md | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/docs/exports.md b/docs/exports.md index 1588defc2..cbf0e3b02 100644 --- a/docs/exports.md +++ b/docs/exports.md @@ -141,6 +141,9 @@ Full exports can only be run via a **rake task**. -If the collection size is very large, full exports may fail due to memory issues. In such cases, it is better to batch exports into chunks of ~60,000 records and run several partial exports over multiple days. The `values_updated_at` field can help with this. +If the collection size is very large, full exports may fail due to memory issues. In such cases: +- Delete the incomplete export files from the S3 bucket. +- It is better to batch exports into chunks of ~60,000 records and run several partial exports over multiple days. The `values_updated_at` field can help with this. +- Rerun the task with more memory allocated, see 'Running Rake Tasks'. The simplest approach is to mark a batch of logs for export each day and allow scheduled morning exports to handle them.