From 50720a16cc9a2a9353472a1f7fbd204b1e23a40f Mon Sep 17 00:00:00 2001 From: Kat <54268893+kosiakkatrina@users.noreply.github.com> Date: Thu, 30 Jan 2025 08:10:43 +0000 Subject: [PATCH] lint --- docs/bulk_upload.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/bulk_upload.md b/docs/bulk_upload.md index e8a50304f..7f829d737 100644 --- a/docs/bulk_upload.md +++ b/docs/bulk_upload.md @@ -11,6 +11,7 @@ Bulk upload functionality allows users to upload multiple logs using a csv file. Bulk upload file can be uploaded for a specific log type (sales or lettings) for a specific year. During crossover period we ask which collection year the file is for, otherwise we assume the Bulk Upload is for the current year. When a bulk upload file is successfully uploaded on the service, it: + - Saves a BulkUpload record in the database - Uploads the file to S3 - Schedules `ProcessBulkUploadJob` @@ -18,6 +19,7 @@ When a bulk upload file is successfully uploaded on the service, it: ### Bulk upload service There are several outcomes to a bulk upload: + - Successful upload - Partial upload: upload has errors but partial logs can be created, need a user approval - Errors in bulk upload: errors on important fields, or in the template. Logs can't be created and email with errors is sent to the user @@ -33,11 +35,13 @@ Most of BU processing logic is in `BulkUpload::Processor`. It chooses the correc Main differences between different collection years would be in `CsvParsers` and `RowParsers`. #### Row parser + - Maps any values from a csv row into values saved internally - Maps any validations into errors for bulk uploads by associating them with relevant fields - Adds any additional validations that might only make sense in BU (for example, validation that might not relevant in single log submission due to routing) ### Csv parser + - Holds template specific information - Header information - Row and field information