* Empty
* CLDC-3345 Rename created_by to assigned_to (#2372)
* Rename created_by to assigned_to
* Replace created_by with assigned_to
* Update created_by to assigned_to in exports, remove blank assigned to
* CLDC-3345 Add and set created_by fields (#2373)
* Add created_by
* Update existing created_by values
* Set created_by on single log
* Set created_by on BU
* Add created_by to exports
* feat: update since last merge
---------
Co-authored-by: natdeanlewissoftwire <nat.dean-lewis@softwire.com>
* Fix papertrail create version
---------
Co-authored-by: natdeanlewissoftwire <nat.dean-lewis@softwire.com>
* Copy 23 BU files to 24
* Update field mapping
* Add duplicate log error to charges
* Check the sum of charges for duplicates
* update stub
* Only add charges to duplciate hash if they exist
* Rebase change
* Update columns
* Add tcharge back to error mapping
* Add prepare your file
* Add csv parser 24/25 cases
* Fix date check
* Rebase fix
* Sends a correct email if there are only soft validations
* Add soft errors valid page
* Add soft errors confirm page
* Confirm the soft validations
* Reuse the how to fix template for check soft validations email
* Update email link
* Move soft validation confirmation to processor
* Correctly set the log status, remove redundant confirm_soft_validations
* Redirect successful upload to logs index and display a success banner
* Implement the soft validations only journey for sales logs
* Display the soft validation errors on the soft-errors-valid page
* Fix page alignment
* Fix tests by mapping housingneeds in csv hepler
* Add the sales soft validations to unpend_and_confirm_soft_validations
* Change naming
* Update method names
* refactor
* undo typo
* Only set the existing soft validations for correct types
* Fix path name
* Add missing error mappings for location fields
* Add missing tests and cancel button
* Change some wording
* Typos
# Context
- Partially related to https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Comprehending sales or lettings bulk upload CSV is enough, comprehending both simultaneously is rather challenging
# Changes
- Split out test helper class by log type ie lettings/sales
* feat: add validation for number of field labels
* feat: check field labels are numbers
* feat: remove validations for number of columns
* refactor: rename non_blank_fields_count to valid_field_numbers_count
* test: add functions to generate custom field labels/values
* test: that extra invalid field labels don't cause issues
* test: that removing a valid field label reduces count by 1
* test: that adding a valid field label increases count by 1
* refactor: rename validate_fields_count and wrong_field_count
* feat: add validation that max col count not exceeded when no headers
* fix: convert numbers to strings in default_field_numbers
* feat: add leniency to max cols count (in case of 1 extra col)
* test: explicitly set year of bulk upload to be 2022
* test: add/update tests in validator_spec
* chore: lint
* test: remove tests from csv_parser_spec that were moved to validator_spec
* feat: update 2022 csv_parser to work with new validations
* refactor: define number of valid 2022 fields in one place
* refactor: remove redundant headers definition
* feat: update 2022 csv parser to have col flexibility like 2023
* test: for validator 2022 as well as 2023
* test: simplify 2022/2023 logic and improve layout
* chore: lint
* test: ensure context descriptions start with "when"
* refactor: check fields/columns count within csv parser, not validator
* test: update 2022 csv parser tests to work like 2023
* chore: lint
* add first page for bulk upload resume journey
* bulk upload resume handles upload again
* add confirm page to bulk upload resume journey
* replace placeholder count with correct value
* apply recommendation for bulk upload resume choice
* add how to fix bulk upload mailer
* integrate new bulk upload approve journey
* add missing bulk upload error mappings
* remove test
* prevent approve being called multiple times
* bulk upload creates invisible logs ahead of time
* work invisible logs into bulk upload flow
* sort errors so deterministic
* remove unused ensure
* remove expected_log_count and processed
- these fields are no longer used or needed
* introduce pending status
* swap visible for pending logs
* only show visible lettings logs
* hard code status filters
* remove unused model methods
* only show visible sales logs
* form controller ignores hidden logs
* locations and schemes only affect visible logs
* refactor questions from validator to row parser
* able to switch between bulk upload parsers
- depending on what year we are processing
* spec tooling to support bulk upload multi year
* row parser now has year in namespacing
* add static data for 2023 row parser
* add placeholder to log to csv for specs
* bulk upload aribtrary 23/34 column ordering works
* bulk upload supports 23/24 without headers
* bulk upload 23/24 supports BOM + invalid chars
* dupe tests
* port 23/24 attributes_for_log
* port 23/24 bulk upload validations
* force crossover period
* tweak max permitted columns
* able to return column for given field
* work out column for field for errors
* add field_4 as 23/24 setup field
* remove duplicate method
* map schemes and locations correctly
* handle arbitrary number of header rows
* add missing fields to bulk upload support
* limit setup errors if errors already present
* bulk upload beter handles age validations
- limit the number of errors on a field. if the field already has an
existing error it does not add further errors to the field
- shim in extra validation for ages which take precedent over existing
log level validations which are lacking
* setup errors always added
- no longer observes if the grouping already has errors
- as we have fine grain control in this class of how errors should be
* fix missing pass through for renewal
* create logs iff the log itself is valid
* do not create logs if a setup section not complete
* bulk upload with 60% errors will not create logs
* extract magic number to constant
* add bulk upload absolute threshold error rate
* refactor with extract method
* refactor with extract method
* can filter logs by bulk upload
* hide log creation button when viewing bulk upload
- this affects the logs index page filtering logs from a specific bulk
upload
* add info banner to bulk upload logs
* placeholder for bulk upload logs header
* when resuming bulk upload set filters
* fill place holder with remaining logs to fix
* add interstitial to resume if logs resolved
* after resolving bulk upload logs show interstitial
* fix linting error
* extract view variable to helper method
* fix bulk upload age data types
* bulk upload handles both with and without headers
- headers are from the spreadsheet template
- otherwise assume cell A1 is start of the dataset
* add class to create logs from bulk upload
* create logs when processing bulk uploads
* remove bulk_upload_id from csv output
* create bulk upload logs only if all valid
- this will be changed later to allow for partial logs
- and only to create logs when a threshold has been met
* add method to blank invalid non setup fields
* bulk upload log creation blanks invalid fields
* fix incorrect logic for bulk upload renewal
* fix linting
* bulk upload log creation fail logs to sentry
* fix bulk upload line ending parsing
* extract bulk uploading csv parsing to class
* use csv parser in log creator
* change handle line endings mechanism
- we now strip all windows line endings for unix based line endings
- this normalises things making it simpler