* Sends a correct email if there are only soft validations
* Add soft errors valid page
* Add soft errors confirm page
* Confirm the soft validations
* Reuse the how to fix template for check soft validations email
* Update email link
* Move soft validation confirmation to processor
* Correctly set the log status, remove redundant confirm_soft_validations
* Redirect successful upload to logs index and display a success banner
* Implement the soft validations only journey for sales logs
* Display the soft validation errors on the soft-errors-valid page
* Fix page alignment
* Fix tests by mapping housingneeds in csv hepler
* Add the sales soft validations to unpend_and_confirm_soft_validations
* Change naming
* Update method names
* refactor
* undo typo
* Only set the existing soft validations for correct types
* Fix path name
* Add missing error mappings for location fields
* Add missing tests and cancel button
* Change some wording
* Typos
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2344
- So the reported bug and these change do not match 100%
- As the reported bug should have already been resolved in an earlier change as the error message that was reported has already been removed from the system
- These changes remove the check from sales plus a few minor tidy ups
# Changes
- Remove dead code around error threshold which is no longer used or needed
- Inverted negative predicate to positive one, `incorrect_field_count` => `correct_field_count`
- Remove `validate_min_columns` from sales bulk upload
* feat: wip blank fields and dependent fields on upload tos ee if valid and can upload with missing info - this is not the exact ac on the ticket yet
* feat: update seed
* feat: wip commit
* feat: add postcodenk error so can clear on validation
* feat: add postcode validation back
* feat: move la vals to shared and move and add tests
* feat: add correct pluralisation to warning message
* feat: add blank compound invalid fields methods
* feat: update validations
* feat: update pluralisation
* refactor: lint
* feat: clear errors associated with blanked values so log status is set correctly on creation
* feat: validate instead
* feat: avoid duplicated errors
* feat: dont auto-refuse income, different to imports
* feat: validate after every blank method
* feat: delete la validator spec
* tests: update
* refactor: erblinting
* refactor: cleanup
* refactor: move pluralizer to helper
* feat: copy update
* feat: rename
* feat: refactor to avoid redundant re-validations and test
* refactor: rubocop
* test: update
* test: update
* test: update
* update sidekiq
* feat: clear errors
* feat: run clearing twice in case first clear creates different errors
* feat: remove moved file
* feat: undo validation file tweaks as shared/specific overlap could do with a more general refactor
* feat: update tests
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-1888
- This is a continuation of https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1277
- When bulk uploading we want to check users are not uploading data that already exists to prevent them submitting duplicate
# Changes
- This feature is behind a feature toggle. it has been disabled for staging for testing purposes but available in all other environments
- If a log already exists based off certain fields add errors to the associated fields
- We discount any hidden logs and only check "active" logs
- Added memoization to `#valid?` as an optimisation
* feat: add validation for number of field labels
* feat: check field labels are numbers
* feat: remove validations for number of columns
* refactor: rename non_blank_fields_count to valid_field_numbers_count
* test: add functions to generate custom field labels/values
* test: that extra invalid field labels don't cause issues
* test: that removing a valid field label reduces count by 1
* test: that adding a valid field label increases count by 1
* refactor: rename validate_fields_count and wrong_field_count
* feat: add validation that max col count not exceeded when no headers
* fix: convert numbers to strings in default_field_numbers
* feat: add leniency to max cols count (in case of 1 extra col)
* test: explicitly set year of bulk upload to be 2022
* test: add/update tests in validator_spec
* chore: lint
* test: remove tests from csv_parser_spec that were moved to validator_spec
* feat: update 2022 csv_parser to work with new validations
* refactor: define number of valid 2022 fields in one place
* refactor: remove redundant headers definition
* feat: update 2022 csv parser to have col flexibility like 2023
* test: for validator 2022 as well as 2023
* test: simplify 2022/2023 logic and improve layout
* chore: lint
* test: ensure context descriptions start with "when"
* refactor: check fields/columns count within csv parser, not validator
* test: update 2022 csv parser tests to work like 2023
* chore: lint
* add first page for bulk upload resume journey
* bulk upload resume handles upload again
* add confirm page to bulk upload resume journey
* replace placeholder count with correct value
* apply recommendation for bulk upload resume choice
* add how to fix bulk upload mailer
* integrate new bulk upload approve journey
* add missing bulk upload error mappings
* remove test
* prevent approve being called multiple times
* bulk upload creates invisible logs ahead of time
* work invisible logs into bulk upload flow
* sort errors so deterministic
* remove unused ensure
* remove expected_log_count and processed
- these fields are no longer used or needed
* introduce pending status
* swap visible for pending logs
* only show visible lettings logs
* hard code status filters
* remove unused model methods
* only show visible sales logs
* form controller ignores hidden logs
* locations and schemes only affect visible logs
* refactor questions from validator to row parser
* able to switch between bulk upload parsers
- depending on what year we are processing
* spec tooling to support bulk upload multi year
* row parser now has year in namespacing
* add static data for 2023 row parser
* add placeholder to log to csv for specs
* bulk upload aribtrary 23/34 column ordering works
* bulk upload supports 23/24 without headers
* bulk upload 23/24 supports BOM + invalid chars
* dupe tests
* port 23/24 attributes_for_log
* port 23/24 bulk upload validations
* force crossover period
* tweak max permitted columns
* able to return column for given field
* work out column for field for errors
* add field_4 as 23/24 setup field
* remove duplicate method
* map schemes and locations correctly
* handle arbitrary number of header rows
* add missing fields to bulk upload support
* bulk upload setup errors only for missing data
* bulk upload setup error for scheme
* bulk upload setup error for location
* ensure missing startdate is a setup error
* add placeholder tests for bulk upload mailer
* bulk upload fix setup errors email
- this plumbs in the condition so if any setup sections are incomplete we
send that partcular email and prevent the remaining flow
* tag bulk upload setup errors for downsteam use
* add category to bulk upload errors
* persist bulk upload error category
* populate bulk upload mailer with errors
* remove duplicate bulk validations
- validation was being performed both at CSV level and log level causing
a duplicate validation to appear
* bulk upload valiation errors now store the message
- previouly this was storing just the error type which we do not have a
mechanism to pipe these back to user readable error messages
* create logs iff the log itself is valid
* do not create logs if a setup section not complete
* bulk upload with 60% errors will not create logs
* extract magic number to constant
* add bulk upload absolute threshold error rate
* refactor with extract method
* add class to create logs from bulk upload
* create logs when processing bulk uploads
* remove bulk_upload_id from csv output
* create bulk upload logs only if all valid
- this will be changed later to allow for partial logs
- and only to create logs when a threshold has been met
* add method to blank invalid non setup fields
* bulk upload log creation blanks invalid fields
* fix incorrect logic for bulk upload renewal
* fix linting
* bulk upload log creation fail logs to sentry
* fix bulk upload line ending parsing
* extract bulk uploading csv parsing to class
* use csv parser in log creator
* change handle line endings mechanism
- we now strip all windows line endings for unix based line endings
- this normalises things making it simpler
* able to view lettings bulk upload errors
* fix linting
* call service correctly in test
* add bulk upload sales questions mapping
* appease linter
* bulk upload error shows correct question
- depending on log type it will show relevant question for the field
concerned
* improve namespacing of classes
* add job to process bulk uploads
* move validation from parser to model
* add validations for field_1
* add validation for field_4
* pending tests for field_4
* convert field_mapping to array of hashes
* validate nulls based on form question
* actually load forms when toggling between forms
* validate null for startdate
* row parser has access to bulk upload
* csv upload validates first form section
* add postcode validation
* Refactor error mappings for row parser
* Add unittype question
* Fix null error setting and add builtype
* add wchair to bulk upload
* Add beds to bulk upload
* Add joint to bulk upload
* Add startertenancy to the bulk upload
* Add tenancy for bulk upload
* Add declaration to the bulk upload
* Add age1 and age1_known to bulk upload
* add ages to bulk upload
* add sex1 to bulk upload
* add ethnic_group and ethnic to bulk upload
* add national to bulk upload
* add ecstat1 to bulk upload
* add military related fields to bulk upload
* add preg_occ to bulk upload
* add housingneeds to bulk upload
* add illness to bulk upload
* add layear to bulk upload
* add waityear to bulk upload
* add reason to bulk upload
* add prevten to bulk upload
* add homeless to bulk upload
* add previous postcode to bulk upload
* add reasonable preferences to bulk upload
* add allocations system to bulk upload
* add referral to bulk upload
* add net_income_known to bulk upload
* add hb to bulk upload
* add benefits to bulk upload
* add rent fields to bulk upload
* add hhmemb to bulk upload
* use 2022 csv fixtures for bulk upload
* fix renewal mapping for bulk upload
* placeholder test for bulk upload validation
* fix bulk upload mapping for homeless field
* fix leftreg mapping for bulk upload
* fix user associations in bulk upload tests
* add gender fields for bulk upload
* add ecstatN fields to bulk upload
* add #relatN fields to bulk upload
* extract old_visible_id in factory to trait
* map net_income_known correctly for bulk upload
* fix income bugs for bulk upload
* add unitletas to bulk upload
* add #rsnvac to bulk upload
* add #sheltered to bulk upload
* add illness fields to bulk upload
* add #irproduct_other to bulk upload
* infer renewal from rsnvac for bulk upload
* add #tenancyother to bulk upload
* add #tenancylength to bulk upload
* bulk upload earnings accepts pennies but rounds
* add #reasonother to bulk upload
* fix mapping of #ppcodenk for bulk upload
* add #household_charge to bulk upload
* add #chcharge to bulk upload
* add #tcharge to bulk upload
* add #supcharg to bulk upload
* add pscharge to bulk upload
* add #scharge to bulk upload
* use case statement for bulk upload allocation
* add offered to bulk upload
* add propcode to bulk upload
* add major repair fields to bulk upload
* add #voiddate to bulk upload
* support YY year format for bulk upload
* test postcode strips whitespace for bulk upload
* add #la to bulk upload
* add previous la to bulk upload
* fix failing test
* remove duplicate line from rebase
* add first time social housing to bulk upload
* make methods private
* fix field_4 validation for bulk upload
- the null check was inverted by mistake
Co-authored-by: Kat <katrina@kosiak.co.uk>
* lockdown bulk upload routes
* able to view lettings bulk upload errors
* add error count to bulk upload results
* coverage for bulk upload filename on results
* group bulk upload errors by row on results
* able to view bulk upload sales results
* scope lettings and sales bulk upload results
* fix linting
* call service correctly in test
* add bulk upload sales questions mapping
* appease linter
* bulk upload error shows correct question
- depending on log type it will show relevant question for the field
concerned
* use local disk for bulk upload for dev env
- this saves the need to connect to S3 to play with bulk upload in dev
environment
* improve namespacing of classes
* add job to process bulk uploads
* use local disk storage for dev file upload
* fix test
* use inline active job queue_adapter for dev
* use test active job queue adapter for test env
* remove rubocop violation
* delete bulk upload from disk after processing
* populate errors with cell, row + metadata
* update error message with something meaningful
* shim in sales validator
* able to parse sales bulk uploads
* change migration to add purchase_code to errors
* bulk upload error component renders purchaser code
- when a sales log
* populate purchaser_code for bulk upload errors
- when log type is sales
* remove superfluous private method