* feat: match email regardless of casing in bulk upload
* feat: allow null emails
* feat: set encoding by bom for file IO before parsing
* Revert "feat: set encoding by bom for file IO before parsing"
This reverts commit 03d2d17e7b.
* CLDC-2494: WIP
* CLDC-2494: wip
* CLDC-2494: page work in progress
* cleanup
* Add a path for duplicate logs
* Display all duplicate logs
* Move a test
* Display duplicate check answers for logs
* Add buttons to delete duplicates
* Add a route for sales logs duplicates
* Update duplicated page to work for sales logs
* Update styling
* lint
* Add auth
* Rebase updates
* Remove propcode from dedulication checks
* Update fields to work with supported housing
* Trigger duplicate log check on buyer 1 age not known
* compare correct charges
* Update displayed questions
* BU test
* Put redirect to duplicate logs path behind a feature flag
* More BU tests
---------
Co-authored-by: Kat <katrina@kosiak.co.uk>
* Validate that correct template for the year is used for sales (with headers)
* Check the correct template is being used without headers
* Check correct template for lettings
* Update csv parser on sales
* Remove redundant methods
* Extract form years
* Reverse year check mathod
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2344
- So the reported bug and these change do not match 100%
- As the reported bug should have already been resolved in an earlier change as the error message that was reported has already been removed from the system
- These changes remove the check from sales plus a few minor tidy ups
# Changes
- Remove dead code around error threshold which is no longer used or needed
- Inverted negative predicate to positive one, `incorrect_field_count` => `correct_field_count`
- Remove `validate_min_columns` from sales bulk upload
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Implement bulk upload sales for new collection year 2023
- This is a first pass implementation and will probably have some bugs in it and we can address over time
# Changes
- Add CSV parser for sales 2023 to handle CSV structure
- Tweak collection window validation so error now contextual to year selected for upload
- Handle arbitrary ordering of CSV columns
- Fix ordering of errors in report and now ordered by cell
- Added `Upload your file again` link styled as button on error report to match lettings experience
- Update tooling to convert logs to 2023 csv rows with support for random column ordering
# Known issues
- There seem to be some issues with how UPRN is handled if the UPRN cannot be validated.
- For the above I think there is dependency on https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1570 which should clear any errored fields so users can continue to create logs and fix within the service
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2287
- bulk upload errors on setup fields are not being categorised as setup errors
# Changes
- ensure errors on setup field are categorised as `setup` errors
- ordering of validations tweaked, so `validate_nulls` is further down the execution order. this is so any existing validation run first and `validate_nulls` only adds errors if there aren't any existing errors on a field
- removed old 16/60% tests are no longer required due to introduction of `how to fix` journey
* Add affected_question_ids to pregnancy check
* Update is_referrer_interruption_screen? check and naming
* Use interruption_screen_question_ids to set soft validation errors on relevant fields
* Add soft validations to sales bulk upload
* Add soft validations to lettings logs 23/24 bulk upload
* Add errors for optional soft validations
* Only add soft validations once
* Import helper methods
* Update test based on new validation messages
* Rebase fix
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-1888
- This is a continuation of https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1277
- When bulk uploading we want to check users are not uploading data that already exists to prevent them submitting duplicate
# Changes
- This feature is behind a feature toggle. it has been disabled for staging for testing purposes but available in all other environments
- If a log already exists based off certain fields add errors to the associated fields
- We discount any hidden logs and only check "active" logs
- Added memoization to `#valid?` as an optimisation
* feat: add validation for number of field labels
* feat: check field labels are numbers
* feat: remove validations for number of columns
* refactor: rename non_blank_fields_count to valid_field_numbers_count
* test: add functions to generate custom field labels/values
* test: that extra invalid field labels don't cause issues
* test: that removing a valid field label reduces count by 1
* test: that adding a valid field label increases count by 1
* refactor: rename validate_fields_count and wrong_field_count
* feat: add validation that max col count not exceeded when no headers
* fix: convert numbers to strings in default_field_numbers
* feat: add leniency to max cols count (in case of 1 extra col)
* test: explicitly set year of bulk upload to be 2022
* test: add/update tests in validator_spec
* chore: lint
* test: remove tests from csv_parser_spec that were moved to validator_spec
* feat: update 2022 csv_parser to work with new validations
* refactor: define number of valid 2022 fields in one place
* refactor: remove redundant headers definition
* feat: update 2022 csv parser to have col flexibility like 2023
* test: for validator 2022 as well as 2023
* test: simplify 2022/2023 logic and improve layout
* chore: lint
* test: ensure context descriptions start with "when"
* refactor: check fields/columns count within csv parser, not validator
* test: update 2022 csv parser tests to work like 2023
* chore: lint
* feat: add validation to check UPRN exists if address doesn't
* test: add tests for UPRN/address fields
* test: make UPRN<=12 chars test more specific
* feat: show qu header on BU error template if check_answer_label missing
* test: that validate_nulls gets header if check_answer_label is missing
* fix: drop "known" from "You must answer UPRN known"
* fix: update uprn validation logic
* test: make UPRN being missing explicit
* test: improve test descriptions
* test: that errors added to address fields too when uprn & address fields all null
* refactor: standardise check for presence of needs type
* test: attempt to stub setup question to have nil check_answer_label
* Revert "test: attempt to stub setup question to have nil check_answer_label"
This reverts commit f5b3f6179b.
* test: put test within context block for clarity
* test: fix typo in test description
* test: ensure one other non-blank field exists in row
* feat: output "this question" if qu has no header or check_answer_label
* chore: lint
* fix: add .presence to question.header and remove & before .downcase
* fix: add :after_log context to uprn validation
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2235
- Support invalid options for bulk upload. one use case when user for a new collection year supplies a value only valid in the previous collection year
# Changes
- this validation works before `log.valid?` clears any fields
- as a result there is the potential to get 2 errors on a field for when it becomes blanked and invalid option occur together
- bulk upload validations are now split so that they run before or after `log.valid?`. this is due to the fact that `log.valid?` heavily mutates the `log` object. so we want to validate both before and after the data mutates depending on what needs to be checked
- errors must be duplicated and merged as calling `valid?` clears any existing errors on the object
- all validations are assigned a specific context otherwise they are added to the default context and will also be called when a context is given
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2223
- Change how `renewal` works for bulk upload in 2023 compared to 2022
# Changes
- `renewal` is mandatory and must be filled in otherwise there is an error
- it is no longer inferred from another field and we just use the value from the user
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2267
- Consider errors on owning and managing org fields as setup errors
- Otherwise the user is not shown the summary report to highlight these as blocking issues to resolve
# Changes
- if there are problems with owning or managing org fields these are now considered as setup errors
* add first page for bulk upload resume journey
* bulk upload resume handles upload again
* add confirm page to bulk upload resume journey
* replace placeholder count with correct value
* apply recommendation for bulk upload resume choice
* add how to fix bulk upload mailer
* integrate new bulk upload approve journey
* add missing bulk upload error mappings
* remove test
* prevent approve being called multiple times
* bulk upload creates invisible logs ahead of time
* work invisible logs into bulk upload flow
* sort errors so deterministic
* remove unused ensure
* remove expected_log_count and processed
- these fields are no longer used or needed
* introduce pending status
* swap visible for pending logs
* only show visible lettings logs
* hard code status filters
* remove unused model methods
* only show visible sales logs
* form controller ignores hidden logs
* locations and schemes only affect visible logs