* Empty commit
* feat: base lettings behaviour
* feat: formatting improvements
* Empty commit
* Empty commit
* Empty commit
* refactor: lint
* feat: more page styling
* feat: update tests
* refactor: lint
* feat: add requests spec
* refactor: simplify
* feat: add tests
* feat: copy change to reflect that the count in the header is not necessarily as large as the errors shown (it excludes duplicate errors on fields, and not_answered errors)
* feat: copy changes to sales as well
* feat: remove LegacyBulkUpload MVC
* feat: fix typo
* feat: update tests
* feat: mark additional not_answered sales q
* feat: update routing and only display warnign text when errors will be cleared
* feat: update copy and error rows
* refactor: lint
* feat: don't show soft validations in deletion report
* feat: update routing so deletion report is not shown once logs uploaded
* feat: update tests
* feat: make unique count track repeated errors across rows
* refactor: lint
* feat: match email regardless of casing in bulk upload
* feat: allow null emails
* feat: set encoding by bom for file IO before parsing
* Revert "feat: set encoding by bom for file IO before parsing"
This reverts commit 03d2d17e7b.
* CLDC-2494: WIP
* CLDC-2494: wip
* CLDC-2494: page work in progress
* cleanup
* Add a path for duplicate logs
* Display all duplicate logs
* Move a test
* Display duplicate check answers for logs
* Add buttons to delete duplicates
* Add a route for sales logs duplicates
* Update duplicated page to work for sales logs
* Update styling
* lint
* Add auth
* Rebase updates
* Remove propcode from dedulication checks
* Update fields to work with supported housing
* Trigger duplicate log check on buyer 1 age not known
* compare correct charges
* Update displayed questions
* BU test
* Put redirect to duplicate logs path behind a feature flag
* More BU tests
---------
Co-authored-by: Kat <katrina@kosiak.co.uk>
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Implement bulk upload sales for new collection year 2023
- This is a first pass implementation and will probably have some bugs in it and we can address over time
# Changes
- Add CSV parser for sales 2023 to handle CSV structure
- Tweak collection window validation so error now contextual to year selected for upload
- Handle arbitrary ordering of CSV columns
- Fix ordering of errors in report and now ordered by cell
- Added `Upload your file again` link styled as button on error report to match lettings experience
- Update tooling to convert logs to 2023 csv rows with support for random column ordering
# Known issues
- There seem to be some issues with how UPRN is handled if the UPRN cannot be validated.
- For the above I think there is dependency on https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1570 which should clear any errored fields so users can continue to create logs and fix within the service
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2287
- bulk upload errors on setup fields are not being categorised as setup errors
# Changes
- ensure errors on setup field are categorised as `setup` errors
- ordering of validations tweaked, so `validate_nulls` is further down the execution order. this is so any existing validation run first and `validate_nulls` only adds errors if there aren't any existing errors on a field
- removed old 16/60% tests are no longer required due to introduction of `how to fix` journey
* Add affected_question_ids to pregnancy check
* Update is_referrer_interruption_screen? check and naming
* Use interruption_screen_question_ids to set soft validation errors on relevant fields
* Add soft validations to sales bulk upload
* Add soft validations to lettings logs 23/24 bulk upload
* Add errors for optional soft validations
* Only add soft validations once
* Import helper methods
* Update test based on new validation messages
* Rebase fix
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-1888
- This is a continuation of https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1277
- When bulk uploading we want to check users are not uploading data that already exists to prevent them submitting duplicate
# Changes
- This feature is behind a feature toggle. it has been disabled for staging for testing purposes but available in all other environments
- If a log already exists based off certain fields add errors to the associated fields
- We discount any hidden logs and only check "active" logs
- Added memoization to `#valid?` as an optimisation
* feat: add validation to check UPRN exists if address doesn't
* test: add tests for UPRN/address fields
* test: make UPRN<=12 chars test more specific
* feat: show qu header on BU error template if check_answer_label missing
* test: that validate_nulls gets header if check_answer_label is missing
* fix: drop "known" from "You must answer UPRN known"
* fix: update uprn validation logic
* test: make UPRN being missing explicit
* test: improve test descriptions
* test: that errors added to address fields too when uprn & address fields all null
* refactor: standardise check for presence of needs type
* test: attempt to stub setup question to have nil check_answer_label
* Revert "test: attempt to stub setup question to have nil check_answer_label"
This reverts commit f5b3f6179b.
* test: put test within context block for clarity
* test: fix typo in test description
* test: ensure one other non-blank field exists in row
* feat: output "this question" if qu has no header or check_answer_label
* chore: lint
* fix: add .presence to question.header and remove & before .downcase
* fix: add :after_log context to uprn validation
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2235
- Support invalid options for bulk upload. one use case when user for a new collection year supplies a value only valid in the previous collection year
# Changes
- this validation works before `log.valid?` clears any fields
- as a result there is the potential to get 2 errors on a field for when it becomes blanked and invalid option occur together
- bulk upload validations are now split so that they run before or after `log.valid?`. this is due to the fact that `log.valid?` heavily mutates the `log` object. so we want to validate both before and after the data mutates depending on what needs to be checked
- errors must be duplicated and merged as calling `valid?` clears any existing errors on the object
- all validations are assigned a specific context otherwise they are added to the default context and will also be called when a context is given