* Save total logs count for bulk upload
* Update content of fix choice and confirm pages
* update bulk upload banner
* Rebase changes
* Extract fix choice summary into method
* Add comma
* some minor refactoring
remove methods from child class that replicate methods on the parent class
tidy up check for nil
remove gubbins and inline method body given only used once
* update import services for lettings and sales to import creation method
write tests to cover this
* create sales log field import service and associated spec file, with methods and tests for importing the creation method of logs that have already been imported
* update lettings log field import service and related spec to allow importing creation method of logs
* use the methods dynamically created by active record in all relevant places, removing obsolete methods in teh process.
various tests tweaked to suppor this change.
rake task from another ticket folded into this ticket to prevent merge conflicts
* rename method for ruby conventions
* update PR for altered spec
upload id now decided to be a better indicator of bulk upload status, import service amended accordingly
tests updated in line with this
* update field import services in line with import services to use upload id rather than upload method as the source of truth for how a log was created
* slight refactor to reduce nesting and dodge linter complaints
* minor amendment to log creator spec in bulk upload to use enum dynamic methods
* form handler to return all questions from lettings forms for all years with ordering interleaved
functionality and tests
* refactor lettings log csv service and all associated tests
remove methods on log models when we can call them directly on associated models
update job to call the service directly with the collection of logs rather
minor modifications to the sales log csv service
update many test files to test the appropriate logic in the appropriate place
* tidying
final amendments to tests
remove commented code
rename variable
* change the position of the rent value check field in the headers
* CLDC-2492 add creation method field to logs (#1738)
* create migrations to add creation method fields to both log types
* add enum definition to logs for creation method
* upadte csv export services to retrieve creation method values direct from the log, remove methods previously used from the log model
* run migrations to update schema
* ensure that logs created via bulk upload have this set correctly when created
* Renme end_date to new_logs_end_date
* Display change buttons in CYA if the collection is still open for editing
* Allow navigating to question if the collection is still open for editing
* Allow logs to be edited if the date is before edit end date
* Update sales validation to allow editing existing logs
* update tests
* Update edit_end_date
* Update some test wording
* Update new logs end date
* tests
* feat: setup to replicate failures to fix
* feat: wip test fixes
* feat: remainder of current state test fixes
* feat: set form date to past to trigger errors to fix
* feat: revert, don't want this in final diff
* feat: fix row_parser tests
* feat: sales log importer and validator fixes
* feat: remainder of test fixes
* refactor: lint
* Validate that correct template for the year is used for sales (with headers)
* Check the correct template is being used without headers
* Check correct template for lettings
* Update csv parser on sales
* Remove redundant methods
* Extract form years
* Reverse year check mathod
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2305
- This is rework to add missing validations for sales setup fields for bulk upload
# Changes
- There is code that clears log fields if they are not a valid option. in order to generate errors for these we check certain fields for content prior to `log.valid?` being called. Unfortunately there does not appear to be an easy way to access to valid options therefore these values have been hard coded
- Privacy notice must be accepted other considered a setup error
- The `type` question can actually be one of three questions with the same identifier. these are excluded as part of `validate_valid_radio_option` and instead validated with a different mechanism. The rationale being that `log.valid?` will clear data before we get there
- Remove tests associated to upload threshold which are no longer required
* Do not allow 32 as an option for 22/23 sale type
* Correctly block log creation
* refactor test
* lint
* Add error to the correct field
* Add error as part of validating radio options
* Update row parser null check
* add a bespoke validation to the row parser
when buyer 1 is uploaded with working situation child, this should be validated with a custom message.
a test for this case was also created
* replicate the work from the last commit for the 2023 row parser and associated test file
* lint correction remove blank lines
* Sends a correct email if there are only soft validations
* Add soft errors valid page
* Add soft errors confirm page
* Confirm the soft validations
* Reuse the how to fix template for check soft validations email
* Update email link
* Move soft validation confirmation to processor
* Correctly set the log status, remove redundant confirm_soft_validations
* Redirect successful upload to logs index and display a success banner
* Implement the soft validations only journey for sales logs
* Display the soft validation errors on the soft-errors-valid page
* Fix page alignment
* Fix tests by mapping housingneeds in csv hepler
* Add the sales soft validations to unpend_and_confirm_soft_validations
* Change naming
* Update method names
* refactor
* undo typo
* Only set the existing soft validations for correct types
* Fix path name
* Add missing error mappings for location fields
* Add missing tests and cancel button
* Change some wording
* Typos
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2344
- So the reported bug and these change do not match 100%
- As the reported bug should have already been resolved in an earlier change as the error message that was reported has already been removed from the system
- These changes remove the check from sales plus a few minor tidy ups
# Changes
- Remove dead code around error threshold which is no longer used or needed
- Inverted negative predicate to positive one, `incorrect_field_count` => `correct_field_count`
- Remove `validate_min_columns` from sales bulk upload
* feat: wip blank fields and dependent fields on upload tos ee if valid and can upload with missing info - this is not the exact ac on the ticket yet
* feat: update seed
* feat: wip commit
* feat: add postcodenk error so can clear on validation
* feat: add postcode validation back
* feat: move la vals to shared and move and add tests
* feat: add correct pluralisation to warning message
* feat: add blank compound invalid fields methods
* feat: update validations
* feat: update pluralisation
* refactor: lint
* feat: clear errors associated with blanked values so log status is set correctly on creation
* feat: validate instead
* feat: avoid duplicated errors
* feat: dont auto-refuse income, different to imports
* feat: validate after every blank method
* feat: delete la validator spec
* tests: update
* refactor: erblinting
* refactor: cleanup
* refactor: move pluralizer to helper
* feat: copy update
* feat: rename
* feat: refactor to avoid redundant re-validations and test
* refactor: rubocop
* test: update
* test: update
* test: update
* update sidekiq
* feat: clear errors
* feat: run clearing twice in case first clear creates different errors
* feat: remove moved file
* feat: undo validation file tweaks as shared/specific overlap could do with a more general refactor
* feat: update tests
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Implement bulk upload sales for new collection year 2023
- This is a first pass implementation and will probably have some bugs in it and we can address over time
# Changes
- Add CSV parser for sales 2023 to handle CSV structure
- Tweak collection window validation so error now contextual to year selected for upload
- Handle arbitrary ordering of CSV columns
- Fix ordering of errors in report and now ordered by cell
- Added `Upload your file again` link styled as button on error report to match lettings experience
- Update tooling to convert logs to 2023 csv rows with support for random column ordering
# Known issues
- There seem to be some issues with how UPRN is handled if the UPRN cannot be validated.
- For the above I think there is dependency on https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1570 which should clear any errored fields so users can continue to create logs and fix within the service
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2287
- bulk upload errors on setup fields are not being categorised as setup errors
# Changes
- ensure errors on setup field are categorised as `setup` errors
- ordering of validations tweaked, so `validate_nulls` is further down the execution order. this is so any existing validation run first and `validate_nulls` only adds errors if there aren't any existing errors on a field
- removed old 16/60% tests are no longer required due to introduction of `how to fix` journey
* Add affected_question_ids to pregnancy check
* Update is_referrer_interruption_screen? check and naming
* Use interruption_screen_question_ids to set soft validation errors on relevant fields
* Add soft validations to sales bulk upload
* Add soft validations to lettings logs 23/24 bulk upload
* Add errors for optional soft validations
* Only add soft validations once
* Import helper methods
* Update test based on new validation messages
* Rebase fix
# Context
- Partially related to https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Comprehending sales or lettings bulk upload CSV is enough, comprehending both simultaneously is rather challenging
# Changes
- Split out test helper class by log type ie lettings/sales
* refactor questions from validator to row parser
* able to switch between bulk upload parsers
- depending on what year we are processing
* spec tooling to support bulk upload multi year
* row parser now has year in namespacing
* add static data for 2023 row parser
* add placeholder to log to csv for specs
* bulk upload aribtrary 23/34 column ordering works
* bulk upload supports 23/24 without headers
* bulk upload 23/24 supports BOM + invalid chars
* dupe tests
* port 23/24 attributes_for_log
* port 23/24 bulk upload validations
* force crossover period
* tweak max permitted columns
* able to return column for given field
* work out column for field for errors
* add field_4 as 23/24 setup field
* remove duplicate method
* map schemes and locations correctly
* handle arbitrary number of header rows
* add missing fields to bulk upload support
* lockdown bulk upload routes
* able to view lettings bulk upload errors
* add error count to bulk upload results
* coverage for bulk upload filename on results
* group bulk upload errors by row on results
* able to view bulk upload sales results
* scope lettings and sales bulk upload results
* fix linting
* call service correctly in test
* add bulk upload sales questions mapping
* appease linter
* bulk upload error shows correct question
- depending on log type it will show relevant question for the field
concerned
* use local disk for bulk upload for dev env
- this saves the need to connect to S3 to play with bulk upload in dev
environment
* improve namespacing of classes
* add job to process bulk uploads
* use local disk storage for dev file upload
* fix test
* use inline active job queue_adapter for dev
* use test active job queue adapter for test env
* remove rubocop violation
* delete bulk upload from disk after processing
* populate errors with cell, row + metadata
* update error message with something meaningful
* shim in sales validator
* able to parse sales bulk uploads
* change migration to add purchase_code to errors
* bulk upload error component renders purchaser code
- when a sales log
* populate purchaser_code for bulk upload errors
- when log type is sales
* remove superfluous private method