* Only process sales logs in sales import job
* Only process lettings logs in lettings import job
* Add sales log import task to the full import
* change review app import bucket name
* Update mortgageused and joinmore
* Update jointmore and ownership scheme
* Fix ownershipsch so that it doesn't override to nil
* Set default relat2, update default household count
* het ownership from type if not given
* Improve logging
* Remove mortgageused method
* Look at Q16aProplensec2 column if Q16aProplen2 is empty
* Set default income used and pregblank
* Remove fields calculated internally
* Comment out sales import from full import task
* typo and change bucket name
- This validation is new to this service. The old CORE did not do it.
- A decision was decided to move this to a soft validation in CLDC-2074
- As a temporary fix to allow us to migrate values that are currently incompatible with our validation, we will relax this constraint.
This covers the following errors:
- Where the income is 0, set earnings and income to blank and set incref to refused
- Removing invalid tenancylength and tenancy values where tenancylength is invalid
- Removing prevten and age1 where incompatible
- We discovered earlier today (thanks Kat) that we have accidentally missed out ‘what are the monthly leasehold charges?’ in the ‘outright sales’ section of the 22/23 sales form on old CORE – the very last question on the form.
- We've decided that we will not add the question back in for 22/23. However we will add it back in for the upcoming 23/24 forms.
* Create test fixtures
* Add old id column
* Add sales logs importer
* Save and update completed discounted ownership log without postcode
* Update fixtures to be incomplete
* Import a completed shared ownership example
* Test with a non homebuy shared ownership example
* Add privacynotice
* Update hholdcount and confirmed fields
* Add buyer stilll serving mapping
* Add totadult/totchild and outright sale examples
* Update log fixtures
* Extract shared methods into logs import service and lint
* Add sales logs import rake task
* Update noint, xml examples and some mappings
* Add tests for checking that all required questions are answered
* Update tests, clean up import
* Map mortgage lender and mortgage lender other
* Infer Mscharge known as no for outright sale
* refactor setting default values
* when the armedforcesspouse is not answered set is as don't know
* Refactor tests: change log id names
* set savings to not known if not given
* Refactor tests: change nesting
* Backfill default household characteristics for completed log
* Add more default mapping
* Typo
* Improve logging and refactor tests
* Adjust test data to fit with the mappings that are known so far
* Rename fixture files
- We previously pushed logs into archives categorised by the quarter that they were created for.
- CDS requested that instead we push everything into a larger bucket seperated by FY.
* bulk upload considers housing needs fields
* bulk upload only permits one housing needs type
* add bulk upload validation
- no disabled needs cannot be selected in conjunction with a disabled need
* add bulk upload validation
- dont know disabled needs cannot be selected in conjunction with a disabled need
* add bulk upload validation
- no and don't know disabled access needs cannot be selected together
* bulk upload validate start date for given window
* use ActiveModel::Errors api correctly
- as opposed to calling methods manually
* refactor by combining lines
* bulk upload handles years with single digits
* remove duplicate bulk validations
- validation was being performed both at CSV level and log level causing
a duplicate validation to appear
* bulk upload valiation errors now store the message
- previouly this was storing just the error type which we do not have a
mechanism to pipe these back to user readable error messages
* refactor bulk upload with extract method
* add bulk upload mailer test
* bulk upload mails users to fix logs on website
* bulk uplad email change resume link
* refactor local variable name
* create logs iff the log itself is valid
* do not create logs if a setup section not complete
* bulk upload with 60% errors will not create logs
* extract magic number to constant
* add bulk upload absolute threshold error rate
* refactor with extract method
* fix bulk upload age data types
* bulk upload handles both with and without headers
- headers are from the spreadsheet template
- otherwise assume cell A1 is start of the dataset
* add class to create logs from bulk upload
* create logs when processing bulk uploads
* remove bulk_upload_id from csv output
* create bulk upload logs only if all valid
- this will be changed later to allow for partial logs
- and only to create logs when a threshold has been met
* add method to blank invalid non setup fields
* bulk upload log creation blanks invalid fields
* fix incorrect logic for bulk upload renewal
* fix linting
* bulk upload log creation fail logs to sentry
* fix bulk upload line ending parsing
* extract bulk uploading csv parsing to class
* use csv parser in log creator
* change handle line endings mechanism
- we now strip all windows line endings for unix based line endings
- this normalises things making it simpler
* able to view lettings bulk upload errors
* fix linting
* call service correctly in test
* add bulk upload sales questions mapping
* appease linter
* bulk upload error shows correct question
- depending on log type it will show relevant question for the field
concerned
* improve namespacing of classes
* add job to process bulk uploads
* move validation from parser to model
* add validations for field_1
* add validation for field_4
* pending tests for field_4
* convert field_mapping to array of hashes
* validate nulls based on form question
* actually load forms when toggling between forms
* validate null for startdate
* row parser has access to bulk upload
* csv upload validates first form section
* add postcode validation
* Refactor error mappings for row parser
* Add unittype question
* Fix null error setting and add builtype
* add wchair to bulk upload
* Add beds to bulk upload
* Add joint to bulk upload
* Add startertenancy to the bulk upload
* Add tenancy for bulk upload
* Add declaration to the bulk upload
* Add age1 and age1_known to bulk upload
* add ages to bulk upload
* add sex1 to bulk upload
* add ethnic_group and ethnic to bulk upload
* add national to bulk upload
* add ecstat1 to bulk upload
* add military related fields to bulk upload
* add preg_occ to bulk upload
* add housingneeds to bulk upload
* add illness to bulk upload
* add layear to bulk upload
* add waityear to bulk upload
* add reason to bulk upload
* add prevten to bulk upload
* add homeless to bulk upload
* add previous postcode to bulk upload
* add reasonable preferences to bulk upload
* add allocations system to bulk upload
* add referral to bulk upload
* add net_income_known to bulk upload
* add hb to bulk upload
* add benefits to bulk upload
* add rent fields to bulk upload
* add hhmemb to bulk upload
* use 2022 csv fixtures for bulk upload
* fix renewal mapping for bulk upload
* placeholder test for bulk upload validation
* fix bulk upload mapping for homeless field
* fix leftreg mapping for bulk upload
* fix user associations in bulk upload tests
* add gender fields for bulk upload
* add ecstatN fields to bulk upload
* add #relatN fields to bulk upload
* extract old_visible_id in factory to trait
* map net_income_known correctly for bulk upload
* fix income bugs for bulk upload
* add unitletas to bulk upload
* add #rsnvac to bulk upload
* add #sheltered to bulk upload
* add illness fields to bulk upload
* add #irproduct_other to bulk upload
* infer renewal from rsnvac for bulk upload
* add #tenancyother to bulk upload
* add #tenancylength to bulk upload
* bulk upload earnings accepts pennies but rounds
* add #reasonother to bulk upload
* fix mapping of #ppcodenk for bulk upload
* add #household_charge to bulk upload
* add #chcharge to bulk upload
* add #tcharge to bulk upload
* add #supcharg to bulk upload
* add pscharge to bulk upload
* add #scharge to bulk upload
* use case statement for bulk upload allocation
* add offered to bulk upload
* add propcode to bulk upload
* add major repair fields to bulk upload
* add #voiddate to bulk upload
* support YY year format for bulk upload
* test postcode strips whitespace for bulk upload
* add #la to bulk upload
* add previous la to bulk upload
* fix failing test
* remove duplicate line from rebase
* add first time social housing to bulk upload
* make methods private
* fix field_4 validation for bulk upload
- the null check was inverted by mistake
Co-authored-by: Kat <katrina@kosiak.co.uk>
* lockdown bulk upload routes
* able to view lettings bulk upload errors
* add error count to bulk upload results
* coverage for bulk upload filename on results
* group bulk upload errors by row on results
* able to view bulk upload sales results
* scope lettings and sales bulk upload results
* fix linting
* call service correctly in test
* add bulk upload sales questions mapping
* appease linter
* bulk upload error shows correct question
- depending on log type it will show relevant question for the field
concerned
* use local disk for bulk upload for dev env
- this saves the need to connect to S3 to play with bulk upload in dev
environment
* improve namespacing of classes
* add job to process bulk uploads
* use local disk storage for dev file upload
* fix test
* use inline active job queue_adapter for dev
* use test active job queue adapter for test env
* remove rubocop violation
* delete bulk upload from disk after processing
* populate errors with cell, row + metadata
* update error message with something meaningful
* shim in sales validator
* able to parse sales bulk uploads
* change migration to add purchase_code to errors
* bulk upload error component renders purchaser code
- when a sales log
* populate purchaser_code for bulk upload errors
- when log type is sales
* remove superfluous private method
* Validate that the user belongs to either the managing or owning organisation
* do not reset created_by and remove user_organisation_chosen?
* Do not default managing organisation to owning organisation
* validate user belongs to organisation
* update tests to specify created_by
* clear create_by for support users
* refactor
* typo
* rebase lint
* Add `updated_by` to logs to track who was the last person to update a log
- Reset `created_by` automatically when a form is updated and the owner does not belong to the managing or owning organisation
* move reset_invalidated_dependent_fields!, update schema file and fix tests
Co-authored-by: James Rose <james@jbpr.net>
* Mark log as impacted by deactivation when location is deactivated
* Display affected logs in the table
* Route affected logs to tenancy start date question
* Update routes to get the tenancy start date page from form
* update next_page_redirect_path
* rename column
* Fix tests
* Add next_unresolved_page_id to pages
* Update unresulved when the log is corrected
* Mark logs as unresolved when scheme gets deactivated
* display correct content when there are no unresolved logs
* mark logs resolved after the user leaves check answers page
* Display link in success banner and reset banner when the link is clicked
* display inset hint text for unresolved log questions
* Display unresolved logs banner
* update banner message
* Persist the link in the banner
* update inset text
* Update success banner text
* Add unresolved and created_by scopes
* rename method
* add unresolved_log_redirect_page_id to form + typo and route
* Add UnresolvedLogHelper and extract flack notice message
* pluralize and return early
* remove flash[:notice] = nil
* to keep it consistent for sales log
* Extract unresolved path and fix a link
* extract resolve method and fix attribute nme
* update path
* typo
We import XML files that define logs from the previous CORE service. For an unknown reason, some of these files don't include the required namespace declarations to parse `user` values. Instead of changing the source, this change introduces a new method that forces the namespace declaration for user fields.
These fields are set using IDs from the previous CORE service. There was an incorrect assumption that these fields were integers so we were casting them as such. We have discovered that these fields are strings (e.g., `027`) so this change adjusts our types.