* Add in past end dates for testing
* Temporarily disable test marked to delete at year end
* Remove one off reinfer_local_authority task
* Update validator_spec
* Set 2024 date in form_handler_spec to be during crossover period as needed
* Use bulk_upload.year_combo for comparison in request tests to avoid year dependancy
* Update BU log creator specs for 2024
* Use year combo function in bulk upload mailer tests
* Refactor lettings validator_spec
* More fixes
* More work on bu validator specs - mostly sales
* Remove pre 2023 test
* More use of bulk_upload.year_combo in request tests
* Fix lint
* Tweak bulk upload error row component tests for year changes
* Further fixes
* Fix 2023 lettings row parser spec
* Sales log to csv fix
* Refactor BU processor tests
* Fix field number row identifier
* Fix linting
* More years in request spec
* fix
* Don't use db unnecessarily in financial validations spec
* Fix sale date changing 2024 -> 2023 test
* Add tests for bulk_upload.year_combo
* Update bu factory year specification
* Refactoring
* Linting
* Don't use helper in factory
* Remove new 2023 specific test
* Remove dummy end dates
* Empty
* CLDC-3345 Rename created_by to assigned_to (#2372)
* Rename created_by to assigned_to
* Replace created_by with assigned_to
* Update created_by to assigned_to in exports, remove blank assigned to
* CLDC-3345 Add and set created_by fields (#2373)
* Add created_by
* Update existing created_by values
* Set created_by on single log
* Set created_by on BU
* Add created_by to exports
* feat: update since last merge
---------
Co-authored-by: natdeanlewissoftwire <nat.dean-lewis@softwire.com>
* Fix papertrail create version
---------
Co-authored-by: natdeanlewissoftwire <nat.dean-lewis@softwire.com>
* Copy 23 BU files to 24
* Update field mapping
* Add duplicate log error to charges
* Check the sum of charges for duplicates
* update stub
* Only add charges to duplciate hash if they exist
* Rebase change
* Update columns
* Add tcharge back to error mapping
* Add prepare your file
* Add csv parser 24/25 cases
* Fix date check
* Rebase fix
* Copy 23 BU files to 24
* Renumber bulk upload fields for 2024
* Add prepare your file
* Update max columns
* Update fields in first_record_start_date
* Update managing org
* Rebase changes
* Sends a correct email if there are only soft validations
* Add soft errors valid page
* Add soft errors confirm page
* Confirm the soft validations
* Reuse the how to fix template for check soft validations email
* Update email link
* Move soft validation confirmation to processor
* Correctly set the log status, remove redundant confirm_soft_validations
* Redirect successful upload to logs index and display a success banner
* Implement the soft validations only journey for sales logs
* Display the soft validation errors on the soft-errors-valid page
* Fix page alignment
* Fix tests by mapping housingneeds in csv hepler
* Add the sales soft validations to unpend_and_confirm_soft_validations
* Change naming
* Update method names
* refactor
* undo typo
* Only set the existing soft validations for correct types
* Fix path name
* Add missing error mappings for location fields
* Add missing tests and cancel button
* Change some wording
* Typos
# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Implement bulk upload sales for new collection year 2023
- This is a first pass implementation and will probably have some bugs in it and we can address over time
# Changes
- Add CSV parser for sales 2023 to handle CSV structure
- Tweak collection window validation so error now contextual to year selected for upload
- Handle arbitrary ordering of CSV columns
- Fix ordering of errors in report and now ordered by cell
- Added `Upload your file again` link styled as button on error report to match lettings experience
- Update tooling to convert logs to 2023 csv rows with support for random column ordering
# Known issues
- There seem to be some issues with how UPRN is handled if the UPRN cannot be validated.
- For the above I think there is dependency on https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1570 which should clear any errored fields so users can continue to create logs and fix within the service
# Context
- Partially related to https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Comprehending sales or lettings bulk upload CSV is enough, comprehending both simultaneously is rather challenging
# Changes
- Split out test helper class by log type ie lettings/sales
* feat: add validation for number of field labels
* feat: check field labels are numbers
* feat: remove validations for number of columns
* refactor: rename non_blank_fields_count to valid_field_numbers_count
* test: add functions to generate custom field labels/values
* test: that extra invalid field labels don't cause issues
* test: that removing a valid field label reduces count by 1
* test: that adding a valid field label increases count by 1
* refactor: rename validate_fields_count and wrong_field_count
* feat: add validation that max col count not exceeded when no headers
* fix: convert numbers to strings in default_field_numbers
* feat: add leniency to max cols count (in case of 1 extra col)
* test: explicitly set year of bulk upload to be 2022
* test: add/update tests in validator_spec
* chore: lint
* test: remove tests from csv_parser_spec that were moved to validator_spec
* feat: update 2022 csv_parser to work with new validations
* refactor: define number of valid 2022 fields in one place
* refactor: remove redundant headers definition
* feat: update 2022 csv parser to have col flexibility like 2023
* test: for validator 2022 as well as 2023
* test: simplify 2022/2023 logic and improve layout
* chore: lint
* test: ensure context descriptions start with "when"
* refactor: check fields/columns count within csv parser, not validator
* test: update 2022 csv parser tests to work like 2023
* chore: lint
* add first page for bulk upload resume journey
* bulk upload resume handles upload again
* add confirm page to bulk upload resume journey
* replace placeholder count with correct value
* apply recommendation for bulk upload resume choice
* add how to fix bulk upload mailer
* integrate new bulk upload approve journey
* add missing bulk upload error mappings
* remove test
* prevent approve being called multiple times
* bulk upload creates invisible logs ahead of time
* work invisible logs into bulk upload flow
* sort errors so deterministic
* remove unused ensure
* remove expected_log_count and processed
- these fields are no longer used or needed
* introduce pending status
* swap visible for pending logs
* only show visible lettings logs
* hard code status filters
* remove unused model methods
* only show visible sales logs
* form controller ignores hidden logs
* locations and schemes only affect visible logs
* refactor questions from validator to row parser
* able to switch between bulk upload parsers
- depending on what year we are processing
* spec tooling to support bulk upload multi year
* row parser now has year in namespacing
* add static data for 2023 row parser
* add placeholder to log to csv for specs
* bulk upload aribtrary 23/34 column ordering works
* bulk upload supports 23/24 without headers
* bulk upload 23/24 supports BOM + invalid chars
* dupe tests
* port 23/24 attributes_for_log
* port 23/24 bulk upload validations
* force crossover period
* tweak max permitted columns
* able to return column for given field
* work out column for field for errors
* add field_4 as 23/24 setup field
* remove duplicate method
* map schemes and locations correctly
* handle arbitrary number of header rows
* add missing fields to bulk upload support
* limit setup errors if errors already present
* bulk upload beter handles age validations
- limit the number of errors on a field. if the field already has an
existing error it does not add further errors to the field
- shim in extra validation for ages which take precedent over existing
log level validations which are lacking
* setup errors always added
- no longer observes if the grouping already has errors
- as we have fine grain control in this class of how errors should be
* fix missing pass through for renewal
* create logs iff the log itself is valid
* do not create logs if a setup section not complete
* bulk upload with 60% errors will not create logs
* extract magic number to constant
* add bulk upload absolute threshold error rate
* refactor with extract method
* refactor with extract method
* can filter logs by bulk upload
* hide log creation button when viewing bulk upload
- this affects the logs index page filtering logs from a specific bulk
upload
* add info banner to bulk upload logs
* placeholder for bulk upload logs header
* when resuming bulk upload set filters
* fill place holder with remaining logs to fix
* add interstitial to resume if logs resolved
* after resolving bulk upload logs show interstitial
* fix linting error
* extract view variable to helper method
* fix bulk upload age data types
* bulk upload handles both with and without headers
- headers are from the spreadsheet template
- otherwise assume cell A1 is start of the dataset
* add class to create logs from bulk upload
* create logs when processing bulk uploads
* remove bulk_upload_id from csv output
* create bulk upload logs only if all valid
- this will be changed later to allow for partial logs
- and only to create logs when a threshold has been met
* add method to blank invalid non setup fields
* bulk upload log creation blanks invalid fields
* fix incorrect logic for bulk upload renewal
* fix linting
* bulk upload log creation fail logs to sentry
* fix bulk upload line ending parsing
* extract bulk uploading csv parsing to class
* use csv parser in log creator
* change handle line endings mechanism
- we now strip all windows line endings for unix based line endings
- this normalises things making it simpler
* Add organisation model and user association
* Add user admin panel
* email
* Update seeds
* Update spec
* Case logs belong to organisations
* Org case log association
* Add user case log association
* User case log helper methods
* Org case log relation helpers
* Case log index page only shows your organisations case logs
* No access to tasklist page for logs that aren't associated with your org
* No access to form pages for case logs that aren't owned or managed by your org
* Check answers access
* Submit form access
* Refactor out not found methods
* Allow user admin update without password
* Update feature specs
* Rubocop
* Update case log specs
* Test admin user update without password
* Spec grammar
* Spec case logs admin table
* Spec admin user admin index
* Dashboard controller panel specs
* Spec panel contents
* Add create specs
* Don't assign non db ids if we don't need them
* Fix specs for new section
* Fix up fields
Devise is a commonly used gem for user authentication and management. Using
rails generators and Devise allows us to get a lot of boilerplate code for
user authentication and management and means we don't have to revinvent the
wheel. Styling will need to be done for the necessary pages and there are
likely to be bits of generated code that can be deleted. This will act as
a starting point to be built up from using TDD.
Devise is a commonly used gem for user authentication and management. Using
rails generators and Devise allows us to get a lot of boilerplate code for
user authentication and management and means we don't have to revinvent the
wheel. Styling will need to be done for the necessary pages and there are
likely to be bits of generated code that can be deleted. This will act as
a starting point to be built up from using TDD.