# Context
- https://digital.dclg.gov.uk/jira/browse/CLDC-2316
- Implement bulk upload sales for new collection year 2023
- This is a first pass implementation and will probably have some bugs in it and we can address over time
# Changes
- Add CSV parser for sales 2023 to handle CSV structure
- Tweak collection window validation so error now contextual to year selected for upload
- Handle arbitrary ordering of CSV columns
- Fix ordering of errors in report and now ordered by cell
- Added `Upload your file again` link styled as button on error report to match lettings experience
- Update tooling to convert logs to 2023 csv rows with support for random column ordering
# Known issues
- There seem to be some issues with how UPRN is handled if the UPRN cannot be validated.
- For the above I think there is dependency on https://github.com/communitiesuk/submit-social-housing-lettings-and-sales-data/pull/1570 which should clear any errored fields so users can continue to create logs and fix within the service
* add first page for bulk upload resume journey
* bulk upload resume handles upload again
* add confirm page to bulk upload resume journey
* replace placeholder count with correct value
* apply recommendation for bulk upload resume choice
* add how to fix bulk upload mailer
* integrate new bulk upload approve journey
* add missing bulk upload error mappings
* remove test
* prevent approve being called multiple times
* bulk upload creates invisible logs ahead of time
* work invisible logs into bulk upload flow
* sort errors so deterministic
* remove unused ensure
* remove expected_log_count and processed
- these fields are no longer used or needed
* introduce pending status
* swap visible for pending logs
* only show visible lettings logs
* hard code status filters
* remove unused model methods
* only show visible sales logs
* form controller ignores hidden logs
* locations and schemes only affect visible logs
* refactor questions from validator to row parser
* able to switch between bulk upload parsers
- depending on what year we are processing
* spec tooling to support bulk upload multi year
* row parser now has year in namespacing
* add static data for 2023 row parser
* add placeholder to log to csv for specs
* bulk upload aribtrary 23/34 column ordering works
* bulk upload supports 23/24 without headers
* bulk upload 23/24 supports BOM + invalid chars
* dupe tests
* port 23/24 attributes_for_log
* port 23/24 bulk upload validations
* force crossover period
* tweak max permitted columns
* able to return column for given field
* work out column for field for errors
* add field_4 as 23/24 setup field
* remove duplicate method
* map schemes and locations correctly
* handle arbitrary number of header rows
* add missing fields to bulk upload support
* bulk upload validate start date for given window
* use ActiveModel::Errors api correctly
- as opposed to calling methods manually
* refactor by combining lines
* bulk upload handles years with single digits
* add class to create logs from bulk upload
* create logs when processing bulk uploads
* remove bulk_upload_id from csv output
* create bulk upload logs only if all valid
- this will be changed later to allow for partial logs
- and only to create logs when a threshold has been met
* add method to blank invalid non setup fields
* bulk upload log creation blanks invalid fields
* fix incorrect logic for bulk upload renewal
* fix linting
* bulk upload log creation fail logs to sentry
* fix bulk upload line ending parsing
* extract bulk uploading csv parsing to class
* use csv parser in log creator
* change handle line endings mechanism
- we now strip all windows line endings for unix based line endings
- this normalises things making it simpler
* lockdown bulk upload routes
* able to view lettings bulk upload errors
* add error count to bulk upload results
* coverage for bulk upload filename on results
* group bulk upload errors by row on results
* able to view bulk upload sales results
* scope lettings and sales bulk upload results
* fix linting
* call service correctly in test
* add bulk upload sales questions mapping
* appease linter
* bulk upload error shows correct question
- depending on log type it will show relevant question for the field
concerned
* use local disk for bulk upload for dev env
- this saves the need to connect to S3 to play with bulk upload in dev
environment
* improve namespacing of classes
* add job to process bulk uploads
* use local disk storage for dev file upload
* fix test
* use inline active job queue_adapter for dev
* use test active job queue adapter for test env
* remove rubocop violation
* delete bulk upload from disk after processing
* populate errors with cell, row + metadata
* update error message with something meaningful
* shim in sales validator
* able to parse sales bulk uploads
* change migration to add purchase_code to errors
* bulk upload error component renders purchaser code
- when a sales log
* populate purchaser_code for bulk upload errors
- when log type is sales
* remove superfluous private method
* add start of bulk upload logs journey
* split upload controller into 2
* add year page to bulk upload journey
* bulk upload years now dynamic
* add placeholder for upload your file page
* handle bulk upload when not in crossover
* add bulk upload csv user journey
* bulk upload now persists and sends to S3
* swap ruby-filemagic for file command
* match csv or text file for validation
* in_crossover_period? now uses overlap of forms
- also moved from Form to FormHandler
* add production env var for CSV bucket
* stub kernel call methods
* remove duplicate env var
* hardcode env var for review apps
* move feature toggle to FeatureToggle
* crossover period checks now specific to the form
* fix typo in bulk upload journey
* Remove an unused ethnic_other column
* remove sale_completion_date from lettings logs
* Add collection start year
* Remove sale_or_letting column
* Rename rent_type to rent_type_detail in the export
* Format dates
* refactor
* Fix test
This renames the case_log to lettings_log as everything we've written so
far has been geared towards lettings of social housing so it makes sense to
have the name describe this. This is also a precursor to adding in stuff for
sales logs (whatever shape that takes)
Co-authored-by: James Rose <james@jbpr.net>
This commit fixes the tests that failed as a result of the introduction
of the created_by column in the case log table. The tests were failing as
it is now necessary to have a created_by for a case log
* Scope auth for org pages
* Scope user methods
* Refactor form UI controller
* More tests for access
* Consistently return not found for scoped auth
* Add ADR
* Authenticate bulk uploads
* Authenticate soft validations controller
* Scope bulk upload to user's org for now
* Pass through current user
* Update docs/adr/adr-012-controller-http-return-statuses.md
Co-authored-by: Dushan <47317567+dushan-madetech@users.noreply.github.com>
* Update spec/requests/case_log_controller_spec.rb
Co-authored-by: Dushan <47317567+dushan-madetech@users.noreply.github.com>
Co-authored-by: Dushan <47317567+dushan-madetech@users.noreply.github.com>
* Add organisation model and user association
* Add user admin panel
* email
* Update seeds
* Update spec
* Case logs belong to organisations
* Org case log association
* Add user case log association
* User case log helper methods
* Org case log relation helpers
* Case log index page only shows your organisations case logs
* No access to tasklist page for logs that aren't associated with your org
* No access to form pages for case logs that aren't owned or managed by your org
* Check answers access
* Submit form access
* Refactor out not found methods
* Allow user admin update without password
* Update feature specs
* Rubocop
* Update case log specs
* Test admin user update without password
* Spec grammar
* Spec case logs admin table
* Spec admin user admin index
* Dashboard controller panel specs
* Spec panel contents
* Add create specs
* Don't assign non db ids if we don't need them
* Fix specs for new section
* Fix up fields