Overview...
Stage Number | Stage | Category | Summary | Environment | Iterations | Estimated Person Time | Estimated Computer Time |
---|
1 | Import and parse reference dataset (Optional) | Parsing | This optional step in the Bulk Loader process is to cross check an address for a match in a reference data set. If a source address is found in the reference dataset the address makes it to the next step. If not found the address is put aside in an exclusion set for later review. | Python 3 PostgreSQL / pgAdmin | Once per Bulk Loader process | 1 hour | 10 minutes |
2 | Import, parse and filter source dataset | Parsing | Import the dataset destined for the EAS. Parse and filter the set. | Python 3 PostgreSQL / pgAdmin | Once per Bulk Loader process | 90 minutes | 15 minutes |
3 | Geocode and filter | Geocoding | Geocode the set and filter further based on the geocoder score and status. | ArcMap | Once per Bulk Loader process | 1 hour | 5 minutes |
4 | Export full set (single batch) or subset (multiple batches) | Geocoding | For large datasets, create one of many subsets that will be run through the Bulk Loader in multiple batches. | ArcMap | One or more batches for each Bulk Loader process | 30 minutes per batch | 5 minutes per batch |
5 | Bulk Load batch (full set or subset) | Bulk Loading | Run the entire batch or each subset batch through the Bulk Loader. | EAS <environment>(+) PostgreSQL / pgAdmin | One or more batches for each Bulk Loader process | 1 hour per batch | 5 minutes per batch |
6 | Extract results | Bulk Loading | Extract and archive the list of addresses that were added to the EAS . Also archive the unique EAS 'change request id' associated with this batch. Also archive the addresses that were rejected by the Bulk Loader in this batch. | PostgreSQL / pgAdmin | One or more batches for each Bulk Loader process | 1 hour per batch | 5 minutes per batch |
7 | Cleanup and Restoration | Bulk Loading | Clean up database, restore services and in the event of a failure, restore from backup. | PostgreSQL / pgAdmin | One or more batches for each Bulk Loader process | 1 hour per batch | 5 minutes per batch |
...
Stage 1 -
Import and parse reference dataset (Optional) This optional stage is run once per Bulk Loader process. This stage can be skipped if the reference dataset is already available or if the optional 'filter by reference' step (Step 2.5) is skipped.
...
Stage 3 - Geocode and filter
-
Step 3.1 - Geocode source dataset
...
Stage 4 - Export shapefile - full set (single batch) or subset (multiple batches)
Note |
---|
title | A note about batches |
---|
|
Stages 4, 5 and 6 can be run one time with the results from Stage 3, or they can be run in multiple batches of subsets. A major consideration of when to run the full set at once versus in batches is the number of records being Bulk Loaded. The size of each Bulk Loader operation affects the following aspects of the EAS: - The disk space consumed by the database server
- The EAS user interface section that lists addresses loaded in a given Bulk Loader operation
- The weekly email attachment listing new addresses added to the EAS
For medium-to-large datasets (input sets with over 1,000 records) it is recommended that the Bulk Loading process be run in batches over several days or weeks. Reminder! It is required that the process first be run on a development server to assess the implications of the operation. The remaining steps will document a single batch iteration. Repeat these steps in a multi-batch process. |
...
Stage 5 - Run the Bulk Loader
For a complete set of steps and background about the Bulk Loader, see also Running the Bulk Loader, a page dedicated to its input, operation and results.
...
-
Step 5.1 - Disable front-end access to EAS
Disable web service on <environment>_WEB
(SF DEV WEB, SF QA WEB, SF PROD WEB)
Code Block |
---|
language | bash |
---|
linenumbers | true |
---|
|
cd /var/www/html
sudo ./set_eas_mode.sh MAINT |
...
-
Step 6.1 - Archive exceptions
...
- Get all the base records added to the EAS during the Bulk Loader operation.
Query the public.address_base
table on the new change_request_id
value.
Code Block |
---|
language | sql |
---|
firstline | 1 |
---|
title | address_base |
---|
linenumbers | true |
---|
|
SELECT activate_change_request_id, address_id, public.address_base.*
FROM public.address_base, public.addresses
WHERE public.address_base.address_base_id = public.addresses.address_base_id
AND public.addresses.address_base_flg = TRUE
AND public.addresses.activate_change_request_id = <change_request_id>; |
- Save the file in the network folder dedicated to artifacts for the Bulk Loader iteration.
- For example,
R:\Tec\..\Eas\_Task\path\to\archive\bulkloader_YYYYMMDD\bulkloader\batch_002\address_base.csv
- Extract sample base address from the output
- Pick a random record from the results. Gather the value in the address_base_id field.
- Construct a URL from this value like this: http://eas.sfgov.org/?address=NNNNNN
- Where NNNNNN is the value from the address_base_id field.
- Make note of this URL for use in Step 7 when testing EAS after services are restored.
- Artifacts
- address_base.csv - All the base records added to the EAS during the Bulk Loader operation.
...
- SKIP
Turn on production-to-replication service SKIP Turn on downstream database propagation service(s)
-
Step 7.
56 - Enable front-end access to EAS
- Restore
Enable web service on <environment>_WEB
(SF DEV WEB, SF QA WEB, SF PROD WEB)
Enable nable front-end access to EAS
Code Block |
---|
language | bash |
---|
linenumbers | true |
---|
|
cd /var/www/html
sudo ./set_eas_mode.sh LIVE |
Review sample addresses gathered in Step 6.3 and Step 6.4
Notify relevant recipients that the Bulk Loader Process is complete
-
Step 7.
67 - Archive artifacts
Notes
(
+) Substitute EAS
<environment> with one of the relevant environments:
SF_DEV, SF_QA, SF_PROD, SD_PROD.
...