Table of Contents |
---|
...
Stage Number | Stage | Category | Summary | Environment | Iterations | Estimated Person Time | Estimated Computer Time |
---|---|---|---|---|---|---|---|
1 | Import and parse reference dataset (Optional) | Parsing | This optional step in the Bulk Loader process is to cross check an address for a match in a reference data set. If a source address is found in the reference dataset the address makes it to the next step. If not found the address is put aside in an exclusion set for later review. | Once per Bulk Loader process | 1 hour | 10 minutes | |
2 | Import, parse and filter source dataset | Parsing | Import the dataset destined for the EAS. Parse and filter the set. | Once per Bulk Loader process | 90 minutes | 15 minutes | |
Geocode and filter | Geocoding | Geocode the set and filter further based on the geocoder score and status. | ArcMap | Once per Bulk Loader process | 1 hour | 5 minutes | |
4 | Export full set (single batch) or subset (multiple batches) | Geocoding | For large datasets, create one of many subsets that will be run through the Bulk Loader in multiple batches. | ArcMap | One or more batches for each Bulk Loader process | 30 minutes per batch | 5 minutes per batch |
5 | Bulk Load batch (full set or subset) | Bulk Loading | Run the entire batch or each subset batch through the Bulk Loader. | One or more batches for each Bulk Loader process | 1 hour per batch | 5 minutes per batch | |
6 | Extract results | Bulk Loading | Extract and archive the list of addresses that were added to the EAS . Also archive the unique EAS 'change request id' associated with this batch. Also archive the addresses that were rejected by the Bulk Loader in this batch. | PostgreSQL / pgAdmin | One or more batches for each Bulk Loader process | 1 hour per batch | 5 minutes per batch |
7 | Cleanup and Restoration | Bulk Loading | Clean up database, restore services and in the event of a failure, restore from backup. | PostgreSQL / pgAdmin | One or more batches for each Bulk Loader process | 1 hour per batch | 5 minutes per batch |
...
Anchor | ||||
---|---|---|---|---|
|
This optional stage is run once per Bulk Loader process. This stage can be skipped if the reference dataset is already available or if the optional 'filter by reference' step (Step 2.5) is skipped.
...
Anchorstage3 stage3
Stage 3 - Geocode and filter
stage3 | |
stage3 |
-
Step 3.1 - Geocode source dataset
- Input - addresses_to_geocode.csv
- Output - addresses_geocoded.shp
Detailed Substeps
Create folder for storing all input and output artifacts for this iteration of the Bulk Loader process.
For example,
R:\Tec\..\Eas\_Task\2018_2019\path\to\archive\bulkloader_process_YYYYMMDD
Create new ArcMap map document (ArcMap 10.6.1)
For example,
bulkloader_YYYYMMDD.mxd
Add streets (optional)
See
StClines_20190129.shp
inR:\Tec\...\Eas\_Task\2018_2019\20181128_248_DocumentBulkLoader\Data
Create Personal Geodatabase
Right click
Home
inFolder Connections
inCatalog
Select
New → Personal Geodatabase
Import CSV into File Geodatabase
Right-click new personal geodatabase:
Import → Table (single)
Browse to
addresses_to_geocode.csv
Specify output table
addresses_to_geocode
Click
OK
and time on stopwatch. Wait up to 5 minutes for TOC to update.
- Geocode with Virtual Address Geocoder
- Right-click the table
addresses_to_geocode
in the ArcMap table of contents and selectGeocode Addresses
- In the dialog
'Choose an address geocoder to use'
select'Add'
.- Browse to and select
R:\311\...\StClines_20150729_VirtualAddressLocator (TODO - Replace with new path)
- Click
'OK'
- Browse to and select
- Under
'Address Input Fields'
select'Multiple Fields'
- Street or intersection:
address
- ZIP Code:
zip
- Street or intersection:
- Under
Output
- Click folder icon. In popup change
'Save as type'
to'Shapefile'
. Save shapefile in path dedicated to artifacts for this Bulk Loader Progress
For example,
R:\Tec\..\Eas\_Task\2018_2019\path\to\archive\bulkloader_YYYYMMDD\geocoder
\addresses_geocoded.shp
- Click folder icon. In popup change
Time on stopwatch and take note of execution time.
- Right-click the table
- Artifacts (**)
bulkloader_process_YYYYMMDD.mxd
addresses_geocoded.shp
...
Anchorstage4 stage4
Stage 4 - Export full set (single batch) or subset (multiple batches)
stage4 | |
stage4 |
Note | ||
---|---|---|
| ||
Stages 4, 5 and 6 can be run one time with the results from Stage 3, or they can be run in multiple batches of subsets. A major consideration of when to run the full set at once versus in batches is the number of records being Bulk Loaded. The size of each Bulk Loader operation affects the following aspects of the EAS:
For medium-to-large datasets (input sets with over 1000 records) it is recommended to run the process on a development server and assess the implications of the operation. Where appropriate, perform the Bulk Loading process in batches over several days or weeks. The remaining steps will document one example iteration of a multi-batch process. |
...
Anchorstage5 stage5
Stage 5 - Run the Bulk Loader
stage5 | |
stage5 |
For a complete set of steps and background about the Bulk Loader, see also Running the Bulk Loader, a page dedicated to its input, operation and results.
...
Anchorstage6 stage6
Stage 6 - Extract results
stage6 | |
stage6 |
-
Step 6.1 - Archive exceptions
...
- If the Bulk Loader Process was run on the production server then restore services
Turn on production-to-replication service
- TODO: add steps
Turn on downstream database propagation service(s)
Resume downstream replication to internal business system database (SF PROD WEB).
Code Block language text firstline 1 title start xmit sudo /var/www/html/eas/bin/xmit_change_notifications.bsh start
Enable front-end access to EAS
Place the Web servers into live mode (SF PROD WEB, DR PROD WEB).
Code Block language bash linenumbers true cd /var/www/html sudo ./set_eas_mode.sh LIVE
...