Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: minor content change

Table of Contents
Overview

...

Stage NumberStageCategorySummaryEnvironmentIterationsEstimated Person TimeEstimated Computer Time
1Import and parse reference dataset (Optional)ParsingThis optional step in the Bulk Loader process is to cross check an address for a match in a reference data set. If a source address is found in the reference dataset the address makes it to the next step. If not found the address is put aside in an exclusion set for later review.

Python 3

PostgreSQL / pgAdmin

Once per Bulk Loader process1 hour10 minutes
2Import, parse and filter source datasetParsingImport the dataset destined for the EAS. Parse and filter the set.

Python 3

PostgreSQL / pgAdmin

Once per Bulk Loader process90 minutes15 minutes

3

Geocode and filterGeocodingGeocode the set and filter further based on the geocoder score and status.ArcMapOnce per Bulk Loader process1 hour 5 minutes
4Export full set (single batch) or subset (multiple batches)GeocodingFor large datasets, create one of many subsets that will be run through the Bulk Loader in multiple batches.ArcMapOne or more batches for each Bulk Loader process30 minutes per batch5 minutes per batch
5Bulk Load batch (full set or subset)Bulk LoadingRun the entire batch or each subset batch through the Bulk Loader.

EAS <environment>(+)

PostgreSQL / pgAdmin

One or more batches for each Bulk Loader process1 hour per batch5 minutes per batch
6Extract resultsBulk LoadingExtract and archive the list of addresses that were added to the EAS . Also archive the unique EAS 'change request id' associated with this batch. Also archive the addresses that were rejected by the Bulk Loader in this batch.PostgreSQL / pgAdminOne or more batches for each Bulk Loader process1 hour per batch5 minutes per batch
7Cleanup and RestorationBulk LoadingClean up database, restore services and in the event of a failure, restore from backup.PostgreSQL / pgAdminOne or more batches for each Bulk Loader process1 hour per batch5 minutes per batch

...

Anchor
stage1
stage1
Stage 1 Import and parse reference dataset (Optional)

This optional stage is run once per Bulk Loader process. This stage can be skipped if the reference dataset is already available or if the optional 'filter by reference' step (Step 2.5) is skipped.

...

Anchor
stage3
stage3
Stage 3 
Geocode and filter

  •  

    Step 3.1 - Geocode source dataset

...

Anchor
stage4
stage4
Stage 4 
Export shapefile - full set (single batch) or subset (multiple batches)

Note
titleA note about batches

Stages 4, 5 and 6 can be run one time with the results from Stage 3, or they can be run in multiple batches of subsets.

A major consideration of when to run the full set at once versus in batches is the number of records being Bulk Loaded.

The size of each Bulk Loader operation affects the following aspects of the EAS:

  • The disk space consumed by the database server
  • The EAS user interface section that lists addresses loaded in a given Bulk Loader operation
  • The weekly email attachment listing new addresses added to the EAS

For medium-to-large datasets (input sets with over 1,000 records) it is recommended that the Bulk Loading process be run in batches over several days or weeks.

Reminder! It is required that the process first be run on a development server to assess the implications of the operation.

The remaining steps will document a single batch iteration. Repeat these steps in a multi-batch process.

...

Anchor
stage5
stage5
Stage 5 
Run the Bulk Loader

(info) For a complete set of steps and background about the Bulk Loader, see also Running the Bulk Loader, a page dedicated to its input, operation and results.

...

  1. Disable web service on <environment>_WEB (SF DEV WEB, SF QA WEB, SF PROD WEB)

    Code Block
    languagebash
    linenumberstrue
    cd /var/www/html
    sudo ./set_eas_mode.sh MAINT


    1. Browse to web site to confirm the service has stopped. (Expect to see message that EAS is currently out of service.)
  •  

    Step 5.2 - Non-production-specific preparation

...

Anchor
stage6
stage6
Stage 6 
Extract results

  •  

    Step 6.1 - Archive exceptions

...

  1. Get all the base records added to the EAS during the Bulk Loader operation.
    1. Query the public.address_base table on the new change_request_id value.

      Code Block
      languagesql
      firstline1
      titleaddress_base
      linenumberstrue
      SELECT activate_change_request_id, address_id, public.address_base.*
      FROM public.address_base, public.addresses
      WHERE public.address_base.address_base_id = public.addresses.address_base_id
      AND public.addresses.address_base_flg = TRUE
      AND public.addresses.activate_change_request_id = <change_request_id>;


    2. Save the file in the network folder dedicated to artifacts for the Bulk Loader iteration.
      • For example, R:\Tec\..\Eas\_Task\path\to\archive\bulkloader_YYYYMMDD\bulkloader\batch_002\address_base.csv
  2. Extract sample base address from the output
    1. Pick a random record from the results. Gather the value in the address_base_id field.
    2. Construct a URL from this value like this: http://eas.sfgov.org/?address=NNNNNN
      • Where NNNNNN is the value from the address_base_id field.
    3. Make note of this URL for use in Step 7 when testing EAS after services are restored.
  3. Artifacts
    1. address_base.csv - All the base records added to the EAS during the Bulk Loader operation.

...