Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: formatting

Table of Contents
Overview

...

Stage NumberStageCategorySummaryEnvironmentIterationsEstimated Person TimeEstimated Computer Time
1Import and parse reference dataset (Optional)ParsingThis optional step in the Bulk Loader process is to cross check an address for a match in a reference data set. If a source address is found in the reference dataset the address makes it to the next step. If not found the address is put aside in an exclusion set for later review.

Python 3

PostgreSQL / pgAdmin

Once per Bulk Loader process1 hour10 minutes
2Import, parse and filter source datasetParsingImport the dataset destined for the EAS. Parse and filter the set.

Python 3

PostgreSQL / pgAdmin

Once per Bulk Loader process90 minutes15 minutes

3

Geocode and filterGeocodingGeocode the set and filter further based on the geocoder score and status.ArcMapOnce per Bulk Loader process1 hour 5 minutes
4Export full set (single batch) or subset (multiple batches)GeocodingFor large datasets, create one of many subsets that will be run through the Bulk Loader in multiple batches.ArcMapOne or more batches for each Bulk Loader process30 minutes per batch5 minutes per batch
5Bulk Load batch (full set or subset)Bulk LoadingRun the entire batch or each subset batch through the Bulk Loader.

EAS <environment>(+)

PostgreSQL / pgAdmin

One or more batches for each Bulk Loader process1 hour per batch5 minutes per batch
6Extract resultsBulk LoadingExtract and archive the list of addresses that were added to the EAS . Also archive the unique EAS 'change request id' associated with this batch. Also archive the addresses that were rejected by the Bulk Loader in this batch.PostgreSQL / pgAdminOne or more batches for each Bulk Loader process1 hour per batch5 minutes per batch
7Cleanup and RestorationBulk LoadingClean up database, restore services and in the event of a failure, restore from backup.PostgreSQL / pgAdminOne or more batches for each Bulk Loader process1 hour per batch5 minutes per batch

...

Anchor
stage1
stage1
Stage 1 Import and parse reference dataset (Optional)

This optional stage is run once per Bulk Loader process. This stage can be skipped if the reference dataset is already available or if the optional 'filter by reference' step (Step 2.5) is skipped.

...

Anchor
stage3
stage3
Stage 3 
Geocode and filter

  •  

    Step 3.1 - Geocode source dataset

...

Anchor
stage4
stage4
Stage 4 
Export full set (single batch) or subset (multiple batches)

Note
titleA note about batches

Stages 4, 5 and 6 can be run one time with the results from Stage 3, or they can be run in multiple batches of subsets.

A major consideration of when to run the full set at once versus in batches is the number of records being Bulk Loaded.

The size of each Bulk Loader operation affects the following aspects of the EAS:

  • The disk space consumed by the database server
  • The EAS user interface section that lists addresses loaded in a given Bulk Loader operation
  • The weekly email attachment listing new addresses added to the EAS

For medium-to-large datasets (input sets with over 1,000 records) it is recommended that the process first be run on a development server to assess the implications of the operation. Where appropriate, perform the Bulk Loading process in batches over several days or weeks.

The remaining steps will document a single batch example iteration of a multi-batch process.

...

Anchor
stage5
stage5
Stage 5 
Run the Bulk Loader

(info) For a complete set of steps and background about the Bulk Loader, see also Running the Bulk Loader, a page dedicated to its input, operation and results.

...

  1. Restore Database
    1. Restore database from latest daily production backup


  •  
  •  

    Step 5.3

     - Backup Database
  1. Make a backup of the EAS database
  •  Step 5.4

     - Database Preparation

  1. Connect to the database, <environment>_DB, and clear any leftover records from previous Bulk Loader batches.

    Code Block
    languagesql
    titletruncate
    linenumberstrue
    TRUNCATE bulkloader.address_extract, bulkloader.blocks_nearest;


    Code Block
    languagesql
    titlevacuum
    linenumberstrue
    VACUUM FULL ANALYZE bulkloader.address_extract;


    Code Block
    languagesql
    titlevacuum
    linenumberstrue
    VACUUM FULL ANALYZE bulkloader.blocks_nearest;


  2. Make note of EAS record counts before the Bulk Loading operation.


    Code Block
    languagesql
    firstline1
    titleRecord Counts
    linenumberstrue
    SELECT schemaname,relname,n_live_tup FROM pg_stat_user_tables ORDER BY schemaname,relname,n_live_tup


  3. Make note of the database partition size on the file system.

    Code Block
    languagebash
    firstline1
    titledisk usage
    linenumberstrue
    df /data


  •  

    Step 5.

    5

    4 - Transfer Shapefiles 

  1. Transfer the bulkload.shp shapefile fromStep from Stage 4 to an EAS automation machine, <environment>_AUTO.
  2. Substitute <environment> with one of the relevant environments: SF_DEV, SF_QA, SF_PROD, SD_PROD.
  3. Copy the shapefile to the folder C:\apps\eas_automation\app_data\data\bulkload_shapefile.

...

  •  

    Step 5.

    - Run Bulk Loader

Transfer the bulkload.shp shapefile from Step 4 to an EAS automation machine, <environment>_AUTO.

  1. Copy the shapefile to the folder C:\apps\eas_automation\app_data\data\bulkload_shapefile.

  2. Open a command prompt and change folders:
    • cd C:\apps\eas_automation\automation\src

  3. Run the step to stage the address records:

    • python job.py --job stage_bulkload_shapefile --env SF_DEV --action EXECUTE --v

  4. Run the step to bulk load the address records:

    • python job.py --job bulkload --env SF_DEV --action EXECUTE --v

  5. (info) To calculate the time it took to run the Bulk Loader look at the timestamps in the output or use a stopwatch or clock to time the operation.


  •  

    Step 5.

    7

    6 - 
    Anchor
    analysis
    analysis
    Analysis

  1. Make note of EAS record counts after the Bulk Load operation.

    Code Block
    languagesql
    firstline1
    titleRecord Counts
    linenumberstrue
    SELECT schemaname,relname,n_live_tup FROM pg_stat_user_tables ORDER BY schemaname,relname,n_live_tup


    • A comparison of 'before' and 'after' record counts will indicate the number of new base addresses added to the table `public.address_base` and the number of new addresses and units added to the table `public.addresses`.

    1. (info) See dedicated Bulk Loader page, Running the Bulk Loader, for more analysis options.

  2. Make note of the database partition size on the file system. Compre with size of partition prior to loading to get the total disk space used as a result of running the Bulk Loader.

    Code Block
    languagebash
    firstline1
    titledisk usage
    linenumberstrue
    df /data


  3. Query and make note of totals in the bulkloader.address_extract table. The results here will be used to cross check the results in the next stage.
    1. Count/view new base addresses added to the EAS.


      Code Block
      languagesql
      firstline1
      titleCount/view new base addresses
      linenumberstrue
      SELECT COUNT(*) FROM bulkloader.address_extract WHERE NOT (street_segment_id IS NULL)
      
      SELECT * FROM bulkloader.address_extract WHERE NOT (street_segment_id IS NULL)


    2. Count/view unit addresses (some were already there, some are new)

      Code Block
      languagesql
      firstline1
      titleCount/view unit addresses
      linenumberstrue
      SELECT COUNT(*) FROM bulkloader.address_extract WHERE NOT (address_id IS NULL)
      
      SELECT * FROM bulkloader.address_extract WHERE NOT (address_id IS NULL)


Anchor
stage6
stage6
Stage 6 
Extract results

  •  

    Step 6.1 - Archive exceptions

...

  1. If the Bulk Loader Process was run on the production server then restore services
    1. Turn on production-to-replication service
      • TODO: add steps
    2. Turn on downstream database propagation service(s)
      • Resume downstream replication to internal business system database (SF PROD WEB).

        Code Block
        languagetext
        firstline1
        titlestart xmit
        sudo /var/www/html/eas/bin/xmit_change_notifications.bsh start


    3. Enable front-end access to EAS
      • Place the Web servers into live mode (SF PROD WEB, DR PROD WEB).

        Code Block
        languagebash
        linenumberstrue
        cd /var/www/html
        sudo ./set_eas_mode.sh LIVE


...