Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Make the 'save artifact' sentences consistent

Table of Contents
Overview

...

Table of Contents
Overview

  • The Bulk Loader is a process that is used to add several new addresses to the EAS at one time.
  • There are several stages that make up the Bulk Loader process, outlined below in Summary and Details.

...

Stage NumberStageCategorySummaryEnvironmentIterationsEstimated Person TimeEstimated Computer Time
1Import and parse reference dataset (Optional)ParsingThis optional step in the Bulk Loader process is to cross check an address for a match in a reference data set. If a source address is found in the reference dataset the address makes it to the next step. If not found the address is put aside in an exclusion set for later review.

Python 3

PostgreSQL / pgAdmin

Once per Bulk Loader process1 hour10 minutes
2Import, parse and filter source datasetParsingImport the dataset destined for the EAS. Parse and filter the set.

Python 3

PostgreSQL / pgAdmin

Once per Bulk Loader process90 minutes15 minutes

3

Geocode and filterGeocodingGeocode the set and filter further based on the geocoder score and status.ArcMapOnce per Bulk Loader process1 hour 5 minutes
4Export full set (single batch) or subset (multiple batches)GeocodingFor large datasets, create one of many subsets that will be run through the Bulk Loader in multiple batches.ArcMapOne or more batches for each Bulk Loader process30 minutes per batch5 minutes per batch
5Bulk Load batch (full set or subset)Bulk LoadingRun the entire batch or each subset batch through the Bulk Loader.

EAS <environment>(+)

PostgreSQL / pgAdmin

One or more batches for each Bulk Loader process1 hour per batch5 minutes per batch
6Extract resultsBulk LoadingExtract and archive the list of addresses that were added to the EAS . Also archive the unique EAS 'change request id' associated with this batch. Also archive the addresses that were rejected by the Bulk Loader in this batch.PostgreSQL / pgAdminOne or more batches for each Bulk Loader process1 hour per batch5 minutes per batch
7Cleanup and RestorationBulk LoadingClean up database, restore services and in the event of a failure, restore from backup.PostgreSQL / pgAdminOne or more batches for each Bulk Loader process1 hour per batch5 minutes per batch

...

(star)Running on DEV or QA first is a requirement

Required! Run addresses through DEV or QA first

Warning
titleRequired! Run addresses through DEV or QA first

Never load any new addresses into production until a successful trial run is performed on the same addresses in a non-production environment, such as development or QA.


(warning)Important considerations when running the Bulk Loader

Downstream implications

...

Anchor
stage1
stage1
Stage 1 Import and parse reference dataset (Optional)

This optional stage is run once per Bulk Loader process. This stage can be skipped if the reference dataset is already available or if the optional 'filter by reference' step (Step 2.5) is skipped.

...

Anchor
stage3
stage3
Stage 3 
Geocode and filter

  •  

    Step 3.1 - Geocode source dataset

...

(warning) The total number of records of the two output shapefiles should be the same as the number of records in the input shapefile.


Anchor
stage4
stage4
Stage 4 
Export shapefile - full set (single batch) or subset (multiple batches)

Note
titleA note about batches

Stages 4, 5 and 6 can be run one time with the results from Stage 3, or they can be run in multiple batches of subsets.

A major consideration of when to run the full set at once versus in batches is the number of records being Bulk Loaded.

The size of each Bulk Loader operation affects the following aspects of the EAS:

  • The disk space consumed by the database server
  • The EAS user interface section that lists addresses loaded in a given Bulk Loader operation
  • The weekly email attachment listing new addresses added to the EAS

For medium-to-large datasets (input sets with over 1,000 records) it is recommended that the Bulk Loading process be run in batches over several days or weeks.

Reminder! It is required that the process first be run on a development server to assess the implications of the operation. Where appropriate, perform the Bulk Loading process in batches over several days or weeks.

The remaining steps will document a single batch example iteration of . Repeat these steps in a multi-batch process.

  •  

    Step 4.1 - Export shapefile for Bulk Loading (entire set or subset batch)

...

      1. Right-click the layer geocoder_score_100 in the ArcMap Table of Contents and select Open Attributes Table.
      2. Click the first icon in the menu bar (top-left) and select Select By Attributes.
      3. Enter the following WHERE clause to select the current batch of 50,000 records:

        Code Block
        languagetext
        linenumberstrue
        "counter_" >
        =
         50000 AND "counter_" <= 100000


      4. Click Apply, wait for operation to complete and then close the Attributes window
        • (warning) The batch may contain less than 50,000 records due to filtering by geocoding results in the previous step.
    1. Export for Bulk Loader
      1. In the Table of Contents right-click the layer geocoder_score_100 and select Data → Export Data.

      2. Click the browse icon.

      3. In the file browser select the Save as type dropdown and select Shapefile.

      4. Save the shapefile to the artifacts folder dedicated to this iteration of the Bulk Loader Process.

 e.g. R:\Tec\..\Eas\_Task\2018_2019\path\to\archive\bulkloader_process_YYYYMMDD\bulkloader\batch_002NNN\bulkload.shp

  1. Artifacts
    1. bulkload.shp - Shapefile for loading into the Bulk Loader in the next stage.

Anchor
stage5
stage5
Stage 5 
Run the Bulk Loader

(info) For a complete set of steps and background about the Bulk Loader, see also Running the Bulk Loader, a page dedicated to its input, operation and results.

Warning
titleRequired! Run addresses through DEV or QA first

Never load any new addresses into production until a successful trial run is performed on the same addresses in a non-production environment, such as development or QA.


Place EAS web application into maintenance mode

...

  •  

    Step 5.1 -

    Production-specific preparation

...

  1. Make a backup of the EAS database

Halt Services

Warning
titleReason for halting services

These steps are being performed to facilitate immediate roll-back of the EAS database if the Bulk Load Process ends in failure.

  • Disable front-end access to EAS

  1. Notify relevant recipients that the Bulk Loader Process is starting
  2. Disable web service on <environment>_WEB (SF DEV WEB, SF QA WEB, SF PROD WEB)

    Code Block
    languagebash
    title
    Place EAS web application into maintenance mode
    linenumberstrue
    cd /var/www/html
    sudo ./set_eas_mode.sh MAINT


    1. Turn off the replication server

      Disable database replication by shutting down the database service on the replication server Browse to web site, http://eas.sfgov.org/, to confirm the service has stopped. (Expect to see message that EAS is currently out of service.)


  •  

    Step 5.2 - Non-production-specific preparation

  1. Restore Database

    1. Restore database from latest daily production backup


  •  

    Step 5.3 - Production-specific preparation

  1. Halt Services

    Warning
    titleReason for halting services

    These steps are being performed to facilitate immediate roll-back of the EAS database if the Bulk Load Process ends in failure


    1. SKIP Turn off the replication server

      1. Disable database replication by shutting down the database service on the replication server (DR PROD DB).


        Code Block
        languagebash
        titleStop PostgreSQL
        linenumberstrue
      sudo
      1. #sudo -u postgres -i
        #/usr/pgsql-9.0/bin/pg_ctl -D /data/9.0/data stop


    2. Turn off downstream database propagation service(s)

      1. Suspend downstream replication to internal business system database (SF PROD WEB).

        Code Block
        language
      text
      1. bash
        titlestop xmit
        linenumberstrue
        sudo /var/www/html/eas/bin/xmit_change_notifications.bsh stop
  •  

    Step 5.2 - Non-production-specific preparation

  1. Restore Database
    1. Restore database from latest daily production backup

 

  •  

    Step 5.3 - Database Preparation


  1. Connect to the database, <environment>_DB, and clear any leftover records from previous Bulk Loader batches.

    Backup Database


Code Block
language

...

bash
title

...

Stop PostgreSQL
linenumberstrue

...

TRUNCATE bulkloader.address_extract, bulkloader.blocks_nearest;
Code Block
languagesql
titlevacuum
linenumberstrue
VACUUM FULL ANALYZE bulkloader.address_extract;
Code Block
languagesql
titlevacuum
linenumberstrue
VACUUM FULL ANALYZE bulkloader.blocks_nearest;

Make note of EAS record counts before the Bulk Loading operation.

Code Block
languagesql
firstline1
titleRecord Counts
linenumberstrue
SELECT schemaname,relname,n_live_tup FROM pg_stat_user_tables ORDER BY schemaname,relname,n_live_tup

Make note of the database partition size on the file system.

Code Block
languagebash
firstline1
titledisk usage
linenumberstrue
df /data
  •  

    Step 5.4 - Transfer Shapefiles 

  1. Transfer the bulkload.shp shapefile from Stage 4 to an EAS automation machine, <environment>_AUTO.
  2. Substitute <environment> with one of the relevant environments: SF_DEV, SF_QA, SF_PROD, SD_PROD.
  3. Copy the shapefile to the folder C:\apps\eas_automation\app_data\data\bulkload_shapefile.
  •  

    Step 5.5 - Run Bulk Loader

...

  • cd C:\apps\eas_automation\automation\src

...

Run the step to stage the address records:

  • python job.py --job stage_bulkload_shapefile --env SF_DEV --action EXECUTE --v

...

Run the step to bulk load the address records:

  • python job.py --job bulkload --env SF_DEV --action EXECUTE --v

(info) To calculate the time it took to run the Bulk Loader look at the timestamps in the output or use a stopwatch or clock to time the operation.

...

sudo -u postgres -i
/home/dba/scripts/dbbackup.sh > /var/tmp/dbbackup.log # this step takes about 2 minutes
ls -l /var/tmp # ensure the log file is 0 bytes
ls -la /mnt/backup/pg/daily/easproddb.sfgov.org-* # the timestamp on the last file listed should match timestamp of backup
exit # logout of user postgres when done


  •  

    Step 5.3 - Database Preparation 

  1. Connect to the database, <environment>_DB, and clear any leftover records from previous Bulk Loader batches.

    Code Block
    languagesql
    titleTRUNCATE
    linenumberstrue
    TRUNCATE bulkloader.address_extract, bulkloader.blocks_nearest;


    Code Block
    languagesql
    titleVACUUM
    linenumberstrue
    VACUUM FULL ANALYZE bulkloader.address_extract;


    Code Block
    languagesql
    titleVACUUM
    linenumberstrue
    VACUUM FULL ANALYZE bulkloader.blocks_nearest;


  2. Make note of EAS record counts before the Bulk Loading operation.

    Code Block
    languagesql
    firstline1
    titleRecord Counts
    linenumberstrue
    SELECT schemaname,relname,n_live_tup FROM pg_stat_user_tables ORDER BY schemaname,relname,n_live_tup
    • A comparison of 'before' and 'after' record counts will indicate the number of new base addresses added to the table `public.address_base` and the number of new addresses and units added to the table `public.addresses`.

    1. (info) See dedicated Bulk Loader page, Running the Bulk Loader, for more analysis options.
      Save results in artifact as record_counts_before.csv 
    2. Also save results in Excel spreadsheet artifact as BulkLoader_Process_YYYYMMDD.xlsx
  3. Make note of the database partition size on the file system . Compre with size of partition prior to loading to get the total disk space used as a result of running the Bulk Loaderat the current point in time.

    Code Block
    languagebash
    firstline1
    titledisk usage
    linenumberstrue
    date; df /data
    Query and make note of totals in the bulkloader.address_extract table. The results here will be used to cross check the results in the next stage.

    Count/view new base addresses added to the EAS.

    Code Block
    languagesql
    firstline1
    titleCount/view new base addresses
    linenumberstrue
    SELECT COUNT(*) FROM bulkloader.address_extract WHERE NOT (street_segment_id IS NULL)
    
    SELECT * FROM bulkloader.address_extract WHERE NOT (street_segment_id IS NULL)

    Count/view unit addresses (some were already there, some are new)

    Code Block
    languagesql
    firstline1
    titleCount/view unit addresses
    linenumberstrue
    SELECT COUNT(*) FROM bulkloader.address_extract WHERE NOT (address_id IS NULL)
    
    SELECT * FROM bulkloader.address_extract WHERE NOT (address_id IS NULL)

...

  •  

    Step 6.1 - Archive exceptions

Info
titleInfo about the 'address_extract' table

The Bulk Loader operation in Stage 5 populated an EAS table named 'address_extract' in the 'bulkloader' schema.

It populated the 'address_extract' table with every address it attempted to load.

If any errors occurred on a given address during the load, the Bulk Loader populated the 'exception_text' field with a description of the error.

  1. Archive the entire address_extract table.Use a query tool such as pgAdmin to query and save the table as a CSV file.
    Code Block
    languagesql
    firstline1
    titlebulkloader.address_extract
    linenumberstrue
    SELECT * FROM 'bulkloader.address_extract';
  2. Save the file in the network folder dedicated to artifacts for the Bulk Loader iteration.
    • For example, R:\Tec\..\Eas\_Task\path\to\archive\bulkloader_YYYYMMDD\bulkloader\batch_002\address_extract.csv
  3. Archive the addresses that raised exceptions during the Bulk Loader runQuery the 'bulkloader.address_extract' table for any value in the 'exception_text' field
     # 1st of 3


  •  

    Step 5.4 - Transfer Shapefiles 

  1. Transfer the bulkload.shp shapefile from Stage 4 to an EAS automation machine, <environment>_AUTO.
  2. Substitute <environment> with one of the relevant environments: SF_DEV, SF_QA, SF_PROD, SD_PROD.
  3. Copy the shapefile to the folder C:\apps\eas_automation\app_data\data\bulkload_shapefile.


  •  

    Step 5.5 - Run Bulk Loader

  1. Open a command prompt and change folders:

    Code Block
    languagebash
    linenumberstrue
    cd C:\apps\eas_automation\automation\src


  2. Run the step to stage the address records:

    Code Block
    languagebash
    linenumberstrue
    python job.py --job stage_bulkload_shapefile --env <environment> --action EXECUTE --v
    python job.py --job stage_bulkload_shapefile --env SF_DEV --action EXECUTE --v
    python job.py --job stage_bulkload_shapefile --env SF_QA --action EXECUTE --v
    python job.py --job stage_bulkload_shapefile --env SF_PROD --action EXECUTE --v


  3. Run the step to bulk load the address records


    Code Block
    languagebash
    linenumberstrue
    python job.py --job bulkload --env <environment> --action EXECUTE --v
    python job.py --job bulkload --env SF_DEV --action EXECUTE --v
    python job.py --job bulkload --env SF_QA --action EXECUTE --v
    python job.py --job bulkload --env SF_PROD --action EXECUTE --v
    
    


  4. (info) To calculate the time it took to run the Bulk Loader look at the timestamps in the output or use a stopwatch or clock to time the operation.


  5. Save Bulk Loader command line output artifact as bulk_loader_CLI_output.txt


  •  

    Step 5.6 - Analysis
    Anchor
    analysis
    analysis

  1. Make note of the database partition size on the file system at this point. Compare with size of partition prior to loading to get the total disk space used as a result of running the Bulk Loader.

    Code Block
    languagebash
    firstline1
    titledisk usage
    linenumberstrue
    date; df /data # 2nd of 3


  2. Make note of EAS record counts after the Bulk Load operation.

    Code Block
    languagesql
    firstline1
    titleRecord Counts
    linenumberstrue
    SELECT schemaname,relname,n_live_tup FROM pg_stat_user_tables ORDER BY schemaname,relname,n_live_tup


    • Save results as artifact record_counts_after.csv
    • Also save results in Excel spreadsheet artifact as BulkLoader_Process_YYYYMMDD.xlsx
    • In the spreadsheet, calculate the difference between the 'before' and 'after' record counts. The results will indicate the number of new base addresses added to the table `public.address_base` and the number of new addresses and units added to the table `public.addresses`.

    • (info) See dedicated Bulk Loader page, Running the Bulk Loader, for more analysis options.

Anchor
stage6
stage6
Stage 6 
Extract results

  •  

    Step 6.1 - Archive exceptions

Info
titleInfo about the 'address_extract' table

The Bulk Loader operation in Stage 5 populated an EAS table named 'bulkloader.address_extract' with every address it attempted to load.

If any errors occurred on a given address during the load, the Bulk Loader populated the 'exception_text' field with a description of the error.

  1. Archive the entire address_extract table.
    1. Use a query tool such as pgAdmin to query and save the table as a CSV file.

      Code Block
      languagesql
      firstline1
      titleaddress_extract
      linenumberstrue
      SELECT * FROM bulkloader.address_extract;


    2. Save the file in the network folder dedicated to artifacts for the Bulk Loader iteration.
      1. Save as artifact address_extract.csv
  2. Archive the addresses that raised exceptions during the Bulk Loader process
    1. Query subtotals

      Code Block
      languagesql
      firstline1
      titleexception_text_counts
      linenumberstrue
      SELECT exception_text, Count(*) FROM bulkloader.address_extract GROUP BY exception_text ORDER BY exception_text;
      1. Save artifact as exception_text_counts.csv

    2. Query all exception text records

      Code Block
      languagesql
      firstline1
      titleexception_text
      linenumberstrue
      SELECT * FROM bulkloader.address_extract WHERE NOT(exception_text IS NULL) ORDER BY exception_text, id;
      1. Save artifact as exception_text.csv
  3. Artifacts
    1. address_extract.csv - Results of every address submitted to the Bulk Loader.
    2. exception_text_counts.csv - Counts of the records that were not loaded due to the error indicated in the 'exception_text' field.
    3. exception_text.csv - Subset of the just the records that were not loaded due to the error indicated in the 'exception_text' field


  •  

    Step 6.2 - Archive unique EAS change_request_id associated with the Bulk Load

  1. Get the unique EAS change_request_id created by the Bulk Load operation. The value of <change_request_id> will be used in the next steps to count addresses added to the EAS.
    Get the
    1. Query the 'public.change_requests' table for the new 'change_request_id' value.

      Code Block
      languagesql
      firstline1
      titleexceptionchange_request_textid
      linenumberstrue
      SELECT *change_request_id FROM bulkloaderpublic.addresschange_extractrequests 
      WHERE NOT(exceptionrequestor_textcomment ISLIKE NULL)'bulk load ORDER BY exception_text;
    2. Save the file in the network folder dedicated to artifacts for the Bulk Loader iteration.
      • For example, R:\Tec\..\Eas\_Task\path\to\archive\bulkloader_YYYYMMDD\bulkloader\batch_002\exceptions.csv
  2. Artifacts
    1. address_extract.csv - Results of every address submitted to the Bulk Loader.
    2. exception_text.csv - Subset of the just the records that were not loaded due to the error indicated in the 'exception_text' field.
  •  

    Step 6.2 - Archive unique EAS change_request_id associated with the Bulk Load

    1. change request' 
      ORDER BY change_request_id DESC 
      LIMIT 1;


    2. Save artifact as change_request_id.csv
  1. Artifacts
    1. change_request_id.csv - The unique EAS change_request_id
     created
    1. created by the Bulk Load operation
    . The value of <change_request_id> will be used in the next steps to count addresses
    1. .


  •  

    Anchor
    step6.3
    step6.3
    St
    ep 6.3 - Archive new EAS addresses records

  1. Get all the address records (including units) added to the EAS during the Bulk Loader operation..
    1. Query the 'public.change_requests' table for the new 'addresses table on the new change_request_id' value.

      SELECT change_request_id FROM public.change_requests WHERE requestor_comment LIKE 'bulk load change request' ORDER BY
      Code Block
      languagesql
      firstline1
      titlechange_request_idaddresses
      linenumberstrue
      true
      SELECT * FROM public.addresses
      WHERE activate_change_request_id DESC 
      LIMIT 1= <change_request_id>;


    2. Save the file in the network folder dedicated to artifacts for the Bulk Loader iteration.
      • For example, R:\Tec\..\Eas\_Task\path\to\archive\bulkloader_YYYYMMDD\bulkloader\batch_002\change_request_id.csv
    Artifacts
    1. change_request_id.csv - The unique EAS change_request_id created by the Bulk Load artifact as addresses.csv
  2. Extract sample unit address from the output
    1. Pick a random record from the results where unit_num is not NULL. Gather the value in the address_base_id field.
    2. Construct a URL from this value like this: http://eas.sfgov.org/?address=NNNNNN
      • Where NNNNNN is the value from the address_base_id field.
    3. Make note of this URL for use in Step 7 when testing EAS after services are restored.
  3.  Artifacts
    1. addresses.csv - All the address records (including units) added to the EAS during the Bulk Loader operation.


  •  

    Anchor
    step6.

    3

    4
    step6.

    3

    4
    Step 6.

    3

    4 - Archive new EAS address_base records

  1. Get all the base records added to the EAS during the Bulk Loader operation.
    1. Query the public.address_base table on the new change_request_id value.

      Code Block
      languagesql
      firstline1
      titlepublic.address_base
      linenumberstrue
      SELECT activate_change_request_id, address_id, public.address_base.*
      FROM public.address_base, public.addresses
      WHERE public.address_base.address_base_id = public.addresses.address_base_id
      AND public.addresses.address_base_flg = TRUE
      AND public.addresses.activate_change_request_id = <change_request_id>;
      Save the file in the network folder dedicated to artifacts for the Bulk Loader iteration.For example, R:\Tec\..\Eas\_Task\path\to\archive\bulkloader_YYYYMMDD\bulkloader\batch_002\address_base.csv
      <change_request_id>;


    2. Save artifact as address_base.csv
  2. Extract sample base address from the output
    1. Pick a random record from the results. Gather the value in the address_base_id field.
    2. Construct a URL from this value like this: http://eas.sfgov.org/?address=NNNNNN
      • Where NNNNNN is the value from the address_base_id field.
    3. Make note of this URL for use in Step 7 when testing EAS after services are restored.
  3. Artifacts
    1. address_base.csv - All the base records added to the EAS during the Bulk Loader operation.

...

  1. Get all the address records (including units) added to the EAS during the Bulk Loader operation..
    1. Query the public.addresses table on the new change_request_id value.operation.



  •  

    Step 6.5 - Cross check results

Compare the results of Stage 5 with the results from Stage 6.

  1. The number of addresses found in the Step 5.6 (2) should be identical to the number of addresses found in Step 6.3.

  2. The number of base addresses found in the Step 5.6 (2) should be identical to the number of base addresses found in Step 6.4.

Anchor
stage7
stage7
Stage 7 
- Cleanup and Restoration

  •  

    Step 7.1 - Database Cleanup

  1. Connect to the database, <environment>_DB, and clear address_extract records from the latest Bulk Loader batch. Make note of final disk usage tally.


Code Block
languagesql
firstline1
titleTRUNCATE
linenumberstrue
TRUNCATE bulkloader.address_extract, bulkloader.blocks_nearest;


Code Block
languagesql
firstline1
title

...

VACUUM
linenumberstrue

...

VACUUM 

...

FULL 

...

ANALYZE 

...

  • For example, R:\Tec\..\Eas\_Task\path\to\archive\bulkloader_YYYYMMDD\bulkloader\batch_002\addresses.csv

...

  1. addresses.csv - All the address records (including units) added to the EAS during the Bulk Loader operation.
  •  

    Step 6.5 - Cross check results

Compare the results of Stage 5 with the results from Stage 6.

  1. The number of base addresses found in the Stage 5 Analysis should be identical to the number of base addresses found in Step 6.3.

  2. The number of addresses found in the Stage 5 Analysis should be less than or equal to the number of addresses listed in Step 6.4. (The Bulk Loader does not provide enough information in the the bulkloader.address_extract table to determine the exact number of new addresses added. But there is enough information to determine an upper limit.)

...

Step 7.1 - Database Cleanup

...

Connect to the database, <environment>_DB, and clear temporary records from the latest Bulk Loader batch.

Code Block
languagesql
firstline1
titleTRUNCATE
linenumberstrue
TRUNCATE bulkloader.address_extract, bulkloader.blocks_nearest;
Code Block
languagesql
firstline1
titleVACUUM
linenumberstrue
VACUUM FULL ANALYZE bulkloader.address_extract;
Code Block
languagesql
firstline1
titleVACUUM
linenumberstrue
TRUNCATE bulkloader.address_extract, bulkloader.blocks_nearest;

...

Step 7.2 - On Failure Restore Database

  1. If the Bulk Loader failed and corrupted any data then restore from the database backup.
    1. Follow these steps to restore from backup.
  •  

    Step 7.3 - Restore Services (Production Only)

...

Turn on production-to-replication service

Re-enable database replication by restarting the database service on the replication server (DR PROD DB).

Code Block
languagebash
titleStop PostgreSQL
linenumberstrue
sudo -u postgres -i
/usr/pgsql-9.0/bin/pg_ctl -D /data/9.0/data start
Turn on downstream database propagation service(s)

Resume downstream replication to internal business system database (SF PROD WEB).

Code Block
languagetext
firstline1
titlestart xmit
sudo /var/www/html/eas/bin/xmit_change_notifications.bsh start
Enable front-end access to EAS

Place the Web servers into live mode (SF PROD WEB, DR PROD WEB).

...

languagebash
linenumberstrue

...

bulkloader.address_extract;


Code Block
languagesql
linenumberstrue
VACUUM FULL ANALYZE bulkloader.blocks_nearest;


Code Block
languagebash
titledisk usage
linenumberstrue
date; df /data # 3rd of 3
# Optional step: archive output to 'df.txt' artifact
exit


  •  

    Step 7.2 - Clean automation machine

  1. Return to automation machine and remove shapefile from 'bulkload_shapefile' folder.
  2. Logout of automation machine. 


  •  

    Step 7.3 - On Failure Restore Database

  1. If the Bulk Loader failed and corrupted any data then restore from the database backup.
    1. Follow these steps to restore from backup.


  •  

    Step 7.4 - Restore Services (Production Only)

  1. SKIP Turn on production-to-replication service
    • Re-enable database replication by restarting the database service on the replication server (DR PROD DB).


      Code Block
      languagebash
      titleStop PostgreSQL
      linenumberstrue
      #sudo -u postgres -i
      #/usr/pgsql-9.0/bin/pg_ctl -D /data/9.0/data start


  2. SKIP Turn on downstream database propagation service(s)
    • Resume downstream replication to internal business system database (SF PROD WEB).

      Code Block
      languagebash
      firstline1
      titlestart xmit
      #sudo /var/www/html/eas/bin/xmit_change_notifications.bsh start


  •  

    Anchor
    step7.5
    step7.5
    Step 7.5
     - Enable front-end access to EAS

  1. Enable web service on <environment>_WEB (SF DEV WEB, SF QA WEB, SF PROD WEB)

    Code Block
    languagebash
    linenumberstrue
    cd /var/www/html
    sudo ./set_eas_mode.sh LIVE
    exit


  2. Browse to website, http://eas.sfgov.org/, and review sample addresses gathered in Step 6.3 and Step 6.4

  3. Notify relevant recipients that the Bulk Loader Process is complete


  •  

    Step 7.6 - Archive artifacts

  1. List of artifacts
    1. address_base.csv
    2. address_extract.csv
    3. addresses.csv
    4. bulk_loader_CLI_output.txt
    5. change_request_id.csv
    6. df.txt
    7. exception_text.csv
    8. exception_text_counts.csv
  2. Contents of progress and summary artifact, BulkLoader_Process_YYYYMMDD.xlsx


    1. Progress - This sheet contains a table of relevant totals for each batch

      1. Batch number

      2. Batch date

      3. Input record counts

      4. New base record counts

      5. New unit record counts

      6. Sample addresses


    2. Email Jobs - This sheet contains a table of details related to the weekly 'Address Notification Report' automated email job

      1. Batch range

      2. Record count in batch range

      3. Email Timestamp

      4. Total record counts in email

      5. Subtotal of records generated as a result of the Bulk Loader

      6. Size of email attachment


    3. Batch N - This sheet tracks the before and after record counts for all tables in the EAS database. There is a sheet for each batch loaded. Within each sheet is a section for the 'before' records, a section for the 'after' record counts, and a 'diff' column showing the change in record counts.


END OF STEPS

Notes

Anchor
env
env
(+)  Substitute EAS <environment> with one of the relevant environments: SF_DEV, SF_QA, SF_PROD, SD_PROD

...