Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

The AVS addresses are imported into EAS via a one time ETL.
Once EAS goes live, this ETL will be turned off.
Here I describe in general the processing that occurs during the ETL.

You can see the most of the code at these URLs

but the execution path is not trivial.

In any case here I walk through the process mostly in english.
I will try to call out the places where I have to generalize.

We start by doing to blanket validations and standardizing some values.
This is done in this db proc:

and includes the following

'invalid street number suffix'
The domain values are here

'street name does not exist'
The domain values are specified by DPW.
We do not use fuzzy string matching.

'street suffix does not exist in street dataset'
The domain values are specified by DPW.
We do not use fuzzy string matching.

'street - street suffix combination does not exist'
The domain values are specified by DPW.
We do not use fuzzy string matching.

'referenced parcel has no geometry'
parcel data from DPW

'no matching block - lot'
parcel data from DPW

'block lot values are inconsistent'
Check for consistency across the columnes block, lot, and block-lot.

'length of concatenated unit num exceeds 10'
We concatenate avsa.unit and avsa.unit_sfx using http://code.google.com/p/eas/source/browse/trunk/etl/sql/avs_load/f_concatenate_unit_ddl.sql
The result must fit into char 10.

'invalid end date value'
Some of the end date values cannot be cats into the date type.

We also standardize dates, "unit number" values, and street number suffixes.

At this point we have "excepted" addresses that we know we cannot process.
We take each remaining address and try to load it into EAS.
This proceeds from the base address, to the unit address, to the unit address - parcel link.
This is detailed here in

The main load proc calls into these procs in this order

With the most interesting work being done by f_process_address_base.
I will describe this now and will do a good bit of generalizing.
We try to use an existing adddress and create a new one if necessary.
If we create a new one, we have to find the best matching street segment.
This is easily the most complicated process.
The segment must be within 500 feet of the specified parcel, and the street name and street suffix must match the source data.
Again, we do not use fuzzy string matching (such as Levenshtein etc).

If all goes well, we insert the base address.
There are various constraints that protect data integrity, specifically to prevent duplicate active base addresses.
You can see the trigger enforced constraints here

If make it past finding or creating the base address, we insert the unit address if it is specified.
Again, we do not allow multiple active duplicates.

Finally, we insert an "address - parcel link".
Here again we do not allow duplicates.

At the end of all this processing we compile results at the summary and detail detailed level and provide a QA report on the data.
An example of this attached to this page.

  • No labels

0 Comments

You are not logged in. Any changes you make will be marked as anonymous. You may want to Log In if you already have an account.