data (class) ∞
-
class
data
(log, dbConn, settings=False)[source] ∞ Bases:
marshallEngine.feeders.data.data
Import the ATLAS transient data into the marshall database
Key Arguments
log
– loggerdbConn
– the marshall database connectionsettings
– the settings dictionary
Usage
To setup your logger, settings and database connections, please use the
fundamentals
package (see tutorial here).To initiate a data object, use the following:
from marshallEngine.feeders.atlas.data import data ingester = data( log=log, settings=settings, dbConn=dbConn ).ingest(withinLastDays=withInLastDay)
Methods
clean_up
()A few tasks to finish off the ingest
get_csv_data
(url[, user, pwd])collect the CSV data from a URL with option to supply basic auth credentials
ingest
(withinLastDays)Ingest the data into the marshall feeder survey table
insert objects/detections from the feeder survey table into the transientbucket
-
clean_up
()[source] ∞ A few tasks to finish off the ingest
Key Arguments:
# -
Return:
- None
Usage:
usage code
Todo
add usage info
create a sublime snippet for usage
write a command-line tool for this method
update package tutorial with command-line tool info if needed
-
get_csv_data
(url, user=False, pwd=False)[source] ∞ collect the CSV data from a URL with option to supply basic auth credentials
Key Arguments
url
– the url to the csv fileuser
– basic auth usernamepwd
– basic auth password
Return
csvData
– a list of dictionaries from the csv file
Usage
To get the CSV data for a suvery from a given URL in the marshall settings file run something similar to:
from marshallEngine.feeders.panstarrs.data import data ingester = data( log=log, settings=settings, dbConn=dbConn ) csvDicts = ingester.get_csv_data( url=settings["panstarrs urls"]["3pi"]["summary csv"], user=settings["credentials"]["ps1-3pi"]["username"], pwd=settings["credentials"]["ps1-3pi"]["password"] )
Note you will also be able to access the data via
ingester.csvDicts
-
ingest
(withinLastDays)[source] ∞ Ingest the data into the marshall feeder survey table
Key Arguments
withinLastDays
– within the last number of days. Default: 50
-
insert_into_transientBucket
(importUnmatched=True, updateTransientSummaries=True)[source] ∞ insert objects/detections from the feeder survey table into the transientbucket
Key Arguments
importUnmatched
– import unmatched (new) transients into the marshall (not wanted in some circumstances)updateTransientSummaries
– update the transient summaries and lightcurves? Can be True or False, or alternatively a specific transientBucketId
This method aims to reduce crossmatching and load on the database by:
automatically assign the transientbucket id to feeder survey detections where the object name is found in the transientbukcet (no spatial crossmatch required). Copy matched feeder survey rows to the transientbucket.
crossmatch remaining unique, unmatched sources in feeder survey with sources in the transientbucket. Add associated transientBucketIds to matched feeder survey sources. Copy matched feeder survey rows to the transientbucket.
assign a new transientbucketid to any feeder survey source not matched in steps 1 & 2. Copy these unmatched feeder survey rows to the transientbucket as new transient detections.
Return
None
Usage
ingester.insert_into_transientBucket()