Skip to main content

Metadata Ingestion

Python version 3.6+

This module hosts an extensible Python-based metadata ingestion system for DataHub. This supports sending data to DataHub using Kafka or through the REST API. It can be used through our CLI tool, with an orchestrator like Airflow, or as a library.

Getting Started#

Prerequisites#

Before running any metadata ingestion job, you should make sure that DataHub backend services are all running. If you are trying this out locally, the easiest way to do that is through quickstart Docker images.

Install from PyPI#

The folks over at Acryl Data maintain a PyPI package for DataHub metadata ingestion.

# Requires Python 3.6+python3 -m pip install --upgrade pip wheel setuptoolspython3 -m pip install --upgrade acryl-datahubdatahub version# If you see "command not found", try running this instead: python3 -m datahub version

If you run into an error, try checking the common setup issues.

Installing Plugins#

We use a plugin architecture so that you can install only the dependencies you actually need. Click the plugin name to learn more about the specific source recipe and any FAQs!

Sources:

Plugin NameInstall CommandProvides
fileincluded by defaultFile source and sink
athenapip install 'acryl-datahub[athena]'AWS Athena source
bigquerypip install 'acryl-datahub[bigquery]'BigQuery source
bigquery-usagepip install 'acryl-datahub[bigquery-usage]'BigQuery usage statistics source
datahub-business-glossaryno additional dependenciesBusiness Glossary File source
dbtno additional dependenciesdbt source
druidpip install 'acryl-datahub[druid]'Druid Source
feastpip install 'acryl-datahub[feast]'Feast source
gluepip install 'acryl-datahub[glue]'AWS Glue source
hivepip install 'acryl-datahub[hive]'Hive source
kafkapip install 'acryl-datahub[kafka]'Kafka source
kafka-connectpip install 'acryl-datahub[kafka-connect]'Kafka connect source
ldappip install 'acryl-datahub[ldap]' (extra requirements)LDAP source
lookerpip install 'acryl-datahub[looker]'Looker source
lookmlpip install 'acryl-datahub[lookml]'LookML source, requires Python 3.7+
mongodbpip install 'acryl-datahub[mongodb]'MongoDB source
mssqlpip install 'acryl-datahub[mssql]'SQL Server source
mysqlpip install 'acryl-datahub[mysql]'MySQL source
oraclepip install 'acryl-datahub[oracle]'Oracle source
postgrespip install 'acryl-datahub[postgres]'Postgres source
redashpip install 'acryl-datahub[redash]'Redash source
redshiftpip install 'acryl-datahub[redshift]'Redshift source
sagemakerpip install 'acryl-datahub[sagemaker]'AWS SageMaker source
snowflakepip install 'acryl-datahub[snowflake]'Snowflake source
snowflake-usagepip install 'acryl-datahub[snowflake-usage]'Snowflake usage statistics source
sql-profilespip install 'acryl-datahub[sql-profiles]'Data profiles for SQL-based systems
sqlalchemypip install 'acryl-datahub[sqlalchemy]'Generic SQLAlchemy source
supersetpip install 'acryl-datahub[superset]'Superset source

Sinks

Plugin NameInstall CommandProvides
fileincluded by defaultFile source and sink
consoleincluded by defaultConsole sink
datahub-restpip install 'acryl-datahub[datahub-rest]'DataHub sink over REST API
datahub-kafkapip install 'acryl-datahub[datahub-kafka]'DataHub sink over Kafka

These plugins can be mixed and matched as desired. For example:

pip install 'acryl-datahub[bigquery,datahub-rest]'

You can check the active plugins:

datahub check plugins

Basic Usage#

pip install 'acryl-datahub[datahub-rest]'  # install the required plugindatahub ingest -c ./examples/recipes/example_to_datahub_rest.yml

Install using Docker#

Docker Hub datahub-ingestion docker

If you don't want to install locally, you can alternatively run metadata ingestion within a Docker container. We have prebuilt images available on Docker hub. All plugins will be installed and enabled automatically.

Limitation: the datahub_docker.sh convenience script assumes that the recipe and any input/output files are accessible in the current working directory or its subdirectories. Files outside the current working directory will not be found, and you'll need to invoke the Docker image directly.

# Assumes the DataHub repo is cloned locally../metadata-ingestion/scripts/datahub_docker.sh ingest -c ./examples/recipes/example_to_datahub_rest.yml

Install from source#

If you'd like to install from source, see the developer guide.

Recipes#

A recipe is a configuration file that tells our ingestion scripts where to pull data from (source) and where to put it (sink). Here's a simple example that pulls metadata from MSSQL and puts it into datahub.

# A sample recipe that pulls metadata from MSSQL and puts it into DataHub# using the Rest API.source:  type: mssql  config:    username: sa    password: ${MSSQL_PASSWORD}    database: DemoData
transformers:  - type: "fully-qualified-class-name-of-transformer"    config:      some_property: "some.value"
sink:  type: "datahub-rest"  config:    server: "http://localhost:8080"

We automatically expand environment variables in the config, similar to variable substitution in GNU bash or in docker-compose files. For details, see https://docs.docker.com/compose/compose-file/compose-file-v2/#variable-substitution.

Running a recipe is quite easy.

datahub ingest -c ./examples/recipes/mssql_to_datahub.yml

A number of recipes are included in the examples/recipes directory. For full info and context on each source and sink, see the pages described in the table of plugins.

Transformations#

If you'd like to modify data before it reaches the ingestion sinks โ€“ for instance, adding additional owners or tags โ€“ you can use a transformer to write your own module and integrate it with DataHub.

Check out the transformers guide for more info!

Using as a library#

In some cases, you might want to construct the MetadataChangeEvents yourself but still use this framework to emit that metadata to DataHub. In this case, take a look at the emitter interfaces, which can easily be imported and called from your own code.

Lineage with Airflow#

There's a couple ways to get lineage information from Airflow into DataHub.

note

If you're simply looking to run ingestion on a schedule, take a look at these sample DAGs:

Using Datahub's Airflow lineage backend (recommended)#

caution

The Airflow lineage backend is only supported in Airflow 1.10.15+ and 2.0.2+.

Running on Docker locally#

If you are looking to run Airflow and DataHub using docker locally, follow the guide here. Otherwise proceed to follow the instructions below.

Setting up Airflow to use DataHub as Lineage Backend#

  1. You need to install the required dependency in your airflow. See https://registry.astronomer.io/providers/datahub/modules/datahublineagebackend
  pip install acryl-datahub[airflow]
  1. You must configure an Airflow hook for Datahub. We support both a Datahub REST hook and a Kafka-based hook, but you only need one.

    # For REST-based:airflow connections add  --conn-type 'datahub_rest' 'datahub_rest_default' --conn-host 'http://localhost:8080'# For Kafka-based (standard Kafka sink config can be passed via extras):airflow connections add  --conn-type 'datahub_kafka' 'datahub_kafka_default' --conn-host 'broker:9092' --conn-extra '{}'
  2. Add the following lines to your airflow.cfg file.

    [lineage]backend = datahub_provider.lineage.datahub.DatahubLineageBackenddatahub_kwargs = {    "datahub_conn_id": "datahub_rest_default",    "capture_ownership_info": true,    "capture_tags_info": true,    "graceful_exceptions": true }# The above indentation is important!

    Configuration options:

    • datahub_conn_id (required): Usually datahub_rest_default or datahub_kafka_default, depending on what you named the connection in step 1.
    • capture_ownership_info (defaults to true): If true, the owners field of the DAG will be capture as a DataHub corpuser.
    • capture_tags_info (defaults to true): If true, the tags field of the DAG will be captured as DataHub tags.
    • graceful_exceptions (defaults to true): If set to true, most runtime errors in the lineage backend will be suppressed and will not cause the overall task to fail. Note that configuration issues will still throw exceptions.
  3. Configure inlets and outlets for your Airflow operators. For reference, look at the sample DAG in lineage_backend_demo.py, or reference lineage_backend_taskflow_demo.py if you're using the TaskFlow API.

  4. [optional] Learn more about Airflow lineage, including shorthand notation and some automation.

Emitting lineage via a separate operator#

Take a look at this sample DAG:

In order to use this example, you must first configure the Datahub hook. Like in ingestion, we support a Datahub REST hook and a Kafka-based hook. See step 1 above for details.

Developing#

See the guides on developing, adding a source and using transformers.