- Apache Airflow, Prefect and Temporal - For building and running data-intensive workflows.
- Apache Spark/Ray/Dask - For doing map-reduce style parallel processing.
Quick Start
Let’s create a simple workflow to summarize a website on-demand! It demonstrates how to build and serve a workflow as a remote Python API.Define the Graph
We will write two functions,
scrape_website and summarize_text.
We create a Graph website-summarizer that executes the scrape function, and then executes the summarizer with the outputs of the scraper.Deploying a Graph as an Remote API
When it’s time to consume your graph from other applications, you can serve it as an API. You can run the server in production in many ways, but here we run this in our laptop to show how it works.Note: The
indexify-cli command is part of the indexify python package previously installed.This starts the following processes -- Server: Orchestrates functions in the graph, stores execution state, and hosts Remote Graph APIs.
- Executor: Runs the individual functions in the graph.