Get Started with Data Pipelines
The Data Pipelines service allows for the scheduling or triggering of data workflows through tools like Data Solaris Notebooks, Serverless-spark, Serverless-flow, Spark Scripts, Metaflow Projects, and various other ETL/ELT technologies that are compatible with a Python environment. Data Pipelines provides an intuitive visual workflow, making it easy for developers to express complex data workflow logic through a drag-and-drop interface. To initiate a new Data Pipeline, simply click the "New Pipeline" button.