1.15.4.dev2+g3e3ce2426

flytekit.core.workflow

Directory

Classes

Class Description
Any Special type indicating an unconstrained type.
ClassStorageTaskResolver Stores tasks inside a class variable.
CompilationState Compilation state is used during the compilation of a workflow or task.
ConditionalSection ConditionalSection is used to denote a condition within a Workflow.
Description Full user description with formatting preserved.
Docstring None.
Documentation DescriptionEntity contains detailed description for the task/workflow/launch plan.
Enum Create a collection of name/value pairs.
ExecutionState This is the context that is active when executing a task or a local workflow.
FlyteContext This is an internal-facing context object, that most users will not have to deal with.
FlyteContextManager FlyteContextManager manages the execution context within Flytekit.
FlyteEntities This is a global Object that tracks various tasks and workflows that are declared within a VM during the.
FlyteError Special Task type that will be used in the failure node.
ImperativeWorkflow An imperative workflow is a programmatic analogue to the typical @workflow function-based workflow and is.
Interface A Python native interface object, like inspect.
Node This class will hold all the things necessary to make an SdkNode but we won’t make one until we know things like.
NodeOutput None.
Options These are options that can be configured for a launchplan during registration or overridden during an execution.
ParamSpec Parameter specification.
Promise This object is a wrapper and exists for three main reasons.
PythonAutoContainerTask A Python AutoContainer task should be used as the base for all extensions that want the user’s code to be in the.
PythonFunctionWorkflow Please read :std:ref:flyte:divedeep-workflows first for a high-level understanding of what workflows are in Flyte.
PythonTask Base Class for all Tasks with a Python native Interface.
ReferenceEntity None.
ReferenceWorkflow A reference workflow is a pointer to a workflow that already exists on your Flyte installation.
Task The base of all Tasks in flytekit.
TypeEngine Core Extensible TypeEngine of Flytekit.
VoidPromise This object is returned for tasks that do not return any outputs (declared interface is empty).
WorkflowBase None.
WorkflowFailurePolicy Defines the behavior for a workflow execution in the case of an observed node execution failure.
WorkflowMetadata None.
WorkflowMetadataDefaults This class is similarly named to the one above.
WorkflowReference A reference object containing metadata that points to a remote workflow.

Errors

flytekit.core.workflow.Any

Special type indicating an unconstrained type.

  • Any is compatible with every type.
  • Any assumed to have all methods.
  • All values assumed to be instances of Any.

Note that all the above statements are true from the point of view of static type checkers. At runtime, Any should not be used with instance checks.

flytekit.core.workflow.ClassStorageTaskResolver

Stores tasks inside a class variable. The class must be inherited from at the point of usage because the task loading process basically relies on the same sequence of things happening.

def ClassStorageTaskResolver(
    args,
    kwargs,
):
Parameter Type
args *args
kwargs **kwargs

Methods

Method Description
add() None
find_lhs() None
get_all_tasks() Future proof method
load_task() Given the set of identifier keys, should return one Python Task or raise an error if not found
loader_args() This is responsible for turning an instance of a task into args that the load_task function can reconstitute
name() None
task_name() Overridable function that can optionally return a custom name for a given task

add()

def add(
    t: flytekit.core.python_auto_container.PythonAutoContainerTask,
):
Parameter Type
t flytekit.core.python_auto_container.PythonAutoContainerTask

find_lhs()

def find_lhs()

get_all_tasks()

def get_all_tasks()

Future proof method. Just making it easy to access all tasks (Not required today as we auto register them)

load_task()

def load_task(
    loader_args: typing.List[str],
):

Given the set of identifier keys, should return one Python Task or raise an error if not found

Parameter Type
loader_args typing.List[str]

loader_args()

def loader_args(
    settings: flytekit.configuration.SerializationSettings,
    t: flytekit.core.python_auto_container.PythonAutoContainerTask,
):

This is responsible for turning an instance of a task into args that the load_task function can reconstitute.

Parameter Type
settings flytekit.configuration.SerializationSettings
t flytekit.core.python_auto_container.PythonAutoContainerTask

name()

def name()

task_name()

def task_name(
    t: flytekit.core.base_task.Task,
):

Overridable function that can optionally return a custom name for a given task

Parameter Type
t flytekit.core.base_task.Task

Properties

Property Type Description
instantiated_in
lhs
location

flytekit.core.workflow.CompilationState

Compilation state is used during the compilation of a workflow or task. It stores the nodes that were created when walking through the workflow graph.

Attributes: prefix (str): This is because we may one day want to be able to have subworkflows inside other workflows. If users choose to not specify their node names, then we can end up with multiple “n0"s. This prefix allows us to give those nested nodes a distinct name, as well as properly identify them in the workflow. mode (int): refer to :py:class:flytekit.extend.ExecutionState.Mode task_resolver (Optional[TaskResolverMixin]): Please see :py:class:flytekit.extend.TaskResolverMixin nodes (Optional[List]): Stores currently compiled nodes so far.

def CompilationState(
    prefix: str,
    mode: int,
    task_resolver: Optional[TaskResolverMixin],
    nodes: List,
):
Parameter Type
prefix str
mode int
task_resolver Optional[TaskResolverMixin]
nodes List

Methods

Method Description
add_node() None
with_params() Create a new CompilationState where the mode and task resolver are defaulted to the current object, but they

add_node()

def add_node(
    n: Node,
):
Parameter Type
n Node

with_params()

def with_params(
    prefix: str,
    mode: Optional[int],
    resolver: Optional[TaskResolverMixin],
    nodes: Optional[List],
):

Create a new CompilationState where the mode and task resolver are defaulted to the current object, but they and all other args are taken if explicitly provided as an argument.

Usage: s.with_params(“p”, nodes=[])

Parameter Type
prefix str
mode Optional[int]
resolver Optional[TaskResolverMixin]
nodes Optional[List]

flytekit.core.workflow.ConditionalSection

ConditionalSection is used to denote a condition within a Workflow. This default conditional section only works for Compilation mode. It is advised to derive the class and re-implement the start_branch and end_branch methods to override the compilation behavior

.. note::

Conditions can only be used within a workflow context.

Usage:

.. code-block:: python

v = conditional(“fractions”).if_((my_input > 0.1) & (my_input < 1.0)).then(…)…

def ConditionalSection(
    name: str,
):
Parameter Type
name str

Methods

Method Description
compute_output_vars() Computes and returns the minimum set of outputs for this conditional block, based on all the cases that have
end_branch() This should be invoked after every branch has been visited
if_() None
start_branch() At the start of an execution of every branch this method should be called

compute_output_vars()

def compute_output_vars()

Computes and returns the minimum set of outputs for this conditional block, based on all the cases that have been registered

end_branch()

def end_branch()

This should be invoked after every branch has been visited. In case this is not local workflow execution then, we should check if this is the last case. If so then return the promise, else return the condition

if_()

def if_(
    expr: Union[ComparisonExpression, ConjunctionExpression],
):
Parameter Type
expr Union[ComparisonExpression, ConjunctionExpression]

start_branch()

def start_branch(
    c: Case,
    last_case: bool,
):

At the start of an execution of every branch this method should be called.

Parameter Type
c Case
last_case bool

Properties

Property Type Description
cases
name

flytekit.core.workflow.Description

Full user description with formatting preserved. This can be rendered by clients, such as the console or command line tools with in-tact formatting.

def Description(
    value: typing.Optional[str],
    uri: typing.Optional[str],
    icon_link: typing.Optional[str],
    format: <enum 'DescriptionFormat'>,
):
Parameter Type
value typing.Optional[str]
uri typing.Optional[str]
icon_link typing.Optional[str]
format <enum 'DescriptionFormat'>

Methods

Method Description
from_flyte_idl() None
serialize_to_string() None
short_string()
to_flyte_idl() None
verbose_string()

from_flyte_idl()

def from_flyte_idl(
    pb2_object: flyteidl.admin.description_entity_pb2.Description,
):
Parameter Type
pb2_object flyteidl.admin.description_entity_pb2.Description

serialize_to_string()

def serialize_to_string()

short_string()

def short_string()

to_flyte_idl()

def to_flyte_idl()

verbose_string()

def verbose_string()

Properties

Property Type Description
is_empty

flytekit.core.workflow.Docstring

def Docstring(
    docstring: typing.Optional[str],
    callable_: typing.Optional[typing.Callable],
):
Parameter Type
docstring typing.Optional[str]
callable_ typing.Optional[typing.Callable]

Properties

Property Type Description
input_descriptions
long_description
output_descriptions
short_description

flytekit.core.workflow.Documentation

DescriptionEntity contains detailed description for the task/workflow/launch plan. Documentation could provide insight into the algorithms, business use case, etc.

def Documentation(
    short_description: typing.Optional[str],
    long_description: typing.Optional[flytekit.models.documentation.Description],
    source_code: typing.Optional[flytekit.models.documentation.SourceCode],
):
Parameter Type
short_description typing.Optional[str]
long_description typing.Optional[flytekit.models.documentation.Description]
source_code typing.Optional[flytekit.models.documentation.SourceCode]

Methods

Method Description
from_flyte_idl() None
serialize_to_string() None
short_string()
to_flyte_idl() None
verbose_string()

from_flyte_idl()

def from_flyte_idl(
    pb2_object: flyteidl.admin.description_entity_pb2.DescriptionEntity,
):
Parameter Type
pb2_object flyteidl.admin.description_entity_pb2.DescriptionEntity

serialize_to_string()

def serialize_to_string()

short_string()

def short_string()

to_flyte_idl()

def to_flyte_idl()

verbose_string()

def verbose_string()

Properties

Property Type Description
is_empty

flytekit.core.workflow.Enum

Create a collection of name/value pairs.

Example enumeration:

class Color(Enum): … RED = 1 … BLUE = 2 … GREEN = 3

Access them by:

  • attribute access:

Color.RED <Color.RED: 1>

  • value lookup:

Color(1) <Color.RED: 1>

  • name lookup:

Color[‘RED’] <Color.RED: 1>

Enumerations can be iterated over, and know how many members they have:

len(Color) 3

list(Color) [<Color.RED: 1>, <Color.BLUE: 2>, <Color.GREEN: 3>]

Methods can be added to enumerations, and members can have their own attributes – see the documentation for details.

flytekit.core.workflow.ExecutionState

This is the context that is active when executing a task or a local workflow. This carries the necessary state to execute. Some required things during execution deal with temporary directories, ExecutionParameters that are passed to the user etc.

Attributes: mode (ExecutionState.Mode): Defines the context in which the task is executed (local, hosted, etc). working_dir (os.PathLike): Specifies the remote, external directory where inputs, outputs and other protobufs are uploaded engine_dir (os.PathLike): branch_eval_mode Optional[BranchEvalMode]: Used to determine whether a branch node should execute. user_space_params Optional[ExecutionParameters]: Provides run-time, user-centric context such as a statsd handler, a logging handler, the current execution id and a working directory.

def ExecutionState(
    working_dir: Union[os.PathLike, str],
    mode: Optional[ExecutionState.Mode],
    engine_dir: Optional[Union[os.PathLike, str]],
    branch_eval_mode: Optional[BranchEvalMode],
    user_space_params: Optional[ExecutionParameters],
):
Parameter Type
working_dir Union[os.PathLike, str]
mode Optional[ExecutionState.Mode]
engine_dir Optional[Union[os.PathLike, str]]
branch_eval_mode Optional[BranchEvalMode]
user_space_params Optional[ExecutionParameters]

Methods

Method Description
branch_complete() Indicates that we are within a conditional / ifelse block and the active branch is not done
is_local_execution() None
take_branch() Indicates that we are within an if-else block and the current branch has evaluated to true
with_params() Produces a copy of the current execution state and overrides the copy’s parameters with passed parameter values

branch_complete()

def branch_complete()

Indicates that we are within a conditional / ifelse block and the active branch is not done. Default to SKIPPED

is_local_execution()

def is_local_execution()

take_branch()

def take_branch()

Indicates that we are within an if-else block and the current branch has evaluated to true. Useful only in local execution mode

with_params()

def with_params(
    working_dir: Optional[os.PathLike],
    mode: Optional[Mode],
    engine_dir: Optional[os.PathLike],
    branch_eval_mode: Optional[BranchEvalMode],
    user_space_params: Optional[ExecutionParameters],
):

Produces a copy of the current execution state and overrides the copy’s parameters with passed parameter values.

Parameter Type
working_dir Optional[os.PathLike]
mode Optional[Mode]
engine_dir Optional[os.PathLike]
branch_eval_mode Optional[BranchEvalMode]
user_space_params Optional[ExecutionParameters]

flytekit.core.workflow.FlyteContext

This is an internal-facing context object, that most users will not have to deal with. It’s essentially a globally available grab bag of settings and objects that allows flytekit to do things like convert complex types, run and compile workflows, serialize Flyte entities, etc.

Even though this object as a current_context function on it, it should not be called directly. Please use the :py:class:flytekit.FlyteContextManager object instead.

Please do not confuse this object with the :py:class:flytekit.ExecutionParameters object.

def FlyteContext(
    file_access: FileAccessProvider,
    level: int,
    flyte_client: Optional['friendly_client.SynchronousFlyteClient'],
    compilation_state: Optional[CompilationState],
    execution_state: Optional[ExecutionState],
    serialization_settings: Optional[SerializationSettings],
    in_a_condition: bool,
    origin_stackframe: Optional[traceback.FrameSummary],
    output_metadata_tracker: Optional[OutputMetadataTracker],
    worker_queue: Optional[Controller],
):
Parameter Type
file_access FileAccessProvider
level int
flyte_client Optional['friendly_client.SynchronousFlyteClient']
compilation_state Optional[CompilationState]
execution_state Optional[ExecutionState]
serialization_settings Optional[SerializationSettings]
in_a_condition bool
origin_stackframe Optional[traceback.FrameSummary]
output_metadata_tracker Optional[OutputMetadataTracker]
worker_queue Optional[Controller]

Methods

Method Description
current_context() This method exists only to maintain backwards compatibility
enter_conditional_section() None
get_deck() Returns the deck that was created as part of the last execution
get_origin_stackframe_repr() None
new_builder() None
new_compilation_state() Creates and returns a default compilation state
new_execution_state() Creates and returns a new default execution state
set_stackframe() None
with_client() None
with_compilation_state() None
with_execution_state() None
with_file_access() None
with_new_compilation_state() None
with_output_metadata_tracker() None
with_serialization_settings() None
with_worker_queue() None

current_context()

def current_context()

This method exists only to maintain backwards compatibility. Please use FlyteContextManager.current_context() instead.

Users of flytekit should be wary not to confuse the object returned from this function with :py:func:flytekit.current_context

enter_conditional_section()

def enter_conditional_section()

get_deck()

def get_deck()

Returns the deck that was created as part of the last execution.

The return value depends on the execution environment. In a notebook, the return value is compatible with IPython.display and should be rendered in the notebook.

.. code-block:: python

with flytekit.new_context() as ctx: my_task(…) ctx.get_deck()

OR if you wish to explicitly display

.. code-block:: python

from IPython import display display(ctx.get_deck())

get_origin_stackframe_repr()

def get_origin_stackframe_repr()

new_builder()

def new_builder()

new_compilation_state()

def new_compilation_state(
    prefix: str,
):

Creates and returns a default compilation state. For most of the code this should be the entrypoint of compilation, otherwise the code should always uses - with_compilation_state

Parameter Type
prefix str

new_execution_state()

def new_execution_state(
    working_dir: Optional[os.PathLike],
):

Creates and returns a new default execution state. This should be used at the entrypoint of execution, in all other cases it is preferable to use with_execution_state

Parameter Type
working_dir Optional[os.PathLike]

set_stackframe()

def set_stackframe(
    s: traceback.FrameSummary,
):
Parameter Type
s traceback.FrameSummary

with_client()

def with_client(
    c: SynchronousFlyteClient,
):
Parameter Type
c SynchronousFlyteClient

with_compilation_state()

def with_compilation_state(
    c: CompilationState,
):
Parameter Type
c CompilationState

with_execution_state()

def with_execution_state(
    es: ExecutionState,
):
Parameter Type
es ExecutionState

with_file_access()

def with_file_access(
    fa: FileAccessProvider,
):
Parameter Type
fa FileAccessProvider

with_new_compilation_state()

def with_new_compilation_state()

with_output_metadata_tracker()

def with_output_metadata_tracker(
    t: OutputMetadataTracker,
):
Parameter Type
t OutputMetadataTracker

with_serialization_settings()

def with_serialization_settings(
    ss: SerializationSettings,
):
Parameter Type
ss SerializationSettings

with_worker_queue()

def with_worker_queue(
    wq: Controller,
):
Parameter Type
wq Controller

Properties

Property Type Description
user_space_params

flytekit.core.workflow.FlyteContextManager

FlyteContextManager manages the execution context within Flytekit. It holds global state of either compilation or Execution. It is not thread-safe and can only be run as a single threaded application currently. Context’s within Flytekit is useful to manage compilation state and execution state. Refer to CompilationState and ExecutionState for more information. FlyteContextManager provides a singleton stack to manage these contexts.

Typical usage is

.. code-block:: python

FlyteContextManager.initialize() with FlyteContextManager.with_context(o) as ctx: pass

If required - not recommended you can use

FlyteContextManager.push_context()

but correspondingly a pop_context should be called

FlyteContextManager.pop_context()

Methods

Method Description
add_signal_handler() None
current_context() None
get_origin_stackframe() None
initialize() Re-initializes the context and erases the entire context
pop_context() None
push_context() None
size() None
with_context() None

add_signal_handler()

def add_signal_handler(
    handler: typing.Callable[[int, FrameType], typing.Any],
):
Parameter Type
handler typing.Callable[[int, FrameType], typing.Any]

current_context()

def current_context()

get_origin_stackframe()

def get_origin_stackframe(
    limit,
):
Parameter Type
limit

initialize()

def initialize()

Re-initializes the context and erases the entire context

pop_context()

def pop_context()

push_context()

def push_context(
    ctx: FlyteContext,
    f: Optional[traceback.FrameSummary],
):
Parameter Type
ctx FlyteContext
f Optional[traceback.FrameSummary]

size()

def size()

with_context()

def with_context(
    b: FlyteContext.Builder,
):
Parameter Type
b FlyteContext.Builder

flytekit.core.workflow.FlyteEntities

This is a global Object that tracks various tasks and workflows that are declared within a VM during the registration process

flytekit.core.workflow.FlyteError

Special Task type that will be used in the failure node. Propeller will pass this error to failure task, so users have to add an input with this type to the failure task.

def FlyteError(
    message: str,
    failed_node_id: str,
):
Parameter Type
message str
failed_node_id str

Methods

Method Description
from_dict() None
from_json() None
to_dict() None
to_json() None

from_dict()

def from_dict(
    d,
    dialect,
):
Parameter Type
d
dialect

from_json()

def from_json(
    data: typing.Union[str, bytes, bytearray],
    decoder: collections.abc.Callable[[typing.Union[str, bytes, bytearray]], dict[typing.Any, typing.Any]],
    from_dict_kwargs: typing.Any,
):
Parameter Type
data typing.Union[str, bytes, bytearray]
decoder collections.abc.Callable[[typing.Union[str, bytes, bytearray]], dict[typing.Any, typing.Any]]
from_dict_kwargs typing.Any

to_dict()

def to_dict()

to_json()

def to_json(
    encoder: collections.abc.Callable[[typing.Any], typing.Union[str, bytes, bytearray]],
    to_dict_kwargs: typing.Any,
):
Parameter Type
encoder collections.abc.Callable[[typing.Any], typing.Union[str, bytes, bytearray]]
to_dict_kwargs typing.Any

flytekit.core.workflow.FlyteFailureNodeInputMismatchException

Assertion failed.

def FlyteFailureNodeInputMismatchException(
    failure_node_node: typing.Union[ForwardRef('WorkflowBase'), ForwardRef('Task')],
    workflow: WorkflowBase,
):
Parameter Type
failure_node_node typing.Union[ForwardRef('WorkflowBase'), ForwardRef('Task')]
workflow WorkflowBase

Properties

Property Type Description
timestamp

flytekit.core.workflow.FlyteValidationException

Assertion failed.

def FlyteValidationException(
    args,
    timestamp: typing.Optional[float],
):
Parameter Type
args *args
timestamp typing.Optional[float]

Properties

Property Type Description
timestamp

flytekit.core.workflow.FlyteValueException

Inappropriate argument value (of correct type).

def FlyteValueException(
    received_value,
    error_message,
):
Parameter Type
received_value
error_message

Properties

Property Type Description
timestamp

flytekit.core.workflow.ImperativeWorkflow

An imperative workflow is a programmatic analogue to the typical @workflow function-based workflow and is better suited to programmatic applications.

Assuming you have some tasks like so

.. literalinclude:: ../../../tests/flytekit/unit/core/test_imperative.py :start-after: # docs_tasks_start :end-before: # docs_tasks_end :language: python :dedent: 4

You could create a workflow imperatively like so

.. literalinclude:: ../../../tests/flytekit/unit/core/test_imperative.py :start-after: # docs_start :end-before: # docs_end :language: python :dedent: 4

This workflow would be identical on the back-end to

.. literalinclude:: ../../../tests/flytekit/unit/core/test_imperative.py :start-after: # docs_equivalent_start :end-before: # docs_equivalent_end :language: python :dedent: 4

Note that the only reason we need the NamedTuple is so we can name the output the same thing as in the imperative example. The imperative paradigm makes the naming of workflow outputs easier, but this isn’t a big deal in function-workflows because names tend to not be necessary.

def ImperativeWorkflow(
    name: str,
    failure_policy: Optional[WorkflowFailurePolicy],
    interruptible: bool,
):
Parameter Type
name str
failure_policy Optional[WorkflowFailurePolicy]
interruptible bool

Methods

Method Description
add_entity() Anytime you add an entity, all the inputs to the entity must be bound
add_launch_plan() None
add_on_failure_handler() This is a special function that mimics the add_entity function, but this is only used
add_subwf() None
add_task() None
add_workflow_input() Adds an input to the workflow
add_workflow_output() Add an output with the given name from the given node output
compile() None
construct_node_metadata() None
create_conditional() None
execute() Called by local_execute
local_execute() None
local_execution_mode() None
ready() This function returns whether or not the workflow is in a ready state, which means

add_entity()

def add_entity(
    entity: Union[PythonTask, _annotated_launch_plan.LaunchPlan, WorkflowBase],
    kwargs,
):

Anytime you add an entity, all the inputs to the entity must be bound.

Parameter Type
entity Union[PythonTask, _annotated_launch_plan.LaunchPlan, WorkflowBase]
kwargs **kwargs

add_launch_plan()

def add_launch_plan(
    launch_plan: _annotated_launch_plan.LaunchPlan,
    kwargs,
):
Parameter Type
launch_plan _annotated_launch_plan.LaunchPlan
kwargs **kwargs

add_on_failure_handler()

def add_on_failure_handler(
    entity,
):

This is a special function that mimics the add_entity function, but this is only used to add the failure node. Failure nodes are special because we don’t want them to be part of the main workflow.

Parameter Type
entity

add_subwf()

def add_subwf(
    sub_wf: WorkflowBase,
    kwargs,
):
Parameter Type
sub_wf WorkflowBase
kwargs **kwargs

add_task()

def add_task(
    task: PythonTask,
    kwargs,
):
Parameter Type
task PythonTask
kwargs **kwargs

add_workflow_input()

def add_workflow_input(
    input_name: str,
    python_type: Type,
):

Adds an input to the workflow.

Parameter Type
input_name str
python_type Type

add_workflow_output()

def add_workflow_output(
    output_name: str,
    p: Union[Promise, List[Promise], Dict[str, Promise]],
    python_type: Optional[Type],
):

Add an output with the given name from the given node output.

Parameter Type
output_name str
p Union[Promise, List[Promise], Dict[str, Promise]]
python_type Optional[Type]

compile()

def compile(
    kwargs,
):
Parameter Type
kwargs **kwargs

construct_node_metadata()

def construct_node_metadata()

create_conditional()

def create_conditional(
    name: str,
):
Parameter Type
name str

execute()

def execute(
    kwargs,
):

Called by local_execute. This function is how local execution for imperative workflows runs. Because when an entity is added using the add_entity function, all inputs to that entity should’ve been already declared, we can just iterate through the nodes in order and we shouldn’t run into any dependency issues. That is, we force the user to declare entities already in a topological sort. To keep track of outputs, we create a map to start things off, filled in only with the workflow inputs (if any). As things are run, their outputs are stored in this map. After all nodes are run, we fill in workflow level outputs the same way as any other previous node.

Parameter Type
kwargs **kwargs

local_execute()

def local_execute(
    ctx: FlyteContext,
    kwargs,
):
Parameter Type
ctx FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

ready()

def ready()

This function returns whether or not the workflow is in a ready state, which means

  • Has at least one node
  • All workflow inputs are bound

These conditions assume that all nodes and workflow i/o changes were done with the functions above, which do additional checking.

Properties

Property Type Description
compilation_state
default_options
docs
failure_node
inputs
interface
name
nodes
on_failure
output_bindings
python_interface
short_name
workflow_metadata
workflow_metadata_defaults

flytekit.core.workflow.Interface

A Python native interface object, like inspect.signature but simpler.

def Interface(
    inputs: Union[Optional[Dict[str, Type]], Optional[Dict[str, Tuple[Type, Any]]]],
    outputs: Union[Optional[Dict[str, Type]], Optional[Dict[str, Optional[Type]]]],
    output_tuple_name: Optional[str],
    docstring: Optional[Docstring],
):
Parameter Type
inputs Union[Optional[Dict[str, Type]], Optional[Dict[str, Tuple[Type, Any]]]]
outputs Union[Optional[Dict[str, Type]], Optional[Dict[str, Optional[Type]]]]
output_tuple_name Optional[str]
docstring Optional[Docstring]

Methods

Method Description
remove_inputs() This method is useful in removing some variables from the Flyte backend inputs specification, as these are
with_inputs() Use this to add additional inputs to the interface
with_outputs() This method allows addition of extra outputs are expected from a task specification

remove_inputs()

def remove_inputs(
    vars: Optional[List[str]],
):

This method is useful in removing some variables from the Flyte backend inputs specification, as these are implicit local only inputs or will be supplied by the library at runtime. For example, spark-session etc It creates a new instance of interface with the requested variables removed

Parameter Type
vars Optional[List[str]]

with_inputs()

def with_inputs(
    extra_inputs: Dict[str, Type],
):

Use this to add additional inputs to the interface. This is useful for adding additional implicit inputs that are added without the user requesting for them

Parameter Type
extra_inputs Dict[str, Type]

with_outputs()

def with_outputs(
    extra_outputs: Dict[str, Type],
):

This method allows addition of extra outputs are expected from a task specification

Parameter Type
extra_outputs Dict[str, Type]

Properties

Property Type Description
default_inputs_as_kwargs
docstring
inputs
inputs_with_defaults
output_names
output_tuple
output_tuple_name
outputs

flytekit.core.workflow.Node

This class will hold all the things necessary to make an SdkNode but we won’t make one until we know things like ID, which from the registration step

def Node(
    id: str,
    metadata: _workflow_model.NodeMetadata,
    bindings: List[_literal_models.Binding],
    upstream_nodes: List[Node],
    flyte_entity: Any,
):
Parameter Type
id str
metadata _workflow_model.NodeMetadata
bindings List[_literal_models.Binding]
upstream_nodes List[Node]
flyte_entity Any

Methods

Method Description
runs_before() This is typically something we shouldn’t do
with_overrides() None

runs_before()

def runs_before(
    other: Node,
):

This is typically something we shouldn’t do. This modifies an attribute of the other instance rather than self. But it’s done so only because we wanted this English function to be the same as the shift function. That is, calling node_1.runs_before(node_2) and node_1 » node_2 are the same. The shift operator going the other direction is not implemented to further avoid confusion. Right shift was picked rather than left shift because that’s what most users are familiar with.

Parameter Type
other Node

with_overrides()

def with_overrides(
    node_name: Optional[str],
    aliases: Optional[Dict[str, str]],
    requests: Optional[Resources],
    limits: Optional[Resources],
    timeout: Optional[Union[int, datetime.timedelta, object]],
    retries: Optional[int],
    interruptible: Optional[bool],
    name: Optional[str],
    task_config: Optional[Any],
    container_image: Optional[str],
    accelerator: Optional[BaseAccelerator],
    cache: Optional[bool],
    cache_version: Optional[str],
    cache_serialize: Optional[bool],
    shared_memory: Optional[Union[L[True], str]],
    pod_template: Optional[PodTemplate],
    resources: Optional[Resources],
    args,
    kwargs,
):
Parameter Type
node_name Optional[str]
aliases Optional[Dict[str, str]]
requests Optional[Resources]
limits Optional[Resources]
timeout Optional[Union[int, datetime.timedelta, object]]
retries Optional[int]
interruptible Optional[bool]
name Optional[str]
task_config Optional[Any]
container_image Optional[str]
accelerator Optional[BaseAccelerator]
cache Optional[bool]
cache_version Optional[str]
cache_serialize Optional[bool]
shared_memory Optional[Union[L[True], str]]
pod_template Optional[PodTemplate]
resources Optional[Resources]
args *args
kwargs **kwargs

Properties

Property Type Description
bindings
flyte_entity
id
metadata
name
outputs
run_entity
upstream_nodes

flytekit.core.workflow.NodeOutput

def NodeOutput(
    node: Node,
    var: str,
    attr_path: Optional[List[Union[str, int]]],
):
Parameter Type
node Node
var str
attr_path Optional[List[Union[str, int]]]

Methods

Method Description
deepcopy() None
from_flyte_idl()
serialize_to_string() None
short_string()
to_flyte_idl()
verbose_string()
with_attr() None

deepcopy()

def deepcopy()

from_flyte_idl()

def from_flyte_idl(
    pb2_object,
):
Parameter Type
pb2_object

serialize_to_string()

def serialize_to_string()

short_string()

def short_string()

to_flyte_idl()

def to_flyte_idl()

verbose_string()

def verbose_string()

with_attr()

def with_attr(
    key,
):
Parameter Type
key

Properties

Property Type Description
attr_path
is_empty
node
node_id
var

flytekit.core.workflow.Options

These are options that can be configured for a launchplan during registration or overridden during an execution. For instance two people may want to run the same workflow but have the offloaded data stored in two different buckets. Or you may want labels or annotations to be different. This object is used when launching an execution in a Flyte backend, and also when registering launch plans.

def Options(
    labels: typing.Optional[flytekit.models.common.Labels],
    annotations: typing.Optional[flytekit.models.common.Annotations],
    raw_output_data_config: typing.Optional[flytekit.models.common.RawOutputDataConfig],
    security_context: typing.Optional[flytekit.models.security.SecurityContext],
    max_parallelism: typing.Optional[int],
    notifications: typing.Optional[typing.List[flytekit.models.common.Notification]],
    disable_notifications: typing.Optional[bool],
    overwrite_cache: typing.Optional[bool],
):
Parameter Type
labels typing.Optional[flytekit.models.common.Labels]
annotations typing.Optional[flytekit.models.common.Annotations]
raw_output_data_config typing.Optional[flytekit.models.common.RawOutputDataConfig]
security_context typing.Optional[flytekit.models.security.SecurityContext]
max_parallelism typing.Optional[int]
notifications typing.Optional[typing.List[flytekit.models.common.Notification]]
disable_notifications typing.Optional[bool]
overwrite_cache typing.Optional[bool]

Methods

Method Description
default_from() None

default_from()

def default_from(
    k8s_service_account: typing.Optional[str],
    raw_data_prefix: typing.Optional[str],
):
Parameter Type
k8s_service_account typing.Optional[str]
raw_data_prefix typing.Optional[str]

flytekit.core.workflow.ParamSpec

Parameter specification.

flytekit.core.workflow.Promise

This object is a wrapper and exists for three main reasons. Let’s assume we’re dealing with a task like ::

@task def t1() -> (int, str): …

#. Handling the duality between compilation and local execution - when the task function is run in a local execution mode inside a workflow function, a Python integer and string are produced. When the task is being compiled as part of the workflow, the task call creates a Node instead, and the task returns two Promise objects that point to that Node. #. One needs to be able to call ::

x = t1().with_overrides(…)

If the task returns an integer or a (int, str) tuple like t1 above, calling with_overrides on the result would throw an error. This Promise object adds that. #. Assorted handling for conditionals.

def Promise(
    var: str,
    val: Union[NodeOutput, _literals_models.Literal],
    type: typing.Optional[_type_models.LiteralType],
):
Parameter Type
var str
val Union[NodeOutput, _literals_models.Literal]
type typing.Optional[_type_models.LiteralType]

Methods

Method Description
deepcopy() None
eval() None
is_() None
is_false() None
is_none() None
is_true() None
with_overrides() None
with_var() None

deepcopy()

def deepcopy()

eval()

def eval()

is_()

def is_(
    v: bool,
):
Parameter Type
v bool

is_false()

def is_false()

is_none()

def is_none()

is_true()

def is_true()

with_overrides()

def with_overrides(
    node_name: Optional[str],
    aliases: Optional[Dict[str, str]],
    requests: Optional[Resources],
    limits: Optional[Resources],
    timeout: Optional[Union[int, datetime.timedelta, object]],
    retries: Optional[int],
    interruptible: Optional[bool],
    name: Optional[str],
    task_config: Optional[Any],
    container_image: Optional[str],
    accelerator: Optional[BaseAccelerator],
    cache: Optional[bool],
    cache_version: Optional[str],
    cache_serialize: Optional[bool],
    args,
    kwargs,
):
Parameter Type
node_name Optional[str]
aliases Optional[Dict[str, str]]
requests Optional[Resources]
limits Optional[Resources]
timeout Optional[Union[int, datetime.timedelta, object]]
retries Optional[int]
interruptible Optional[bool]
name Optional[str]
task_config Optional[Any]
container_image Optional[str]
accelerator Optional[BaseAccelerator]
cache Optional[bool]
cache_version Optional[str]
cache_serialize Optional[bool]
args *args
kwargs **kwargs

with_var()

def with_var(
    new_var: str,
):
Parameter Type
new_var str

Properties

Property Type Description
attr_path
is_ready
ref
val
var

flytekit.core.workflow.PythonAutoContainerTask

A Python AutoContainer task should be used as the base for all extensions that want the user’s code to be in the container and the container information to be automatically captured. This base will auto configure the image and image version to be used for all its derivatives.

If you are looking to extend, you might prefer to use PythonFunctionTask or PythonInstanceTask

def PythonAutoContainerTask(
    name: str,
    task_config: T,
    task_type,
    container_image: Optional[Union[str, ImageSpec]],
    requests: Optional[Resources],
    limits: Optional[Resources],
    environment: Optional[Dict[str, str]],
    task_resolver: Optional[TaskResolverMixin],
    secret_requests: Optional[List[Secret]],
    pod_template: Optional[PodTemplate],
    pod_template_name: Optional[str],
    accelerator: Optional[BaseAccelerator],
    shared_memory: Optional[Union[L[True], str]],
    resources: Optional[Resources],
    kwargs,
):
Parameter Type
name str
task_config T
task_type
container_image Optional[Union[str, ImageSpec]]
requests Optional[Resources]
limits Optional[Resources]
environment Optional[Dict[str, str]]
task_resolver Optional[TaskResolverMixin]
secret_requests Optional[List[Secret]]
pod_template Optional[PodTemplate]
pod_template_name Optional[str]
accelerator Optional[BaseAccelerator]
shared_memory Optional[Union[L[True], str]]
resources Optional[Resources]
kwargs **kwargs

Methods

Method Description
compile() Generates a node that encapsulates this task in a workflow definition
construct_node_metadata() Used when constructing the node that encapsulates this task as part of a broader workflow definition
dispatch_execute() This method translates Flyte’s Type system based input values and invokes the actual call to the executor
execute() This method will be invoked to execute the task
find_lhs() None
get_command() Returns the command which should be used in the container definition for the serialized version of this task
get_config() Returns the task config as a serializable dictionary
get_container() Returns the container definition (if any) that is used to run the task on hosted Flyte
get_custom() Return additional plugin-specific custom data (if any) as a serializable dictionary
get_default_command() Returns the default pyflyte-execute command used to run this on hosted Flyte platforms
get_extended_resources() Returns the extended resources to allocate to the task on hosted Flyte
get_image() Update image spec based on fast registration usage, and return string representing the image
get_input_types() Returns the names and python types as a dictionary for the inputs of this task
get_k8s_pod() Returns the kubernetes pod definition (if any) that is used to run the task on hosted Flyte
get_sql() Returns the Sql definition (if any) that is used to run the task on hosted Flyte
get_type_for_input_var() Returns the python type for an input variable by name
get_type_for_output_var() Returns the python type for the specified output variable by name
local_execute() This function is used only in the local execution path and is responsible for calling dispatch execute
local_execution_mode() None
post_execute() Post execute is called after the execution has completed, with the user_params and can be used to clean-up,
pre_execute() This is the method that will be invoked directly before executing the task method and before all the inputs
reset_command_fn() Resets the command which should be used in the container definition of this task to the default arguments
sandbox_execute() Call dispatch_execute, in the context of a local sandbox execution
set_command_fn() By default, the task will run on the Flyte platform using the pyflyte-execute command
set_resolver() By default, flytekit uses the DefaultTaskResolver to resolve the task

compile()

def compile(
    ctx: flytekit.core.context_manager.FlyteContext,
    args,
    kwargs,
):

Generates a node that encapsulates this task in a workflow definition.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
args *args
kwargs **kwargs

construct_node_metadata()

def construct_node_metadata()

Used when constructing the node that encapsulates this task as part of a broader workflow definition.

dispatch_execute()

def dispatch_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

This method translates Flyte’s Type system based input values and invokes the actual call to the executor This method is also invoked during runtime.

  • VoidPromise is returned in the case when the task itself declares no outputs.
  • Literal Map is returned when the task returns either one more outputs in the declaration. Individual outputs may be none
  • DynamicJobSpec is returned when a dynamic workflow is executed
Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

execute()

def execute(
    kwargs,
):

This method will be invoked to execute the task.

Parameter Type
kwargs **kwargs

find_lhs()

def find_lhs()

get_command()

def get_command(
    settings: SerializationSettings,
):

Returns the command which should be used in the container definition for the serialized version of this task registered on a hosted Flyte platform.

Parameter Type
settings SerializationSettings

get_config()

def get_config(
    settings: SerializationSettings,
):

Returns the task config as a serializable dictionary. This task config consists of metadata about the custom defined for this task.

Parameter Type
settings SerializationSettings

get_container()

def get_container(
    settings: SerializationSettings,
):

Returns the container definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings SerializationSettings

get_custom()

def get_custom(
    settings: flytekit.configuration.SerializationSettings,
):

Return additional plugin-specific custom data (if any) as a serializable dictionary.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_default_command()

def get_default_command(
    settings: SerializationSettings,
):

Returns the default pyflyte-execute command used to run this on hosted Flyte platforms.

Parameter Type
settings SerializationSettings

get_extended_resources()

def get_extended_resources(
    settings: SerializationSettings,
):

Returns the extended resources to allocate to the task on hosted Flyte.

Parameter Type
settings SerializationSettings

get_image()

def get_image(
    settings: SerializationSettings,
):

Update image spec based on fast registration usage, and return string representing the image

Parameter Type
settings SerializationSettings

get_input_types()

def get_input_types()

Returns the names and python types as a dictionary for the inputs of this task.

get_k8s_pod()

def get_k8s_pod(
    settings: SerializationSettings,
):

Returns the kubernetes pod definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings SerializationSettings

get_sql()

def get_sql(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the Sql definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_type_for_input_var()

def get_type_for_input_var(
    k: str,
    v: typing.Any,
):

Returns the python type for an input variable by name.

Parameter Type
k str
v typing.Any

get_type_for_output_var()

def get_type_for_output_var(
    k: str,
    v: typing.Any,
):

Returns the python type for the specified output variable by name.

Parameter Type
k str
v typing.Any

local_execute()

def local_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    kwargs,
):

This function is used only in the local execution path and is responsible for calling dispatch execute. Use this function when calling a task with native values (or Promises containing Flyte literals derived from Python native values).

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

post_execute()

def post_execute(
    user_params: typing.Optional[flytekit.core.context_manager.ExecutionParameters],
    rval: typing.Any,
):

Post execute is called after the execution has completed, with the user_params and can be used to clean-up, or alter the outputs to match the intended tasks outputs. If not overridden, then this function is a No-op

Parameter Type
user_params typing.Optional[flytekit.core.context_manager.ExecutionParameters]
rval typing.Any

pre_execute()

def pre_execute(
    user_params: typing.Optional[flytekit.core.context_manager.ExecutionParameters],
):

This is the method that will be invoked directly before executing the task method and before all the inputs are converted. One particular case where this is useful is if the context is to be modified for the user process to get some user space parameters. This also ensures that things like SparkSession are already correctly setup before the type transformers are called

This should return either the same context of the mutated context

Parameter Type
user_params typing.Optional[flytekit.core.context_manager.ExecutionParameters]

reset_command_fn()

def reset_command_fn()

Resets the command which should be used in the container definition of this task to the default arguments. This is useful when the command line is overridden at serialization time.

sandbox_execute()

def sandbox_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

Call dispatch_execute, in the context of a local sandbox execution. Not invoked during runtime.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

set_command_fn()

def set_command_fn(
    get_command_fn: Optional[Callable[[SerializationSettings], List[str]]],
):

By default, the task will run on the Flyte platform using the pyflyte-execute command. However, it can be useful to update the command with which the task is serialized for specific cases like running map tasks (“pyflyte-map-execute”) or for fast-executed tasks.

Parameter Type
get_command_fn Optional[Callable[[SerializationSettings], List[str]]]

set_resolver()

def set_resolver(
    resolver: TaskResolverMixin,
):

By default, flytekit uses the DefaultTaskResolver to resolve the task. This method allows the user to set a custom task resolver. It can be useful to override the task resolver for specific cases like running tasks in the jupyter notebook.

Parameter Type
resolver TaskResolverMixin

Properties

Property Type Description
container_image
deck_fields
disable_deck
docs
enable_deck
environment
instantiated_in
interface
lhs
location
metadata
name
python_interface
resources
security_context
task_config
task_resolver
task_type
task_type_version

flytekit.core.workflow.PythonFunctionWorkflow

Please read :std:ref:flyte:divedeep-workflows first for a high-level understanding of what workflows are in Flyte. This Python object represents a workflow defined by a function and decorated with the :py:func:@workflow <flytekit.workflow> decorator. Please see notes on that object for additional information.

def PythonFunctionWorkflow(
    workflow_function: Callable,
    metadata: WorkflowMetadata,
    default_metadata: WorkflowMetadataDefaults,
    docstring: Optional[Docstring],
    on_failure: Optional[Union[WorkflowBase, Task]],
    docs: Optional[Documentation],
    pickle_untyped: bool,
    default_options: Optional[Options],
):
Parameter Type
workflow_function Callable
metadata WorkflowMetadata
default_metadata WorkflowMetadataDefaults
docstring Optional[Docstring]
on_failure Optional[Union[WorkflowBase, Task]]
docs Optional[Documentation]
pickle_untyped bool
default_options Optional[Options]

Methods

Method Description
add() None
compile() Supply static Python native values in the kwargs if you want them to be used in the compilation
construct_node_metadata() None
execute() This function is here only to try to streamline the pattern between workflows and tasks
find_lhs() None
get_all_tasks() Future proof method
load_task() Given the set of identifier keys, should return one Python Task or raise an error if not found
loader_args() This is responsible for turning an instance of a task into args that the load_task function can reconstitute
local_execute() None
local_execution_mode() None
task_name() Overridable function that can optionally return a custom name for a given task

add()

def add(
    t: flytekit.core.python_auto_container.PythonAutoContainerTask,
):
Parameter Type
t flytekit.core.python_auto_container.PythonAutoContainerTask

compile()

def compile(
    kwargs,
):

Supply static Python native values in the kwargs if you want them to be used in the compilation. This mimics a ‘closure’ in the traditional sense of the word.

Parameter Type
kwargs **kwargs

construct_node_metadata()

def construct_node_metadata()

execute()

def execute(
    kwargs,
):

This function is here only to try to streamline the pattern between workflows and tasks. Since tasks call execute from dispatch_execute which is in local_execute, workflows should also call an execute inside local_execute. This makes mocking cleaner.

Parameter Type
kwargs **kwargs

find_lhs()

def find_lhs()

get_all_tasks()

def get_all_tasks()

Future proof method. Just making it easy to access all tasks (Not required today as we auto register them)

load_task()

def load_task(
    loader_args: typing.List[str],
):

Given the set of identifier keys, should return one Python Task or raise an error if not found

Parameter Type
loader_args typing.List[str]

loader_args()

def loader_args(
    settings: flytekit.configuration.SerializationSettings,
    t: flytekit.core.python_auto_container.PythonAutoContainerTask,
):

This is responsible for turning an instance of a task into args that the load_task function can reconstitute.

Parameter Type
settings flytekit.configuration.SerializationSettings
t flytekit.core.python_auto_container.PythonAutoContainerTask

local_execute()

def local_execute(
    ctx: FlyteContext,
    kwargs,
):
Parameter Type
ctx FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

task_name()

def task_name(
    t: PythonAutoContainerTask,
):

Overridable function that can optionally return a custom name for a given task

Parameter Type
t PythonAutoContainerTask

Properties

Property Type Description
default_options
docs
failure_node
function
instantiated_in
interface
lhs
location
name
nodes
on_failure
output_bindings
python_interface
short_name
workflow_metadata
workflow_metadata_defaults

flytekit.core.workflow.PythonTask

Base Class for all Tasks with a Python native Interface. This should be directly used for task types, that do not have a python function to be executed. Otherwise refer to :py:class:flytekit.PythonFunctionTask.

def PythonTask(
    task_type: str,
    name: str,
    task_config: typing.Optional[~T],
    interface: typing.Optional[flytekit.core.interface.Interface],
    environment: typing.Optional[typing.Dict[str, str]],
    disable_deck: typing.Optional[bool],
    enable_deck: typing.Optional[bool],
    deck_fields: typing.Optional[typing.Tuple[flytekit.deck.deck.DeckField, ...]],
    kwargs,
):
Parameter Type
task_type str
name str
task_config typing.Optional[~T]
interface typing.Optional[flytekit.core.interface.Interface]
environment typing.Optional[typing.Dict[str, str]]
disable_deck typing.Optional[bool]
enable_deck typing.Optional[bool]
deck_fields typing.Optional[typing.Tuple[flytekit.deck.deck.DeckField, ...]]
kwargs **kwargs

Methods

Method Description
compile() Generates a node that encapsulates this task in a workflow definition
construct_node_metadata() Used when constructing the node that encapsulates this task as part of a broader workflow definition
dispatch_execute() This method translates Flyte’s Type system based input values and invokes the actual call to the executor
execute() This method will be invoked to execute the task
find_lhs() None
get_config() Returns the task config as a serializable dictionary
get_container() Returns the container definition (if any) that is used to run the task on hosted Flyte
get_custom() Return additional plugin-specific custom data (if any) as a serializable dictionary
get_extended_resources() Returns the extended resources to allocate to the task on hosted Flyte
get_input_types() Returns the names and python types as a dictionary for the inputs of this task
get_k8s_pod() Returns the kubernetes pod definition (if any) that is used to run the task on hosted Flyte
get_sql() Returns the Sql definition (if any) that is used to run the task on hosted Flyte
get_type_for_input_var() Returns the python type for an input variable by name
get_type_for_output_var() Returns the python type for the specified output variable by name
local_execute() This function is used only in the local execution path and is responsible for calling dispatch execute
local_execution_mode() None
post_execute() Post execute is called after the execution has completed, with the user_params and can be used to clean-up,
pre_execute() This is the method that will be invoked directly before executing the task method and before all the inputs
sandbox_execute() Call dispatch_execute, in the context of a local sandbox execution

compile()

def compile(
    ctx: flytekit.core.context_manager.FlyteContext,
    args,
    kwargs,
):

Generates a node that encapsulates this task in a workflow definition.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
args *args
kwargs **kwargs

construct_node_metadata()

def construct_node_metadata()

Used when constructing the node that encapsulates this task as part of a broader workflow definition.

dispatch_execute()

def dispatch_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

This method translates Flyte’s Type system based input values and invokes the actual call to the executor This method is also invoked during runtime.

  • VoidPromise is returned in the case when the task itself declares no outputs.
  • Literal Map is returned when the task returns either one more outputs in the declaration. Individual outputs may be none
  • DynamicJobSpec is returned when a dynamic workflow is executed
Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

execute()

def execute(
    kwargs,
):

This method will be invoked to execute the task.

Parameter Type
kwargs **kwargs

find_lhs()

def find_lhs()

get_config()

def get_config(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the task config as a serializable dictionary. This task config consists of metadata about the custom defined for this task.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_container()

def get_container(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the container definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_custom()

def get_custom(
    settings: flytekit.configuration.SerializationSettings,
):

Return additional plugin-specific custom data (if any) as a serializable dictionary.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_extended_resources()

def get_extended_resources(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the extended resources to allocate to the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_input_types()

def get_input_types()

Returns the names and python types as a dictionary for the inputs of this task.

get_k8s_pod()

def get_k8s_pod(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the kubernetes pod definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_sql()

def get_sql(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the Sql definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_type_for_input_var()

def get_type_for_input_var(
    k: str,
    v: typing.Any,
):

Returns the python type for an input variable by name.

Parameter Type
k str
v typing.Any

get_type_for_output_var()

def get_type_for_output_var(
    k: str,
    v: typing.Any,
):

Returns the python type for the specified output variable by name.

Parameter Type
k str
v typing.Any

local_execute()

def local_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    kwargs,
):

This function is used only in the local execution path and is responsible for calling dispatch execute. Use this function when calling a task with native values (or Promises containing Flyte literals derived from Python native values).

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

post_execute()

def post_execute(
    user_params: typing.Optional[flytekit.core.context_manager.ExecutionParameters],
    rval: typing.Any,
):

Post execute is called after the execution has completed, with the user_params and can be used to clean-up, or alter the outputs to match the intended tasks outputs. If not overridden, then this function is a No-op

Parameter Type
user_params typing.Optional[flytekit.core.context_manager.ExecutionParameters]
rval typing.Any

pre_execute()

def pre_execute(
    user_params: typing.Optional[flytekit.core.context_manager.ExecutionParameters],
):

This is the method that will be invoked directly before executing the task method and before all the inputs are converted. One particular case where this is useful is if the context is to be modified for the user process to get some user space parameters. This also ensures that things like SparkSession are already correctly setup before the type transformers are called

This should return either the same context of the mutated context

Parameter Type
user_params typing.Optional[flytekit.core.context_manager.ExecutionParameters]

sandbox_execute()

def sandbox_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

Call dispatch_execute, in the context of a local sandbox execution. Not invoked during runtime.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

Properties

Property Type Description
deck_fields
disable_deck
docs
enable_deck
environment
instantiated_in
interface
lhs
location
metadata
name
python_interface
security_context
task_config
task_type
task_type_version

flytekit.core.workflow.ReferenceEntity

def ReferenceEntity(
    reference: typing.Union[flytekit.core.reference_entity.WorkflowReference, flytekit.core.reference_entity.TaskReference, flytekit.core.reference_entity.LaunchPlanReference],
    inputs: typing.Dict[str, typing.Type],
    outputs: typing.Dict[str, typing.Type],
):
Parameter Type
reference typing.Union[flytekit.core.reference_entity.WorkflowReference, flytekit.core.reference_entity.TaskReference, flytekit.core.reference_entity.LaunchPlanReference]
inputs typing.Dict[str, typing.Type]
outputs typing.Dict[str, typing.Type]

Methods

Method Description
compile() None
construct_node_metadata() None
execute() None
local_execute() Please see the local_execute comments in the main task
local_execution_mode() None
unwrap_literal_map_and_execute() Please see the implementation of the dispatch_execute function in the real task

compile()

def compile(
    ctx: flytekit.core.context_manager.FlyteContext,
    args,
    kwargs,
):
Parameter Type
ctx flytekit.core.context_manager.FlyteContext
args *args
kwargs **kwargs

construct_node_metadata()

def construct_node_metadata()

execute()

def execute(
    kwargs,
):
Parameter Type
kwargs **kwargs

local_execute()

def local_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    kwargs,
):

Please see the local_execute comments in the main task.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

unwrap_literal_map_and_execute()

def unwrap_literal_map_and_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

Please see the implementation of the dispatch_execute function in the real task.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

Properties

Property Type Description
id
interface
name
python_interface
reference

flytekit.core.workflow.ReferenceWorkflow

A reference workflow is a pointer to a workflow that already exists on your Flyte installation. This object will not initiate a network call to Admin, which is why the user is asked to provide the expected interface. If at registration time the interface provided causes an issue with compilation, an error will be returned.

def ReferenceWorkflow(
    project: str,
    domain: str,
    name: str,
    version: str,
    inputs: Dict[str, Type],
    outputs: Dict[str, Type],
):
Parameter Type
project str
domain str
name str
version str
inputs Dict[str, Type]
outputs Dict[str, Type]

Methods

Method Description
add() None
compile() None
construct_node_metadata() None
execute() None
find_lhs() None
get_all_tasks() Future proof method
load_task() Given the set of identifier keys, should return one Python Task or raise an error if not found
loader_args() This is responsible for turning an instance of a task into args that the load_task function can reconstitute
local_execute() Please see the local_execute comments in the main task
local_execution_mode() None
task_name() Overridable function that can optionally return a custom name for a given task
unwrap_literal_map_and_execute() Please see the implementation of the dispatch_execute function in the real task

add()

def add(
    t: flytekit.core.python_auto_container.PythonAutoContainerTask,
):
Parameter Type
t flytekit.core.python_auto_container.PythonAutoContainerTask

compile()

def compile(
    ctx: flytekit.core.context_manager.FlyteContext,
    args,
    kwargs,
):
Parameter Type
ctx flytekit.core.context_manager.FlyteContext
args *args
kwargs **kwargs

construct_node_metadata()

def construct_node_metadata()

execute()

def execute(
    kwargs,
):
Parameter Type
kwargs **kwargs

find_lhs()

def find_lhs()

get_all_tasks()

def get_all_tasks()

Future proof method. Just making it easy to access all tasks (Not required today as we auto register them)

load_task()

def load_task(
    loader_args: typing.List[str],
):

Given the set of identifier keys, should return one Python Task or raise an error if not found

Parameter Type
loader_args typing.List[str]

loader_args()

def loader_args(
    settings: flytekit.configuration.SerializationSettings,
    t: flytekit.core.python_auto_container.PythonAutoContainerTask,
):

This is responsible for turning an instance of a task into args that the load_task function can reconstitute.

Parameter Type
settings flytekit.configuration.SerializationSettings
t flytekit.core.python_auto_container.PythonAutoContainerTask

local_execute()

def local_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    kwargs,
):

Please see the local_execute comments in the main task.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

task_name()

def task_name(
    t: PythonAutoContainerTask,
):

Overridable function that can optionally return a custom name for a given task

Parameter Type
t PythonAutoContainerTask

unwrap_literal_map_and_execute()

def unwrap_literal_map_and_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

Please see the implementation of the dispatch_execute function in the real task.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

Properties

Property Type Description
default_options
docs
failure_node
function
id
instantiated_in
interface
lhs
location
name
nodes
on_failure
output_bindings
python_interface
reference
short_name
workflow_metadata
workflow_metadata_defaults

flytekit.core.workflow.Task

The base of all Tasks in flytekit. This task is closest to the FlyteIDL TaskTemplate and captures information in FlyteIDL specification and does not have python native interfaces associated. Refer to the derived classes for examples of how to extend this class.

def Task(
    task_type: str,
    name: str,
    interface: flytekit.models.interface.TypedInterface,
    metadata: typing.Optional[flytekit.core.base_task.TaskMetadata],
    task_type_version,
    security_ctx: typing.Optional[flytekit.models.security.SecurityContext],
    docs: typing.Optional[flytekit.models.documentation.Documentation],
    kwargs,
):
Parameter Type
task_type str
name str
interface flytekit.models.interface.TypedInterface
metadata typing.Optional[flytekit.core.base_task.TaskMetadata]
task_type_version
security_ctx typing.Optional[flytekit.models.security.SecurityContext]
docs typing.Optional[flytekit.models.documentation.Documentation]
kwargs **kwargs

Methods

Method Description
compile() None
dispatch_execute() This method translates Flyte’s Type system based input values and invokes the actual call to the executor
execute() This method will be invoked to execute the task
get_config() Returns the task config as a serializable dictionary
get_container() Returns the container definition (if any) that is used to run the task on hosted Flyte
get_custom() Return additional plugin-specific custom data (if any) as a serializable dictionary
get_extended_resources() Returns the extended resources to allocate to the task on hosted Flyte
get_input_types() Returns python native types for inputs
get_k8s_pod() Returns the kubernetes pod definition (if any) that is used to run the task on hosted Flyte
get_sql() Returns the Sql definition (if any) that is used to run the task on hosted Flyte
get_type_for_input_var() Returns the python native type for the given input variable
get_type_for_output_var() Returns the python native type for the given output variable
local_execute() This function is used only in the local execution path and is responsible for calling dispatch execute
local_execution_mode() None
pre_execute() This is the method that will be invoked directly before executing the task method and before all the inputs
sandbox_execute() Call dispatch_execute, in the context of a local sandbox execution

compile()

def compile(
    ctx: flytekit.core.context_manager.FlyteContext,
    args,
    kwargs,
):
Parameter Type
ctx flytekit.core.context_manager.FlyteContext
args *args
kwargs **kwargs

dispatch_execute()

def dispatch_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

This method translates Flyte’s Type system based input values and invokes the actual call to the executor This method is also invoked during runtime.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

execute()

def execute(
    kwargs,
):

This method will be invoked to execute the task.

Parameter Type
kwargs **kwargs

get_config()

def get_config(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the task config as a serializable dictionary. This task config consists of metadata about the custom defined for this task.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_container()

def get_container(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the container definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_custom()

def get_custom(
    settings: flytekit.configuration.SerializationSettings,
):

Return additional plugin-specific custom data (if any) as a serializable dictionary.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_extended_resources()

def get_extended_resources(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the extended resources to allocate to the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_input_types()

def get_input_types()

Returns python native types for inputs. In case this is not a python native task (base class) and hence returns a None. we could deduce the type from literal types, but that is not a required exercise

TODO we could use literal type to determine this

get_k8s_pod()

def get_k8s_pod(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the kubernetes pod definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_sql()

def get_sql(
    settings: flytekit.configuration.SerializationSettings,
):

Returns the Sql definition (if any) that is used to run the task on hosted Flyte.

Parameter Type
settings flytekit.configuration.SerializationSettings

get_type_for_input_var()

def get_type_for_input_var(
    k: str,
    v: typing.Any,
):

Returns the python native type for the given input variable

TODO we could use literal type to determine this

Parameter Type
k str
v typing.Any

get_type_for_output_var()

def get_type_for_output_var(
    k: str,
    v: typing.Any,
):

Returns the python native type for the given output variable

TODO we could use literal type to determine this

Parameter Type
k str
v typing.Any

local_execute()

def local_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    kwargs,
):

This function is used only in the local execution path and is responsible for calling dispatch execute. Use this function when calling a task with native values (or Promises containing Flyte literals derived from Python native values).

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

pre_execute()

def pre_execute(
    user_params: flytekit.core.context_manager.ExecutionParameters,
):

This is the method that will be invoked directly before executing the task method and before all the inputs are converted. One particular case where this is useful is if the context is to be modified for the user process to get some user space parameters. This also ensures that things like SparkSession are already correctly setup before the type transformers are called

This should return either the same context of the mutated context

Parameter Type
user_params flytekit.core.context_manager.ExecutionParameters

sandbox_execute()

def sandbox_execute(
    ctx: flytekit.core.context_manager.FlyteContext,
    input_literal_map: flytekit.models.literals.LiteralMap,
):

Call dispatch_execute, in the context of a local sandbox execution. Not invoked during runtime.

Parameter Type
ctx flytekit.core.context_manager.FlyteContext
input_literal_map flytekit.models.literals.LiteralMap

Properties

Property Type Description
docs
interface
metadata
name
python_interface
security_context
task_type
task_type_version

flytekit.core.workflow.TypeEngine

Core Extensible TypeEngine of Flytekit. This should be used to extend the capabilities of FlyteKits type system. Users can implement their own TypeTransformers and register them with the TypeEngine. This will allow special handling of user objects

Methods

Method Description
async_to_literal() Converts a python value of a given type and expected LiteralType into a resolved Literal value
async_to_python_value() None
calculate_hash() None
dict_to_literal_map() None
dict_to_literal_map_pb() None
get_available_transformers() Returns all python types for which transformers are available
get_transformer() Implements a recursive search for the transformer
guess_python_type() Transforms a flyte-specific LiteralType to a regular python value
guess_python_types() Transforms a dictionary of flyte-specific Variable objects to a dictionary of regular python values
lazy_import_transformers() Only load the transformers if needed
literal_map_to_kwargs() None
named_tuple_to_variable_map() Converts a python-native NamedTuple to a flyte-specific VariableMap of named literals
register() This should be used for all types that respond with the right type annotation when you use type(
register_additional_type() None
register_restricted_type() None
to_html() None
to_literal() The current dance is because we are allowing users to call from an async function, this synchronous
to_literal_checks() None
to_literal_type() Converts a python type into a flyte specific LiteralType
to_python_value() Converts a Literal value with an expected python type into a python value
unwrap_offloaded_literal() None

async_to_literal()

def async_to_literal(
    ctx: FlyteContext,
    python_val: typing.Any,
    python_type: Type[T],
    expected: LiteralType,
):

Converts a python value of a given type and expected LiteralType into a resolved Literal value.

Parameter Type
ctx FlyteContext
python_val typing.Any
python_type Type[T]
expected LiteralType

async_to_python_value()

def async_to_python_value(
    ctx: FlyteContext,
    lv: Literal,
    expected_python_type: Type,
):
Parameter Type
ctx FlyteContext
lv Literal
expected_python_type Type

calculate_hash()

def calculate_hash(
    python_val: typing.Any,
    python_type: Type[T],
):
Parameter Type
python_val typing.Any
python_type Type[T]

dict_to_literal_map()

def dict_to_literal_map(
    ctx: FlyteContext,
    d: typing.Dict[str, typing.Any],
    type_hints: Optional[typing.Dict[str, type]],
):
Parameter Type
ctx FlyteContext
d typing.Dict[str, typing.Any]
type_hints Optional[typing.Dict[str, type]]

dict_to_literal_map_pb()

def dict_to_literal_map_pb(
    ctx: FlyteContext,
    d: typing.Dict[str, typing.Any],
    type_hints: Optional[typing.Dict[str, type]],
):
Parameter Type
ctx FlyteContext
d typing.Dict[str, typing.Any]
type_hints Optional[typing.Dict[str, type]]

get_available_transformers()

def get_available_transformers()

Returns all python types for which transformers are available

get_transformer()

def get_transformer(
    python_type: Type,
):

Implements a recursive search for the transformer.

Parameter Type
python_type Type

guess_python_type()

def guess_python_type(
    flyte_type: LiteralType,
):

Transforms a flyte-specific LiteralType to a regular python value.

Parameter Type
flyte_type LiteralType

guess_python_types()

def guess_python_types(
    flyte_variable_dict: typing.Dict[str, _interface_models.Variable],
):

Transforms a dictionary of flyte-specific Variable objects to a dictionary of regular python values.

Parameter Type
flyte_variable_dict typing.Dict[str, _interface_models.Variable]

lazy_import_transformers()

def lazy_import_transformers()

Only load the transformers if needed.

literal_map_to_kwargs()

def literal_map_to_kwargs(
    ctx: FlyteContext,
    lm: LiteralMap,
    python_types: typing.Optional[typing.Dict[str, type]],
    literal_types: typing.Optional[typing.Dict[str, _interface_models.Variable]],
):
Parameter Type
ctx FlyteContext
lm LiteralMap
python_types typing.Optional[typing.Dict[str, type]]
literal_types typing.Optional[typing.Dict[str, _interface_models.Variable]]

named_tuple_to_variable_map()

def named_tuple_to_variable_map(
    t: typing.NamedTuple,
):

Converts a python-native NamedTuple to a flyte-specific VariableMap of named literals.

Parameter Type
t typing.NamedTuple

register()

def register(
    transformer: TypeTransformer,
    additional_types: Optional[typing.List[Type]],
):

This should be used for all types that respond with the right type annotation when you use type(…) function

Parameter Type
transformer TypeTransformer
additional_types Optional[typing.List[Type]]

register_additional_type()

def register_additional_type(
    transformer: TypeTransformer[T],
    additional_type: Type[T],
    override,
):
Parameter Type
transformer TypeTransformer[T]
additional_type Type[T]
override

register_restricted_type()

def register_restricted_type(
    name: str,
    type: Type[T],
):
Parameter Type
name str
type Type[T]

to_html()

def to_html(
    ctx: FlyteContext,
    python_val: typing.Any,
    expected_python_type: Type[typing.Any],
):
Parameter Type
ctx FlyteContext
python_val typing.Any
expected_python_type Type[typing.Any]

to_literal()

def to_literal(
    ctx: FlyteContext,
    python_val: typing.Any,
    python_type: Type[T],
    expected: LiteralType,
):

The current dance is because we are allowing users to call from an async function, this synchronous to_literal function, and allowing this to_literal function, to then invoke yet another async function, namely an async transformer.

Parameter Type
ctx FlyteContext
python_val typing.Any
python_type Type[T]
expected LiteralType

to_literal_checks()

def to_literal_checks(
    python_val: typing.Any,
    python_type: Type[T],
    expected: LiteralType,
):
Parameter Type
python_val typing.Any
python_type Type[T]
expected LiteralType

to_literal_type()

def to_literal_type(
    python_type: Type[T],
):

Converts a python type into a flyte specific LiteralType

Parameter Type
python_type Type[T]

to_python_value()

def to_python_value(
    ctx: FlyteContext,
    lv: Literal,
    expected_python_type: Type,
):

Converts a Literal value with an expected python type into a python value.

Parameter Type
ctx FlyteContext
lv Literal
expected_python_type Type

unwrap_offloaded_literal()

def unwrap_offloaded_literal(
    ctx: FlyteContext,
    lv: Literal,
):
Parameter Type
ctx FlyteContext
lv Literal

flytekit.core.workflow.VoidPromise

This object is returned for tasks that do not return any outputs (declared interface is empty) VoidPromise cannot be interacted with and does not allow comparisons or any operations

def VoidPromise(
    task_name: str,
    ref: Optional[NodeOutput],
):
Parameter Type
task_name str
ref Optional[NodeOutput]

Methods

Method Description
runs_before() This is a placeholder and should do nothing
with_overrides() None

runs_before()

def runs_before(
    args,
    kwargs,
):

This is a placeholder and should do nothing. It is only here to enable local execution of workflows where a task returns nothing.

Parameter Type
args *args
kwargs **kwargs

with_overrides()

def with_overrides(
    args,
    kwargs,
):
Parameter Type
args *args
kwargs **kwargs

Properties

Property Type Description
ref
task_name

flytekit.core.workflow.WorkflowBase

def WorkflowBase(
    name: str,
    workflow_metadata: WorkflowMetadata,
    workflow_metadata_defaults: WorkflowMetadataDefaults,
    python_interface: Interface,
    on_failure: Optional[Union[WorkflowBase, Task]],
    docs: Optional[Documentation],
    default_options: Optional[Options],
    kwargs,
):
Parameter Type
name str
workflow_metadata WorkflowMetadata
workflow_metadata_defaults WorkflowMetadataDefaults
python_interface Interface
on_failure Optional[Union[WorkflowBase, Task]]
docs Optional[Documentation]
default_options Optional[Options]
kwargs **kwargs

Methods

Method Description
compile() None
construct_node_metadata() None
execute() None
local_execute() None
local_execution_mode() None

compile()

def compile(
    kwargs,
):
Parameter Type
kwargs **kwargs

construct_node_metadata()

def construct_node_metadata()

execute()

def execute(
    kwargs,
):
Parameter Type
kwargs **kwargs

local_execute()

def local_execute(
    ctx: FlyteContext,
    kwargs,
):
Parameter Type
ctx FlyteContext
kwargs **kwargs

local_execution_mode()

def local_execution_mode()

Properties

Property Type Description
default_options
docs
failure_node
interface
name
nodes
on_failure
output_bindings
python_interface
short_name
workflow_metadata
workflow_metadata_defaults

flytekit.core.workflow.WorkflowFailurePolicy

Defines the behavior for a workflow execution in the case of an observed node execution failure. By default, a workflow execution will immediately enter a failed state if a component node fails.

flytekit.core.workflow.WorkflowMetadata

def WorkflowMetadata(
    on_failure: WorkflowFailurePolicy,
):
Parameter Type
on_failure WorkflowFailurePolicy

Methods

Method Description
to_flyte_model() None

to_flyte_model()

def to_flyte_model()

flytekit.core.workflow.WorkflowMetadataDefaults

This class is similarly named to the one above. Please see the IDL for more information but essentially, this WorkflowMetadataDefaults class represents the defaults that are handed down to a workflow’s tasks, whereas WorkflowMetadata represents metadata about the workflow itself.

def WorkflowMetadataDefaults(
    interruptible: bool,
):
Parameter Type
interruptible bool

Methods

Method Description
to_flyte_model() None

to_flyte_model()

def to_flyte_model()

flytekit.core.workflow.WorkflowReference

A reference object containing metadata that points to a remote workflow.

def WorkflowReference(
    project: str,
    domain: str,
    name: str,
    version: str,
):
Parameter Type
project str
domain str
name str
version str

Properties

Property Type Description
id
resource_type