azkaban-aplcache
Changes
docs/eventTrigger.rst 156(+156 -0)
docs/figures/createproject.png 0(+0 -0)
docs/figures/embedded-flow.png 0(+0 -0)
docs/figures/emptyprojectpage.png 0(+0 -0)
docs/figures/executeflowfailure.png 0(+0 -0)
docs/figures/executeflownotify.png 0(+0 -0)
docs/figures/executeflowpanel.png 0(+0 -0)
docs/figures/executingflowpage.png 0(+0 -0)
docs/figures/executingflowspage.png 0(+0 -0)
docs/figures/flexible-scheduling.png 0(+0 -0)
docs/figures/FlowExecuted.png 0(+0 -0)
docs/figures/flowview.png 0(+0 -0)
docs/figures/flowviewexecutions.png 0(+0 -0)
docs/figures/hdfsbrowser.png 0(+0 -0)
docs/figures/historypage.png 0(+0 -0)
docs/figures/jobedit.png 0(+0 -0)
docs/figures/jobhistorypage.png 0(+0 -0)
docs/figures/joblogs.png 0(+0 -0)
docs/figures/jobpage.png 0(+0 -0)
docs/figures/jobsummary.png 0(+0 -0)
docs/figures/login.png 0(+0 -0)
docs/figures/newprojectpage.png 0(+0 -0)
docs/figures/permission.png 0(+0 -0)
docs/figures/project-page.png 0(+0 -0)
docs/figures/proxy-user.png 0(+0 -0)
docs/figures/scheduleflowoptions.png 0(+0 -0)
docs/figures/schedulepage.png 0(+0 -0)
docs/figures/slapanel.png 0(+0 -0)
docs/figures/Success.png 0(+0 -0)
docs/figures/TriggerExample.png 0(+0 -0)
docs/figures/TriggerInfo.png 0(+0 -0)
docs/figures/TriggerList.png 0(+0 -0)
docs/figures/uploadprojects.png 0(+0 -0)
docs/figures/userpermission.png 0(+0 -0)
docs/index.rst 2(+2 -0)
docs/useAzkaban.rst 435(+435 -0)
Details
docs/eventTrigger.rst 156(+156 -0)
diff --git a/docs/eventTrigger.rst b/docs/eventTrigger.rst
new file mode 100644
index 0000000..b1a9015
--- /dev/null
+++ b/docs/eventTrigger.rst
@@ -0,0 +1,156 @@
+.. _EventBasedTrigger:
+
+
+Flow Trigger Dependency Plugin
+==================================
+*****
+Event Based Trigger
+*****
+
+
+..
+ Todo:: Link to the data trigger documentation if available
+Currently there are only few ways to launch jobs in Azkaban including schedules and API. However, they are limited because sometimes jobs need to be executed automatically on demand. Event trigger is a new feature introduced by Azkaban. It defines a new paradigm of triggering flows - triggering a flow on event arrival. This concept enables users to define events that the flow depends on. Once all of the dependencies become ready, a workflow will be triggered.
+
+Apache Kafka is a Publish & Subscribe data streaming system. By utilizing Kafka, we do the regular expression match on the Kafka event payload. With the contain-a logic matching, a dependency will be marked as satisfied only if the whole payload contains the Regex pattern that user predefines.
+
+*****
+Getting started with Deploying Event Trigger on the Azkaban
+*****
+
+Azkaban builds use Gradle (downloads automatically when run using gradlew which is the Gradle wrapper) and requires Java 8 or higher.
+
+Build
+########
+The following commands run on *nix platforms like Linux, OS X. For building Flow Trigger Dependency Plugin, we need to run the comment in ``/az-flow-trigger-dependency-type/kafka-event-trigger`` directory.
+::
+ # Build Azkaban
+ ../../gradlew build
+
+ # Clean the build
+ ../../gradlew clean
+
+ # Build without running tests
+ ../../gradlew build -x test
+
+These are all standard Gradle commands. Please look at Gradle documentation for more info.
+
+
+
+Server Configuration
+########
+The gradlew commands help you to build the fat JAR. After that, you need to specify the plugin.dir within ``conf``. Take solo-server for example, override the ``azkaban.dependency.plugin.dir`` property for runtime parameters inside the ``azkaban.properties`` file under the solo-server ``conf`` directory.
+This property needs to set to contain the location where you put your Event-Trigger JAR file.
+
+Data Base Configuration (Optional)
+########
+The following 4 properties can be defined in ``conf/azkaban.private.properties``. for solo-server based on the use case.
+
++-----------------------------------------+
+| Properties |
++=========================================+
+| mysql.user |
++-----------------------------------------+
+| mysql.password |
++-----------------------------------------+
+| org.quartz.dataSource.quartzDS.user |
++-----------------------------------------+
+| org.quartz.dataSource.quartzDS.password |
++-----------------------------------------+
+
+*****
+Event Based Trigger Plugin Configuration
+*****
+
+Inside the Azkaban dependency plugin directory, there should be two items Event Based Trigger plugin jar and the ``dependency.properties``.
+
+Required properties are:
+
+- **dependency.classpath** - Used by Azkaban identify plugins classpath. Should be the JAR file’s absolute path.
+
+- **dependency.class** - Used by Azkaban flow trigger instance to integrate with this configuration file. Take Event trigger for example, it should be ``trigger.kafka.KafkaDependencyCheck``.
+
+- **kafka.broker.url** - Specifying URL and port number where your Kafka broker is.
+
+
+
+*****
+Event Trigger Instance Configuration
+*****
+Event trigger is part of flow definition and each flow can only have one event trigger at most.
+Defining an event trigger is supported via Hadoop DSL.
+The trigger needs to be configurated within the flow file along with the project zip that users upload.
+Event trigger is composed of a list of event dependencies, max wait time and schedule.
+Take the following figure as example:
+
+.. image:: figures/TriggerExample.png
+
+- **Max Wait Time**: How long the trigger will wait for all dependencies to be available before cancelling it.
+- **Trigger.schedule**: The schedule to perform this workflow on the regular basis. We use the cron time format here to specify, creating a trigger followed by the project workflow every 2 minutes
+
+- **Trigger.schedule**: The params here is to clarify what regex pattern happening in the event coming from specific topic channel. The trigger kick-starts the flow if all of predefined dependency conditions are met.
+
+Therefore, this trigger example will launch the flow once detecting Kafka event with anything in ``AzEvent_Topic4``, ``.*Partition[A-Z]....Event`` string in event comming from ``AzEvent_Topic4`` and ``hadoop?.*`` in ``AzEvent_Topic1``.
+
+The matching mechanism can be extended other than regex since now it is implemented as a generic interface.
+
+
+*****
+Event Based Trigger Example With Azkaban UI
+*****
+All scheduled data trigger will show up Azkaban Flow Trigger section. Also, project admins are able to pause and resume a scheduled trigger for undesirable situation.
+
+
+**Trigger info page for a specific flow:**
+
+.. image:: figures/TriggerInfo.png
+
+**Current and historic triggers for a specific flow:**
+
+.. image:: figures/TriggerList.png
+
+
+
+Follow these steps to run end to end local test:
+
+1.Start Kafka Broker Locally:
+
+Following `Kafka QuickStart <https://kafka.apache.org/quickstart/>`_ to run these Kafka console scripts in Kafka package
+::
+ *Start ZooKeeper
+ bin/zookeeper-server-start.sh config/zookeeper.properties
+ *Start Kafka Server
+ bin/kafka-server-start.sh config/server.properties
+
+2. Send Json event to topics with AzEvent_Topic4 as example:
+::
+ bin/kafka-console-producer.sh --broker-list localhost:9092 --topic AzEvent_Topic4 < recordPartition.json
+Here is how my ``recordPartition.json`` looks like:
+
+.. code-block:: json
+
+ {
+ "name":"Charlie",
+ "team": "Azkaban",
+ "event":"MetastorePartitionAuditEvent"
+ }
+
+Once this event arrived, Azkaban will mark this specific event dependency as success.
+
+.. image:: figures/Success.png
+
+3. Send another event from producer to launch the flow:
+::
+ bin/kafka-console-producer.sh --broker-list localhost:9092 --topic AzEvent_Topic4 < recordHadoop.json
+
+
+**Trigger the workflows that have all dependencies cleared out:**
+
+.. image:: figures/FlowExecuted.png
+
+*****
+Limitation
+*****
+Since our design purpose is to decouple the trigger condition from the action to take, currently there is a limitation on deserializing record. Although Kafka provides the ability to publish and subscribe to streams of records on custom serializer and deserializer. What we have right now is limited to Kafka build in String deserializer only. We are planning to enhance the flexibility on users to upload JAR with their own custom deserialize function in the near future.
+
+
diff --git a/docs/figures/azkaban2overviewdesign.png b/docs/figures/azkaban2overviewdesign.png
new file mode 100644
index 0000000..071bc37
Binary files /dev/null and b/docs/figures/azkaban2overviewdesign.png differ
docs/figures/createproject.png 0(+0 -0)
diff --git a/docs/figures/createproject.png b/docs/figures/createproject.png
new file mode 100644
index 0000000..12ca702
Binary files /dev/null and b/docs/figures/createproject.png differ
docs/figures/embedded-flow.png 0(+0 -0)
diff --git a/docs/figures/embedded-flow.png b/docs/figures/embedded-flow.png
new file mode 100644
index 0000000..66f7c5a
Binary files /dev/null and b/docs/figures/embedded-flow.png differ
docs/figures/emptyprojectpage.png 0(+0 -0)
diff --git a/docs/figures/emptyprojectpage.png b/docs/figures/emptyprojectpage.png
new file mode 100644
index 0000000..d2c44f1
Binary files /dev/null and b/docs/figures/emptyprojectpage.png differ
diff --git a/docs/figures/executeflowconcurrent.png b/docs/figures/executeflowconcurrent.png
new file mode 100644
index 0000000..b2ac4c1
Binary files /dev/null and b/docs/figures/executeflowconcurrent.png differ
docs/figures/executeflowfailure.png 0(+0 -0)
diff --git a/docs/figures/executeflowfailure.png b/docs/figures/executeflowfailure.png
new file mode 100644
index 0000000..a077da6
Binary files /dev/null and b/docs/figures/executeflowfailure.png differ
docs/figures/executeflownotify.png 0(+0 -0)
diff --git a/docs/figures/executeflownotify.png b/docs/figures/executeflownotify.png
new file mode 100644
index 0000000..cd64cc9
Binary files /dev/null and b/docs/figures/executeflownotify.png differ
docs/figures/executeflowpanel.png 0(+0 -0)
diff --git a/docs/figures/executeflowpanel.png b/docs/figures/executeflowpanel.png
new file mode 100644
index 0000000..255483d
Binary files /dev/null and b/docs/figures/executeflowpanel.png differ
diff --git a/docs/figures/executeflowparameters.png b/docs/figures/executeflowparameters.png
new file mode 100644
index 0000000..dab0971
Binary files /dev/null and b/docs/figures/executeflowparameters.png differ
docs/figures/executingflowpage.png 0(+0 -0)
diff --git a/docs/figures/executingflowpage.png b/docs/figures/executingflowpage.png
new file mode 100644
index 0000000..7a7efbf
Binary files /dev/null and b/docs/figures/executingflowpage.png differ
diff --git a/docs/figures/executingflowpagejobslist.png b/docs/figures/executingflowpagejobslist.png
new file mode 100644
index 0000000..5469516
Binary files /dev/null and b/docs/figures/executingflowpagejobslist.png differ
docs/figures/executingflowspage.png 0(+0 -0)
diff --git a/docs/figures/executingflowspage.png b/docs/figures/executingflowspage.png
new file mode 100644
index 0000000..d014578
Binary files /dev/null and b/docs/figures/executingflowspage.png differ
docs/figures/flexible-scheduling.png 0(+0 -0)
diff --git a/docs/figures/flexible-scheduling.png b/docs/figures/flexible-scheduling.png
new file mode 100644
index 0000000..910fb85
Binary files /dev/null and b/docs/figures/flexible-scheduling.png differ
docs/figures/FlowExecuted.png 0(+0 -0)
diff --git a/docs/figures/FlowExecuted.png b/docs/figures/FlowExecuted.png
new file mode 100644
index 0000000..1ed46d9
Binary files /dev/null and b/docs/figures/FlowExecuted.png differ
docs/figures/flowview.png 0(+0 -0)
diff --git a/docs/figures/flowview.png b/docs/figures/flowview.png
new file mode 100644
index 0000000..af7f737
Binary files /dev/null and b/docs/figures/flowview.png differ
docs/figures/flowviewexecutions.png 0(+0 -0)
diff --git a/docs/figures/flowviewexecutions.png b/docs/figures/flowviewexecutions.png
new file mode 100644
index 0000000..e68d26a
Binary files /dev/null and b/docs/figures/flowviewexecutions.png differ
docs/figures/hdfsbrowser.png 0(+0 -0)
diff --git a/docs/figures/hdfsbrowser.png b/docs/figures/hdfsbrowser.png
new file mode 100644
index 0000000..b3bfe2b
Binary files /dev/null and b/docs/figures/hdfsbrowser.png differ
docs/figures/historypage.png 0(+0 -0)
diff --git a/docs/figures/historypage.png b/docs/figures/historypage.png
new file mode 100644
index 0000000..4ad95df
Binary files /dev/null and b/docs/figures/historypage.png differ
docs/figures/jobedit.png 0(+0 -0)
diff --git a/docs/figures/jobedit.png b/docs/figures/jobedit.png
new file mode 100644
index 0000000..fd604a4
Binary files /dev/null and b/docs/figures/jobedit.png differ
docs/figures/jobhistorypage.png 0(+0 -0)
diff --git a/docs/figures/jobhistorypage.png b/docs/figures/jobhistorypage.png
new file mode 100644
index 0000000..4156dc7
Binary files /dev/null and b/docs/figures/jobhistorypage.png differ
docs/figures/joblogs.png 0(+0 -0)
diff --git a/docs/figures/joblogs.png b/docs/figures/joblogs.png
new file mode 100644
index 0000000..c9c0ca9
Binary files /dev/null and b/docs/figures/joblogs.png differ
docs/figures/jobpage.png 0(+0 -0)
diff --git a/docs/figures/jobpage.png b/docs/figures/jobpage.png
new file mode 100644
index 0000000..859dca1
Binary files /dev/null and b/docs/figures/jobpage.png differ
docs/figures/jobsummary.png 0(+0 -0)
diff --git a/docs/figures/jobsummary.png b/docs/figures/jobsummary.png
new file mode 100644
index 0000000..1b94f46
Binary files /dev/null and b/docs/figures/jobsummary.png differ
docs/figures/login.png 0(+0 -0)
diff --git a/docs/figures/login.png b/docs/figures/login.png
new file mode 100644
index 0000000..c717926
Binary files /dev/null and b/docs/figures/login.png differ
docs/figures/newprojectpage.png 0(+0 -0)
diff --git a/docs/figures/newprojectpage.png b/docs/figures/newprojectpage.png
new file mode 100644
index 0000000..e628c62
Binary files /dev/null and b/docs/figures/newprojectpage.png differ
docs/figures/permission.png 0(+0 -0)
diff --git a/docs/figures/permission.png b/docs/figures/permission.png
new file mode 100644
index 0000000..a8df8eb
Binary files /dev/null and b/docs/figures/permission.png differ
docs/figures/project-page.png 0(+0 -0)
diff --git a/docs/figures/project-page.png b/docs/figures/project-page.png
new file mode 100644
index 0000000..aef6bd7
Binary files /dev/null and b/docs/figures/project-page.png differ
docs/figures/proxy-user.png 0(+0 -0)
diff --git a/docs/figures/proxy-user.png b/docs/figures/proxy-user.png
new file mode 100644
index 0000000..e8e06b1
Binary files /dev/null and b/docs/figures/proxy-user.png differ
docs/figures/scheduleflowoptions.png 0(+0 -0)
diff --git a/docs/figures/scheduleflowoptions.png b/docs/figures/scheduleflowoptions.png
new file mode 100644
index 0000000..f8d1a3c
Binary files /dev/null and b/docs/figures/scheduleflowoptions.png differ
docs/figures/schedulepage.png 0(+0 -0)
diff --git a/docs/figures/schedulepage.png b/docs/figures/schedulepage.png
new file mode 100644
index 0000000..ff638dd
Binary files /dev/null and b/docs/figures/schedulepage.png differ
diff --git a/docs/figures/schedulepage_Deprecated.png b/docs/figures/schedulepage_Deprecated.png
new file mode 100644
index 0000000..5ae410e
Binary files /dev/null and b/docs/figures/schedulepage_Deprecated.png differ
docs/figures/slapanel.png 0(+0 -0)
diff --git a/docs/figures/slapanel.png b/docs/figures/slapanel.png
new file mode 100644
index 0000000..57b36af
Binary files /dev/null and b/docs/figures/slapanel.png differ
docs/figures/Success.png 0(+0 -0)
diff --git a/docs/figures/Success.png b/docs/figures/Success.png
new file mode 100644
index 0000000..169ad8a
Binary files /dev/null and b/docs/figures/Success.png differ
docs/figures/TriggerExample.png 0(+0 -0)
diff --git a/docs/figures/TriggerExample.png b/docs/figures/TriggerExample.png
new file mode 100644
index 0000000..5a9b36a
Binary files /dev/null and b/docs/figures/TriggerExample.png differ
docs/figures/TriggerInfo.png 0(+0 -0)
diff --git a/docs/figures/TriggerInfo.png b/docs/figures/TriggerInfo.png
new file mode 100644
index 0000000..c756f07
Binary files /dev/null and b/docs/figures/TriggerInfo.png differ
docs/figures/TriggerList.png 0(+0 -0)
diff --git a/docs/figures/TriggerList.png b/docs/figures/TriggerList.png
new file mode 100644
index 0000000..53ba0d7
Binary files /dev/null and b/docs/figures/TriggerList.png differ
docs/figures/uploadprojects.png 0(+0 -0)
diff --git a/docs/figures/uploadprojects.png b/docs/figures/uploadprojects.png
new file mode 100644
index 0000000..982a08e
Binary files /dev/null and b/docs/figures/uploadprojects.png differ
docs/figures/userpermission.png 0(+0 -0)
diff --git a/docs/figures/userpermission.png b/docs/figures/userpermission.png
new file mode 100644
index 0000000..ad93038
Binary files /dev/null and b/docs/figures/userpermission.png differ
docs/index.rst 2(+2 -0)
diff --git a/docs/index.rst b/docs/index.rst
index 68b3068..e8512c0 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -34,6 +34,8 @@ Features
getStarted
configuration
userGuide
+ useAzkaban
+ eventTrigger
docs/useAzkaban.rst 435(+435 -0)
diff --git a/docs/useAzkaban.rst b/docs/useAzkaban.rst
new file mode 100644
index 0000000..18d0b50
--- /dev/null
+++ b/docs/useAzkaban.rst
@@ -0,0 +1,435 @@
+.. _UsingAzkaban:
+
+Using Azkaban
+=============
+
+This section covers how to use Azkaban Web UI to create, view and
+execution your flows.
+
+
+
+
+Create Projects
+---------------
+
+After logging onto Azkaban, you will see the Projects page. This page
+will show you a list of all the projects that you have read
+permissions on. Projects where only group permissions as or those with
+a role with READ or ADMIN will not appear.
+
+.. image:: figures/emptyprojectpage.png
+
+If you are just starting out, the project page may be empty. However
+you can see all the existing projects by clicking on All Projects .
+
+Clicking on **Create Projects** will pop open a dialog box. Enter in a
+unique project name and description of the project. The description
+can be changed in the future, but the project name cannot be. If you
+don't see this button, it is likely that the ability to create new
+projects has been locked down except for users with the proper
+permissions.
+
+.. image:: figures/createproject.png
+
+After creating your project, an empty project page will appear. You
+will automatically be given ADMIN status for this project. Add and
+remove permissions by clicking on the `Permissions Button <#project-permissions>`_.
+
+
+.. image:: figures/newprojectpage.png
+
+If you have the proper permissions (which you should if you created
+the project), you can delete the project, update the description,
+upload files and view the project logs from this page.
+
+
+
+
+
+Upload Projects
+---------------
+
+Click on the **Upload** button. You will see the following dialog.
+
+.. image:: figures/uploadprojects.png
+
+Select the archive file of your workflow files that you want to
+upload. Currently Azkaban only supports ``*.zip`` files. The zip should
+contain the ``*.job`` files and any files needed to run your jobs. Job
+names must be unique in a project.
+
+Azkaban will validate the contents of the zip to make sure that
+dependencies are met and that there's no cyclical dependencies
+detected. If it finds any invalid flows, the upload will fail.
+
+Uploads overwrite all files in the project. Any changes made to jobs
+will be wiped out after a new zip file is uploaded.
+
+After a successful upload, you should see all of your flows listed on
+the screen.
+
+
+
+
+
+
+Flow View
+---------
+
+By clicking on the flow link, you can go to the Flow View page. From
+here, you'll be shown a graph representation of the flow. The left
+panel contains a list of jobs in your flow.
+
+Right clicking on either the jobs in the right panel or the nodes in
+the graph will allow you to open individual jobs. You are also able to
+Schedule and Execute Flows from this page.
+
+.. image:: figures/flowview.png
+
+Click on the Executions tab will show you all the previous executions
+of this flow.
+
+.. image:: figures/flowviewexecutions.png
+
+
+
+
+
+Project Permissions
+-------------------
+
+When a project is created, the creator is automatically given an ADMIN
+status on the project. This allows the creator to view, upload, change
+jobs, run flows, delete and add user permissions to the project. An
+admin can remove other admins, but cannot remove themselves. This
+prevents projects from being admin-less except when admins are deleted
+by a user with an admin role.
+
+The permission page is accessible from the project page. On the
+permissions page, admins can add other users, groups or proxy users to
+the project.
+
+.. image:: figures/permission.png
+
++ Adding user permissions gives those users those specified
+ permissions on the project. Remove user permissions by unchecking all
+ of the permissions.
++ Group permissions allow everyone in a particular group the specified
+ permissions. Remove group permissions by unchecking all the group
+ permissions.
++ If proxy users are turned on, proxy users allows the project
+ workflows to run as those users. This is useful for locking down which
+ headless accounts jobs can proxy to. They are removed by clicking on
+ the 'Remove' button once added.
+
+
+Every user is validated through the UserManager to prevent invalid
+users from being added. Groups and Proxy users are also check to make
+sure they are valid and to see if the admin is allowed to add them to
+the project.
+
+The following permissions can be set for users and groups:
+
+
+
++------------+-------------------------------------------------------------+
+| Permission | Description |
++============+=============================================================+
+| ADMIN | Allows the user to do anything with this project, as well |
+| | as add permissions and delete the project. |
++------------+-------------------------------------------------------------+
+| READ | The user can view the job, the flows, the execution logs. |
+| | |
++------------+-------------------------------------------------------------+
+| WRITE | Project files can be uploaded, and the job files can |
+| | be modified. |
++------------+-------------------------------------------------------------+
+| EXECUTE | The user is allowed to execute, pause, cancel jobs. |
+| | |
++------------+-------------------------------------------------------------+
+| SCHEDULE | The user is allowed to add, modify and remove a flow from |
+| | the schedule. |
++------------+-------------------------------------------------------------+
+
+
+Executing Flow
+--------------
+
+From the `Flow View <#flow-view>`_ or the Project Page, you can trigger a job to be
+executed. You will see an executing panel pop-up.
+
+
+Executing Flow View
+~~~~~~~~~~~~~~~~~~~
+From the Flow View panel, you can right click on the graph and disable
+or enable jobs. Disabled jobs will be skipped during execution as if
+their dependencies have been met. Disabled jobs will appear
+translucent.
+
+.. image:: figures/executeflowpanel.png
+
+
+Notification Options
+~~~~~~~~~~~~~~~~~~~~
+
+The notification options allow users to change the flow's success or
+failure notification behavior.
+
+
+
+Notify on Failure
+````````````````
+
+
++ First Failure - Send failure emails after the first failure is
+ detected.
++ Flow Finished - If the flow has a job that has failed, it will send
+ failure emails after all jobs in the flow have finished.
+
+
+
+
+Email overrides
+````````````````
+
+Azkaban will use the default notification emails set in the final job
+in the flow. If overridden, a user can change the email addresses
+where failure or success emails are sent. The list can be delimited by
+commas, whitespace or a semi-colon.
+
+.. image:: figures/executeflownotify.png
+
+
+Failure Options
+~~~~~~~~~~~~~~~
+
+When a job in a flow fails, you are able to control how the rest of
+the flow will succeed.
+
+
++ **Finish Current Running** will finish the jobs that are currently
+ running, but it will not start new jobs. The flow will be put in the
+ `FAILED FINISHING` state and be set to FAILED once everything
+ completes.
++ **Cancel All** will immediately kill all running jobs and set the state
+ of the executing flow to FAILED.
++ **Finish All Possible** will keep executing jobs in the flow as long as
+ its dependencies are met. The flow will be put in the ``FAILED
+ FINISHING`` state and be set to FAILED once everything completes.
+
+.. image:: figures/executeflowfailure.png
+
+
+Concurrent Options
+~~~~~~~~~~~~~~~~~~
+
+If the flow execution is invoked while the flow is concurrently
+executing, several options can be set.
+
+
++ **Skip Execution** option will not run the flow if its already running.
++ **Run Concurrently** option will run the flow regardless of if its
+ running. Executions are given different working directories.
++ **Pipeline** runs the the flow in a manner that the new execution will
+ not overrun the concurrent execution.
+
+ + Level 1: blocks executing **job A** until the the previous flow's **job A**
+ has completed.
+ + Level 2: blocks executing **job A** until the the children of the
+ previous flow's **job A** has completed. This is useful if you need to run
+ your flows a few steps behind an already executin flow.
+
+
+.. image:: figures/executeflowconcurrent.png
+
+Flow Parameters
+~~~~~~~~~~~~~~~
+
+Allows users to override flow parameters. The flow parameters override
+the global properties for a job, but not the properties of the job
+itself.
+
+.. image:: figures/executeflowparameters.png
+
+
+
+
+
+Executions
+----------
+
+
+
+Flow Execution Page
+~~~~~~~~~~~~~~~~~~~
+
+
+After `executing a flow <#executing-flow>`_ you will be presented the Executing Flow page.
+Alternatively, you can access these flows from the Flow View page
+under the Executions tab, the History page, or the Executing page.
+
+This page is similar to the Flow View page, except it shows status of
+running jobs.
+
+.. image:: figures/executingflowpage.png
+
+Selecting the Job List will give a timeline of job executions. You can
+access the jobs and job logs directly from this list.
+
+.. image:: figures/executingflowpagejobslist.png
+
+This page will auto update as long as the execution is not finished.
+
+Some options that you are able to do on execution flows include the
+following:
+
+
++ Cancel - kills all running jobs and fails the flow immediately. The
+ flow state will be KILLED.
++ Pause - prevents new jobs from running. Currently running jobs
+ proceed as usual.
++ Resume - resume a paused execution.
++ Retry Failed - only available when the flow is in a FAILED FINISHING
+ state. Retry will restart all FAILED jobs while the flow is still
+ active. Attempts will appear in the Jobs List page.
++ Prepare Execution - only available on a finished flow, regardless of
+ success or failures. This will auto disable successfully completed
+ jobs.
+
+
+
+
+Executing Page
+~~~~~~~~~~~~~~
+
+Clicking on the Executing Tab in the header will show the Execution
+page. This page will show currently running executions as well as
+recently finished flows.
+
+.. image:: figures/executingflowspage.png
+
+
+History Page
+~~~~~~~~~~~~
+
+Currently executing flows as well as completed executions will appear
+in the History page. Searching options are provided to find the
+execution you're looking for. Alternatively, you can view previous
+executions for a flow on the Flow View execution tab.
+
+.. image:: figures/historypage.png
+
+
+
+
+
+Schedule Flow
+-------------
+
+From the same panel that is used to `execute flow <#executing-flow>`_, flows can be
+scheduled by clicking on the *Schedule* button.
+
+.. image:: figures/flexible-scheduling.png
+
+Any flow options set will be preserved for the scheduled flow. For
+instance, if jobs are disabled, then the scheduled flow's jobs will
+also be disabled.
+
+With new flexible scheduling feature in Azkaban 3.3, User are able to
+define a cron job following `Quartz syntax <http://www.quartz-scheduler.org/documentation/quartz-2.x/tutorials/crontrigger.html>`_. One important change
+different from Quartz or cron is that Azkaban functions at the minute
+granularity at most. Therefore, second field in UI is labeled as a
+static "0". The `Flexible Schedule Wiki <https://github.com/azkaban/azkaban/wiki/New-Azkaban-Schedule-Introduction>`_ explains the details how to
+use.
+
+After scheduling, it should appear on the schedule page, where you can
+remove the scheduled job or set the SLA options.
+
+.. image:: figures/schedulepage.png
+
+
+SLA
+~~~
+
+To add SLA notification or pre-emption, click on the SLA button. From
+here you can set the SLA alert emails. Rules can be added and applied
+to individual jobs or the flow itself. If duration threshold is
+exceeded, then an alert email can be set or the flow or job can be
+auto killed. If a job is killed due to missing the SLA, it will be
+retried based on the retry configuration of that job.
+
+.. image:: figures/slapanel.png
+
+
+
+
+
+Job Page
+--------
+
+Jobs make up individual tasks of a flow. To get to the jobs page, you
+can right click on a job in the Flow View, the Executing Flow view or
+the Project Page.
+
+.. image:: figures/jobpage.png
+
+From this page you can see the dependencies and dependents for a job
+as well as the global properties that the job will use.
+
+
+
+Job Edit
+~~~~~~~~
+
+Clicking on Job Edit will allow you to edit all the job properties
+except for certain reserved parameters, such as ``type``, and
+``dependencies``. The changes to the parameters will affect an executing
+flow only if the job hasn't started to run yet. These overwrites of
+job properties will be overwritten by the next project upload.
+
+.. image:: figures/jobedit.png
+
+
+Job History
+~~~~~~~~~~~
+
+
+Any retries of a job will show as ``executionid.attempt`` number.
+
+.. image:: figures/jobhistorypage.png
+
+
+
+
+Job Details
+-----------
+
+From an execution page, after clicking "Job List" and then "Details"
+for one of the jobs, you will arrive at the job details page. This
+page contains tabs for the "Job Logs" and a "Summary".
+
+
+
+Job Logs
+~~~~~~~~
+
+The job logs are stored in the database. They contain all the stdout
+and stderr output of the job.
+
+.. image:: figures/joblogs.png
+
+
+Job Summary
+~~~~~~~~~~~
+The Job Summary tab contains a summary of the information in the job
+logs. This includes:
+
++ **Job Type** - the jobtype of the job
++ **Command Summary** - the command that launched the job process, with
+ fields such as the classpath and memory settings shown separately as
+ well
++ **Pig/Hive Job Summary** - custom stats specific to Pig and Hive jobs
++ **Map Reduce Jobs** - a list of job ids of Map-Reduce jobs that were
+ launched, linked to their job tracker pages
+
+.. image:: figures/jobsummary.png