GitHub Actions help you automate tasks within your software development life cycle. GitHub say that:
"GitHub Actions are event-driven, meaning that you can run a series of commands after a specified event has occurred. For example, every time someone creates a pull request for a repository, you can automatically run a command that executes a software testing script."
They are a powerful system for automating many software development processes, and open source projects get unlimited compute hours for free! A sequence of steps in GitHub Actions are be combined together into a "workflow", which can furthermore contain multiple "jobs", which are run on multiple machines, with (if needed) multiple operating systems, in parallel. For more information, read GitHub's Introduction to GitHub Actions.
However, workflows can be difficult to create and debug, because:
However, with ghapi
you can write your workflows in Python and shell, and can do nearly all your development on your local machine. We'll start by importing from the library, along with fastcore
:
from ghapi.all import *
from fastcore.utils import *
We're going to help you get started by building a simple workflow which will add a comment to all new pull requests, saying "thank you" to the contributor.
GhApi
makes developing workflows easier, because your actual YAML file is created for you entirely automatically. To create the new workflow, run the create_workflow
function, like so:
create_workflow(name='thankyou', event=Event.pull_request)
Now it's time to create our python script. You'll find it in .github/scripts
, named based on your create_workflow
parameters. The initial skeleton just contains import statements.
It will be much easier for us to iterate on our script if we have a sample context that the GitHub workflow runner will provide to us. We can get one by calling example_payload
:
example = example_payload(Event.pull_request)
list(example)
['action', 'number', 'pull_request', 'repository', 'sender']
For information about all the fields in a payload, use the GitHub webhook payload documentation. For instance, the docs tell us that the "action" field can be:
"opened, edited, closed, assigned, unassigned, review_requested, review_request_removed, ready_for_review, labeled, unlabeled, synchronize, locked, unlocked, or reopened"
Let's see what it contains for our example payload:
example.action
'opened'
For our script, we should ensure that it's only run for the opened
action. Next up, we need to find out how to add a comment to a pull request. First, we'll need to create our GhApi
object. On your own PC, your GitHub token should be in the GITHUB_TOKEN
environment variable, whereas when run in a workflow it will be part of the github
context. The github_token
function handles this for you automatically, so we can say:
api = GhApi(owner='fastai', repo='ghapi-test', token=github_token())
One way to find the correct operation to call is to search the full API reference. Operations are generally named as {verb}_{object}
, so search for create_comment
. Alternatively, you can jump to the section that you expect to contain your required operation -- in this case, it's important to know that GitHub considers a "pull request" to be a special kind of "issue". After some looking through the page, we found this operation:
api.issues.create_comment
issues.create_comment(issue_number, body): Create an issue comment
The hyperlink provided will take you to the GitHub docs for this operation, so take a look at that now. We need to provide an issue_number
and the body
of the comment. We can look inside the payload to find the issue number we need to use:
', '.join(example.pull_request)
'url, id, node_id, html_url, diff_url, patch_url, issue_url, number, state, locked, title, user, body, created_at, updated_at, closed_at, merged_at, merge_commit_sha, assignee, assignees, requested_reviewers, requested_teams, labels, milestone, commits_url, review_comments_url, review_comment_url, comments_url, statuses_url, head, base, _links, author_association, draft, merged, mergeable, rebaseable, mergeable_state, merged_by, comments, review_comments, maintainer_can_modify, commits, additions, deletions, changed_files'
Since the docs call the parameter issue_number
, and there is a number
field here, that's probably what we should use.
To test this out, we'll first create a PR or issue, and then create our comment:
com = api.issues.create_comment(issue_number=1, body='Thank you for your *valuable* contribution')
We can view the comment by visiting its URL:
com.url
'https://api.github.com/repos/fastai/ghapi-test/issues/comments/737393228'
...and then delete it since we were just testing:
com = api.issues.delete_comment(com.id)
The payload will be available in the github
context, which is provided automatically to you using the context_github
variable. When running locally, an example github context will be used instead. (This example github context will not, in general, have the same payload as the event you're using, so use the example payload
for that.)
The event
contains the actual payload:
list(context_github.event)
['inputs', 'organization', 'ref', 'repository', 'sender', 'workflow']
When we called create_workflow
, the workflow it created can also be triggered using workflow dispatch, which is more convenient for testing. In this case, our payload will not have the same information, so we should write our function in a way that can handle workflow dispatch as well.
dispatch = example_payload(Event.workflow_dispatch)
We can check what type of trigger we're responding to, by checking the keys of our payload:
'workflow' in dispatch
True
In that case, we'll use a fixed issue number, instead of using the actual payload number.
We can now write our function.
def reply_thanks():
api = GhApi(owner='fastai', repo='ghapi', token=github_token())
payload = context_github.event
if 'workflow' in payload: issue = 1
else:
if payload.action != 'opened': return
issue = payload.number
api.issues.create_comment(issue_number=issue, body='Thank you for your *valuable* contribution')
Finally, copy this to the script in .github/scripts
, along with a line to run it (reply_thanks()
), and commit it to GitHub.
You can now test the workflow by going to GitHub, clicking the "Actions" tab, then clicking the workflow you just created, and then clicking the "Run workflow" button.
Alternatively, you can run it using GhApi
, which we'll do now:
wf = api.actions.get_workflow('thankyou-pull_request.yml')
api.actions.create_workflow_dispatch(wf.id, ref='master')
To find out if the dispatch succeeded, go to the Actions tab on your repo and look for a yellow circle, which means it's running, or a green tick, which means it succeeded. If you get a red cross, it failed - you can click the item for details. You can also get the results of the latest run through the API (but be sure to wait at least 30 seconds for the workflow to kick off):
last_run = api.actions.list_workflow_runs(wf.id).workflow_runs[0]
last_run.conclusion
'success'
If it doesn't work, you can add print(payload)
after the first line, and when GitHub runs your workflow you'll see the whole payload in your "Actions" panel.
Once it's working OK using the manual workflow dispatch, try creating a pull request.
Sometimes, you need to join more than one job in a workflow. This isn't common, but one example it's absolutely needed is if you want to build software or run tests on multiple platforms, and then have some step that runs before or after all of them. For instance, building and releasing software that needs to be built on each platform requires one job to create a release, and then a separate job to do the builds and add artifacts to the release, since GitHub Actions runs all your steps in a job in a single platform. Furthermore, we need to set up dependencies between jobs using needs
. ghapi
provides helpers to set up multi-job workflows with depedencies for you, which we'll show an example of here.
In this tutorial we're going to show how we created hugo-mathjax. The workflow in this project patches Hugo to add Mathjax support, then builds Mac, Ubuntu, and Windows versions of the software, creating a release which includes each of those builds as artifacts. It does the following:
First, we'll create the workflow and script files:
create_workflow('create', Event.schedule, contexts='needs', opersys='macos, ubuntu, windows', prebuild=True)
This has some pieces we haven't seen before, so let's take a look at them now.
The prebuild
bool tells ghapi
to include a prebuild job (see create_workflow
for the full workflow created). After you run the above, take a look at the YAML that was created, to see how it all works. We're going to need to modify the YAML, to add the schedule we want to run it on, using a cron:
line. You can see the final YAML file here.
The prebuild job calls a prebuild script using a Ubuntu runner. We'll create the prebuild script now.
The script needs to get the current Hugo release tag:
tag = GhApi().repos.get_latest_release('gohugoio', repo='hugo').name
tag
'v0.79.0'
...and check whether we have released that patched version:
api = GhApi(owner='fastai', repo='hugo-mathjax', token=github_token())
exists = True
try: api.repos.get_release_by_tag(tag)
except HTTP404NotFoundError: exists = False
exists
True
In the case that the release already exists, we'll want to just exit, otherwise we'll create a new release:
rel = api.repos.create_release(tag, name=tag)
The build
job will run in a separate runner, so we need to create an output telling it the tag of the release we've created. The actions_output
function does that, but print a special format that GitHub Actions uses:
actions_output('tag', tag)
::set-output name=tag::v0.79.0
You can see the completed prebuild script here.
Now we can move on to creating the build
script. First, we can get the outputs of the prebuild
job as a dict
by using context_needs
, which contains information from any jobs in the needs
section (which ghapi
will automatically set up for you if you enable prebuild
):
out = loads(nested_idx(context_needs, 'prebuild', 'outputs', 'out'))
To get the tag
output of the job, we index into the dict
:
tag = nested_idx(out, 'step1', 'outputs', 'tag')
tag
'v0.79.0'
We get the release details using the same approach as before, then download, untar, patch, and build it. We won't discuss those steps here since they're not specific to ghapi
.
Once the software is built, we need to know the name of the created file, which depends on the operating system we're running on. We can use the following approach to check:
platform = dict(linux='linux', linux2='linux', win32='win', darwin='mac')[sys.platform]
ext_nm = 'hugo.exe' if platform=='win' else 'hugo'
ext_nm
'hugo'
Finally, we can upload our file as an artifact:
api = GhApi(owner='fastai', repo='hugo-mathjax', token=github_token())
rel = api.repos.get_release_by_tag(tag)
api.upload_file(rel, fn)
You can see the complete script here.
We've now successfully built a workflow that automatically patches, builds, and releases a mixed C/Go project, across multiple platforms, in parallel!