Developer Operations (DevOps) aims to shorten the distance between you as a developer and operations (those that distribute or run your software). Its principles save yourself and your target audience a lot of installation and usage headaches by continuously making sure it works as intended, is installable and tells you loudly if you accidentally break something.
It is totally fine to have a GitHub repository as a mirror of a project and from there handle all external pull-requests and issues. Simply add another remote using git and remember to keep them synchronized.
Integration with gitlab continuous integration and -deployment. Enforced rules by automation is more efficient than relying on developers to remember chores and manually walk through checklists.
Instead of repeating everything out of context, see comments in
Examples for other languages will follow. The tier of gitlab used by IRF does not support creating your own templates, so copy/paste them from here.
To deploy static websites to gitlab pages, all you need is html in a
public directory and notify gitlab of it. In your
.gitlab-ci.yml, add
pages:
# if you run a different default python image, you can omit the next line
image: python:3.8.10
# also add the stage "deploy" to the list of stages
stage: deploy
script:
# There needs to be a script section even if it doesn't do anything
# Any other script that generates static html should output to ./public/
- echo "deploying pages"
artifacts:
paths:
# sphinx has output the files to the public directory,
# but the gitlab runner also needs to know where to get them
- publicLet the gitlab ci/cd pipeline complete, then go to your project
settings > pages to activate pages. As default, pages from repository
https://gitlab.irf.se/<user|group>/<project-slug>
are deployed to
https://<user|group>.developer.irf.se/<project-slug>.
By following the conventional changelog conventional commits guidelines with the de-facto standard Angular variety we can use tooling to auto-generate a changelog based on commit messages. Tooling is different depending on source language to keep build images small and simple.
The automated changelog is not meant to be final, but a tool to
collate everything that has happened since last release, much like
git commit --squash helps you create a commit message. With
some manual editing of the resulting CHANGELOG.md, the
point is to make the changes readable for an audience that doesn’t know
all the context.
The ci/cd pipeline picks the lines of the changelog for the current release and prints them in the associated release notes on gitlab.
Radon is a package in PyPI that computes metrics based on source code:
These metrics are very good to have as a quantitative measure of where e.g. refactoring efforts should be directed. To set up these metrics with a GitLab-CI you can use the below job as a template.
diagnostics:
stage: test
script:
- radon cc -a pyorb/ > radon_cc_report
- radon raw -s pyorb/ > radon_raw_report
- radon mi -s pyorb/ > radon_mi_report
- radon hal pyorb/ > radon_hal_report
# note: coverage can help you find areas not covered by tests. However,
# the point is not to reach 100% but to have good quality tests where they
# really matter
- coverage run -m pytest > pytest_report || true
- coverage report -m > coverage_report
- python -m flake8 --extend-exclude venv pyorb/ > flake8_report || true
allow_failure: true
artifacts:
paths:
- "*_report"The results will be available as artifacts in GitLab. These artifacts can than be published alongside your documentation in e.g. sphinx by literal includes
diagnostics
===========
PyTest
-------
.. include:: ../../pytest_report
:literal:
Coverage
---------
.. include:: ../../coverage_report
:literal:
Cyclomatic Complexity
-----------------------
.. include:: ../../radon_cc_report
:literal:
... ectSuggested directory structure, based on good integration practices by pytest and the ideas outlined in test as installed by Paul Ganssle.
├── src
│ └── irfkp2017plot
│ ├── __init__.py
│ └── irfkp2017plot.py
├── tests
│ ├── conftest.py
│ └── test_canary.py
├── .gitlab-ci.yml
├── pyproject.toml
├── pytest.ini
├── pyenv.cfg
├── setup.cfg
└── setup.py
Your automated tests should test your package as it would be
installed on somebody else’s machine, to avoid “works on my machine”
issues. The above directory structure helps achive this, as the library
is not included in the python path by default. This means that the
software must be installed with pip install -e . in your
virtual environment before any tests can run, testing that the
installation procedure works at the same time. The same install commands
are run in the ci/cd pipelines along with the unit tests, to prevent any
accidental broken releases.
Intentionally left empty, similar to __init__.py. Tells
the test frameworks this is where to look for tests.
A “canary” test is just there to verify the test environment itself works and is removed when the first actual test passes.
# this will show compilation errors
# * if not installed properly in your virtual environment
# * if the package or modules refer to each other in a way not reproducible when installed
from irfkp2017plot import compute_lead_time
def test_canary():
compute_lead_time(1, 2)
assert 1 != 2For your build requirements
[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"Default options and paths to pytest, to reduce
typing.
[pytest]
addopts = -v
testpaths = testsStay away from making this file complicated. Use setup.cfg instead.
import setuptools
setuptools.setup()[metadata]
# whatever you need to tell the world about the project
[options]
# these are crucial to make the package installable from the src dir
packages = find:
package_dir =
=src
install_requires =
pandas >= 1.3.4
numpy >= 1.21.3
# example of an inter-project dependency that uses git refs (tags, branches, hashes) for versioning.
irf-timeseries @ git+ssh://git@gitlab.irf.se/spaceweather/irf-timeseries-package@v1.1.0
# and so on
# also important for the setuptools to find the right dir
[options.packages.find]
where = srcSee python packaging guidelines on using init.py within your projects. From the devops perspective, they help keep your package easy to use and install, but also to refactor as you will break your users’ import statements less frequently when you want to rearrange the subpackage structures.