osmo-ci/scripts/obs/lib/git.py

110 lines
3.3 KiB
Python
Raw Normal View History

scripts/obs: rewrite pushing source pkgs to OBS Harald requested that the OBS scripts should not stop if building one specific source package fails, instead it should keep going and report at the end a non-success exit code. Given that the shell script code has historically grown and became hard to maintain, I decided to rewrite the scripts for implementing this feature. This rewrite solves additional problems: * No full checkout of an OBS project like network:osmocom:latest anymore, with lots of packages that won't get updated (e.g. the uhd package has a uhd-images_3.14.1.1.tar.xz file that is 108 MB). With the old code, developers had to wait minutes during the checkout before the script reaches code that is currently being developed. Now only single packages get checked out right before they get updated. * No need to clone git repositories over and over. With the new code, git repos only get cloned if needed (for latest it is not needed if the remote git tag is the same as the version in OBS). During development, the cloned git repositories are cached. * Output from commands like "git tag -l" is not written to the log unless they failed. This makes the log more readable, which is especially important when a package fails to build, we keep going and need to spot the build error in the middle of the log later on. * No more duplicated code for nightly and latest scripts that worked similar but had slight differences. Also the list of packages is not duplicated for nightly and latest anymore; nightly uses all packages and latest uses packages that have at least one git tag. * Building source packages is decoupled from uploading them. A separate script build_srcpkg.py can be used to just build the deb + rpm spec source packages, without interacting with the OBS server. * The scripts can optionally run in docker with a command-line switch, and this is used by jenkins. This way we don't need to install more dependencies on the host such as rebar3 which is now needed for erlang/osmo_dia2gsup. * Add erlang/osmo_dia2gsup and run its generate_build_dep.sh (SYS#6006) I have done the new implementation in python to make use of argparse and to be able to use try/except and print a trace when building one package fails. Example output: * https://jenkins.osmocom.org/jenkins/job/Osmocom_OBS_nightly_obs.osmocom.org/48/console * https://jenkins.osmocom.org/jenkins/job/Osmocom_OBS_latest_obs.osmocom.org/46/console Change-Id: I45a555d05a9da808c0fe0145aae665f583cb80d9
2022-07-13 10:50:21 +00:00
#!/usr/bin/env python3
# SPDX-License-Identifier: GPL-2.0-or-later
# Copyright 2022 sysmocom - s.f.m.c. GmbH <info@sysmocom.de>
import os
import re
import lib.config
def get_repo_path(project):
return f"{lib.config.path_cache}/{os.path.basename(project)}"
def get_repo_url(project):
if project in lib.config.git_url_other:
return lib.config.git_url_other[project]
return f"{lib.config.git_url_default}/{project}"
def get_latest_tag_pattern(project):
if project in lib.config.git_latest_tag_pattern_other:
return lib.config.git_latest_tag_pattern_other[project]
return lib.config.git_latest_tag_pattern_default
def clone(project, fetch=False):
repo_path = get_repo_path(project)
url = get_repo_url(project)
if os.path.exists(repo_path):
if fetch:
print(f"{project}: cloning {url} (cached, fetching)")
lib.run_cmd(["git", "fetch"], cwd=repo_path)
else:
print(f"{project}: cloning {url} (cached, not fetching)")
return
print(f"{project}: cloning {url}")
os.makedirs(lib.config.path_cache, exist_ok=True)
lib.run_cmd(["git", "clone", url, repo_path])
lib.run_cmd(["git", "config", "user.name", "Osmocom OBS scripts"],
cwd=repo_path)
lib.run_cmd(["git", "config", "user.email", "info@osmocom.org"],
cwd=repo_path)
def clean(project):
repo_path = get_repo_path(project)
lib.run_cmd(["git", "clean", "-ffxd"], cwd=repo_path)
def checkout(project, branch):
repo_path = get_repo_path(project)
print(f"{project}: checking out {branch}")
lib.run_cmd(["git", "checkout", "-f", branch], cwd=repo_path)
lib.run_cmd(["git", "reset", "--hard", branch], cwd=repo_path)
def checkout_default_branch(project):
branch = lib.config.git_branch_default
if project in lib.config.git_branch_other:
branch = lib.config.git_branch_other[project]
checkout(project, f"origin/{branch}")
def get_latest_tag(project):
pattern_str = get_latest_tag_pattern(project)
pattern = re.compile(pattern_str)
repo_path = get_repo_path(project)
git_tag_ret = lib.run_cmd(["git", "tag", "-l", "--sort=-v:refname"],
cwd=repo_path)
for line in git_tag_ret.output.split('\n'):
line = line.strip('\r')
if pattern.match(line):
return line
lib.exit_error_cmd(git_tag_ret, f"couldn't find latest tag for {project},"
f" regex used on output: {pattern_str}")
def get_latest_tag_remote(project):
pattern_str = get_latest_tag_pattern(project)
pattern = re.compile(pattern_str)
print(f"{project}: getting latest tag from git remote")
ls_remote = lib.run_cmd(["git", "ls-remote", "--tags", "--sort=-v:refname",
get_repo_url(project)])
for line in ls_remote.output.split('\n'):
# Tags are listed twice, skip the ones with ^{} at the end
if "^{}" in line:
continue
if "refs/tags/" not in line:
continue
line = line.rstrip().split("refs/tags/")[1]
if pattern.match(line):
return line
# No tag found probably means the repository was just created and doesn't
# have a release tag yet
return None
def checkout_latest_tag(project):
checkout(project, get_latest_tag(project))