Skip to content

Use safe get for large number of builds

For large project (mine is 166GB of build artifacts), the use of listing all builds can fail with the following message:

# time ./gitlab-artifact-cleanup --project mygroup/myproject --min-age "6 months" --gitlab gitlab --dry-run                                                                                                                                                                           
Traceback (most recent call last):
Would delete non-tagged artifacts older than 180 days, 0:00:00

mygroup / myproject
  File "./gitlab-artifact-cleanup", line 214, in <module>
    main()
  File "./gitlab-artifact-cleanup", line 206, in main
    cleanup.cleanup_project(proj)
  File "./gitlab-artifact-cleanup", line 45, in cleanup_project
    for build in proj.builds.list(all=True):
  File "/tmp/gitlab-artifact-cleanup/local/lib/python2.7/site-packages/gitlab/objects.py", line 122, in list
    return self.obj_cls.list(self.gitlab, **args)
  File "/tmp/gitlab-artifact-cleanup/local/lib/python2.7/site-packages/gitlab/objects.py", line 258, in list
    return gl.list(cls, **kwargs)
  File "/tmp/gitlab-artifact-cleanup/local/lib/python2.7/site-packages/gitlab/__init__.py", line 407, in list
    return self._raw_list(url, obj_class, **kwargs)
  File "/tmp/gitlab-artifact-cleanup/local/lib/python2.7/site-packages/gitlab/__init__.py", line 351, in _raw_list
    raise e
gitlab.exceptions.GitlabConnectionError: Can't connect to GitLab server (maximum recursion depth exceeded in cmp)

real    0m40.867s
user    0m1.160s
sys     0m0.052s

I didn't need to do this for a project which is 27GB for example.

The documentation of python-gitlab (http://python-gitlab.readthedocs.io/en/stable/api-usage.html#pagination) mention this case by using safe_all instead of all.

Edited by username-removed-761860

Merge request reports