Skip to content
Snippets Groups Projects
Commit 5cc71ca7 authored by Achilleas Pipinellis's avatar Achilleas Pipinellis Committed by GitLab Release Tools Bot
Browse files

Merge branch 'docs/make-content-us-english' into 'master'

Use US English for content

See merge request gitlab-org/gitlab-ce!27154

(cherry picked from commit 8511a43a)

42fe253c Use US English for content
parent 06fbeef7
No related branches found
No related tags found
No related merge requests found
Loading
Loading
@@ -62,7 +62,7 @@ into more features:
| [ChatOps](chatops/README.md) | Trigger CI jobs from chat, with results sent back to the channel. |
| [Interactive web terminals](interactive_web_terminal/index.md) | Open an interactive web terminal to debug the running jobs. |
| [Review Apps](review_apps/index.md) | Configure GitLab CI/CD to preview code changes in a per-branch basis. |
| [Optimising GitLab for large repositories](large_repositories/index.md) | Useful tips on how to optimise GitLab and GitLab Runner for big repositories. |
| [Optimizing GitLab for large repositories](large_repositories/index.md) | Useful tips on how to optimize GitLab and GitLab Runner for big repositories. |
| [Deploy Boards](https://docs.gitlab.com/ee/user/project/deploy_boards.html) **[PREMIUM]** | Check the current health and status of each CI/CD environment running on Kubernetes. |
| [GitLab CI/CD for external repositories](https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/index.html) **[PREMIUM]** | Get the benefits of GitLab CI/CD combined with repositories in GitHub and BitBucket Cloud. |
 
Loading
Loading
# Optimising GitLab for large repositories
# Optimizing GitLab for large repositories
 
Large repositories consisting of more than 50k files in a worktree
often require special consideration because of
the time required to clone and check out.
 
GitLab and GitLab Runner handle this scenario well
but require optimised configuration to efficiently perform its
but require optimized configuration to efficiently perform its
set of operations.
 
The general guidelines for handling big repositories are simple.
Loading
Loading
@@ -15,7 +15,7 @@ Each guideline is described in more detail in the sections below:
- Always use shallow clone to reduce data transfer. Be aware that this puts more burden
on GitLab instance due to higher CPU impact.
- Control the clone directory if you heavily use a fork-based workflow.
- Optimise `git clean` flags to ensure that you remove or keep data that might affect or speed-up your build.
- Optimize `git clean` flags to ensure that you remove or keep data that might affect or speed-up your build.
 
## Shallow cloning
 
Loading
Loading
@@ -76,7 +76,7 @@ done by GitLab, requiring you to do them.
This can have implications if you heavily use big repositories with fork workflow.
 
Fork workflow from GitLab Runner's perspective is stored as a separate repository
with separate worktree. That means that GitLab Runner cannot optimise the usage
with separate worktree. That means that GitLab Runner cannot optimize the usage
of worktrees and you might have to instruct GitLab Runner to use that.
 
In such cases, ideally you want to make the GitLab Runner executor be used only used only
Loading
Loading
@@ -113,7 +113,7 @@ available parameters are dependent on Git version.
 
Following the guidelines above, lets imagine that we want to:
 
- Optimise for a big project (more than 50k files in directory).
- Optimize for a big project (more than 50k files in directory).
- Use forks-based workflow for contributing.
- Reuse existing worktrees. Have preconfigured runners that are pre-cloned with repositories.
- Runner assigned only to project and all forks.
Loading
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment