Skip to content
Snippets Groups Projects
Commit fc1df8c8 authored by GitLab Bot's avatar GitLab Bot
Browse files

Add latest changes from gitlab-org/gitlab@master

parent c8df22c5
No related branches found
No related tags found
No related merge requests found
Showing
with 294 additions and 199 deletions
Loading
Loading
@@ -81,6 +81,8 @@ already reserved for category labels).
The descriptions on the [labels page](https://gitlab.com/groups/gitlab-org/-/labels)
explain what falls under each type label.
 
The GitLab handbook documents [when something is a bug and when it is a feature request.](https://about.gitlab.com/handbook/product/product-management/process/feature-or-bug.html)
### Facet labels
 
Sometimes it's useful to refine the type of an issue. In those cases, you can
Loading
Loading
---
redirect_to: '../../telemetry/backend.md'
---
This document was moved to [another location](../../telemetry/backend.md).
---
redirect_to: '../../telemetry/frontend.md'
---
This document was moved to [another location](../../telemetry/frontend.md).
---
redirect_to: '../../telemetry/index.md'
---
This document was moved to [another location](../../telemetry/index.md).
Loading
Loading
@@ -127,7 +127,7 @@ one major version. For example, it is safe to:
- `9.5.5` -> `9.5.9`
- `10.6.3` -> `10.6.6`
- `11.11.1` -> `11.11.8`
- `12.0.4` -> `12.0.9`
- `12.0.4` -> `12.0.12`
- Upgrade the minor version:
- `8.9.4` -> `8.12.3`
- `9.2.3` -> `9.5.5`
Loading
Loading
@@ -144,9 +144,10 @@ It's also important to ensure that any background migrations have been fully com
before upgrading to a new major version. To see the current size of the `background_migration` queue,
[Check for background migrations before upgrading](../update/README.md#checking-for-background-migrations-before-upgrading).
 
To ensure background migrations are successful, increment by one minor version during the version jump before installing newer releases.
From version 12 onwards, an additional step is required. More significant migrations may occur during major release upgrades. To ensure these are successful, increment to the first minor version (`x.0.x`) during the major version jump. Then proceed with upgrading to a newer release.
For example: `11.11.x` -> `12.0.x` -> `12.8.x`
 
For example: `11.11.x` -> `12.0.x`
Please see the table below for some examples:
 
| Latest stable version | Your version | Recommended upgrade path | Note |
Loading
Loading
@@ -154,7 +155,8 @@ Please see the table below for some examples:
| 9.4.5 | 8.13.4 | `8.13.4` -> `8.17.7` -> `9.4.5` | `8.17.7` is the last version in version `8` |
| 10.1.4 | 8.13.4 | `8.13.4 -> 8.17.7 -> 9.5.10 -> 10.1.4` | `8.17.7` is the last version in version `8`, `9.5.10` is the last version in version `9` |
| 11.3.4 | 8.13.4 | `8.13.4` -> `8.17.7` -> `9.5.10` -> `10.8.7` -> `11.3.4` | `8.17.7` is the last version in version `8`, `9.5.10` is the last version in version `9`, `10.8.7` is the last version in version `10` |
| 12.5.8 | 11.3.4 | `11.3.4` -> `11.11.8` -> `12.0.9` -> `12.5.8` | `11.11.8` is the last version in version `11` |
| 12.5.8 | 11.3.4 | `11.3.4` -> `11.11.8` -> `12.0.12` -> `12.5.8` | `11.11.8` is the last version in version `11`. `12.0.x` [is a required step.](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/23211#note_272842444) |
| 12.8.5 | 9.2.6 | `9.2.6` -> `9.5.10` -> `10.8.7` -> `11.11.8` -> `12.0.12` -> `12.8.5` | Four intermediate versions required: the final 9.5, 10.8, 11.11 releases, plus 12.0 |
 
More information about the release procedures can be found in our
[release documentation](https://gitlab.com/gitlab-org/release/docs). You may also want to read our
Loading
Loading
Loading
Loading
@@ -40,6 +40,7 @@ stop_review:
environment:
name: review/$CI_COMMIT_REF_NAME
action: stop
dependencies: []
when: manual
allow_failure: true
only:
Loading
Loading
Loading
Loading
@@ -17,9 +17,17 @@ module Gitlab
end
 
def restore
@tree_hash = @group_hash || read_tree_hash
@group_members = @tree_hash.delete('members')
@children = @tree_hash.delete('children')
@relation_reader ||=
if @group_hash.present?
ImportExport::JSON::LegacyReader::User.new(@group_hash, reader.group_relation_names)
else
ImportExport::JSON::LegacyReader::File.new(@path, reader.group_relation_names)
end
@group_members = @relation_reader.consume_relation('members')
@children = @relation_reader.consume_attribute('children')
@relation_reader.consume_attribute('name')
@relation_reader.consume_attribute('path')
 
if members_mapper.map && restorer.restore
@children&.each do |group_hash|
Loading
Loading
@@ -45,21 +53,12 @@ module Gitlab
 
private
 
def read_tree_hash
json = IO.read(@path)
ActiveSupport::JSON.decode(json)
rescue => e
@shared.error(e)
raise Gitlab::ImportExport::Error.new('Incorrect JSON format')
end
def restorer
@relation_tree_restorer ||= RelationTreeRestorer.new(
user: @user,
shared: @shared,
importable: @group,
tree_hash: @tree_hash.except('name', 'path'),
relation_reader: @relation_reader,
members_mapper: members_mapper,
object_builder: object_builder,
relation_factory: relation_factory,
Loading
Loading
# frozen_string_literal: true
module Gitlab
module ImportExport
module JSON
class LegacyReader
class File < LegacyReader
def initialize(path, relation_names)
@path = path
super(relation_names)
end
def valid?
::File.exist?(@path)
end
private
def tree_hash
@tree_hash ||= read_hash
end
def read_hash
ActiveSupport::JSON.decode(IO.read(@path))
rescue => e
Gitlab::ErrorTracking.log_exception(e)
raise Gitlab::ImportExport::Error.new('Incorrect JSON format')
end
end
class User < LegacyReader
def initialize(tree_hash, relation_names)
@tree_hash = tree_hash
super(relation_names)
end
def valid?
@tree_hash.present?
end
protected
attr_reader :tree_hash
end
def initialize(relation_names)
@relation_names = relation_names.map(&:to_s)
end
def valid?
raise NotImplementedError
end
def legacy?
true
end
def root_attributes(excluded_attributes = [])
attributes.except(*excluded_attributes.map(&:to_s))
end
def consume_relation(key)
value = relations.delete(key)
return value unless block_given?
return if value.nil?
if value.is_a?(Array)
value.each.with_index do |item, idx|
yield(item, idx)
end
else
yield(value, 0)
end
end
def consume_attribute(key)
attributes.delete(key)
end
def sort_ci_pipelines_by_id
relations['ci_pipelines']&.sort_by! { |hash| hash['id'] }
end
private
attr_reader :relation_names
def tree_hash
raise NotImplementedError
end
def attributes
@attributes ||= tree_hash.slice!(*relation_names)
end
def relations
@relations ||= tree_hash.extract!(*relation_names)
end
end
end
end
end
# frozen_string_literal: true
module Gitlab
module ImportExport
module Project
class TreeLoader
def load(path, dedup_entries: false)
tree_hash = ActiveSupport::JSON.decode(IO.read(path))
if dedup_entries
dedup_tree(tree_hash)
else
tree_hash
end
end
private
# This function removes duplicate entries from the given tree recursively
# by caching nodes it encounters repeatedly. We only consider nodes for
# which there can actually be multiple equivalent instances (e.g. strings,
# hashes and arrays, but not `nil`s, numbers or booleans.)
#
# The algorithm uses a recursive depth-first descent with 3 cases, starting
# with a root node (the tree/hash itself):
# - a node has already been cached; in this case we return it from the cache
# - a node has not been cached yet but should be; descend into its children
# - a node is neither cached nor qualifies for caching; this is a no-op
def dedup_tree(node, nodes_seen = {})
if nodes_seen.key?(node) && distinguishable?(node)
yield nodes_seen[node]
elsif should_dedup?(node)
nodes_seen[node] = node
case node
when Array
node.each_index do |idx|
dedup_tree(node[idx], nodes_seen) do |cached_node|
node[idx] = cached_node
end
end
when Hash
node.each do |k, v|
dedup_tree(v, nodes_seen) do |cached_node|
node[k] = cached_node
end
end
end
else
node
end
end
# We do not need to consider nodes for which there cannot be multiple instances
def should_dedup?(node)
node && !(node.is_a?(Numeric) || node.is_a?(TrueClass) || node.is_a?(FalseClass))
end
# We can only safely de-dup values that are distinguishable. True value objects
# are always distinguishable by nature. Hashes however can represent entities,
# which are identified by ID, not value. We therefore disallow de-duping hashes
# that do not have an `id` field, since we might risk dropping entities that
# have equal attributes yet different identities.
def distinguishable?(node)
if node.is_a?(Hash)
node.key?('id')
else
true
end
end
end
end
end
end
Loading
Loading
@@ -4,8 +4,6 @@ module Gitlab
module ImportExport
module Project
class TreeRestorer
LARGE_PROJECT_FILE_SIZE_BYTES = 500.megabyte
attr_reader :user
attr_reader :shared
attr_reader :project
Loading
Loading
@@ -14,12 +12,12 @@ module Gitlab
@user = user
@shared = shared
@project = project
@tree_loader = TreeLoader.new
end
 
def restore
@tree_hash = read_tree_hash
@project_members = @tree_hash.delete('project_members')
@relation_reader = ImportExport::JSON::LegacyReader::File.new(File.join(shared.export_path, 'project.json'), reader.project_relation_names)
@project_members = @relation_reader.consume_relation('project_members')
 
if relation_tree_restorer.restore
import_failure_service.with_retry(action: 'set_latest_merge_request_diff_ids!') do
Loading
Loading
@@ -37,24 +35,12 @@ module Gitlab
 
private
 
def large_project?(path)
File.size(path) >= LARGE_PROJECT_FILE_SIZE_BYTES
end
def read_tree_hash
path = File.join(@shared.export_path, 'project.json')
@tree_loader.load(path, dedup_entries: large_project?(path))
rescue => e
Rails.logger.error("Import/Export error: #{e.message}") # rubocop:disable Gitlab/RailsLogger
raise Gitlab::ImportExport::Error.new('Incorrect JSON format')
end
def relation_tree_restorer
@relation_tree_restorer ||= RelationTreeRestorer.new(
user: @user,
shared: @shared,
importable: @project,
tree_hash: @tree_hash,
relation_reader: @relation_reader,
object_builder: object_builder,
members_mapper: members_mapper,
relation_factory: relation_factory,
Loading
Loading
Loading
Loading
@@ -17,10 +17,18 @@ module Gitlab
tree_by_key(:project)
end
 
def project_relation_names
attributes_finder.find_relations_tree(:project).keys
end
def group_tree
tree_by_key(:group)
end
 
def group_relation_names
attributes_finder.find_relations_tree(:group).keys
end
def group_members_tree
tree_by_key(:group_members)
end
Loading
Loading
Loading
Loading
@@ -9,13 +9,13 @@ module Gitlab
attr_reader :user
attr_reader :shared
attr_reader :importable
attr_reader :tree_hash
attr_reader :relation_reader
 
def initialize(user:, shared:, importable:, tree_hash:, members_mapper:, object_builder:, relation_factory:, reader:)
def initialize(user:, shared:, importable:, relation_reader:, members_mapper:, object_builder:, relation_factory:, reader:)
@user = user
@shared = shared
@importable = importable
@tree_hash = tree_hash
@relation_reader = relation_reader
@members_mapper = members_mapper
@object_builder = object_builder
@relation_factory = relation_factory
Loading
Loading
@@ -30,7 +30,7 @@ module Gitlab
bulk_inserts_enabled = @importable.class == ::Project &&
Feature.enabled?(:import_bulk_inserts, @importable.group)
BulkInsertableAssociations.with_bulk_insert(enabled: bulk_inserts_enabled) do
update_relation_hashes!
fix_ci_pipelines_not_sorted_on_legacy_project_json!
create_relations!
end
end
Loading
Loading
@@ -57,18 +57,8 @@ module Gitlab
end
 
def process_relation!(relation_key, relation_definition)
data_hashes = @tree_hash.delete(relation_key)
return unless data_hashes
# we do not care if we process array or hash
data_hashes = [data_hashes] unless data_hashes.is_a?(Array)
relation_index = 0
# consume and remove objects from memory
while data_hash = data_hashes.shift
@relation_reader.consume_relation(relation_key) do |data_hash, relation_index|
process_relation_item!(relation_key, relation_definition, relation_index, data_hash)
relation_index += 1
end
end
 
Loading
Loading
@@ -103,10 +93,7 @@ module Gitlab
end
 
def update_params!
params = @tree_hash.reject do |key, _|
relations.include?(key)
end
params = @relation_reader.root_attributes(relations.keys)
params = params.merge(present_override_params)
 
# Cleaning all imported and overridden params
Loading
Loading
@@ -223,8 +210,13 @@ module Gitlab
}
end
 
def update_relation_hashes!
@tree_hash['ci_pipelines']&.sort_by! { |hash| hash['id'] }
# Temporary fix for https://gitlab.com/gitlab-org/gitlab/-/issues/27883 when import from legacy project.json
# This should be removed once legacy JSON format is deprecated.
# Ndjson export file will fix the order during project export.
def fix_ci_pipelines_not_sorted_on_legacy_project_json!
return unless relation_reader.legacy?
relation_reader.sort_ci_pipelines_by_id
end
end
end
Loading
Loading
Loading
Loading
@@ -18,7 +18,7 @@ module Gitlab
def save(tree, dir_path, filename)
mkdir_p(dir_path)
 
tree_json = JSON.generate(tree)
tree_json = ::JSON.generate(tree)
 
File.write(File.join(dir_path, filename), tree_json)
end
Loading
Loading
{
"invalid" json
}
/* global Mousetrap */
// `mousetrap` uses amd which webpack understands but Jest does not
// Thankfully it also writes to a global export so we can es6-ify it
import 'mousetrap';
export default Mousetrap;
import Vuex from 'vuex';
import { shallowMount, createLocalVue } from '@vue/test-utils';
import { GlLoadingIcon } from '@gitlab/ui';
import MockAdapter from 'axios-mock-adapter';
import { TEST_HOST } from 'spec/test_constants';
import Mousetrap from 'mousetrap';
import App from '~/diffs/components/app.vue';
Loading
Loading
@@ -12,14 +13,17 @@ import CommitWidget from '~/diffs/components/commit_widget.vue';
import TreeList from '~/diffs/components/tree_list.vue';
import { INLINE_DIFF_VIEW_TYPE, PARALLEL_DIFF_VIEW_TYPE } from '~/diffs/constants';
import createDiffsStore from '../create_diffs_store';
import axios from '~/lib/utils/axios_utils';
import diffsMockData from '../mock_data/merge_request_diffs';
 
const mergeRequestDiff = { version_index: 1 };
const TEST_ENDPOINT = `${TEST_HOST}/diff/endpoint`;
 
describe('diffs/components/app', () => {
const oldMrTabs = window.mrTabs;
let store;
let wrapper;
let mock;
 
function createComponent(props = {}, extendStore = () => {}) {
const localVue = createLocalVue();
Loading
Loading
@@ -34,7 +38,7 @@ describe('diffs/components/app', () => {
wrapper = shallowMount(localVue.extend(App), {
localVue,
propsData: {
endpoint: `${TEST_HOST}/diff/endpoint`,
endpoint: TEST_ENDPOINT,
endpointMetadata: `${TEST_HOST}/diff/endpointMetadata`,
endpointBatch: `${TEST_HOST}/diff/endpointBatch`,
projectPath: 'namespace/project',
Loading
Loading
@@ -61,8 +65,12 @@ describe('diffs/components/app', () => {
 
beforeEach(() => {
// setup globals (needed for component to mount :/)
window.mrTabs = jasmine.createSpyObj('mrTabs', ['resetViewContainer']);
window.mrTabs.expandViewContainer = jasmine.createSpy();
window.mrTabs = {
resetViewContainer: jest.fn(),
};
window.mrTabs.expandViewContainer = jest.fn();
mock = new MockAdapter(axios);
mock.onGet(TEST_ENDPOINT).reply(200, {});
});
 
afterEach(() => {
Loading
Loading
@@ -71,6 +79,8 @@ describe('diffs/components/app', () => {
 
// reset component
wrapper.destroy();
mock.restore();
});
 
describe('fetch diff methods', () => {
Loading
Loading
@@ -80,15 +90,15 @@ describe('diffs/components/app', () => {
store.state.notes.discussions = 'test';
return Promise.resolve({ real_size: 100 });
};
spyOn(window, 'requestIdleCallback').and.callFake(fn => fn());
jest.spyOn(window, 'requestIdleCallback').mockImplementation(fn => fn());
createComponent();
spyOn(wrapper.vm, 'fetchDiffFiles').and.callFake(fetchResolver);
spyOn(wrapper.vm, 'fetchDiffFilesMeta').and.callFake(fetchResolver);
spyOn(wrapper.vm, 'fetchDiffFilesBatch').and.callFake(fetchResolver);
spyOn(wrapper.vm, 'setDiscussions');
spyOn(wrapper.vm, 'startRenderDiffsQueue');
spyOn(wrapper.vm, 'unwatchDiscussions');
spyOn(wrapper.vm, 'unwatchRetrievingBatches');
jest.spyOn(wrapper.vm, 'fetchDiffFiles').mockImplementation(fetchResolver);
jest.spyOn(wrapper.vm, 'fetchDiffFilesMeta').mockImplementation(fetchResolver);
jest.spyOn(wrapper.vm, 'fetchDiffFilesBatch').mockImplementation(fetchResolver);
jest.spyOn(wrapper.vm, 'setDiscussions').mockImplementation(() => {});
jest.spyOn(wrapper.vm, 'startRenderDiffsQueue').mockImplementation(() => {});
jest.spyOn(wrapper.vm, 'unwatchDiscussions').mockImplementation(() => {});
jest.spyOn(wrapper.vm, 'unwatchRetrievingBatches').mockImplementation(() => {});
store.state.diffs.retrievingBatches = true;
store.state.diffs.diffFiles = [];
wrapper.vm.$nextTick(done);
Loading
Loading
@@ -236,7 +246,7 @@ describe('diffs/components/app', () => {
wrapper.vm.fetchData(false);
 
expect(wrapper.vm.fetchDiffFiles).toHaveBeenCalled();
setTimeout(() => {
setImmediate(() => {
expect(wrapper.vm.startRenderDiffsQueue).toHaveBeenCalled();
expect(wrapper.vm.fetchDiffFilesMeta).not.toHaveBeenCalled();
expect(wrapper.vm.fetchDiffFilesBatch).not.toHaveBeenCalled();
Loading
Loading
@@ -255,7 +265,7 @@ describe('diffs/components/app', () => {
wrapper.vm.fetchData(false);
 
expect(wrapper.vm.fetchDiffFiles).not.toHaveBeenCalled();
setTimeout(() => {
setImmediate(() => {
expect(wrapper.vm.startRenderDiffsQueue).toHaveBeenCalled();
expect(wrapper.vm.fetchDiffFilesMeta).toHaveBeenCalled();
expect(wrapper.vm.fetchDiffFilesBatch).toHaveBeenCalled();
Loading
Loading
@@ -272,7 +282,7 @@ describe('diffs/components/app', () => {
wrapper.vm.fetchData(false);
 
expect(wrapper.vm.fetchDiffFiles).not.toHaveBeenCalled();
setTimeout(() => {
setImmediate(() => {
expect(wrapper.vm.startRenderDiffsQueue).toHaveBeenCalled();
expect(wrapper.vm.fetchDiffFilesMeta).toHaveBeenCalled();
expect(wrapper.vm.fetchDiffFilesBatch).toHaveBeenCalled();
Loading
Loading
@@ -350,23 +360,21 @@ describe('diffs/components/app', () => {
});
 
// Component uses $nextTick so we wait until that has finished
setTimeout(() => {
setImmediate(() => {
expect(store.state.diffs.highlightedRow).toBe('ABC_123');
 
done();
});
});
 
it('marks current diff file based on currently highlighted row', done => {
it('marks current diff file based on currently highlighted row', () => {
createComponent({
shouldShow: true,
});
 
// Component uses $nextTick so we wait until that has finished
setTimeout(() => {
return wrapper.vm.$nextTick().then(() => {
expect(store.state.diffs.currentDiffFileId).toBe('ABC');
done();
});
});
});
Loading
Loading
@@ -403,7 +411,7 @@ describe('diffs/components/app', () => {
});
 
// Component uses $nextTick so we wait until that has finished
setTimeout(() => {
setImmediate(() => {
expect(store.state.diffs.currentDiffFileId).toBe('ABC');
 
done();
Loading
Loading
@@ -449,7 +457,7 @@ describe('diffs/components/app', () => {
 
describe('visible app', () => {
beforeEach(() => {
spy = jasmine.createSpy('spy');
spy = jest.fn();
 
createComponent({
shouldShow: true,
Loading
Loading
@@ -459,21 +467,18 @@ describe('diffs/components/app', () => {
});
});
 
it('calls `jumpToFile()` with correct parameter whenever pre-defined key is pressed', done => {
wrapper.vm
.$nextTick()
.then(() => {
Object.keys(mappings).forEach(function(key) {
Mousetrap.trigger(key);
it.each(Object.keys(mappings))(
'calls `jumpToFile()` with correct parameter whenever pre-defined %s is pressed',
key => {
return wrapper.vm.$nextTick().then(() => {
expect(spy).not.toHaveBeenCalled();
 
expect(spy.calls.mostRecent().args).toEqual([mappings[key]]);
});
Mousetrap.trigger(key);
 
expect(spy.calls.count()).toEqual(Object.keys(mappings).length);
})
.then(done)
.catch(done.fail);
});
expect(spy).toHaveBeenCalledWith(mappings[key]);
});
},
);
 
it('does not call `jumpToFile()` when unknown key is pressed', done => {
wrapper.vm
Loading
Loading
@@ -490,7 +495,7 @@ describe('diffs/components/app', () => {
 
describe('hideen app', () => {
beforeEach(() => {
spy = jasmine.createSpy('spy');
spy = jest.fn();
 
createComponent({
shouldShow: false,
Loading
Loading
@@ -504,7 +509,7 @@ describe('diffs/components/app', () => {
wrapper.vm
.$nextTick()
.then(() => {
Object.keys(mappings).forEach(function(key) {
Object.keys(mappings).forEach(key => {
Mousetrap.trigger(key);
 
expect(spy).not.toHaveBeenCalled();
Loading
Loading
@@ -520,7 +525,7 @@ describe('diffs/components/app', () => {
let spy;
 
beforeEach(() => {
spy = jasmine.createSpy();
spy = jest.fn();
 
createComponent({}, () => {
store.state.diffs.diffFiles = [
Loading
Loading
@@ -545,15 +550,15 @@ describe('diffs/components/app', () => {
.then(() => {
wrapper.vm.jumpToFile(+1);
 
expect(spy.calls.mostRecent().args).toEqual(['222.js']);
expect(spy.mock.calls[spy.mock.calls.length - 1]).toEqual(['222.js']);
store.state.diffs.currentDiffFileId = '222';
wrapper.vm.jumpToFile(+1);
 
expect(spy.calls.mostRecent().args).toEqual(['333.js']);
expect(spy.mock.calls[spy.mock.calls.length - 1]).toEqual(['333.js']);
store.state.diffs.currentDiffFileId = '333';
wrapper.vm.jumpToFile(-1);
 
expect(spy.calls.mostRecent().args).toEqual(['222.js']);
expect(spy.mock.calls[spy.mock.calls.length - 1]).toEqual(['222.js']);
})
.then(done)
.catch(done.fail);
Loading
Loading
@@ -602,7 +607,7 @@ describe('diffs/components/app', () => {
 
expect(wrapper.contains(CompareVersions)).toBe(true);
expect(wrapper.find(CompareVersions).props()).toEqual(
jasmine.objectContaining({
expect.objectContaining({
targetBranch: {
branchName: 'target-branch',
versionIndex: -1,
Loading
Loading
@@ -625,7 +630,7 @@ describe('diffs/components/app', () => {
 
expect(wrapper.contains(HiddenFilesWarning)).toBe(true);
expect(wrapper.find(HiddenFilesWarning).props()).toEqual(
jasmine.objectContaining({
expect.objectContaining({
total: '5',
plainDiffPath: 'plain diff path',
emailPatchPath: 'email patch path',
Loading
Loading
@@ -663,7 +668,7 @@ describe('diffs/components/app', () => {
let toggleShowTreeList;
 
beforeEach(() => {
toggleShowTreeList = jasmine.createSpy('toggleShowTreeList');
toggleShowTreeList = jest.fn();
});
 
afterEach(() => {
Loading
Loading
import Vue from 'vue';
import Vuex from 'vuex';
import diffsModule from '~/diffs/store/modules';
import notesModule from '~/notes/stores/modules';
Vue.use(Vuex);
export default function createDiffsStore() {
return new Vuex.Store({
modules: {
diffs: diffsModule(),
notes: notesModule(),
},
});
}
Loading
Loading
@@ -8,6 +8,7 @@ exports[`IDE pipelines list when loaded renders empty state when no latestPipeli
<empty-state-stub
cansetci="true"
class="mb-auto mt-auto"
emptystatesvgpath="http://test.host"
helppagepath="http://test.host"
/>
Loading
Loading
import MockAdapter from 'axios-mock-adapter';
import axios from '~/lib/utils/axios_utils';
import * as iconUtils from '~/lib/utils/icon_utils';
import { clearSvgIconPathContentCache, getSvgIconPathContent } from '~/lib/utils/icon_utils';
 
describe('Icon utils', () => {
describe('getSvgIconPathContent', () => {
let spriteIcons;
let axiosMock;
const mockName = 'mockIconName';
const mockPath = 'mockPath';
const mockIcons = `<svg><symbol id="${mockName}"><path d="${mockPath}"/></symbol></svg>`;
 
beforeAll(() => {
spriteIcons = gon.sprite_icons;
Loading
Loading
@@ -15,45 +19,63 @@ describe('Icon utils', () => {
gon.sprite_icons = spriteIcons;
});
 
let axiosMock;
let mockEndpoint;
const mockName = 'mockIconName';
const mockPath = 'mockPath';
const getIcon = () => iconUtils.getSvgIconPathContent(mockName);
beforeEach(() => {
axiosMock = new MockAdapter(axios);
mockEndpoint = axiosMock.onGet(gon.sprite_icons);
});
 
afterEach(() => {
axiosMock.restore();
clearSvgIconPathContentCache();
});
 
it('extracts svg icon path content from sprite icons', () => {
mockEndpoint.replyOnce(
200,
`<svg><symbol id="${mockName}"><path d="${mockPath}"/></symbol></svg>`,
);
return getIcon().then(path => {
expect(path).toBe(mockPath);
describe('when the icons can be loaded', () => {
beforeEach(() => {
axiosMock.onGet(gon.sprite_icons).reply(200, mockIcons);
});
});
 
it('returns null if icon path content does not exist', () => {
mockEndpoint.replyOnce(200, ``);
it('extracts svg icon path content from sprite icons', () => {
return getSvgIconPathContent(mockName).then(path => {
expect(path).toBe(mockPath);
});
});
 
return getIcon().then(path => {
expect(path).toBe(null);
it('returns null if icon path content does not exist', () => {
return getSvgIconPathContent('missing-icon').then(path => {
expect(path).toBe(null);
});
});
});
 
it('returns null if an http error occurs', () => {
mockEndpoint.replyOnce(500);
describe('when the icons cannot be loaded on the first 2 tries', () => {
beforeEach(() => {
axiosMock
.onGet(gon.sprite_icons)
.replyOnce(500)
.onGet(gon.sprite_icons)
.replyOnce(500)
.onGet(gon.sprite_icons)
.reply(200, mockIcons);
});
it('returns null', () => {
return getSvgIconPathContent(mockName).then(path => {
expect(path).toBe(null);
});
});
 
return getIcon().then(path => {
expect(path).toBe(null);
it('extracts svg icon path content, after 2 attempts', () => {
return getSvgIconPathContent(mockName)
.then(path1 => {
expect(path1).toBe(null);
return getSvgIconPathContent(mockName);
})
.then(path2 => {
expect(path2).toBe(null);
return getSvgIconPathContent(mockName);
})
.then(path3 => {
expect(path3).toBe(mockPath);
});
});
});
});
Loading
Loading
/* eslint-disable class-methods-use-this */
export default class TreeWorkerMock {
addEventListener() {}
terminate() {}
postMessage() {}
}
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment