mirror of
https://github.com/ansible/awx.git
synced 2026-01-11 10:00:01 -03:30
Merge branch 'ansible:devel' into devel
This commit is contained in:
commit
ab6b4bad03
20
.github/triage_replies.md
vendored
20
.github/triage_replies.md
vendored
@ -29,12 +29,24 @@ In the future, sometimes starting a discussion on the development list prior to
|
||||
Thank you once again for this and your interest in AWX!
|
||||
|
||||
|
||||
### No Progress
|
||||
### No Progress Issue
|
||||
- Hi! \
|
||||
\
|
||||
Thank you very much for for this issue. It means a lot to us that you have taken time to contribute by opening this report. \
|
||||
\
|
||||
On this issue, there were comments added but it has been some time since then without response. At this time we are closing this issue. If you get time to address the comments we can reopen the issue if you can contact us by using any of the communication methods listed in the page below: \
|
||||
\
|
||||
https://github.com/ansible/awx/#get-involved \
|
||||
\
|
||||
Thank you once again for this and your interest in AWX!
|
||||
|
||||
|
||||
### No Progress PR
|
||||
- Hi! \
|
||||
\
|
||||
Thank you very much for your submission to AWX. It means a lot to us that you have taken time to contribute. \
|
||||
\
|
||||
On this PR, changes were requested but it has been some time since then. We think this PR has merit but without the requested changes we are unable to merge it. At this time we are closing you PR. If you get time to address the changes you are welcome to open another PR or we can reopen this PR upon request if you contact us by using any of the communication methods listed in the page below: \
|
||||
On this PR, changes were requested but it has been some time since then. We think this PR has merit but without the requested changes we are unable to merge it. At this time we are closing your PR. If you get time to address the changes you are welcome to open another PR or we can reopen this PR upon request if you contact us by using any of the communication methods listed in the page below: \
|
||||
\
|
||||
https://github.com/ansible/awx/#get-involved \
|
||||
\
|
||||
@ -51,6 +63,10 @@ Thank you once again for this and your interest in AWX!
|
||||
### Code of Conduct
|
||||
- Hello. Please keep in mind that Ansible adheres to a Code of Conduct in its community spaces. The spirit of the code of conduct is to be kind, and this is your friendly reminder to be so. Please see the full code of conduct here if you have questions: https://docs.ansible.com/ansible/latest/community/code_of_conduct.html
|
||||
|
||||
### EE Contents / Community General
|
||||
- Hello. The awx-ee contains the collections and dependencies needed for supported AWX features to function. Anything beyond that (like the community.general package) will require you to build your own EE. For information on how to do that, see https://ansible-builder.readthedocs.io/en/stable/ \
|
||||
\
|
||||
The Ansible Community is looking at building an EE that corresponds to all of the collections inside the ansible package. That may help you if and when it happens; see https://github.com/ansible-community/community-topics/issues/31 for details.
|
||||
|
||||
|
||||
|
||||
|
||||
2
.github/workflows/ci.yml
vendored
2
.github/workflows/ci.yml
vendored
@ -113,7 +113,7 @@ jobs:
|
||||
|
||||
- name: Install playbook dependencies
|
||||
run: |
|
||||
python3 -m pip install docker
|
||||
python3 -m pip install docker setuptools_scm
|
||||
|
||||
- name: Build AWX image
|
||||
working-directory: awx
|
||||
|
||||
4
.github/workflows/promote.yml
vendored
4
.github/workflows/promote.yml
vendored
@ -21,7 +21,7 @@ jobs:
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python${{ env.py_version }} -m pip install wheel twine
|
||||
python${{ env.py_version }} -m pip install wheel twine setuptools-scm
|
||||
|
||||
- name: Set official collection namespace
|
||||
run: echo collection_namespace=awx >> $GITHUB_ENV
|
||||
@ -70,4 +70,4 @@ jobs:
|
||||
docker tag ghcr.io/${{ github.repository }}:${{ github.event.release.tag_name }} quay.io/${{ github.repository }}:latest
|
||||
docker push quay.io/${{ github.repository }}:${{ github.event.release.tag_name }}
|
||||
docker push quay.io/${{ github.repository }}:latest
|
||||
|
||||
|
||||
|
||||
2
.github/workflows/stage.yml
vendored
2
.github/workflows/stage.yml
vendored
@ -65,7 +65,7 @@ jobs:
|
||||
|
||||
- name: Install playbook dependencies
|
||||
run: |
|
||||
python3 -m pip install docker
|
||||
python3 -m pip install docker setuptools_scm
|
||||
|
||||
- name: Build and stage AWX
|
||||
working-directory: awx
|
||||
|
||||
18
Makefile
18
Makefile
@ -5,8 +5,8 @@ NPM_BIN ?= npm
|
||||
CHROMIUM_BIN=/tmp/chrome-linux/chrome
|
||||
GIT_BRANCH ?= $(shell git rev-parse --abbrev-ref HEAD)
|
||||
MANAGEMENT_COMMAND ?= awx-manage
|
||||
VERSION := $(shell $(PYTHON) setup.py --version)
|
||||
COLLECTION_VERSION := $(shell $(PYTHON) setup.py --version | cut -d . -f 1-3)
|
||||
VERSION := $(shell $(PYTHON) tools/scripts/scm_version.py)
|
||||
COLLECTION_VERSION := $(shell $(PYTHON) tools/scripts/scm_version.py | cut -d . -f 1-3)
|
||||
|
||||
# NOTE: This defaults the container image version to the branch that's active
|
||||
COMPOSE_TAG ?= $(GIT_BRANCH)
|
||||
@ -49,7 +49,7 @@ I18N_FLAG_FILE = .i18n_built
|
||||
.PHONY: awx-link clean clean-tmp clean-venv requirements requirements_dev \
|
||||
develop refresh adduser migrate dbchange \
|
||||
receiver test test_unit test_coverage coverage_html \
|
||||
dev_build release_build sdist \
|
||||
sdist \
|
||||
ui-release ui-devel \
|
||||
VERSION PYTHON_VERSION docker-compose-sources \
|
||||
.git/hooks/pre-commit
|
||||
@ -273,7 +273,7 @@ api-lint:
|
||||
yamllint -s .
|
||||
|
||||
awx-link:
|
||||
[ -d "/awx_devel/awx.egg-info" ] || $(PYTHON) /awx_devel/setup.py egg_info_dev
|
||||
[ -d "/awx_devel/awx.egg-info" ] || $(PYTHON) /awx_devel/tools/scripts/egg_info_dev
|
||||
cp -f /tmp/awx.egg-link /var/lib/awx/venv/awx/lib/$(PYTHON)/site-packages/awx.egg-link
|
||||
|
||||
TEST_DIRS ?= awx/main/tests/unit awx/main/tests/functional awx/conf/tests awx/sso/tests
|
||||
@ -424,21 +424,13 @@ ui-test-general:
|
||||
$(NPM_BIN) run --prefix awx/ui pretest
|
||||
$(NPM_BIN) run --prefix awx/ui/ test-general --runInBand
|
||||
|
||||
# Build a pip-installable package into dist/ with a timestamped version number.
|
||||
dev_build:
|
||||
$(PYTHON) setup.py dev_build
|
||||
|
||||
# Build a pip-installable package into dist/ with the release version number.
|
||||
release_build:
|
||||
$(PYTHON) setup.py release_build
|
||||
|
||||
HEADLESS ?= no
|
||||
ifeq ($(HEADLESS), yes)
|
||||
dist/$(SDIST_TAR_FILE):
|
||||
else
|
||||
dist/$(SDIST_TAR_FILE): $(UI_BUILD_FLAG_FILE)
|
||||
endif
|
||||
$(PYTHON) setup.py $(SDIST_COMMAND)
|
||||
$(PYTHON) -m build -s
|
||||
ln -sf $(SDIST_TAR_FILE) dist/awx.tar.gz
|
||||
|
||||
sdist: dist/$(SDIST_TAR_FILE)
|
||||
|
||||
@ -6,9 +6,40 @@ import os
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
from pkg_resources import get_distribution
|
||||
|
||||
__version__ = get_distribution('awx').version
|
||||
def get_version():
|
||||
version_from_file = get_version_from_file()
|
||||
if version_from_file:
|
||||
return version_from_file
|
||||
else:
|
||||
from setuptools_scm import get_version
|
||||
|
||||
version = get_version(root='..', relative_to=__file__)
|
||||
return version
|
||||
|
||||
|
||||
def get_version_from_file():
|
||||
vf = version_file()
|
||||
if vf:
|
||||
with open(vf, 'r') as file:
|
||||
return file.read().strip()
|
||||
|
||||
|
||||
def version_file():
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
version_file = os.path.join(current_dir, '..', 'VERSION')
|
||||
|
||||
if os.path.exists(version_file):
|
||||
return version_file
|
||||
|
||||
|
||||
try:
|
||||
import pkg_resources
|
||||
|
||||
__version__ = pkg_resources.get_distribution('awx').version
|
||||
except pkg_resources.DistributionNotFound:
|
||||
__version__ = get_version()
|
||||
|
||||
__all__ = ['__version__']
|
||||
|
||||
|
||||
@ -21,7 +52,6 @@ try:
|
||||
except ImportError: # pragma: no cover
|
||||
MODE = 'production'
|
||||
|
||||
|
||||
import hashlib
|
||||
|
||||
try:
|
||||
|
||||
@ -2073,7 +2073,7 @@ class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOpt
|
||||
|
||||
class Meta:
|
||||
model = InventorySource
|
||||
fields = ('*', 'name', 'inventory', 'update_on_launch', 'update_cache_timeout', 'source_project', 'update_on_project_update') + (
|
||||
fields = ('*', 'name', 'inventory', 'update_on_launch', 'update_cache_timeout', 'source_project') + (
|
||||
'last_update_failed',
|
||||
'last_updated',
|
||||
) # Backwards compatibility.
|
||||
@ -2136,11 +2136,6 @@ class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOpt
|
||||
raise serializers.ValidationError(_("Cannot use manual project for SCM-based inventory."))
|
||||
return value
|
||||
|
||||
def validate_update_on_project_update(self, value):
|
||||
if value and self.instance and self.instance.schedules.exists():
|
||||
raise serializers.ValidationError(_("Setting not compatible with existing schedules."))
|
||||
return value
|
||||
|
||||
def validate_inventory(self, value):
|
||||
if value and value.kind == 'smart':
|
||||
raise serializers.ValidationError({"detail": _("Cannot create Inventory Source for Smart Inventory")})
|
||||
@ -2191,7 +2186,7 @@ class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOpt
|
||||
if ('source' in attrs or 'source_project' in attrs) and get_field_from_model_or_attrs('source_project') is None:
|
||||
raise serializers.ValidationError({"source_project": _("Project required for scm type sources.")})
|
||||
else:
|
||||
redundant_scm_fields = list(filter(lambda x: attrs.get(x, None), ['source_project', 'source_path', 'update_on_project_update']))
|
||||
redundant_scm_fields = list(filter(lambda x: attrs.get(x, None), ['source_project', 'source_path']))
|
||||
if redundant_scm_fields:
|
||||
raise serializers.ValidationError({"detail": _("Cannot set %s if not SCM type." % ' '.join(redundant_scm_fields))})
|
||||
|
||||
@ -4745,13 +4740,6 @@ class ScheduleSerializer(LaunchConfigurationBaseSerializer, SchedulePreviewSeria
|
||||
raise serializers.ValidationError(_('Inventory Source must be a cloud resource.'))
|
||||
elif type(value) == Project and value.scm_type == '':
|
||||
raise serializers.ValidationError(_('Manual Project cannot have a schedule set.'))
|
||||
elif type(value) == InventorySource and value.source == 'scm' and value.update_on_project_update:
|
||||
raise serializers.ValidationError(
|
||||
_(
|
||||
'Inventory sources with `update_on_project_update` cannot be scheduled. '
|
||||
'Schedule its source project `{}` instead.'.format(value.source_project.name)
|
||||
)
|
||||
)
|
||||
return value
|
||||
|
||||
def validate(self, attrs):
|
||||
|
||||
@ -115,7 +115,6 @@ from awx.api.metadata import RoleMetadata
|
||||
from awx.main.constants import ACTIVE_STATES, SURVEY_TYPE_MAPPING
|
||||
from awx.main.scheduler.dag_workflow import WorkflowDAG
|
||||
from awx.api.views.mixin import (
|
||||
ControlledByScmMixin,
|
||||
InstanceGroupMembershipMixin,
|
||||
OrganizationCountsMixin,
|
||||
RelatedJobsPreventDeleteMixin,
|
||||
@ -1675,7 +1674,7 @@ class HostList(HostRelatedSearchMixin, ListCreateAPIView):
|
||||
return Response(dict(error=_(str(e))), status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
|
||||
class HostDetail(RelatedJobsPreventDeleteMixin, ControlledByScmMixin, RetrieveUpdateDestroyAPIView):
|
||||
class HostDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIView):
|
||||
|
||||
always_allow_superuser = False
|
||||
model = models.Host
|
||||
@ -1709,7 +1708,7 @@ class InventoryHostsList(HostRelatedSearchMixin, SubListCreateAttachDetachAPIVie
|
||||
return qs
|
||||
|
||||
|
||||
class HostGroupsList(ControlledByScmMixin, SubListCreateAttachDetachAPIView):
|
||||
class HostGroupsList(SubListCreateAttachDetachAPIView):
|
||||
'''the list of groups a host is directly a member of'''
|
||||
|
||||
model = models.Group
|
||||
@ -1825,7 +1824,7 @@ class EnforceParentRelationshipMixin(object):
|
||||
return super(EnforceParentRelationshipMixin, self).create(request, *args, **kwargs)
|
||||
|
||||
|
||||
class GroupChildrenList(ControlledByScmMixin, EnforceParentRelationshipMixin, SubListCreateAttachDetachAPIView):
|
||||
class GroupChildrenList(EnforceParentRelationshipMixin, SubListCreateAttachDetachAPIView):
|
||||
|
||||
model = models.Group
|
||||
serializer_class = serializers.GroupSerializer
|
||||
@ -1871,7 +1870,7 @@ class GroupPotentialChildrenList(SubListAPIView):
|
||||
return qs.exclude(pk__in=except_pks)
|
||||
|
||||
|
||||
class GroupHostsList(HostRelatedSearchMixin, ControlledByScmMixin, SubListCreateAttachDetachAPIView):
|
||||
class GroupHostsList(HostRelatedSearchMixin, SubListCreateAttachDetachAPIView):
|
||||
'''the list of hosts directly below a group'''
|
||||
|
||||
model = models.Host
|
||||
@ -1935,7 +1934,7 @@ class GroupActivityStreamList(SubListAPIView):
|
||||
return qs.filter(Q(group=parent) | Q(host__in=parent.hosts.all()))
|
||||
|
||||
|
||||
class GroupDetail(RelatedJobsPreventDeleteMixin, ControlledByScmMixin, RetrieveUpdateDestroyAPIView):
|
||||
class GroupDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIView):
|
||||
|
||||
model = models.Group
|
||||
serializer_class = serializers.GroupSerializer
|
||||
|
||||
@ -41,7 +41,7 @@ from awx.api.serializers import (
|
||||
JobTemplateSerializer,
|
||||
LabelSerializer,
|
||||
)
|
||||
from awx.api.views.mixin import RelatedJobsPreventDeleteMixin, ControlledByScmMixin
|
||||
from awx.api.views.mixin import RelatedJobsPreventDeleteMixin
|
||||
|
||||
from awx.api.pagination import UnifiedJobEventPagination
|
||||
|
||||
@ -75,7 +75,7 @@ class InventoryList(ListCreateAPIView):
|
||||
serializer_class = InventorySerializer
|
||||
|
||||
|
||||
class InventoryDetail(RelatedJobsPreventDeleteMixin, ControlledByScmMixin, RetrieveUpdateDestroyAPIView):
|
||||
class InventoryDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIView):
|
||||
|
||||
model = Inventory
|
||||
serializer_class = InventorySerializer
|
||||
|
||||
@ -10,13 +10,12 @@ from django.shortcuts import get_object_or_404
|
||||
from django.utils.timezone import now
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from rest_framework.permissions import SAFE_METHODS
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
from rest_framework.response import Response
|
||||
from rest_framework import status
|
||||
|
||||
from awx.main.constants import ACTIVE_STATES
|
||||
from awx.main.utils import get_object_or_400, parse_yaml_or_json
|
||||
from awx.main.utils import get_object_or_400
|
||||
from awx.main.models.ha import Instance, InstanceGroup
|
||||
from awx.main.models.organization import Team
|
||||
from awx.main.models.projects import Project
|
||||
@ -186,35 +185,6 @@ class OrganizationCountsMixin(object):
|
||||
return full_context
|
||||
|
||||
|
||||
class ControlledByScmMixin(object):
|
||||
"""
|
||||
Special method to reset SCM inventory commit hash
|
||||
if anything that it manages changes.
|
||||
"""
|
||||
|
||||
def _reset_inv_src_rev(self, obj):
|
||||
if self.request.method in SAFE_METHODS or not obj:
|
||||
return
|
||||
project_following_sources = obj.inventory_sources.filter(update_on_project_update=True, source='scm')
|
||||
if project_following_sources:
|
||||
# Allow inventory changes unrelated to variables
|
||||
if self.model == Inventory and (
|
||||
not self.request or not self.request.data or parse_yaml_or_json(self.request.data.get('variables', '')) == parse_yaml_or_json(obj.variables)
|
||||
):
|
||||
return
|
||||
project_following_sources.update(scm_last_revision='')
|
||||
|
||||
def get_object(self):
|
||||
obj = super(ControlledByScmMixin, self).get_object()
|
||||
self._reset_inv_src_rev(obj)
|
||||
return obj
|
||||
|
||||
def get_parent_object(self):
|
||||
obj = super(ControlledByScmMixin, self).get_parent_object()
|
||||
self._reset_inv_src_rev(obj)
|
||||
return obj
|
||||
|
||||
|
||||
class NoTruncateMixin(object):
|
||||
def get_serializer_context(self):
|
||||
context = super().get_serializer_context()
|
||||
|
||||
@ -129,7 +129,7 @@ def config(since, **kwargs):
|
||||
}
|
||||
|
||||
|
||||
@register('counts', '1.1', description=_('Counts of objects such as organizations, inventories, and projects'))
|
||||
@register('counts', '1.2', description=_('Counts of objects such as organizations, inventories, and projects'))
|
||||
def counts(since, **kwargs):
|
||||
counts = {}
|
||||
for cls in (
|
||||
@ -172,6 +172,13 @@ def counts(since, **kwargs):
|
||||
.count()
|
||||
)
|
||||
counts['pending_jobs'] = models.UnifiedJob.objects.exclude(launch_type='sync').filter(status__in=('pending',)).count()
|
||||
if connection.vendor == 'postgresql':
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute(f"select count(*) from pg_stat_activity where datname=\'{connection.settings_dict['NAME']}\'")
|
||||
counts['database_connections'] = cursor.fetchone()[0]
|
||||
else:
|
||||
# We should be using postgresql, but if we do that change that ever we should change the below value
|
||||
counts['database_connections'] = 1
|
||||
return counts
|
||||
|
||||
|
||||
|
||||
@ -126,6 +126,8 @@ def metrics():
|
||||
LICENSE_INSTANCE_TOTAL = Gauge('awx_license_instance_total', 'Total number of managed hosts provided by your license', registry=REGISTRY)
|
||||
LICENSE_INSTANCE_FREE = Gauge('awx_license_instance_free', 'Number of remaining managed hosts provided by your license', registry=REGISTRY)
|
||||
|
||||
DATABASE_CONNECTIONS = Gauge('awx_database_connections_total', 'Number of connections to database', registry=REGISTRY)
|
||||
|
||||
license_info = get_license()
|
||||
SYSTEM_INFO.info(
|
||||
{
|
||||
@ -163,6 +165,8 @@ def metrics():
|
||||
USER_SESSIONS.labels(type='user').set(current_counts['active_user_sessions'])
|
||||
USER_SESSIONS.labels(type='anonymous').set(current_counts['active_anonymous_sessions'])
|
||||
|
||||
DATABASE_CONNECTIONS.set(current_counts['database_connections'])
|
||||
|
||||
all_job_data = job_counts(None)
|
||||
statuses = all_job_data.get('status', {})
|
||||
for status, value in statuses.items():
|
||||
|
||||
@ -10,6 +10,27 @@ from awx.main.models import Instance, UnifiedJob, WorkflowJob
|
||||
logger = logging.getLogger('awx.main.dispatch')
|
||||
|
||||
|
||||
def startup_reaping():
|
||||
"""
|
||||
If this particular instance is starting, then we know that any running jobs are invalid
|
||||
so we will reap those jobs as a special action here
|
||||
"""
|
||||
me = Instance.objects.me()
|
||||
jobs = UnifiedJob.objects.filter(status='running', controller_node=me.hostname)
|
||||
job_ids = []
|
||||
for j in jobs:
|
||||
job_ids.append(j.id)
|
||||
j.status = 'failed'
|
||||
j.start_args = ''
|
||||
j.job_explanation += 'Task was marked as running at system start up. The system must have not shut down properly, so it has been marked as failed.'
|
||||
j.save(update_fields=['status', 'start_args', 'job_explanation'])
|
||||
if hasattr(j, 'send_notification_templates'):
|
||||
j.send_notification_templates('failed')
|
||||
j.websocket_emit_status('failed')
|
||||
if job_ids:
|
||||
logger.error(f'Unified jobs {job_ids} were reaped on dispatch startup')
|
||||
|
||||
|
||||
def reap_job(j, status):
|
||||
if UnifiedJob.objects.get(id=j.id).status not in ('running', 'waiting'):
|
||||
# just in case, don't reap jobs that aren't running
|
||||
|
||||
@ -169,8 +169,9 @@ class AWXConsumerPG(AWXConsumerBase):
|
||||
logger.exception(f"Error consuming new events from postgres, will retry for {self.pg_max_wait} s")
|
||||
self.pg_down_time = time.time()
|
||||
self.pg_is_down = True
|
||||
if time.time() - self.pg_down_time > self.pg_max_wait:
|
||||
logger.warning(f"Postgres event consumer has not recovered in {self.pg_max_wait} s, exiting")
|
||||
current_downtime = time.time() - self.pg_down_time
|
||||
if current_downtime > self.pg_max_wait:
|
||||
logger.exception(f"Postgres event consumer has not recovered in {current_downtime} s, exiting")
|
||||
raise
|
||||
# Wait for a second before next attempt, but still listen for any shutdown signals
|
||||
for i in range(10):
|
||||
@ -179,6 +180,10 @@ class AWXConsumerPG(AWXConsumerBase):
|
||||
time.sleep(0.1)
|
||||
for conn in db.connections.all():
|
||||
conn.close_if_unusable_or_obsolete()
|
||||
except Exception:
|
||||
# Log unanticipated exception in addition to writing to stderr to get timestamps and other metadata
|
||||
logger.exception('Encountered unhandled error in dispatcher main loop')
|
||||
raise
|
||||
|
||||
|
||||
class BaseWorker(object):
|
||||
|
||||
@ -53,7 +53,7 @@ class Command(BaseCommand):
|
||||
# (like the node heartbeat)
|
||||
periodic.run_continuously()
|
||||
|
||||
reaper.reap()
|
||||
reaper.startup_reaping()
|
||||
consumer = None
|
||||
|
||||
try:
|
||||
|
||||
@ -0,0 +1,40 @@
|
||||
# Generated by Django 3.2.13 on 2022-06-21 21:29
|
||||
|
||||
from django.db import migrations
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger("awx")
|
||||
|
||||
|
||||
def forwards(apps, schema_editor):
|
||||
InventorySource = apps.get_model('main', 'InventorySource')
|
||||
sources = InventorySource.objects.filter(update_on_project_update=True)
|
||||
for src in sources:
|
||||
if src.update_on_launch == False:
|
||||
src.update_on_launch = True
|
||||
src.save(update_fields=['update_on_launch'])
|
||||
logger.info(f"Setting update_on_launch to True for {src}")
|
||||
proj = src.source_project
|
||||
if proj and proj.scm_update_on_launch is False:
|
||||
proj.scm_update_on_launch = True
|
||||
proj.save(update_fields=['scm_update_on_launch'])
|
||||
logger.warning(f"Setting scm_update_on_launch to True for {proj}")
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('main', '0163_convert_job_tags_to_textfield'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(forwards, migrations.RunPython.noop),
|
||||
migrations.RemoveField(
|
||||
model_name='inventorysource',
|
||||
name='scm_last_revision',
|
||||
),
|
||||
migrations.RemoveField(
|
||||
model_name='inventorysource',
|
||||
name='update_on_project_update',
|
||||
),
|
||||
]
|
||||
@ -35,6 +35,7 @@ def gce(cred, env, private_data_dir):
|
||||
container_path = to_container_path(path, private_data_dir)
|
||||
env['GCE_CREDENTIALS_FILE_PATH'] = container_path
|
||||
env['GCP_SERVICE_ACCOUNT_FILE'] = container_path
|
||||
env['GOOGLE_APPLICATION_CREDENTIALS'] = container_path
|
||||
|
||||
# Handle env variables for new module types.
|
||||
# This includes gcp_compute inventory plugin and
|
||||
|
||||
@ -985,22 +985,11 @@ class InventorySource(UnifiedJobTemplate, InventorySourceOptions, CustomVirtualE
|
||||
default=None,
|
||||
null=True,
|
||||
)
|
||||
scm_last_revision = models.CharField(
|
||||
max_length=1024,
|
||||
blank=True,
|
||||
default='',
|
||||
editable=False,
|
||||
)
|
||||
update_on_project_update = models.BooleanField(
|
||||
default=False,
|
||||
help_text=_(
|
||||
'This field is deprecated and will be removed in a future release. '
|
||||
'In future release, functionality will be migrated to source project update_on_launch.'
|
||||
),
|
||||
)
|
||||
|
||||
update_on_launch = models.BooleanField(
|
||||
default=False,
|
||||
)
|
||||
|
||||
update_cache_timeout = models.PositiveIntegerField(
|
||||
default=0,
|
||||
)
|
||||
@ -1038,14 +1027,6 @@ class InventorySource(UnifiedJobTemplate, InventorySourceOptions, CustomVirtualE
|
||||
self.name = 'inventory source (%s)' % replace_text
|
||||
if 'name' not in update_fields:
|
||||
update_fields.append('name')
|
||||
# Reset revision if SCM source has changed parameters
|
||||
if self.source == 'scm' and not is_new_instance:
|
||||
before_is = self.__class__.objects.get(pk=self.pk)
|
||||
if before_is.source_path != self.source_path or before_is.source_project_id != self.source_project_id:
|
||||
# Reset the scm_revision if file changed to force update
|
||||
self.scm_last_revision = ''
|
||||
if 'scm_last_revision' not in update_fields:
|
||||
update_fields.append('scm_last_revision')
|
||||
|
||||
# Do the actual save.
|
||||
super(InventorySource, self).save(*args, **kwargs)
|
||||
@ -1054,10 +1035,6 @@ class InventorySource(UnifiedJobTemplate, InventorySourceOptions, CustomVirtualE
|
||||
if replace_text in self.name:
|
||||
self.name = self.name.replace(replace_text, str(self.pk))
|
||||
super(InventorySource, self).save(update_fields=['name'])
|
||||
if self.source == 'scm' and is_new_instance and self.update_on_project_update:
|
||||
# Schedule a new Project update if one is not already queued
|
||||
if self.source_project and not self.source_project.project_updates.filter(status__in=['new', 'pending', 'waiting']).exists():
|
||||
self.update()
|
||||
if not getattr(_inventory_updates, 'is_updating', False):
|
||||
if self.inventory is not None:
|
||||
self.inventory.update_computed_fields()
|
||||
@ -1147,25 +1124,6 @@ class InventorySource(UnifiedJobTemplate, InventorySourceOptions, CustomVirtualE
|
||||
)
|
||||
return dict(error=list(error_notification_templates), started=list(started_notification_templates), success=list(success_notification_templates))
|
||||
|
||||
def clean_update_on_project_update(self):
|
||||
if (
|
||||
self.update_on_project_update is True
|
||||
and self.source == 'scm'
|
||||
and InventorySource.objects.filter(Q(inventory=self.inventory, update_on_project_update=True, source='scm') & ~Q(id=self.id)).exists()
|
||||
):
|
||||
raise ValidationError(_("More than one SCM-based inventory source with update on project update per-inventory not allowed."))
|
||||
return self.update_on_project_update
|
||||
|
||||
def clean_update_on_launch(self):
|
||||
if self.update_on_project_update is True and self.source == 'scm' and self.update_on_launch is True:
|
||||
raise ValidationError(
|
||||
_(
|
||||
"Cannot update SCM-based inventory source on launch if set to update on project update. "
|
||||
"Instead, configure the corresponding source project to update on launch."
|
||||
)
|
||||
)
|
||||
return self.update_on_launch
|
||||
|
||||
def clean_source_path(self):
|
||||
if self.source != 'scm' and self.source_path:
|
||||
raise ValidationError(_("Cannot set source_path if not SCM type."))
|
||||
@ -1301,13 +1259,6 @@ class InventoryUpdate(UnifiedJob, InventorySourceOptions, JobNotificationMixin,
|
||||
return self.global_instance_groups
|
||||
return selected_groups
|
||||
|
||||
def cancel(self, job_explanation=None, is_chain=False):
|
||||
res = super(InventoryUpdate, self).cancel(job_explanation=job_explanation, is_chain=is_chain)
|
||||
if res:
|
||||
if self.launch_type != 'scm' and self.source_project_update:
|
||||
self.source_project_update.cancel(job_explanation=job_explanation)
|
||||
return res
|
||||
|
||||
|
||||
class CustomInventoryScript(CommonModelNameNotUnique, ResourceMixin):
|
||||
class Meta:
|
||||
|
||||
@ -743,6 +743,12 @@ class Job(UnifiedJob, JobOptions, SurveyJobMixin, JobNotificationMixin, TaskMana
|
||||
return "$hidden due to Ansible no_log flag$"
|
||||
return artifacts
|
||||
|
||||
def get_effective_artifacts(self, **kwargs):
|
||||
"""Return unified job artifacts (from set_stats) to pass downstream in workflows"""
|
||||
if isinstance(self.artifacts, dict):
|
||||
return self.artifacts
|
||||
return {}
|
||||
|
||||
@property
|
||||
def is_container_group_task(self):
|
||||
return bool(self.instance_group and self.instance_group.is_container_group)
|
||||
|
||||
@ -533,7 +533,7 @@ class UnifiedJob(
|
||||
('workflow', _('Workflow')), # Job was started from a workflow job.
|
||||
('webhook', _('Webhook')), # Job was started from a webhook event.
|
||||
('sync', _('Sync')), # Job was started from a project sync.
|
||||
('scm', _('SCM Update')), # Job was created as an Inventory SCM sync.
|
||||
('scm', _('SCM Update')), # (deprecated) Job was created as an Inventory SCM sync.
|
||||
]
|
||||
|
||||
PASSWORD_FIELDS = ('start_args',)
|
||||
@ -1204,6 +1204,10 @@ class UnifiedJob(
|
||||
pass
|
||||
return None
|
||||
|
||||
def get_effective_artifacts(self, **kwargs):
|
||||
"""Return unified job artifacts (from set_stats) to pass downstream in workflows"""
|
||||
return {}
|
||||
|
||||
def get_passwords_needed_to_start(self):
|
||||
return []
|
||||
|
||||
|
||||
@ -318,8 +318,8 @@ class WorkflowJobNode(WorkflowNodeBase):
|
||||
for parent_node in self.get_parent_nodes():
|
||||
is_root_node = False
|
||||
aa_dict.update(parent_node.ancestor_artifacts)
|
||||
if parent_node.job and hasattr(parent_node.job, 'artifacts'):
|
||||
aa_dict.update(parent_node.job.artifacts)
|
||||
if parent_node.job:
|
||||
aa_dict.update(parent_node.job.get_effective_artifacts(parents_set=set([self.workflow_job_id])))
|
||||
if aa_dict and not is_root_node:
|
||||
self.ancestor_artifacts = aa_dict
|
||||
self.save(update_fields=['ancestor_artifacts'])
|
||||
@ -682,6 +682,27 @@ class WorkflowJob(UnifiedJob, WorkflowJobOptions, SurveyJobMixin, JobNotificatio
|
||||
wj = wj.get_workflow_job()
|
||||
return ancestors
|
||||
|
||||
def get_effective_artifacts(self, **kwargs):
|
||||
"""
|
||||
For downstream jobs of a workflow nested inside of a workflow,
|
||||
we send aggregated artifacts from the nodes inside of the nested workflow
|
||||
"""
|
||||
artifacts = {}
|
||||
job_queryset = (
|
||||
UnifiedJob.objects.filter(unified_job_node__workflow_job=self)
|
||||
.defer('job_args', 'job_cwd', 'start_args', 'result_traceback')
|
||||
.order_by('finished', 'id')
|
||||
.filter(status__in=['successful', 'failed'])
|
||||
.iterator()
|
||||
)
|
||||
parents_set = kwargs.get('parents_set', set())
|
||||
new_parents_set = parents_set | {self.id}
|
||||
for job in job_queryset:
|
||||
if job.id in parents_set:
|
||||
continue
|
||||
artifacts.update(job.get_effective_artifacts(parents_set=new_parents_set))
|
||||
return artifacts
|
||||
|
||||
def get_notification_templates(self):
|
||||
return self.workflow_job_template.notification_templates
|
||||
|
||||
|
||||
@ -248,11 +248,11 @@ class TaskManager:
|
||||
workflow_job.save(update_fields=update_fields)
|
||||
status_changed = True
|
||||
if status_changed:
|
||||
if workflow_job.spawned_by_workflow:
|
||||
schedule_task_manager()
|
||||
workflow_job.websocket_emit_status(workflow_job.status)
|
||||
# Operations whose queries rely on modifications made during the atomic scheduling session
|
||||
workflow_job.send_notification_templates('succeeded' if workflow_job.status == 'successful' else 'failed')
|
||||
if workflow_job.spawned_by_workflow:
|
||||
schedule_task_manager()
|
||||
return result
|
||||
|
||||
@timeit
|
||||
|
||||
@ -16,6 +16,7 @@ from awx.main.redact import UriCleaner
|
||||
from awx.main.constants import MINIMAL_EVENTS, ANSIBLE_RUNNER_NEEDS_UPDATE_MESSAGE
|
||||
from awx.main.utils.update_model import update_model
|
||||
from awx.main.queue import CallbackQueueDispatcher
|
||||
from awx.main.tasks.signals import signal_callback
|
||||
|
||||
logger = logging.getLogger('awx.main.tasks.callback')
|
||||
|
||||
@ -179,7 +180,13 @@ class RunnerCallback:
|
||||
Ansible runner callback to tell the job when/if it is canceled
|
||||
"""
|
||||
unified_job_id = self.instance.pk
|
||||
self.instance = self.update_model(unified_job_id)
|
||||
if signal_callback():
|
||||
return True
|
||||
try:
|
||||
self.instance = self.update_model(unified_job_id)
|
||||
except Exception:
|
||||
logger.exception(f'Encountered error during cancel check for {unified_job_id}, canceling now')
|
||||
return True
|
||||
if not self.instance:
|
||||
logger.error('unified job {} was deleted while running, canceling'.format(unified_job_id))
|
||||
return True
|
||||
|
||||
@ -19,7 +19,6 @@ from uuid import uuid4
|
||||
|
||||
# Django
|
||||
from django.conf import settings
|
||||
from django.db import transaction
|
||||
|
||||
|
||||
# Runner
|
||||
@ -34,7 +33,6 @@ from gitdb.exc import BadName as BadGitName
|
||||
from awx.main.dispatch.publish import task
|
||||
from awx.main.dispatch import get_local_queuename
|
||||
from awx.main.constants import (
|
||||
ACTIVE_STATES,
|
||||
PRIVILEGE_ESCALATION_METHODS,
|
||||
STANDARD_INVENTORY_UPDATE_ENV,
|
||||
JOB_FOLDER_PREFIX,
|
||||
@ -64,6 +62,7 @@ from awx.main.tasks.callback import (
|
||||
RunnerCallbackForProjectUpdate,
|
||||
RunnerCallbackForSystemJob,
|
||||
)
|
||||
from awx.main.tasks.signals import with_signal_handling, signal_callback
|
||||
from awx.main.tasks.receptor import AWXReceptorJob
|
||||
from awx.main.exceptions import AwxTaskError, PostRunError, ReceptorNodeNotFound
|
||||
from awx.main.utils.ansible import read_ansible_config
|
||||
@ -394,6 +393,7 @@ class BaseTask(object):
|
||||
instance.save(update_fields=['ansible_version'])
|
||||
|
||||
@with_path_cleanup
|
||||
@with_signal_handling
|
||||
def run(self, pk, **kwargs):
|
||||
"""
|
||||
Run the job/task and capture its output.
|
||||
@ -425,7 +425,7 @@ class BaseTask(object):
|
||||
private_data_dir = self.build_private_data_dir(self.instance)
|
||||
self.pre_run_hook(self.instance, private_data_dir)
|
||||
self.instance.log_lifecycle("preparing_playbook")
|
||||
if self.instance.cancel_flag:
|
||||
if self.instance.cancel_flag or signal_callback():
|
||||
self.instance = self.update_model(self.instance.pk, status='canceled')
|
||||
if self.instance.status != 'running':
|
||||
# Stop the task chain and prevent starting the job if it has
|
||||
@ -547,6 +547,11 @@ class BaseTask(object):
|
||||
self.runner_callback.delay_update(skip_if_already_set=True, job_explanation=f"Job terminated due to {status}")
|
||||
if status == 'timeout':
|
||||
status = 'failed'
|
||||
elif status == 'canceled':
|
||||
self.instance = self.update_model(pk)
|
||||
if (getattr(self.instance, 'cancel_flag', False) is False) and signal_callback():
|
||||
self.runner_callback.delay_update(job_explanation="Task was canceled due to receiving a shutdown signal.")
|
||||
status = 'failed'
|
||||
except ReceptorNodeNotFound as exc:
|
||||
self.runner_callback.delay_update(job_explanation=str(exc))
|
||||
except Exception:
|
||||
@ -1168,64 +1173,6 @@ class RunProjectUpdate(BaseTask):
|
||||
d[r'^Are you sure you want to continue connecting \(yes/no\)\?\s*?$'] = 'yes'
|
||||
return d
|
||||
|
||||
def _update_dependent_inventories(self, project_update, dependent_inventory_sources):
|
||||
scm_revision = project_update.project.scm_revision
|
||||
inv_update_class = InventoryUpdate._get_task_class()
|
||||
for inv_src in dependent_inventory_sources:
|
||||
if not inv_src.update_on_project_update:
|
||||
continue
|
||||
if inv_src.scm_last_revision == scm_revision:
|
||||
logger.debug('Skipping SCM inventory update for `{}` because ' 'project has not changed.'.format(inv_src.name))
|
||||
continue
|
||||
logger.debug('Local dependent inventory update for `{}`.'.format(inv_src.name))
|
||||
with transaction.atomic():
|
||||
if InventoryUpdate.objects.filter(inventory_source=inv_src, status__in=ACTIVE_STATES).exists():
|
||||
logger.debug('Skipping SCM inventory update for `{}` because ' 'another update is already active.'.format(inv_src.name))
|
||||
continue
|
||||
|
||||
if settings.IS_K8S:
|
||||
instance_group = InventoryUpdate(inventory_source=inv_src).preferred_instance_groups[0]
|
||||
else:
|
||||
instance_group = project_update.instance_group
|
||||
|
||||
local_inv_update = inv_src.create_inventory_update(
|
||||
_eager_fields=dict(
|
||||
launch_type='scm',
|
||||
status='running',
|
||||
instance_group=instance_group,
|
||||
execution_node=project_update.execution_node,
|
||||
controller_node=project_update.execution_node,
|
||||
source_project_update=project_update,
|
||||
celery_task_id=project_update.celery_task_id,
|
||||
)
|
||||
)
|
||||
local_inv_update.log_lifecycle("controller_node_chosen")
|
||||
local_inv_update.log_lifecycle("execution_node_chosen")
|
||||
try:
|
||||
create_partition(local_inv_update.event_class._meta.db_table, start=local_inv_update.created)
|
||||
inv_update_class().run(local_inv_update.id)
|
||||
except Exception:
|
||||
logger.exception('{} Unhandled exception updating dependent SCM inventory sources.'.format(project_update.log_format))
|
||||
|
||||
try:
|
||||
project_update.refresh_from_db()
|
||||
except ProjectUpdate.DoesNotExist:
|
||||
logger.warning('Project update deleted during updates of dependent SCM inventory sources.')
|
||||
break
|
||||
try:
|
||||
local_inv_update.refresh_from_db()
|
||||
except InventoryUpdate.DoesNotExist:
|
||||
logger.warning('%s Dependent inventory update deleted during execution.', project_update.log_format)
|
||||
continue
|
||||
if project_update.cancel_flag:
|
||||
logger.info('Project update {} was canceled while updating dependent inventories.'.format(project_update.log_format))
|
||||
break
|
||||
if local_inv_update.cancel_flag:
|
||||
logger.info('Continuing to process project dependencies after {} was canceled'.format(local_inv_update.log_format))
|
||||
if local_inv_update.status == 'successful':
|
||||
inv_src.scm_last_revision = scm_revision
|
||||
inv_src.save(update_fields=['scm_last_revision'])
|
||||
|
||||
def release_lock(self, instance):
|
||||
try:
|
||||
fcntl.lockf(self.lock_fd, fcntl.LOCK_UN)
|
||||
@ -1435,12 +1382,6 @@ class RunProjectUpdate(BaseTask):
|
||||
p.inventory_files = p.inventories
|
||||
p.save(update_fields=['scm_revision', 'playbook_files', 'inventory_files'])
|
||||
|
||||
# Update any inventories that depend on this project
|
||||
dependent_inventory_sources = p.scm_inventory_sources.filter(update_on_project_update=True)
|
||||
if len(dependent_inventory_sources) > 0:
|
||||
if status == 'successful' and instance.launch_type != 'sync':
|
||||
self._update_dependent_inventories(instance, dependent_inventory_sources)
|
||||
|
||||
def build_execution_environment_params(self, instance, private_data_dir):
|
||||
if settings.IS_K8S:
|
||||
return {}
|
||||
@ -1620,9 +1561,7 @@ class RunInventoryUpdate(BaseTask):
|
||||
source_project = None
|
||||
if inventory_update.inventory_source:
|
||||
source_project = inventory_update.inventory_source.source_project
|
||||
if (
|
||||
inventory_update.source == 'scm' and inventory_update.launch_type != 'scm' and source_project and source_project.scm_type
|
||||
): # never ever update manual projects
|
||||
if inventory_update.source == 'scm' and source_project and source_project.scm_type: # never ever update manual projects
|
||||
|
||||
# Check if the content cache exists, so that we do not unnecessarily re-download roles
|
||||
sync_needs = ['update_{}'.format(source_project.scm_type)]
|
||||
@ -1655,8 +1594,6 @@ class RunInventoryUpdate(BaseTask):
|
||||
sync_task = project_update_task(job_private_data_dir=private_data_dir)
|
||||
sync_task.run(local_project_sync.id)
|
||||
local_project_sync.refresh_from_db()
|
||||
inventory_update.inventory_source.scm_last_revision = local_project_sync.scm_revision
|
||||
inventory_update.inventory_source.save(update_fields=['scm_last_revision'])
|
||||
except Exception:
|
||||
inventory_update = self.update_model(
|
||||
inventory_update.pk,
|
||||
@ -1667,9 +1604,6 @@ class RunInventoryUpdate(BaseTask):
|
||||
),
|
||||
)
|
||||
raise
|
||||
elif inventory_update.source == 'scm' and inventory_update.launch_type == 'scm' and source_project:
|
||||
# This follows update, not sync, so make copy here
|
||||
RunProjectUpdate.make_local_copy(source_project, private_data_dir)
|
||||
|
||||
def post_run_hook(self, inventory_update, status):
|
||||
super(RunInventoryUpdate, self).post_run_hook(inventory_update, status)
|
||||
|
||||
63
awx/main/tasks/signals.py
Normal file
63
awx/main/tasks/signals.py
Normal file
@ -0,0 +1,63 @@
|
||||
import signal
|
||||
import functools
|
||||
import logging
|
||||
|
||||
|
||||
logger = logging.getLogger('awx.main.tasks.signals')
|
||||
|
||||
|
||||
__all__ = ['with_signal_handling', 'signal_callback']
|
||||
|
||||
|
||||
class SignalState:
|
||||
def reset(self):
|
||||
self.sigterm_flag = False
|
||||
self.is_active = False
|
||||
self.original_sigterm = None
|
||||
self.original_sigint = None
|
||||
|
||||
def __init__(self):
|
||||
self.reset()
|
||||
|
||||
def set_flag(self, *args):
|
||||
"""Method to pass into the python signal.signal method to receive signals"""
|
||||
self.sigterm_flag = True
|
||||
|
||||
def connect_signals(self):
|
||||
self.original_sigterm = signal.getsignal(signal.SIGTERM)
|
||||
self.original_sigint = signal.getsignal(signal.SIGINT)
|
||||
signal.signal(signal.SIGTERM, self.set_flag)
|
||||
signal.signal(signal.SIGINT, self.set_flag)
|
||||
self.is_active = True
|
||||
|
||||
def restore_signals(self):
|
||||
signal.signal(signal.SIGTERM, self.original_sigterm)
|
||||
signal.signal(signal.SIGINT, self.original_sigint)
|
||||
self.reset()
|
||||
|
||||
|
||||
signal_state = SignalState()
|
||||
|
||||
|
||||
def signal_callback():
|
||||
return signal_state.sigterm_flag
|
||||
|
||||
|
||||
def with_signal_handling(f):
|
||||
"""
|
||||
Change signal handling to make signal_callback return True in event of SIGTERM or SIGINT.
|
||||
"""
|
||||
|
||||
@functools.wraps(f)
|
||||
def _wrapped(*args, **kwargs):
|
||||
try:
|
||||
this_is_outermost_caller = False
|
||||
if not signal_state.is_active:
|
||||
signal_state.connect_signals()
|
||||
this_is_outermost_caller = True
|
||||
return f(*args, **kwargs)
|
||||
finally:
|
||||
if this_is_outermost_caller:
|
||||
signal_state.restore_signals()
|
||||
|
||||
return _wrapped
|
||||
@ -114,10 +114,6 @@ def inform_cluster_of_shutdown():
|
||||
try:
|
||||
this_inst = Instance.objects.get(hostname=settings.CLUSTER_HOST_ID)
|
||||
this_inst.mark_offline(update_last_seen=True, errors=_('Instance received normal shutdown signal'))
|
||||
try:
|
||||
reaper.reap(this_inst)
|
||||
except Exception:
|
||||
logger.exception('failed to reap jobs for {}'.format(this_inst.hostname))
|
||||
logger.warning('Normal shutdown signal for instance {}, ' 'removed self from capacity pool.'.format(this_inst.hostname))
|
||||
except Exception:
|
||||
logger.exception('Encountered problem with normal shutdown signal.')
|
||||
|
||||
@ -2,8 +2,9 @@
|
||||
"ANSIBLE_JINJA2_NATIVE": "True",
|
||||
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
|
||||
"GCE_CREDENTIALS_FILE_PATH": "{{ file_reference }}",
|
||||
"GOOGLE_APPLICATION_CREDENTIALS": "{{ file_reference }}",
|
||||
"GCP_AUTH_KIND": "serviceaccount",
|
||||
"GCP_ENV_TYPE": "tower",
|
||||
"GCP_PROJECT": "fooo",
|
||||
"GCP_SERVICE_ACCOUNT_FILE": "{{ file_reference }}"
|
||||
}
|
||||
}
|
||||
|
||||
@ -26,6 +26,7 @@ def test_empty():
|
||||
"workflow_job_template": 0,
|
||||
"unified_job": 0,
|
||||
"pending_jobs": 0,
|
||||
"database_connections": 1,
|
||||
}
|
||||
|
||||
|
||||
|
||||
@ -31,6 +31,7 @@ EXPECTED_VALUES = {
|
||||
'awx_license_instance_total': 0,
|
||||
'awx_license_instance_free': 0,
|
||||
'awx_pending_jobs_total': 0,
|
||||
'awx_database_connections_total': 1,
|
||||
}
|
||||
|
||||
|
||||
|
||||
@ -9,9 +9,7 @@ from awx.api.versioning import reverse
|
||||
@pytest.fixture
|
||||
def ec2_source(inventory, project):
|
||||
with mock.patch('awx.main.models.unified_jobs.UnifiedJobTemplate.update'):
|
||||
return inventory.inventory_sources.create(
|
||||
name='some_source', update_on_project_update=True, source='ec2', source_project=project, scm_last_revision=project.scm_revision
|
||||
)
|
||||
return inventory.inventory_sources.create(name='some_source', source='ec2', source_project=project)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
|
||||
@ -13,9 +13,7 @@ from awx.main.models import InventorySource, Inventory, ActivityStream
|
||||
@pytest.fixture
|
||||
def scm_inventory(inventory, project):
|
||||
with mock.patch('awx.main.models.unified_jobs.UnifiedJobTemplate.update'):
|
||||
inventory.inventory_sources.create(
|
||||
name='foobar', update_on_project_update=True, source='scm', source_project=project, scm_last_revision=project.scm_revision
|
||||
)
|
||||
inventory.inventory_sources.create(name='foobar', source='scm', source_project=project)
|
||||
return inventory
|
||||
|
||||
|
||||
@ -23,9 +21,7 @@ def scm_inventory(inventory, project):
|
||||
def factory_scm_inventory(inventory, project):
|
||||
def fn(**kwargs):
|
||||
with mock.patch('awx.main.models.unified_jobs.UnifiedJobTemplate.update'):
|
||||
return inventory.inventory_sources.create(
|
||||
source_project=project, overwrite_vars=True, source='scm', scm_last_revision=project.scm_revision, **kwargs
|
||||
)
|
||||
return inventory.inventory_sources.create(source_project=project, overwrite_vars=True, source='scm', **kwargs)
|
||||
|
||||
return fn
|
||||
|
||||
@ -544,15 +540,12 @@ class TestControlledBySCM:
|
||||
def test_safe_method_works(self, get, options, scm_inventory, admin_user):
|
||||
get(scm_inventory.get_absolute_url(), admin_user, expect=200)
|
||||
options(scm_inventory.get_absolute_url(), admin_user, expect=200)
|
||||
assert InventorySource.objects.get(inventory=scm_inventory.pk).scm_last_revision != ''
|
||||
|
||||
def test_vars_edit_reset(self, patch, scm_inventory, admin_user):
|
||||
patch(scm_inventory.get_absolute_url(), {'variables': 'hello: world'}, admin_user, expect=200)
|
||||
assert InventorySource.objects.get(inventory=scm_inventory.pk).scm_last_revision == ''
|
||||
|
||||
def test_name_edit_allowed(self, patch, scm_inventory, admin_user):
|
||||
patch(scm_inventory.get_absolute_url(), {'variables': '---', 'name': 'newname'}, admin_user, expect=200)
|
||||
assert InventorySource.objects.get(inventory=scm_inventory.pk).scm_last_revision != ''
|
||||
|
||||
def test_host_associations_reset(self, post, scm_inventory, admin_user):
|
||||
inv_src = scm_inventory.inventory_sources.first()
|
||||
@ -560,14 +553,12 @@ class TestControlledBySCM:
|
||||
g = inv_src.groups.create(name='fooland', inventory=scm_inventory)
|
||||
post(reverse('api:host_groups_list', kwargs={'pk': h.id}), {'id': g.id}, admin_user, expect=204)
|
||||
post(reverse('api:group_hosts_list', kwargs={'pk': g.id}), {'id': h.id}, admin_user, expect=204)
|
||||
assert InventorySource.objects.get(inventory=scm_inventory.pk).scm_last_revision == ''
|
||||
|
||||
def test_group_group_associations_reset(self, post, scm_inventory, admin_user):
|
||||
inv_src = scm_inventory.inventory_sources.first()
|
||||
g1 = inv_src.groups.create(name='barland', inventory=scm_inventory)
|
||||
g2 = inv_src.groups.create(name='fooland', inventory=scm_inventory)
|
||||
post(reverse('api:group_children_list', kwargs={'pk': g1.id}), {'id': g2.id}, admin_user, expect=204)
|
||||
assert InventorySource.objects.get(inventory=scm_inventory.pk).scm_last_revision == ''
|
||||
|
||||
def test_host_group_delete_reset(self, delete, scm_inventory, admin_user):
|
||||
inv_src = scm_inventory.inventory_sources.first()
|
||||
@ -575,7 +566,6 @@ class TestControlledBySCM:
|
||||
g = inv_src.groups.create(name='fooland', inventory=scm_inventory)
|
||||
delete(h.get_absolute_url(), admin_user, expect=204)
|
||||
delete(g.get_absolute_url(), admin_user, expect=204)
|
||||
assert InventorySource.objects.get(inventory=scm_inventory.pk).scm_last_revision == ''
|
||||
|
||||
def test_remove_scm_inv_src(self, delete, scm_inventory, admin_user):
|
||||
inv_src = scm_inventory.inventory_sources.first()
|
||||
@ -588,7 +578,6 @@ class TestControlledBySCM:
|
||||
{
|
||||
'name': 'new inv src',
|
||||
'source_project': project.pk,
|
||||
'update_on_project_update': False,
|
||||
'source': 'scm',
|
||||
'overwrite_vars': True,
|
||||
'source_vars': 'plugin: a.b.c',
|
||||
@ -597,27 +586,6 @@ class TestControlledBySCM:
|
||||
expect=201,
|
||||
)
|
||||
|
||||
def test_adding_inv_src_prohibited(self, post, scm_inventory, project, admin_user):
|
||||
post(
|
||||
reverse('api:inventory_inventory_sources_list', kwargs={'pk': scm_inventory.id}),
|
||||
{'name': 'new inv src', 'source_project': project.pk, 'update_on_project_update': True, 'source': 'scm', 'overwrite_vars': True},
|
||||
admin_user,
|
||||
expect=400,
|
||||
)
|
||||
|
||||
def test_two_update_on_project_update_inv_src_prohibited(self, patch, scm_inventory, factory_scm_inventory, project, admin_user):
|
||||
scm_inventory2 = factory_scm_inventory(name="scm_inventory2")
|
||||
res = patch(
|
||||
reverse('api:inventory_source_detail', kwargs={'pk': scm_inventory2.id}),
|
||||
{
|
||||
'update_on_project_update': True,
|
||||
},
|
||||
admin_user,
|
||||
expect=400,
|
||||
)
|
||||
content = json.loads(res.content)
|
||||
assert content['update_on_project_update'] == ["More than one SCM-based inventory source with update on project update " "per-inventory not allowed."]
|
||||
|
||||
def test_adding_inv_src_without_proj_access_prohibited(self, post, project, inventory, rando):
|
||||
inventory.admin_role.members.add(rando)
|
||||
post(
|
||||
|
||||
@ -347,9 +347,7 @@ def scm_inventory_source(inventory, project):
|
||||
source_project=project,
|
||||
source='scm',
|
||||
source_path='inventory_file',
|
||||
update_on_project_update=True,
|
||||
inventory=inventory,
|
||||
scm_last_revision=project.scm_revision,
|
||||
)
|
||||
with mock.patch('awx.main.models.unified_jobs.UnifiedJobTemplate.update'):
|
||||
inv_src.save()
|
||||
|
||||
@ -3,8 +3,6 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
|
||||
from django.core.exceptions import ValidationError
|
||||
|
||||
# AWX
|
||||
from awx.main.models import Host, Inventory, InventorySource, InventoryUpdate, CredentialType, Credential, Job
|
||||
from awx.main.constants import CLOUD_PROVIDERS
|
||||
@ -123,19 +121,6 @@ class TestActiveCount:
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSCMUpdateFeatures:
|
||||
def test_automatic_project_update_on_create(self, inventory, project):
|
||||
inv_src = InventorySource(source_project=project, source_path='inventory_file', inventory=inventory, update_on_project_update=True, source='scm')
|
||||
with mock.patch.object(inv_src, 'update') as mck_update:
|
||||
inv_src.save()
|
||||
mck_update.assert_called_once_with()
|
||||
|
||||
def test_reset_scm_revision(self, scm_inventory_source):
|
||||
starting_rev = scm_inventory_source.scm_last_revision
|
||||
assert starting_rev != ''
|
||||
scm_inventory_source.source_path = '/newfolder/newfile.ini'
|
||||
scm_inventory_source.save()
|
||||
assert scm_inventory_source.scm_last_revision == ''
|
||||
|
||||
def test_source_location(self, scm_inventory_source):
|
||||
# Combines project directory with the inventory file specified
|
||||
inventory_update = InventoryUpdate(inventory_source=scm_inventory_source, source_path=scm_inventory_source.source_path)
|
||||
@ -167,22 +152,6 @@ class TestRelatedJobs:
|
||||
assert job.id in [jerb.id for jerb in group._get_related_jobs()]
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestSCMClean:
|
||||
def test_clean_update_on_project_update_multiple(self, inventory):
|
||||
inv_src1 = InventorySource(inventory=inventory, update_on_project_update=True, source='scm')
|
||||
inv_src1.clean_update_on_project_update()
|
||||
inv_src1.save()
|
||||
|
||||
inv_src1.source_vars = '---\nhello: world'
|
||||
inv_src1.clean_update_on_project_update()
|
||||
|
||||
inv_src2 = InventorySource(inventory=inventory, update_on_project_update=True, source='scm')
|
||||
|
||||
with pytest.raises(ValidationError):
|
||||
inv_src2.clean_update_on_project_update()
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestInventorySourceInjectors:
|
||||
def test_extra_credentials(self, project, credential):
|
||||
|
||||
@ -19,6 +19,7 @@ from awx.api.views import WorkflowJobTemplateNodeSuccessNodesList
|
||||
# Django
|
||||
from django.test import TransactionTestCase
|
||||
from django.core.exceptions import ValidationError
|
||||
from django.utils.timezone import now
|
||||
|
||||
|
||||
class TestWorkflowDAGFunctional(TransactionTestCase):
|
||||
@ -381,3 +382,38 @@ def test_workflow_ancestors_recursion_prevention(organization):
|
||||
WorkflowJobNode.objects.create(workflow_job=wfj, unified_job_template=wfjt, job=wfj) # well, this is a problem
|
||||
# mostly, we just care that this assertion finishes in finite time
|
||||
assert wfj.get_ancestor_workflows() == []
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestCombinedArtifacts:
|
||||
@pytest.fixture
|
||||
def wfj_artifacts(self, job_template, organization):
|
||||
wfjt = WorkflowJobTemplate.objects.create(organization=organization, name='has_artifacts')
|
||||
wfj = WorkflowJob.objects.create(workflow_job_template=wfjt, launch_type='workflow')
|
||||
job = job_template.create_unified_job(_eager_fields=dict(artifacts={'foooo': 'bar'}, status='successful', finished=now()))
|
||||
WorkflowJobNode.objects.create(workflow_job=wfj, unified_job_template=job_template, job=job)
|
||||
return wfj
|
||||
|
||||
def test_multiple_types(self, project, wfj_artifacts):
|
||||
project_update = project.create_unified_job()
|
||||
WorkflowJobNode.objects.create(workflow_job=wfj_artifacts, unified_job_template=project, job=project_update)
|
||||
|
||||
assert wfj_artifacts.get_effective_artifacts() == {'foooo': 'bar'}
|
||||
|
||||
def test_precedence_based_on_time(self, wfj_artifacts, job_template):
|
||||
later_job = job_template.create_unified_job(
|
||||
_eager_fields=dict(artifacts={'foooo': 'zoo'}, status='successful', finished=now()) # finished later, should win
|
||||
)
|
||||
WorkflowJobNode.objects.create(workflow_job=wfj_artifacts, unified_job_template=job_template, job=later_job)
|
||||
|
||||
assert wfj_artifacts.get_effective_artifacts() == {'foooo': 'zoo'}
|
||||
|
||||
def test_bad_data_with_artifacts(self, organization):
|
||||
# This is toxic database data, this tests that it doesn't create an infinite loop
|
||||
wfjt = WorkflowJobTemplate.objects.create(organization=organization, name='child')
|
||||
wfj = WorkflowJob.objects.create(workflow_job_template=wfjt, launch_type='workflow')
|
||||
WorkflowJobNode.objects.create(workflow_job=wfj, unified_job_template=wfjt, job=wfj)
|
||||
job = Job.objects.create(artifacts={'foo': 'bar'}, status='successful')
|
||||
WorkflowJobNode.objects.create(workflow_job=wfj, job=job)
|
||||
# mostly, we just care that this assertion finishes in finite time
|
||||
assert wfj.get_effective_artifacts() == {'foo': 'bar'}
|
||||
|
||||
@ -4,9 +4,8 @@ import os
|
||||
import tempfile
|
||||
import shutil
|
||||
|
||||
from awx.main.tasks.jobs import RunProjectUpdate, RunInventoryUpdate
|
||||
from awx.main.tasks.system import execution_node_health_check, _cleanup_images_and_files
|
||||
from awx.main.models import ProjectUpdate, InventoryUpdate, InventorySource, Instance, Job
|
||||
from awx.main.models import Instance, Job
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
@ -27,63 +26,6 @@ def test_no_worker_info_on_AWX_nodes(node_type):
|
||||
execution_node_health_check(hostname)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
class TestDependentInventoryUpdate:
|
||||
def test_dependent_inventory_updates_is_called(self, scm_inventory_source, scm_revision_file, mock_me):
|
||||
task = RunProjectUpdate()
|
||||
task.revision_path = scm_revision_file
|
||||
proj_update = scm_inventory_source.source_project.create_project_update()
|
||||
with mock.patch.object(RunProjectUpdate, '_update_dependent_inventories') as inv_update_mck:
|
||||
with mock.patch.object(RunProjectUpdate, 'release_lock'):
|
||||
task.post_run_hook(proj_update, 'successful')
|
||||
inv_update_mck.assert_called_once_with(proj_update, mock.ANY)
|
||||
|
||||
def test_no_unwanted_dependent_inventory_updates(self, project, scm_revision_file, mock_me):
|
||||
task = RunProjectUpdate()
|
||||
task.revision_path = scm_revision_file
|
||||
proj_update = project.create_project_update()
|
||||
with mock.patch.object(RunProjectUpdate, '_update_dependent_inventories') as inv_update_mck:
|
||||
with mock.patch.object(RunProjectUpdate, 'release_lock'):
|
||||
task.post_run_hook(proj_update, 'successful')
|
||||
assert not inv_update_mck.called
|
||||
|
||||
def test_dependent_inventory_updates(self, scm_inventory_source, default_instance_group, mock_me):
|
||||
task = RunProjectUpdate()
|
||||
scm_inventory_source.scm_last_revision = ''
|
||||
proj_update = ProjectUpdate.objects.create(project=scm_inventory_source.source_project)
|
||||
with mock.patch.object(RunInventoryUpdate, 'run') as iu_run_mock:
|
||||
with mock.patch('awx.main.tasks.jobs.create_partition'):
|
||||
task._update_dependent_inventories(proj_update, [scm_inventory_source])
|
||||
assert InventoryUpdate.objects.count() == 1
|
||||
inv_update = InventoryUpdate.objects.first()
|
||||
iu_run_mock.assert_called_once_with(inv_update.id)
|
||||
assert inv_update.source_project_update_id == proj_update.pk
|
||||
|
||||
def test_dependent_inventory_project_cancel(self, project, inventory, default_instance_group, mock_me):
|
||||
"""
|
||||
Test that dependent inventory updates exhibit good behavior on cancel
|
||||
of the source project update
|
||||
"""
|
||||
task = RunProjectUpdate()
|
||||
proj_update = ProjectUpdate.objects.create(project=project)
|
||||
|
||||
kwargs = dict(source_project=project, source='scm', source_path='inventory_file', update_on_project_update=True, inventory=inventory)
|
||||
|
||||
is1 = InventorySource.objects.create(name="test-scm-inv", **kwargs)
|
||||
is2 = InventorySource.objects.create(name="test-scm-inv2", **kwargs)
|
||||
|
||||
def user_cancels_project(pk):
|
||||
ProjectUpdate.objects.all().update(cancel_flag=True)
|
||||
|
||||
with mock.patch.object(RunInventoryUpdate, 'run') as iu_run_mock:
|
||||
with mock.patch('awx.main.tasks.jobs.create_partition'):
|
||||
iu_run_mock.side_effect = user_cancels_project
|
||||
task._update_dependent_inventories(proj_update, [is1, is2])
|
||||
# Verify that it bails after 1st update, detecting a cancel
|
||||
assert is2.inventory_updates.count() == 0
|
||||
iu_run_mock.assert_called_once()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_job_folder(request):
|
||||
pdd_path = tempfile.mkdtemp(prefix='awx_123_')
|
||||
|
||||
@ -69,21 +69,21 @@ class TestJobTemplateLabelList:
|
||||
|
||||
class TestInventoryInventorySourcesUpdate:
|
||||
@pytest.mark.parametrize(
|
||||
"can_update, can_access, is_source, is_up_on_proj, expected",
|
||||
"can_update, can_access, is_source, expected",
|
||||
[
|
||||
(True, True, "ec2", False, [{'status': 'started', 'inventory_update': 1, 'inventory_source': 1}]),
|
||||
(False, True, "gce", False, [{'status': 'Could not start because `can_update` returned False', 'inventory_source': 1}]),
|
||||
(True, False, "scm", True, [{'status': 'started', 'inventory_update': 1, 'inventory_source': 1}]),
|
||||
(True, True, "ec2", [{'status': 'started', 'inventory_update': 1, 'inventory_source': 1}]),
|
||||
(False, True, "gce", [{'status': 'Could not start because `can_update` returned False', 'inventory_source': 1}]),
|
||||
(True, False, "scm", [{'status': 'started', 'inventory_update': 1, 'inventory_source': 1}]),
|
||||
],
|
||||
)
|
||||
def test_post(self, mocker, can_update, can_access, is_source, is_up_on_proj, expected):
|
||||
def test_post(self, mocker, can_update, can_access, is_source, expected):
|
||||
class InventoryUpdate:
|
||||
id = 1
|
||||
|
||||
class Project:
|
||||
name = 'project'
|
||||
|
||||
InventorySource = namedtuple('InventorySource', ['source', 'update_on_project_update', 'pk', 'can_update', 'update', 'source_project'])
|
||||
InventorySource = namedtuple('InventorySource', ['source', 'pk', 'can_update', 'update', 'source_project'])
|
||||
|
||||
class InventorySources(object):
|
||||
def all(self):
|
||||
@ -92,7 +92,6 @@ class TestInventoryInventorySourcesUpdate:
|
||||
pk=1,
|
||||
source=is_source,
|
||||
source_project=Project,
|
||||
update_on_project_update=is_up_on_proj,
|
||||
can_update=can_update,
|
||||
update=lambda: InventoryUpdate,
|
||||
)
|
||||
|
||||
@ -1,28 +1,13 @@
|
||||
import pytest
|
||||
from unittest import mock
|
||||
|
||||
from django.core.exceptions import ValidationError
|
||||
|
||||
from awx.main.models import (
|
||||
UnifiedJob,
|
||||
InventoryUpdate,
|
||||
InventorySource,
|
||||
)
|
||||
|
||||
|
||||
def test_cancel(mocker):
|
||||
with mock.patch.object(UnifiedJob, 'cancel', return_value=True) as parent_cancel:
|
||||
iu = InventoryUpdate()
|
||||
|
||||
iu.save = mocker.MagicMock()
|
||||
build_job_explanation_mock = mocker.MagicMock()
|
||||
iu._build_job_explanation = mocker.MagicMock(return_value=build_job_explanation_mock)
|
||||
|
||||
iu.cancel()
|
||||
|
||||
parent_cancel.assert_called_with(is_chain=False, job_explanation=None)
|
||||
|
||||
|
||||
def test__build_job_explanation():
|
||||
iu = InventoryUpdate(id=3, name='I_am_an_Inventory_Update')
|
||||
|
||||
@ -53,9 +38,3 @@ class TestControlledBySCM:
|
||||
|
||||
with pytest.raises(ValidationError):
|
||||
inv_src.clean_source_path()
|
||||
|
||||
def test_clean_update_on_launch_update_on_project_update(self):
|
||||
inv_src = InventorySource(update_on_project_update=True, update_on_launch=True, source='scm')
|
||||
|
||||
with pytest.raises(ValidationError):
|
||||
inv_src.clean_update_on_launch()
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
from awx.main.tasks.callback import RunnerCallback
|
||||
from awx.main.constants import ANSIBLE_RUNNER_NEEDS_UPDATE_MESSAGE
|
||||
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
|
||||
def test_delay_update(mock_me):
|
||||
|
||||
50
awx/main/tests/unit/tasks/test_signals.py
Normal file
50
awx/main/tests/unit/tasks/test_signals.py
Normal file
@ -0,0 +1,50 @@
|
||||
import signal
|
||||
|
||||
from awx.main.tasks.signals import signal_state, signal_callback, with_signal_handling
|
||||
|
||||
|
||||
def test_outer_inner_signal_handling():
|
||||
"""
|
||||
Even if the flag is set in the outer context, its value should persist in the inner context
|
||||
"""
|
||||
|
||||
@with_signal_handling
|
||||
def f2():
|
||||
assert signal_callback()
|
||||
|
||||
@with_signal_handling
|
||||
def f1():
|
||||
assert signal_callback() is False
|
||||
signal_state.set_flag()
|
||||
assert signal_callback()
|
||||
f2()
|
||||
|
||||
original_sigterm = signal.getsignal(signal.SIGTERM)
|
||||
assert signal_callback() is False
|
||||
f1()
|
||||
assert signal_callback() is False
|
||||
assert signal.getsignal(signal.SIGTERM) is original_sigterm
|
||||
|
||||
|
||||
def test_inner_outer_signal_handling():
|
||||
"""
|
||||
Even if the flag is set in the inner context, its value should persist in the outer context
|
||||
"""
|
||||
|
||||
@with_signal_handling
|
||||
def f2():
|
||||
assert signal_callback() is False
|
||||
signal_state.set_flag()
|
||||
assert signal_callback()
|
||||
|
||||
@with_signal_handling
|
||||
def f1():
|
||||
assert signal_callback() is False
|
||||
f2()
|
||||
assert signal_callback()
|
||||
|
||||
original_sigterm = signal.getsignal(signal.SIGTERM)
|
||||
assert signal_callback() is False
|
||||
f1()
|
||||
assert signal_callback() is False
|
||||
assert signal.getsignal(signal.SIGTERM) is original_sigterm
|
||||
@ -922,7 +922,8 @@ class TestJobCredentials(TestJobExecution):
|
||||
assert env['AWS_SECURITY_TOKEN'] == 'token'
|
||||
assert safe_env['AWS_SECRET_ACCESS_KEY'] == HIDDEN_PASSWORD
|
||||
|
||||
def test_gce_credentials(self, private_data_dir, job, mock_me):
|
||||
@pytest.mark.parametrize("cred_env_var", ['GCE_CREDENTIALS_FILE_PATH', 'GOOGLE_APPLICATION_CREDENTIALS'])
|
||||
def test_gce_credentials(self, cred_env_var, private_data_dir, job, mock_me):
|
||||
gce = CredentialType.defaults['gce']()
|
||||
credential = Credential(pk=1, credential_type=gce, inputs={'username': 'bob', 'project': 'some-project', 'ssh_key_data': self.EXAMPLE_PRIVATE_KEY})
|
||||
credential.inputs['ssh_key_data'] = encrypt_field(credential, 'ssh_key_data')
|
||||
@ -931,7 +932,7 @@ class TestJobCredentials(TestJobExecution):
|
||||
env = {}
|
||||
safe_env = {}
|
||||
credential.credential_type.inject_credential(credential, env, safe_env, [], private_data_dir)
|
||||
runner_path = env['GCE_CREDENTIALS_FILE_PATH']
|
||||
runner_path = env[cred_env_var]
|
||||
local_path = to_host_path(runner_path, private_data_dir)
|
||||
json_data = json.load(open(local_path, 'rb'))
|
||||
assert json_data['type'] == 'service_account'
|
||||
@ -1316,6 +1317,7 @@ class TestJobCredentials(TestJobExecution):
|
||||
assert env['AZURE_AD_USER'] == 'bob'
|
||||
assert env['AZURE_PASSWORD'] == 'secret'
|
||||
|
||||
# Because this is testing a mix of multiple cloud creds, we are not going to test the GOOGLE_APPLICATION_CREDENTIALS here
|
||||
path = to_host_path(env['GCE_CREDENTIALS_FILE_PATH'], private_data_dir)
|
||||
json_data = json.load(open(path, 'rb'))
|
||||
assert json_data['type'] == 'service_account'
|
||||
@ -1645,7 +1647,8 @@ class TestInventoryUpdateCredentials(TestJobExecution):
|
||||
|
||||
assert safe_env['AZURE_PASSWORD'] == HIDDEN_PASSWORD
|
||||
|
||||
def test_gce_source(self, inventory_update, private_data_dir, mocker, mock_me):
|
||||
@pytest.mark.parametrize("cred_env_var", ['GCE_CREDENTIALS_FILE_PATH', 'GOOGLE_APPLICATION_CREDENTIALS'])
|
||||
def test_gce_source(self, cred_env_var, inventory_update, private_data_dir, mocker, mock_me):
|
||||
task = jobs.RunInventoryUpdate()
|
||||
task.instance = inventory_update
|
||||
gce = CredentialType.defaults['gce']()
|
||||
@ -1669,7 +1672,7 @@ class TestInventoryUpdateCredentials(TestJobExecution):
|
||||
credential.credential_type.inject_credential(credential, env, safe_env, [], private_data_dir)
|
||||
|
||||
assert env['GCE_ZONE'] == expected_gce_zone
|
||||
json_data = json.load(open(env['GCE_CREDENTIALS_FILE_PATH'], 'rb'))
|
||||
json_data = json.load(open(env[cred_env_var], 'rb'))
|
||||
assert json_data['type'] == 'service_account'
|
||||
assert json_data['private_key'] == self.EXAMPLE_PRIVATE_KEY
|
||||
assert json_data['client_email'] == 'bob'
|
||||
|
||||
@ -3,6 +3,8 @@ from django.db import transaction, DatabaseError, InterfaceError
|
||||
import logging
|
||||
import time
|
||||
|
||||
from awx.main.tasks.signals import signal_callback
|
||||
|
||||
|
||||
logger = logging.getLogger('awx.main.tasks.utils')
|
||||
|
||||
@ -37,7 +39,10 @@ def update_model(model, pk, _attempt=0, _max_attempts=5, select_for_update=False
|
||||
# Attempt to retry the update, assuming we haven't already
|
||||
# tried too many times.
|
||||
if _attempt < _max_attempts:
|
||||
time.sleep(5)
|
||||
for i in range(5):
|
||||
time.sleep(1)
|
||||
if signal_callback():
|
||||
raise RuntimeError(f'Could not fetch {pk} because of receiving abort signal')
|
||||
return update_model(model, pk, _attempt=_attempt + 1, _max_attempts=_max_attempts, **updates)
|
||||
else:
|
||||
logger.error('Failed to update %s after %d retries.', model._meta.object_name, _attempt)
|
||||
|
||||
391
awx/ui/SEARCH.md
391
awx/ui/SEARCH.md
@ -2,26 +2,27 @@
|
||||
|
||||
## UX Considerations
|
||||
|
||||
Historically, the code that powers search in the AngularJS version of the AWX UI is very complex and prone to bugs. In order to reduce that complexity, we've made some UX decisions to help make the code easier to maintain.
|
||||
Historically, the code that powers search in the AngularJS version of the AWX UI is very complex and prone to bugs. In order to reduce that complexity, we've made some UX decisions to help make the code easier to maintain.
|
||||
|
||||
**ALL query params namespaced and in url bar**
|
||||
|
||||
This includes lists that aren't necessarily hyperlinked, like lookup lists. The reason behind this is so we can treat the url bar as the source of truth for queries always. Any params that have both a key AND value that is in the defaultParams section of the qs config are stripped out of the search string (see "Encoding for UI vs. API" for more info on this point)
|
||||
This includes lists that aren't necessarily hyperlinked, like lookup lists. The reason behind this is so we can treat the url bar as the source of truth for queries always. Any params that have both a key AND value that is in the defaultParams section of the qs config are stripped out of the search string (see "Encoding for UI vs. API" for more info on this point)
|
||||
|
||||
**Django fuzzy search (`?search=`) is not accessible outside of "advanced search"**
|
||||
|
||||
In current smart search typing a term with no key utilizes `?search=` i.e. for "foo" tag, `?search=foo` is given. `?search=` looks on a static list of field name "guesses" (such as name, description, etc.), as well as specific fields as defined for each endpoint (for example, the events endpoint looks for a "stdout" field as well). Due to the fact a key will always be present on the left-hand of simple search, it doesn't make sense to use `?search=` as the default.
|
||||
In current smart search typing a term with no key utilizes `?search=` i.e. for "foo" tag, `?search=foo` is given. `?search=` looks on a static list of field name "guesses" (such as name, description, etc.), as well as specific fields as defined for each endpoint (for example, the events endpoint looks for a "stdout" field as well). Due to the fact a key will always be present on the left-hand of simple search, it doesn't make sense to use `?search=` as the default.
|
||||
|
||||
We may allow passing of `?search=` through our future advanced search interface. Some details that were gathered in planning phases about `?search=` that might be helpful in the future:
|
||||
|
||||
We may allow passing of `?search=` through our future advanced search interface. Some details that were gathered in planning phases about `?search=` that might be helpful in the future:
|
||||
- `?search=` tags are OR'd together (union is returned).
|
||||
- `?search=foo&name=bar` returns items that have a name field of bar (not case insensitive) AND some text field with foo on it
|
||||
- `?search=foo&search=bar&name=baz` returns (foo in name OR foo in description OR ...) AND (bar in name OR bar in description OR ...) AND (baz in name)
|
||||
- similarly `?related__search=` looks on the static list of "guesses" for models related to the endpoint. The specific fields are not "searched" for `?related__search=`.
|
||||
- similarly `?related__search=` looks on the static list of "guesses" for models related to the endpoint. The specific fields are not "searched" for `?related__search=`.
|
||||
- `?related__search=` not currently used in awx ui
|
||||
|
||||
**A note on clicking a tag to putting it back into the search bar**
|
||||
|
||||
This was brought up as a nice to have when we were discussing our initial implementation of search in the new application. Since there isn't a way we would be able to know if the user created the tag from the simple or advanced search interface, we wouldn't know where to put it back. This breaks our idea of using the query params as the exclusive source of truth, so we've decided against implementing it for now.
|
||||
This was brought up as a nice to have when we were discussing our initial implementation of search in the new application. Since there isn't a way we would be able to know if the user created the tag from the simple or advanced search interface, we wouldn't know where to put it back. This breaks our idea of using the query params as the exclusive source of truth, so we've decided against implementing it for now.
|
||||
|
||||
## Tasklist
|
||||
|
||||
@ -50,171 +51,197 @@ This was brought up as a nice to have when we were discussing our initial implem
|
||||
- DONE remove button for search tags of duplicate keys are broken, fix that
|
||||
|
||||
### TODO pre-holiday break
|
||||
|
||||
- Update COLUMNS to SORT_COLUMNS and SEARCH_COLUMNS
|
||||
- Update to using new PF Toolbar component (currently an experimental component)
|
||||
- Change the right-hand input based on the type of key selected on the left-hand side. In addition to text input, for our MVP we will support:
|
||||
- Change the right-hand input based on the type of key selected on the left-hand side. In addition to text input, for our MVP we will support:
|
||||
- number input
|
||||
- select input (multiple-choice configured from UI or Options)
|
||||
- Update the following lists to have the following keys:
|
||||
|
||||
**Jobs list** (signed off earlier in chat)
|
||||
- Name (which is also the name of the job template) - search is ?name=jt
|
||||
- Job ID - search is ?id=13
|
||||
- Label name - search is ?labels__name=foo
|
||||
- Job type (dropdown on right with the different types) ?type = job
|
||||
- Created by (username) - search is ?created_by__username=admin
|
||||
- Status - search (dropdown on right with different statuses) is ?status=successful
|
||||
|
||||
- Name (which is also the name of the job template) - search is ?name=jt
|
||||
- Job ID - search is ?id=13
|
||||
- Label name - search is ?labels\_\_name=foo
|
||||
- Job type (dropdown on right with the different types) ?type = job
|
||||
- Created by (username) - search is ?created_by\_\_username=admin
|
||||
- Status - search (dropdown on right with different statuses) is ?status=successful
|
||||
|
||||
Instances of jobs list include:
|
||||
- Jobs list
|
||||
- Host completed jobs list
|
||||
- JT completed jobs list
|
||||
|
||||
- Jobs list
|
||||
- Host completed jobs list
|
||||
- JT completed jobs list
|
||||
|
||||
**Organization list**
|
||||
- Name - search is ?name=org
|
||||
- ? Team name (of a team in the org) - search is ?teams__name=ansible
|
||||
- ? Username (of a user in the org) - search is ?users__username=johndoe
|
||||
|
||||
- Name - search is ?name=org
|
||||
- ? Team name (of a team in the org) - search is ?teams\_\_name=ansible
|
||||
- ? Username (of a user in the org) - search is ?users\_\_username=johndoe
|
||||
|
||||
Instances of orgs list include:
|
||||
- Orgs list
|
||||
- User orgs list
|
||||
- Lookup on Project
|
||||
- Lookup on Credential
|
||||
- Lookup on Inventory
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
- Orgs list
|
||||
- User orgs list
|
||||
- Lookup on Project
|
||||
- Lookup on Credential
|
||||
- Lookup on Inventory
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
**Instance Groups list**
|
||||
- Name - search is ?name=ig
|
||||
- ? is_container_group boolean choice (doesn't work right now in API but will soon) - search is ?is_container_group=true
|
||||
- ? credential name - search is ?credentials__name=kubey
|
||||
|
||||
- Name - search is ?name=ig
|
||||
- ? is_container_group boolean choice (doesn't work right now in API but will soon) - search is ?is_container_group=true
|
||||
- ? credential name - search is ?credentials\_\_name=kubey
|
||||
|
||||
Instance of instance groups list include:
|
||||
- Lookup on Org
|
||||
- Lookup on JT
|
||||
- Lookup on Inventory
|
||||
|
||||
- Lookup on Org
|
||||
- Lookup on JT
|
||||
- Lookup on Inventory
|
||||
|
||||
**Users list**
|
||||
- Username - search is ?username=johndoe
|
||||
- First Name - search is ?first_name=John
|
||||
- Last Name - search is ?last_name=Doe
|
||||
- ? (if not superfluous, would not include on Team users list) Team Name - search is ?teams__name=team_of_john_does (note API issue: User has no field named "teams")
|
||||
- ? (only for access or permissions list) Role Name - search is ?roles__name=Admin (note API issue: Role has no field "name")
|
||||
- ? (if not superfluous, would not include on Organization users list) ORg Name - search is ?organizations__name=org_of_jhn_does
|
||||
|
||||
- Username - search is ?username=johndoe
|
||||
- First Name - search is ?first_name=John
|
||||
- Last Name - search is ?last_name=Doe
|
||||
- ? (if not superfluous, would not include on Team users list) Team Name - search is ?teams\_\_name=team_of_john_does (note API issue: User has no field named "teams")
|
||||
- ? (only for access or permissions list) Role Name - search is ?roles\_\_name=Admin (note API issue: Role has no field "name")
|
||||
- ? (if not superfluous, would not include on Organization users list) ORg Name - search is ?organizations\_\_name=org_of_jhn_does
|
||||
|
||||
Instance of user lists include:
|
||||
- User list
|
||||
- Org user list
|
||||
- Access list for Org, JT, Project, Credential, Inventory, User and Team
|
||||
- Access list for JT
|
||||
- Access list Project
|
||||
- Access list for Credential
|
||||
- Access list for Inventory
|
||||
- Access list for User
|
||||
- Access list for Team
|
||||
- Team add users list
|
||||
- Users list in access wizard (to add new roles for a particular list) for Org
|
||||
- Users list in access wizard (to add new roles for a particular list) for JT
|
||||
- Users list in access wizard (to add new roles for a particular list) for Project
|
||||
- Users list in access wizard (to add new roles for a particular list) for Credential
|
||||
- Users list in access wizard (to add new roles for a particular list) for Inventory
|
||||
|
||||
- User list
|
||||
- Org user list
|
||||
- Access list for Org, JT, Project, Credential, Inventory, User and Team
|
||||
- Access list for JT
|
||||
- Access list Project
|
||||
- Access list for Credential
|
||||
- Access list for Inventory
|
||||
- Access list for User
|
||||
- Access list for Team
|
||||
- Team add users list
|
||||
- Users list in access wizard (to add new roles for a particular list) for Org
|
||||
- Users list in access wizard (to add new roles for a particular list) for JT
|
||||
- Users list in access wizard (to add new roles for a particular list) for Project
|
||||
- Users list in access wizard (to add new roles for a particular list) for Credential
|
||||
- Users list in access wizard (to add new roles for a particular list) for Inventory
|
||||
|
||||
**Teams list**
|
||||
- Name - search is ?name=teamname
|
||||
- ? Username (of a user in the team) - search is ?users__username=johndoe
|
||||
- ? (if not superfluous, would not include on Organizations teams list) Org Name - search is ?organizations__name=org_of_john_does
|
||||
|
||||
- Name - search is ?name=teamname
|
||||
- ? Username (of a user in the team) - search is ?users\_\_username=johndoe
|
||||
- ? (if not superfluous, would not include on Organizations teams list) Org Name - search is ?organizations\_\_name=org_of_john_does
|
||||
|
||||
Instance of team lists include:
|
||||
- Team list
|
||||
- Org team list
|
||||
- User team list
|
||||
- Team list in access wizard (to add new roles for a particular list) for Org
|
||||
- Team list in access wizard (to add new roles for a particular list) for JT
|
||||
- Team list in access wizard (to add new roles for a particular list) for Project
|
||||
- Team list in access wizard (to add new roles for a particular list) for Credential
|
||||
- Team list in access wizard (to add new roles for a particular list) for Inventory
|
||||
|
||||
- Team list
|
||||
- Org team list
|
||||
- User team list
|
||||
- Team list in access wizard (to add new roles for a particular list) for Org
|
||||
- Team list in access wizard (to add new roles for a particular list) for JT
|
||||
- Team list in access wizard (to add new roles for a particular list) for Project
|
||||
- Team list in access wizard (to add new roles for a particular list) for Credential
|
||||
- Team list in access wizard (to add new roles for a particular list) for Inventory
|
||||
|
||||
**Credentials list**
|
||||
- Name
|
||||
- ? Type (dropdown on right with different types)
|
||||
- ? Created by (username)
|
||||
- ? Modified by (username)
|
||||
|
||||
- Name
|
||||
- ? Type (dropdown on right with different types)
|
||||
- ? Created by (username)
|
||||
- ? Modified by (username)
|
||||
|
||||
Instance of credential lists include:
|
||||
- Credential list
|
||||
- Lookup for JT
|
||||
- Lookup for Project
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
- Credential list
|
||||
- Lookup for JT
|
||||
- Lookup for Project
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
**Projects list**
|
||||
- Name - search is ?name=proj
|
||||
- ? Type (dropdown on right with different types) - search is scm_type=git
|
||||
- ? SCM URL - search is ?scm_url=github.com/ansible/test-playbooks
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
- Name - search is ?name=proj
|
||||
- ? Type (dropdown on right with different types) - search is scm_type=git
|
||||
- ? SCM URL - search is ?scm_url=github.com/ansible/test-playbooks
|
||||
- ? Created by (username) - search is ?created_by\_\_username=admin
|
||||
- ? Modified by (username) - search is ?modified_by\_\_username=admin
|
||||
|
||||
Instance of project lists include:
|
||||
- Project list
|
||||
- Lookup for JT
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
- Project list
|
||||
- Lookup for JT
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
**Templates list**
|
||||
- Name - search is ?name=cleanup
|
||||
- ? Type (dropdown on right with different types) - search is ?type=playbook_run
|
||||
- ? Playbook name - search is ?job_template__playbook=debug.yml
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
- Name - search is ?name=cleanup
|
||||
- ? Type (dropdown on right with different types) - search is ?type=playbook_run
|
||||
- ? Playbook name - search is ?job_template\_\_playbook=debug.yml
|
||||
- ? Created by (username) - search is ?created_by\_\_username=admin
|
||||
- ? Modified by (username) - search is ?modified_by\_\_username=admin
|
||||
|
||||
Instance of template lists include:
|
||||
- Template list
|
||||
- Project Templates list
|
||||
|
||||
- Template list
|
||||
- Project Templates list
|
||||
|
||||
**Inventories list**
|
||||
- Name - search is ?name=inv
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
- Name - search is ?name=inv
|
||||
- ? Created by (username) - search is ?created_by\_\_username=admin
|
||||
- ? Modified by (username) - search is ?modified_by\_\_username=admin
|
||||
|
||||
Instance of inventory lists include:
|
||||
- Inventory list
|
||||
- Lookup for JT
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
- Inventory list
|
||||
- Lookup for JT
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
**Groups list**
|
||||
- Name - search is ?name=group_name
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
- Name - search is ?name=group_name
|
||||
- ? Created by (username) - search is ?created_by\_\_username=admin
|
||||
- ? Modified by (username) - search is ?modified_by\_\_username=admin
|
||||
|
||||
Instance of group lists include:
|
||||
- Group list
|
||||
|
||||
- Group list
|
||||
|
||||
**Hosts list**
|
||||
- Name - search is ?name=hostname
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
- Name - search is ?name=hostname
|
||||
- ? Created by (username) - search is ?created_by\_\_username=admin
|
||||
- ? Modified by (username) - search is ?modified_by\_\_username=admin
|
||||
|
||||
Instance of host lists include:
|
||||
- Host list
|
||||
|
||||
- Host list
|
||||
|
||||
**Notifications list**
|
||||
- Name - search is ?name=notification_template_name
|
||||
- ? Type (dropdown on right with different types) - search is ?type=slack
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
- Name - search is ?name=notification_template_name
|
||||
- ? Type (dropdown on right with different types) - search is ?type=slack
|
||||
- ? Created by (username) - search is ?created_by\_\_username=admin
|
||||
- ? Modified by (username) - search is ?modified_by\_\_username=admin
|
||||
|
||||
Instance of notification lists include:
|
||||
- Org notification list
|
||||
- JT notification list
|
||||
- Project notification list
|
||||
|
||||
- Org notification list
|
||||
- JT notification list
|
||||
- Project notification list
|
||||
|
||||
### TODO backlog
|
||||
- Change the right-hand input based on the type of key selected on the left-hand side. We will eventually want to support:
|
||||
|
||||
- Change the right-hand input based on the type of key selected on the left-hand side. We will eventually want to support:
|
||||
- lookup input (selection of particular resources, based on API list endpoints)
|
||||
- date picker input
|
||||
- Update the following lists to have the following keys:
|
||||
- Update all __name and __username related field search-based keys to be type-ahead lookup based searches
|
||||
- Update all **name and **username related field search-based keys to be type-ahead lookup based searches
|
||||
|
||||
## Code Details
|
||||
|
||||
@ -230,13 +257,13 @@ The component looks like this:
|
||||
/>
|
||||
```
|
||||
|
||||
**qsConfig** is used to get namespace so that multiple lists can be on the page. When tags are modified they append namespace to query params. The qsConfig is also used to get "type" of fields in order to correctly parse values as int or date as it is translating.
|
||||
**qsConfig** is used to get namespace so that multiple lists can be on the page. When tags are modified they append namespace to query params. The qsConfig is also used to get "type" of fields in order to correctly parse values as int or date as it is translating.
|
||||
|
||||
**columns** are passed as an array, as defined in the screen where the list is located. You pass a bool `isDefault` to indicate that should be the key that shows up in the left-hand dropdown as default in the UI. If you don't pass any columns, a default of `isDefault=true` will be added to a name column, which is nearly universally shared throughout the models of awx.
|
||||
**columns** are passed as an array, as defined in the screen where the list is located. You pass a bool `isDefault` to indicate that should be the key that shows up in the left-hand dropdown as default in the UI. If you don't pass any columns, a default of `isDefault=true` will be added to a name column, which is nearly universally shared throughout the models of awx.
|
||||
|
||||
There is a type attribute that can be `'string'`, `'number'` or `'choice'` (and in the future, `'date'` and `'lookup'`), which will change the type of input on the right-hand side of the search bar. For a key that has a set number of choices, you will pass a choices attribute, which is an array in the format choices: [{label: 'Foo', value: 'foo'}]
|
||||
There is a type attribute that can be `'string'`, `'number'` or `'choice'` (and in the future, `'date'` and `'lookup'`), which will change the type of input on the right-hand side of the search bar. For a key that has a set number of choices, you will pass a choices attribute, which is an array in the format choices: [{label: 'Foo', value: 'foo'}]
|
||||
|
||||
**onSearch** calls the `mergeParams` qs util in order to add new tags to the queryset. mergeParams is used so that we can support duplicate keys (see mergeParams vs. replaceParams for more info).
|
||||
**onSearch** calls the `mergeParams` qs util in order to add new tags to the queryset. mergeParams is used so that we can support duplicate keys (see mergeParams vs. replaceParams for more info).
|
||||
|
||||
### ListHeader component
|
||||
|
||||
@ -253,15 +280,16 @@ All of these functions act on the react-router history using the `pushHistorySta
|
||||
|
||||
**a note on sort_columns and search_columns**
|
||||
|
||||
We have split out column configuration into separate search and sort column array props--these are passed to the search and sort columns. Both accept an isDefault prop for one of the items in the array to be the default option selected when going to the page. Sort column items can pass an isNumeric boolean in order to chnage the iconography of the sort UI element. Search column items can pass type and if applicable choices, in order to configure the right-hand side of the search bar.
|
||||
We have split out column configuration into separate search and sort column array props--these are passed to the search and sort columns. Both accept an isDefault prop for one of the items in the array to be the default option selected when going to the page. Sort column items can pass an isNumeric boolean in order to chnage the iconography of the sort UI element. Search column items can pass type and if applicable choices, in order to configure the right-hand side of the search bar.
|
||||
|
||||
### FilterTags component
|
||||
|
||||
Similar to the way the list grabs data based on changes to the react-router params, the `FilterTags` component updates when new params are added. This component is a fairly straight-forward map (only slightly complex, because it needed to do a nested map over any values with duplicate keys that were represented by an inner-array). Both key and value are displayed for the tag.
|
||||
Similar to the way the list grabs data based on changes to the react-router params, the `FilterTags` component updates when new params are added. This component is a fairly straight-forward map (only slightly complex, because it needed to do a nested map over any values with duplicate keys that were represented by an inner-array). Both key and value are displayed for the tag.
|
||||
|
||||
### qs utility
|
||||
|
||||
The qs (queryset) utility is used to make the search speak the language of the REST API. The main functions of the utilities are to:
|
||||
The qs (queryset) utility is used to make the search speak the language of the REST API. The main functions of the utilities are to:
|
||||
|
||||
- add, replace and remove filters
|
||||
- translate filters as url params (for linking and maintaining state), in-memory representation (as JS objects), and params that Django REST Framework understands.
|
||||
|
||||
@ -269,7 +297,7 @@ More info in the below sections:
|
||||
|
||||
#### Encoding for UI vs. API
|
||||
|
||||
For the UI url params, we want to only encode those params that aren't defaults, as the default behavior was defined through configuration and we don't need these in the url as a source of truth. For the API, we need to pass these params so that they are taken into account when the response is built.
|
||||
For the UI url params, we want to only encode those params that aren't defaults, as the default behavior was defined through configuration and we don't need these in the url as a source of truth. For the API, we need to pass these params so that they are taken into account when the response is built.
|
||||
|
||||
#### mergeParams vs. replaceParams
|
||||
|
||||
@ -283,13 +311,13 @@ From a UX perspective, we wanted to be able to support searching on the same key
|
||||
}
|
||||
```
|
||||
|
||||
Concatenating terms in this way gives you the intersection of both terms (i.e. foo must be "bar" and "baz"). This is helpful for the most-common type of searching, substring (`__icontains`) searches. This will increase filtering, allowing the user to drill-down into the list as terms are added.
|
||||
Concatenating terms in this way gives you the intersection of both terms (i.e. foo must be "bar" and "baz"). This is helpful for the most-common type of searching, substring (`__icontains`) searches. This will increase filtering, allowing the user to drill-down into the list as terms are added.
|
||||
|
||||
**replaceParams** is used to support sorting, setting page_size, etc. These params only allow one choice, and we need to replace a particular key's value if one is passed.
|
||||
**replaceParams** is used to support sorting, setting page_size, etc. These params only allow one choice, and we need to replace a particular key's value if one is passed.
|
||||
|
||||
#### Working with REST API
|
||||
|
||||
The REST API is coupled with the qs util through the `paramsSerializer`, due to the fact we need axios to support the array for duplicate key values in the object representation of the params to pass to the get request. This is done where axios is configured in the Base.js file, so all requests and request types should support our array syntax for duplicate keys automatically.
|
||||
The REST API is coupled with the qs util through the `paramsSerializer`, due to the fact we need axios to support the array for duplicate key values in the object representation of the params to pass to the get request. This is done where axios is configured in the Base.js file, so all requests and request types should support our array syntax for duplicate keys automatically.
|
||||
|
||||
# Advanced Search - this section is a mess, update eventually
|
||||
|
||||
@ -305,85 +333,84 @@ Current thinking is Advanced Search will be post-3.6, or at least late 3.6 after
|
||||
|
||||
That being said, we want to plan it out so we make sure the infrastructure of how we set up adding/removing tags, what shows up in the url bar, etc. all doesn't have to be redone.
|
||||
|
||||
Users will get to advanced search with a button to the right of search bar. When selected type-ahead key thing opens, left dropdown of search bar goes away, and x is given to get back to regular search (this is in the mockups)
|
||||
Users will get to advanced search with a button to the right of search bar. When selected type-ahead key thing opens, left dropdown of search bar goes away, and x is given to get back to regular search (this is in the mockups)
|
||||
|
||||
It is okay to only make this typing representation available initially (i.e. they start doing stuff with the type-ahead and the phases, no more typing in to make a query that way).
|
||||
|
||||
when you click through or type in the search bar for the various phases of crafting the query ("not", "related resource project", "related resource key name", "value foo") which might be represented in the top bar as a series of tags that can be added and removed before submitting the tag.
|
||||
|
||||
We will try to form options data from a static file. Because options data is static, we may be able to generate and store as a static file of some sort (that we can use for managing smart search). Alan had ideas around this. If we do this it will mean we don't have to make a ton of requests as we craft smart search filters. It sounds like the cli may start using something similar.
|
||||
We will try to form options data from a static file. Because options data is static, we may be able to generate and store as a static file of some sort (that we can use for managing smart search). Alan had ideas around this. If we do this it will mean we don't have to make a ton of requests as we craft smart search filters. It sounds like the cli may start using something similar.
|
||||
|
||||
## Smart search flow
|
||||
|
||||
Smart search will be able to craft the tag through various states. Note that the phases don't necessarily need to be completed in sequential order.
|
||||
Smart search will be able to craft the tag through various states. Note that the phases don't necessarily need to be completed in sequential order.
|
||||
|
||||
PHASE 1: prefix operators
|
||||
PHASE 1: prefix operators
|
||||
|
||||
**TODO: Double check there's no reason we need to include or__ and chain__ and can just do not__**
|
||||
**TODO: Double check there's no reason we need to include or** and chain** and can just do not\_\_**
|
||||
|
||||
- not__
|
||||
- or__
|
||||
- chain__
|
||||
- not\_\_
|
||||
- or\_\_
|
||||
- chain\_\_
|
||||
|
||||
how these work:
|
||||
how these work:
|
||||
|
||||
To exclude results matching certain criteria, prefix the field parameter with not__:
|
||||
To exclude results matching certain criteria, prefix the field parameter with not\_\_:
|
||||
|
||||
?not__field=value
|
||||
By default, all query string filters are AND'ed together, so only the results matching all filters will be returned. To combine results matching any one of multiple criteria, prefix each query string parameter with or__:
|
||||
?not**field=value
|
||||
By default, all query string filters are AND'ed together, so only the results matching all filters will be returned. To combine results matching any one of multiple criteria, prefix each query string parameter with or**:
|
||||
|
||||
?or__field=value&or__field=othervalue
|
||||
?or__not__field=value&or__field=othervalue
|
||||
(Added in Ansible Tower 1.4.5) The default AND filtering applies all filters simultaneously to each related object being filtered across database relationships. The chain filter instead applies filters separately for each related object. To use, prefix the query string parameter with chain__:
|
||||
?or**field=value&or**field=othervalue
|
||||
?or**not**field=value&or**field=othervalue
|
||||
(Added in Ansible Controller 1.4.5) The default AND filtering applies all filters simultaneously to each related object being filtered across database relationships. The chain filter instead applies filters separately for each related object. To use, prefix the query string parameter with chain**:
|
||||
|
||||
?chain__related__field=value&chain__related__field2=othervalue
|
||||
?chain__not__related__field=value&chain__related__field2=othervalue
|
||||
If the first query above were written as ?related__field=value&related__field2=othervalue, it would return only the primary objects where the same related object satisfied both conditions. As written using the chain filter, it would return the intersection of primary objects matching each condition.
|
||||
?chain**related**field=value&chain**related**field2=othervalue
|
||||
?chain**not**related**field=value&chain**related**field2=othervalue
|
||||
If the first query above were written as ?related**field=value&related\_\_field2=othervalue, it would return only the primary objects where the same related object satisfied both conditions. As written using the chain filter, it would return the intersection of primary objects matching each condition.
|
||||
|
||||
PHASE 2: related fields, given by array, where __search is appended to them, i.e.
|
||||
PHASE 2: related fields, given by array, where \_\_search is appended to them, i.e.
|
||||
|
||||
```
|
||||
"related_search_fields": [
|
||||
"credentials__search",
|
||||
"labels__search",
|
||||
"created_by__search",
|
||||
"modified_by__search",
|
||||
"notification_templates__search",
|
||||
"custom_inventory_scripts__search",
|
||||
"notification_templates_error__search",
|
||||
"notification_templates_success__search",
|
||||
"notification_templates_any__search",
|
||||
"teams__search",
|
||||
"projects__search",
|
||||
"inventories__search",
|
||||
"applications__search",
|
||||
"workflows__search",
|
||||
"instance_groups__search"
|
||||
],
|
||||
```
|
||||
```
|
||||
"related_search_fields": [
|
||||
"credentials__search",
|
||||
"labels__search",
|
||||
"created_by__search",
|
||||
"modified_by__search",
|
||||
"notification_templates__search",
|
||||
"custom_inventory_scripts__search",
|
||||
"notification_templates_error__search",
|
||||
"notification_templates_success__search",
|
||||
"notification_templates_any__search",
|
||||
"teams__search",
|
||||
"projects__search",
|
||||
"inventories__search",
|
||||
"applications__search",
|
||||
"workflows__search",
|
||||
"instance_groups__search"
|
||||
],
|
||||
```
|
||||
|
||||
PHASE 3: keys, give by object key names for data.actions.GET
|
||||
- type is given for each key which we could use to help craft the value
|
||||
PHASE 3: keys, give by object key names for data.actions.GET - type is given for each key which we could use to help craft the value
|
||||
|
||||
PHASE 4: after key postfix operators can be
|
||||
PHASE 4: after key postfix operators can be
|
||||
|
||||
**TODO: will need to figure out which ones we support**
|
||||
|
||||
- exact: Exact match (default lookup if not specified).
|
||||
- iexact: Case-insensitive version of exact.
|
||||
- contains: Field contains value.
|
||||
- icontains: Case-insensitive version of contains.
|
||||
- startswith: Field starts with value.
|
||||
- istartswith: Case-insensitive version of startswith.
|
||||
- endswith: Field ends with value.
|
||||
- iendswith: Case-insensitive version of endswith.
|
||||
- regex: Field matches the given regular expression.
|
||||
- iregex: Case-insensitive version of regex.
|
||||
- gt: Greater than comparison.
|
||||
- gte: Greater than or equal to comparison.
|
||||
- lt: Less than comparison.
|
||||
- lte: Less than or equal to comparison.
|
||||
- isnull: Check whether the given field or related object is null; expects a boolean value.
|
||||
- in: Check whether the given field's value is present in the list provided; expects a list of items.
|
||||
- exact: Exact match (default lookup if not specified).
|
||||
- iexact: Case-insensitive version of exact.
|
||||
- contains: Field contains value.
|
||||
- icontains: Case-insensitive version of contains.
|
||||
- startswith: Field starts with value.
|
||||
- istartswith: Case-insensitive version of startswith.
|
||||
- endswith: Field ends with value.
|
||||
- iendswith: Case-insensitive version of endswith.
|
||||
- regex: Field matches the given regular expression.
|
||||
- iregex: Case-insensitive version of regex.
|
||||
- gt: Greater than comparison.
|
||||
- gte: Greater than or equal to comparison.
|
||||
- lt: Less than comparison.
|
||||
- lte: Less than or equal to comparison.
|
||||
- isnull: Check whether the given field or related object is null; expects a boolean value.
|
||||
- in: Check whether the given field's value is present in the list provided; expects a list of items.
|
||||
|
||||
PHASE 5: The value. Based on options, we can give hints or validation based on type of value (like number fields don't accept "foo" or whatever)
|
||||
PHASE 5: The value. Based on options, we can give hints or validation based on type of value (like number fields don't accept "foo" or whatever)
|
||||
|
||||
589
awx/ui/package-lock.json
generated
589
awx/ui/package-lock.json
generated
@ -6,15 +6,15 @@
|
||||
"": {
|
||||
"name": "ui",
|
||||
"dependencies": {
|
||||
"@lingui/react": "3.13.3",
|
||||
"@patternfly/patternfly": "4.196.7",
|
||||
"@patternfly/react-core": "^4.201.0",
|
||||
"@patternfly/react-icons": "4.49.19",
|
||||
"@patternfly/react-table": "4.83.1",
|
||||
"@lingui/react": "3.14.0",
|
||||
"@patternfly/patternfly": "4.202.1",
|
||||
"@patternfly/react-core": "^4.221.3",
|
||||
"@patternfly/react-icons": "4.75.1",
|
||||
"@patternfly/react-table": "4.93.1",
|
||||
"ace-builds": "^1.6.0",
|
||||
"ansi-to-html": "0.7.2",
|
||||
"axios": "0.22.0",
|
||||
"codemirror": "^5.65.4",
|
||||
"axios": "0.27.2",
|
||||
"codemirror": "^6.0.1",
|
||||
"d3": "7.4.4",
|
||||
"dagre": "^0.8.4",
|
||||
"dompurify": "2.3.8",
|
||||
@ -28,7 +28,7 @@
|
||||
"react-ace": "^10.1.0",
|
||||
"react-dom": "17.0.2",
|
||||
"react-error-boundary": "^3.1.4",
|
||||
"react-router-dom": "^5.1.2",
|
||||
"react-router-dom": "^5.3.3",
|
||||
"react-virtualized": "^9.21.1",
|
||||
"rrule": "2.7.0",
|
||||
"styled-components": "5.3.5"
|
||||
@ -1926,6 +1926,82 @@
|
||||
"integrity": "sha512-0hYQ8SB4Db5zvZB4axdMHGwEaQjkZzFjQiN9LVYvIFB2nSUHW9tYpxWriPrWDASIxiaXax83REcLxuSdnGPZtw==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/@codemirror/autocomplete": {
|
||||
"version": "6.0.2",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/autocomplete/-/autocomplete-6.0.2.tgz",
|
||||
"integrity": "sha512-9PDjnllmXan/7Uax87KGORbxerDJ/cu10SB+n4Jz0zXMEvIh3+TGgZxhIvDOtaQ4jDBQEM7kHYW4vLdQB0DGZQ==",
|
||||
"dependencies": {
|
||||
"@codemirror/language": "^6.0.0",
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"@lezer/common": "^1.0.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@codemirror/language": "^6.0.0",
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@codemirror/commands": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/commands/-/commands-6.0.1.tgz",
|
||||
"integrity": "sha512-iNHDByicYqQjs0Wo1MKGfqNbMYMyhS9WV6EwMVwsHXImlFemgEUC+c5X22bXKBStN3qnwg4fArNZM+gkv22baQ==",
|
||||
"dependencies": {
|
||||
"@codemirror/language": "^6.0.0",
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@codemirror/language": {
|
||||
"version": "6.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/language/-/language-6.2.0.tgz",
|
||||
"integrity": "sha512-tabB0Ef/BflwoEmTB4a//WZ9P90UQyne9qWB9YFsmeS4bnEqSys7UpGk/da1URMXhyfuzWCwp+AQNMhvu8SfnA==",
|
||||
"dependencies": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"@lezer/common": "^1.0.0",
|
||||
"@lezer/highlight": "^1.0.0",
|
||||
"@lezer/lr": "^1.0.0",
|
||||
"style-mod": "^4.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@codemirror/lint": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/lint/-/lint-6.0.0.tgz",
|
||||
"integrity": "sha512-nUUXcJW1Xp54kNs+a1ToPLK8MadO0rMTnJB8Zk4Z8gBdrN0kqV7uvUraU/T2yqg+grDNR38Vmy/MrhQN/RgwiA==",
|
||||
"dependencies": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"crelt": "^1.0.5"
|
||||
}
|
||||
},
|
||||
"node_modules/@codemirror/search": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/search/-/search-6.0.0.tgz",
|
||||
"integrity": "sha512-rL0rd3AhI0TAsaJPUaEwC63KHLO7KL0Z/dYozXj6E7L3wNHRyx7RfE0/j5HsIf912EE5n2PCb4Vg0rGYmDv4UQ==",
|
||||
"dependencies": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"crelt": "^1.0.5"
|
||||
}
|
||||
},
|
||||
"node_modules/@codemirror/state": {
|
||||
"version": "6.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/state/-/state-6.1.0.tgz",
|
||||
"integrity": "sha512-qbUr94DZTe6/V1VS7LDLz11rM/1t/nJxR1El4I6UaxDEdc0aZZvq6JCLJWiRmUf95NRAnDH6fhXn+PWp9wGCIg=="
|
||||
},
|
||||
"node_modules/@codemirror/view": {
|
||||
"version": "6.0.2",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/view/-/view-6.0.2.tgz",
|
||||
"integrity": "sha512-mnVT/q1JvKPjpmjXJNeCi/xHyaJ3abGJsumIVpdQ1nE1MXAyHf7GHWt8QpWMUvDiqF0j+inkhVR2OviTdFFX7Q==",
|
||||
"dependencies": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"style-mod": "^4.0.0",
|
||||
"w3c-keyname": "^2.2.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@csstools/normalize.css": {
|
||||
"version": "12.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@csstools/normalize.css/-/normalize.css-12.0.0.tgz",
|
||||
@ -3322,6 +3398,27 @@
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/@lezer/common": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@lezer/common/-/common-1.0.0.tgz",
|
||||
"integrity": "sha512-ohydQe+Hb+w4oMDvXzs8uuJd2NoA3D8YDcLiuDsLqH+yflDTPEpgCsWI3/6rH5C3BAedtH1/R51dxENldQceEA=="
|
||||
},
|
||||
"node_modules/@lezer/highlight": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@lezer/highlight/-/highlight-1.0.0.tgz",
|
||||
"integrity": "sha512-nsCnNtim90UKsB5YxoX65v3GEIw3iCHw9RM2DtdgkiqAbKh9pCdvi8AWNwkYf10Lu6fxNhXPpkpHbW6mihhvJA==",
|
||||
"dependencies": {
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@lezer/lr": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@lezer/lr/-/lr-1.1.0.tgz",
|
||||
"integrity": "sha512-Iad04uVwk1PvSnj25mqj7zEEIRAsasbsTRmVzI0AUTs/+1Dz1//iYAaoLr7A+Xa7bZDfql5MKTxZmSlkYZD3Dg==",
|
||||
"dependencies": {
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@lingui/babel-plugin-extract-messages": {
|
||||
"version": "3.8.10",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/babel-plugin-extract-messages/-/babel-plugin-extract-messages-3.8.10.tgz",
|
||||
@ -3550,9 +3647,9 @@
|
||||
}
|
||||
},
|
||||
"node_modules/@lingui/core": {
|
||||
"version": "3.13.3",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/core/-/core-3.13.3.tgz",
|
||||
"integrity": "sha512-3rQDIC7PtPfUuZCSNfU0nziWNMlGk3JhpxENzGrlt1M8w5RHson89Mk1Ce/how+hWzFpumCQDWLDDhyRPpydbg==",
|
||||
"version": "3.14.0",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/core/-/core-3.14.0.tgz",
|
||||
"integrity": "sha512-ertREq9oi9B/umxpd/pInm9uFO8FLK2/0FXfDmMqvH5ydswWn/c9nY5YO4W1h4/8LWO45mewypOIyjoue4De1w==",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.11.2",
|
||||
"make-plural": "^6.2.2",
|
||||
@ -3593,12 +3690,12 @@
|
||||
}
|
||||
},
|
||||
"node_modules/@lingui/react": {
|
||||
"version": "3.13.3",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/react/-/react-3.13.3.tgz",
|
||||
"integrity": "sha512-sCCI5xMcUY9b6w2lwbwy6iHpo1Fb9TDcjcHAD2KI5JueLH+WWQG66tIHiVAlSsQ+hmQ9Tt+f86H05JQEiDdIvg==",
|
||||
"version": "3.14.0",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/react/-/react-3.14.0.tgz",
|
||||
"integrity": "sha512-ow9Mtru7f0T2S9AwnPWRejppcucCW0LmoDR3P4wqHjL+eH5f8a6nxd2doxGieC91/2i4qqW88y4K/zXJxwRSQw==",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.11.2",
|
||||
"@lingui/core": "^3.13.3"
|
||||
"@lingui/core": "^3.14.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10.0.0"
|
||||
@ -3649,19 +3746,19 @@
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/@patternfly/patternfly": {
|
||||
"version": "4.196.7",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.196.7.tgz",
|
||||
"integrity": "sha512-hA7Oww411e1p0/IXjC1I+4/1NNis9V+NVBxfUIpRwyuLbCIDHBdtMu2qAPLdKxXjuibV9EE6ZdlT7ra/kcFuJQ=="
|
||||
"version": "4.202.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.202.1.tgz",
|
||||
"integrity": "sha512-cQiiPqmwJOm9onuTfLPQNRlpAZwDIJ/zVfDQeaFqMQyPJtxtKn3lkphz5xErY5dPs9rR4X94ytQ1I9pkVzaPJQ=="
|
||||
},
|
||||
"node_modules/@patternfly/react-core": {
|
||||
"version": "4.214.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.214.1.tgz",
|
||||
"integrity": "sha512-XHEqXpnBEDyLVdAEDOYlGqFHnN43eNLSD5HABB99xO6541JV9MRnbxs0+v9iYnfhcKh/8bhA9ITXnUi3f2PEvg==",
|
||||
"version": "4.224.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.224.1.tgz",
|
||||
"integrity": "sha512-v8wGGNoMGndAScAoE5jeOA5jVgymlLSwttPjQk/Idr0k7roSpOrsM39oXUR5DEgkZee45DW00WKTgmg50PP3FQ==",
|
||||
"dependencies": {
|
||||
"@patternfly/react-icons": "^4.65.1",
|
||||
"@patternfly/react-styles": "^4.64.1",
|
||||
"@patternfly/react-tokens": "^4.66.1",
|
||||
"focus-trap": "6.2.2",
|
||||
"@patternfly/react-icons": "4.75.1",
|
||||
"@patternfly/react-styles": "^4.74.1",
|
||||
"@patternfly/react-tokens": "^4.76.1",
|
||||
"focus-trap": "6.9.2",
|
||||
"react-dropzone": "9.0.0",
|
||||
"tippy.js": "5.1.2",
|
||||
"tslib": "^2.0.0"
|
||||
@ -3671,43 +3768,34 @@
|
||||
"react-dom": "^16.8.0 || ^17.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@patternfly/react-core/node_modules/@patternfly/react-icons": {
|
||||
"version": "4.65.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.65.1.tgz",
|
||||
"integrity": "sha512-CUYFRPztFkR7qrXq/0UAhLjeHd8FdjLe4jBjj8tfKc7OXwxDeZczqNFyRMATZpPaduTH7BU2r3OUjQrgAbquWg==",
|
||||
"peerDependencies": {
|
||||
"react": "^16.8.0 || ^17.0.0",
|
||||
"react-dom": "^16.8.0 || ^17.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@patternfly/react-core/node_modules/tslib": {
|
||||
"version": "2.3.1",
|
||||
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz",
|
||||
"integrity": "sha512-77EbyPPpMz+FRFRuAFlWMtmgUWGe9UOG2Z25NqCwiIjRhOf5iKGuzSe5P2w1laq+FkRy4p+PCuVkJSGkzTEKVw=="
|
||||
},
|
||||
"node_modules/@patternfly/react-icons": {
|
||||
"version": "4.49.19",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.49.19.tgz",
|
||||
"integrity": "sha512-Pr6JDDKWOnWChkifXKWglKEPo3Q+1CgiUTUrvk4ZbnD7mhq5e/TFxxInB9CPzi278bvnc2YlPyTjpaAcCN0yGw==",
|
||||
"version": "4.75.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.75.1.tgz",
|
||||
"integrity": "sha512-1ly8SVi/kcc0zkiViOjUd8D5BEr7GeqWGmDPuDSBtD60l1dYf3hZc44IWFVkRM/oHZML/musdrJkLfh4MDqX9w==",
|
||||
"peerDependencies": {
|
||||
"react": "^16.8.0 || ^17.0.0",
|
||||
"react-dom": "^16.8.0 || ^17.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@patternfly/react-styles": {
|
||||
"version": "4.64.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.64.1.tgz",
|
||||
"integrity": "sha512-+GxULkP2o5Vpr9w+J4NiGOGzhTfNniYzdPGEF/yC+oDoAXB6Q1HJyQnEj+kJH31xNvwmw3G3VFtwRLX4ZWr0oA=="
|
||||
"version": "4.74.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.74.1.tgz",
|
||||
"integrity": "sha512-9eWvKrjtrJ3qhJkhY2GQKyYA13u/J0mU1befH49SYbvxZtkbuHdpKmXBAeQoHmcx1hcOKtiYXeKb+dVoRRNx0A=="
|
||||
},
|
||||
"node_modules/@patternfly/react-table": {
|
||||
"version": "4.83.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.83.1.tgz",
|
||||
"integrity": "sha512-mkq13x9funh+Nh2Uzj2ZQBOacNYc+a60yUAHZMXgNcljCJ3LTQUoYy6EonvYrqwSrpC7vj8nLt8+/XbDNc0Aig==",
|
||||
"version": "4.93.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.93.1.tgz",
|
||||
"integrity": "sha512-N/zHkNsY3X3yUXPg6COwdZKAFmTCbWm25qCY2aHjrXlIlE2OKWaYvVag0CcTwPiQhIuCumztr9Y2Uw9uvv0Fsw==",
|
||||
"dependencies": {
|
||||
"@patternfly/react-core": "^4.214.1",
|
||||
"@patternfly/react-icons": "^4.65.1",
|
||||
"@patternfly/react-styles": "^4.64.1",
|
||||
"@patternfly/react-tokens": "^4.66.1",
|
||||
"@patternfly/react-core": "^4.224.1",
|
||||
"@patternfly/react-icons": "4.75.1",
|
||||
"@patternfly/react-styles": "^4.74.1",
|
||||
"@patternfly/react-tokens": "^4.76.1",
|
||||
"lodash": "^4.17.19",
|
||||
"tslib": "^2.0.0"
|
||||
},
|
||||
@ -3716,24 +3804,15 @@
|
||||
"react-dom": "^16.8.0 || ^17.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@patternfly/react-table/node_modules/@patternfly/react-icons": {
|
||||
"version": "4.65.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.65.1.tgz",
|
||||
"integrity": "sha512-CUYFRPztFkR7qrXq/0UAhLjeHd8FdjLe4jBjj8tfKc7OXwxDeZczqNFyRMATZpPaduTH7BU2r3OUjQrgAbquWg==",
|
||||
"peerDependencies": {
|
||||
"react": "^16.8.0 || ^17.0.0",
|
||||
"react-dom": "^16.8.0 || ^17.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@patternfly/react-table/node_modules/tslib": {
|
||||
"version": "2.4.0",
|
||||
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.4.0.tgz",
|
||||
"integrity": "sha512-d6xOpEDfsi2CZVlPQzGeux8XMwLT9hssAsaPYExaQMuYskwb+x1x7J371tWlbBdWHroy99KnVB6qIkUbs5X3UQ=="
|
||||
},
|
||||
"node_modules/@patternfly/react-tokens": {
|
||||
"version": "4.66.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.66.1.tgz",
|
||||
"integrity": "sha512-k0IWqpufM6ezT+3gWlEamqQ7LW9yi8e8cBBlude5IU8eIEqIG6AccwR1WNBEK1wCVWGwVxakLMdf0XBLl4k52Q=="
|
||||
"version": "4.76.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.76.1.tgz",
|
||||
"integrity": "sha512-gLEezRSzQeflaPu3SCgYmWtuiqDIRtxNNFP1+ES7P2o56YHXJ5o1Pki7LpNCPk/VOzHy2+vRFE/7l+hBEweugw=="
|
||||
},
|
||||
"node_modules/@pmmmwh/react-refresh-webpack-plugin": {
|
||||
"version": "0.5.4",
|
||||
@ -5553,8 +5632,7 @@
|
||||
"node_modules/asynckit": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k=",
|
||||
"dev": true
|
||||
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k="
|
||||
},
|
||||
"node_modules/at-least-node": {
|
||||
"version": "1.0.0",
|
||||
@ -5625,11 +5703,25 @@
|
||||
}
|
||||
},
|
||||
"node_modules/axios": {
|
||||
"version": "0.22.0",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-0.22.0.tgz",
|
||||
"integrity": "sha512-Z0U3uhqQeg1oNcihswf4ZD57O3NrR1+ZXhxaROaWpDmsDTx7T2HNBV2ulBtie2hwJptu8UvgnJoK+BIqdzh/1w==",
|
||||
"version": "0.27.2",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-0.27.2.tgz",
|
||||
"integrity": "sha512-t+yRIyySRTp/wua5xEr+z1q60QmLq8ABsS5O9Me1AsE5dfKqgnCFzwiCZZ/cGNd1lq4/7akDWMxdhVlucjmnOQ==",
|
||||
"dependencies": {
|
||||
"follow-redirects": "^1.14.4"
|
||||
"follow-redirects": "^1.14.9",
|
||||
"form-data": "^4.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/axios/node_modules/form-data": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
|
||||
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
|
||||
"dependencies": {
|
||||
"asynckit": "^0.4.0",
|
||||
"combined-stream": "^1.0.8",
|
||||
"mime-types": "^2.1.12"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
},
|
||||
"node_modules/axobject-query": {
|
||||
@ -6603,9 +6695,18 @@
|
||||
}
|
||||
},
|
||||
"node_modules/codemirror": {
|
||||
"version": "5.65.4",
|
||||
"resolved": "https://registry.npmjs.org/codemirror/-/codemirror-5.65.4.tgz",
|
||||
"integrity": "sha512-tytrSm5Rh52b6j36cbDXN+FHwHCl9aroY4BrDZB2NFFL3Wjfq9nuYVLFFhaOYOczKAg3JXTr8BuT8LcE5QY4Iw=="
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/codemirror/-/codemirror-6.0.1.tgz",
|
||||
"integrity": "sha512-J8j+nZ+CdWmIeFIGXEFbFPtpiYacFMDR8GlHK3IyHQJMCaVRfGx9NT+Hxivv1ckLWPvNdZqndbr/7lVhrf/Svg==",
|
||||
"dependencies": {
|
||||
"@codemirror/autocomplete": "^6.0.0",
|
||||
"@codemirror/commands": "^6.0.0",
|
||||
"@codemirror/language": "^6.0.0",
|
||||
"@codemirror/lint": "^6.0.0",
|
||||
"@codemirror/search": "^6.0.0",
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/collect-v8-coverage": {
|
||||
"version": "1.0.1",
|
||||
@ -6651,7 +6752,6 @@
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"delayed-stream": "~1.0.0"
|
||||
},
|
||||
@ -6881,6 +6981,11 @@
|
||||
"integrity": "sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==",
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/crelt": {
|
||||
"version": "1.0.5",
|
||||
"resolved": "https://registry.npmjs.org/crelt/-/crelt-1.0.5.tgz",
|
||||
"integrity": "sha512-+BO9wPPi+DWTDcNYhr/W90myha8ptzftZT+LwcmUbbok0rcP/fequmFYCw8NMoH7pkAZQzU78b3kYrlua5a9eA=="
|
||||
},
|
||||
"node_modules/cross-spawn": {
|
||||
"version": "7.0.3",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz",
|
||||
@ -7915,7 +8020,6 @@
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||
"integrity": "sha1-3zrhmayt+31ECqrgsp4icrJOxhk=",
|
||||
"dev": true,
|
||||
"engines": {
|
||||
"node": ">=0.4.0"
|
||||
}
|
||||
@ -9931,17 +10035,17 @@
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/focus-trap": {
|
||||
"version": "6.2.2",
|
||||
"resolved": "https://registry.npmjs.org/focus-trap/-/focus-trap-6.2.2.tgz",
|
||||
"integrity": "sha512-qWovH9+LGoKqREvJaTCzJyO0hphQYGz+ap5Hc4NqXHNhZBdxCi5uBPPcaOUw66fHmzXLVwvETLvFgpwPILqKpg==",
|
||||
"version": "6.9.2",
|
||||
"resolved": "https://registry.npmjs.org/focus-trap/-/focus-trap-6.9.2.tgz",
|
||||
"integrity": "sha512-gBEuXOPNOKPrLdZpMFUSTyIo1eT2NSZRrwZ9r/0Jqw5tmT3Yvxfmu8KBHw8xW2XQkw6E/JoG+OlEq7UDtSUNgw==",
|
||||
"dependencies": {
|
||||
"tabbable": "^5.1.4"
|
||||
"tabbable": "^5.3.2"
|
||||
}
|
||||
},
|
||||
"node_modules/follow-redirects": {
|
||||
"version": "1.14.8",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.14.8.tgz",
|
||||
"integrity": "sha512-1x0S9UVJHsQprFcEC/qnNzBLcIxsjAV905f/UkQxbclCsoTWlacCNOpQa/anodLl2uaEKFhfWOvM2Qg77+15zA==",
|
||||
"version": "1.15.1",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.1.tgz",
|
||||
"integrity": "sha512-yLAMQs+k0b2m7cVxpS1VKJVvoz7SS9Td1zss3XRwXj+ZDH00RJgnuLx7E44wx02kQLrdM3aOOy+FpzS7+8OizA==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "individual",
|
||||
@ -11378,7 +11482,7 @@
|
||||
"node_modules/isarray": {
|
||||
"version": "0.0.1",
|
||||
"resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz",
|
||||
"integrity": "sha1-ihis/Kmo9Bd+Cav8YDiTmwXR7t8="
|
||||
"integrity": "sha512-D2S+3GLxWH+uhrNEcoh/fnmYeP8E8/zHl644d/jdA0g2uyXvy3sb0qxotE+ne0LtccHknQzWwZEzhak7oJ0COQ=="
|
||||
},
|
||||
"node_modules/isexe": {
|
||||
"version": "2.0.0",
|
||||
@ -15501,7 +15605,6 @@
|
||||
"version": "1.51.0",
|
||||
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.51.0.tgz",
|
||||
"integrity": "sha512-5y8A56jg7XVQx2mbv1lu49NR4dokRnhZYTtL+KGfaa27uq4pSTXkwQkFJl4pkRMyNFz/EtYDSkiiEHx3F7UN6g==",
|
||||
"dev": true,
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
@ -15510,7 +15613,6 @@
|
||||
"version": "2.1.34",
|
||||
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.34.tgz",
|
||||
"integrity": "sha512-6cP692WwGIs9XXdOO4++N+7qjqv0rqxxVvJ3VHPh/Sc9mVZcQP+ZGhkKiTvWMQRr2tbHkJP/Yn7Y0npb3ZBs4A==",
|
||||
"dev": true,
|
||||
"dependencies": {
|
||||
"mime-db": "1.51.0"
|
||||
},
|
||||
@ -15543,6 +15645,10 @@
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.12.1",
|
||||
"tiny-warning": "^1.0.3"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"prop-types": "^15.0.0",
|
||||
"react": "^0.14.0 || ^15.0.0 || ^16.0.0 || ^17.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/mini-css-extract-plugin": {
|
||||
@ -18350,11 +18456,11 @@
|
||||
}
|
||||
},
|
||||
"node_modules/react-router": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/react-router/-/react-router-5.2.0.tgz",
|
||||
"integrity": "sha512-smz1DUuFHRKdcJC0jobGo8cVbhO3x50tCL4icacOlcwDOEQPq4TMqwx3sY1TP+DvtTgz4nm3thuo7A+BK2U0Dw==",
|
||||
"version": "5.3.3",
|
||||
"resolved": "https://registry.npmjs.org/react-router/-/react-router-5.3.3.tgz",
|
||||
"integrity": "sha512-mzQGUvS3bM84TnbtMYR8ZjKnuPJ71IjSzR+DE6UkUqvN4czWIqEs17yLL8xkAycv4ev0AiN+IGrWu88vJs/p2w==",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.1.2",
|
||||
"@babel/runtime": "^7.12.13",
|
||||
"history": "^4.9.0",
|
||||
"hoist-non-react-statics": "^3.1.0",
|
||||
"loose-envify": "^1.3.1",
|
||||
@ -18364,20 +18470,26 @@
|
||||
"react-is": "^16.6.0",
|
||||
"tiny-invariant": "^1.0.2",
|
||||
"tiny-warning": "^1.0.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"react": ">=15"
|
||||
}
|
||||
},
|
||||
"node_modules/react-router-dom": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-5.2.0.tgz",
|
||||
"integrity": "sha512-gxAmfylo2QUjcwxI63RhQ5G85Qqt4voZpUXSEqCwykV0baaOTQDR1f0PmY8AELqIyVc0NEZUj0Gov5lNGcXgsA==",
|
||||
"version": "5.3.3",
|
||||
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-5.3.3.tgz",
|
||||
"integrity": "sha512-Ov0tGPMBgqmbu5CDmN++tv2HQ9HlWDuWIIqn4b88gjlAN5IHI+4ZUZRcpz9Hl0azFIwihbLDYw1OiHGRo7ZIng==",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.1.2",
|
||||
"@babel/runtime": "^7.12.13",
|
||||
"history": "^4.9.0",
|
||||
"loose-envify": "^1.3.1",
|
||||
"prop-types": "^15.6.2",
|
||||
"react-router": "5.2.0",
|
||||
"react-router": "5.3.3",
|
||||
"tiny-invariant": "^1.0.2",
|
||||
"tiny-warning": "^1.0.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"react": ">=15"
|
||||
}
|
||||
},
|
||||
"node_modules/react-scripts": {
|
||||
@ -20073,6 +20185,11 @@
|
||||
"webpack": "^5.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/style-mod": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/style-mod/-/style-mod-4.0.0.tgz",
|
||||
"integrity": "sha512-OPhtyEjyyN9x3nhPsu76f52yUGXiZcgvsrFVtvTkyGRQJ0XK+GPc6ov1z+lRpbeabka+MYEQxOYRnt5nF30aMw=="
|
||||
},
|
||||
"node_modules/styled-components": {
|
||||
"version": "5.3.5",
|
||||
"resolved": "https://registry.npmjs.org/styled-components/-/styled-components-5.3.5.tgz",
|
||||
@ -20312,9 +20429,9 @@
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/tabbable": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/tabbable/-/tabbable-5.2.0.tgz",
|
||||
"integrity": "sha512-0uyt8wbP0P3T4rrsfYg/5Rg3cIJ8Shl1RJ54QMqYxm1TLdWqJD1u6+RQjr2Lor3wmfT7JRHkirIwy99ydBsyPg=="
|
||||
"version": "5.3.3",
|
||||
"resolved": "https://registry.npmjs.org/tabbable/-/tabbable-5.3.3.tgz",
|
||||
"integrity": "sha512-QD9qKY3StfbZqWOPLp0++pOrAVb/HbUi5xCc8cUo4XjP19808oaMiDzn0leBY5mCespIBM0CIZePzZjgzR83kA=="
|
||||
},
|
||||
"node_modules/tailwindcss": {
|
||||
"version": "3.0.15",
|
||||
@ -20701,9 +20818,9 @@
|
||||
"dev": true
|
||||
},
|
||||
"node_modules/tiny-invariant": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/tiny-invariant/-/tiny-invariant-1.1.0.tgz",
|
||||
"integrity": "sha512-ytxQvrb1cPc9WBEI/HSeYYoGD0kWnGEOR8RY6KomWLBVhqz0RgTwVO9dLrGz7dC+nN9llyI7OKAgRq8Vq4ZBSw=="
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/tiny-invariant/-/tiny-invariant-1.2.0.tgz",
|
||||
"integrity": "sha512-1Uhn/aqw5C6RI4KejVeTg6mIS7IqxnLJ8Mv2tV5rTc0qWobay7pDUz6Wi392Cnc8ak1H0F2cjoRzb2/AW4+Fvg=="
|
||||
},
|
||||
"node_modules/tiny-warning": {
|
||||
"version": "1.0.3",
|
||||
@ -21140,6 +21257,11 @@
|
||||
"browser-process-hrtime": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/w3c-keyname": {
|
||||
"version": "2.2.4",
|
||||
"resolved": "https://registry.npmjs.org/w3c-keyname/-/w3c-keyname-2.2.4.tgz",
|
||||
"integrity": "sha512-tOhfEwEzFLJzf6d1ZPkYfGj+FWhIpBux9ppoP3rlclw3Z0BZv3N7b7030Z1kYth+6rDuAsXUFr+d0VE6Ed1ikw=="
|
||||
},
|
||||
"node_modules/w3c-xmlserializer": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/w3c-xmlserializer/-/w3c-xmlserializer-2.0.0.tgz",
|
||||
@ -23496,6 +23618,76 @@
|
||||
"integrity": "sha512-0hYQ8SB4Db5zvZB4axdMHGwEaQjkZzFjQiN9LVYvIFB2nSUHW9tYpxWriPrWDASIxiaXax83REcLxuSdnGPZtw==",
|
||||
"dev": true
|
||||
},
|
||||
"@codemirror/autocomplete": {
|
||||
"version": "6.0.2",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/autocomplete/-/autocomplete-6.0.2.tgz",
|
||||
"integrity": "sha512-9PDjnllmXan/7Uax87KGORbxerDJ/cu10SB+n4Jz0zXMEvIh3+TGgZxhIvDOtaQ4jDBQEM7kHYW4vLdQB0DGZQ==",
|
||||
"requires": {
|
||||
"@codemirror/language": "^6.0.0",
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"@codemirror/commands": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/commands/-/commands-6.0.1.tgz",
|
||||
"integrity": "sha512-iNHDByicYqQjs0Wo1MKGfqNbMYMyhS9WV6EwMVwsHXImlFemgEUC+c5X22bXKBStN3qnwg4fArNZM+gkv22baQ==",
|
||||
"requires": {
|
||||
"@codemirror/language": "^6.0.0",
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"@codemirror/language": {
|
||||
"version": "6.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/language/-/language-6.2.0.tgz",
|
||||
"integrity": "sha512-tabB0Ef/BflwoEmTB4a//WZ9P90UQyne9qWB9YFsmeS4bnEqSys7UpGk/da1URMXhyfuzWCwp+AQNMhvu8SfnA==",
|
||||
"requires": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"@lezer/common": "^1.0.0",
|
||||
"@lezer/highlight": "^1.0.0",
|
||||
"@lezer/lr": "^1.0.0",
|
||||
"style-mod": "^4.0.0"
|
||||
}
|
||||
},
|
||||
"@codemirror/lint": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/lint/-/lint-6.0.0.tgz",
|
||||
"integrity": "sha512-nUUXcJW1Xp54kNs+a1ToPLK8MadO0rMTnJB8Zk4Z8gBdrN0kqV7uvUraU/T2yqg+grDNR38Vmy/MrhQN/RgwiA==",
|
||||
"requires": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"crelt": "^1.0.5"
|
||||
}
|
||||
},
|
||||
"@codemirror/search": {
|
||||
"version": "6.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/search/-/search-6.0.0.tgz",
|
||||
"integrity": "sha512-rL0rd3AhI0TAsaJPUaEwC63KHLO7KL0Z/dYozXj6E7L3wNHRyx7RfE0/j5HsIf912EE5n2PCb4Vg0rGYmDv4UQ==",
|
||||
"requires": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0",
|
||||
"crelt": "^1.0.5"
|
||||
}
|
||||
},
|
||||
"@codemirror/state": {
|
||||
"version": "6.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/state/-/state-6.1.0.tgz",
|
||||
"integrity": "sha512-qbUr94DZTe6/V1VS7LDLz11rM/1t/nJxR1El4I6UaxDEdc0aZZvq6JCLJWiRmUf95NRAnDH6fhXn+PWp9wGCIg=="
|
||||
},
|
||||
"@codemirror/view": {
|
||||
"version": "6.0.2",
|
||||
"resolved": "https://registry.npmjs.org/@codemirror/view/-/view-6.0.2.tgz",
|
||||
"integrity": "sha512-mnVT/q1JvKPjpmjXJNeCi/xHyaJ3abGJsumIVpdQ1nE1MXAyHf7GHWt8QpWMUvDiqF0j+inkhVR2OviTdFFX7Q==",
|
||||
"requires": {
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"style-mod": "^4.0.0",
|
||||
"w3c-keyname": "^2.2.4"
|
||||
}
|
||||
},
|
||||
"@csstools/normalize.css": {
|
||||
"version": "12.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@csstools/normalize.css/-/normalize.css-12.0.0.tgz",
|
||||
@ -24592,6 +24784,27 @@
|
||||
}
|
||||
}
|
||||
},
|
||||
"@lezer/common": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@lezer/common/-/common-1.0.0.tgz",
|
||||
"integrity": "sha512-ohydQe+Hb+w4oMDvXzs8uuJd2NoA3D8YDcLiuDsLqH+yflDTPEpgCsWI3/6rH5C3BAedtH1/R51dxENldQceEA=="
|
||||
},
|
||||
"@lezer/highlight": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@lezer/highlight/-/highlight-1.0.0.tgz",
|
||||
"integrity": "sha512-nsCnNtim90UKsB5YxoX65v3GEIw3iCHw9RM2DtdgkiqAbKh9pCdvi8AWNwkYf10Lu6fxNhXPpkpHbW6mihhvJA==",
|
||||
"requires": {
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"@lezer/lr": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/@lezer/lr/-/lr-1.1.0.tgz",
|
||||
"integrity": "sha512-Iad04uVwk1PvSnj25mqj7zEEIRAsasbsTRmVzI0AUTs/+1Dz1//iYAaoLr7A+Xa7bZDfql5MKTxZmSlkYZD3Dg==",
|
||||
"requires": {
|
||||
"@lezer/common": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"@lingui/babel-plugin-extract-messages": {
|
||||
"version": "3.8.10",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/babel-plugin-extract-messages/-/babel-plugin-extract-messages-3.8.10.tgz",
|
||||
@ -24776,9 +24989,9 @@
|
||||
}
|
||||
},
|
||||
"@lingui/core": {
|
||||
"version": "3.13.3",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/core/-/core-3.13.3.tgz",
|
||||
"integrity": "sha512-3rQDIC7PtPfUuZCSNfU0nziWNMlGk3JhpxENzGrlt1M8w5RHson89Mk1Ce/how+hWzFpumCQDWLDDhyRPpydbg==",
|
||||
"version": "3.14.0",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/core/-/core-3.14.0.tgz",
|
||||
"integrity": "sha512-ertREq9oi9B/umxpd/pInm9uFO8FLK2/0FXfDmMqvH5ydswWn/c9nY5YO4W1h4/8LWO45mewypOIyjoue4De1w==",
|
||||
"requires": {
|
||||
"@babel/runtime": "^7.11.2",
|
||||
"make-plural": "^6.2.2",
|
||||
@ -24810,12 +25023,12 @@
|
||||
}
|
||||
},
|
||||
"@lingui/react": {
|
||||
"version": "3.13.3",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/react/-/react-3.13.3.tgz",
|
||||
"integrity": "sha512-sCCI5xMcUY9b6w2lwbwy6iHpo1Fb9TDcjcHAD2KI5JueLH+WWQG66tIHiVAlSsQ+hmQ9Tt+f86H05JQEiDdIvg==",
|
||||
"version": "3.14.0",
|
||||
"resolved": "https://registry.npmjs.org/@lingui/react/-/react-3.14.0.tgz",
|
||||
"integrity": "sha512-ow9Mtru7f0T2S9AwnPWRejppcucCW0LmoDR3P4wqHjL+eH5f8a6nxd2doxGieC91/2i4qqW88y4K/zXJxwRSQw==",
|
||||
"requires": {
|
||||
"@babel/runtime": "^7.11.2",
|
||||
"@lingui/core": "^3.13.3"
|
||||
"@lingui/core": "^3.14.0"
|
||||
}
|
||||
},
|
||||
"@nodelib/fs.scandir": {
|
||||
@ -24851,30 +25064,24 @@
|
||||
"dev": true
|
||||
},
|
||||
"@patternfly/patternfly": {
|
||||
"version": "4.196.7",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.196.7.tgz",
|
||||
"integrity": "sha512-hA7Oww411e1p0/IXjC1I+4/1NNis9V+NVBxfUIpRwyuLbCIDHBdtMu2qAPLdKxXjuibV9EE6ZdlT7ra/kcFuJQ=="
|
||||
"version": "4.202.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.202.1.tgz",
|
||||
"integrity": "sha512-cQiiPqmwJOm9onuTfLPQNRlpAZwDIJ/zVfDQeaFqMQyPJtxtKn3lkphz5xErY5dPs9rR4X94ytQ1I9pkVzaPJQ=="
|
||||
},
|
||||
"@patternfly/react-core": {
|
||||
"version": "4.214.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.214.1.tgz",
|
||||
"integrity": "sha512-XHEqXpnBEDyLVdAEDOYlGqFHnN43eNLSD5HABB99xO6541JV9MRnbxs0+v9iYnfhcKh/8bhA9ITXnUi3f2PEvg==",
|
||||
"version": "4.224.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.224.1.tgz",
|
||||
"integrity": "sha512-v8wGGNoMGndAScAoE5jeOA5jVgymlLSwttPjQk/Idr0k7roSpOrsM39oXUR5DEgkZee45DW00WKTgmg50PP3FQ==",
|
||||
"requires": {
|
||||
"@patternfly/react-icons": "^4.65.1",
|
||||
"@patternfly/react-styles": "^4.64.1",
|
||||
"@patternfly/react-tokens": "^4.66.1",
|
||||
"focus-trap": "6.2.2",
|
||||
"@patternfly/react-icons": "4.75.1",
|
||||
"@patternfly/react-styles": "^4.74.1",
|
||||
"@patternfly/react-tokens": "^4.76.1",
|
||||
"focus-trap": "6.9.2",
|
||||
"react-dropzone": "9.0.0",
|
||||
"tippy.js": "5.1.2",
|
||||
"tslib": "^2.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"@patternfly/react-icons": {
|
||||
"version": "4.65.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.65.1.tgz",
|
||||
"integrity": "sha512-CUYFRPztFkR7qrXq/0UAhLjeHd8FdjLe4jBjj8tfKc7OXwxDeZczqNFyRMATZpPaduTH7BU2r3OUjQrgAbquWg==",
|
||||
"requires": {}
|
||||
},
|
||||
"tslib": {
|
||||
"version": "2.3.1",
|
||||
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz",
|
||||
@ -24883,35 +25090,29 @@
|
||||
}
|
||||
},
|
||||
"@patternfly/react-icons": {
|
||||
"version": "4.49.19",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.49.19.tgz",
|
||||
"integrity": "sha512-Pr6JDDKWOnWChkifXKWglKEPo3Q+1CgiUTUrvk4ZbnD7mhq5e/TFxxInB9CPzi278bvnc2YlPyTjpaAcCN0yGw==",
|
||||
"version": "4.75.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.75.1.tgz",
|
||||
"integrity": "sha512-1ly8SVi/kcc0zkiViOjUd8D5BEr7GeqWGmDPuDSBtD60l1dYf3hZc44IWFVkRM/oHZML/musdrJkLfh4MDqX9w==",
|
||||
"requires": {}
|
||||
},
|
||||
"@patternfly/react-styles": {
|
||||
"version": "4.64.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.64.1.tgz",
|
||||
"integrity": "sha512-+GxULkP2o5Vpr9w+J4NiGOGzhTfNniYzdPGEF/yC+oDoAXB6Q1HJyQnEj+kJH31xNvwmw3G3VFtwRLX4ZWr0oA=="
|
||||
"version": "4.74.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.74.1.tgz",
|
||||
"integrity": "sha512-9eWvKrjtrJ3qhJkhY2GQKyYA13u/J0mU1befH49SYbvxZtkbuHdpKmXBAeQoHmcx1hcOKtiYXeKb+dVoRRNx0A=="
|
||||
},
|
||||
"@patternfly/react-table": {
|
||||
"version": "4.83.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.83.1.tgz",
|
||||
"integrity": "sha512-mkq13x9funh+Nh2Uzj2ZQBOacNYc+a60yUAHZMXgNcljCJ3LTQUoYy6EonvYrqwSrpC7vj8nLt8+/XbDNc0Aig==",
|
||||
"version": "4.93.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.93.1.tgz",
|
||||
"integrity": "sha512-N/zHkNsY3X3yUXPg6COwdZKAFmTCbWm25qCY2aHjrXlIlE2OKWaYvVag0CcTwPiQhIuCumztr9Y2Uw9uvv0Fsw==",
|
||||
"requires": {
|
||||
"@patternfly/react-core": "^4.214.1",
|
||||
"@patternfly/react-icons": "^4.65.1",
|
||||
"@patternfly/react-styles": "^4.64.1",
|
||||
"@patternfly/react-tokens": "^4.66.1",
|
||||
"@patternfly/react-core": "^4.224.1",
|
||||
"@patternfly/react-icons": "4.75.1",
|
||||
"@patternfly/react-styles": "^4.74.1",
|
||||
"@patternfly/react-tokens": "^4.76.1",
|
||||
"lodash": "^4.17.19",
|
||||
"tslib": "^2.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"@patternfly/react-icons": {
|
||||
"version": "4.65.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.65.1.tgz",
|
||||
"integrity": "sha512-CUYFRPztFkR7qrXq/0UAhLjeHd8FdjLe4jBjj8tfKc7OXwxDeZczqNFyRMATZpPaduTH7BU2r3OUjQrgAbquWg==",
|
||||
"requires": {}
|
||||
},
|
||||
"tslib": {
|
||||
"version": "2.4.0",
|
||||
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.4.0.tgz",
|
||||
@ -24920,9 +25121,9 @@
|
||||
}
|
||||
},
|
||||
"@patternfly/react-tokens": {
|
||||
"version": "4.66.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.66.1.tgz",
|
||||
"integrity": "sha512-k0IWqpufM6ezT+3gWlEamqQ7LW9yi8e8cBBlude5IU8eIEqIG6AccwR1WNBEK1wCVWGwVxakLMdf0XBLl4k52Q=="
|
||||
"version": "4.76.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.76.1.tgz",
|
||||
"integrity": "sha512-gLEezRSzQeflaPu3SCgYmWtuiqDIRtxNNFP1+ES7P2o56YHXJ5o1Pki7LpNCPk/VOzHy2+vRFE/7l+hBEweugw=="
|
||||
},
|
||||
"@pmmmwh/react-refresh-webpack-plugin": {
|
||||
"version": "0.5.4",
|
||||
@ -26395,8 +26596,7 @@
|
||||
"asynckit": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k=",
|
||||
"dev": true
|
||||
"integrity": "sha1-x57Zf380y48robyXkLzDZkdLS3k="
|
||||
},
|
||||
"at-least-node": {
|
||||
"version": "1.0.0",
|
||||
@ -26439,11 +26639,24 @@
|
||||
"dev": true
|
||||
},
|
||||
"axios": {
|
||||
"version": "0.22.0",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-0.22.0.tgz",
|
||||
"integrity": "sha512-Z0U3uhqQeg1oNcihswf4ZD57O3NrR1+ZXhxaROaWpDmsDTx7T2HNBV2ulBtie2hwJptu8UvgnJoK+BIqdzh/1w==",
|
||||
"version": "0.27.2",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-0.27.2.tgz",
|
||||
"integrity": "sha512-t+yRIyySRTp/wua5xEr+z1q60QmLq8ABsS5O9Me1AsE5dfKqgnCFzwiCZZ/cGNd1lq4/7akDWMxdhVlucjmnOQ==",
|
||||
"requires": {
|
||||
"follow-redirects": "^1.14.4"
|
||||
"follow-redirects": "^1.14.9",
|
||||
"form-data": "^4.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"form-data": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
|
||||
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
|
||||
"requires": {
|
||||
"asynckit": "^0.4.0",
|
||||
"combined-stream": "^1.0.8",
|
||||
"mime-types": "^2.1.12"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"axobject-query": {
|
||||
@ -27240,9 +27453,18 @@
|
||||
}
|
||||
},
|
||||
"codemirror": {
|
||||
"version": "5.65.4",
|
||||
"resolved": "https://registry.npmjs.org/codemirror/-/codemirror-5.65.4.tgz",
|
||||
"integrity": "sha512-tytrSm5Rh52b6j36cbDXN+FHwHCl9aroY4BrDZB2NFFL3Wjfq9nuYVLFFhaOYOczKAg3JXTr8BuT8LcE5QY4Iw=="
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/codemirror/-/codemirror-6.0.1.tgz",
|
||||
"integrity": "sha512-J8j+nZ+CdWmIeFIGXEFbFPtpiYacFMDR8GlHK3IyHQJMCaVRfGx9NT+Hxivv1ckLWPvNdZqndbr/7lVhrf/Svg==",
|
||||
"requires": {
|
||||
"@codemirror/autocomplete": "^6.0.0",
|
||||
"@codemirror/commands": "^6.0.0",
|
||||
"@codemirror/language": "^6.0.0",
|
||||
"@codemirror/lint": "^6.0.0",
|
||||
"@codemirror/search": "^6.0.0",
|
||||
"@codemirror/state": "^6.0.0",
|
||||
"@codemirror/view": "^6.0.0"
|
||||
}
|
||||
},
|
||||
"collect-v8-coverage": {
|
||||
"version": "1.0.1",
|
||||
@ -27285,7 +27507,6 @@
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"delayed-stream": "~1.0.0"
|
||||
}
|
||||
@ -27471,6 +27692,11 @@
|
||||
"integrity": "sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==",
|
||||
"dev": true
|
||||
},
|
||||
"crelt": {
|
||||
"version": "1.0.5",
|
||||
"resolved": "https://registry.npmjs.org/crelt/-/crelt-1.0.5.tgz",
|
||||
"integrity": "sha512-+BO9wPPi+DWTDcNYhr/W90myha8ptzftZT+LwcmUbbok0rcP/fequmFYCw8NMoH7pkAZQzU78b3kYrlua5a9eA=="
|
||||
},
|
||||
"cross-spawn": {
|
||||
"version": "7.0.3",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz",
|
||||
@ -28220,8 +28446,7 @@
|
||||
"delayed-stream": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||
"integrity": "sha1-3zrhmayt+31ECqrgsp4icrJOxhk=",
|
||||
"dev": true
|
||||
"integrity": "sha1-3zrhmayt+31ECqrgsp4icrJOxhk="
|
||||
},
|
||||
"depd": {
|
||||
"version": "1.1.2",
|
||||
@ -29827,17 +30052,17 @@
|
||||
"dev": true
|
||||
},
|
||||
"focus-trap": {
|
||||
"version": "6.2.2",
|
||||
"resolved": "https://registry.npmjs.org/focus-trap/-/focus-trap-6.2.2.tgz",
|
||||
"integrity": "sha512-qWovH9+LGoKqREvJaTCzJyO0hphQYGz+ap5Hc4NqXHNhZBdxCi5uBPPcaOUw66fHmzXLVwvETLvFgpwPILqKpg==",
|
||||
"version": "6.9.2",
|
||||
"resolved": "https://registry.npmjs.org/focus-trap/-/focus-trap-6.9.2.tgz",
|
||||
"integrity": "sha512-gBEuXOPNOKPrLdZpMFUSTyIo1eT2NSZRrwZ9r/0Jqw5tmT3Yvxfmu8KBHw8xW2XQkw6E/JoG+OlEq7UDtSUNgw==",
|
||||
"requires": {
|
||||
"tabbable": "^5.1.4"
|
||||
"tabbable": "^5.3.2"
|
||||
}
|
||||
},
|
||||
"follow-redirects": {
|
||||
"version": "1.14.8",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.14.8.tgz",
|
||||
"integrity": "sha512-1x0S9UVJHsQprFcEC/qnNzBLcIxsjAV905f/UkQxbclCsoTWlacCNOpQa/anodLl2uaEKFhfWOvM2Qg77+15zA=="
|
||||
"version": "1.15.1",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.1.tgz",
|
||||
"integrity": "sha512-yLAMQs+k0b2m7cVxpS1VKJVvoz7SS9Td1zss3XRwXj+ZDH00RJgnuLx7E44wx02kQLrdM3aOOy+FpzS7+8OizA=="
|
||||
},
|
||||
"form-data": {
|
||||
"version": "3.0.1",
|
||||
@ -30916,7 +31141,7 @@
|
||||
"isarray": {
|
||||
"version": "0.0.1",
|
||||
"resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz",
|
||||
"integrity": "sha1-ihis/Kmo9Bd+Cav8YDiTmwXR7t8="
|
||||
"integrity": "sha512-D2S+3GLxWH+uhrNEcoh/fnmYeP8E8/zHl644d/jdA0g2uyXvy3sb0qxotE+ne0LtccHknQzWwZEzhak7oJ0COQ=="
|
||||
},
|
||||
"isexe": {
|
||||
"version": "2.0.0",
|
||||
@ -34076,14 +34301,12 @@
|
||||
"mime-db": {
|
||||
"version": "1.51.0",
|
||||
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.51.0.tgz",
|
||||
"integrity": "sha512-5y8A56jg7XVQx2mbv1lu49NR4dokRnhZYTtL+KGfaa27uq4pSTXkwQkFJl4pkRMyNFz/EtYDSkiiEHx3F7UN6g==",
|
||||
"dev": true
|
||||
"integrity": "sha512-5y8A56jg7XVQx2mbv1lu49NR4dokRnhZYTtL+KGfaa27uq4pSTXkwQkFJl4pkRMyNFz/EtYDSkiiEHx3F7UN6g=="
|
||||
},
|
||||
"mime-types": {
|
||||
"version": "2.1.34",
|
||||
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.34.tgz",
|
||||
"integrity": "sha512-6cP692WwGIs9XXdOO4++N+7qjqv0rqxxVvJ3VHPh/Sc9mVZcQP+ZGhkKiTvWMQRr2tbHkJP/Yn7Y0npb3ZBs4A==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"mime-db": "1.51.0"
|
||||
}
|
||||
@ -36113,11 +36336,11 @@
|
||||
"dev": true
|
||||
},
|
||||
"react-router": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/react-router/-/react-router-5.2.0.tgz",
|
||||
"integrity": "sha512-smz1DUuFHRKdcJC0jobGo8cVbhO3x50tCL4icacOlcwDOEQPq4TMqwx3sY1TP+DvtTgz4nm3thuo7A+BK2U0Dw==",
|
||||
"version": "5.3.3",
|
||||
"resolved": "https://registry.npmjs.org/react-router/-/react-router-5.3.3.tgz",
|
||||
"integrity": "sha512-mzQGUvS3bM84TnbtMYR8ZjKnuPJ71IjSzR+DE6UkUqvN4czWIqEs17yLL8xkAycv4ev0AiN+IGrWu88vJs/p2w==",
|
||||
"requires": {
|
||||
"@babel/runtime": "^7.1.2",
|
||||
"@babel/runtime": "^7.12.13",
|
||||
"history": "^4.9.0",
|
||||
"hoist-non-react-statics": "^3.1.0",
|
||||
"loose-envify": "^1.3.1",
|
||||
@ -36130,15 +36353,15 @@
|
||||
}
|
||||
},
|
||||
"react-router-dom": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-5.2.0.tgz",
|
||||
"integrity": "sha512-gxAmfylo2QUjcwxI63RhQ5G85Qqt4voZpUXSEqCwykV0baaOTQDR1f0PmY8AELqIyVc0NEZUj0Gov5lNGcXgsA==",
|
||||
"version": "5.3.3",
|
||||
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-5.3.3.tgz",
|
||||
"integrity": "sha512-Ov0tGPMBgqmbu5CDmN++tv2HQ9HlWDuWIIqn4b88gjlAN5IHI+4ZUZRcpz9Hl0azFIwihbLDYw1OiHGRo7ZIng==",
|
||||
"requires": {
|
||||
"@babel/runtime": "^7.1.2",
|
||||
"@babel/runtime": "^7.12.13",
|
||||
"history": "^4.9.0",
|
||||
"loose-envify": "^1.3.1",
|
||||
"prop-types": "^15.6.2",
|
||||
"react-router": "5.2.0",
|
||||
"react-router": "5.3.3",
|
||||
"tiny-invariant": "^1.0.2",
|
||||
"tiny-warning": "^1.0.0"
|
||||
}
|
||||
@ -37407,6 +37630,11 @@
|
||||
"dev": true,
|
||||
"requires": {}
|
||||
},
|
||||
"style-mod": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/style-mod/-/style-mod-4.0.0.tgz",
|
||||
"integrity": "sha512-OPhtyEjyyN9x3nhPsu76f52yUGXiZcgvsrFVtvTkyGRQJ0XK+GPc6ov1z+lRpbeabka+MYEQxOYRnt5nF30aMw=="
|
||||
},
|
||||
"styled-components": {
|
||||
"version": "5.3.5",
|
||||
"resolved": "https://registry.npmjs.org/styled-components/-/styled-components-5.3.5.tgz",
|
||||
@ -37600,9 +37828,9 @@
|
||||
"dev": true
|
||||
},
|
||||
"tabbable": {
|
||||
"version": "5.2.0",
|
||||
"resolved": "https://registry.npmjs.org/tabbable/-/tabbable-5.2.0.tgz",
|
||||
"integrity": "sha512-0uyt8wbP0P3T4rrsfYg/5Rg3cIJ8Shl1RJ54QMqYxm1TLdWqJD1u6+RQjr2Lor3wmfT7JRHkirIwy99ydBsyPg=="
|
||||
"version": "5.3.3",
|
||||
"resolved": "https://registry.npmjs.org/tabbable/-/tabbable-5.3.3.tgz",
|
||||
"integrity": "sha512-QD9qKY3StfbZqWOPLp0++pOrAVb/HbUi5xCc8cUo4XjP19808oaMiDzn0leBY5mCespIBM0CIZePzZjgzR83kA=="
|
||||
},
|
||||
"tailwindcss": {
|
||||
"version": "3.0.15",
|
||||
@ -37873,9 +38101,9 @@
|
||||
"dev": true
|
||||
},
|
||||
"tiny-invariant": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/tiny-invariant/-/tiny-invariant-1.1.0.tgz",
|
||||
"integrity": "sha512-ytxQvrb1cPc9WBEI/HSeYYoGD0kWnGEOR8RY6KomWLBVhqz0RgTwVO9dLrGz7dC+nN9llyI7OKAgRq8Vq4ZBSw=="
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/tiny-invariant/-/tiny-invariant-1.2.0.tgz",
|
||||
"integrity": "sha512-1Uhn/aqw5C6RI4KejVeTg6mIS7IqxnLJ8Mv2tV5rTc0qWobay7pDUz6Wi392Cnc8ak1H0F2cjoRzb2/AW4+Fvg=="
|
||||
},
|
||||
"tiny-warning": {
|
||||
"version": "1.0.3",
|
||||
@ -38220,6 +38448,11 @@
|
||||
"browser-process-hrtime": "^1.0.0"
|
||||
}
|
||||
},
|
||||
"w3c-keyname": {
|
||||
"version": "2.2.4",
|
||||
"resolved": "https://registry.npmjs.org/w3c-keyname/-/w3c-keyname-2.2.4.tgz",
|
||||
"integrity": "sha512-tOhfEwEzFLJzf6d1ZPkYfGj+FWhIpBux9ppoP3rlclw3Z0BZv3N7b7030Z1kYth+6rDuAsXUFr+d0VE6Ed1ikw=="
|
||||
},
|
||||
"w3c-xmlserializer": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/w3c-xmlserializer/-/w3c-xmlserializer-2.0.0.tgz",
|
||||
|
||||
@ -6,15 +6,15 @@
|
||||
"node": ">=16.13.1"
|
||||
},
|
||||
"dependencies": {
|
||||
"@lingui/react": "3.13.3",
|
||||
"@patternfly/patternfly": "4.196.7",
|
||||
"@patternfly/react-core": "^4.201.0",
|
||||
"@patternfly/react-icons": "4.49.19",
|
||||
"@patternfly/react-table": "4.83.1",
|
||||
"@lingui/react": "3.14.0",
|
||||
"@patternfly/patternfly": "4.202.1",
|
||||
"@patternfly/react-core": "^4.221.3",
|
||||
"@patternfly/react-icons": "4.75.1",
|
||||
"@patternfly/react-table": "4.93.1",
|
||||
"ace-builds": "^1.6.0",
|
||||
"ansi-to-html": "0.7.2",
|
||||
"axios": "0.22.0",
|
||||
"codemirror": "^5.65.4",
|
||||
"axios": "0.27.2",
|
||||
"codemirror": "^6.0.1",
|
||||
"d3": "7.4.4",
|
||||
"dagre": "^0.8.4",
|
||||
"dompurify": "2.3.8",
|
||||
@ -28,7 +28,7 @@
|
||||
"react-ace": "^10.1.0",
|
||||
"react-dom": "17.0.2",
|
||||
"react-error-boundary": "^3.1.4",
|
||||
"react-router-dom": "^5.1.2",
|
||||
"react-router-dom": "^5.3.3",
|
||||
"react-virtualized": "^9.21.1",
|
||||
"rrule": "2.7.0",
|
||||
"styled-components": "5.3.5"
|
||||
|
||||
@ -26,13 +26,6 @@ function AdHocCommands({
|
||||
const [isWizardOpen, setIsWizardOpen] = useState(false);
|
||||
const { isKebabified, onKebabModalChange } = useContext(KebabifiedContext);
|
||||
|
||||
const verbosityOptions = [
|
||||
{ value: '0', key: '0', label: t`0 (Normal)` },
|
||||
{ value: '1', key: '1', label: t`1 (Verbose)` },
|
||||
{ value: '2', key: '2', label: t`2 (More Verbose)` },
|
||||
{ value: '3', key: '3', label: t`3 (Debug)` },
|
||||
{ value: '4', key: '4', label: t`4 (Connection Debug)` },
|
||||
];
|
||||
useEffect(() => {
|
||||
if (isKebabified) {
|
||||
onKebabModalChange(isWizardOpen);
|
||||
@ -159,7 +152,6 @@ function AdHocCommands({
|
||||
adHocItems={adHocItems}
|
||||
organizationId={organizationId}
|
||||
moduleOptions={moduleOptions}
|
||||
verbosityOptions={verbosityOptions}
|
||||
credentialTypeId={credentialTypeId}
|
||||
onCloseWizard={() => setIsWizardOpen(false)}
|
||||
onLaunch={handleSubmit}
|
||||
|
||||
@ -3,13 +3,13 @@ import { t } from '@lingui/macro';
|
||||
import { withFormik, useFormikContext } from 'formik';
|
||||
import PropTypes from 'prop-types';
|
||||
|
||||
import { VERBOSITY } from 'components/VerbositySelectField';
|
||||
import Wizard from '../Wizard';
|
||||
import useAdHocLaunchSteps from './useAdHocLaunchSteps';
|
||||
|
||||
function AdHocCommandsWizard({
|
||||
onLaunch,
|
||||
moduleOptions,
|
||||
verbosityOptions,
|
||||
onCloseWizard,
|
||||
credentialTypeId,
|
||||
organizationId,
|
||||
@ -18,7 +18,6 @@ function AdHocCommandsWizard({
|
||||
|
||||
const { steps, validateStep, visitStep, visitAllSteps } = useAdHocLaunchSteps(
|
||||
moduleOptions,
|
||||
verbosityOptions,
|
||||
organizationId,
|
||||
credentialTypeId
|
||||
);
|
||||
@ -57,13 +56,13 @@ function AdHocCommandsWizard({
|
||||
}
|
||||
|
||||
const FormikApp = withFormik({
|
||||
mapPropsToValues({ adHocItems, verbosityOptions }) {
|
||||
mapPropsToValues({ adHocItems }) {
|
||||
const adHocItemStrings = adHocItems.map((item) => item.name).join(', ');
|
||||
return {
|
||||
limit: adHocItemStrings || 'all',
|
||||
credentials: [],
|
||||
module_args: '',
|
||||
verbosity: verbosityOptions[0].value,
|
||||
verbosity: VERBOSITY()[0],
|
||||
forks: 0,
|
||||
diff_mode: false,
|
||||
become_enabled: '',
|
||||
@ -79,7 +78,6 @@ const FormikApp = withFormik({
|
||||
FormikApp.propTypes = {
|
||||
onLaunch: PropTypes.func.isRequired,
|
||||
moduleOptions: PropTypes.arrayOf(PropTypes.array).isRequired,
|
||||
verbosityOptions: PropTypes.arrayOf(PropTypes.object).isRequired,
|
||||
onCloseWizard: PropTypes.func.isRequired,
|
||||
credentialTypeId: PropTypes.number.isRequired,
|
||||
};
|
||||
|
||||
@ -13,13 +13,6 @@ jest.mock('../../api/models/Credentials');
|
||||
jest.mock('../../api/models/ExecutionEnvironments');
|
||||
jest.mock('../../api/models/Root');
|
||||
|
||||
const verbosityOptions = [
|
||||
{ value: '0', key: '0', label: '0 (Normal)' },
|
||||
{ value: '1', key: '1', label: '1 (Verbose)' },
|
||||
{ value: '2', key: '2', label: '2 (More Verbose)' },
|
||||
{ value: '3', key: '3', label: '3 (Debug)' },
|
||||
{ value: '4', key: '4', label: '4 (Connection Debug)' },
|
||||
];
|
||||
const moduleOptions = [
|
||||
['command', 'command'],
|
||||
['shell', 'shell'],
|
||||
@ -44,7 +37,6 @@ describe('<AdHocCommandsWizard/>', () => {
|
||||
adHocItems={adHocItems}
|
||||
onLaunch={onLaunch}
|
||||
moduleOptions={moduleOptions}
|
||||
verbosityOptions={verbosityOptions}
|
||||
onCloseWizard={() => {}}
|
||||
credentialTypeId={1}
|
||||
organizationId={1}
|
||||
|
||||
@ -7,6 +7,7 @@ import { Form, FormGroup, Switch, Checkbox } from '@patternfly/react-core';
|
||||
import styled from 'styled-components';
|
||||
import { required } from 'util/validators';
|
||||
import useBrandName from 'hooks/useBrandName';
|
||||
import { VerbositySelectField } from 'components/VerbositySelectField';
|
||||
import AnsibleSelect from '../AnsibleSelect';
|
||||
import FormField from '../FormField';
|
||||
import { VariablesField } from '../CodeEditor';
|
||||
@ -21,7 +22,7 @@ const TooltipWrapper = styled.div`
|
||||
text-align: left;
|
||||
`;
|
||||
|
||||
function AdHocDetailsStep({ verbosityOptions, moduleOptions }) {
|
||||
function AdHocDetailsStep({ moduleOptions }) {
|
||||
const brandName = useBrandName();
|
||||
const [moduleNameField, moduleNameMeta, moduleNameHelpers] = useField({
|
||||
name: 'module_name',
|
||||
@ -32,7 +33,7 @@ function AdHocDetailsStep({ verbosityOptions, moduleOptions }) {
|
||||
const [diffModeField, , diffModeHelpers] = useField('diff_mode');
|
||||
const [becomeEnabledField, , becomeEnabledHelpers] =
|
||||
useField('become_enabled');
|
||||
const [verbosityField, verbosityMeta, verbosityHelpers] = useField({
|
||||
const [, verbosityMeta] = useField({
|
||||
name: 'verbosity',
|
||||
validate: required(null),
|
||||
});
|
||||
@ -122,33 +123,16 @@ function AdHocDetailsStep({ verbosityOptions, moduleOptions }) {
|
||||
)
|
||||
}
|
||||
/>
|
||||
<FormGroup
|
||||
|
||||
<VerbositySelectField
|
||||
fieldId="verbosity"
|
||||
aria-label={t`select verbosity`}
|
||||
label={t`Verbosity`}
|
||||
isRequired
|
||||
validated={
|
||||
tooltip={t`These are the verbosity levels for standard out of the command run that are supported.`}
|
||||
isValid={
|
||||
!verbosityMeta.touched || !verbosityMeta.error
|
||||
? 'default'
|
||||
: 'error'
|
||||
}
|
||||
helperTextInvalid={verbosityMeta.error}
|
||||
labelIcon={
|
||||
<Popover
|
||||
content={t`These are the verbosity levels for standard out of the command run that are supported.`}
|
||||
/>
|
||||
}
|
||||
>
|
||||
<AnsibleSelect
|
||||
{...verbosityField}
|
||||
isValid={!verbosityMeta.touched || !verbosityMeta.error}
|
||||
id="verbosity"
|
||||
data={verbosityOptions || []}
|
||||
onChange={(event, value) => {
|
||||
verbosityHelpers.setValue(parseInt(value, 10));
|
||||
}}
|
||||
/>
|
||||
</FormGroup>
|
||||
/>
|
||||
<FormField
|
||||
id="limit"
|
||||
name="limit"
|
||||
@ -296,7 +280,6 @@ function AdHocDetailsStep({ verbosityOptions, moduleOptions }) {
|
||||
|
||||
AdHocDetailsStep.propTypes = {
|
||||
moduleOptions: PropTypes.arrayOf(PropTypes.array).isRequired,
|
||||
verbosityOptions: PropTypes.arrayOf(PropTypes.object).isRequired,
|
||||
};
|
||||
|
||||
export default AdHocDetailsStep;
|
||||
|
||||
@ -3,6 +3,7 @@ import { t } from '@lingui/macro';
|
||||
import { Tooltip } from '@patternfly/react-core';
|
||||
import { ExclamationCircleIcon as PFExclamationCircleIcon } from '@patternfly/react-icons';
|
||||
import styled from 'styled-components';
|
||||
import { VERBOSITY } from '../VerbositySelectField';
|
||||
import { toTitleCase } from '../../util/strings';
|
||||
import { VariablesDetail } from '../CodeEditor';
|
||||
import { jsonToYaml } from '../../util/yaml';
|
||||
@ -21,7 +22,7 @@ const ErrorMessageWrapper = styled.div`
|
||||
margin-bottom: 10px;
|
||||
`;
|
||||
function AdHocPreviewStep({ hasErrors, values }) {
|
||||
const { credential, execution_environment, extra_vars } = values;
|
||||
const { credential, execution_environment, extra_vars, verbosity } = values;
|
||||
|
||||
const items = Object.entries(values);
|
||||
return (
|
||||
@ -44,6 +45,7 @@ function AdHocPreviewStep({ hasErrors, values }) {
|
||||
key !== 'extra_vars' &&
|
||||
key !== 'execution_environment' &&
|
||||
key !== 'credentials' &&
|
||||
key !== 'verbosity' &&
|
||||
!key.startsWith('credential_passwords') && (
|
||||
<Detail key={key} label={toTitleCase(key)} value={value} />
|
||||
)
|
||||
@ -57,6 +59,9 @@ function AdHocPreviewStep({ hasErrors, values }) {
|
||||
value={execution_environment[0]?.name}
|
||||
/>
|
||||
)}
|
||||
{verbosity && (
|
||||
<Detail label={t`Verbosity`} value={VERBOSITY()[values.verbosity]} />
|
||||
)}
|
||||
{extra_vars && (
|
||||
<VariablesDetail
|
||||
value={jsonToYaml(JSON.stringify(extra_vars))}
|
||||
|
||||
@ -5,11 +5,7 @@ import StepName from '../LaunchPrompt/steps/StepName';
|
||||
import AdHocDetailsStep from './AdHocDetailsStep';
|
||||
|
||||
const STEP_ID = 'details';
|
||||
export default function useAdHocDetailsStep(
|
||||
visited,
|
||||
moduleOptions,
|
||||
verbosityOptions
|
||||
) {
|
||||
export default function useAdHocDetailsStep(visited, moduleOptions) {
|
||||
const { values, touched, setFieldError } = useFormikContext();
|
||||
|
||||
const hasError = () => {
|
||||
@ -39,12 +35,7 @@ export default function useAdHocDetailsStep(
|
||||
{t`Details`}
|
||||
</StepName>
|
||||
),
|
||||
component: (
|
||||
<AdHocDetailsStep
|
||||
moduleOptions={moduleOptions}
|
||||
verbosityOptions={verbosityOptions}
|
||||
/>
|
||||
),
|
||||
component: <AdHocDetailsStep moduleOptions={moduleOptions} />,
|
||||
enableNext: true,
|
||||
nextButtonText: t`Next`,
|
||||
},
|
||||
|
||||
@ -24,7 +24,6 @@ function showCredentialPasswordsStep(credential) {
|
||||
|
||||
export default function useAdHocLaunchSteps(
|
||||
moduleOptions,
|
||||
verbosityOptions,
|
||||
organizationId,
|
||||
credentialTypeId
|
||||
) {
|
||||
@ -32,7 +31,7 @@ export default function useAdHocLaunchSteps(
|
||||
|
||||
const [visited, setVisited] = useState({});
|
||||
const steps = [
|
||||
useAdHocDetailsStep(visited, moduleOptions, verbosityOptions),
|
||||
useAdHocDetailsStep(visited, moduleOptions),
|
||||
useAdHocExecutionEnvironmentStep(organizationId),
|
||||
useAdHocCredentialStep(visited, credentialTypeId),
|
||||
useCredentialPasswordsStep(
|
||||
|
||||
@ -46,7 +46,9 @@ function AnsibleSelect({
|
||||
value={option.value}
|
||||
label={option.label}
|
||||
isDisabled={option.isDisabled}
|
||||
/>
|
||||
>
|
||||
{option.label}
|
||||
</FormSelectOption>
|
||||
))}
|
||||
</FormSelect>
|
||||
);
|
||||
|
||||
@ -113,48 +113,6 @@ describe('LaunchButton', () => {
|
||||
expect(history.location.pathname).toEqual('/jobs/9000/output');
|
||||
});
|
||||
|
||||
test('should disable button to prevent duplicate clicks', async () => {
|
||||
WorkflowJobTemplatesAPI.readLaunch.mockResolvedValue({
|
||||
data: {
|
||||
can_start_without_user_input: true,
|
||||
},
|
||||
});
|
||||
const history = createMemoryHistory({
|
||||
initialEntries: ['/jobs/9000'],
|
||||
});
|
||||
WorkflowJobTemplatesAPI.launch.mockImplementation(async () => {
|
||||
// return asynchronously so isLaunching isn't set back to false in the
|
||||
// same tick
|
||||
await new Promise((resolve) => setTimeout(resolve, 10));
|
||||
return {
|
||||
data: {
|
||||
id: 9000,
|
||||
},
|
||||
};
|
||||
});
|
||||
const wrapper = mountWithContexts(
|
||||
<LaunchButton
|
||||
resource={{
|
||||
id: 1,
|
||||
type: 'workflow_job_template',
|
||||
}}
|
||||
>
|
||||
{({ handleLaunch, isLaunching }) => (
|
||||
<button type="submit" onClick={handleLaunch} disabled={isLaunching} />
|
||||
)}
|
||||
</LaunchButton>,
|
||||
{
|
||||
context: {
|
||||
router: { history },
|
||||
},
|
||||
}
|
||||
);
|
||||
const button = wrapper.find('button');
|
||||
await act(() => button.prop('onClick')());
|
||||
wrapper.update();
|
||||
expect(wrapper.find('button').prop('disabled')).toEqual(false);
|
||||
});
|
||||
|
||||
test('should relaunch job correctly', async () => {
|
||||
JobsAPI.readRelaunch.mockResolvedValue({
|
||||
data: {
|
||||
|
||||
@ -9,6 +9,7 @@ import { TagMultiSelect } from '../../MultiSelect';
|
||||
import AnsibleSelect from '../../AnsibleSelect';
|
||||
import { VariablesField } from '../../CodeEditor';
|
||||
import Popover from '../../Popover';
|
||||
import { VerbositySelectField } from '../../VerbositySelectField';
|
||||
|
||||
const FieldHeader = styled.div`
|
||||
display: flex;
|
||||
@ -57,7 +58,7 @@ function OtherPromptsStep({ launchConfig, variablesMode, onVarModeChange }) {
|
||||
aria-label={t`Job Tags`}
|
||||
tooltip={t`Tags are useful when you have a large
|
||||
playbook, and you want to run a specific part of a play or task.
|
||||
Use commas to separate multiple tags. Refer to Ansible Tower
|
||||
Use commas to separate multiple tags. Refer to Ansible Controller
|
||||
documentation for details on the usage of tags.`}
|
||||
/>
|
||||
)}
|
||||
@ -69,7 +70,7 @@ function OtherPromptsStep({ launchConfig, variablesMode, onVarModeChange }) {
|
||||
aria-label={t`Skip Tags`}
|
||||
tooltip={t`Skip tags are useful when you have a large
|
||||
playbook, and you want to skip specific parts of a play or task.
|
||||
Use commas to separate multiple tags. Refer to Ansible Tower
|
||||
Use commas to separate multiple tags. Refer to Ansible Controller
|
||||
documentation for details on the usage of tags.`}
|
||||
/>
|
||||
)}
|
||||
@ -129,36 +130,16 @@ function JobTypeField() {
|
||||
}
|
||||
|
||||
function VerbosityField() {
|
||||
const [field, meta, helpers] = useField('verbosity');
|
||||
const options = [
|
||||
{ value: '0', key: '0', label: t`0 (Normal)` },
|
||||
{ value: '1', key: '1', label: t`1 (Verbose)` },
|
||||
{ value: '2', key: '2', label: t`2 (More Verbose)` },
|
||||
{ value: '3', key: '3', label: t`3 (Debug)` },
|
||||
{ value: '4', key: '4', label: t`4 (Connection Debug)` },
|
||||
];
|
||||
|
||||
const [, meta] = useField('verbosity');
|
||||
const isValid = !(meta.touched && meta.error);
|
||||
|
||||
return (
|
||||
<FormGroup
|
||||
<VerbositySelectField
|
||||
fieldId="prompt-verbosity"
|
||||
validated={isValid ? 'default' : 'error'}
|
||||
label={t`Verbosity`}
|
||||
labelIcon={
|
||||
<Popover
|
||||
content={t`Control the level of output ansible
|
||||
tooltip={t`Control the level of output ansible
|
||||
will produce as the playbook executes.`}
|
||||
/>
|
||||
}
|
||||
>
|
||||
<AnsibleSelect
|
||||
id="prompt-verbosity"
|
||||
data={options}
|
||||
{...field}
|
||||
onChange={(event, value) => helpers.setValue(value)}
|
||||
/>
|
||||
</FormGroup>
|
||||
isValid={isValid ? 'default' : 'error'}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@ -85,7 +85,7 @@ describe('OtherPromptsStep', () => {
|
||||
expect(wrapper.find('VerbosityField')).toHaveLength(1);
|
||||
expect(
|
||||
wrapper.find('VerbosityField AnsibleSelect').prop('data')
|
||||
).toHaveLength(5);
|
||||
).toHaveLength(6);
|
||||
});
|
||||
|
||||
test('should render show changes toggle', async () => {
|
||||
|
||||
@ -349,7 +349,7 @@ function HostFilterLookup({
|
||||
content={t`Populate the hosts for this inventory by using a search
|
||||
filter. Example: ansible_facts__ansible_distribution:"RedHat".
|
||||
Refer to the documentation for further syntax and
|
||||
examples. Refer to the Ansible Tower documentation for further syntax and
|
||||
examples. Refer to the Ansible Controller documentation for further syntax and
|
||||
examples.`}
|
||||
/>
|
||||
}
|
||||
|
||||
@ -14,6 +14,7 @@ import PromptProjectDetail from './PromptProjectDetail';
|
||||
import PromptInventorySourceDetail from './PromptInventorySourceDetail';
|
||||
import PromptJobTemplateDetail from './PromptJobTemplateDetail';
|
||||
import PromptWFJobTemplateDetail from './PromptWFJobTemplateDetail';
|
||||
import { VERBOSITY } from '../VerbositySelectField';
|
||||
|
||||
const PromptTitle = styled(Title)`
|
||||
margin-top: var(--pf-global--spacer--xl);
|
||||
@ -93,14 +94,6 @@ function PromptDetail({
|
||||
overrides = {},
|
||||
workflowNode = false,
|
||||
}) {
|
||||
const VERBOSITY = {
|
||||
0: t`0 (Normal)`,
|
||||
1: t`1 (Verbose)`,
|
||||
2: t`2 (More Verbose)`,
|
||||
3: t`3 (Debug)`,
|
||||
4: t`4 (Connection Debug)`,
|
||||
};
|
||||
|
||||
const details = omitOverrides(resource, overrides, launchConfig.defaults);
|
||||
details.type = overrides?.nodeType || details.type;
|
||||
const hasOverrides = Object.keys(overrides).length > 0;
|
||||
@ -226,7 +219,7 @@ function PromptDetail({
|
||||
launchConfig.ask_verbosity_on_launch ? (
|
||||
<Detail
|
||||
label={t`Verbosity`}
|
||||
value={VERBOSITY[overrides.verbosity]}
|
||||
value={VERBOSITY()[overrides.verbosity]}
|
||||
/>
|
||||
) : null}
|
||||
{launchConfig.ask_tags_on_launch && (
|
||||
|
||||
@ -13,6 +13,7 @@ import { VariablesDetail } from '../CodeEditor';
|
||||
import CredentialChip from '../CredentialChip';
|
||||
import ChipGroup from '../ChipGroup';
|
||||
import ExecutionEnvironmentDetail from '../ExecutionEnvironmentDetail';
|
||||
import { VERBOSITY } from '../VerbositySelectField';
|
||||
|
||||
function PromptInventorySourceDetail({ resource }) {
|
||||
const {
|
||||
@ -32,14 +33,6 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
verbosity,
|
||||
} = resource;
|
||||
|
||||
const VERBOSITY = {
|
||||
0: t`0 (Normal)`,
|
||||
1: t`1 (Verbose)`,
|
||||
2: t`2 (More Verbose)`,
|
||||
3: t`3 (Debug)`,
|
||||
4: t`4 (Connection Debug)`,
|
||||
};
|
||||
|
||||
let optionsList = '';
|
||||
if (
|
||||
overwrite ||
|
||||
@ -115,7 +108,7 @@ function PromptInventorySourceDetail({ resource }) {
|
||||
executionEnvironment={summary_fields?.execution_environment}
|
||||
/>
|
||||
<Detail label={t`Inventory File`} value={source_path} />
|
||||
<Detail label={t`Verbosity`} value={VERBOSITY[verbosity]} />
|
||||
<Detail label={t`Verbosity`} value={VERBOSITY()[verbosity]} />
|
||||
<Detail
|
||||
label={t`Cache Timeout`}
|
||||
value={`${update_cache_timeout} ${t`Seconds`}`}
|
||||
|
||||
@ -15,6 +15,7 @@ import Sparkline from '../Sparkline';
|
||||
import { Detail, DeletedDetail } from '../DetailList';
|
||||
import { VariablesDetail } from '../CodeEditor';
|
||||
import ExecutionEnvironmentDetail from '../ExecutionEnvironmentDetail';
|
||||
import { VERBOSITY } from '../VerbositySelectField';
|
||||
|
||||
function PromptJobTemplateDetail({ resource }) {
|
||||
const {
|
||||
@ -42,14 +43,6 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
custom_virtualenv,
|
||||
} = resource;
|
||||
|
||||
const VERBOSITY = {
|
||||
0: t`0 (Normal)`,
|
||||
1: t`1 (Verbose)`,
|
||||
2: t`2 (More Verbose)`,
|
||||
3: t`3 (Debug)`,
|
||||
4: t`4 (Connection Debug)`,
|
||||
};
|
||||
|
||||
let optionsList = '';
|
||||
if (
|
||||
become_enabled ||
|
||||
@ -153,7 +146,7 @@ function PromptJobTemplateDetail({ resource }) {
|
||||
<Detail label={t`Playbook`} value={playbook} />
|
||||
<Detail label={t`Forks`} value={forks || '0'} />
|
||||
<Detail label={t`Limit`} value={limit} />
|
||||
<Detail label={t`Verbosity`} value={VERBOSITY[verbosity]} />
|
||||
<Detail label={t`Verbosity`} value={VERBOSITY()[verbosity]} />
|
||||
{typeof diff_mode === 'boolean' && (
|
||||
<Detail label={t`Show Changes`} value={diff_mode ? t`On` : t`Off`} />
|
||||
)}
|
||||
|
||||
@ -11,6 +11,7 @@ import { formatDateString } from 'util/dates';
|
||||
import useRequest, { useDismissableError } from 'hooks/useRequest';
|
||||
import { JobTemplatesAPI, SchedulesAPI, WorkflowJobTemplatesAPI } from 'api';
|
||||
import { parseVariableField, jsonToYaml } from 'util/yaml';
|
||||
import { useConfig } from 'contexts/Config';
|
||||
import AlertModal from '../../AlertModal';
|
||||
import { CardBody, CardActionsRow } from '../../Card';
|
||||
import ContentError from '../../ContentError';
|
||||
@ -23,6 +24,8 @@ import DeleteButton from '../../DeleteButton';
|
||||
import ErrorDetail from '../../ErrorDetail';
|
||||
import ChipGroup from '../../ChipGroup';
|
||||
import { VariablesDetail } from '../../CodeEditor';
|
||||
import { VERBOSITY } from '../../VerbositySelectField';
|
||||
import helpText from '../../../screens/Template/shared/JobTemplate.helptext';
|
||||
|
||||
const PromptDivider = styled(Divider)`
|
||||
margin-top: var(--pf-global--spacer--lg);
|
||||
@ -38,7 +41,6 @@ const PromptTitle = styled(Title)`
|
||||
const PromptDetailList = styled(DetailList)`
|
||||
padding: 0px 20px;
|
||||
`;
|
||||
|
||||
function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
|
||||
const {
|
||||
id,
|
||||
@ -66,14 +68,7 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
|
||||
const history = useHistory();
|
||||
const { pathname } = useLocation();
|
||||
const pathRoot = pathname.substr(0, pathname.indexOf('schedules'));
|
||||
|
||||
const VERBOSITY = {
|
||||
0: t`0 (Normal)`,
|
||||
1: t`1 (Verbose)`,
|
||||
2: t`2 (More Verbose)`,
|
||||
3: t`3 (Debug)`,
|
||||
4: t`4 (Connection Debug)`,
|
||||
};
|
||||
const config = useConfig();
|
||||
|
||||
const {
|
||||
request: deleteSchedule,
|
||||
@ -216,7 +211,7 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
|
||||
const showLimitDetail = ask_limit_on_launch && limit;
|
||||
const showJobTypeDetail = ask_job_type_on_launch && job_type;
|
||||
const showSCMBranchDetail = ask_scm_branch_on_launch && scm_branch;
|
||||
const showVerbosityDetail = ask_verbosity_on_launch && VERBOSITY[verbosity];
|
||||
const showVerbosityDetail = ask_verbosity_on_launch && VERBOSITY()[verbosity];
|
||||
|
||||
const showPromptedFields =
|
||||
showCredentialsDetail ||
|
||||
@ -267,7 +262,11 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
|
||||
value={formatDateString(next_run, timezone)}
|
||||
/>
|
||||
<Detail label={t`Last Run`} value={formatDateString(dtend, timezone)} />
|
||||
<Detail label={t`Local Time Zone`} value={timezone} />
|
||||
<Detail
|
||||
label={t`Local Time Zone`}
|
||||
value={timezone}
|
||||
helpText={helpText.localTimeZone(config)}
|
||||
/>
|
||||
<Detail label={t`Repeat Frequency`} value={repeatFrequency} />
|
||||
{hasDaysToKeepField ? (
|
||||
<Detail label={t`Days of Data to Keep`} value={daysToKeep} />
|
||||
@ -313,7 +312,7 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
|
||||
/>
|
||||
)}
|
||||
{ask_verbosity_on_launch && (
|
||||
<Detail label={t`Verbosity`} value={VERBOSITY[verbosity]} />
|
||||
<Detail label={t`Verbosity`} value={VERBOSITY()[verbosity]} />
|
||||
)}
|
||||
{ask_scm_branch_on_launch && (
|
||||
<Detail label={t`Source Control Branch`} value={scm_branch} />
|
||||
|
||||
@ -164,6 +164,9 @@ describe('<ScheduleDetail />', () => {
|
||||
expect(
|
||||
wrapper.find('Detail[label="Local Time Zone"]').find('dd').text()
|
||||
).toBe('America/New_York');
|
||||
expect(
|
||||
wrapper.find('Detail[label="Local Time Zone"]').prop('helpText')
|
||||
).toBeDefined();
|
||||
expect(wrapper.find('Detail[label="Repeat Frequency"]').length).toBe(1);
|
||||
expect(wrapper.find('Detail[label="Created"]').length).toBe(1);
|
||||
expect(wrapper.find('Detail[label="Last Modified"]').length).toBe(1);
|
||||
|
||||
@ -14,12 +14,13 @@ import {
|
||||
// To be removed once UI completes complex schedules
|
||||
Alert,
|
||||
} from '@patternfly/react-core';
|
||||
import { Config } from 'contexts/Config';
|
||||
import { Config, useConfig } from 'contexts/Config';
|
||||
import { SchedulesAPI } from 'api';
|
||||
import { dateToInputDateTime } from 'util/dates';
|
||||
import useRequest from 'hooks/useRequest';
|
||||
import { required } from 'util/validators';
|
||||
import { parseVariableField } from 'util/yaml';
|
||||
import Popover from '../../Popover';
|
||||
import AnsibleSelect from '../../AnsibleSelect';
|
||||
import ContentError from '../../ContentError';
|
||||
import ContentLoading from '../../ContentLoading';
|
||||
@ -33,6 +34,7 @@ import FrequencyDetailSubform from './FrequencyDetailSubform';
|
||||
import SchedulePromptableFields from './SchedulePromptableFields';
|
||||
import DateTimePicker from './DateTimePicker';
|
||||
import buildRuleObj from './buildRuleObj';
|
||||
import helpText from '../../../screens/Template/shared/JobTemplate.helptext';
|
||||
|
||||
const NUM_DAYS_PER_FREQUENCY = {
|
||||
week: 7,
|
||||
@ -118,6 +120,9 @@ function ScheduleFormFields({ hasDaysToKeepField, zoneOptions, zoneLinks }) {
|
||||
} else if (timezoneMessage) {
|
||||
timezoneValidatedStatus = 'warning';
|
||||
}
|
||||
|
||||
const config = useConfig();
|
||||
|
||||
return (
|
||||
<>
|
||||
<FormField
|
||||
@ -147,6 +152,7 @@ function ScheduleFormFields({ hasDaysToKeepField, zoneOptions, zoneLinks }) {
|
||||
validated={timezoneValidatedStatus}
|
||||
label={t`Local time zone`}
|
||||
helperText={timezoneMessage}
|
||||
labelIcon={<Popover content={helpText.localTimeZone(config)} />}
|
||||
>
|
||||
<AnsibleSelect
|
||||
id="schedule-timezone"
|
||||
|
||||
@ -91,6 +91,9 @@ const defaultFieldsVisible = () => {
|
||||
expect(wrapper.find('FormGroup[label="Description"]').length).toBe(1);
|
||||
expect(wrapper.find('FormGroup[label="Start date/time"]').length).toBe(1);
|
||||
expect(wrapper.find('FormGroup[label="Local time zone"]').length).toBe(1);
|
||||
expect(
|
||||
wrapper.find('FormGroup[label="Local time zone"]').find('HelpIcon').length
|
||||
).toBe(1);
|
||||
expect(wrapper.find('FormGroup[label="Run frequency"]').length).toBe(1);
|
||||
};
|
||||
|
||||
|
||||
@ -0,0 +1,58 @@
|
||||
import React from 'react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { useField } from 'formik';
|
||||
import { FormGroup } from '@patternfly/react-core';
|
||||
import Popover from 'components/Popover';
|
||||
import AnsibleSelect from 'components/AnsibleSelect';
|
||||
import FieldWithPrompt from 'components/FieldWithPrompt';
|
||||
|
||||
const VERBOSITY = () => ({
|
||||
0: t`0 (Normal)`,
|
||||
1: t`1 (Verbose)`,
|
||||
2: t`2 (More Verbose)`,
|
||||
3: t`3 (Debug)`,
|
||||
4: t`4 (Connection Debug)`,
|
||||
5: t`5 (WinRM Debug)`,
|
||||
});
|
||||
|
||||
function VerbositySelectField({
|
||||
fieldId,
|
||||
promptId,
|
||||
promptName,
|
||||
tooltip,
|
||||
isValid,
|
||||
}) {
|
||||
const VERBOSE_OPTIONS = Object.entries(VERBOSITY()).map(([k, v]) => ({
|
||||
key: `${k}`,
|
||||
value: `${k}`,
|
||||
label: v,
|
||||
}));
|
||||
const [verbosityField, , verbosityHelpers] = useField('verbosity');
|
||||
return promptId ? (
|
||||
<FieldWithPrompt
|
||||
fieldId={fieldId}
|
||||
label={t`Verbosity`}
|
||||
promptId={promptId}
|
||||
promptName={promptName}
|
||||
tooltip={tooltip}
|
||||
>
|
||||
<AnsibleSelect id={fieldId} data={VERBOSE_OPTIONS} {...verbosityField} />
|
||||
</FieldWithPrompt>
|
||||
) : (
|
||||
<FormGroup
|
||||
fieldId={fieldId}
|
||||
validated={isValid ? 'default' : 'error'}
|
||||
label={t`Verbosity`}
|
||||
labelIcon={<Popover content={tooltip} />}
|
||||
>
|
||||
<AnsibleSelect
|
||||
id={fieldId}
|
||||
data={VERBOSE_OPTIONS}
|
||||
{...verbosityField}
|
||||
onChange={(event, value) => verbosityHelpers.setValue(value)}
|
||||
/>
|
||||
</FormGroup>
|
||||
);
|
||||
}
|
||||
|
||||
export { VerbositySelectField, VERBOSITY };
|
||||
1
awx/ui/src/components/VerbositySelectField/index.js
Normal file
1
awx/ui/src/components/VerbositySelectField/index.js
Normal file
@ -0,0 +1 @@
|
||||
export { VERBOSITY, VerbositySelectField } from './VerbositySelectField';
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -418,7 +418,7 @@ describe('<CredentialForm />', () => {
|
||||
).toBe(false);
|
||||
expect(wrapper.find('FormGroup[label="Credential Type"]').length).toBe(1);
|
||||
expect(
|
||||
wrapper.find('FormGroup[label="Ansible Tower Hostname"]').length
|
||||
wrapper.find('FormGroup[label="Ansible Controller Hostname"]').length
|
||||
).toBe(1);
|
||||
expect(wrapper.find('FormGroup[label="Username"]').length).toBe(1);
|
||||
expect(wrapper.find('FormGroup[label="Password"]').length).toBe(1);
|
||||
|
||||
@ -61,7 +61,7 @@
|
||||
},
|
||||
"created": "2020-05-18T21:53:35.334813Z",
|
||||
"modified": "2020-05-18T21:54:05.424087Z",
|
||||
"name": "Ansible Tower",
|
||||
"name": "Ansible Controller",
|
||||
"description": "",
|
||||
"kind": "cloud",
|
||||
"namespace": "tower",
|
||||
@ -70,9 +70,9 @@
|
||||
"fields": [
|
||||
{
|
||||
"id": "host",
|
||||
"label": "Ansible Tower Hostname",
|
||||
"label": "Ansible Controller Hostname",
|
||||
"type": "string",
|
||||
"help_text": "The Ansible Tower base URL to authenticate with."
|
||||
"help_text": "The Ansible Controller base URL to authenticate with."
|
||||
},
|
||||
{
|
||||
"id": "username",
|
||||
|
||||
@ -3,7 +3,7 @@
|
||||
"type": "credential",
|
||||
"url": "/api/v2/credentials/4/",
|
||||
"related": {
|
||||
"named_url": "/api/v2/credentials/Tower cred++Ansible Tower+cloud++/",
|
||||
"named_url": "/api/v2/credentials/Tower cred++Ansible Controller+cloud++/",
|
||||
"created_by": "/api/v2/users/2/",
|
||||
"modified_by": "/api/v2/users/2/",
|
||||
"activity_stream": "/api/v2/credentials/4/activity_stream/",
|
||||
@ -19,7 +19,7 @@
|
||||
"summary_fields": {
|
||||
"credential_type": {
|
||||
"id": 16,
|
||||
"name": "Ansible Tower",
|
||||
"name": "Ansible Controller",
|
||||
"description": ""
|
||||
},
|
||||
"created_by": {
|
||||
|
||||
@ -32,7 +32,7 @@ function CredentialTypeFormFields() {
|
||||
/>
|
||||
<FormFullWidthLayout>
|
||||
<VariablesField
|
||||
tooltip={t`Enter inputs using either JSON or YAML syntax. Refer to the Ansible Tower documentation for example syntax.`}
|
||||
tooltip={t`Enter inputs using either JSON or YAML syntax. Refer to the Ansible Controller documentation for example syntax.`}
|
||||
id="credential-type-inputs-configuration"
|
||||
name="inputs"
|
||||
label={t`Input configuration`}
|
||||
@ -40,7 +40,7 @@ function CredentialTypeFormFields() {
|
||||
</FormFullWidthLayout>
|
||||
<FormFullWidthLayout>
|
||||
<VariablesField
|
||||
tooltip={t`Enter injectors using either JSON or YAML syntax. Refer to the Ansible Tower documentation for example syntax.`}
|
||||
tooltip={t`Enter injectors using either JSON or YAML syntax. Refer to the Ansible Controller documentation for example syntax.`}
|
||||
id="credential-type-injectors-configuration"
|
||||
name="injectors"
|
||||
label={t`Injector configuration`}
|
||||
|
||||
@ -29,6 +29,7 @@ import { relatedResourceDeleteRequests } from 'util/getRelatedResourceDeleteDeta
|
||||
import useIsMounted from 'hooks/useIsMounted';
|
||||
import { formatDateString } from 'util/dates';
|
||||
import Popover from 'components/Popover';
|
||||
import { VERBOSITY } from 'components/VerbositySelectField';
|
||||
import InventorySourceSyncButton from '../shared/InventorySourceSyncButton';
|
||||
import useWsInventorySourcesDetails from '../InventorySources/useWsInventorySourcesDetails';
|
||||
import helpText from '../shared/Inventory.helptext';
|
||||
@ -111,12 +112,6 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
inventorySource.id
|
||||
);
|
||||
|
||||
const VERBOSITY = {
|
||||
0: t`0 (Warning)`,
|
||||
1: t`1 (Info)`,
|
||||
2: t`2 (Debug)`,
|
||||
};
|
||||
|
||||
let optionsList = '';
|
||||
if (
|
||||
overwrite ||
|
||||
@ -251,7 +246,7 @@ function InventorySourceDetail({ inventorySource }) {
|
||||
<Detail
|
||||
label={t`Verbosity`}
|
||||
helpText={helpText.subFormVerbosityFields}
|
||||
value={VERBOSITY[verbosity]}
|
||||
value={VERBOSITY()[verbosity]}
|
||||
/>
|
||||
<Detail
|
||||
label={t`Cache timeout`}
|
||||
|
||||
@ -93,7 +93,7 @@ describe('InventorySourceDetail', () => {
|
||||
assertDetail(wrapper, 'Organization', 'Mock Org');
|
||||
assertDetail(wrapper, 'Project', 'Mock Project');
|
||||
assertDetail(wrapper, 'Inventory file', 'foo');
|
||||
assertDetail(wrapper, 'Verbosity', '2 (Debug)');
|
||||
assertDetail(wrapper, 'Verbosity', '2 (More Verbose)');
|
||||
assertDetail(wrapper, 'Cache timeout', '2 seconds');
|
||||
const executionEnvironment = wrapper.find('ExecutionEnvironmentDetail');
|
||||
expect(executionEnvironment).toHaveLength(1);
|
||||
|
||||
@ -93,7 +93,7 @@ const SmartInventoryFormFields = ({ inventory }) => {
|
||||
label={t`Variables`}
|
||||
tooltip={t`Enter inventory variables using either JSON or YAML syntax.
|
||||
Use the radio button to toggle between the two. Refer to the
|
||||
Ansible Tower documentation for example syntax.`}
|
||||
Ansible Controller documentation for example syntax.`}
|
||||
/>
|
||||
</FormFullWidthLayout>
|
||||
</>
|
||||
|
||||
@ -25,6 +25,7 @@ import { LaunchButton, ReLaunchDropDown } from 'components/LaunchButton';
|
||||
import StatusLabel from 'components/StatusLabel';
|
||||
import JobCancelButton from 'components/JobCancelButton';
|
||||
import ExecutionEnvironmentDetail from 'components/ExecutionEnvironmentDetail';
|
||||
import { VERBOSITY } from 'components/VerbositySelectField';
|
||||
import { getJobModel, isJobRunning } from 'util/jobs';
|
||||
import { formatDateString } from 'util/dates';
|
||||
import { Job } from 'types';
|
||||
@ -37,14 +38,6 @@ const StatusDetailValue = styled.div`
|
||||
grid-template-columns: auto auto;
|
||||
`;
|
||||
|
||||
const VERBOSITY = {
|
||||
0: '0 (Normal)',
|
||||
1: '1 (Verbose)',
|
||||
2: '2 (More Verbose)',
|
||||
3: '3 (Debug)',
|
||||
4: '4 (Connection Debug)',
|
||||
};
|
||||
|
||||
function JobDetail({ job, inventorySourceLabels }) {
|
||||
const { me } = useConfig();
|
||||
const {
|
||||
@ -332,7 +325,7 @@ function JobDetail({ job, inventorySourceLabels }) {
|
||||
dataCy="job-verbosity"
|
||||
label={t`Verbosity`}
|
||||
helpText={jobHelpText.verbosity}
|
||||
value={VERBOSITY[job.verbosity]}
|
||||
value={VERBOSITY()[job.verbosity]}
|
||||
/>
|
||||
{job.type !== 'workflow_job' && !isJobRunning(job.status) && (
|
||||
<ExecutionEnvironmentDetail
|
||||
|
||||
@ -1,17 +1,29 @@
|
||||
import React, { useEffect } from 'react';
|
||||
import { Link, useParams } from 'react-router-dom';
|
||||
import 'styled-components/macro';
|
||||
import { t } from '@lingui/macro';
|
||||
import { SearchIcon } from '@patternfly/react-icons';
|
||||
import {
|
||||
SearchIcon,
|
||||
ExclamationCircleIcon as PFExclamationCircleIcon,
|
||||
} from '@patternfly/react-icons';
|
||||
import ContentEmpty from 'components/ContentEmpty';
|
||||
|
||||
import styled from 'styled-components';
|
||||
|
||||
const ExclamationCircleIcon = styled(PFExclamationCircleIcon)`
|
||||
color: var(--pf-global--danger-color--100);
|
||||
`;
|
||||
|
||||
export default function EmptyOutput({
|
||||
hasQueryParams,
|
||||
isJobRunning,
|
||||
onUnmount,
|
||||
job,
|
||||
}) {
|
||||
let title;
|
||||
let message;
|
||||
let icon;
|
||||
const { typeSegment, id } = useParams();
|
||||
|
||||
useEffect(() => onUnmount);
|
||||
|
||||
@ -21,6 +33,21 @@ export default function EmptyOutput({
|
||||
icon = SearchIcon;
|
||||
} else if (isJobRunning) {
|
||||
title = t`Waiting for job output…`;
|
||||
} else if (job.status === 'failed') {
|
||||
title = t`This job failed and has no output.`;
|
||||
message = (
|
||||
<>
|
||||
{t`Return to `}{' '}
|
||||
<Link to={`/jobs/${typeSegment}/${id}/details`}>{t`details.`}</Link>
|
||||
<br />
|
||||
{job.job_explanation && (
|
||||
<>
|
||||
{t`Failure Explanation:`} {`${job.job_explanation}`}
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
icon = ExclamationCircleIcon;
|
||||
} else {
|
||||
title = t`No output found for this job.`;
|
||||
}
|
||||
|
||||
@ -687,6 +687,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
|
||||
) {
|
||||
return (
|
||||
<EmptyOutput
|
||||
job={job}
|
||||
hasQueryParams={location.search.length > 1}
|
||||
isJobRunning={isJobRunning(jobStatus)}
|
||||
onUnmount={() => {
|
||||
|
||||
@ -134,4 +134,20 @@ describe('<JobOutput />', () => {
|
||||
});
|
||||
await waitForElement(wrapper, 'ContentError', (el) => el.length === 1);
|
||||
});
|
||||
test('should show failed empty output screen', async () => {
|
||||
JobsAPI.readEvents.mockResolvedValue({
|
||||
data: {
|
||||
count: 0,
|
||||
next: null,
|
||||
previous: null,
|
||||
results: [],
|
||||
},
|
||||
});
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobOutput job={{ ...mockJob, status: 'failed' }} />
|
||||
);
|
||||
});
|
||||
await waitForElement(wrapper, 'EmptyOutput', (el) => el.length === 1);
|
||||
});
|
||||
});
|
||||
|
||||
@ -93,7 +93,7 @@ function CustomMessagesSubForm({ defaultMessages, type }) {
|
||||
config
|
||||
)}/html/userguide/notifications.html#create-custom-notifications`}
|
||||
>
|
||||
{t`Ansible Tower Documentation.`}
|
||||
{t`Ansible Controller Documentation.`}
|
||||
</a>
|
||||
</small>
|
||||
</Text>
|
||||
|
||||
@ -28,7 +28,7 @@ const helpText = {
|
||||
twilioDestinationNumbers: t`Use one phone number per line to specify where to
|
||||
route SMS messages. Phone numbers should be formatted +11231231234. For more information see Twilio documentation`,
|
||||
webhookHeaders: t`Specify HTTP Headers in JSON format. Refer to
|
||||
the Ansible Tower documentation for example syntax.`,
|
||||
the Ansible Controller documentation for example syntax.`,
|
||||
};
|
||||
|
||||
export default helpText;
|
||||
|
||||
@ -35,6 +35,13 @@ function SubscriptionDetail() {
|
||||
},
|
||||
];
|
||||
|
||||
const { automated_instances: automatedInstancesCount, automated_since } =
|
||||
license_info;
|
||||
|
||||
const automatedInstancesSinceDateTime = automated_since
|
||||
? formatDateString(new Date(automated_since * 1000).toISOString())
|
||||
: null;
|
||||
|
||||
return (
|
||||
<>
|
||||
<RoutedTabs tabsArray={tabsArray} />
|
||||
@ -127,19 +134,23 @@ function SubscriptionDetail() {
|
||||
label={t`Hosts imported`}
|
||||
value={license_info.current_instances}
|
||||
/>
|
||||
<Detail
|
||||
dataCy="subscription-hosts-automated"
|
||||
label={t`Hosts automated`}
|
||||
value={
|
||||
<>
|
||||
{license_info.automated_instances} <Trans>since</Trans>{' '}
|
||||
{license_info.automated_since &&
|
||||
formatDateString(
|
||||
new Date(license_info.automated_since * 1000).toISOString()
|
||||
)}
|
||||
</>
|
||||
}
|
||||
/>
|
||||
{typeof automatedInstancesCount !== 'undefined' &&
|
||||
automatedInstancesCount !== null && (
|
||||
<Detail
|
||||
dataCy="subscription-hosts-automated"
|
||||
label={t`Hosts automated`}
|
||||
value={
|
||||
automated_since ? (
|
||||
<Trans>
|
||||
{automatedInstancesCount} since{' '}
|
||||
{automatedInstancesSinceDateTime}
|
||||
</Trans>
|
||||
) : (
|
||||
automatedInstancesCount
|
||||
)
|
||||
}
|
||||
/>
|
||||
)}
|
||||
<Detail
|
||||
dataCy="subscription-hosts-remaining"
|
||||
label={t`Hosts remaining`}
|
||||
|
||||
@ -82,4 +82,17 @@ describe('<SubscriptionDetail />', () => {
|
||||
|
||||
expect(wrapper.find('Button[aria-label="edit"]').length).toBe(1);
|
||||
});
|
||||
|
||||
test('should not render Hosts Automated Detail if license_info.automated_instances is undefined', () => {
|
||||
wrapper = mountWithContexts(<SubscriptionDetail />, {
|
||||
context: {
|
||||
config: {
|
||||
...config,
|
||||
license_info: { ...config.license_info, automated_instances: null },
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
expect(wrapper.find(`Detail[label="Hosts automated"]`).length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
@ -28,6 +28,7 @@ import DeleteButton from 'components/DeleteButton';
|
||||
import ErrorDetail from 'components/ErrorDetail';
|
||||
import { LaunchButton } from 'components/LaunchButton';
|
||||
import { VariablesDetail } from 'components/CodeEditor';
|
||||
import { VERBOSITY } from 'components/VerbositySelectField';
|
||||
import { JobTemplatesAPI } from 'api';
|
||||
import useRequest, { useDismissableError } from 'hooks/useRequest';
|
||||
import useBrandName from 'hooks/useBrandName';
|
||||
@ -104,17 +105,6 @@ function JobTemplateDetail({ template }) {
|
||||
relatedResourceDeleteRequests.template(template);
|
||||
const canLaunch =
|
||||
summary_fields.user_capabilities && summary_fields.user_capabilities.start;
|
||||
const verbosityOptions = [
|
||||
{ verbosity: 0, details: t`0 (Normal)` },
|
||||
{ verbosity: 1, details: t`1 (Verbose)` },
|
||||
{ verbosity: 2, details: t`2 (More Verbose)` },
|
||||
{ verbosity: 3, details: t`3 (Debug)` },
|
||||
{ verbosity: 4, details: t`4 (Connection Debug)` },
|
||||
{ verbosity: 5, details: t`5 (WinRM Debug)` },
|
||||
];
|
||||
const verbosityDetails = verbosityOptions.filter(
|
||||
(option) => option.verbosity === verbosity
|
||||
);
|
||||
const generateCallBackUrl = `${window.location.origin + url}callback/`;
|
||||
const renderOptionsField =
|
||||
become_enabled ||
|
||||
@ -272,7 +262,7 @@ function JobTemplateDetail({ template }) {
|
||||
/>
|
||||
<Detail
|
||||
label={t`Verbosity`}
|
||||
value={verbosityDetails[0].details}
|
||||
value={VERBOSITY()[verbosity]}
|
||||
dataCy="jt-detail-verbosity"
|
||||
helpText={helpText.verbosity}
|
||||
/>
|
||||
|
||||
@ -44,7 +44,7 @@ function AnswerTypeField() {
|
||||
labelIcon={
|
||||
<Popover
|
||||
content={t`Choose an answer type or format you want as the prompt for the user.
|
||||
Refer to the Ansible Tower Documentation for more additional
|
||||
Refer to the Ansible Controller Documentation for more additional
|
||||
information about each option.`}
|
||||
/>
|
||||
}
|
||||
@ -266,8 +266,8 @@ function SurveyQuestionForm({
|
||||
target="_blank"
|
||||
rel="noreferrer"
|
||||
>
|
||||
{t`documentation`}{' '}
|
||||
</a>
|
||||
{t`documentation`}
|
||||
</a>{' '}
|
||||
{t`for more information.`}
|
||||
</>
|
||||
}
|
||||
|
||||
@ -1,5 +1,6 @@
|
||||
import React from 'react';
|
||||
import { t } from '@lingui/macro';
|
||||
import getDocsBaseUrl from 'util/getDocsBaseUrl';
|
||||
|
||||
const jtHelpTextStrings = {
|
||||
jobType: t`For job templates, select run to execute the playbook. Select check to only check playbook syntax, test environment setup, and report problems without executing the playbook.`,
|
||||
@ -46,6 +47,19 @@ const jtHelpTextStrings = {
|
||||
{t`Refer to the Ansible documentation for details about the configuration file.`}
|
||||
</span>
|
||||
),
|
||||
localTimeZone: (config = '') => (
|
||||
<span>
|
||||
{t`Refer to the`}{' '}
|
||||
<a
|
||||
href={`${getDocsBaseUrl(config)}/html/userguide/scheduling.html`}
|
||||
target="_blank"
|
||||
rel="noreferrer"
|
||||
>
|
||||
{t`documentation`}
|
||||
</a>{' '}
|
||||
{t`for more information.`}
|
||||
</span>
|
||||
),
|
||||
};
|
||||
|
||||
export default jtHelpTextStrings;
|
||||
|
||||
@ -43,6 +43,7 @@ import Popover from 'components/Popover';
|
||||
import { JobTemplatesAPI } from 'api';
|
||||
import useIsMounted from 'hooks/useIsMounted';
|
||||
import LabelSelect from 'components/LabelSelect';
|
||||
import { VerbositySelectField } from 'components/VerbositySelectField';
|
||||
import PlaybookSelect from './PlaybookSelect';
|
||||
import WebhookSubForm from './WebhookSubForm';
|
||||
import helpText from './JobTemplate.helptext';
|
||||
@ -85,7 +86,6 @@ function JobTemplateForm({
|
||||
const [credentialField, , credentialHelpers] = useField('credentials');
|
||||
const [labelsField, , labelsHelpers] = useField('labels');
|
||||
const [limitField, limitMeta, limitHelpers] = useField('limit');
|
||||
const [verbosityField] = useField('verbosity');
|
||||
const [diffModeField, , diffModeHelpers] = useField('diff_mode');
|
||||
const [instanceGroupsField, , instanceGroupsHelpers] =
|
||||
useField('instanceGroups');
|
||||
@ -215,13 +215,6 @@ function JobTemplateForm({
|
||||
isDisabled: false,
|
||||
},
|
||||
];
|
||||
const verbosityOptions = [
|
||||
{ value: '0', key: '0', label: t`0 (Normal)` },
|
||||
{ value: '1', key: '1', label: t`1 (Verbose)` },
|
||||
{ value: '2', key: '2', label: t`2 (More Verbose)` },
|
||||
{ value: '3', key: '3', label: t`3 (Debug)` },
|
||||
{ value: '4', key: '4', label: t`4 (Connection Debug)` },
|
||||
];
|
||||
let callbackUrl;
|
||||
if (template?.related) {
|
||||
const path = template.related.callback || `${template.url}callback`;
|
||||
@ -428,19 +421,12 @@ function JobTemplateForm({
|
||||
}}
|
||||
/>
|
||||
</FieldWithPrompt>
|
||||
<FieldWithPrompt
|
||||
<VerbositySelectField
|
||||
fieldId="template-verbosity"
|
||||
label={t`Verbosity`}
|
||||
promptId="template-ask-verbosity-on-launch"
|
||||
promptName="ask_verbosity_on_launch"
|
||||
tooltip={helpText.verbosity}
|
||||
>
|
||||
<AnsibleSelect
|
||||
id="template-verbosity"
|
||||
data={verbosityOptions}
|
||||
{...verbosityField}
|
||||
/>
|
||||
</FieldWithPrompt>
|
||||
/>
|
||||
<FormField
|
||||
id="template-job-slicing"
|
||||
name="job_slice_count"
|
||||
|
||||
@ -11,7 +11,7 @@ const wfHelpTextStrings = {
|
||||
labels: t`Optional labels that describe this job template,
|
||||
such as 'dev' or 'test'. Labels can be used to group and filter
|
||||
job templates and completed jobs.`,
|
||||
variables: t`Pass extra command line variables to the playbook. This is the -e or --extra-vars command line parameter for ansible-playbook. Provide key/value pairs using either YAML or JSON. Refer to the Ansible Tower documentation for example syntax.`,
|
||||
variables: t`Pass extra command line variables to the playbook. This is the -e or --extra-vars command line parameter for ansible-playbook. Provide key/value pairs using either YAML or JSON. Refer to the Ansible Controller documentation for example syntax.`,
|
||||
enableWebhook: t`Enable Webhook for this workflow job template.`,
|
||||
enableConcurrentJobs: t`If enabled, simultaneous runs of this workflow job template will be allowed.`,
|
||||
webhookURL: t`Webhook services can launch jobs with this workflow job template by making a POST request to this URL.`,
|
||||
|
||||
@ -105,9 +105,6 @@ options:
|
||||
description:
|
||||
- Project to use as source with scm option
|
||||
type: str
|
||||
update_on_project_update:
|
||||
description: Update this source when the related project updates if source is C(scm)
|
||||
type: bool
|
||||
state:
|
||||
description:
|
||||
- Desired state of the resource.
|
||||
@ -181,7 +178,6 @@ def main():
|
||||
update_on_launch=dict(type='bool'),
|
||||
update_cache_timeout=dict(type='int'),
|
||||
source_project=dict(),
|
||||
update_on_project_update=dict(type='bool'),
|
||||
notification_templates_started=dict(type="list", elements='str'),
|
||||
notification_templates_success=dict(type="list", elements='str'),
|
||||
notification_templates_error=dict(type="list", elements='str'),
|
||||
@ -273,7 +269,6 @@ def main():
|
||||
'verbosity',
|
||||
'update_on_launch',
|
||||
'update_cache_timeout',
|
||||
'update_on_project_update',
|
||||
'enabled_var',
|
||||
'enabled_value',
|
||||
'host_filter',
|
||||
|
||||
@ -105,7 +105,7 @@ options:
|
||||
- 5
|
||||
unified_job_template:
|
||||
description:
|
||||
- Name of unified job template to schedule.
|
||||
- Name of unified job template to schedule. Used to look up an already existing schedule.
|
||||
required: False
|
||||
type: str
|
||||
organization:
|
||||
@ -158,6 +158,12 @@ EXAMPLES = '''
|
||||
every: 1
|
||||
on_days: 'sunday'
|
||||
include: False
|
||||
|
||||
- name: Delete 'my_schedule' schedule for my_workflow
|
||||
schedule:
|
||||
name: "my_schedule"
|
||||
state: absent
|
||||
unified_job_template: my_workflow
|
||||
'''
|
||||
|
||||
from ..module_utils.controller_api import ControllerAPIModule
|
||||
@ -214,14 +220,16 @@ def main():
|
||||
if inventory:
|
||||
inventory_id = module.resolve_name_to_id('inventories', inventory)
|
||||
search_fields = {}
|
||||
sched_search_fields = {}
|
||||
if organization:
|
||||
search_fields['organization'] = module.resolve_name_to_id('organizations', organization)
|
||||
unified_job_template_id = None
|
||||
if unified_job_template:
|
||||
search_fields['name'] = unified_job_template
|
||||
unified_job_template_id = module.get_one('unified_job_templates', **{'data': search_fields})['id']
|
||||
sched_search_fields['unified_job_template'] = unified_job_template_id
|
||||
# Attempt to look up an existing item based on the provided data
|
||||
existing_item = module.get_one('schedules', name_or_id=name)
|
||||
existing_item = module.get_one('schedules', name_or_id=name, **{'data': sched_search_fields})
|
||||
|
||||
association_fields = {}
|
||||
|
||||
|
||||
@ -736,7 +736,7 @@ def main():
|
||||
|
||||
webhook_credential = module.params.get('webhook_credential')
|
||||
if webhook_credential:
|
||||
new_fields['webhook_credential'] = module.resolve_name_to_id('webhook_credential', webhook_credential)
|
||||
new_fields['webhook_credential'] = module.resolve_name_to_id('credentials', webhook_credential)
|
||||
|
||||
# Create the data that gets sent for create and update
|
||||
new_fields['name'] = new_name if new_name else (module.get_item_name(existing_item) if existing_item else name)
|
||||
|
||||
@ -6,7 +6,7 @@ import pytest
|
||||
|
||||
from ansible.errors import AnsibleError
|
||||
|
||||
from awx.main.models import Schedule
|
||||
from awx.main.models import JobTemplate, Schedule
|
||||
from awx.api.serializers import SchedulePreviewSerializer
|
||||
|
||||
|
||||
@ -24,6 +24,19 @@ def test_create_schedule(run_module, job_template, admin_user):
|
||||
assert schedule.rrule == my_rrule
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_delete_same_named_schedule(run_module, project, inventory, admin_user):
|
||||
jt1 = JobTemplate.objects.create(name='jt1', project=project, inventory=inventory, playbook='helloworld.yml')
|
||||
jt2 = JobTemplate.objects.create(name='jt2', project=project, inventory=inventory, playbook='helloworld2.yml')
|
||||
Schedule.objects.create(name='Some Schedule', rrule='DTSTART:20300112T210000Z RRULE:FREQ=DAILY;INTERVAL=1', unified_job_template=jt1)
|
||||
Schedule.objects.create(name='Some Schedule', rrule='DTSTART:20300112T210000Z RRULE:FREQ=DAILY;INTERVAL=1', unified_job_template=jt2)
|
||||
|
||||
result = run_module('schedule', {'name': 'Some Schedule', 'unified_job_template': 'jt1', 'state': 'absent'}, admin_user)
|
||||
assert not result.get('failed', False), result.get('msg', result)
|
||||
|
||||
assert Schedule.objects.filter(name='Some Schedule').count() == 1
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"freq, kwargs, expect",
|
||||
[
|
||||
|
||||
@ -163,6 +163,7 @@
|
||||
- name: Disable a schedule
|
||||
schedule:
|
||||
name: "{{ sched1 }}"
|
||||
unified_job_template: "{{ jt1 }}"
|
||||
state: present
|
||||
enabled: "false"
|
||||
register: result
|
||||
@ -188,6 +189,29 @@
|
||||
rrule: "DTSTART:20191219T130551Z RRULE:FREQ=WEEKLY;INTERVAL=1;COUNT=1"
|
||||
register: result
|
||||
|
||||
- name: Verify we can't find the schedule without the UJT lookup
|
||||
schedule:
|
||||
name: "{{ sched1 }}"
|
||||
state: present
|
||||
rrule: "DTSTART:20201219T130551Z RRULE:FREQ=WEEKLY;INTERVAL=1;COUNT=1"
|
||||
register: result
|
||||
ignore_errors: true
|
||||
|
||||
- assert:
|
||||
that:
|
||||
- result is failed
|
||||
|
||||
- name: Verify we can find the schedule with the UJT lookup and delete it
|
||||
schedule:
|
||||
name: "{{ sched1 }}"
|
||||
state: absent
|
||||
unified_job_template: "{{ jt2 }}"
|
||||
register: result
|
||||
|
||||
- assert:
|
||||
that:
|
||||
- result is changed
|
||||
|
||||
always:
|
||||
- name: Delete the schedule
|
||||
schedule:
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@ -79,6 +79,7 @@ class ApiV2(base.Base):
|
||||
return None
|
||||
if post_fields is None: # Deprecated endpoint or insufficient permissions
|
||||
log.error("Object export failed: %s", _page.endpoint)
|
||||
self._has_error = True
|
||||
return None
|
||||
|
||||
# Note: doing _page[key] automatically parses json blob strings, which can be a problem.
|
||||
@ -99,6 +100,7 @@ class ApiV2(base.Base):
|
||||
pass
|
||||
if resource is None:
|
||||
log.error("Unable to infer endpoint for %r on %s.", key, _page.endpoint)
|
||||
self._has_error = True
|
||||
continue
|
||||
related = self._filtered_list(resource, _page.json[key]).results[0]
|
||||
else:
|
||||
@ -108,12 +110,14 @@ class ApiV2(base.Base):
|
||||
if rel_endpoint is None: # This foreign key is unreadable
|
||||
if post_fields[key].get('required'):
|
||||
log.error("Foreign key %r export failed for object %s.", key, _page.endpoint)
|
||||
self._has_error = True
|
||||
return None
|
||||
log.warning("Foreign key %r export failed for object %s, setting to null", key, _page.endpoint)
|
||||
continue
|
||||
rel_natural_key = rel_endpoint.get_natural_key(self._cache)
|
||||
if rel_natural_key is None:
|
||||
log.error("Unable to construct a natural key for foreign key %r of object %s.", key, _page.endpoint)
|
||||
self._has_error = True
|
||||
return None # This foreign key has unresolvable dependencies
|
||||
fields[key] = rel_natural_key
|
||||
|
||||
@ -158,6 +162,7 @@ class ApiV2(base.Base):
|
||||
natural_key = _page.get_natural_key(self._cache)
|
||||
if natural_key is None:
|
||||
log.error("Unable to construct a natural key for object %s.", _page.endpoint)
|
||||
self._has_error = True
|
||||
return None
|
||||
fields['natural_key'] = natural_key
|
||||
|
||||
@ -249,6 +254,7 @@ class ApiV2(base.Base):
|
||||
except (exc.Common, AssertionError) as e:
|
||||
identifier = asset.get("name", None) or asset.get("username", None) or asset.get("hostname", None)
|
||||
log.error(f"{endpoint} \"{identifier}\": {e}.")
|
||||
self._has_error = True
|
||||
log.debug("post_data: %r", post_data)
|
||||
continue
|
||||
|
||||
@ -283,6 +289,7 @@ class ApiV2(base.Base):
|
||||
pass
|
||||
except exc.Common as e:
|
||||
log.error("Role assignment failed: %s.", e)
|
||||
self._has_error = True
|
||||
log.debug("post_data: %r", {'id': role_page['id']})
|
||||
|
||||
def _assign_membership(self):
|
||||
@ -313,17 +320,21 @@ class ApiV2(base.Base):
|
||||
for item in related_set:
|
||||
rel_page = self._cache.get_by_natural_key(item)
|
||||
if rel_page is None:
|
||||
continue # FIXME
|
||||
log.error("Could not find matching object in Tower for imported relation, item: %r", item)
|
||||
self._has_error = True
|
||||
continue
|
||||
if rel_page['id'] in existing:
|
||||
continue
|
||||
try:
|
||||
post_data = {'id': rel_page['id']}
|
||||
endpoint.post(post_data)
|
||||
log.error("endpoint: %s, id: %s", endpoint.endpoint, rel_page['id'])
|
||||
self._has_error = True
|
||||
except exc.NoContent: # desired exception on successful (dis)association
|
||||
pass
|
||||
except exc.Common as e:
|
||||
log.error("Object association failed: %s.", e)
|
||||
self._has_error = True
|
||||
log.debug("post_data: %r", post_data)
|
||||
else: # It is a create set
|
||||
self._cache.get_page(endpoint)
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user