mirror of
https://github.com/ansible/awx.git
synced 2026-02-26 23:46:05 -03:30
3
.gitignore
vendored
3
.gitignore
vendored
@@ -33,6 +33,7 @@ awx/ui_next/src/locales/
|
||||
awx/ui_next/coverage/
|
||||
awx/ui_next/build
|
||||
awx/ui_next/.env.local
|
||||
awx/ui_next/instrumented
|
||||
rsyslog.pid
|
||||
tools/prometheus/data
|
||||
tools/docker-compose/Dockerfile
|
||||
@@ -146,3 +147,5 @@ use_dev_supervisor.txt
|
||||
*.unison.tmp
|
||||
*.#
|
||||
/tools/docker-compose/overrides/
|
||||
/awx/ui_next/.ui-built
|
||||
/Dockerfile
|
||||
|
||||
18
CHANGELOG.md
18
CHANGELOG.md
@@ -2,6 +2,22 @@
|
||||
|
||||
This is a list of high-level changes for each release of AWX. A full list of commits can be found at `https://github.com/ansible/awx/releases/tag/<version>`.
|
||||
|
||||
## 16.0.0 (December 10, 2020)
|
||||
- AWX now ships with a reimagined user interface. **Please read this before upgrading:** https://groups.google.com/g/awx-project/c/KuT5Ao92HWo
|
||||
- Removed support for syncing inventory from Red Hat CloudForms - https://github.com/ansible/awx/commit/0b701b3b2
|
||||
- Removed support for Mercurial-based project updates - https://github.com/ansible/awx/issues/7932
|
||||
- Upgraded NodeJS to actively maintained LTS 14.15.1 - https://github.com/ansible/awx/pull/8766
|
||||
- Added Git-LFS to the default image build - https://github.com/ansible/awx/pull/8700
|
||||
- Added the ability to specify `metadata.labels` in the podspec for container groups - https://github.com/ansible/awx/issues/8486
|
||||
- Added support for Kubernetes pod annotations - https://github.com/ansible/awx/pull/8434
|
||||
- Added the ability to label the web container in local Docker installs - https://github.com/ansible/awx/pull/8449
|
||||
- Added additional metadata (as an extra var) to playbook runs to report the SCM branch name - https://github.com/ansible/awx/pull/8433
|
||||
- Fixed a bug that caused k8s installations to fail due to an incorrect Helm repo - https://github.com/ansible/awx/issues/8715
|
||||
- Fixed a bug that prevented certain Workflow Approval resources from being deleted - https://github.com/ansible/awx/pull/8612
|
||||
- Fixed a bug that prevented the deletion of inventories stuck in "pending deletion" state - https://github.com/ansible/awx/issues/8525
|
||||
- Fixed a display bug in webhook notifications with certain unicode characters - https://github.com/ansible/awx/issues/7400
|
||||
- Improved support for exporting dependent objects (Inventory Hosts and Groups) in the `awx export` CLI tool - https://github.com/ansible/awx/commit/607bc0788
|
||||
|
||||
## 15.0.1 (October 20, 2020)
|
||||
- Added several optimizations to improve performance for a variety of high-load simultaneous job launch use cases https://github.com/ansible/awx/pull/8403
|
||||
- Added the ability to source roles and collections from requirements.yaml files (not just requirements.yml) - https://github.com/ansible/awx/issues/4540
|
||||
@@ -88,7 +104,7 @@ This is a list of high-level changes for each release of AWX. A full list of com
|
||||
- Fixed a bug that caused rsyslogd's configuration file to have world-readable file permissions, potentially leaking secrets (CVE-2020-10782)
|
||||
|
||||
## 12.0.0 (Jun 9, 2020)
|
||||
- Removed memcached as a dependency of AWX (https://github.com/ansible/awx/pull/7240)
|
||||
- Removed memcached as a dependency of AWX (https://github.com/ansible/awx/pull/7240)
|
||||
- Moved to a single container image build instead of separate awx_web and awx_task images. The container image is just `awx` (https://github.com/ansible/awx/pull/7228)
|
||||
- Official AWX container image builds now use a two-stage container build process that notably reduces the size of our published images (https://github.com/ansible/awx/pull/7017)
|
||||
- Removed support for HipChat notifications ([EoL announcement](https://www.atlassian.com/partnerships/slack/faq#faq-98b17ca3-247f-423b-9a78-70a91681eff0)); all previously-created HipChat notification templates will be deleted due to this removal.
|
||||
|
||||
@@ -60,7 +60,7 @@ Please note that deploying from `HEAD` (or the latest commit) is **not** stable,
|
||||
|
||||
For more on how to clone the repo, view [git clone help](https://git-scm.com/docs/git-clone).
|
||||
|
||||
Once you have a local copy, run commands within the root of the project tree.
|
||||
Once you have a local copy, run the commands in the following sections from the root of the project tree.
|
||||
|
||||
### AWX branding
|
||||
|
||||
@@ -83,7 +83,7 @@ Before you can run a deployment, you'll need the following installed in your loc
|
||||
- [GNU Make](https://www.gnu.org/software/make/)
|
||||
- [Git](https://git-scm.com/) Requires Version 1.8.4+
|
||||
- Python 3.6+
|
||||
- [Node 10.x LTS version](https://nodejs.org/en/download/)
|
||||
- [Node 14.x LTS version](https://nodejs.org/en/download/)
|
||||
+ This is only required if you're [building your own container images](#official-vs-building-images) with `use_container_for_build=false`
|
||||
- [NPM 6.x LTS](https://docs.npmjs.com/)
|
||||
+ This is only required if you're [building your own container images](#official-vs-building-images) with `use_container_for_build=false`
|
||||
|
||||
25
Makefile
25
Makefile
@@ -462,19 +462,24 @@ endif
|
||||
|
||||
# UI TASKS
|
||||
# --------------------------------------
|
||||
awx/ui_next/node_modules:
|
||||
$(NPM_BIN) --prefix awx/ui_next install
|
||||
|
||||
UI_BUILD_FLAG_FILE = awx/ui_next/.ui-built
|
||||
|
||||
clean-ui:
|
||||
rm -rf node_modules
|
||||
rm -rf awx/ui_next/node_modules
|
||||
rm -rf awx/ui_next/build
|
||||
rm -rf awx/ui_next/src/locales/_build
|
||||
rm -rf $(UI_BUILD_FLAG_FILE)
|
||||
git checkout awx/ui_next/src/locales
|
||||
|
||||
ui-release: ui-devel
|
||||
ui-devel: awx/ui_next/node_modules
|
||||
$(NPM_BIN) --prefix awx/ui_next run extract-strings
|
||||
$(NPM_BIN) --prefix awx/ui_next run compile-strings
|
||||
$(NPM_BIN) --prefix awx/ui_next run build
|
||||
awx/ui_next/node_modules:
|
||||
$(NPM_BIN) --prefix awx/ui_next --loglevel warn --ignore-scripts install
|
||||
|
||||
$(UI_BUILD_FLAG_FILE):
|
||||
$(NPM_BIN) --prefix awx/ui_next --loglevel warn run extract-strings
|
||||
$(NPM_BIN) --prefix awx/ui_next --loglevel warn run compile-strings
|
||||
$(NPM_BIN) --prefix awx/ui_next --loglevel warn run build
|
||||
git checkout awx/ui_next/src/locales
|
||||
mkdir -p awx/public/static/css
|
||||
mkdir -p awx/public/static/js
|
||||
@@ -482,6 +487,12 @@ ui-devel: awx/ui_next/node_modules
|
||||
cp -r awx/ui_next/build/static/css/* awx/public/static/css
|
||||
cp -r awx/ui_next/build/static/js/* awx/public/static/js
|
||||
cp -r awx/ui_next/build/static/media/* awx/public/static/media
|
||||
touch $@
|
||||
|
||||
ui-release: awx/ui_next/node_modules $(UI_BUILD_FLAG_FILE)
|
||||
|
||||
ui-devel: awx/ui_next/node_modules
|
||||
@$(MAKE) -B $(UI_BUILD_FLAG_FILE)
|
||||
|
||||
ui-zuul-lint-and-test:
|
||||
$(NPM_BIN) --prefix awx/ui_next install
|
||||
|
||||
@@ -640,7 +640,7 @@ class EmptySerializer(serializers.Serializer):
|
||||
|
||||
|
||||
class UnifiedJobTemplateSerializer(BaseSerializer):
|
||||
# As a base serializer, the capabilities prefetch is not used directly,
|
||||
# As a base serializer, the capabilities prefetch is not used directly,
|
||||
# instead they are derived from the Workflow Job Template Serializer and the Job Template Serializer, respectively.
|
||||
capabilities_prefetch = []
|
||||
|
||||
@@ -1748,7 +1748,7 @@ class HostSerializer(BaseSerializerWithVariables):
|
||||
attrs['variables'] = json.dumps(vars_dict)
|
||||
if Group.objects.filter(name=name, inventory=inventory).exists():
|
||||
raise serializers.ValidationError(_('A Group with that name already exists.'))
|
||||
|
||||
|
||||
return super(HostSerializer, self).validate(attrs)
|
||||
|
||||
def to_representation(self, obj):
|
||||
@@ -3945,12 +3945,12 @@ class ProjectUpdateEventSerializer(JobEventSerializer):
|
||||
return UriCleaner.remove_sensitive(obj.stdout)
|
||||
|
||||
def get_event_data(self, obj):
|
||||
# the project update playbook uses the git, hg, or svn modules
|
||||
# the project update playbook uses the git or svn modules
|
||||
# to clone repositories, and those modules are prone to printing
|
||||
# raw SCM URLs in their stdout (which *could* contain passwords)
|
||||
# attempt to detect and filter HTTP basic auth passwords in the stdout
|
||||
# of these types of events
|
||||
if obj.event_data.get('task_action') in ('git', 'hg', 'svn'):
|
||||
if obj.event_data.get('task_action') in ('git', 'svn'):
|
||||
try:
|
||||
return json.loads(
|
||||
UriCleaner.remove_sensitive(
|
||||
|
||||
@@ -4,7 +4,6 @@ The following lists the expected format and details of our rrules:
|
||||
* DTSTART is expected to be in UTC
|
||||
* INTERVAL is required
|
||||
* SECONDLY is not supported
|
||||
* TZID is not supported
|
||||
* RRULE must precede the rule statements
|
||||
* BYDAY is supported but not BYDAY with a numerical prefix
|
||||
* BYYEARDAY and BYWEEKNO are not supported
|
||||
|
||||
@@ -242,8 +242,6 @@ class DashboardView(APIView):
|
||||
git_failed_projects = git_projects.filter(last_job_failed=True)
|
||||
svn_projects = user_projects.filter(scm_type='svn')
|
||||
svn_failed_projects = svn_projects.filter(last_job_failed=True)
|
||||
hg_projects = user_projects.filter(scm_type='hg')
|
||||
hg_failed_projects = hg_projects.filter(last_job_failed=True)
|
||||
archive_projects = user_projects.filter(scm_type='archive')
|
||||
archive_failed_projects = archive_projects.filter(last_job_failed=True)
|
||||
data['scm_types'] = {}
|
||||
@@ -257,11 +255,6 @@ class DashboardView(APIView):
|
||||
'failures_url': reverse('api:project_list', request=request) + "?scm_type=svn&last_job_failed=True",
|
||||
'total': svn_projects.count(),
|
||||
'failed': svn_failed_projects.count()}
|
||||
data['scm_types']['hg'] = {'url': reverse('api:project_list', request=request) + "?scm_type=hg",
|
||||
'label': 'Mercurial',
|
||||
'failures_url': reverse('api:project_list', request=request) + "?scm_type=hg&last_job_failed=True",
|
||||
'total': hg_projects.count(),
|
||||
'failed': hg_failed_projects.count()}
|
||||
data['scm_types']['archive'] = {'url': reverse('api:project_list', request=request) + "?scm_type=archive",
|
||||
'label': 'Remote Archive',
|
||||
'failures_url': reverse('api:project_list', request=request) + "?scm_type=archive&last_job_failed=True",
|
||||
|
||||
@@ -333,14 +333,14 @@ class BaseAccess(object):
|
||||
report_violation(_("License has expired."))
|
||||
|
||||
free_instances = validation_info.get('free_instances', 0)
|
||||
available_instances = validation_info.get('available_instances', 0)
|
||||
instance_count = validation_info.get('instance_count', 0)
|
||||
|
||||
if add_host_name:
|
||||
host_exists = Host.objects.filter(name=add_host_name).exists()
|
||||
if not host_exists and free_instances == 0:
|
||||
report_violation(_("License count of %s instances has been reached.") % available_instances)
|
||||
report_violation(_("License count of %s instances has been reached.") % instance_count)
|
||||
elif not host_exists and free_instances < 0:
|
||||
report_violation(_("License count of %s instances has been exceeded.") % available_instances)
|
||||
report_violation(_("License count of %s instances has been exceeded.") % instance_count)
|
||||
elif not add_host_name and free_instances < 0:
|
||||
report_violation(_("Host count exceeds available instances."))
|
||||
|
||||
|
||||
@@ -280,14 +280,16 @@ def _copy_table(table, query, path):
|
||||
return file.file_list()
|
||||
|
||||
|
||||
@register('events_table', '1.1', format='csv', description=_('Automation task records'), expensive=True)
|
||||
@register('events_table', '1.2', format='csv', description=_('Automation task records'), expensive=True)
|
||||
def events_table(since, full_path, until, **kwargs):
|
||||
events_query = '''COPY (SELECT main_jobevent.id,
|
||||
main_jobevent.created,
|
||||
main_jobevent.modified,
|
||||
main_jobevent.uuid,
|
||||
main_jobevent.parent_uuid,
|
||||
main_jobevent.event,
|
||||
main_jobevent.event_data::json->'task_action' AS task_action,
|
||||
(CASE WHEN event = 'playbook_on_stats' THEN event_data END) as playbook_on_stats,
|
||||
main_jobevent.failed,
|
||||
main_jobevent.changed,
|
||||
main_jobevent.playbook,
|
||||
|
||||
@@ -68,7 +68,7 @@ def register(key, version, description=None, format='json', expensive=False):
|
||||
|
||||
@register('projects_by_scm_type', 1)
|
||||
def projects_by_scm_type():
|
||||
return {'git': 5, 'svn': 1, 'hg': 0}
|
||||
return {'git': 5, 'svn': 1}
|
||||
"""
|
||||
|
||||
def decorate(f):
|
||||
@@ -102,7 +102,7 @@ def gather(dest=None, module=None, subset = None, since = None, until = now(), c
|
||||
|
||||
last_run = since or settings.AUTOMATION_ANALYTICS_LAST_GATHER or (now() - timedelta(weeks=4))
|
||||
logger.debug("Last analytics run was: {}".format(settings.AUTOMATION_ANALYTICS_LAST_GATHER))
|
||||
|
||||
|
||||
if _valid_license() is False:
|
||||
logger.exception("Invalid License provided, or No License Provided")
|
||||
return None
|
||||
|
||||
@@ -68,7 +68,7 @@ def metrics():
|
||||
'external_logger_type': getattr(settings, 'LOG_AGGREGATOR_TYPE', 'None')
|
||||
})
|
||||
|
||||
LICENSE_INSTANCE_TOTAL.set(str(license_info.get('available_instances', 0)))
|
||||
LICENSE_INSTANCE_TOTAL.set(str(license_info.get('instance_count', 0)))
|
||||
LICENSE_INSTANCE_FREE.set(str(license_info.get('free_instances', 0)))
|
||||
|
||||
current_counts = counts(None)
|
||||
|
||||
@@ -75,7 +75,7 @@ class WebsocketSecretAuthHelper:
|
||||
nonce_diff = now - nonce_parsed
|
||||
if abs(nonce_diff) > nonce_tolerance:
|
||||
logger.warn(f"Potential replay attack or machine(s) time out of sync by {nonce_diff} seconds.")
|
||||
raise ValueError("Potential replay attack or machine(s) time out of sync by {nonce_diff} seconds.")
|
||||
raise ValueError(f"Potential replay attack or machine(s) time out of sync by {nonce_diff} seconds.")
|
||||
|
||||
return True
|
||||
|
||||
|
||||
@@ -30,3 +30,10 @@ class _AwxTaskError():
|
||||
|
||||
|
||||
AwxTaskError = _AwxTaskError()
|
||||
|
||||
|
||||
class PostRunError(Exception):
|
||||
def __init__(self, msg, status='failed', tb=''):
|
||||
self.status = status
|
||||
self.tb = tb
|
||||
super(PostRunError, self).__init__(msg)
|
||||
|
||||
@@ -149,7 +149,6 @@ class IsolatedManager(object):
|
||||
# don't rsync source control metadata (it can be huge!)
|
||||
'- /project/.git',
|
||||
'- /project/.svn',
|
||||
'- /project/.hg',
|
||||
# don't rsync job events that are in the process of being written
|
||||
'- /artifacts/job_events/*-partial.json.tmp',
|
||||
# don't rsync the ssh_key FIFO
|
||||
|
||||
@@ -21,7 +21,7 @@ from awx.main.signals import (
|
||||
disable_computed_fields
|
||||
)
|
||||
|
||||
from awx.main.management.commands.deletion import AWXCollector, pre_delete
|
||||
from awx.main.utils.deletion import AWXCollector, pre_delete
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
|
||||
@@ -19,6 +19,9 @@ from django.core.management.base import BaseCommand, CommandError
|
||||
from django.db import connection, transaction
|
||||
from django.utils.encoding import smart_text
|
||||
|
||||
# DRF error class to distinguish license exceptions
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
|
||||
# AWX inventory imports
|
||||
from awx.main.models.inventory import (
|
||||
Inventory,
|
||||
@@ -31,11 +34,12 @@ from awx.main.utils.safe_yaml import sanitize_jinja
|
||||
|
||||
# other AWX imports
|
||||
from awx.main.models.rbac import batch_role_ancestor_rebuilding
|
||||
# TODO: remove proot utils once we move to running inv. updates in containers
|
||||
from awx.main.utils import (
|
||||
ignore_inventory_computed_fields,
|
||||
check_proot_installed,
|
||||
wrap_args_with_proot,
|
||||
build_proot_temp_dir,
|
||||
ignore_inventory_computed_fields,
|
||||
get_licenser
|
||||
)
|
||||
from awx.main.signals import disable_activity_stream
|
||||
@@ -53,11 +57,11 @@ No license.
|
||||
See http://www.ansible.com/renew for license information.'''
|
||||
|
||||
LICENSE_MESSAGE = '''\
|
||||
Number of licensed instances exceeded, would bring available instances to %(new_count)d, system is licensed for %(available_instances)d.
|
||||
Number of licensed instances exceeded, would bring available instances to %(new_count)d, system is licensed for %(instance_count)d.
|
||||
See http://www.ansible.com/renew for license extension information.'''
|
||||
|
||||
DEMO_LICENSE_MESSAGE = '''\
|
||||
Demo mode free license count exceeded, would bring available instances to %(new_count)d, demo mode allows %(available_instances)d.
|
||||
Demo mode free license count exceeded, would bring available instances to %(new_count)d, demo mode allows %(instance_count)d.
|
||||
See http://www.ansible.com/renew for licensing information.'''
|
||||
|
||||
|
||||
@@ -75,13 +79,11 @@ class AnsibleInventoryLoader(object):
|
||||
/usr/bin/ansible/ansible-inventory -i hosts --list
|
||||
'''
|
||||
|
||||
def __init__(self, source, is_custom=False, venv_path=None, verbosity=0):
|
||||
def __init__(self, source, venv_path=None, verbosity=0):
|
||||
self.source = source
|
||||
self.source_dir = functioning_dir(self.source)
|
||||
self.is_custom = is_custom
|
||||
self.tmp_private_dir = None
|
||||
self.method = 'ansible-inventory'
|
||||
self.verbosity = verbosity
|
||||
# TODO: remove once proot has been removed
|
||||
self.tmp_private_dir = None
|
||||
if venv_path:
|
||||
self.venv_path = venv_path
|
||||
else:
|
||||
@@ -134,35 +136,31 @@ class AnsibleInventoryLoader(object):
|
||||
# inside of /venv/ansible, so we override the specified interpreter
|
||||
# https://github.com/ansible/ansible/issues/50714
|
||||
bargs = ['python', ansible_inventory_path, '-i', self.source]
|
||||
bargs.extend(['--playbook-dir', self.source_dir])
|
||||
bargs.extend(['--playbook-dir', functioning_dir(self.source)])
|
||||
if self.verbosity:
|
||||
# INFO: -vvv, DEBUG: -vvvvv, for inventory, any more than 3 makes little difference
|
||||
bargs.append('-{}'.format('v' * min(5, self.verbosity * 2 + 1)))
|
||||
logger.debug('Using base command: {}'.format(' '.join(bargs)))
|
||||
return bargs
|
||||
|
||||
# TODO: Remove this once we move to running ansible-inventory in containers
|
||||
# and don't need proot for process isolation anymore
|
||||
def get_proot_args(self, cmd, env):
|
||||
cwd = os.getcwd()
|
||||
if not check_proot_installed():
|
||||
raise RuntimeError("proot is not installed but is configured for use")
|
||||
|
||||
kwargs = {}
|
||||
if self.is_custom:
|
||||
# use source's tmp dir for proot, task manager will delete folder
|
||||
logger.debug("Using provided directory '{}' for isolation.".format(self.source_dir))
|
||||
kwargs['proot_temp_dir'] = self.source_dir
|
||||
cwd = self.source_dir
|
||||
else:
|
||||
# we cannot safely store tmp data in source dir or trust script contents
|
||||
if env['AWX_PRIVATE_DATA_DIR']:
|
||||
# If this is non-blank, file credentials are being used and we need access
|
||||
private_data_dir = functioning_dir(env['AWX_PRIVATE_DATA_DIR'])
|
||||
logger.debug("Using private credential data in '{}'.".format(private_data_dir))
|
||||
kwargs['private_data_dir'] = private_data_dir
|
||||
self.tmp_private_dir = build_proot_temp_dir()
|
||||
logger.debug("Using fresh temporary directory '{}' for isolation.".format(self.tmp_private_dir))
|
||||
kwargs['proot_temp_dir'] = self.tmp_private_dir
|
||||
kwargs['proot_show_paths'] = [functioning_dir(self.source), settings.AWX_ANSIBLE_COLLECTIONS_PATHS]
|
||||
# we cannot safely store tmp data in source dir or trust script contents
|
||||
if env['AWX_PRIVATE_DATA_DIR']:
|
||||
# If this is non-blank, file credentials are being used and we need access
|
||||
private_data_dir = functioning_dir(env['AWX_PRIVATE_DATA_DIR'])
|
||||
logger.debug("Using private credential data in '{}'.".format(private_data_dir))
|
||||
kwargs['private_data_dir'] = private_data_dir
|
||||
self.tmp_private_dir = build_proot_temp_dir()
|
||||
logger.debug("Using fresh temporary directory '{}' for isolation.".format(self.tmp_private_dir))
|
||||
kwargs['proot_temp_dir'] = self.tmp_private_dir
|
||||
kwargs['proot_show_paths'] = [functioning_dir(self.source), settings.AWX_ANSIBLE_COLLECTIONS_PATHS]
|
||||
logger.debug("Running from `{}` working directory.".format(cwd))
|
||||
|
||||
if self.venv_path != settings.ANSIBLE_VENV_PATH:
|
||||
@@ -170,12 +168,14 @@ class AnsibleInventoryLoader(object):
|
||||
|
||||
return wrap_args_with_proot(cmd, cwd, **kwargs)
|
||||
|
||||
|
||||
def command_to_json(self, cmd):
|
||||
data = {}
|
||||
stdout, stderr = '', ''
|
||||
env = self.build_env()
|
||||
|
||||
if ((self.is_custom or 'AWX_PRIVATE_DATA_DIR' in env) and
|
||||
# TODO: remove proot args once inv. updates run in containers
|
||||
if (('AWX_PRIVATE_DATA_DIR' in env) and
|
||||
getattr(settings, 'AWX_PROOT_ENABLED', False)):
|
||||
cmd = self.get_proot_args(cmd, env)
|
||||
|
||||
@@ -184,11 +184,13 @@ class AnsibleInventoryLoader(object):
|
||||
stdout = smart_text(stdout)
|
||||
stderr = smart_text(stderr)
|
||||
|
||||
# TODO: can be removed when proot is removed
|
||||
if self.tmp_private_dir:
|
||||
shutil.rmtree(self.tmp_private_dir, True)
|
||||
|
||||
if proc.returncode != 0:
|
||||
raise RuntimeError('%s failed (rc=%d) with stdout:\n%s\nstderr:\n%s' % (
|
||||
self.method, proc.returncode, stdout, stderr))
|
||||
'ansible-inventory', proc.returncode, stdout, stderr))
|
||||
|
||||
for line in stderr.splitlines():
|
||||
logger.error(line)
|
||||
@@ -231,9 +233,9 @@ class Command(BaseCommand):
|
||||
action='store_true', default=False,
|
||||
help='overwrite (rather than merge) variables')
|
||||
parser.add_argument('--keep-vars', dest='keep_vars', action='store_true', default=False,
|
||||
help='use database variables if set')
|
||||
help='DEPRECATED legacy option, has no effect')
|
||||
parser.add_argument('--custom', dest='custom', action='store_true', default=False,
|
||||
help='this is a custom inventory script')
|
||||
help='DEPRECATED indicates a custom inventory script, no longer used')
|
||||
parser.add_argument('--source', dest='source', type=str, default=None,
|
||||
metavar='s', help='inventory directory, file, or script to load')
|
||||
parser.add_argument('--enabled-var', dest='enabled_var', type=str,
|
||||
@@ -259,10 +261,10 @@ class Command(BaseCommand):
|
||||
'specifies the unique, immutable instance ID, may be '
|
||||
'specified as "foo.bar" to traverse nested dicts.')
|
||||
|
||||
def set_logging_level(self):
|
||||
def set_logging_level(self, verbosity):
|
||||
log_levels = dict(enumerate([logging.WARNING, logging.INFO,
|
||||
logging.DEBUG, 0]))
|
||||
logger.setLevel(log_levels.get(self.verbosity, 0))
|
||||
logger.setLevel(log_levels.get(verbosity, 0))
|
||||
|
||||
def _get_instance_id(self, variables, default=''):
|
||||
'''
|
||||
@@ -322,7 +324,8 @@ class Command(BaseCommand):
|
||||
else:
|
||||
raise NotImplementedError('Value of enabled {} not understood.'.format(enabled))
|
||||
|
||||
def get_source_absolute_path(self, source):
|
||||
@staticmethod
|
||||
def get_source_absolute_path(source):
|
||||
if not os.path.exists(source):
|
||||
raise IOError('Source does not exist: %s' % source)
|
||||
source = os.path.join(os.getcwd(), os.path.dirname(source),
|
||||
@@ -330,61 +333,6 @@ class Command(BaseCommand):
|
||||
source = os.path.normpath(os.path.abspath(source))
|
||||
return source
|
||||
|
||||
def load_inventory_from_database(self):
|
||||
'''
|
||||
Load inventory and related objects from the database.
|
||||
'''
|
||||
# Load inventory object based on name or ID.
|
||||
if self.inventory_id:
|
||||
q = dict(id=self.inventory_id)
|
||||
else:
|
||||
q = dict(name=self.inventory_name)
|
||||
try:
|
||||
self.inventory = Inventory.objects.get(**q)
|
||||
except Inventory.DoesNotExist:
|
||||
raise CommandError('Inventory with %s = %s cannot be found' % list(q.items())[0])
|
||||
except Inventory.MultipleObjectsReturned:
|
||||
raise CommandError('Inventory with %s = %s returned multiple results' % list(q.items())[0])
|
||||
logger.info('Updating inventory %d: %s' % (self.inventory.pk,
|
||||
self.inventory.name))
|
||||
|
||||
# Load inventory source if specified via environment variable (when
|
||||
# inventory_import is called from an InventoryUpdate task).
|
||||
inventory_source_id = os.getenv('INVENTORY_SOURCE_ID', None)
|
||||
inventory_update_id = os.getenv('INVENTORY_UPDATE_ID', None)
|
||||
if inventory_source_id:
|
||||
try:
|
||||
self.inventory_source = InventorySource.objects.get(pk=inventory_source_id,
|
||||
inventory=self.inventory)
|
||||
except InventorySource.DoesNotExist:
|
||||
raise CommandError('Inventory source with id=%s not found' %
|
||||
inventory_source_id)
|
||||
try:
|
||||
self.inventory_update = InventoryUpdate.objects.get(pk=inventory_update_id)
|
||||
except InventoryUpdate.DoesNotExist:
|
||||
raise CommandError('Inventory update with id=%s not found' %
|
||||
inventory_update_id)
|
||||
# Otherwise, create a new inventory source to capture this invocation
|
||||
# via command line.
|
||||
else:
|
||||
with ignore_inventory_computed_fields():
|
||||
self.inventory_source, created = InventorySource.objects.get_or_create(
|
||||
inventory=self.inventory,
|
||||
source='file',
|
||||
source_path=os.path.abspath(self.source),
|
||||
overwrite=self.overwrite,
|
||||
overwrite_vars=self.overwrite_vars,
|
||||
)
|
||||
self.inventory_update = self.inventory_source.create_inventory_update(
|
||||
_eager_fields=dict(
|
||||
job_args=json.dumps(sys.argv),
|
||||
job_env=dict(os.environ.items()),
|
||||
job_cwd=os.getcwd())
|
||||
)
|
||||
|
||||
# FIXME: Wait or raise error if inventory is being updated by another
|
||||
# source.
|
||||
|
||||
def _batch_add_m2m(self, related_manager, *objs, **kwargs):
|
||||
key = (related_manager.instance.pk, related_manager.through._meta.db_table)
|
||||
flush = bool(kwargs.get('flush', False))
|
||||
@@ -894,9 +842,9 @@ class Command(BaseCommand):
|
||||
source_vars = self.all_group.variables
|
||||
remote_license_type = source_vars.get('tower_metadata', {}).get('license_type', None)
|
||||
if remote_license_type is None:
|
||||
raise CommandError('Unexpected Error: Tower inventory plugin missing needed metadata!')
|
||||
raise PermissionDenied('Unexpected Error: Tower inventory plugin missing needed metadata!')
|
||||
if local_license_type != remote_license_type:
|
||||
raise CommandError('Tower server licenses must match: source: {} local: {}'.format(
|
||||
raise PermissionDenied('Tower server licenses must match: source: {} local: {}'.format(
|
||||
remote_license_type, local_license_type
|
||||
))
|
||||
|
||||
@@ -905,10 +853,10 @@ class Command(BaseCommand):
|
||||
local_license_type = license_info.get('license_type', 'UNLICENSED')
|
||||
if local_license_type == 'UNLICENSED':
|
||||
logger.error(LICENSE_NON_EXISTANT_MESSAGE)
|
||||
raise CommandError('No license found!')
|
||||
raise PermissionDenied('No license found!')
|
||||
elif local_license_type == 'open':
|
||||
return
|
||||
available_instances = license_info.get('available_instances', 0)
|
||||
instance_count = license_info.get('instance_count', 0)
|
||||
free_instances = license_info.get('free_instances', 0)
|
||||
time_remaining = license_info.get('time_remaining', 0)
|
||||
hard_error = license_info.get('trial', False) is True or license_info['instance_count'] == 10
|
||||
@@ -916,24 +864,24 @@ class Command(BaseCommand):
|
||||
if time_remaining <= 0:
|
||||
if hard_error:
|
||||
logger.error(LICENSE_EXPIRED_MESSAGE)
|
||||
raise CommandError("License has expired!")
|
||||
raise PermissionDenied("License has expired!")
|
||||
else:
|
||||
logger.warning(LICENSE_EXPIRED_MESSAGE)
|
||||
# special check for tower-type inventory sources
|
||||
# but only if running the plugin
|
||||
TOWER_SOURCE_FILES = ['tower.yml', 'tower.yaml']
|
||||
if self.inventory_source.source == 'tower' and any(f in self.source for f in TOWER_SOURCE_FILES):
|
||||
if self.inventory_source.source == 'tower' and any(f in self.inventory_source.source_path for f in TOWER_SOURCE_FILES):
|
||||
# only if this is the 2nd call to license check, we cannot compare before running plugin
|
||||
if hasattr(self, 'all_group'):
|
||||
self.remote_tower_license_compare(local_license_type)
|
||||
if free_instances < 0:
|
||||
d = {
|
||||
'new_count': new_count,
|
||||
'available_instances': available_instances,
|
||||
'instance_count': instance_count,
|
||||
}
|
||||
if hard_error:
|
||||
logger.error(LICENSE_MESSAGE % d)
|
||||
raise CommandError('License count exceeded!')
|
||||
raise PermissionDenied('License count exceeded!')
|
||||
else:
|
||||
logger.warning(LICENSE_MESSAGE % d)
|
||||
|
||||
@@ -948,7 +896,7 @@ class Command(BaseCommand):
|
||||
|
||||
active_count = Host.objects.org_active_count(org.id)
|
||||
if active_count > org.max_hosts:
|
||||
raise CommandError('Host limit for organization exceeded!')
|
||||
raise PermissionDenied('Host limit for organization exceeded!')
|
||||
|
||||
def mark_license_failure(self, save=True):
|
||||
self.inventory_update.license_error = True
|
||||
@@ -959,16 +907,103 @@ class Command(BaseCommand):
|
||||
self.inventory_update.save(update_fields=['org_host_limit_error'])
|
||||
|
||||
def handle(self, *args, **options):
|
||||
self.verbosity = int(options.get('verbosity', 1))
|
||||
self.set_logging_level()
|
||||
self.inventory_name = options.get('inventory_name', None)
|
||||
self.inventory_id = options.get('inventory_id', None)
|
||||
venv_path = options.get('venv', None)
|
||||
# Load inventory and related objects from database.
|
||||
inventory_name = options.get('inventory_name', None)
|
||||
inventory_id = options.get('inventory_id', None)
|
||||
if inventory_name and inventory_id:
|
||||
raise CommandError('--inventory-name and --inventory-id are mutually exclusive')
|
||||
elif not inventory_name and not inventory_id:
|
||||
raise CommandError('--inventory-name or --inventory-id is required')
|
||||
|
||||
with advisory_lock('inventory_{}_import'.format(inventory_id)):
|
||||
# Obtain rest of the options needed to run update
|
||||
raw_source = options.get('source', None)
|
||||
if not raw_source:
|
||||
raise CommandError('--source is required')
|
||||
verbosity = int(options.get('verbosity', 1))
|
||||
self.set_logging_level(verbosity)
|
||||
venv_path = options.get('venv', None)
|
||||
|
||||
# Load inventory object based on name or ID.
|
||||
if inventory_id:
|
||||
q = dict(id=inventory_id)
|
||||
else:
|
||||
q = dict(name=inventory_name)
|
||||
try:
|
||||
inventory = Inventory.objects.get(**q)
|
||||
except Inventory.DoesNotExist:
|
||||
raise CommandError('Inventory with %s = %s cannot be found' % list(q.items())[0])
|
||||
except Inventory.MultipleObjectsReturned:
|
||||
raise CommandError('Inventory with %s = %s returned multiple results' % list(q.items())[0])
|
||||
logger.info('Updating inventory %d: %s' % (inventory.pk, inventory.name))
|
||||
|
||||
|
||||
# Create ad-hoc inventory source and inventory update objects
|
||||
with ignore_inventory_computed_fields():
|
||||
source = Command.get_source_absolute_path(raw_source)
|
||||
|
||||
inventory_source, created = InventorySource.objects.get_or_create(
|
||||
inventory=inventory,
|
||||
source='file',
|
||||
source_path=os.path.abspath(source),
|
||||
overwrite=bool(options.get('overwrite', False)),
|
||||
overwrite_vars=bool(options.get('overwrite_vars', False)),
|
||||
)
|
||||
inventory_update = inventory_source.create_inventory_update(
|
||||
_eager_fields=dict(
|
||||
job_args=json.dumps(sys.argv),
|
||||
job_env=dict(os.environ.items()),
|
||||
job_cwd=os.getcwd())
|
||||
)
|
||||
|
||||
data = AnsibleInventoryLoader(
|
||||
source=source, venv_path=venv_path, verbosity=verbosity
|
||||
).load()
|
||||
|
||||
logger.debug('Finished loading from source: %s', source)
|
||||
|
||||
status, tb, exc = 'error', '', None
|
||||
try:
|
||||
self.perform_update(options, data, inventory_update)
|
||||
status = 'successful'
|
||||
except Exception as e:
|
||||
exc = e
|
||||
if isinstance(e, KeyboardInterrupt):
|
||||
status = 'canceled'
|
||||
else:
|
||||
tb = traceback.format_exc()
|
||||
|
||||
with ignore_inventory_computed_fields():
|
||||
inventory_update = InventoryUpdate.objects.get(pk=inventory_update.pk)
|
||||
inventory_update.result_traceback = tb
|
||||
inventory_update.status = status
|
||||
inventory_update.save(update_fields=['status', 'result_traceback'])
|
||||
inventory_source.status = status
|
||||
inventory_source.save(update_fields=['status'])
|
||||
|
||||
if exc:
|
||||
logger.error(str(exc))
|
||||
|
||||
if exc:
|
||||
if isinstance(exc, CommandError):
|
||||
sys.exit(1)
|
||||
raise exc
|
||||
|
||||
def perform_update(self, options, data, inventory_update):
|
||||
"""Shared method for both awx-manage CLI updates and inventory updates
|
||||
from the tasks system.
|
||||
|
||||
This saves the inventory data to the database, calling load_into_database
|
||||
but also wraps that method in a host of options processing
|
||||
"""
|
||||
# outside of normal options, these are needed as part of programatic interface
|
||||
self.inventory = inventory_update.inventory
|
||||
self.inventory_source = inventory_update.inventory_source
|
||||
self.inventory_update = inventory_update
|
||||
|
||||
# the update options, could be parser object or dict
|
||||
self.overwrite = bool(options.get('overwrite', False))
|
||||
self.overwrite_vars = bool(options.get('overwrite_vars', False))
|
||||
self.keep_vars = bool(options.get('keep_vars', False))
|
||||
self.is_custom = bool(options.get('custom', False))
|
||||
self.source = options.get('source', None)
|
||||
self.enabled_var = options.get('enabled_var', None)
|
||||
self.enabled_value = options.get('enabled_value', None)
|
||||
self.group_filter = options.get('group_filter', None) or r'^.+$'
|
||||
@@ -976,17 +1011,6 @@ class Command(BaseCommand):
|
||||
self.exclude_empty_groups = bool(options.get('exclude_empty_groups', False))
|
||||
self.instance_id_var = options.get('instance_id_var', None)
|
||||
|
||||
self.invoked_from_dispatcher = False if os.getenv('INVENTORY_SOURCE_ID', None) is None else True
|
||||
|
||||
# Load inventory and related objects from database.
|
||||
if self.inventory_name and self.inventory_id:
|
||||
raise CommandError('--inventory-name and --inventory-id are mutually exclusive')
|
||||
elif not self.inventory_name and not self.inventory_id:
|
||||
raise CommandError('--inventory-name or --inventory-id is required')
|
||||
if (self.overwrite or self.overwrite_vars) and self.keep_vars:
|
||||
raise CommandError('--overwrite/--overwrite-vars and --keep-vars are mutually exclusive')
|
||||
if not self.source:
|
||||
raise CommandError('--source is required')
|
||||
try:
|
||||
self.group_filter_re = re.compile(self.group_filter)
|
||||
except re.error:
|
||||
@@ -997,146 +1021,115 @@ class Command(BaseCommand):
|
||||
raise CommandError('invalid regular expression for --host-filter')
|
||||
|
||||
begin = time.time()
|
||||
with advisory_lock('inventory_{}_update'.format(self.inventory_id)):
|
||||
self.load_inventory_from_database()
|
||||
|
||||
# Since perform_update can be invoked either through the awx-manage CLI
|
||||
# or from the task system, we need to create a new lock at this level
|
||||
# (even though inventory_import.Command.handle -- which calls
|
||||
# perform_update -- has its own lock, inventory_ID_import)
|
||||
with advisory_lock('inventory_{}_perform_update'.format(self.inventory.id)):
|
||||
|
||||
try:
|
||||
self.check_license()
|
||||
except CommandError as e:
|
||||
except PermissionDenied as e:
|
||||
self.mark_license_failure(save=True)
|
||||
raise e
|
||||
|
||||
try:
|
||||
# Check the per-org host limits
|
||||
self.check_org_host_limit()
|
||||
except CommandError as e:
|
||||
except PermissionDenied as e:
|
||||
self.mark_org_limits_failure(save=True)
|
||||
raise e
|
||||
|
||||
status, tb, exc = 'error', '', None
|
||||
try:
|
||||
if settings.SQL_DEBUG:
|
||||
queries_before = len(connection.queries)
|
||||
if settings.SQL_DEBUG:
|
||||
queries_before = len(connection.queries)
|
||||
|
||||
# Update inventory update for this command line invocation.
|
||||
with ignore_inventory_computed_fields():
|
||||
iu = self.inventory_update
|
||||
if iu.status != 'running':
|
||||
with transaction.atomic():
|
||||
self.inventory_update.status = 'running'
|
||||
self.inventory_update.save()
|
||||
# Update inventory update for this command line invocation.
|
||||
with ignore_inventory_computed_fields():
|
||||
# TODO: move this to before perform_update
|
||||
iu = self.inventory_update
|
||||
if iu.status != 'running':
|
||||
with transaction.atomic():
|
||||
self.inventory_update.status = 'running'
|
||||
self.inventory_update.save()
|
||||
|
||||
source = self.get_source_absolute_path(self.source)
|
||||
logger.info('Processing JSON output...')
|
||||
inventory = MemInventory(
|
||||
group_filter_re=self.group_filter_re, host_filter_re=self.host_filter_re)
|
||||
inventory = dict_to_mem_data(data, inventory=inventory)
|
||||
|
||||
data = AnsibleInventoryLoader(source=source, is_custom=self.is_custom,
|
||||
venv_path=venv_path, verbosity=self.verbosity).load()
|
||||
logger.info('Loaded %d groups, %d hosts', len(inventory.all_group.all_groups),
|
||||
len(inventory.all_group.all_hosts))
|
||||
|
||||
logger.debug('Finished loading from source: %s', source)
|
||||
logger.info('Processing JSON output...')
|
||||
inventory = MemInventory(
|
||||
group_filter_re=self.group_filter_re, host_filter_re=self.host_filter_re)
|
||||
inventory = dict_to_mem_data(data, inventory=inventory)
|
||||
if self.exclude_empty_groups:
|
||||
inventory.delete_empty_groups()
|
||||
|
||||
del data # forget dict from import, could be large
|
||||
self.all_group = inventory.all_group
|
||||
|
||||
logger.info('Loaded %d groups, %d hosts', len(inventory.all_group.all_groups),
|
||||
len(inventory.all_group.all_hosts))
|
||||
if settings.DEBUG:
|
||||
# depending on inventory source, this output can be
|
||||
# *exceedingly* verbose - crawling a deeply nested
|
||||
# inventory/group data structure and printing metadata about
|
||||
# each host and its memberships
|
||||
#
|
||||
# it's easy for this scale of data to overwhelm pexpect,
|
||||
# (and it's likely only useful for purposes of debugging the
|
||||
# actual inventory import code), so only print it if we have to:
|
||||
# https://github.com/ansible/ansible-tower/issues/7414#issuecomment-321615104
|
||||
self.all_group.debug_tree()
|
||||
|
||||
if self.exclude_empty_groups:
|
||||
inventory.delete_empty_groups()
|
||||
|
||||
self.all_group = inventory.all_group
|
||||
|
||||
if settings.DEBUG:
|
||||
# depending on inventory source, this output can be
|
||||
# *exceedingly* verbose - crawling a deeply nested
|
||||
# inventory/group data structure and printing metadata about
|
||||
# each host and its memberships
|
||||
#
|
||||
# it's easy for this scale of data to overwhelm pexpect,
|
||||
# (and it's likely only useful for purposes of debugging the
|
||||
# actual inventory import code), so only print it if we have to:
|
||||
# https://github.com/ansible/ansible-tower/issues/7414#issuecomment-321615104
|
||||
self.all_group.debug_tree()
|
||||
|
||||
with batch_role_ancestor_rebuilding():
|
||||
# If using with transaction.atomic() with try ... catch,
|
||||
# with transaction.atomic() must be inside the try section of the code as per Django docs
|
||||
try:
|
||||
# Ensure that this is managed as an atomic SQL transaction,
|
||||
# and thus properly rolled back if there is an issue.
|
||||
with transaction.atomic():
|
||||
# Merge/overwrite inventory into database.
|
||||
if settings.SQL_DEBUG:
|
||||
logger.warning('loading into database...')
|
||||
with ignore_inventory_computed_fields():
|
||||
if getattr(settings, 'ACTIVITY_STREAM_ENABLED_FOR_INVENTORY_SYNC', True):
|
||||
with batch_role_ancestor_rebuilding():
|
||||
# If using with transaction.atomic() with try ... catch,
|
||||
# with transaction.atomic() must be inside the try section of the code as per Django docs
|
||||
try:
|
||||
# Ensure that this is managed as an atomic SQL transaction,
|
||||
# and thus properly rolled back if there is an issue.
|
||||
with transaction.atomic():
|
||||
# Merge/overwrite inventory into database.
|
||||
if settings.SQL_DEBUG:
|
||||
logger.warning('loading into database...')
|
||||
with ignore_inventory_computed_fields():
|
||||
if getattr(settings, 'ACTIVITY_STREAM_ENABLED_FOR_INVENTORY_SYNC', True):
|
||||
self.load_into_database()
|
||||
else:
|
||||
with disable_activity_stream():
|
||||
self.load_into_database()
|
||||
else:
|
||||
with disable_activity_stream():
|
||||
self.load_into_database()
|
||||
if settings.SQL_DEBUG:
|
||||
queries_before2 = len(connection.queries)
|
||||
self.inventory.update_computed_fields()
|
||||
if settings.SQL_DEBUG:
|
||||
logger.warning('update computed fields took %d queries',
|
||||
len(connection.queries) - queries_before2)
|
||||
# Check if the license is valid.
|
||||
# If the license is not valid, a CommandError will be thrown,
|
||||
# and inventory update will be marked as invalid.
|
||||
# with transaction.atomic() will roll back the changes.
|
||||
license_fail = True
|
||||
self.check_license()
|
||||
if settings.SQL_DEBUG:
|
||||
queries_before2 = len(connection.queries)
|
||||
self.inventory.update_computed_fields()
|
||||
if settings.SQL_DEBUG:
|
||||
logger.warning('update computed fields took %d queries',
|
||||
len(connection.queries) - queries_before2)
|
||||
|
||||
# Check the per-org host limits
|
||||
license_fail = False
|
||||
self.check_org_host_limit()
|
||||
except CommandError as e:
|
||||
if license_fail:
|
||||
self.mark_license_failure()
|
||||
else:
|
||||
self.mark_org_limits_failure()
|
||||
raise e
|
||||
# Check if the license is valid.
|
||||
# If the license is not valid, a CommandError will be thrown,
|
||||
# and inventory update will be marked as invalid.
|
||||
# with transaction.atomic() will roll back the changes.
|
||||
license_fail = True
|
||||
self.check_license()
|
||||
|
||||
if settings.SQL_DEBUG:
|
||||
logger.warning('Inventory import completed for %s in %0.1fs',
|
||||
self.inventory_source.name, time.time() - begin)
|
||||
# Check the per-org host limits
|
||||
license_fail = False
|
||||
self.check_org_host_limit()
|
||||
except PermissionDenied as e:
|
||||
if license_fail:
|
||||
self.mark_license_failure(save=True)
|
||||
else:
|
||||
logger.info('Inventory import completed for %s in %0.1fs',
|
||||
self.inventory_source.name, time.time() - begin)
|
||||
status = 'successful'
|
||||
self.mark_org_limits_failure(save=True)
|
||||
raise e
|
||||
|
||||
# If we're in debug mode, then log the queries and time
|
||||
# used to do the operation.
|
||||
if settings.SQL_DEBUG:
|
||||
queries_this_import = connection.queries[queries_before:]
|
||||
sqltime = sum(float(x['time']) for x in queries_this_import)
|
||||
logger.warning('Inventory import required %d queries '
|
||||
'taking %0.3fs', len(queries_this_import),
|
||||
sqltime)
|
||||
except Exception as e:
|
||||
if isinstance(e, KeyboardInterrupt):
|
||||
status = 'canceled'
|
||||
exc = e
|
||||
elif isinstance(e, CommandError):
|
||||
exc = e
|
||||
logger.warning('Inventory import completed for %s in %0.1fs',
|
||||
self.inventory_source.name, time.time() - begin)
|
||||
else:
|
||||
tb = traceback.format_exc()
|
||||
exc = e
|
||||
logger.info('Inventory import completed for %s in %0.1fs',
|
||||
self.inventory_source.name, time.time() - begin)
|
||||
|
||||
if not self.invoked_from_dispatcher:
|
||||
with ignore_inventory_computed_fields():
|
||||
self.inventory_update = InventoryUpdate.objects.get(pk=self.inventory_update.pk)
|
||||
self.inventory_update.result_traceback = tb
|
||||
self.inventory_update.status = status
|
||||
self.inventory_update.save(update_fields=['status', 'result_traceback'])
|
||||
self.inventory_source.status = status
|
||||
self.inventory_source.save(update_fields=['status'])
|
||||
|
||||
if exc:
|
||||
logger.error(str(exc))
|
||||
|
||||
if exc:
|
||||
if isinstance(exc, CommandError):
|
||||
sys.exit(1)
|
||||
raise exc
|
||||
# If we're in debug mode, then log the queries and time
|
||||
# used to do the operation.
|
||||
if settings.SQL_DEBUG:
|
||||
queries_this_import = connection.queries[queries_before:]
|
||||
sqltime = sum(float(x['time']) for x in queries_this_import)
|
||||
logger.warning('Inventory import required %d queries '
|
||||
'taking %0.3fs', len(queries_this_import),
|
||||
sqltime)
|
||||
|
||||
@@ -181,4 +181,4 @@ class MigrationRanCheckMiddleware(MiddlewareMixin):
|
||||
plan = executor.migration_plan(executor.loader.graph.leaf_nodes())
|
||||
if bool(plan) and \
|
||||
getattr(resolve(request.path), 'url_name', '') != 'migrations_notran':
|
||||
return redirect(reverse("ui:migrations_notran"))
|
||||
return redirect(reverse("ui_next:migrations_notran"))
|
||||
|
||||
23
awx/main/migrations/0123_drop_hg_support.py
Normal file
23
awx/main/migrations/0123_drop_hg_support.py
Normal file
@@ -0,0 +1,23 @@
|
||||
from django.db import migrations, models
|
||||
from awx.main.migrations._hg_removal import delete_hg_scm
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('main', '0122_really_remove_cloudforms_inventory'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(delete_hg_scm),
|
||||
migrations.AlterField(
|
||||
model_name='project',
|
||||
name='scm_type',
|
||||
field=models.CharField(blank=True, choices=[('', 'Manual'), ('git', 'Git'), ('svn', 'Subversion'), ('insights', 'Red Hat Insights'), ('archive', 'Remote Archive')], default='', help_text='Specifies the source control system used to store the project.', max_length=8, verbose_name='SCM Type'),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='projectupdate',
|
||||
name='scm_type',
|
||||
field=models.CharField(blank=True, choices=[('', 'Manual'), ('git', 'Git'), ('svn', 'Subversion'), ('insights', 'Red Hat Insights'), ('archive', 'Remote Archive')], default='', help_text='Specifies the source control system used to store the project.', max_length=8, verbose_name='SCM Type'),
|
||||
),
|
||||
]
|
||||
19
awx/main/migrations/_hg_removal.py
Normal file
19
awx/main/migrations/_hg_removal.py
Normal file
@@ -0,0 +1,19 @@
|
||||
import logging
|
||||
|
||||
from awx.main.utils.common import set_current_apps
|
||||
|
||||
logger = logging.getLogger('awx.main.migrations')
|
||||
|
||||
|
||||
def delete_hg_scm(apps, schema_editor):
|
||||
set_current_apps(apps)
|
||||
Project = apps.get_model('main', 'Project')
|
||||
ProjectUpdate = apps.get_model('main', 'ProjectUpdate')
|
||||
|
||||
ProjectUpdate.objects.filter(project__scm_type='hg').update(scm_type='')
|
||||
update_ct = Project.objects.filter(scm_type='hg').update(scm_type='')
|
||||
|
||||
if update_ct:
|
||||
logger.warn('Changed {} mercurial projects to manual, deprecation period ended'.format(
|
||||
update_ct
|
||||
))
|
||||
@@ -1,9 +1,13 @@
|
||||
import json
|
||||
import re
|
||||
import logging
|
||||
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.encoding import iri_to_uri
|
||||
|
||||
|
||||
FrozenInjectors = dict()
|
||||
logger = logging.getLogger('awx.main.migrations')
|
||||
|
||||
|
||||
class PluginFileInjector(object):
|
||||
@@ -129,6 +133,7 @@ class azure_rm(PluginFileInjector):
|
||||
ret['exclude_host_filters'].append("location not in {}".format(repr(python_regions)))
|
||||
return ret
|
||||
|
||||
|
||||
class ec2(PluginFileInjector):
|
||||
plugin_name = 'aws_ec2'
|
||||
namespace = 'amazon'
|
||||
@@ -586,6 +591,7 @@ class openstack(PluginFileInjector):
|
||||
ret['inventory_hostname'] = use_host_name_for_name(source_vars['use_hostnames'])
|
||||
return ret
|
||||
|
||||
|
||||
class rhv(PluginFileInjector):
|
||||
"""ovirt uses the custom credential templating, and that is all
|
||||
"""
|
||||
|
||||
@@ -52,7 +52,6 @@ class ProjectOptions(models.Model):
|
||||
SCM_TYPE_CHOICES = [
|
||||
('', _('Manual')),
|
||||
('git', _('Git')),
|
||||
('hg', _('Mercurial')),
|
||||
('svn', _('Subversion')),
|
||||
('insights', _('Red Hat Insights')),
|
||||
('archive', _('Remote Archive')),
|
||||
|
||||
@@ -121,6 +121,27 @@ def sync_superuser_status_to_rbac(instance, **kwargs):
|
||||
Role.singleton(ROLE_SINGLETON_SYSTEM_ADMINISTRATOR).members.remove(instance)
|
||||
|
||||
|
||||
def sync_rbac_to_superuser_status(instance, sender, **kwargs):
|
||||
'When the is_superuser flag is false but a user has the System Admin role, update the database to reflect that'
|
||||
if kwargs['action'] in ['post_add', 'post_remove', 'post_clear']:
|
||||
new_status_value = bool(kwargs['action'] == 'post_add')
|
||||
if hasattr(instance, 'singleton_name'): # duck typing, role.members.add() vs user.roles.add()
|
||||
role = instance
|
||||
if role.singleton_name == ROLE_SINGLETON_SYSTEM_ADMINISTRATOR:
|
||||
if kwargs['pk_set']:
|
||||
kwargs['model'].objects.filter(pk__in=kwargs['pk_set']).update(is_superuser=new_status_value)
|
||||
elif kwargs['action'] == 'post_clear':
|
||||
kwargs['model'].objects.all().update(is_superuser=False)
|
||||
else:
|
||||
user = instance
|
||||
if kwargs['action'] == 'post_clear':
|
||||
user.is_superuser = False
|
||||
user.save(update_fields=['is_superuser'])
|
||||
elif kwargs['model'].objects.filter(pk__in=kwargs['pk_set'], singleton_name=ROLE_SINGLETON_SYSTEM_ADMINISTRATOR).exists():
|
||||
user.is_superuser = new_status_value
|
||||
user.save(update_fields=['is_superuser'])
|
||||
|
||||
|
||||
def rbac_activity_stream(instance, sender, **kwargs):
|
||||
# Only if we are associating/disassociating
|
||||
if kwargs['action'] in ['pre_add', 'pre_remove']:
|
||||
@@ -197,6 +218,7 @@ m2m_changed.connect(rebuild_role_ancestor_list, Role.parents.through)
|
||||
m2m_changed.connect(rbac_activity_stream, Role.members.through)
|
||||
m2m_changed.connect(rbac_activity_stream, Role.parents.through)
|
||||
post_save.connect(sync_superuser_status_to_rbac, sender=User)
|
||||
m2m_changed.connect(sync_rbac_to_superuser_status, Role.members.through)
|
||||
pre_delete.connect(cleanup_detached_labels_on_deleted_parent, sender=UnifiedJob)
|
||||
pre_delete.connect(cleanup_detached_labels_on_deleted_parent, sender=UnifiedJobTemplate)
|
||||
|
||||
|
||||
@@ -23,7 +23,6 @@ import fcntl
|
||||
from pathlib import Path
|
||||
from uuid import uuid4
|
||||
import urllib.parse as urlparse
|
||||
import shlex
|
||||
|
||||
# Django
|
||||
from django.conf import settings
|
||||
@@ -64,7 +63,7 @@ from awx.main.models import (
|
||||
build_safe_env, enforce_bigint_pk_migration
|
||||
)
|
||||
from awx.main.constants import ACTIVE_STATES
|
||||
from awx.main.exceptions import AwxTaskError
|
||||
from awx.main.exceptions import AwxTaskError, PostRunError
|
||||
from awx.main.queue import CallbackQueueDispatcher
|
||||
from awx.main.isolated import manager as isolated_manager
|
||||
from awx.main.dispatch.publish import task
|
||||
@@ -79,6 +78,7 @@ from awx.main.utils.external_logging import reconfigure_rsyslog
|
||||
from awx.main.utils.safe_yaml import safe_dump, sanitize_jinja
|
||||
from awx.main.utils.reload import stop_local_services
|
||||
from awx.main.utils.pglock import advisory_lock
|
||||
from awx.main.utils.handlers import SpecialInventoryHandler
|
||||
from awx.main.consumers import emit_channel_notification
|
||||
from awx.main import analytics
|
||||
from awx.conf import settings_registry
|
||||
@@ -1225,6 +1225,13 @@ class BaseTask(object):
|
||||
Ansible runner puts a parent_uuid on each event, no matter what the type.
|
||||
AWX only saves the parent_uuid if the event is for a Job.
|
||||
'''
|
||||
# cache end_line locally for RunInventoryUpdate tasks
|
||||
# which generate job events from two 'streams':
|
||||
# ansible-inventory and the awx.main.commands.inventory_import
|
||||
# logger
|
||||
if isinstance(self, RunInventoryUpdate):
|
||||
self.end_line = event_data['end_line']
|
||||
|
||||
if event_data.get(self.event_data_key, None):
|
||||
if self.event_data_key != 'job_id':
|
||||
event_data.pop('parent_uuid', None)
|
||||
@@ -1253,7 +1260,7 @@ class BaseTask(object):
|
||||
# so it *should* have a negligible performance impact
|
||||
task = event_data.get('event_data', {}).get('task_action')
|
||||
try:
|
||||
if task in ('git', 'hg', 'svn'):
|
||||
if task in ('git', 'svn'):
|
||||
event_data_json = json.dumps(event_data)
|
||||
event_data_json = UriCleaner.remove_sensitive(event_data_json)
|
||||
event_data = json.loads(event_data_json)
|
||||
@@ -1521,6 +1528,12 @@ class BaseTask(object):
|
||||
|
||||
try:
|
||||
self.post_run_hook(self.instance, status)
|
||||
except PostRunError as exc:
|
||||
if status == 'successful':
|
||||
status = exc.status
|
||||
extra_update_fields['job_explanation'] = exc.args[0]
|
||||
if exc.tb:
|
||||
extra_update_fields['result_traceback'] = exc.tb
|
||||
except Exception:
|
||||
logger.exception('{} Post run hook errored.'.format(self.instance.log_format))
|
||||
|
||||
@@ -2141,7 +2154,7 @@ class RunProjectUpdate(BaseTask):
|
||||
elif not scm_branch:
|
||||
raise RuntimeError('Could not determine a revision to run from project.')
|
||||
elif not scm_branch:
|
||||
scm_branch = {'hg': 'tip'}.get(project_update.scm_type, 'HEAD')
|
||||
scm_branch = 'HEAD'
|
||||
|
||||
galaxy_creds_are_defined = (
|
||||
project_update.project.organization and
|
||||
@@ -2150,7 +2163,7 @@ class RunProjectUpdate(BaseTask):
|
||||
if not galaxy_creds_are_defined and (
|
||||
settings.AWX_ROLES_ENABLED or settings.AWX_COLLECTIONS_ENABLED
|
||||
):
|
||||
logger.debug(
|
||||
logger.warning(
|
||||
'Galaxy role/collection syncing is enabled, but no '
|
||||
f'credentials are configured for {project_update.project.organization}.'
|
||||
)
|
||||
@@ -2417,9 +2430,10 @@ class RunProjectUpdate(BaseTask):
|
||||
shutil.rmtree(stage_path) # cannot trust content update produced
|
||||
|
||||
if self.job_private_data_dir:
|
||||
# copy project folder before resetting to default branch
|
||||
# because some git-tree-specific resources (like submodules) might matter
|
||||
self.make_local_copy(instance, self.job_private_data_dir)
|
||||
if status == 'successful':
|
||||
# copy project folder before resetting to default branch
|
||||
# because some git-tree-specific resources (like submodules) might matter
|
||||
self.make_local_copy(instance, self.job_private_data_dir)
|
||||
if self.original_branch:
|
||||
# for git project syncs, non-default branches can be problems
|
||||
# restore to branch the repo was on before this run
|
||||
@@ -2461,6 +2475,14 @@ class RunInventoryUpdate(BaseTask):
|
||||
event_model = InventoryUpdateEvent
|
||||
event_data_key = 'inventory_update_id'
|
||||
|
||||
# TODO: remove once inv updates run in containers
|
||||
def should_use_proot(self, inventory_update):
|
||||
'''
|
||||
Return whether this task should use proot.
|
||||
'''
|
||||
return getattr(settings, 'AWX_PROOT_ENABLED', False)
|
||||
|
||||
# TODO: remove once inv updates run in containers
|
||||
@property
|
||||
def proot_show_paths(self):
|
||||
return [settings.AWX_ANSIBLE_COLLECTIONS_PATHS]
|
||||
@@ -2485,15 +2507,11 @@ class RunInventoryUpdate(BaseTask):
|
||||
return injector.build_private_data(inventory_update, private_data_dir)
|
||||
|
||||
def build_env(self, inventory_update, private_data_dir, isolated, private_data_files=None):
|
||||
"""Build environment dictionary for inventory import.
|
||||
"""Build environment dictionary for ansible-inventory.
|
||||
|
||||
This used to be the mechanism by which any data that needs to be passed
|
||||
to the inventory update script is set up. In particular, this is how
|
||||
inventory update is aware of its proper credentials.
|
||||
|
||||
Most environment injection is now accomplished by the credential
|
||||
injectors. The primary purpose this still serves is to
|
||||
still point to the inventory update INI or config file.
|
||||
Most environment variables related to credentials or configuration
|
||||
are accomplished by the inventory source injectors (in this method)
|
||||
or custom credential type injectors (in main run method).
|
||||
"""
|
||||
env = super(RunInventoryUpdate, self).build_env(inventory_update,
|
||||
private_data_dir,
|
||||
@@ -2501,8 +2519,11 @@ class RunInventoryUpdate(BaseTask):
|
||||
private_data_files=private_data_files)
|
||||
if private_data_files is None:
|
||||
private_data_files = {}
|
||||
self.add_awx_venv(env)
|
||||
# Pass inventory source ID to inventory script.
|
||||
# TODO: remove once containers replace custom venvs
|
||||
self.add_ansible_venv(inventory_update.ansible_virtualenv_path, env, isolated=isolated)
|
||||
|
||||
# Legacy environment variables, were used as signal to awx-manage command
|
||||
# now they are provided in case some scripts may be relying on them
|
||||
env['INVENTORY_SOURCE_ID'] = str(inventory_update.inventory_source_id)
|
||||
env['INVENTORY_UPDATE_ID'] = str(inventory_update.pk)
|
||||
env.update(STANDARD_INVENTORY_UPDATE_ENV)
|
||||
@@ -2565,47 +2586,25 @@ class RunInventoryUpdate(BaseTask):
|
||||
if inventory is None:
|
||||
raise RuntimeError('Inventory Source is not associated with an Inventory.')
|
||||
|
||||
# Piece together the initial command to run via. the shell.
|
||||
args = ['awx-manage', 'inventory_import']
|
||||
args.extend(['--inventory-id', str(inventory.pk)])
|
||||
args = ['ansible-inventory', '--list', '--export']
|
||||
|
||||
# Add appropriate arguments for overwrite if the inventory_update
|
||||
# object calls for it.
|
||||
if inventory_update.overwrite:
|
||||
args.append('--overwrite')
|
||||
if inventory_update.overwrite_vars:
|
||||
args.append('--overwrite-vars')
|
||||
# Add arguments for the source inventory file/script/thing
|
||||
source_location = self.pseudo_build_inventory(inventory_update, private_data_dir)
|
||||
args.append('-i')
|
||||
args.append(source_location)
|
||||
|
||||
# Declare the virtualenv the management command should activate
|
||||
# as it calls ansible-inventory
|
||||
args.extend(['--venv', inventory_update.ansible_virtualenv_path])
|
||||
args.append('--output')
|
||||
args.append(os.path.join(private_data_dir, 'artifacts', 'output.json'))
|
||||
|
||||
src = inventory_update.source
|
||||
if inventory_update.enabled_var:
|
||||
args.extend(['--enabled-var', shlex.quote(inventory_update.enabled_var)])
|
||||
args.extend(['--enabled-value', shlex.quote(inventory_update.enabled_value)])
|
||||
if os.path.isdir(source_location):
|
||||
playbook_dir = source_location
|
||||
else:
|
||||
if getattr(settings, '%s_ENABLED_VAR' % src.upper(), False):
|
||||
args.extend(['--enabled-var',
|
||||
getattr(settings, '%s_ENABLED_VAR' % src.upper())])
|
||||
if getattr(settings, '%s_ENABLED_VALUE' % src.upper(), False):
|
||||
args.extend(['--enabled-value',
|
||||
getattr(settings, '%s_ENABLED_VALUE' % src.upper())])
|
||||
if inventory_update.host_filter:
|
||||
args.extend(['--host-filter', shlex.quote(inventory_update.host_filter)])
|
||||
if getattr(settings, '%s_EXCLUDE_EMPTY_GROUPS' % src.upper()):
|
||||
args.append('--exclude-empty-groups')
|
||||
if getattr(settings, '%s_INSTANCE_ID_VAR' % src.upper(), False):
|
||||
args.extend(['--instance-id-var',
|
||||
"'{}'".format(getattr(settings, '%s_INSTANCE_ID_VAR' % src.upper())),])
|
||||
# Add arguments for the source inventory script
|
||||
args.append('--source')
|
||||
args.append(self.pseudo_build_inventory(inventory_update, private_data_dir))
|
||||
if src == 'custom':
|
||||
args.append("--custom")
|
||||
args.append('-v%d' % inventory_update.verbosity)
|
||||
if settings.DEBUG:
|
||||
args.append('--traceback')
|
||||
playbook_dir = os.path.dirname(source_location)
|
||||
args.extend(['--playbook-dir', playbook_dir])
|
||||
|
||||
if inventory_update.verbosity:
|
||||
args.append('-' + 'v' * min(5, inventory_update.verbosity * 2 + 1))
|
||||
|
||||
return args
|
||||
|
||||
def build_inventory(self, inventory_update, private_data_dir):
|
||||
@@ -2645,11 +2644,9 @@ class RunInventoryUpdate(BaseTask):
|
||||
|
||||
def build_cwd(self, inventory_update, private_data_dir):
|
||||
'''
|
||||
There are two cases where the inventory "source" is in a different
|
||||
There is one case where the inventory "source" is in a different
|
||||
location from the private data:
|
||||
- deprecated vendored inventory scripts in awx/plugins/inventory
|
||||
- SCM, where source needs to live in the project folder
|
||||
in these cases, the inventory does not exist in the standard tempdir
|
||||
'''
|
||||
src = inventory_update.source
|
||||
if src == 'scm' and inventory_update.source_project_update:
|
||||
@@ -2707,6 +2704,75 @@ class RunInventoryUpdate(BaseTask):
|
||||
# This follows update, not sync, so make copy here
|
||||
RunProjectUpdate.make_local_copy(source_project, private_data_dir)
|
||||
|
||||
def post_run_hook(self, inventory_update, status):
|
||||
if status != 'successful':
|
||||
return # nothing to save, step out of the way to allow error reporting
|
||||
|
||||
private_data_dir = inventory_update.job_env['AWX_PRIVATE_DATA_DIR']
|
||||
expected_output = os.path.join(private_data_dir, 'artifacts', 'output.json')
|
||||
with open(expected_output) as f:
|
||||
data = json.load(f)
|
||||
|
||||
# build inventory save options
|
||||
options = dict(
|
||||
overwrite=inventory_update.overwrite,
|
||||
overwrite_vars=inventory_update.overwrite_vars,
|
||||
)
|
||||
src = inventory_update.source
|
||||
|
||||
if inventory_update.enabled_var:
|
||||
options['enabled_var'] = inventory_update.enabled_var
|
||||
options['enabled_value'] = inventory_update.enabled_value
|
||||
else:
|
||||
if getattr(settings, '%s_ENABLED_VAR' % src.upper(), False):
|
||||
options['enabled_var'] = getattr(settings, '%s_ENABLED_VAR' % src.upper())
|
||||
if getattr(settings, '%s_ENABLED_VALUE' % src.upper(), False):
|
||||
options['enabled_value'] = getattr(settings, '%s_ENABLED_VALUE' % src.upper())
|
||||
|
||||
if inventory_update.host_filter:
|
||||
options['host_filter'] = inventory_update.host_filter
|
||||
|
||||
if getattr(settings, '%s_EXCLUDE_EMPTY_GROUPS' % src.upper()):
|
||||
options['exclude_empty_groups'] = True
|
||||
if getattr(settings, '%s_INSTANCE_ID_VAR' % src.upper(), False):
|
||||
options['instance_id_var'] = getattr(settings, '%s_INSTANCE_ID_VAR' % src.upper())
|
||||
|
||||
# Verbosity is applied to saving process, as well as ansible-inventory CLI option
|
||||
if inventory_update.verbosity:
|
||||
options['verbosity'] = inventory_update.verbosity
|
||||
|
||||
handler = SpecialInventoryHandler(
|
||||
self.event_handler, self.cancel_callback,
|
||||
verbosity=inventory_update.verbosity,
|
||||
job_timeout=self.get_instance_timeout(self.instance),
|
||||
start_time=inventory_update.started,
|
||||
counter=self.event_ct, initial_line=self.end_line
|
||||
)
|
||||
inv_logger = logging.getLogger('awx.main.commands.inventory_import')
|
||||
formatter = inv_logger.handlers[0].formatter
|
||||
formatter.job_start = inventory_update.started
|
||||
handler.formatter = formatter
|
||||
inv_logger.handlers[0] = handler
|
||||
|
||||
from awx.main.management.commands.inventory_import import Command as InventoryImportCommand
|
||||
cmd = InventoryImportCommand()
|
||||
try:
|
||||
# save the inventory data to database.
|
||||
# canceling exceptions will be handled in the global post_run_hook
|
||||
cmd.perform_update(options, data, inventory_update)
|
||||
except PermissionDenied as exc:
|
||||
logger.exception('License error saving {} content'.format(inventory_update.log_format))
|
||||
raise PostRunError(str(exc), status='error')
|
||||
except PostRunError:
|
||||
logger.exception('Error saving {} content, rolling back changes'.format(inventory_update.log_format))
|
||||
raise
|
||||
except Exception:
|
||||
logger.exception('Exception saving {} content, rolling back changes.'.format(
|
||||
inventory_update.log_format))
|
||||
raise PostRunError(
|
||||
'Error occured while saving inventory data, see traceback or server logs',
|
||||
status='error', tb=traceback.format_exc())
|
||||
|
||||
|
||||
@task(queue=get_local_queuename)
|
||||
class RunAdHocCommand(BaseTask):
|
||||
|
||||
@@ -6,7 +6,7 @@ from collections import OrderedDict
|
||||
from django.db.models.deletion import Collector, SET_NULL, CASCADE
|
||||
from django.core.management import call_command
|
||||
|
||||
from awx.main.management.commands.deletion import AWXCollector
|
||||
from awx.main.utils.deletion import AWXCollector
|
||||
from awx.main.models import (
|
||||
JobTemplate, User, Job, JobEvent, Notification,
|
||||
WorkflowJobNode, JobHostSummary
|
||||
|
||||
@@ -9,6 +9,9 @@ import os
|
||||
# Django
|
||||
from django.core.management.base import CommandError
|
||||
|
||||
# for license errors
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
|
||||
# AWX
|
||||
from awx.main.management.commands import inventory_import
|
||||
from awx.main.models import Inventory, Host, Group, InventorySource
|
||||
@@ -83,7 +86,7 @@ class MockLoader:
|
||||
return self._data
|
||||
|
||||
|
||||
def mock_logging(self):
|
||||
def mock_logging(self, level):
|
||||
pass
|
||||
|
||||
|
||||
@@ -322,6 +325,6 @@ def test_tower_version_compare():
|
||||
"version": "2.0.1-1068-g09684e2c41"
|
||||
}
|
||||
}
|
||||
with pytest.raises(CommandError):
|
||||
with pytest.raises(PermissionDenied):
|
||||
cmd.remote_tower_license_compare('very_supported')
|
||||
cmd.remote_tower_license_compare('open')
|
||||
|
||||
@@ -214,6 +214,9 @@ def test_inventory_update_injected_content(this_kind, inventory, fake_credential
|
||||
f"'{inventory_filename}' file not found in inventory update runtime files {content.keys()}"
|
||||
|
||||
env.pop('ANSIBLE_COLLECTIONS_PATHS', None) # collection paths not relevant to this test
|
||||
env.pop('PYTHONPATH')
|
||||
env.pop('VIRTUAL_ENV')
|
||||
env.pop('PROOT_TMP_DIR')
|
||||
base_dir = os.path.join(DATA, 'plugins')
|
||||
if not os.path.exists(base_dir):
|
||||
os.mkdir(base_dir)
|
||||
|
||||
@@ -4,7 +4,7 @@ from unittest import mock
|
||||
from django.test import TransactionTestCase
|
||||
|
||||
from awx.main.access import UserAccess, RoleAccess, TeamAccess
|
||||
from awx.main.models import User, Organization, Inventory
|
||||
from awx.main.models import User, Organization, Inventory, Role
|
||||
|
||||
|
||||
class TestSysAuditorTransactional(TransactionTestCase):
|
||||
@@ -170,4 +170,34 @@ def test_org_admin_cannot_delete_member_attached_to_other_group(org_admin, org_m
|
||||
access = UserAccess(org_admin)
|
||||
other_org.member_role.members.add(org_member)
|
||||
assert not access.can_delete(org_member)
|
||||
|
||||
|
||||
|
||||
@pytest.mark.parametrize('reverse', (True, False))
|
||||
@pytest.mark.django_db
|
||||
def test_consistency_of_is_superuser_flag(reverse):
|
||||
users = [User.objects.create(username='rando_{}'.format(i)) for i in range(2)]
|
||||
for u in users:
|
||||
assert u.is_superuser is False
|
||||
|
||||
system_admin = Role.singleton('system_administrator')
|
||||
if reverse:
|
||||
for u in users:
|
||||
u.roles.add(system_admin)
|
||||
else:
|
||||
system_admin.members.add(*[u.id for u in users]) # like .add(42, 54)
|
||||
|
||||
for u in users:
|
||||
u.refresh_from_db()
|
||||
assert u.is_superuser is True
|
||||
|
||||
users[0].roles.clear()
|
||||
for u in users:
|
||||
u.refresh_from_db()
|
||||
assert users[0].is_superuser is False
|
||||
assert users[1].is_superuser is True
|
||||
|
||||
system_admin.members.clear()
|
||||
|
||||
for u in users:
|
||||
u.refresh_from_db()
|
||||
assert u.is_superuser is False
|
||||
|
||||
@@ -33,32 +33,6 @@ class TestInvalidOptions:
|
||||
assert 'inventory-id' in str(err.value)
|
||||
assert 'exclusive' in str(err.value)
|
||||
|
||||
def test_invalid_options_id_and_keep_vars(self):
|
||||
# You can't overwrite and keep_vars at the same time, that wouldn't make sense
|
||||
cmd = Command()
|
||||
with pytest.raises(CommandError) as err:
|
||||
cmd.handle(
|
||||
inventory_id=42, overwrite=True, keep_vars=True
|
||||
)
|
||||
assert 'overwrite-vars' in str(err.value)
|
||||
assert 'exclusive' in str(err.value)
|
||||
|
||||
def test_invalid_options_id_but_no_source(self):
|
||||
# Need a source to import
|
||||
cmd = Command()
|
||||
with pytest.raises(CommandError) as err:
|
||||
cmd.handle(
|
||||
inventory_id=42, overwrite=True, keep_vars=True
|
||||
)
|
||||
assert 'overwrite-vars' in str(err.value)
|
||||
assert 'exclusive' in str(err.value)
|
||||
with pytest.raises(CommandError) as err:
|
||||
cmd.handle(
|
||||
inventory_id=42, overwrite_vars=True, keep_vars=True
|
||||
)
|
||||
assert 'overwrite-vars' in str(err.value)
|
||||
assert 'exclusive' in str(err.value)
|
||||
|
||||
def test_invalid_options_missing_source(self):
|
||||
cmd = Command()
|
||||
with pytest.raises(CommandError) as err:
|
||||
|
||||
@@ -1909,19 +1909,16 @@ class TestProjectUpdateCredentials(TestJobExecution):
|
||||
parametrize = {
|
||||
'test_username_and_password_auth': [
|
||||
dict(scm_type='git'),
|
||||
dict(scm_type='hg'),
|
||||
dict(scm_type='svn'),
|
||||
dict(scm_type='archive'),
|
||||
],
|
||||
'test_ssh_key_auth': [
|
||||
dict(scm_type='git'),
|
||||
dict(scm_type='hg'),
|
||||
dict(scm_type='svn'),
|
||||
dict(scm_type='archive'),
|
||||
],
|
||||
'test_awx_task_env': [
|
||||
dict(scm_type='git'),
|
||||
dict(scm_type='hg'),
|
||||
dict(scm_type='svn'),
|
||||
dict(scm_type='archive'),
|
||||
]
|
||||
@@ -2061,8 +2058,8 @@ class TestInventoryUpdateCredentials(TestJobExecution):
|
||||
credential, env, {}, [], private_data_dir
|
||||
)
|
||||
|
||||
assert '--custom' in ' '.join(args)
|
||||
script = args[args.index('--source') + 1]
|
||||
assert '-i' in ' '.join(args)
|
||||
script = args[args.index('-i') + 1]
|
||||
with open(script, 'r') as f:
|
||||
assert f.read() == inventory_update.source_script.script
|
||||
assert env['FOO'] == 'BAR'
|
||||
|
||||
@@ -222,9 +222,8 @@ def update_scm_url(scm_type, url, username=True, password=True,
|
||||
'''
|
||||
# Handle all of the URL formats supported by the SCM systems:
|
||||
# git: https://www.kernel.org/pub/software/scm/git/docs/git-clone.html#URLS
|
||||
# hg: http://www.selenic.com/mercurial/hg.1.html#url-paths
|
||||
# svn: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.advanced.reposurls
|
||||
if scm_type not in ('git', 'hg', 'svn', 'insights', 'archive'):
|
||||
if scm_type not in ('git', 'svn', 'insights', 'archive'):
|
||||
raise ValueError(_('Unsupported SCM type "%s"') % str(scm_type))
|
||||
if not url.strip():
|
||||
return ''
|
||||
@@ -256,8 +255,8 @@ def update_scm_url(scm_type, url, username=True, password=True,
|
||||
# SCP style before passed to git module.
|
||||
parts = urllib.parse.urlsplit('git+ssh://%s' % modified_url)
|
||||
# Handle local paths specified without file scheme (e.g. /path/to/foo).
|
||||
# Only supported by git and hg.
|
||||
elif scm_type in ('git', 'hg'):
|
||||
# Only supported by git.
|
||||
elif scm_type == 'git':
|
||||
if not url.startswith('/'):
|
||||
parts = urllib.parse.urlsplit('file:///%s' % url)
|
||||
else:
|
||||
@@ -268,7 +267,6 @@ def update_scm_url(scm_type, url, username=True, password=True,
|
||||
# Validate that scheme is valid for given scm_type.
|
||||
scm_type_schemes = {
|
||||
'git': ('ssh', 'git', 'git+ssh', 'http', 'https', 'ftp', 'ftps', 'file'),
|
||||
'hg': ('http', 'https', 'ssh', 'file'),
|
||||
'svn': ('http', 'https', 'svn', 'svn+ssh', 'file'),
|
||||
'insights': ('http', 'https'),
|
||||
'archive': ('http', 'https'),
|
||||
@@ -300,12 +298,6 @@ def update_scm_url(scm_type, url, username=True, password=True,
|
||||
if scm_type == 'git' and parts.scheme.endswith('ssh') and parts.hostname in special_git_hosts and netloc_password:
|
||||
#raise ValueError('Password not allowed for SSH access to %s.' % parts.hostname)
|
||||
netloc_password = ''
|
||||
special_hg_hosts = ('bitbucket.org', 'altssh.bitbucket.org')
|
||||
if scm_type == 'hg' and parts.scheme == 'ssh' and parts.hostname in special_hg_hosts and netloc_username != 'hg':
|
||||
raise ValueError(_('Username must be "hg" for SSH access to %s.') % parts.hostname)
|
||||
if scm_type == 'hg' and parts.scheme == 'ssh' and netloc_password:
|
||||
#raise ValueError('Password not supported for SSH with Mercurial.')
|
||||
netloc_password = ''
|
||||
|
||||
if netloc_username and parts.scheme != 'file' and scm_type not in ("insights", "archive"):
|
||||
netloc = u':'.join([urllib.parse.quote(x,safe='') for x in (netloc_username, netloc_password) if x])
|
||||
|
||||
@@ -9,6 +9,7 @@ import socket
|
||||
from datetime import datetime
|
||||
|
||||
from dateutil.tz import tzutc
|
||||
from django.utils.timezone import now
|
||||
from django.core.serializers.json import DjangoJSONEncoder
|
||||
from django.conf import settings
|
||||
|
||||
@@ -17,8 +18,15 @@ class TimeFormatter(logging.Formatter):
|
||||
'''
|
||||
Custom log formatter used for inventory imports
|
||||
'''
|
||||
def __init__(self, start_time=None, **kwargs):
|
||||
if start_time is None:
|
||||
self.job_start = now()
|
||||
else:
|
||||
self.job_start = start_time
|
||||
super(TimeFormatter, self).__init__(**kwargs)
|
||||
|
||||
def format(self, record):
|
||||
record.relativeSeconds = record.relativeCreated / 1000.0
|
||||
record.relativeSeconds = (now() - self.job_start).total_seconds()
|
||||
return logging.Formatter.format(self, record)
|
||||
|
||||
|
||||
|
||||
@@ -7,6 +7,10 @@ import os.path
|
||||
|
||||
# Django
|
||||
from django.conf import settings
|
||||
from django.utils.timezone import now
|
||||
|
||||
# AWX
|
||||
from awx.main.exceptions import PostRunError
|
||||
|
||||
|
||||
class RSysLogHandler(logging.handlers.SysLogHandler):
|
||||
@@ -40,6 +44,58 @@ class RSysLogHandler(logging.handlers.SysLogHandler):
|
||||
pass
|
||||
|
||||
|
||||
class SpecialInventoryHandler(logging.Handler):
|
||||
"""Logging handler used for the saving-to-database part of inventory updates
|
||||
ran by the task system
|
||||
this dispatches events directly to be processed by the callback receiver,
|
||||
as opposed to ansible-runner
|
||||
"""
|
||||
|
||||
def __init__(self, event_handler, cancel_callback, job_timeout, verbosity,
|
||||
start_time=None, counter=0, initial_line=0, **kwargs):
|
||||
self.event_handler = event_handler
|
||||
self.cancel_callback = cancel_callback
|
||||
self.job_timeout = job_timeout
|
||||
if start_time is None:
|
||||
self.job_start = now()
|
||||
else:
|
||||
self.job_start = start_time
|
||||
self.last_check = self.job_start
|
||||
self.counter = counter
|
||||
self.skip_level = [logging.WARNING, logging.INFO, logging.DEBUG, 0][verbosity]
|
||||
self._current_line = initial_line
|
||||
super(SpecialInventoryHandler, self).__init__(**kwargs)
|
||||
|
||||
def emit(self, record):
|
||||
# check cancel and timeout status regardless of log level
|
||||
this_time = now()
|
||||
if (this_time - self.last_check).total_seconds() > 0.5: # cancel callback is expensive
|
||||
self.last_check = this_time
|
||||
if self.cancel_callback():
|
||||
raise PostRunError('Inventory update has been canceled', status='canceled')
|
||||
if self.job_timeout and ((this_time - self.job_start).total_seconds() > self.job_timeout):
|
||||
raise PostRunError('Inventory update has timed out', status='canceled')
|
||||
|
||||
# skip logging for low severity logs
|
||||
if record.levelno < self.skip_level:
|
||||
return
|
||||
|
||||
self.counter += 1
|
||||
msg = self.format(record)
|
||||
n_lines = len(msg.strip().split('\n')) # don't count line breaks at boundry of text
|
||||
dispatch_data = dict(
|
||||
created=now().isoformat(),
|
||||
event='verbose',
|
||||
counter=self.counter,
|
||||
stdout=msg,
|
||||
start_line=self._current_line,
|
||||
end_line=self._current_line + n_lines
|
||||
)
|
||||
self._current_line += n_lines
|
||||
|
||||
self.event_handler(dispatch_data)
|
||||
|
||||
|
||||
ColorHandler = logging.StreamHandler
|
||||
|
||||
if settings.COLOR_LOGS is True:
|
||||
|
||||
@@ -11,7 +11,7 @@ The Licenser class can do the following:
|
||||
|
||||
import base64
|
||||
import configparser
|
||||
from datetime import datetime
|
||||
from datetime import datetime, timezone
|
||||
import collections
|
||||
import copy
|
||||
import io
|
||||
@@ -73,9 +73,12 @@ def validate_entitlement_manifest(data):
|
||||
|
||||
buff.write(export)
|
||||
z = zipfile.ZipFile(buff)
|
||||
subs = []
|
||||
for f in z.filelist:
|
||||
if f.filename.startswith('export/entitlements') and f.filename.endswith('.json'):
|
||||
return json.loads(z.open(f).read())
|
||||
subs.append(json.loads(z.open(f).read()))
|
||||
if subs:
|
||||
return subs
|
||||
raise ValueError(_("Invalid manifest: manifest contains no subscriptions."))
|
||||
|
||||
|
||||
@@ -131,21 +134,61 @@ class Licenser(object):
|
||||
|
||||
|
||||
def license_from_manifest(self, manifest):
|
||||
def is_appropriate_manifest_sub(sub):
|
||||
if sub['pool']['activeSubscription'] is False:
|
||||
return False
|
||||
now = datetime.now(timezone.utc)
|
||||
if parse_date(sub['startDate']) > now:
|
||||
return False
|
||||
if parse_date(sub['endDate']) < now:
|
||||
return False
|
||||
products = sub['pool']['providedProducts']
|
||||
if any(product.get('productId') == '480' for product in products):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _can_aggregate(sub, license):
|
||||
# We aggregate multiple subs into a larger meta-sub, if they match
|
||||
#
|
||||
# No current sub in aggregate
|
||||
if not license:
|
||||
return True
|
||||
# Same SKU type (SER vs MCT vs others)?
|
||||
if license['sku'][0:3] != sub['pool']['productId'][0:3]:
|
||||
return False
|
||||
return True
|
||||
|
||||
# Parse output for subscription metadata to build config
|
||||
license = dict()
|
||||
license['sku'] = manifest['pool']['productId']
|
||||
try:
|
||||
license['instance_count'] = manifest['pool']['exported']
|
||||
except KeyError:
|
||||
license['instance_count'] = manifest['pool']['quantity']
|
||||
license['subscription_name'] = manifest['pool']['productName']
|
||||
license['pool_id'] = manifest['pool']['id']
|
||||
license['license_date'] = parse_date(manifest['endDate']).strftime('%s')
|
||||
license['product_name'] = manifest['pool']['productName']
|
||||
license['valid_key'] = True
|
||||
license['license_type'] = 'enterprise'
|
||||
license['satellite'] = False
|
||||
for sub in manifest:
|
||||
if not is_appropriate_manifest_sub(sub):
|
||||
logger.warning("Subscription %s (%s) in manifest is not active or for another product" %
|
||||
(sub['pool']['productName'], sub['pool']['productId']))
|
||||
continue
|
||||
if not _can_aggregate(sub, license):
|
||||
logger.warning("Subscription %s (%s) in manifest does not match other manifest subscriptions" %
|
||||
(sub['pool']['productName'], sub['pool']['productId']))
|
||||
continue
|
||||
|
||||
license.setdefault('sku', sub['pool']['productId'])
|
||||
license.setdefault('subscription_name', sub['pool']['productName'])
|
||||
license.setdefault('pool_id', sub['pool']['id'])
|
||||
license.setdefault('product_name', sub['pool']['productName'])
|
||||
license.setdefault('valid_key', True)
|
||||
license.setdefault('license_type', 'enterprise')
|
||||
license.setdefault('satellite', False)
|
||||
# Use the nearest end date
|
||||
endDate = parse_date(sub['endDate'])
|
||||
currentEndDateStr = license.get('license_date', '4102462800') # 2100-01-01
|
||||
currentEndDate = datetime.fromtimestamp(int(currentEndDateStr), timezone.utc)
|
||||
if endDate < currentEndDate:
|
||||
license['license_date'] = endDate.strftime('%s')
|
||||
instances = sub['quantity']
|
||||
license['instance_count'] = license.get('instance_count', 0) + instances
|
||||
license['subscription_name'] = re.sub(r'[\d]* Managed Nodes', '%d Managed Nodes' % license['instance_count'], license['subscription_name'])
|
||||
|
||||
if not license:
|
||||
logger.error("No valid subscriptions found in manifest")
|
||||
self._attrs.update(license)
|
||||
settings.LICENSE = self._attrs
|
||||
return self._attrs
|
||||
@@ -214,11 +257,15 @@ class Licenser(object):
|
||||
|
||||
|
||||
def get_satellite_subs(self, host, user, pw):
|
||||
port = None
|
||||
try:
|
||||
verify = str(self.config.get("rhsm", "repo_ca_cert"))
|
||||
port = str(self.config.get("server", "port"))
|
||||
except Exception as e:
|
||||
logger.exception('Unable to read rhsm config to get ca_cert location. {}'.format(str(e)))
|
||||
verify = getattr(settings, 'REDHAT_CANDLEPIN_VERIFY', True)
|
||||
if port:
|
||||
host = ':'.join([host, port])
|
||||
json = []
|
||||
try:
|
||||
orgs = requests.get(
|
||||
@@ -272,7 +319,7 @@ class Licenser(object):
|
||||
return False
|
||||
# Products that contain Ansible Tower
|
||||
products = sub.get('providedProducts', [])
|
||||
if any(map(lambda product: product.get('productId', None) == "480", products)):
|
||||
if any(product.get('productId') == '480' for product in products):
|
||||
return True
|
||||
return False
|
||||
|
||||
@@ -373,10 +420,9 @@ class Licenser(object):
|
||||
current_instances = Host.objects.active_count()
|
||||
else:
|
||||
current_instances = 0
|
||||
available_instances = int(attrs.get('instance_count', None) or 0)
|
||||
instance_count = int(attrs.get('instance_count', 0))
|
||||
attrs['current_instances'] = current_instances
|
||||
attrs['available_instances'] = available_instances
|
||||
free_instances = (available_instances - current_instances)
|
||||
free_instances = (instance_count - current_instances)
|
||||
attrs['free_instances'] = max(0, free_instances)
|
||||
|
||||
license_date = int(attrs.get('license_date', 0) or 0)
|
||||
|
||||
@@ -1,15 +0,0 @@
|
||||
from __future__ import (absolute_import, division, print_function)
|
||||
__metaclass__ = type
|
||||
|
||||
from ansible.plugins.action import ActionBase
|
||||
|
||||
|
||||
class ActionModule(ActionBase):
|
||||
|
||||
def run(self, tmp=None, task_vars=None):
|
||||
self._supports_check_mode = False
|
||||
result = super(ActionModule, self).run(tmp, task_vars)
|
||||
result['changed'] = result['failed'] = False
|
||||
result['msg'] = ''
|
||||
self._display.deprecated("Mercurial support is deprecated")
|
||||
return result
|
||||
@@ -48,12 +48,6 @@
|
||||
tags:
|
||||
- update_git
|
||||
|
||||
- block:
|
||||
- name: include hg tasks
|
||||
include_tasks: project_update_hg_tasks.yml
|
||||
tags:
|
||||
- update_hg
|
||||
|
||||
- block:
|
||||
- name: update project using svn
|
||||
subversion:
|
||||
@@ -150,7 +144,6 @@
|
||||
msg: "Repository Version {{ scm_version }}"
|
||||
tags:
|
||||
- update_git
|
||||
- update_hg
|
||||
- update_svn
|
||||
- update_insights
|
||||
- update_archive
|
||||
|
||||
@@ -1,20 +0,0 @@
|
||||
---
|
||||
- name: Mercurial support is deprecated.
|
||||
hg_deprecation:
|
||||
|
||||
- name: update project using hg
|
||||
hg:
|
||||
dest: "{{project_path|quote}}"
|
||||
repo: "{{scm_url|quote}}"
|
||||
revision: "{{scm_branch|quote}}"
|
||||
force: "{{scm_clean}}"
|
||||
register: hg_result
|
||||
|
||||
- name: Set the hg repository version
|
||||
set_fact:
|
||||
scm_version: "{{ hg_result['after'] }}"
|
||||
when: "'after' in hg_result"
|
||||
|
||||
- name: parse hg version string properly
|
||||
set_fact:
|
||||
scm_version: "{{scm_version|regex_replace('^([A-Za-z0-9]+).*$', '\\1')}}"
|
||||
@@ -248,6 +248,7 @@ TEMPLATES = [
|
||||
'django.template.context_processors.static',
|
||||
'django.template.context_processors.tz',
|
||||
'django.contrib.messages.context_processors.messages',
|
||||
'awx.ui.context_processors.csp',
|
||||
'social_django.context_processors.backends',
|
||||
'social_django.context_processors.login_redirect',
|
||||
],
|
||||
@@ -661,7 +662,7 @@ INV_ENV_VARIABLE_BLOCKED = ("HOME", "USER", "_", "TERM")
|
||||
# ----------------
|
||||
EC2_ENABLED_VAR = 'ec2_state'
|
||||
EC2_ENABLED_VALUE = 'running'
|
||||
EC2_INSTANCE_ID_VAR = 'ec2_id'
|
||||
EC2_INSTANCE_ID_VAR = 'instance_id'
|
||||
EC2_EXCLUDE_EMPTY_GROUPS = True
|
||||
|
||||
# ------------
|
||||
|
||||
@@ -175,13 +175,6 @@ TEST_GIT_PUBLIC_HTTPS = 'https://github.com/ansible/ansible.github.com.git'
|
||||
TEST_GIT_PRIVATE_HTTPS = 'https://github.com/ansible/product-docs.git'
|
||||
TEST_GIT_PRIVATE_SSH = 'git@github.com:ansible/product-docs.git'
|
||||
|
||||
TEST_HG_USERNAME = ''
|
||||
TEST_HG_PASSWORD = ''
|
||||
TEST_HG_KEY_DATA = TEST_SSH_KEY_DATA
|
||||
TEST_HG_PUBLIC_HTTPS = 'https://bitbucket.org/cchurch/django-hotrunner'
|
||||
TEST_HG_PRIVATE_HTTPS = ''
|
||||
TEST_HG_PRIVATE_SSH = ''
|
||||
|
||||
TEST_SVN_USERNAME = ''
|
||||
TEST_SVN_PASSWORD = ''
|
||||
TEST_SVN_PUBLIC_HTTPS = 'https://github.com/ansible/ansible.github.com'
|
||||
|
||||
@@ -38,7 +38,7 @@ if is_testing(sys.argv):
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
# AMQP configuration.
|
||||
BROKER_URL = 'amqp://guest:guest@localhost:5672'
|
||||
|
||||
@@ -146,13 +146,6 @@ TEST_GIT_PUBLIC_HTTPS = 'https://github.com/ansible/ansible.github.com.git'
|
||||
TEST_GIT_PRIVATE_HTTPS = 'https://github.com/ansible/product-docs.git'
|
||||
TEST_GIT_PRIVATE_SSH = 'git@github.com:ansible/product-docs.git'
|
||||
|
||||
TEST_HG_USERNAME = ''
|
||||
TEST_HG_PASSWORD = ''
|
||||
TEST_HG_KEY_DATA = TEST_SSH_KEY_DATA
|
||||
TEST_HG_PUBLIC_HTTPS = 'https://bitbucket.org/cchurch/django-hotrunner'
|
||||
TEST_HG_PRIVATE_HTTPS = ''
|
||||
TEST_HG_PRIVATE_SSH = ''
|
||||
|
||||
TEST_SVN_USERNAME = ''
|
||||
TEST_SVN_PASSWORD = ''
|
||||
TEST_SVN_PUBLIC_HTTPS = 'https://github.com/ansible/ansible.github.com'
|
||||
|
||||
@@ -445,7 +445,8 @@ class LDAPGroupTypeField(fields.ChoiceField, DependsOnMixin):
|
||||
|
||||
default_error_messages = {
|
||||
'type_error': _('Expected an instance of LDAPGroupType but got {input_type} instead.'),
|
||||
'missing_parameters': _('Missing required parameters in {dependency}.')
|
||||
'missing_parameters': _('Missing required parameters in {dependency}.'),
|
||||
'invalid_parameters': _('Invalid group_type parameters. Expected instance of dict but got {parameters_type} instead.')
|
||||
}
|
||||
|
||||
def __init__(self, choices=None, **kwargs):
|
||||
@@ -465,7 +466,6 @@ class LDAPGroupTypeField(fields.ChoiceField, DependsOnMixin):
|
||||
if not data:
|
||||
return None
|
||||
|
||||
params = self.get_depends_on() or {}
|
||||
cls = find_class_in_modules(data)
|
||||
if not cls:
|
||||
return None
|
||||
@@ -475,8 +475,16 @@ class LDAPGroupTypeField(fields.ChoiceField, DependsOnMixin):
|
||||
# Backwords compatability. Before AUTH_LDAP_GROUP_TYPE_PARAMS existed
|
||||
# MemberDNGroupType was the only group type, of the underlying lib, that
|
||||
# took a parameter.
|
||||
params = self.get_depends_on() or {}
|
||||
params_sanitized = dict()
|
||||
for attr in inspect.getargspec(cls.__init__).args[1:]:
|
||||
|
||||
cls_args = inspect.getargspec(cls.__init__).args[1:]
|
||||
|
||||
if cls_args:
|
||||
if not isinstance(params, dict):
|
||||
self.fail('invalid_parameters', parameters_type=type(params))
|
||||
|
||||
for attr in cls_args:
|
||||
if attr in params:
|
||||
params_sanitized[attr] = params[attr]
|
||||
|
||||
|
||||
@@ -25,7 +25,7 @@ class BaseRedirectView(RedirectView):
|
||||
def get_redirect_url(self, *args, **kwargs):
|
||||
last_path = self.request.COOKIES.get('lastPath', '')
|
||||
last_path = urllib.parse.quote(urllib.parse.unquote(last_path).strip('"'))
|
||||
url = reverse('ui:index')
|
||||
url = reverse('ui_next:index')
|
||||
if last_path:
|
||||
return '%s#%s' % (url, last_path)
|
||||
else:
|
||||
|
||||
8
awx/ui/context_processors.py
Normal file
8
awx/ui/context_processors.py
Normal file
@@ -0,0 +1,8 @@
|
||||
import base64
|
||||
import os
|
||||
|
||||
|
||||
def csp(request):
|
||||
return {
|
||||
'csp_nonce': base64.encodebytes(os.urandom(32)).decode().rstrip(),
|
||||
}
|
||||
@@ -6,4 +6,5 @@ coverage
|
||||
build
|
||||
node_modules
|
||||
dist
|
||||
images
|
||||
images
|
||||
instrumented
|
||||
@@ -8,8 +8,8 @@
|
||||
"modules": true
|
||||
}
|
||||
},
|
||||
"plugins": ["react-hooks"],
|
||||
"extends": ["airbnb", "prettier", "prettier/react"],
|
||||
"plugins": ["react-hooks", "jsx-a11y"],
|
||||
"extends": ["airbnb", "prettier", "prettier/react", "plugin:jsx-a11y/strict"],
|
||||
"settings": {
|
||||
"react": {
|
||||
"version": "16.5.2"
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
FROM node:10
|
||||
FROM node:14
|
||||
ARG NPMRC_FILE=.npmrc
|
||||
ENV NPMRC_FILE=${NPMRC_FILE}
|
||||
ARG TARGET='https://awx:8043'
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# AWX-PF
|
||||
|
||||
## Requirements
|
||||
- node 10.x LTS, npm 6.x LTS, make, git
|
||||
- node 14.x LTS, npm 6.x LTS, make, git
|
||||
|
||||
## Development
|
||||
The API development server will need to be running. See [CONTRIBUTING.md](../../CONTRIBUTING.md).
|
||||
@@ -15,6 +15,19 @@ npm --prefix=awx/ui_next install
|
||||
npm --prefix=awx/ui_next start
|
||||
```
|
||||
|
||||
### Build for the Development Containers
|
||||
If you just want to build a ui for the container-based awx development
|
||||
environment, use these make targets:
|
||||
|
||||
```shell
|
||||
# The ui will be reachable at https://localhost:8043 or
|
||||
# http://localhost:8013
|
||||
make ui-devel
|
||||
|
||||
# clean up
|
||||
make clean-ui
|
||||
```
|
||||
|
||||
### Using an External Server
|
||||
If you normally run awx on an external host/server (in this example, `awx.local`),
|
||||
you'll need use the `TARGET` environment variable when starting the ui development
|
||||
|
||||
27
awx/ui_next/docs/APP_ARCHITECTURE.md
Normal file
27
awx/ui_next/docs/APP_ARCHITECTURE.md
Normal file
@@ -0,0 +1,27 @@
|
||||
# Application Architecture
|
||||
|
||||
## Local Storage Integration
|
||||
The `useStorage` hook integrates with the browser's localStorage api.
|
||||
It accepts a localStorage key as its only argument and returns a state
|
||||
variable and setter function for that state variable. The hook enables
|
||||
bidirectional data transfer between tabs via an event listener that
|
||||
is registered with the Web Storage api.
|
||||
|
||||
|
||||

|
||||
|
||||
The `useStorage` hook currently lives in the `AppContainer` component. It
|
||||
can be relocated to a more general location should and if the need
|
||||
ever arise
|
||||
|
||||
## Session Expiration
|
||||
Session timeout state is communicated to the client in the HTTP(S)
|
||||
response headers. Every HTTP(S) response is intercepted to read the
|
||||
session expiration time before being passed into the rest of the
|
||||
application. A timeout date is computed from the intercepted HTTP(S)
|
||||
headers and is pushed into local storage, where it can be read using
|
||||
standard Web Storage apis or other utilities, such as `useStorage`.
|
||||
|
||||
|
||||

|
||||
|
||||
BIN
awx/ui_next/docs/images/sessionExpiration.png
Normal file
BIN
awx/ui_next/docs/images/sessionExpiration.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 70 KiB |
BIN
awx/ui_next/docs/images/useStorage.png
Normal file
BIN
awx/ui_next/docs/images/useStorage.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 57 KiB |
8346
awx/ui_next/package-lock.json
generated
8346
awx/ui_next/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -3,13 +3,14 @@
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"engines": {
|
||||
"node": "10.x"
|
||||
"node": "14.x"
|
||||
},
|
||||
"dependencies": {
|
||||
"@lingui/react": "^2.9.1",
|
||||
"@patternfly/patternfly": "4.59.1",
|
||||
"@patternfly/react-core": "4.75.2",
|
||||
"@patternfly/react-icons": "4.7.16",
|
||||
"@patternfly/patternfly": "4.70.2",
|
||||
"@patternfly/react-core": "4.84.3",
|
||||
"@patternfly/react-icons": "4.7.22",
|
||||
"@patternfly/react-table": "^4.19.15",
|
||||
"ansi-to-html": "^0.6.11",
|
||||
"axios": "^0.18.1",
|
||||
"codemirror": "^5.47.0",
|
||||
@@ -30,6 +31,7 @@
|
||||
},
|
||||
"devDependencies": {
|
||||
"@babel/polyfill": "^7.8.7",
|
||||
"@cypress/instrument-cra": "^1.4.0",
|
||||
"@lingui/cli": "^2.9.2",
|
||||
"@lingui/macro": "^2.9.1",
|
||||
"@nteract/mockument": "^1.0.4",
|
||||
@@ -42,7 +44,7 @@
|
||||
"eslint-config-prettier": "^5.0.0",
|
||||
"eslint-import-resolver-webpack": "0.11.1",
|
||||
"eslint-plugin-import": "^2.14.0",
|
||||
"eslint-plugin-jsx-a11y": "^6.1.1",
|
||||
"eslint-plugin-jsx-a11y": "^6.4.1",
|
||||
"eslint-plugin-react": "^7.11.1",
|
||||
"eslint-plugin-react-hooks": "^2.2.0",
|
||||
"http-proxy-middleware": "^1.0.3",
|
||||
@@ -53,7 +55,8 @@
|
||||
},
|
||||
"scripts": {
|
||||
"start": "PORT=3001 HTTPS=true DANGEROUSLY_DISABLE_HOST_CHECK=true react-scripts start",
|
||||
"build": "react-scripts build",
|
||||
"start-instrumented": "DEBUG=instrument-cra PORT=3001 HTTPS=true DANGEROUSLY_DISABLE_HOST_CHECK=true react-scripts -r @cypress/instrument-cra start",
|
||||
"build": "INLINE_RUNTIME_CHUNK=false react-scripts build",
|
||||
"test": "TZ='UTC' react-scripts test --coverage --watchAll=false",
|
||||
"test-watch": "TZ='UTC' react-scripts test",
|
||||
"eject": "react-scripts eject",
|
||||
|
||||
@@ -1,6 +1,15 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<% if (process.env.NODE_ENV === 'production') { %>
|
||||
<script nonce="{{ csp_nonce }}" type="text/javascript">
|
||||
window.NONCE_ID = '{{ csp_nonce }}';
|
||||
</script>
|
||||
<meta
|
||||
http-equiv="Content-Security-Policy"
|
||||
content="default-src 'self'; connect-src 'self' ws: wss:; style-src 'self' 'unsafe-inline'; script-src 'self' 'nonce-{{ csp_nonce }}' *.pendo.io; img-src 'self' *.pendo.io data:;"
|
||||
/>
|
||||
<% } %>
|
||||
<meta charset="utf-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||
<meta name="theme-color" content="#000000" />
|
||||
@@ -12,6 +21,10 @@
|
||||
</head>
|
||||
<body>
|
||||
<noscript>You need to enable JavaScript to run this app.</noscript>
|
||||
<div id="app" style="height: 100%"></div>
|
||||
<% if (process.env.NODE_ENV === 'production') { %>
|
||||
<style nonce="{{ csp_nonce }}">.app{height: 100%;}</style><div id="app" class="app"></div>
|
||||
<% } else { %>
|
||||
<div id="app" style="height: 100%"></div>
|
||||
<% } %>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
25
awx/ui_next/public/installing.html
Normal file
25
awx/ui_next/public/installing.html
Normal file
@@ -0,0 +1,25 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en-US">
|
||||
<head>
|
||||
<meta
|
||||
http-equiv="Content-Security-Policy"
|
||||
content="default-src 'self'; connect-src 'self' ws: wss:; style-src 'self' 'nonce-{{ csp_nonce }}'; script-src 'self' 'nonce-{{ csp_nonce }}' *.pendo.io; img-src 'self' *.pendo.io data:;"
|
||||
/>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<script nonce="{{ csp_nonce }}">
|
||||
setInterval(function() {
|
||||
window.location = '/';
|
||||
}, 10000);
|
||||
</script>
|
||||
</head>
|
||||
<body>
|
||||
<div>
|
||||
<span>
|
||||
<p>AWX is installing.</p>
|
||||
<p>This page will refresh when complete.</p>
|
||||
</span>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
@@ -30,7 +30,12 @@ const ProtectedRoute = ({ children, ...rest }) =>
|
||||
|
||||
function App() {
|
||||
const catalogs = { en, ja };
|
||||
const language = getLanguageWithoutRegionCode(navigator);
|
||||
let language = getLanguageWithoutRegionCode(navigator);
|
||||
if (!Object.keys(catalogs).includes(language)) {
|
||||
// If there isn't a string catalog available for the browser's
|
||||
// preferred language, default to one that has strings.
|
||||
language = 'en';
|
||||
}
|
||||
const match = useRouteMatch();
|
||||
const { hash, search, pathname } = useLocation();
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import React from 'react';
|
||||
|
||||
import { act } from 'react-dom/test-utils';
|
||||
import { mountWithContexts } from '../testUtils/enzymeHelpers';
|
||||
|
||||
import App from './App';
|
||||
@@ -7,8 +7,11 @@ import App from './App';
|
||||
jest.mock('./api');
|
||||
|
||||
describe('<App />', () => {
|
||||
test('renders ok', () => {
|
||||
const wrapper = mountWithContexts(<App />);
|
||||
test('renders ok', async () => {
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(<App />);
|
||||
});
|
||||
expect(wrapper.length).toBe(1);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,6 +1,13 @@
|
||||
import axios from 'axios';
|
||||
|
||||
import { SESSION_TIMEOUT_KEY } from '../constants';
|
||||
import { encodeQueryString } from '../util/qs';
|
||||
import debounce from '../util/debounce';
|
||||
|
||||
const updateStorage = debounce((key, val) => {
|
||||
window.localStorage.setItem(key, val);
|
||||
window.dispatchEvent(new Event('storage'));
|
||||
}, 500);
|
||||
|
||||
const defaultHttp = axios.create({
|
||||
xsrfCookieName: 'csrftoken',
|
||||
@@ -10,6 +17,15 @@ const defaultHttp = axios.create({
|
||||
},
|
||||
});
|
||||
|
||||
defaultHttp.interceptors.response.use(response => {
|
||||
const timeout = response?.headers['session-timeout'];
|
||||
if (timeout) {
|
||||
const timeoutDate = new Date().getTime() + timeout * 1000;
|
||||
updateStorage(SESSION_TIMEOUT_KEY, String(timeoutDate));
|
||||
}
|
||||
return response;
|
||||
});
|
||||
|
||||
class Base {
|
||||
constructor(http = defaultHttp, baseURL) {
|
||||
this.http = http;
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import AdHocCommands from './models/AdHocCommands';
|
||||
import Applications from './models/Applications';
|
||||
import Auth from './models/Auth';
|
||||
import Config from './models/Config';
|
||||
import CredentialInputSources from './models/CredentialInputSources';
|
||||
import CredentialTypes from './models/CredentialTypes';
|
||||
@@ -32,6 +33,7 @@ import Tokens from './models/Tokens';
|
||||
import UnifiedJobTemplates from './models/UnifiedJobTemplates';
|
||||
import UnifiedJobs from './models/UnifiedJobs';
|
||||
import Users from './models/Users';
|
||||
import WorkflowApprovals from './models/WorkflowApprovals';
|
||||
import WorkflowApprovalTemplates from './models/WorkflowApprovalTemplates';
|
||||
import WorkflowJobTemplateNodes from './models/WorkflowJobTemplateNodes';
|
||||
import WorkflowJobTemplates from './models/WorkflowJobTemplates';
|
||||
@@ -39,6 +41,7 @@ import WorkflowJobs from './models/WorkflowJobs';
|
||||
|
||||
const AdHocCommandsAPI = new AdHocCommands();
|
||||
const ApplicationsAPI = new Applications();
|
||||
const AuthAPI = new Auth();
|
||||
const ConfigAPI = new Config();
|
||||
const CredentialInputSourcesAPI = new CredentialInputSources();
|
||||
const CredentialTypesAPI = new CredentialTypes();
|
||||
@@ -71,6 +74,7 @@ const TokensAPI = new Tokens();
|
||||
const UnifiedJobTemplatesAPI = new UnifiedJobTemplates();
|
||||
const UnifiedJobsAPI = new UnifiedJobs();
|
||||
const UsersAPI = new Users();
|
||||
const WorkflowApprovalsAPI = new WorkflowApprovals();
|
||||
const WorkflowApprovalTemplatesAPI = new WorkflowApprovalTemplates();
|
||||
const WorkflowJobTemplateNodesAPI = new WorkflowJobTemplateNodes();
|
||||
const WorkflowJobTemplatesAPI = new WorkflowJobTemplates();
|
||||
@@ -79,6 +83,7 @@ const WorkflowJobsAPI = new WorkflowJobs();
|
||||
export {
|
||||
AdHocCommandsAPI,
|
||||
ApplicationsAPI,
|
||||
AuthAPI,
|
||||
ConfigAPI,
|
||||
CredentialInputSourcesAPI,
|
||||
CredentialTypesAPI,
|
||||
@@ -111,6 +116,7 @@ export {
|
||||
UnifiedJobTemplatesAPI,
|
||||
UnifiedJobsAPI,
|
||||
UsersAPI,
|
||||
WorkflowApprovalsAPI,
|
||||
WorkflowApprovalTemplatesAPI,
|
||||
WorkflowJobTemplateNodesAPI,
|
||||
WorkflowJobTemplatesAPI,
|
||||
|
||||
10
awx/ui_next/src/api/models/Auth.js
Normal file
10
awx/ui_next/src/api/models/Auth.js
Normal file
@@ -0,0 +1,10 @@
|
||||
import Base from '../Base';
|
||||
|
||||
class Auth extends Base {
|
||||
constructor(http) {
|
||||
super(http);
|
||||
this.baseUrl = '/api/v2/auth/';
|
||||
}
|
||||
}
|
||||
|
||||
export default Auth;
|
||||
18
awx/ui_next/src/api/models/WorkflowApprovals.js
Normal file
18
awx/ui_next/src/api/models/WorkflowApprovals.js
Normal file
@@ -0,0 +1,18 @@
|
||||
import Base from '../Base';
|
||||
|
||||
class WorkflowApprovals extends Base {
|
||||
constructor(http) {
|
||||
super(http);
|
||||
this.baseUrl = '/api/v2/workflow_approvals/';
|
||||
}
|
||||
|
||||
approve(id) {
|
||||
return this.http.post(`${this.baseUrl}${id}/approve/`);
|
||||
}
|
||||
|
||||
deny(id) {
|
||||
return this.http.post(`${this.baseUrl}${id}/deny/`);
|
||||
}
|
||||
}
|
||||
|
||||
export default WorkflowApprovals;
|
||||
@@ -55,6 +55,19 @@ class WorkflowJobTemplateNodes extends Base {
|
||||
readCredentials(id) {
|
||||
return this.http.get(`${this.baseUrl}${id}/credentials/`);
|
||||
}
|
||||
|
||||
associateCredentials(id, credentialId) {
|
||||
return this.http.post(`${this.baseUrl}${id}/credentials/`, {
|
||||
id: credentialId,
|
||||
});
|
||||
}
|
||||
|
||||
disassociateCredentials(id, credentialId) {
|
||||
return this.http.post(`${this.baseUrl}${id}/credentials/`, {
|
||||
id: credentialId,
|
||||
disassociate: true,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export default WorkflowJobTemplateNodes;
|
||||
|
||||
@@ -12,8 +12,8 @@ import {
|
||||
import { BrandName } from '../../variables';
|
||||
import brandLogoImg from './brand-logo.svg';
|
||||
|
||||
class About extends React.Component {
|
||||
static createSpeechBubble(version) {
|
||||
function About({ ansible_version, version, isOpen, onClose, i18n }) {
|
||||
const createSpeechBubble = () => {
|
||||
let text = `${BrandName} ${version}`;
|
||||
let top = '';
|
||||
let bottom = '';
|
||||
@@ -28,31 +28,22 @@ class About extends React.Component {
|
||||
bottom = ` --${bottom}-- `;
|
||||
|
||||
return top + text + bottom;
|
||||
}
|
||||
};
|
||||
|
||||
constructor(props) {
|
||||
super(props);
|
||||
const speechBubble = createSpeechBubble();
|
||||
|
||||
this.createSpeechBubble = this.constructor.createSpeechBubble.bind(this);
|
||||
}
|
||||
|
||||
render() {
|
||||
const { ansible_version, version, isOpen, onClose, i18n } = this.props;
|
||||
|
||||
const speechBubble = this.createSpeechBubble(version);
|
||||
|
||||
return (
|
||||
<AboutModal
|
||||
isOpen={isOpen}
|
||||
onClose={onClose}
|
||||
productName={`Ansible ${BrandName}`}
|
||||
trademark={i18n._(t`Copyright 2019 Red Hat, Inc.`)}
|
||||
brandImageSrc={brandLogoImg}
|
||||
brandImageAlt={i18n._(t`Brand Image`)}
|
||||
>
|
||||
<pre>
|
||||
{speechBubble}
|
||||
{`
|
||||
return (
|
||||
<AboutModal
|
||||
isOpen={isOpen}
|
||||
onClose={onClose}
|
||||
productName={`Ansible ${BrandName}`}
|
||||
trademark={i18n._(t`Copyright 2019 Red Hat, Inc.`)}
|
||||
brandImageSrc={brandLogoImg}
|
||||
brandImageAlt={i18n._(t`Brand Image`)}
|
||||
>
|
||||
<pre>
|
||||
{speechBubble}
|
||||
{`
|
||||
\\
|
||||
\\ ^__^
|
||||
(oo)\\_______
|
||||
@@ -60,18 +51,17 @@ class About extends React.Component {
|
||||
||----w |
|
||||
|| ||
|
||||
`}
|
||||
</pre>
|
||||
<TextContent>
|
||||
<TextList component="dl">
|
||||
<TextListItem component="dt">
|
||||
{i18n._(t`Ansible Version`)}
|
||||
</TextListItem>
|
||||
<TextListItem component="dd">{ansible_version}</TextListItem>
|
||||
</TextList>
|
||||
</TextContent>
|
||||
</AboutModal>
|
||||
);
|
||||
}
|
||||
</pre>
|
||||
<TextContent>
|
||||
<TextList component="dl">
|
||||
<TextListItem component="dt">
|
||||
{i18n._(t`Ansible Version`)}
|
||||
</TextListItem>
|
||||
<TextListItem component="dd">{ansible_version}</TextListItem>
|
||||
</TextList>
|
||||
</TextContent>
|
||||
</AboutModal>
|
||||
);
|
||||
}
|
||||
|
||||
About.propTypes = {
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import React, { useEffect, useState, useCallback } from 'react';
|
||||
import React, { useEffect, useState, useCallback, useRef } from 'react';
|
||||
import { useHistory, useLocation, withRouter } from 'react-router-dom';
|
||||
import {
|
||||
Button,
|
||||
Nav,
|
||||
NavList,
|
||||
Page,
|
||||
@@ -13,6 +14,8 @@ import styled from 'styled-components';
|
||||
|
||||
import { ConfigAPI, MeAPI, RootAPI } from '../../api';
|
||||
import { ConfigProvider } from '../../contexts/Config';
|
||||
import { SESSION_TIMEOUT_KEY } from '../../constants';
|
||||
import { isAuthenticated } from '../../util/auth';
|
||||
import About from '../About';
|
||||
import AlertModal from '../AlertModal';
|
||||
import ErrorDetail from '../ErrorDetail';
|
||||
@@ -20,6 +23,17 @@ import BrandLogo from './BrandLogo';
|
||||
import NavExpandableGroup from './NavExpandableGroup';
|
||||
import PageHeaderToolbar from './PageHeaderToolbar';
|
||||
|
||||
// The maximum supported timeout for setTimeout(), in milliseconds,
|
||||
// is the highest number you can represent as a signed 32bit
|
||||
// integer (approximately 25 days)
|
||||
const MAX_TIMEOUT = 2 ** (32 - 1) - 1;
|
||||
|
||||
// The number of seconds the session timeout warning is displayed
|
||||
// before the user is logged out. Increasing this number (up to
|
||||
// the total session time, which is 1800s by default) will cause
|
||||
// the session timeout warning to display sooner.
|
||||
const SESSION_WARNING_DURATION = 10;
|
||||
|
||||
const PageHeader = styled(PFPageHeader)`
|
||||
& .pf-c-page__header-brand-link {
|
||||
color: inherit;
|
||||
@@ -30,6 +44,45 @@ const PageHeader = styled(PFPageHeader)`
|
||||
}
|
||||
`;
|
||||
|
||||
/**
|
||||
* The useStorage hook integrates with the browser's localStorage api.
|
||||
* It accepts a storage key as its only argument and returns a state
|
||||
* variable and setter function for that state variable.
|
||||
*
|
||||
* This utility behaves much like the standard useState hook with some
|
||||
* key differences:
|
||||
* 1. You don't pass it an initial value. Instead, the provided key
|
||||
* is used to retrieve the initial value from local storage. If
|
||||
* the key doesn't exist in local storage, null is returned.
|
||||
* 2. Behind the scenes, this hook registers an event listener with
|
||||
* the Web Storage api to establish a two-way binding between the
|
||||
* state variable and its corresponding local storage value. This
|
||||
* means that updates to the state variable with the setter
|
||||
* function will produce a corresponding update to the local
|
||||
* storage value and vice-versa.
|
||||
* 3. When local storage is shared across browser tabs, the data
|
||||
* binding is also shared across browser tabs. This means that
|
||||
* updates to the state variable using the setter function on
|
||||
* one tab will also update the state variable on any other tab
|
||||
* using this hook with the same key and vice-versa.
|
||||
*/
|
||||
function useStorage(key) {
|
||||
const [storageVal, setStorageVal] = useState(
|
||||
window.localStorage.getItem(key)
|
||||
);
|
||||
window.addEventListener('storage', () => {
|
||||
const newVal = window.localStorage.getItem(key);
|
||||
if (newVal !== storageVal) {
|
||||
setStorageVal(newVal);
|
||||
}
|
||||
});
|
||||
const setValue = val => {
|
||||
window.localStorage.setItem(key, val);
|
||||
setStorageVal(val);
|
||||
};
|
||||
return [storageVal, setValue];
|
||||
}
|
||||
|
||||
function AppContainer({ i18n, navRouteConfig = [], children }) {
|
||||
const history = useHistory();
|
||||
const { pathname } = useLocation();
|
||||
@@ -38,14 +91,51 @@ function AppContainer({ i18n, navRouteConfig = [], children }) {
|
||||
const [isAboutModalOpen, setIsAboutModalOpen] = useState(false);
|
||||
const [isReady, setIsReady] = useState(false);
|
||||
|
||||
const sessionTimeoutId = useRef();
|
||||
const sessionIntervalId = useRef();
|
||||
const [sessionTimeout, setSessionTimeout] = useStorage(SESSION_TIMEOUT_KEY);
|
||||
const [timeoutWarning, setTimeoutWarning] = useState(false);
|
||||
const [timeRemaining, setTimeRemaining] = useState(null);
|
||||
|
||||
const handleAboutModalOpen = () => setIsAboutModalOpen(true);
|
||||
const handleAboutModalClose = () => setIsAboutModalOpen(false);
|
||||
const handleConfigErrorClose = () => setConfigError(null);
|
||||
const handleSessionTimeout = () => setTimeoutWarning(true);
|
||||
|
||||
const handleLogout = useCallback(async () => {
|
||||
await RootAPI.logout();
|
||||
history.replace('/login');
|
||||
}, [history]);
|
||||
setSessionTimeout(null);
|
||||
}, [setSessionTimeout]);
|
||||
|
||||
const handleSessionContinue = () => {
|
||||
MeAPI.read();
|
||||
setTimeoutWarning(false);
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
if (!isAuthenticated(document.cookie)) history.replace('/login');
|
||||
const calcRemaining = () =>
|
||||
parseInt(sessionTimeout, 10) - new Date().getTime();
|
||||
const updateRemaining = () => setTimeRemaining(calcRemaining());
|
||||
setTimeoutWarning(false);
|
||||
clearTimeout(sessionTimeoutId.current);
|
||||
clearInterval(sessionIntervalId.current);
|
||||
sessionTimeoutId.current = setTimeout(
|
||||
handleSessionTimeout,
|
||||
Math.min(calcRemaining() - SESSION_WARNING_DURATION * 1000, MAX_TIMEOUT)
|
||||
);
|
||||
sessionIntervalId.current = setInterval(updateRemaining, 1000);
|
||||
return () => {
|
||||
clearTimeout(sessionTimeoutId.current);
|
||||
clearInterval(sessionIntervalId.current);
|
||||
};
|
||||
}, [history, sessionTimeout]);
|
||||
|
||||
useEffect(() => {
|
||||
if (timeRemaining !== null && timeRemaining <= 1) {
|
||||
handleLogout();
|
||||
}
|
||||
}, [handleLogout, timeRemaining]);
|
||||
|
||||
useEffect(() => {
|
||||
const loadConfig = async () => {
|
||||
@@ -128,6 +218,31 @@ function AppContainer({ i18n, navRouteConfig = [], children }) {
|
||||
{i18n._(t`Failed to retrieve configuration.`)}
|
||||
<ErrorDetail error={configError} />
|
||||
</AlertModal>
|
||||
<AlertModal
|
||||
title={i18n._(t`Your session is about to expire`)}
|
||||
isOpen={timeoutWarning && sessionTimeout > 0 && timeRemaining !== null}
|
||||
onClose={handleLogout}
|
||||
showClose={false}
|
||||
variant="warning"
|
||||
actions={[
|
||||
<Button
|
||||
key="confirm"
|
||||
variant="primary"
|
||||
onClick={handleSessionContinue}
|
||||
>
|
||||
{i18n._(t`Continue`)}
|
||||
</Button>,
|
||||
<Button key="logout" variant="secondary" onClick={handleLogout}>
|
||||
{i18n._(t`Logout`)}
|
||||
</Button>,
|
||||
]}
|
||||
>
|
||||
{i18n._(
|
||||
t`You will be logged out in ${Number(
|
||||
Math.max(Math.floor(timeRemaining / 1000), 0)
|
||||
)} seconds due to inactivity.`
|
||||
)}
|
||||
</AlertModal>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import React, { Component } from 'react';
|
||||
import React, { useState } from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
@@ -17,129 +17,100 @@ import { QuestionCircleIcon, UserIcon } from '@patternfly/react-icons';
|
||||
const DOCLINK =
|
||||
'https://docs.ansible.com/ansible-tower/latest/html/userguide/index.html';
|
||||
|
||||
class PageHeaderToolbar extends Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
this.state = {
|
||||
isHelpOpen: false,
|
||||
isUserOpen: false,
|
||||
};
|
||||
function PageHeaderToolbar({
|
||||
isAboutDisabled,
|
||||
onAboutClick,
|
||||
onLogoutClick,
|
||||
loggedInUser,
|
||||
i18n,
|
||||
}) {
|
||||
const [isHelpOpen, setIsHelpOpen] = useState(false);
|
||||
const [isUserOpen, setIsUserOpen] = useState(false);
|
||||
|
||||
this.handleHelpSelect = this.handleHelpSelect.bind(this);
|
||||
this.handleHelpToggle = this.handleHelpToggle.bind(this);
|
||||
this.handleUserSelect = this.handleUserSelect.bind(this);
|
||||
this.handleUserToggle = this.handleUserToggle.bind(this);
|
||||
}
|
||||
const handleHelpSelect = () => {
|
||||
setIsHelpOpen(!isHelpOpen);
|
||||
};
|
||||
|
||||
handleHelpSelect() {
|
||||
const { isHelpOpen } = this.state;
|
||||
const handleUserSelect = () => {
|
||||
setIsUserOpen(!isUserOpen);
|
||||
};
|
||||
|
||||
this.setState({ isHelpOpen: !isHelpOpen });
|
||||
}
|
||||
|
||||
handleUserSelect() {
|
||||
const { isUserOpen } = this.state;
|
||||
|
||||
this.setState({ isUserOpen: !isUserOpen });
|
||||
}
|
||||
|
||||
handleHelpToggle(isOpen) {
|
||||
this.setState({ isHelpOpen: isOpen });
|
||||
}
|
||||
|
||||
handleUserToggle(isOpen) {
|
||||
this.setState({ isUserOpen: isOpen });
|
||||
}
|
||||
|
||||
render() {
|
||||
const { isHelpOpen, isUserOpen } = this.state;
|
||||
const {
|
||||
isAboutDisabled,
|
||||
onAboutClick,
|
||||
onLogoutClick,
|
||||
loggedInUser,
|
||||
i18n,
|
||||
} = this.props;
|
||||
|
||||
return (
|
||||
<PageHeaderTools>
|
||||
<PageHeaderToolsGroup>
|
||||
<Tooltip position="left" content={<div>{i18n._(t`Info`)}</div>}>
|
||||
<PageHeaderToolsItem>
|
||||
<Dropdown
|
||||
isPlain
|
||||
isOpen={isHelpOpen}
|
||||
position={DropdownPosition.right}
|
||||
onSelect={this.handleHelpSelect}
|
||||
toggle={
|
||||
<DropdownToggle
|
||||
onToggle={this.handleHelpToggle}
|
||||
aria-label={i18n._(t`Info`)}
|
||||
>
|
||||
<QuestionCircleIcon />
|
||||
</DropdownToggle>
|
||||
}
|
||||
dropdownItems={[
|
||||
<DropdownItem key="help" target="_blank" href={DOCLINK}>
|
||||
{i18n._(t`Help`)}
|
||||
</DropdownItem>,
|
||||
<DropdownItem
|
||||
key="about"
|
||||
component="button"
|
||||
isDisabled={isAboutDisabled}
|
||||
onClick={onAboutClick}
|
||||
>
|
||||
{i18n._(t`About`)}
|
||||
</DropdownItem>,
|
||||
]}
|
||||
/>
|
||||
</PageHeaderToolsItem>
|
||||
</Tooltip>
|
||||
<Tooltip position="left" content={<div>{i18n._(t`User`)}</div>}>
|
||||
<PageHeaderToolsItem>
|
||||
<Dropdown
|
||||
id="toolbar-user-dropdown"
|
||||
isPlain
|
||||
isOpen={isUserOpen}
|
||||
position={DropdownPosition.right}
|
||||
onSelect={this.handleUserSelect}
|
||||
toggle={
|
||||
<DropdownToggle onToggle={this.handleUserToggle}>
|
||||
<UserIcon />
|
||||
{loggedInUser && (
|
||||
<span style={{ marginLeft: '10px' }}>
|
||||
{loggedInUser.username}
|
||||
</span>
|
||||
)}
|
||||
</DropdownToggle>
|
||||
}
|
||||
dropdownItems={[
|
||||
<DropdownItem
|
||||
key="user"
|
||||
href={
|
||||
loggedInUser
|
||||
? `/users/${loggedInUser.id}/details`
|
||||
: '/home'
|
||||
}
|
||||
>
|
||||
{i18n._(t`User Details`)}
|
||||
</DropdownItem>,
|
||||
<DropdownItem
|
||||
key="logout"
|
||||
component="button"
|
||||
onClick={onLogoutClick}
|
||||
id="logout-button"
|
||||
>
|
||||
{i18n._(t`Logout`)}
|
||||
</DropdownItem>,
|
||||
]}
|
||||
/>
|
||||
</PageHeaderToolsItem>
|
||||
</Tooltip>
|
||||
</PageHeaderToolsGroup>
|
||||
</PageHeaderTools>
|
||||
);
|
||||
}
|
||||
return (
|
||||
<PageHeaderTools>
|
||||
<PageHeaderToolsGroup>
|
||||
<Tooltip position="left" content={<div>{i18n._(t`Info`)}</div>}>
|
||||
<PageHeaderToolsItem>
|
||||
<Dropdown
|
||||
isPlain
|
||||
isOpen={isHelpOpen}
|
||||
position={DropdownPosition.right}
|
||||
onSelect={handleHelpSelect}
|
||||
toggle={
|
||||
<DropdownToggle
|
||||
onToggle={setIsHelpOpen}
|
||||
aria-label={i18n._(t`Info`)}
|
||||
>
|
||||
<QuestionCircleIcon />
|
||||
</DropdownToggle>
|
||||
}
|
||||
dropdownItems={[
|
||||
<DropdownItem key="help" target="_blank" href={DOCLINK}>
|
||||
{i18n._(t`Help`)}
|
||||
</DropdownItem>,
|
||||
<DropdownItem
|
||||
key="about"
|
||||
component="button"
|
||||
isDisabled={isAboutDisabled}
|
||||
onClick={onAboutClick}
|
||||
>
|
||||
{i18n._(t`About`)}
|
||||
</DropdownItem>,
|
||||
]}
|
||||
/>
|
||||
</PageHeaderToolsItem>
|
||||
</Tooltip>
|
||||
<Tooltip position="left" content={<div>{i18n._(t`User`)}</div>}>
|
||||
<PageHeaderToolsItem>
|
||||
<Dropdown
|
||||
id="toolbar-user-dropdown"
|
||||
isPlain
|
||||
isOpen={isUserOpen}
|
||||
position={DropdownPosition.right}
|
||||
onSelect={handleUserSelect}
|
||||
toggle={
|
||||
<DropdownToggle onToggle={setIsUserOpen}>
|
||||
<UserIcon />
|
||||
{loggedInUser && (
|
||||
<span style={{ marginLeft: '10px' }}>
|
||||
{loggedInUser.username}
|
||||
</span>
|
||||
)}
|
||||
</DropdownToggle>
|
||||
}
|
||||
dropdownItems={[
|
||||
<DropdownItem
|
||||
key="user"
|
||||
href={
|
||||
loggedInUser ? `/users/${loggedInUser.id}/details` : '/home'
|
||||
}
|
||||
>
|
||||
{i18n._(t`User Details`)}
|
||||
</DropdownItem>,
|
||||
<DropdownItem
|
||||
key="logout"
|
||||
component="button"
|
||||
onClick={onLogoutClick}
|
||||
id="logout-button"
|
||||
>
|
||||
{i18n._(t`Logout`)}
|
||||
</DropdownItem>,
|
||||
]}
|
||||
/>
|
||||
</PageHeaderToolsItem>
|
||||
</Tooltip>
|
||||
</PageHeaderToolsGroup>
|
||||
</PageHeaderTools>
|
||||
);
|
||||
}
|
||||
|
||||
PageHeaderToolbar.propTypes = {
|
||||
|
||||
@@ -34,7 +34,7 @@ function CopyButton({
|
||||
<>
|
||||
<Tooltip content={helperText.tooltip} position="top">
|
||||
<Button
|
||||
isDisabled={isDisabled}
|
||||
isDisabled={isLoading || isDisabled}
|
||||
aria-label={i18n._(t`Copy`)}
|
||||
variant="plain"
|
||||
onClick={copyItemToAPI}
|
||||
|
||||
@@ -93,9 +93,11 @@ function DataListToolbar({
|
||||
onRemove={onRemove}
|
||||
/>
|
||||
</ToolbarItem>
|
||||
<ToolbarItem>
|
||||
<Sort qsConfig={qsConfig} columns={sortColumns} onSort={onSort} />
|
||||
</ToolbarItem>
|
||||
{sortColumns && (
|
||||
<ToolbarItem>
|
||||
<Sort qsConfig={qsConfig} columns={sortColumns} onSort={onSort} />
|
||||
</ToolbarItem>
|
||||
)}
|
||||
</ToolbarToggleGroup>
|
||||
{showExpandCollapse && (
|
||||
<ToolbarGroup>
|
||||
@@ -157,7 +159,7 @@ DataListToolbar.propTypes = {
|
||||
searchColumns: SearchColumns.isRequired,
|
||||
searchableKeys: PropTypes.arrayOf(PropTypes.string),
|
||||
relatedSearchableKeys: PropTypes.arrayOf(PropTypes.string),
|
||||
sortColumns: SortColumns.isRequired,
|
||||
sortColumns: SortColumns,
|
||||
showSelectAll: PropTypes.bool,
|
||||
isAllSelected: PropTypes.bool,
|
||||
isCompact: PropTypes.bool,
|
||||
@@ -174,6 +176,7 @@ DataListToolbar.defaultProps = {
|
||||
itemCount: 0,
|
||||
searchableKeys: [],
|
||||
relatedSearchableKeys: [],
|
||||
sortColumns: null,
|
||||
clearAllFilters: null,
|
||||
showSelectAll: false,
|
||||
isAllSelected: false,
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import React, { useState, useEffect, useContext } from 'react';
|
||||
import { arrayOf, func, object, string } from 'prop-types';
|
||||
import { arrayOf, func, shape, string, oneOfType, number } from 'prop-types';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { Button, Tooltip, DropdownItem } from '@patternfly/react-core';
|
||||
@@ -149,7 +149,20 @@ DisassociateButton.defaultProps = {
|
||||
};
|
||||
|
||||
DisassociateButton.propTypes = {
|
||||
itemsToDisassociate: arrayOf(object),
|
||||
itemsToDisassociate: oneOfType([
|
||||
arrayOf(
|
||||
shape({
|
||||
id: number.isRequired,
|
||||
name: string.isRequired,
|
||||
})
|
||||
),
|
||||
arrayOf(
|
||||
shape({
|
||||
id: number.isRequired,
|
||||
hostname: string.isRequired,
|
||||
})
|
||||
),
|
||||
]),
|
||||
modalNote: string,
|
||||
modalTitle: string,
|
||||
onDisassociate: func.isRequired,
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import React, { Component, Fragment } from 'react';
|
||||
import React, { useState, Fragment } from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import styled from 'styled-components';
|
||||
import { withI18n } from '@lingui/react';
|
||||
@@ -32,27 +32,15 @@ const Expandable = styled(PFExpandable)`
|
||||
}
|
||||
`;
|
||||
|
||||
class ErrorDetail extends Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
function ErrorDetail({ error, i18n }) {
|
||||
const { response } = error;
|
||||
const [isExpanded, setIsExpanded] = useState(false);
|
||||
|
||||
this.state = {
|
||||
isExpanded: false,
|
||||
};
|
||||
const handleToggle = () => {
|
||||
setIsExpanded(!isExpanded);
|
||||
};
|
||||
|
||||
this.handleToggle = this.handleToggle.bind(this);
|
||||
this.renderNetworkError = this.renderNetworkError.bind(this);
|
||||
this.renderStack = this.renderStack.bind(this);
|
||||
}
|
||||
|
||||
handleToggle() {
|
||||
const { isExpanded } = this.state;
|
||||
this.setState({ isExpanded: !isExpanded });
|
||||
}
|
||||
|
||||
renderNetworkError() {
|
||||
const { error } = this.props;
|
||||
const { response } = error;
|
||||
const renderNetworkError = () => {
|
||||
const message = getErrorMessage(response);
|
||||
|
||||
return (
|
||||
@@ -74,31 +62,25 @@ class ErrorDetail extends Component {
|
||||
</CardBody>
|
||||
</Fragment>
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
renderStack() {
|
||||
const { error } = this.props;
|
||||
const renderStack = () => {
|
||||
return <CardBody>{error.stack}</CardBody>;
|
||||
}
|
||||
};
|
||||
|
||||
render() {
|
||||
const { isExpanded } = this.state;
|
||||
const { error, i18n } = this.props;
|
||||
|
||||
return (
|
||||
<Expandable
|
||||
toggleText={i18n._(t`Details`)}
|
||||
onToggle={this.handleToggle}
|
||||
isExpanded={isExpanded}
|
||||
>
|
||||
<Card>
|
||||
{Object.prototype.hasOwnProperty.call(error, 'response')
|
||||
? this.renderNetworkError()
|
||||
: this.renderStack()}
|
||||
</Card>
|
||||
</Expandable>
|
||||
);
|
||||
}
|
||||
return (
|
||||
<Expandable
|
||||
toggleText={i18n._(t`Details`)}
|
||||
onToggle={handleToggle}
|
||||
isExpanded={isExpanded}
|
||||
>
|
||||
<Card>
|
||||
{Object.prototype.hasOwnProperty.call(error, 'response')
|
||||
? renderNetworkError()
|
||||
: renderStack()}
|
||||
</Card>
|
||||
</Expandable>
|
||||
);
|
||||
}
|
||||
|
||||
ErrorDetail.propTypes = {
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import React from 'react';
|
||||
import { act } from 'react-dom/test-utils';
|
||||
import { mountWithContexts } from '../../../testUtils/enzymeHelpers';
|
||||
|
||||
import ErrorDetail from './ErrorDetail';
|
||||
@@ -39,7 +40,7 @@ describe('ErrorDetail', () => {
|
||||
}
|
||||
/>
|
||||
);
|
||||
wrapper.find('ExpandableSection').prop('onToggle')();
|
||||
act(() => wrapper.find('ExpandableSection').prop('onToggle')());
|
||||
wrapper.update();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -7,8 +7,15 @@ import { KebabifiedContext } from '../../contexts/Kebabified';
|
||||
import AlertModal from '../AlertModal';
|
||||
import { Job } from '../../types';
|
||||
|
||||
function cannotCancel(job) {
|
||||
return !job.summary_fields.user_capabilities.start;
|
||||
function cannotCancelBecausePermissions(job) {
|
||||
return (
|
||||
!job.summary_fields.user_capabilities.start &&
|
||||
['pending', 'waiting', 'running'].includes(job.status)
|
||||
);
|
||||
}
|
||||
|
||||
function cannotCancelBecauseNotRunning(job) {
|
||||
return !['pending', 'waiting', 'running'].includes(job.status);
|
||||
}
|
||||
|
||||
function JobListCancelButton({ i18n, jobsToCancel, onCancel }) {
|
||||
@@ -33,20 +40,40 @@ function JobListCancelButton({ i18n, jobsToCancel, onCancel }) {
|
||||
}, [isKebabified, isModalOpen, onKebabModalChange]);
|
||||
|
||||
const renderTooltip = () => {
|
||||
const jobsUnableToCancel = jobsToCancel
|
||||
.filter(cannotCancel)
|
||||
const cannotCancelPermissions = jobsToCancel
|
||||
.filter(cannotCancelBecausePermissions)
|
||||
.map(job => job.name);
|
||||
const numJobsUnableToCancel = jobsUnableToCancel.length;
|
||||
const cannotCancelNotRunning = jobsToCancel
|
||||
.filter(cannotCancelBecauseNotRunning)
|
||||
.map(job => job.name);
|
||||
const numJobsUnableToCancel = cannotCancelPermissions.concat(
|
||||
cannotCancelNotRunning
|
||||
).length;
|
||||
if (numJobsUnableToCancel > 0) {
|
||||
return (
|
||||
<div>
|
||||
{i18n._(
|
||||
'{numJobsUnableToCancel, plural, one {You do not have permission to cancel the following job:} other {You do not have permission to cancel the following jobs:}}',
|
||||
{
|
||||
numJobsUnableToCancel,
|
||||
}
|
||||
{cannotCancelPermissions.length > 0 && (
|
||||
<div>
|
||||
{i18n._(
|
||||
'{numJobsUnableToCancel, plural, one {You do not have permission to cancel the following job:} other {You do not have permission to cancel the following jobs:}}',
|
||||
{
|
||||
numJobsUnableToCancel: cannotCancelPermissions.length,
|
||||
}
|
||||
)}
|
||||
{' '.concat(cannotCancelPermissions.join(', '))}
|
||||
</div>
|
||||
)}
|
||||
{cannotCancelNotRunning.length > 0 && (
|
||||
<div>
|
||||
{i18n._(
|
||||
'{numJobsUnableToCancel, plural, one {You cannot cancel the following job because it is not running:} other {You cannot cancel the following jobs because they are not running:}}',
|
||||
{
|
||||
numJobsUnableToCancel: cannotCancelNotRunning.length,
|
||||
}
|
||||
)}
|
||||
{' '.concat(cannotCancelNotRunning.join(', '))}
|
||||
</div>
|
||||
)}
|
||||
{' '.concat(jobsUnableToCancel.join(', '))}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -62,7 +89,9 @@ function JobListCancelButton({ i18n, jobsToCancel, onCancel }) {
|
||||
};
|
||||
|
||||
const isDisabled =
|
||||
jobsToCancel.length === 0 || jobsToCancel.some(cannotCancel);
|
||||
jobsToCancel.length === 0 ||
|
||||
jobsToCancel.some(cannotCancelBecausePermissions) ||
|
||||
jobsToCancel.some(cannotCancelBecauseNotRunning);
|
||||
|
||||
const cancelJobText = i18n._(
|
||||
'{zeroOrOneJobSelected, plural, one {Cancel job} other {Cancel jobs}}',
|
||||
|
||||
@@ -30,6 +30,29 @@ describe('<JobListCancelButton />', () => {
|
||||
start: false,
|
||||
},
|
||||
},
|
||||
status: 'running',
|
||||
},
|
||||
]}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.find('JobListCancelButton button').props().disabled).toBe(
|
||||
true
|
||||
);
|
||||
});
|
||||
test('should be disabled when selected job is not running', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<JobListCancelButton
|
||||
jobsToCancel={[
|
||||
{
|
||||
id: 1,
|
||||
name: 'some job',
|
||||
summary_fields: {
|
||||
user_capabilities: {
|
||||
delete: false,
|
||||
start: false,
|
||||
},
|
||||
},
|
||||
status: 'successful',
|
||||
},
|
||||
]}
|
||||
/>
|
||||
@@ -51,6 +74,7 @@ describe('<JobListCancelButton />', () => {
|
||||
start: true,
|
||||
},
|
||||
},
|
||||
status: 'running',
|
||||
},
|
||||
]}
|
||||
/>
|
||||
@@ -73,6 +97,7 @@ describe('<JobListCancelButton />', () => {
|
||||
start: true,
|
||||
},
|
||||
},
|
||||
status: 'running',
|
||||
},
|
||||
]}
|
||||
onCancel={onCancel}
|
||||
|
||||
@@ -39,7 +39,7 @@ function JobListItem({
|
||||
project_update: i18n._(t`Source Control Update`),
|
||||
inventory_update: i18n._(t`Inventory Sync`),
|
||||
job: i18n._(t`Playbook Run`),
|
||||
command: i18n._(t`Command`),
|
||||
ad_hoc_command: i18n._(t`Command`),
|
||||
management_job: i18n._(t`Management Job`),
|
||||
workflow_job: i18n._(t`Workflow Job`),
|
||||
};
|
||||
|
||||
@@ -44,7 +44,7 @@ describe('useWsJobs hook', () => {
|
||||
|
||||
test('should establish websocket connection', async () => {
|
||||
global.document.cookie = 'csrftoken=abc123';
|
||||
const mockServer = new WS('wss://localhost/websocket/');
|
||||
const mockServer = new WS('ws://localhost/websocket/');
|
||||
|
||||
const jobs = [{ id: 1 }];
|
||||
await act(async () => {
|
||||
@@ -67,7 +67,7 @@ describe('useWsJobs hook', () => {
|
||||
|
||||
test('should update job status', async () => {
|
||||
global.document.cookie = 'csrftoken=abc123';
|
||||
const mockServer = new WS('wss://localhost/websocket/');
|
||||
const mockServer = new WS('ws://localhost/websocket/');
|
||||
|
||||
const jobs = [{ id: 1, status: 'running' }];
|
||||
await act(async () => {
|
||||
@@ -105,7 +105,7 @@ describe('useWsJobs hook', () => {
|
||||
|
||||
test('should fetch new job', async () => {
|
||||
global.document.cookie = 'csrftoken=abc123';
|
||||
const mockServer = new WS('wss://localhost/websocket/');
|
||||
const mockServer = new WS('ws://localhost/websocket/');
|
||||
const jobs = [{ id: 1 }];
|
||||
const fetch = jest.fn(() => []);
|
||||
await act(async () => {
|
||||
|
||||
@@ -44,6 +44,7 @@ class LaunchButton extends React.Component {
|
||||
showLaunchPrompt: false,
|
||||
launchConfig: null,
|
||||
launchError: false,
|
||||
surveyConfig: null,
|
||||
};
|
||||
|
||||
this.handleLaunch = this.handleLaunch.bind(this);
|
||||
@@ -67,15 +68,28 @@ class LaunchButton extends React.Component {
|
||||
resource.type === 'workflow_job_template'
|
||||
? WorkflowJobTemplatesAPI.readLaunch(resource.id)
|
||||
: JobTemplatesAPI.readLaunch(resource.id);
|
||||
const readSurvey =
|
||||
resource.type === 'workflow_job_template'
|
||||
? WorkflowJobTemplatesAPI.readSurvey(resource.id)
|
||||
: JobTemplatesAPI.readSurvey(resource.id);
|
||||
try {
|
||||
const { data: launchConfig } = await readLaunch;
|
||||
|
||||
let surveyConfig = null;
|
||||
|
||||
if (launchConfig.survey_enabled) {
|
||||
const { data } = await readSurvey;
|
||||
|
||||
surveyConfig = data;
|
||||
}
|
||||
|
||||
if (canLaunchWithoutPrompt(launchConfig)) {
|
||||
this.launchWithParams({});
|
||||
} else {
|
||||
this.setState({
|
||||
showLaunchPrompt: true,
|
||||
launchConfig,
|
||||
surveyConfig,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
@@ -151,7 +165,12 @@ class LaunchButton extends React.Component {
|
||||
}
|
||||
|
||||
render() {
|
||||
const { launchError, showLaunchPrompt, launchConfig } = this.state;
|
||||
const {
|
||||
launchError,
|
||||
showLaunchPrompt,
|
||||
launchConfig,
|
||||
surveyConfig,
|
||||
} = this.state;
|
||||
const { resource, i18n, children } = this.props;
|
||||
return (
|
||||
<Fragment>
|
||||
@@ -172,7 +191,8 @@ class LaunchButton extends React.Component {
|
||||
)}
|
||||
{showLaunchPrompt && (
|
||||
<LaunchPrompt
|
||||
config={launchConfig}
|
||||
launchConfig={launchConfig}
|
||||
surveyConfig={surveyConfig}
|
||||
resource={resource}
|
||||
onLaunch={this.launchWithParams}
|
||||
onCancel={() => this.setState({ showLaunchPrompt: false })}
|
||||
|
||||
@@ -6,12 +6,19 @@ import { Formik, useFormikContext } from 'formik';
|
||||
import ContentError from '../ContentError';
|
||||
import ContentLoading from '../ContentLoading';
|
||||
import { useDismissableError } from '../../util/useRequest';
|
||||
import mergeExtraVars from './mergeExtraVars';
|
||||
import mergeExtraVars from '../../util/prompt/mergeExtraVars';
|
||||
import getSurveyValues from '../../util/prompt/getSurveyValues';
|
||||
import useLaunchSteps from './useLaunchSteps';
|
||||
import AlertModal from '../AlertModal';
|
||||
import getSurveyValues from './getSurveyValues';
|
||||
|
||||
function PromptModalForm({ onSubmit, onCancel, i18n, config, resource }) {
|
||||
function PromptModalForm({
|
||||
launchConfig,
|
||||
i18n,
|
||||
onCancel,
|
||||
onSubmit,
|
||||
resource,
|
||||
surveyConfig,
|
||||
}) {
|
||||
const { values, setTouched, validateForm } = useFormikContext();
|
||||
|
||||
const {
|
||||
@@ -20,7 +27,7 @@ function PromptModalForm({ onSubmit, onCancel, i18n, config, resource }) {
|
||||
visitStep,
|
||||
visitAllSteps,
|
||||
contentError,
|
||||
} = useLaunchSteps(config, resource, i18n);
|
||||
} = useLaunchSteps(launchConfig, surveyConfig, resource, i18n);
|
||||
|
||||
const handleSave = () => {
|
||||
const postValues = {};
|
||||
@@ -39,7 +46,7 @@ function PromptModalForm({ onSubmit, onCancel, i18n, config, resource }) {
|
||||
setValue('limit', values.limit);
|
||||
setValue('job_tags', values.job_tags);
|
||||
setValue('skip_tags', values.skip_tags);
|
||||
const extraVars = config.ask_variables_on_launch
|
||||
const extraVars = launchConfig.ask_variables_on_launch
|
||||
? values.extra_vars || '---'
|
||||
: resource.extra_vars;
|
||||
setValue('extra_vars', mergeExtraVars(extraVars, surveyValues));
|
||||
@@ -103,28 +110,22 @@ function PromptModalForm({ onSubmit, onCancel, i18n, config, resource }) {
|
||||
);
|
||||
}
|
||||
|
||||
function LaunchPrompt({ config, resource = {}, onLaunch, onCancel, i18n }) {
|
||||
function LaunchPrompt({
|
||||
launchConfig,
|
||||
i18n,
|
||||
onCancel,
|
||||
onLaunch,
|
||||
resource = {},
|
||||
surveyConfig,
|
||||
}) {
|
||||
return (
|
||||
<Formik
|
||||
initialValues={{
|
||||
verbosity: resource.verbosity || 0,
|
||||
inventory: resource.summary_fields?.inventory || null,
|
||||
credentials: resource.summary_fields?.credentials || null,
|
||||
diff_mode: resource.diff_mode || false,
|
||||
extra_vars: resource.extra_vars || '---',
|
||||
job_type: resource.job_type || '',
|
||||
job_tags: resource.job_tags || '',
|
||||
skip_tags: resource.skip_tags || '',
|
||||
scm_branch: resource.scm_branch || '',
|
||||
limit: resource.limit || '',
|
||||
}}
|
||||
onSubmit={values => onLaunch(values)}
|
||||
>
|
||||
<Formik initialValues={{}} onSubmit={values => onLaunch(values)}>
|
||||
<PromptModalForm
|
||||
onSubmit={values => onLaunch(values)}
|
||||
onCancel={onCancel}
|
||||
i18n={i18n}
|
||||
config={config}
|
||||
launchConfig={launchConfig}
|
||||
surveyConfig={surveyConfig}
|
||||
resource={resource}
|
||||
/>
|
||||
</Formik>
|
||||
|
||||
@@ -76,7 +76,7 @@ describe('LaunchPrompt', () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<LaunchPrompt
|
||||
config={{
|
||||
launchConfig={{
|
||||
...config,
|
||||
ask_inventory_on_launch: true,
|
||||
ask_credential_on_launch: true,
|
||||
@@ -86,6 +86,24 @@ describe('LaunchPrompt', () => {
|
||||
resource={resource}
|
||||
onLaunch={noop}
|
||||
onCancel={noop}
|
||||
surveyConfig={{
|
||||
name: '',
|
||||
description: '',
|
||||
spec: [
|
||||
{
|
||||
choices: '',
|
||||
default: '',
|
||||
max: 1024,
|
||||
min: 0,
|
||||
new_question: false,
|
||||
question_description: '',
|
||||
question_name: 'foo',
|
||||
required: true,
|
||||
type: 'text',
|
||||
variable: 'foo',
|
||||
},
|
||||
],
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
@@ -94,10 +112,10 @@ describe('LaunchPrompt', () => {
|
||||
|
||||
expect(steps).toHaveLength(5);
|
||||
expect(steps[0].name.props.children).toEqual('Inventory');
|
||||
expect(steps[1].name).toEqual('Credentials');
|
||||
expect(steps[2].name).toEqual('Other Prompts');
|
||||
expect(steps[1].name.props.children).toEqual('Credentials');
|
||||
expect(steps[2].name.props.children).toEqual('Other prompts');
|
||||
expect(steps[3].name.props.children).toEqual('Survey');
|
||||
expect(steps[4].name).toEqual('Preview');
|
||||
expect(steps[4].name.props.children).toEqual('Preview');
|
||||
});
|
||||
|
||||
test('should add inventory step', async () => {
|
||||
@@ -105,7 +123,7 @@ describe('LaunchPrompt', () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<LaunchPrompt
|
||||
config={{
|
||||
launchConfig={{
|
||||
...config,
|
||||
ask_inventory_on_launch: true,
|
||||
}}
|
||||
@@ -129,7 +147,7 @@ describe('LaunchPrompt', () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<LaunchPrompt
|
||||
config={{
|
||||
launchConfig={{
|
||||
...config,
|
||||
ask_credential_on_launch: true,
|
||||
}}
|
||||
@@ -143,7 +161,7 @@ describe('LaunchPrompt', () => {
|
||||
const steps = wizard.prop('steps');
|
||||
|
||||
expect(steps).toHaveLength(2);
|
||||
expect(steps[0].name).toEqual('Credentials');
|
||||
expect(steps[0].name.props.children).toEqual('Credentials');
|
||||
expect(isElementOfType(steps[0].component, CredentialsStep)).toEqual(true);
|
||||
expect(isElementOfType(steps[1].component, PreviewStep)).toEqual(true);
|
||||
});
|
||||
@@ -153,7 +171,7 @@ describe('LaunchPrompt', () => {
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<LaunchPrompt
|
||||
config={{
|
||||
launchConfig={{
|
||||
...config,
|
||||
ask_verbosity_on_launch: true,
|
||||
}}
|
||||
@@ -167,7 +185,7 @@ describe('LaunchPrompt', () => {
|
||||
const steps = wizard.prop('steps');
|
||||
|
||||
expect(steps).toHaveLength(2);
|
||||
expect(steps[0].name).toEqual('Other Prompts');
|
||||
expect(steps[0].name.props.children).toEqual('Other prompts');
|
||||
expect(isElementOfType(steps[0].component, OtherPromptsStep)).toEqual(true);
|
||||
expect(isElementOfType(steps[1].component, PreviewStep)).toEqual(true);
|
||||
});
|
||||
|
||||
@@ -20,11 +20,11 @@ const FieldHeader = styled.div`
|
||||
}
|
||||
`;
|
||||
|
||||
function OtherPromptsStep({ config, i18n }) {
|
||||
function OtherPromptsStep({ launchConfig, i18n }) {
|
||||
return (
|
||||
<Form>
|
||||
{config.ask_job_type_on_launch && <JobTypeField i18n={i18n} />}
|
||||
{config.ask_limit_on_launch && (
|
||||
{launchConfig.ask_job_type_on_launch && <JobTypeField i18n={i18n} />}
|
||||
{launchConfig.ask_limit_on_launch && (
|
||||
<FormField
|
||||
id="prompt-limit"
|
||||
name="limit"
|
||||
@@ -35,7 +35,7 @@ function OtherPromptsStep({ config, i18n }) {
|
||||
information and examples on patterns.`)}
|
||||
/>
|
||||
)}
|
||||
{config.ask_scm_branch_on_launch && (
|
||||
{launchConfig.ask_scm_branch_on_launch && (
|
||||
<FormField
|
||||
id="prompt-scm-branch"
|
||||
name="scm_branch"
|
||||
@@ -45,9 +45,11 @@ function OtherPromptsStep({ config, i18n }) {
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
{config.ask_verbosity_on_launch && <VerbosityField i18n={i18n} />}
|
||||
{config.ask_diff_mode_on_launch && <ShowChangesToggle i18n={i18n} />}
|
||||
{config.ask_tags_on_launch && (
|
||||
{launchConfig.ask_verbosity_on_launch && <VerbosityField i18n={i18n} />}
|
||||
{launchConfig.ask_diff_mode_on_launch && (
|
||||
<ShowChangesToggle i18n={i18n} />
|
||||
)}
|
||||
{launchConfig.ask_tags_on_launch && (
|
||||
<TagField
|
||||
id="prompt-job-tags"
|
||||
name="job_tags"
|
||||
@@ -59,7 +61,7 @@ function OtherPromptsStep({ config, i18n }) {
|
||||
documentation for details on the usage of tags.`)}
|
||||
/>
|
||||
)}
|
||||
{config.ask_skip_tags_on_launch && (
|
||||
{launchConfig.ask_skip_tags_on_launch && (
|
||||
<TagField
|
||||
id="prompt-skip-tags"
|
||||
name="skip_tags"
|
||||
@@ -71,7 +73,7 @@ function OtherPromptsStep({ config, i18n }) {
|
||||
documentation for details on the usage of tags.`)}
|
||||
/>
|
||||
)}
|
||||
{config.ask_variables_on_launch && (
|
||||
{launchConfig.ask_variables_on_launch && (
|
||||
<VariablesField
|
||||
id="prompt-variables"
|
||||
name="extra_vars"
|
||||
|
||||
@@ -11,7 +11,7 @@ describe('OtherPromptsStep', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<Formik initialValues={{ job_type: 'run' }}>
|
||||
<OtherPromptsStep
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_job_type_on_launch: true,
|
||||
}}
|
||||
/>
|
||||
@@ -34,7 +34,7 @@ describe('OtherPromptsStep', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<Formik>
|
||||
<OtherPromptsStep
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_limit_on_launch: true,
|
||||
}}
|
||||
/>
|
||||
@@ -54,7 +54,7 @@ describe('OtherPromptsStep', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<Formik>
|
||||
<OtherPromptsStep
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_scm_branch_on_launch: true,
|
||||
}}
|
||||
/>
|
||||
@@ -74,7 +74,7 @@ describe('OtherPromptsStep', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<Formik initialValues={{ verbosity: '' }}>
|
||||
<OtherPromptsStep
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_verbosity_on_launch: true,
|
||||
}}
|
||||
/>
|
||||
@@ -94,7 +94,7 @@ describe('OtherPromptsStep', () => {
|
||||
wrapper = mountWithContexts(
|
||||
<Formik initialValues={{ diff_mode: true }}>
|
||||
<OtherPromptsStep
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_diff_mode_on_launch: true,
|
||||
}}
|
||||
/>
|
||||
|
||||
@@ -6,8 +6,10 @@ import { t } from '@lingui/macro';
|
||||
import { useFormikContext } from 'formik';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import yaml from 'js-yaml';
|
||||
import mergeExtraVars, { maskPasswords } from '../mergeExtraVars';
|
||||
import getSurveyValues from '../getSurveyValues';
|
||||
import mergeExtraVars, {
|
||||
maskPasswords,
|
||||
} from '../../../util/prompt/mergeExtraVars';
|
||||
import getSurveyValues from '../../../util/prompt/getSurveyValues';
|
||||
import PromptDetail from '../../PromptDetail';
|
||||
|
||||
const ExclamationCircleIcon = styled(PFExclamationCircleIcon)`
|
||||
@@ -23,18 +25,25 @@ const ErrorMessageWrapper = styled.div`
|
||||
margin-bottom: 10px;
|
||||
`;
|
||||
|
||||
function PreviewStep({ resource, config, survey, formErrors, i18n }) {
|
||||
function PreviewStep({
|
||||
resource,
|
||||
launchConfig,
|
||||
surveyConfig,
|
||||
formErrors,
|
||||
i18n,
|
||||
}) {
|
||||
const { values } = useFormikContext();
|
||||
const surveyValues = getSurveyValues(values);
|
||||
|
||||
const overrides = { ...values };
|
||||
const overrides = {
|
||||
...values,
|
||||
};
|
||||
|
||||
if (config.ask_variables_on_launch || config.survey_enabled) {
|
||||
const initialExtraVars = config.ask_variables_on_launch
|
||||
? values.extra_vars || '---'
|
||||
: resource.extra_vars;
|
||||
if (survey && survey.spec) {
|
||||
const passwordFields = survey.spec
|
||||
if (launchConfig.ask_variables_on_launch || launchConfig.survey_enabled) {
|
||||
const initialExtraVars =
|
||||
launchConfig.ask_variables_on_launch && (overrides.extra_vars || '---');
|
||||
if (surveyConfig?.spec) {
|
||||
const passwordFields = surveyConfig.spec
|
||||
.filter(q => q.type === 'password')
|
||||
.map(q => q.variable);
|
||||
const masked = maskPasswords(surveyValues, passwordFields);
|
||||
@@ -42,7 +51,9 @@ function PreviewStep({ resource, config, survey, formErrors, i18n }) {
|
||||
mergeExtraVars(initialExtraVars, masked)
|
||||
);
|
||||
} else {
|
||||
overrides.extra_vars = initialExtraVars;
|
||||
overrides.extra_vars = yaml.safeDump(
|
||||
mergeExtraVars(initialExtraVars, {})
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -62,7 +73,7 @@ function PreviewStep({ resource, config, survey, formErrors, i18n }) {
|
||||
)}
|
||||
<PromptDetail
|
||||
resource={resource}
|
||||
launchConfig={config}
|
||||
launchConfig={launchConfig}
|
||||
overrides={overrides}
|
||||
/>
|
||||
</Fragment>
|
||||
|
||||
@@ -36,11 +36,11 @@ describe('PreviewStep', () => {
|
||||
<Formik initialValues={{ limit: '4', survey_foo: 'abc' }}>
|
||||
<PreviewStep
|
||||
resource={resource}
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_limit_on_launch: true,
|
||||
survey_enabled: true,
|
||||
}}
|
||||
survey={survey}
|
||||
surveyConfig={survey}
|
||||
formErrors={formErrors}
|
||||
/>
|
||||
</Formik>
|
||||
@@ -64,7 +64,7 @@ describe('PreviewStep', () => {
|
||||
<Formik initialValues={{ limit: '4' }}>
|
||||
<PreviewStep
|
||||
resource={resource}
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_limit_on_launch: true,
|
||||
}}
|
||||
formErrors={formErrors}
|
||||
@@ -80,7 +80,32 @@ describe('PreviewStep', () => {
|
||||
limit: '4',
|
||||
});
|
||||
});
|
||||
test('should handle extra vars with survey', async () => {
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(
|
||||
<Formik initialValues={{ extra_vars: 'one: 1', survey_foo: 'abc' }}>
|
||||
<PreviewStep
|
||||
resource={resource}
|
||||
launchConfig={{
|
||||
ask_variables_on_launch: true,
|
||||
survey_enabled: true,
|
||||
}}
|
||||
surveyConfig={survey}
|
||||
formErrors={formErrors}
|
||||
/>
|
||||
</Formik>
|
||||
);
|
||||
});
|
||||
|
||||
const detail = wrapper.find('PromptDetail');
|
||||
expect(detail).toHaveLength(1);
|
||||
expect(detail.prop('resource')).toEqual(resource);
|
||||
expect(detail.prop('overrides')).toEqual({
|
||||
extra_vars: 'one: 1\nfoo: abc\n',
|
||||
survey_foo: 'abc',
|
||||
});
|
||||
});
|
||||
test('should handle extra vars without survey', async () => {
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
@@ -88,7 +113,7 @@ describe('PreviewStep', () => {
|
||||
<Formik initialValues={{ extra_vars: 'one: 1' }}>
|
||||
<PreviewStep
|
||||
resource={resource}
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_variables_on_launch: true,
|
||||
}}
|
||||
formErrors={formErrors}
|
||||
@@ -101,10 +126,9 @@ describe('PreviewStep', () => {
|
||||
expect(detail).toHaveLength(1);
|
||||
expect(detail.prop('resource')).toEqual(resource);
|
||||
expect(detail.prop('overrides')).toEqual({
|
||||
extra_vars: 'one: 1',
|
||||
extra_vars: 'one: 1\n',
|
||||
});
|
||||
});
|
||||
|
||||
test('should remove survey with empty array value', async () => {
|
||||
let wrapper;
|
||||
await act(async () => {
|
||||
@@ -115,7 +139,7 @@ describe('PreviewStep', () => {
|
||||
>
|
||||
<PreviewStep
|
||||
resource={resource}
|
||||
config={{
|
||||
launchConfig={{
|
||||
ask_variables_on_launch: true,
|
||||
}}
|
||||
formErrors={formErrors}
|
||||
@@ -128,7 +152,7 @@ describe('PreviewStep', () => {
|
||||
expect(detail).toHaveLength(1);
|
||||
expect(detail.prop('resource')).toEqual(resource);
|
||||
expect(detail.prop('overrides')).toEqual({
|
||||
extra_vars: 'one: 1',
|
||||
extra_vars: 'one: 1\n',
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -14,13 +14,13 @@ const ExclamationCircleIcon = styled(PFExclamationCircleIcon)`
|
||||
margin-left: 10px;
|
||||
`;
|
||||
|
||||
function StepName({ hasErrors, children, i18n }) {
|
||||
function StepName({ hasErrors, children, i18n, id }) {
|
||||
if (!hasErrors) {
|
||||
return children;
|
||||
return <div id={id}>{children}</div>;
|
||||
}
|
||||
return (
|
||||
<>
|
||||
<AlertText>
|
||||
<AlertText id={id}>
|
||||
{children}
|
||||
<Tooltip
|
||||
position="right"
|
||||
|
||||
@@ -22,7 +22,7 @@ import {
|
||||
} from '../../../util/validators';
|
||||
import { Survey } from '../../../types';
|
||||
|
||||
function SurveyStep({ survey, i18n }) {
|
||||
function SurveyStep({ surveyConfig, i18n }) {
|
||||
const fieldTypes = {
|
||||
text: TextField,
|
||||
textarea: TextField,
|
||||
@@ -34,7 +34,7 @@ function SurveyStep({ survey, i18n }) {
|
||||
};
|
||||
return (
|
||||
<Form>
|
||||
{survey.spec.map(question => {
|
||||
{surveyConfig.spec.map(question => {
|
||||
const Field = fieldTypes[question.type];
|
||||
return (
|
||||
<Field key={question.variable} question={question} i18n={i18n} />
|
||||
@@ -44,7 +44,7 @@ function SurveyStep({ survey, i18n }) {
|
||||
);
|
||||
}
|
||||
SurveyStep.propTypes = {
|
||||
survey: Survey.isRequired,
|
||||
surveyConfig: Survey.isRequired,
|
||||
};
|
||||
|
||||
function TextField({ question, i18n }) {
|
||||
@@ -130,7 +130,8 @@ function MultiSelectField({ question, i18n }) {
|
||||
<FormGroup
|
||||
fieldId={id}
|
||||
helperTextInvalid={
|
||||
meta.error || i18n._(t`Must select a value for this field.`)
|
||||
meta.error ||
|
||||
i18n._(t`At least one value must be selected for this field.`)
|
||||
}
|
||||
isRequired={question.required}
|
||||
validated={isValid ? 'default' : 'error'}
|
||||
|
||||
@@ -1,12 +1,15 @@
|
||||
import React from 'react';
|
||||
import { t } from '@lingui/macro';
|
||||
import CredentialsStep from './CredentialsStep';
|
||||
import StepName from './StepName';
|
||||
|
||||
const STEP_ID = 'credentials';
|
||||
|
||||
export default function useCredentialsStep(config, i18n) {
|
||||
export default function useCredentialsStep(launchConfig, resource, i18n) {
|
||||
return {
|
||||
step: getStep(config, i18n),
|
||||
step: getStep(launchConfig, i18n),
|
||||
initialValues: getInitialValues(launchConfig, resource),
|
||||
validate: () => ({}),
|
||||
isReady: true,
|
||||
contentError: null,
|
||||
formError: null,
|
||||
@@ -18,13 +21,29 @@ export default function useCredentialsStep(config, i18n) {
|
||||
};
|
||||
}
|
||||
|
||||
function getStep(config, i18n) {
|
||||
if (!config.ask_credential_on_launch) {
|
||||
function getStep(launchConfig, i18n) {
|
||||
if (!launchConfig.ask_credential_on_launch) {
|
||||
return null;
|
||||
}
|
||||
return {
|
||||
id: STEP_ID,
|
||||
name: i18n._(t`Credentials`),
|
||||
key: 4,
|
||||
name: (
|
||||
<StepName hasErrors={false} id="credentials-step">
|
||||
{i18n._(t`Credentials`)}
|
||||
</StepName>
|
||||
),
|
||||
component: <CredentialsStep i18n={i18n} />,
|
||||
enableNext: true,
|
||||
};
|
||||
}
|
||||
|
||||
function getInitialValues(launchConfig, resource) {
|
||||
if (!launchConfig.ask_credential_on_launch) {
|
||||
return {};
|
||||
}
|
||||
|
||||
return {
|
||||
credentials: resource?.summary_fields?.credentials || [],
|
||||
};
|
||||
}
|
||||
|
||||
@@ -6,14 +6,22 @@ import StepName from './StepName';
|
||||
|
||||
const STEP_ID = 'inventory';
|
||||
|
||||
export default function useInventoryStep(config, visitedSteps, i18n) {
|
||||
export default function useInventoryStep(
|
||||
launchConfig,
|
||||
resource,
|
||||
i18n,
|
||||
visitedSteps
|
||||
) {
|
||||
const [, meta] = useField('inventory');
|
||||
const formError =
|
||||
Object.keys(visitedSteps).includes(STEP_ID) && (!meta.value || meta.error);
|
||||
|
||||
return {
|
||||
step: getStep(config, meta, i18n, visitedSteps),
|
||||
step: getStep(launchConfig, i18n, formError),
|
||||
initialValues: getInitialValues(launchConfig, resource),
|
||||
isReady: true,
|
||||
contentError: null,
|
||||
formError: !meta.value,
|
||||
formError: launchConfig.ask_inventory_on_launch && formError,
|
||||
setTouched: setFieldsTouched => {
|
||||
setFieldsTouched({
|
||||
inventory: true,
|
||||
@@ -21,20 +29,14 @@ export default function useInventoryStep(config, visitedSteps, i18n) {
|
||||
},
|
||||
};
|
||||
}
|
||||
function getStep(config, meta, i18n, visitedSteps) {
|
||||
if (!config.ask_inventory_on_launch) {
|
||||
function getStep(launchConfig, i18n, formError) {
|
||||
if (!launchConfig.ask_inventory_on_launch) {
|
||||
return null;
|
||||
}
|
||||
return {
|
||||
id: STEP_ID,
|
||||
key: 3,
|
||||
name: (
|
||||
<StepName
|
||||
hasErrors={
|
||||
Object.keys(visitedSteps).includes(STEP_ID) &&
|
||||
(!meta.value || meta.error)
|
||||
}
|
||||
>
|
||||
<StepName hasErrors={formError} id="inventory-step">
|
||||
{i18n._(t`Inventory`)}
|
||||
</StepName>
|
||||
),
|
||||
@@ -42,3 +44,13 @@ function getStep(config, meta, i18n, visitedSteps) {
|
||||
enableNext: true,
|
||||
};
|
||||
}
|
||||
|
||||
function getInitialValues(launchConfig, resource) {
|
||||
if (!launchConfig.ask_inventory_on_launch) {
|
||||
return {};
|
||||
}
|
||||
|
||||
return {
|
||||
inventory: resource?.summary_fields?.inventory || null,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -1,12 +1,25 @@
|
||||
import React from 'react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { jsonToYaml, parseVariableField } from '../../../util/yaml';
|
||||
import OtherPromptsStep from './OtherPromptsStep';
|
||||
import StepName from './StepName';
|
||||
|
||||
const STEP_ID = 'other';
|
||||
|
||||
export default function useOtherPrompt(config, i18n) {
|
||||
const getVariablesData = resource => {
|
||||
if (resource?.extra_data) {
|
||||
return jsonToYaml(JSON.stringify(resource.extra_data));
|
||||
}
|
||||
if (resource?.extra_vars && resource?.extra_vars !== '---') {
|
||||
return jsonToYaml(JSON.stringify(parseVariableField(resource.extra_vars)));
|
||||
}
|
||||
return '---';
|
||||
};
|
||||
|
||||
export default function useOtherPromptsStep(launchConfig, resource, i18n) {
|
||||
return {
|
||||
step: getStep(config, i18n),
|
||||
step: getStep(launchConfig, i18n),
|
||||
initialValues: getInitialValues(launchConfig, resource),
|
||||
isReady: true,
|
||||
contentError: null,
|
||||
formError: null,
|
||||
@@ -24,26 +37,66 @@ export default function useOtherPrompt(config, i18n) {
|
||||
};
|
||||
}
|
||||
|
||||
function getStep(config, i18n) {
|
||||
if (!shouldShowPrompt(config)) {
|
||||
function getStep(launchConfig, i18n) {
|
||||
if (!shouldShowPrompt(launchConfig)) {
|
||||
return null;
|
||||
}
|
||||
return {
|
||||
id: STEP_ID,
|
||||
name: i18n._(t`Other Prompts`),
|
||||
component: <OtherPromptsStep config={config} i18n={i18n} />,
|
||||
key: 5,
|
||||
name: (
|
||||
<StepName hasErrors={false} id="other-prompts-step">
|
||||
{i18n._(t`Other prompts`)}
|
||||
</StepName>
|
||||
),
|
||||
component: <OtherPromptsStep launchConfig={launchConfig} i18n={i18n} />,
|
||||
enableNext: true,
|
||||
};
|
||||
}
|
||||
|
||||
function shouldShowPrompt(config) {
|
||||
function shouldShowPrompt(launchConfig) {
|
||||
return (
|
||||
config.ask_job_type_on_launch ||
|
||||
config.ask_limit_on_launch ||
|
||||
config.ask_verbosity_on_launch ||
|
||||
config.ask_tags_on_launch ||
|
||||
config.ask_skip_tags_on_launch ||
|
||||
config.ask_variables_on_launch ||
|
||||
config.ask_scm_branch_on_launch ||
|
||||
config.ask_diff_mode_on_launch
|
||||
launchConfig.ask_job_type_on_launch ||
|
||||
launchConfig.ask_limit_on_launch ||
|
||||
launchConfig.ask_verbosity_on_launch ||
|
||||
launchConfig.ask_tags_on_launch ||
|
||||
launchConfig.ask_skip_tags_on_launch ||
|
||||
launchConfig.ask_variables_on_launch ||
|
||||
launchConfig.ask_scm_branch_on_launch ||
|
||||
launchConfig.ask_diff_mode_on_launch
|
||||
);
|
||||
}
|
||||
|
||||
function getInitialValues(launchConfig, resource) {
|
||||
const initialValues = {};
|
||||
|
||||
if (!launchConfig) {
|
||||
return initialValues;
|
||||
}
|
||||
|
||||
if (launchConfig.ask_job_type_on_launch) {
|
||||
initialValues.job_type = resource?.job_type || '';
|
||||
}
|
||||
if (launchConfig.ask_limit_on_launch) {
|
||||
initialValues.limit = resource?.limit || '';
|
||||
}
|
||||
if (launchConfig.ask_verbosity_on_launch) {
|
||||
initialValues.verbosity = resource?.verbosity || 0;
|
||||
}
|
||||
if (launchConfig.ask_tags_on_launch) {
|
||||
initialValues.job_tags = resource?.job_tags || '';
|
||||
}
|
||||
if (launchConfig.ask_skip_tags_on_launch) {
|
||||
initialValues.skip_tags = resource?.skip_tags || '';
|
||||
}
|
||||
if (launchConfig.ask_variables_on_launch) {
|
||||
initialValues.extra_vars = getVariablesData(resource);
|
||||
}
|
||||
if (launchConfig.ask_scm_branch_on_launch) {
|
||||
initialValues.scm_branch = resource?.scm_branch || '';
|
||||
}
|
||||
if (launchConfig.ask_diff_mode_on_launch) {
|
||||
initialValues.diff_mode = resource?.diff_mode || false;
|
||||
}
|
||||
return initialValues;
|
||||
}
|
||||
|
||||
@@ -1,53 +1,41 @@
|
||||
import React from 'react';
|
||||
import { useFormikContext } from 'formik';
|
||||
import { t } from '@lingui/macro';
|
||||
import PreviewStep from './PreviewStep';
|
||||
import StepName from './StepName';
|
||||
|
||||
const STEP_ID = 'preview';
|
||||
|
||||
export default function usePreviewStep(
|
||||
config,
|
||||
launchConfig,
|
||||
i18n,
|
||||
resource,
|
||||
survey,
|
||||
surveyConfig,
|
||||
hasErrors,
|
||||
i18n
|
||||
showStep
|
||||
) {
|
||||
const { values: formikValues, errors } = useFormikContext();
|
||||
|
||||
const formErrorsContent = [];
|
||||
if (config.ask_inventory_on_launch && !formikValues.inventory) {
|
||||
formErrorsContent.push({
|
||||
inventory: true,
|
||||
});
|
||||
}
|
||||
const hasSurveyError = Object.keys(errors).find(e => e.includes('survey'));
|
||||
if (
|
||||
config.survey_enabled &&
|
||||
(config.variables_needed_to_start ||
|
||||
config.variables_needed_to_start.length === 0) &&
|
||||
hasSurveyError
|
||||
) {
|
||||
formErrorsContent.push({
|
||||
survey: true,
|
||||
});
|
||||
}
|
||||
|
||||
return {
|
||||
step: {
|
||||
id: STEP_ID,
|
||||
name: i18n._(t`Preview`),
|
||||
component: (
|
||||
<PreviewStep
|
||||
config={config}
|
||||
resource={resource}
|
||||
survey={survey}
|
||||
formErrors={hasErrors}
|
||||
/>
|
||||
),
|
||||
enableNext: !hasErrors,
|
||||
nextButtonText: i18n._(t`Launch`),
|
||||
},
|
||||
step: showStep
|
||||
? {
|
||||
id: STEP_ID,
|
||||
name: (
|
||||
<StepName hasErrors={false} id="preview-step">
|
||||
{i18n._(t`Preview`)}
|
||||
</StepName>
|
||||
),
|
||||
component: (
|
||||
<PreviewStep
|
||||
launchConfig={launchConfig}
|
||||
resource={resource}
|
||||
surveyConfig={surveyConfig}
|
||||
formErrors={hasErrors}
|
||||
/>
|
||||
),
|
||||
enableNext: !hasErrors,
|
||||
nextButtonText: i18n._(t`Launch`),
|
||||
}
|
||||
: null,
|
||||
initialValues: {},
|
||||
validate: () => ({}),
|
||||
isReady: true,
|
||||
error: null,
|
||||
setTouched: () => {},
|
||||
|
||||
@@ -1,39 +1,25 @@
|
||||
import React, { useEffect, useCallback } from 'react';
|
||||
import React from 'react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { useFormikContext } from 'formik';
|
||||
import useRequest from '../../../util/useRequest';
|
||||
import { JobTemplatesAPI, WorkflowJobTemplatesAPI } from '../../../api';
|
||||
import SurveyStep from './SurveyStep';
|
||||
import StepName from './StepName';
|
||||
|
||||
const STEP_ID = 'survey';
|
||||
|
||||
export default function useSurveyStep(config, visitedSteps, i18n) {
|
||||
export default function useSurveyStep(
|
||||
launchConfig,
|
||||
surveyConfig,
|
||||
resource,
|
||||
i18n,
|
||||
visitedSteps
|
||||
) {
|
||||
const { values } = useFormikContext();
|
||||
const { result: survey, request: fetchSurvey, isLoading, error } = useRequest(
|
||||
useCallback(async () => {
|
||||
if (!config.survey_enabled) {
|
||||
return {};
|
||||
}
|
||||
const { data } = config?.workflow_job_template_data
|
||||
? await WorkflowJobTemplatesAPI.readSurvey(
|
||||
config?.workflow_job_template_data?.id
|
||||
)
|
||||
: await JobTemplatesAPI.readSurvey(config?.job_template_data?.id);
|
||||
return data;
|
||||
}, [config])
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
fetchSurvey();
|
||||
}, [fetchSurvey]);
|
||||
|
||||
const errors = {};
|
||||
const validate = () => {
|
||||
if (!config.survey_enabled || !survey || !survey.spec) {
|
||||
if (!launchConfig.survey_enabled || !surveyConfig?.spec) {
|
||||
return {};
|
||||
}
|
||||
const errors = {};
|
||||
survey.spec.forEach(question => {
|
||||
surveyConfig.spec.forEach(question => {
|
||||
const errMessage = validateField(
|
||||
question,
|
||||
values[`survey_${question.variable}`],
|
||||
@@ -47,18 +33,19 @@ export default function useSurveyStep(config, visitedSteps, i18n) {
|
||||
};
|
||||
const formError = Object.keys(validate()).length > 0;
|
||||
return {
|
||||
step: getStep(config, survey, formError, i18n, visitedSteps),
|
||||
step: getStep(launchConfig, surveyConfig, validate, i18n, visitedSteps),
|
||||
initialValues: getInitialValues(launchConfig, surveyConfig, resource),
|
||||
validate,
|
||||
surveyConfig,
|
||||
isReady: true,
|
||||
contentError: null,
|
||||
formError,
|
||||
initialValues: getInitialValues(config, survey),
|
||||
survey,
|
||||
isReady: !isLoading && !!survey,
|
||||
contentError: error,
|
||||
setTouched: setFieldsTouched => {
|
||||
if (!survey || !survey.spec) {
|
||||
if (!surveyConfig?.spec) {
|
||||
return;
|
||||
}
|
||||
const fields = {};
|
||||
survey.spec.forEach(question => {
|
||||
surveyConfig.spec.forEach(question => {
|
||||
fields[`survey_${question.variable}`] = true;
|
||||
});
|
||||
setFieldsTouched(fields);
|
||||
@@ -84,49 +71,65 @@ function validateField(question, value, i18n) {
|
||||
);
|
||||
}
|
||||
}
|
||||
if (
|
||||
question.required &&
|
||||
((!value && value !== 0) || (Array.isArray(value) && value.length === 0))
|
||||
) {
|
||||
if (question.required && !value && value !== 0) {
|
||||
return i18n._(t`This field must not be blank`);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
function getStep(config, survey, hasErrors, i18n, visitedSteps) {
|
||||
if (!config.survey_enabled) {
|
||||
function getStep(launchConfig, surveyConfig, validate, i18n, visitedSteps) {
|
||||
if (!launchConfig.survey_enabled) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return {
|
||||
id: STEP_ID,
|
||||
key: 6,
|
||||
name: (
|
||||
<StepName
|
||||
hasErrors={Object.keys(visitedSteps).includes(STEP_ID) && hasErrors}
|
||||
hasErrors={
|
||||
Object.keys(visitedSteps).includes(STEP_ID) &&
|
||||
Object.keys(validate()).length
|
||||
}
|
||||
id="survey-step"
|
||||
>
|
||||
{i18n._(t`Survey`)}
|
||||
</StepName>
|
||||
),
|
||||
component: <SurveyStep survey={survey} i18n={i18n} />,
|
||||
component: <SurveyStep surveyConfig={surveyConfig} i18n={i18n} />,
|
||||
enableNext: true,
|
||||
};
|
||||
}
|
||||
function getInitialValues(config, survey) {
|
||||
if (!config.survey_enabled || !survey) {
|
||||
|
||||
function getInitialValues(launchConfig, surveyConfig, resource) {
|
||||
if (!launchConfig.survey_enabled || !surveyConfig) {
|
||||
return {};
|
||||
}
|
||||
const surveyValues = {};
|
||||
survey.spec.forEach(question => {
|
||||
if (question.type === 'multiselect') {
|
||||
if (question.default === '') {
|
||||
surveyValues[`survey_${question.variable}`] = [];
|
||||
|
||||
const values = {};
|
||||
if (surveyConfig?.spec) {
|
||||
surveyConfig.spec.forEach(question => {
|
||||
if (question.type === 'multiselect') {
|
||||
values[`survey_${question.variable}`] = question.default
|
||||
? question.default.split('\n')
|
||||
: [];
|
||||
} else if (question.type === 'multiplechoice') {
|
||||
values[`survey_${question.variable}`] =
|
||||
question.default || question.choices.split('\n')[0];
|
||||
} else {
|
||||
surveyValues[`survey_${question.variable}`] = question.default.split(
|
||||
'\n'
|
||||
);
|
||||
values[`survey_${question.variable}`] = question.default || '';
|
||||
}
|
||||
} else {
|
||||
surveyValues[`survey_${question.variable}`] = question.default;
|
||||
}
|
||||
});
|
||||
return surveyValues;
|
||||
if (resource?.extra_data) {
|
||||
Object.entries(resource.extra_data).forEach(([key, value]) => {
|
||||
if (key === question.variable) {
|
||||
if (question.type === 'multiselect') {
|
||||
values[`survey_${question.variable}`] = value;
|
||||
} else {
|
||||
values[`survey_${question.variable}`] = value;
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
return values;
|
||||
}
|
||||
|
||||
@@ -6,42 +6,48 @@ import useOtherPromptsStep from './steps/useOtherPromptsStep';
|
||||
import useSurveyStep from './steps/useSurveyStep';
|
||||
import usePreviewStep from './steps/usePreviewStep';
|
||||
|
||||
export default function useLaunchSteps(config, resource, i18n) {
|
||||
export default function useLaunchSteps(
|
||||
launchConfig,
|
||||
surveyConfig,
|
||||
resource,
|
||||
i18n
|
||||
) {
|
||||
const [visited, setVisited] = useState({});
|
||||
const [isReady, setIsReady] = useState(false);
|
||||
const steps = [
|
||||
useInventoryStep(config, visited, i18n),
|
||||
useCredentialsStep(config, i18n),
|
||||
useOtherPromptsStep(config, i18n),
|
||||
useSurveyStep(config, visited, i18n),
|
||||
useInventoryStep(launchConfig, resource, i18n, visited),
|
||||
useCredentialsStep(launchConfig, resource, i18n),
|
||||
useOtherPromptsStep(launchConfig, resource, i18n),
|
||||
useSurveyStep(launchConfig, surveyConfig, resource, i18n, visited),
|
||||
];
|
||||
const { resetForm, values: formikValues } = useFormikContext();
|
||||
const { resetForm } = useFormikContext();
|
||||
const hasErrors = steps.some(step => step.formError);
|
||||
|
||||
const surveyStepIndex = steps.findIndex(step => step.survey);
|
||||
steps.push(
|
||||
usePreviewStep(
|
||||
config,
|
||||
resource,
|
||||
steps[surveyStepIndex]?.survey,
|
||||
hasErrors,
|
||||
i18n
|
||||
)
|
||||
usePreviewStep(launchConfig, i18n, resource, surveyConfig, hasErrors, true)
|
||||
);
|
||||
|
||||
const pfSteps = steps.map(s => s.step).filter(s => s != null);
|
||||
const isReady = !steps.some(s => !s.isReady);
|
||||
const stepsAreReady = !steps.some(s => !s.isReady);
|
||||
|
||||
useEffect(() => {
|
||||
if (surveyStepIndex > -1 && isReady) {
|
||||
if (stepsAreReady) {
|
||||
const initialValues = steps.reduce((acc, cur) => {
|
||||
return {
|
||||
...acc,
|
||||
...cur.initialValues,
|
||||
};
|
||||
}, {});
|
||||
resetForm({
|
||||
values: {
|
||||
...formikValues,
|
||||
...steps[surveyStepIndex].initialValues,
|
||||
...initialValues,
|
||||
},
|
||||
});
|
||||
|
||||
setIsReady(true);
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [isReady]);
|
||||
}, [stepsAreReady]);
|
||||
|
||||
const stepWithError = steps.find(s => s.contentError);
|
||||
const contentError = stepWithError ? stepWithError.contentError : null;
|
||||
|
||||
@@ -147,13 +147,14 @@ ListHeader.propTypes = {
|
||||
searchColumns: SearchColumns.isRequired,
|
||||
searchableKeys: PropTypes.arrayOf(PropTypes.string),
|
||||
relatedSearchableKeys: PropTypes.arrayOf(PropTypes.string),
|
||||
sortColumns: SortColumns.isRequired,
|
||||
sortColumns: SortColumns,
|
||||
renderToolbar: PropTypes.func,
|
||||
};
|
||||
|
||||
ListHeader.defaultProps = {
|
||||
renderToolbar: props => <DataListToolbar {...props} />,
|
||||
searchableKeys: [],
|
||||
sortColumns: null,
|
||||
relatedSearchableKeys: [],
|
||||
};
|
||||
|
||||
|
||||
@@ -279,8 +279,8 @@ function HostFilterLookup({
|
||||
numChips={5}
|
||||
totalChips={chips[key]?.chips?.length || 0}
|
||||
>
|
||||
{chips[key]?.chips?.map((chip, index) => (
|
||||
<Chip key={index} isReadOnly>
|
||||
{chips[key]?.chips?.map(chip => (
|
||||
<Chip key={chip.key} isReadOnly>
|
||||
{chip.node}
|
||||
</Chip>
|
||||
))}
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
import React, { useCallback, useEffect } from 'react';
|
||||
import { arrayOf, string, func, object, bool } from 'prop-types';
|
||||
import { arrayOf, string, func, bool } from 'prop-types';
|
||||
import { withRouter } from 'react-router-dom';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { FormGroup } from '@patternfly/react-core';
|
||||
import { InstanceGroupsAPI } from '../../api';
|
||||
import { InstanceGroup } from '../../types';
|
||||
import { getQSConfig, parseQueryString } from '../../util/qs';
|
||||
import Popover from '../Popover';
|
||||
import OptionsList from '../OptionsList';
|
||||
@@ -120,7 +121,7 @@ function InstanceGroupsLookup(props) {
|
||||
}
|
||||
|
||||
InstanceGroupsLookup.propTypes = {
|
||||
value: arrayOf(object).isRequired,
|
||||
value: arrayOf(InstanceGroup).isRequired,
|
||||
tooltip: string,
|
||||
onChange: func.isRequired,
|
||||
className: string,
|
||||
|
||||
@@ -194,9 +194,10 @@ function MultiCredentialsLookup(props) {
|
||||
const hasSameVaultID = val =>
|
||||
val?.inputs?.vault_id !== undefined &&
|
||||
val?.inputs?.vault_id === item?.inputs?.vault_id;
|
||||
const hasSameKind = val => val.kind === item.kind;
|
||||
const hasSameCredentialType = val =>
|
||||
val.credential_type === item.credential_type;
|
||||
const selectedItems = state.selectedItems.filter(i =>
|
||||
isVault ? !hasSameVaultID(i) : !hasSameKind(i)
|
||||
isVault ? !hasSameVaultID(i) : !hasSameCredentialType(i)
|
||||
);
|
||||
selectedItems.push(item);
|
||||
return dispatch({
|
||||
|
||||
@@ -13,11 +13,29 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
let wrapper;
|
||||
|
||||
const credentials = [
|
||||
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
|
||||
{ id: 2, kind: 'ssh', name: 'Alex', url: 'www.google.com' },
|
||||
{ name: 'Gatsby', id: 21, kind: 'vault', inputs: { vault_id: '1' } },
|
||||
{ name: 'Gatsby 2', id: 23, kind: 'vault' },
|
||||
{ name: 'Gatsby', id: 8, kind: 'Machine' },
|
||||
{
|
||||
id: 1,
|
||||
credential_type: 1,
|
||||
kind: 'gce',
|
||||
name: 'Foo',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
credential_type: 2,
|
||||
kind: 'ssh',
|
||||
name: 'Alex',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 21,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
inputs: { vault_id: '1' },
|
||||
name: 'Gatsby',
|
||||
},
|
||||
{ id: 23, credential_type: 3, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, credential_type: 4, kind: 'Machine', name: 'Gatsby' },
|
||||
];
|
||||
|
||||
beforeEach(() => {
|
||||
@@ -34,11 +52,41 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
CredentialsAPI.read.mockResolvedValueOnce({
|
||||
data: {
|
||||
results: [
|
||||
{ id: 1, kind: 'cloud', name: 'Cred 1', url: 'www.google.com' },
|
||||
{ id: 2, kind: 'ssh', name: 'Cred 2', url: 'www.google.com' },
|
||||
{ id: 3, kind: 'Ansible', name: 'Cred 3', url: 'www.google.com' },
|
||||
{ id: 4, kind: 'Machine', name: 'Cred 4', url: 'www.google.com' },
|
||||
{ id: 5, kind: 'Machine', name: 'Cred 5', url: 'www.google.com' },
|
||||
{
|
||||
id: 1,
|
||||
credential_type: 1,
|
||||
kind: 'gc2',
|
||||
name: 'Cred 1',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
credential_type: 2,
|
||||
kind: 'ssh',
|
||||
name: 'Cred 2',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 3,
|
||||
credential_type: 5,
|
||||
kind: 'Ansible',
|
||||
name: 'Cred 3',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 4,
|
||||
credential_type: 4,
|
||||
kind: 'Machine',
|
||||
name: 'Cred 4',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 5,
|
||||
credential_type: 4,
|
||||
kind: 'Machine',
|
||||
name: 'Cred 5',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
],
|
||||
count: 3,
|
||||
},
|
||||
@@ -95,10 +143,22 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
button.invoke('onClick')();
|
||||
});
|
||||
expect(onChange).toBeCalledWith([
|
||||
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
|
||||
{ id: 21, inputs: { vault_id: '1' }, kind: 'vault', name: 'Gatsby' },
|
||||
{ id: 23, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, kind: 'Machine', name: 'Gatsby' },
|
||||
{
|
||||
id: 1,
|
||||
credential_type: 1,
|
||||
kind: 'gce',
|
||||
name: 'Foo',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 21,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
inputs: { vault_id: '1' },
|
||||
name: 'Gatsby',
|
||||
},
|
||||
{ id: 23, credential_type: 3, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, credential_type: 4, kind: 'Machine', name: 'Gatsby' },
|
||||
]);
|
||||
});
|
||||
|
||||
@@ -165,6 +225,7 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
act(() => {
|
||||
optionsList.invoke('selectItem')({
|
||||
id: 5,
|
||||
credential_type: 4,
|
||||
kind: 'Machine',
|
||||
name: 'Cred 5',
|
||||
url: 'www.google.com',
|
||||
@@ -175,11 +236,35 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
wrapper.find('Button[variant="primary"]').invoke('onClick')();
|
||||
});
|
||||
expect(onChange).toBeCalledWith([
|
||||
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
|
||||
{ id: 2, kind: 'ssh', name: 'Alex', url: 'www.google.com' },
|
||||
{ id: 21, inputs: { vault_id: '1' }, kind: 'vault', name: 'Gatsby' },
|
||||
{ id: 23, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 5, kind: 'Machine', name: 'Cred 5', url: 'www.google.com' },
|
||||
{
|
||||
id: 1,
|
||||
credential_type: 1,
|
||||
kind: 'gce',
|
||||
name: 'Foo',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
credential_type: 2,
|
||||
kind: 'ssh',
|
||||
name: 'Alex',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 21,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
inputs: { vault_id: '1' },
|
||||
name: 'Gatsby',
|
||||
},
|
||||
{ id: 23, credential_type: 3, kind: 'vault', name: 'Gatsby 2' },
|
||||
{
|
||||
id: 5,
|
||||
credential_type: 4,
|
||||
kind: 'Machine',
|
||||
name: 'Cred 5',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
]);
|
||||
});
|
||||
|
||||
@@ -212,10 +297,10 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
expect(optionsList.prop('multiple')).toEqual(true);
|
||||
act(() => {
|
||||
optionsList.invoke('selectItem')({
|
||||
id: 5,
|
||||
id: 11,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
name: 'Cred 5',
|
||||
url: 'www.google.com',
|
||||
name: 'Vault',
|
||||
});
|
||||
});
|
||||
wrapper.update();
|
||||
@@ -223,12 +308,30 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
wrapper.find('Button[variant="primary"]').invoke('onClick')();
|
||||
});
|
||||
expect(onChange).toBeCalledWith([
|
||||
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
|
||||
{ id: 2, kind: 'ssh', name: 'Alex', url: 'www.google.com' },
|
||||
{ id: 21, kind: 'vault', name: 'Gatsby', inputs: { vault_id: '1' } },
|
||||
{ id: 23, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, kind: 'Machine', name: 'Gatsby' },
|
||||
{ id: 5, kind: 'vault', name: 'Cred 5', url: 'www.google.com' },
|
||||
{
|
||||
id: 1,
|
||||
credential_type: 1,
|
||||
kind: 'gce',
|
||||
name: 'Foo',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
credential_type: 2,
|
||||
kind: 'ssh',
|
||||
name: 'Alex',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 21,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
inputs: { vault_id: '1' },
|
||||
name: 'Gatsby',
|
||||
},
|
||||
{ id: 23, credential_type: 3, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, credential_type: 4, kind: 'Machine', name: 'Gatsby' },
|
||||
{ id: 11, credential_type: 3, kind: 'vault', name: 'Vault' },
|
||||
]);
|
||||
});
|
||||
|
||||
@@ -261,11 +364,11 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
expect(optionsList.prop('multiple')).toEqual(true);
|
||||
act(() => {
|
||||
optionsList.invoke('selectItem')({
|
||||
id: 5,
|
||||
id: 12,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
name: 'Cred 5',
|
||||
url: 'www.google.com',
|
||||
inputs: { vault_id: '2' },
|
||||
name: 'Other Vault',
|
||||
vault_id: '2',
|
||||
});
|
||||
});
|
||||
wrapper.update();
|
||||
@@ -273,17 +376,35 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
wrapper.find('Button[variant="primary"]').invoke('onClick')();
|
||||
});
|
||||
expect(onChange).toBeCalledWith([
|
||||
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
|
||||
{ id: 2, kind: 'ssh', name: 'Alex', url: 'www.google.com' },
|
||||
{ id: 21, kind: 'vault', name: 'Gatsby', inputs: { vault_id: '1' } },
|
||||
{ id: 23, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, kind: 'Machine', name: 'Gatsby' },
|
||||
{
|
||||
id: 5,
|
||||
kind: 'vault',
|
||||
name: 'Cred 5',
|
||||
id: 1,
|
||||
credential_type: 1,
|
||||
kind: 'gce',
|
||||
name: 'Foo',
|
||||
url: 'www.google.com',
|
||||
inputs: { vault_id: '2' },
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
credential_type: 2,
|
||||
kind: 'ssh',
|
||||
name: 'Alex',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 21,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
inputs: { vault_id: '1' },
|
||||
name: 'Gatsby',
|
||||
},
|
||||
{ id: 23, credential_type: 3, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, credential_type: 4, kind: 'Machine', name: 'Gatsby' },
|
||||
{
|
||||
id: 12,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
name: 'Other Vault',
|
||||
vault_id: '2',
|
||||
},
|
||||
]);
|
||||
});
|
||||
@@ -317,10 +438,10 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
expect(optionsList.prop('multiple')).toEqual(true);
|
||||
act(() => {
|
||||
optionsList.invoke('selectItem')({
|
||||
id: 24,
|
||||
id: 13,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
name: 'Cred 5',
|
||||
url: 'www.google.com',
|
||||
name: 'Vault Cred with Same Vault Id',
|
||||
inputs: { vault_id: '1' },
|
||||
});
|
||||
});
|
||||
@@ -329,15 +450,27 @@ describe('<MultiCredentialsLookup />', () => {
|
||||
wrapper.find('Button[variant="primary"]').invoke('onClick')();
|
||||
});
|
||||
expect(onChange).toBeCalledWith([
|
||||
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
|
||||
{ id: 2, kind: 'ssh', name: 'Alex', url: 'www.google.com' },
|
||||
{ id: 23, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, kind: 'Machine', name: 'Gatsby' },
|
||||
{
|
||||
id: 24,
|
||||
kind: 'vault',
|
||||
name: 'Cred 5',
|
||||
id: 1,
|
||||
credential_type: 1,
|
||||
kind: 'gce',
|
||||
name: 'Foo',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
credential_type: 2,
|
||||
kind: 'ssh',
|
||||
name: 'Alex',
|
||||
url: 'www.google.com',
|
||||
},
|
||||
{ id: 23, credential_type: 3, kind: 'vault', name: 'Gatsby 2' },
|
||||
{ id: 8, credential_type: 4, kind: 'Machine', name: 'Gatsby' },
|
||||
{
|
||||
id: 13,
|
||||
credential_type: 3,
|
||||
kind: 'vault',
|
||||
name: 'Vault Cred with Same Vault Id',
|
||||
inputs: { vault_id: '1' },
|
||||
},
|
||||
]);
|
||||
|
||||
@@ -108,7 +108,6 @@ function ProjectLookup({
|
||||
options: [
|
||||
[``, i18n._(t`Manual`)],
|
||||
[`git`, i18n._(t`Git`)],
|
||||
[`hg`, i18n._(t`Mercurial`)],
|
||||
[`svn`, i18n._(t`Subversion`)],
|
||||
[`archive`, i18n._(t`Remote Archive`)],
|
||||
[`insights`, i18n._(t`Red Hat Insights`)],
|
||||
|
||||
@@ -183,8 +183,12 @@ function NotificationList({
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Type`),
|
||||
key: 'or__type',
|
||||
name: i18n._(t`Description`),
|
||||
key: 'description__icontains',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Notification type`),
|
||||
key: 'or__notification_type',
|
||||
options: [
|
||||
['email', i18n._(t`Email`)],
|
||||
['grafana', i18n._(t`Grafana`)],
|
||||
@@ -212,6 +216,10 @@ function NotificationList({
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Type`),
|
||||
key: 'notification_type',
|
||||
},
|
||||
]}
|
||||
toolbarSearchableKeys={searchableKeys}
|
||||
toolbarRelatedSearchableKeys={relatedSearchableKeys}
|
||||
|
||||
@@ -61,10 +61,6 @@ const ItemToDelete = shape({
|
||||
}).isRequired,
|
||||
});
|
||||
|
||||
function cannotDelete(item) {
|
||||
return !item.summary_fields.user_capabilities.delete;
|
||||
}
|
||||
|
||||
function ToolbarDeleteButton({
|
||||
itemsToDelete,
|
||||
pluralizedItemName,
|
||||
@@ -72,6 +68,7 @@ function ToolbarDeleteButton({
|
||||
onDelete,
|
||||
warningMessage,
|
||||
i18n,
|
||||
cannotDelete,
|
||||
}) {
|
||||
const { isKebabified, onKebabModalChange } = useContext(KebabifiedContext);
|
||||
const [isModalOpen, setIsModalOpen] = useState(false);
|
||||
@@ -100,7 +97,7 @@ function ToolbarDeleteButton({
|
||||
return (
|
||||
<div>
|
||||
{errorMessage.length > 0
|
||||
? errorMessage
|
||||
? `${errorMessage}: ${itemsUnableToDelete}`
|
||||
: i18n._(
|
||||
t`You do not have permission to delete ${pluralizedItemName}: ${itemsUnableToDelete}`
|
||||
)}
|
||||
@@ -193,12 +190,14 @@ ToolbarDeleteButton.propTypes = {
|
||||
pluralizedItemName: string,
|
||||
errorMessage: string,
|
||||
warningMessage: node,
|
||||
cannotDelete: func,
|
||||
};
|
||||
|
||||
ToolbarDeleteButton.defaultProps = {
|
||||
pluralizedItemName: 'Items',
|
||||
errorMessage: '',
|
||||
warningMessage: null,
|
||||
cannotDelete: item => !item.summary_fields.user_capabilities.delete,
|
||||
};
|
||||
|
||||
export default withI18n()(ToolbarDeleteButton);
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
|
||||
exports[`<ToolbarDeleteButton /> should render button 1`] = `
|
||||
<ToolbarDeleteButton
|
||||
cannotDelete={[Function]}
|
||||
errorMessage=""
|
||||
i18n={"/i18n/"}
|
||||
itemsToDelete={Array []}
|
||||
|
||||
21
awx/ui_next/src/components/PaginatedTable/ActionItem.jsx
Normal file
21
awx/ui_next/src/components/PaginatedTable/ActionItem.jsx
Normal file
@@ -0,0 +1,21 @@
|
||||
import 'styled-components/macro';
|
||||
import React from 'react';
|
||||
import { Tooltip } from '@patternfly/react-core';
|
||||
|
||||
export default function ActionItem({ column, tooltip, visible, children }) {
|
||||
if (!visible) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<div
|
||||
css={`
|
||||
grid-column: ${column};
|
||||
`}
|
||||
>
|
||||
<Tooltip content={tooltip} position="top">
|
||||
{children}
|
||||
</Tooltip>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -0,0 +1,29 @@
|
||||
import React from 'react';
|
||||
import { shallow } from 'enzyme';
|
||||
import ActionItem from './ActionItem';
|
||||
|
||||
describe('<ActionItem />', () => {
|
||||
test('should render child with tooltip', async () => {
|
||||
const wrapper = shallow(
|
||||
<ActionItem columns={1} tooltip="a tooltip" visible>
|
||||
foo
|
||||
</ActionItem>
|
||||
);
|
||||
|
||||
const tooltip = wrapper.find('Tooltip');
|
||||
expect(tooltip.prop('content')).toEqual('a tooltip');
|
||||
expect(tooltip.prop('children')).toEqual('foo');
|
||||
});
|
||||
|
||||
test('should render null if not visible', async () => {
|
||||
const wrapper = shallow(
|
||||
<ActionItem columns={1} tooltip="foo">
|
||||
<div>foo</div>
|
||||
</ActionItem>
|
||||
);
|
||||
|
||||
expect(wrapper.find('Tooltip')).toHaveLength(0);
|
||||
expect(wrapper.find('div')).toHaveLength(0);
|
||||
expect(wrapper.text()).toEqual('');
|
||||
});
|
||||
});
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user