mirror of
https://github.com/ansible/awx.git
synced 2026-02-05 19:44:43 -03:30
Compare commits
135 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
df5231f527 | ||
|
|
8bd9233d2c | ||
|
|
4dfda92c69 | ||
|
|
0d2ae47238 | ||
|
|
b12c2a142d | ||
|
|
306f504fb7 | ||
|
|
862fafab86 | ||
|
|
f923f07b79 | ||
|
|
4112b20f1a | ||
|
|
18e7b6ce04 | ||
|
|
ebc540a460 | ||
|
|
b1f56df930 | ||
|
|
95960c8c14 | ||
|
|
488f52b82b | ||
|
|
b4a7cdbb60 | ||
|
|
04576af6a5 | ||
|
|
cf9f00ab86 | ||
|
|
200be3297a | ||
|
|
6da5205d73 | ||
|
|
15cb92d58e | ||
|
|
78cc2742b2 | ||
|
|
959d5058fc | ||
|
|
acf54e6102 | ||
|
|
4a9979e2db | ||
|
|
1e344bdf8a | ||
|
|
3cdf274bdb | ||
|
|
068de221c1 | ||
|
|
30b6e318cc | ||
|
|
2c1648f9c9 | ||
|
|
2c953ed7d0 | ||
|
|
2d00623c16 | ||
|
|
51a6ba14f1 | ||
|
|
6edd879a43 | ||
|
|
a31661ce08 | ||
|
|
c69d497093 | ||
|
|
8b9810e466 | ||
|
|
16f9411914 | ||
|
|
f7ba706ec2 | ||
|
|
5455fe3c10 | ||
|
|
8ac8bc8df2 | ||
|
|
ed474df744 | ||
|
|
c33d2a1e00 | ||
|
|
3e58ee068c | ||
|
|
b19e5aab28 | ||
|
|
6ec96a8f4f | ||
|
|
4db2df9691 | ||
|
|
0c696bfd96 | ||
|
|
63ffff3b76 | ||
|
|
c532c6fe61 | ||
|
|
61c2968a7c | ||
|
|
d9e41547a1 | ||
|
|
eec08fdcca | ||
|
|
b74f7f6c26 | ||
|
|
f37ac1dcc9 | ||
|
|
1c09114abd | ||
|
|
c0e1c8aa77 | ||
|
|
d82180605c | ||
|
|
d3b7829e69 | ||
|
|
4a214a7770 | ||
|
|
18bb910e33 | ||
|
|
ca8dcced8b | ||
|
|
0b9b8832a8 | ||
|
|
271b3f00b7 | ||
|
|
477f566da0 | ||
|
|
cf55b6a0ba | ||
|
|
a2acf4d61f | ||
|
|
3dc8c789fb | ||
|
|
7873d08311 | ||
|
|
c4df5f64c1 | ||
|
|
679d531930 | ||
|
|
7d0d000180 | ||
|
|
f0882aba7d | ||
|
|
5c1713460b | ||
|
|
67d19b20ef | ||
|
|
4a6147d4c2 | ||
|
|
d91e72c23f | ||
|
|
8c99321ec8 | ||
|
|
18e9121db4 | ||
|
|
0809c27bd1 | ||
|
|
807f4ea757 | ||
|
|
ef3f98a399 | ||
|
|
0bbf5e4faf | ||
|
|
da440469cf | ||
|
|
8d4425f056 | ||
|
|
1f46878652 | ||
|
|
930b46810f | ||
|
|
c6dc69c68b | ||
|
|
f00344f8b4 | ||
|
|
f9e0600263 | ||
|
|
3ba1ba1c9d | ||
|
|
ecf1d79ca5 | ||
|
|
82fd245ca9 | ||
|
|
df5aa8a47d | ||
|
|
f3c5cb5a2e | ||
|
|
b794fdbefd | ||
|
|
497f46041c | ||
|
|
e688ed813a | ||
|
|
6c3e42a1ac | ||
|
|
bfedbe561c | ||
|
|
6c439bb9ae | ||
|
|
f461a46155 | ||
|
|
eee84b1af7 | ||
|
|
c4ff27cedb | ||
|
|
cf57d596a3 | ||
|
|
a68cd6f0ae | ||
|
|
7ff4d821ce | ||
|
|
23914182c4 | ||
|
|
979328baa4 | ||
|
|
055c02072f | ||
|
|
8ff0902177 | ||
|
|
3d510c5064 | ||
|
|
df47186c43 | ||
|
|
2f7607a080 | ||
|
|
cde39413c9 | ||
|
|
41c9ea3c07 | ||
|
|
3d45f27502 | ||
|
|
0ab61fd3cb | ||
|
|
d0c891764f | ||
|
|
057320aed3 | ||
|
|
6340f9147c | ||
|
|
b8d6991e9d | ||
|
|
2f9742e9de | ||
|
|
e4c3454b98 | ||
|
|
7cc3a7c39d | ||
|
|
9c291c2b50 | ||
|
|
caad204cbb | ||
|
|
86eb541b3f | ||
|
|
05e2386fac | ||
|
|
3c0fd37a4d | ||
|
|
b23ccf7ee1 | ||
|
|
bd8643d599 | ||
|
|
b26b8e7097 | ||
|
|
a7a3609e48 | ||
|
|
93dda04fd0 | ||
|
|
15041e57b2 |
128
CHANGELOG.md
Normal file
128
CHANGELOG.md
Normal file
@@ -0,0 +1,128 @@
|
||||
# Changelog
|
||||
|
||||
This is a list of high-level changes for each release of AWX. A full list of commits can be found at `https://github.com/ansible/awx/releases/tag/<version>`.
|
||||
|
||||
## 9.1.1 (Jan 14, 2020)
|
||||
|
||||
- Fixed a bug that caused database migrations on Kubernetes installs to hang https://github.com/ansible/awx/pull/5579
|
||||
- Upgraded Python-level app dependencies in AWX virtual environment https://github.com/ansible/awx/pull/5407
|
||||
- Running jobs no longer block associated inventory updates https://github.com/ansible/awx/pull/5519
|
||||
- Fixed invalid_response SAML error https://github.com/ansible/awx/pull/5577
|
||||
- Optimized the callback receiver to drastically improve the write speed of stdout for parallel jobs (https://github.com/ansible/awx/pull/5618)
|
||||
|
||||
## 9.1.0 (Dec 17, 2019)
|
||||
- Added a command to generate a new SECRET_KEY and rekey the secrets in the database
|
||||
- Removed project update locking when jobs using it are running
|
||||
- Fixed slow queries for /api/v2/instances and /api/v2/instance_groups when smart inventories are used
|
||||
- Fixed a partial password disclosure when special characters existed in the RabbitMQ password (CVE-2019-19342)
|
||||
- Fixed hang in error handling for source control checkouts
|
||||
- Fixed an error on subsequent job runs that override the branch of a project on an instance that did not have a prior project checkout
|
||||
- Fixed an issue where jobs launched in isolated or container groups would incorrectly timeout
|
||||
- Fixed an incorrect link to instance groups documentation in the user interface
|
||||
- Fixed editing of inventory on Workflow templates
|
||||
- Fixed multiple issues with OAuth2 token cleanup system jobs
|
||||
- Fixed a bug that broke email notifications for workflow approval/deny https://github.com/ansible/awx/issues/5401
|
||||
- Updated SAML implementation to automatically login if authorization already exists
|
||||
- Updated AngularJS to 1.7.9 for CVE-2019-10768
|
||||
|
||||
## 9.0.1 (Nov 4, 2019)
|
||||
|
||||
- Fixed a bug in the installer that broke certain types of k8s installs https://github.com/ansible/awx/issues/5205
|
||||
|
||||
## 9.0.0 (Oct 31, 2019)
|
||||
|
||||
- Updated AWX images to use centos:8 as the parent image.
|
||||
- Updated to ansible-runner 1.4.4 to address various bugs.
|
||||
- Added oc and kubectl to the AWX images to support new container-based execution introduced in 8.0.0.
|
||||
- Added some optimizations to speed up the deletion of large Inventory Groups.
|
||||
- Fixed a bug that broke webhook launches for Job Templates that define a survey (https://github.com/ansible/awx/issues/5062).
|
||||
- Fixed a bug in the CLI which incorrectly parsed launch time arguments for `awx job_templates launch` and `awx workflow_job_templates launch` (https://github.com/ansible/awx/issues/5093).
|
||||
- Fixed a bug that caused inventory updates using "sourced from a project" to stop working (https://github.com/ansible/awx/issues/4750).
|
||||
- Fixed a bug that caused Slack notifications to sometimes show the wrong bot avatar (https://github.com/ansible/awx/pull/5125).
|
||||
- Fixed a bug that prevented the use of digits in Tower's URL settings (https://github.com/ansible/awx/issues/5081).
|
||||
|
||||
## 8.0.0 (Oct 21, 2019)
|
||||
|
||||
- The Ansible Tower Ansible modules have been migrated to a new official Ansible AWX collection: https://galaxy.ansible.com/awx/AWX
|
||||
Please note that this functionality is only supported in Ansible 2.9+
|
||||
- AWX now supports the ability to launch jobs from external webhooks (GitHub and GitLab integration are supported).
|
||||
- AWX now supports Container Groups, a new feature that allows you to schedule and run playbooks on single-use kubernetes pods on-demand.
|
||||
- AWX now supports sending notifications when Workflow steps are approved, denied, or time out.
|
||||
- AWX now records the user who approved or denied Workflow steps.
|
||||
- AWX now supports fetching Ansible Collections from private galaxy servers.
|
||||
- AWX now checks the user's ansible.cfg for paths where role/collections may live when running project updates.
|
||||
- AWX now uses PostgreSQL 10 by default.
|
||||
- AWX now warns more loudly about underlying AMQP connectivity issues (https://github.com/ansible/awx/pull/4857).
|
||||
- Added a few optimizations to drastically improve dashboard performance for larger AWX installs (installs with several hundred thousand jobs or more).
|
||||
- Updated to the latest version of Ansible's VMWare inventory script (which adds support for vmware_guest_facts).
|
||||
- Deprecated /api/v2/inventory_scripts/ (this endpoint - and the Custom Inventory Script feature - will be removed in a future release of AWX).
|
||||
- Fixed a bug which prevented Organization Admins from removing users from their own Organization (https://github.com/ansible/awx/issues/2979)
|
||||
- Fixed a bug which sometimes caused cluster nodes to fail to re-join with a cryptic error, "No instance found with the current cluster host id" (https://github.com/ansible/awx/issues/4294)
|
||||
- Fixed a bug that prevented the use of launch-time passphrases when using credential plugins (https://github.com/ansible/awx/pull/4807)
|
||||
- Fixed a bug that caused notifications assigned at the Organization level not to take effect for Workflows in that Organization (https://github.com/ansible/awx/issues/4712)
|
||||
- Fixed a bug which caused a notable amount of CPU overhead on RabbitMQ health checks (https://github.com/ansible/awx/pull/5009)
|
||||
- Fixed a bug which sometimes caused the <return> key to stop functioning in <textarea> elements (https://github.com/ansible/awx/issues/4192)
|
||||
- Fixed a bug which caused request contention when the same OAuth2.0 token was used in multiple simultaneous requests (https://github.com/ansible/awx/issues/4694)
|
||||
- Fixed a bug related to parsing multiple choice survey options (https://github.com/ansible/awx/issues/4452).
|
||||
- Fixed a bug that caused single-sign-on icons on the login page to fail to render in certain Windows browsers (https://github.com/ansible/awx/issues/3924)
|
||||
- Fixed a number of bugs that caused certain OAuth2 settings to not be properly respected, such as REFRESH_TOKEN_EXPIRE_SECONDS.
|
||||
- Fixed a number of bugs in the AWX CLI, including a bug which sometimes caused long lines of stdout output to be unexpectedly truncated.
|
||||
- Fixed a number of bugs on the job details UI which sometimes caused auto-scrolling stdout to become stuck.
|
||||
- Fixed a bug which caused LDAP authentication to fail if the TLD of the server URL contained digits (https://github.com/ansible/awx/issues/3646)
|
||||
- Fixed a bug which broke HashiCorp Vault integration on older versions of HashiCorp Vault.
|
||||
|
||||
## 7.0.0 (Sept 4, 2019)
|
||||
|
||||
- AWX now detects and installs Ansible Collections defined in your project (note - this feature only works in Ansible 2.9+) (https://github.com/ansible/awx/issues/2534)
|
||||
- AWX now includes an official command line client. Keep an eye out for a follow-up email on this mailing list for information on how to install it and try it out.
|
||||
- Added the ability to provide a specific SCM branch on jobs (https://github.com/ansible/awx/issues/282)
|
||||
- Added support for Workflow Approval Nodes, a new feature which allows you to add "pause and wait for approval" steps into your workflows (https://github.com/ansible/awx/issues/1206)
|
||||
- Added the ability to specify a specific HTTP method for webhook notifications (POST vs PUT) (https://github.com/ansible/awx/pull/4124)
|
||||
- Added the ability to specify a username and password for HTTP Basic Authorization for webhook notifications (https://github.com/ansible/awx/pull/4124)
|
||||
- Added support for customizing the text content of notifications (https://github.com/ansible/awx/issues/79)
|
||||
- Added the ability to enable and disable hosts in dynamic inventory (https://github.com/ansible/awx/pull/4420)
|
||||
- Added the description (if any) to the Job Template list (https://github.com/ansible/awx/issues/4359)
|
||||
- Added new metrics for instance hostnames and pending jobs to the /api/v2/metrics/ endpoint (https://github.com/ansible/awx/pull/4375)
|
||||
- Changed AWX's on/off toggle buttons to a non-text based style to simplify internationalization (https://github.com/ansible/awx/pull/4425)
|
||||
- Events emitted by ansible for adhoc commands are now sent to the external log aggregrator (https://github.com/ansible/awx/issues/4545)
|
||||
- Fixed a bug which allowed a user to make an organization credential in another organization without permissions to that organization (https://github.com/ansible/awx/pull/4483)
|
||||
- Fixed a bug that caused `extra_vars` on workflows to break when edited (https://github.com/ansible/awx/issues/4293)
|
||||
- Fixed a slow SQL query that caused performance issues when large numbers of groups exist (https://github.com/ansible/awx/issues/4461)
|
||||
- Fixed a few minor bugs in survey field validation (https://github.com/ansible/awx/pull/4509) (https://github.com/ansible/awx/pull/4479)
|
||||
- Fixed a bug that sometimes resulted in orphaned `ansible_runner_pi` directories in `/tmp` after playbook execution (https://github.com/ansible/awx/pull/4409)
|
||||
- Fixed a bug that caused the `is_system_auditor` flag in LDAP configuration to not work (https://github.com/ansible/awx/pull/4396)
|
||||
- Fixed a bug which caused schedules to disappear from the UI when toggled off (https://github.com/ansible/awx/pull/4378)
|
||||
- Fixed a bug that sometimes caused stdout content to contain extraneous blank lines in newer versions of Ansible (https://github.com/ansible/awx/pull/4391)
|
||||
- Updated to the latest Django security release, 2.2.4 (https://github.com/ansible/awx/pull/4410) (https://www.djangoproject.com/weblog/2019/aug/01/security-releases/)
|
||||
- Updated the default version of git to a version that includes support for x509 certificates (https://github.com/ansible/awx/issues/4362)
|
||||
- Removed the deprecated `credential` field from `/api/v2/workflow_job_templates/N/` (as part of the `/api/v1/` removal in prior AWX versions - https://github.com/ansible/awx/pull/4490).
|
||||
|
||||
## 6.1.0 (Jul 18, 2019)
|
||||
|
||||
- Updated AWX to use Django 2.2.2.
|
||||
- Updated the provided openstacksdk version to support new functionality (such as Nova scheduler_hints)
|
||||
- Added the ability to specify a custom cacert for the HashiCorp Vault credential plugin
|
||||
- Fixed a number of bugs related to path lookups for the HashiCorp Vault credential plugin
|
||||
- Fixed a bug which prevented signed SSH certificates from working, including the HashiCorp Vault Signed SSH backend
|
||||
- Fixed a bug which prevented custom logos from displaying on the login page (as a result of a new Content Security Policy in 6.0.0)
|
||||
- Fixed a bug which broke websocket connectivity in Apple Safari (as a result of a new Content Security Policy in 6.0.0)
|
||||
- Fixed a bug on the job output page that occasionally caused the "up" and "down" buttons to not load additional output
|
||||
- Fixed a bug on the job output page that caused quoted task names to display incorrectly
|
||||
|
||||
## 6.0.0 (Jul 1, 2019)
|
||||
|
||||
- Removed support for "Any" notification templates and their API endpoints e.g., /api/v2/job_templates/N/notification_templates/any/ (https://github.com/ansible/awx/issues/4022)
|
||||
- Fixed a bug which prevented credentials from properly being applied to inventory sources (https://github.com/ansible/awx/issues/4059)
|
||||
- Fixed a bug which can cause the task dispatcher to hang indefinitely when external logging support (e.g., Splunk, Logstash) is enabled (https://github.com/ansible/awx/issues/4181)
|
||||
- Fixed a bug which causes slow stdout display when running jobs against smart inventories. (https://github.com/ansible/awx/issues/3106)
|
||||
- Fixed a bug that caused SSL verification flags to fail to be respected for LDAP authentication in certain environments. (https://github.com/ansible/awx/pull/4190)
|
||||
- Added a simple Content Security Policy (https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP) to restrict access to third-party resources in the browser. (https://github.com/ansible/awx/pull/4167)
|
||||
- Updated ovirt4 library dependencies to work with newer versions of oVirt (https://github.com/ansible/awx/issues/4138)
|
||||
|
||||
## 5.0.0 (Jun 21, 2019)
|
||||
|
||||
- Bump Django Rest Framework from 3.7.7 to 3.9.4
|
||||
- Bump setuptools / pip dependencies
|
||||
- Fixed bug where Recent Notification list would not appear
|
||||
- Added notifications on job start
|
||||
- Default to Ansible 2.8
|
||||
49
Makefile
49
Makefile
@@ -26,6 +26,9 @@ DEV_DOCKER_TAG_BASE ?= gcr.io/ansible-tower-engineering
|
||||
# Python packages to install only from source (not from binary wheels)
|
||||
# Comma separated list
|
||||
SRC_ONLY_PKGS ?= cffi,pycparser,psycopg2,twilio
|
||||
# These should be upgraded in the AWX and Ansible venv before attempting
|
||||
# to install the actual requirements
|
||||
VENV_BOOTSTRAP ?= pip==19.3.1 setuptools==41.6.0
|
||||
|
||||
# Determine appropriate shasum command
|
||||
UNAME_S := $(shell uname -s)
|
||||
@@ -130,16 +133,16 @@ guard-%:
|
||||
|
||||
virtualenv: virtualenv_ansible virtualenv_awx
|
||||
|
||||
# virtualenv_* targets do not use --system-site-packages to prevent bugs installing packages
|
||||
# but Ansible venvs are expected to have this, so that must be done after venv creation
|
||||
virtualenv_ansible:
|
||||
if [ "$(VENV_BASE)" ]; then \
|
||||
if [ ! -d "$(VENV_BASE)" ]; then \
|
||||
mkdir $(VENV_BASE); \
|
||||
fi; \
|
||||
if [ ! -d "$(VENV_BASE)/ansible" ]; then \
|
||||
virtualenv -p python --system-site-packages $(VENV_BASE)/ansible && \
|
||||
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --ignore-installed six packaging appdirs && \
|
||||
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --ignore-installed setuptools==36.0.1 && \
|
||||
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --ignore-installed pip==9.0.1; \
|
||||
virtualenv -p python $(VENV_BASE)/ansible && \
|
||||
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) $(VENV_BOOTSTRAP); \
|
||||
fi; \
|
||||
fi
|
||||
|
||||
@@ -149,36 +152,46 @@ virtualenv_ansible_py3:
|
||||
mkdir $(VENV_BASE); \
|
||||
fi; \
|
||||
if [ ! -d "$(VENV_BASE)/ansible" ]; then \
|
||||
$(PYTHON) -m venv --system-site-packages $(VENV_BASE)/ansible; \
|
||||
virtualenv -p $(PYTHON) $(VENV_BASE)/ansible; \
|
||||
$(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) $(VENV_BOOTSTRAP); \
|
||||
fi; \
|
||||
fi
|
||||
|
||||
# flit is needed for offline install of certain packages, specifically ptyprocess
|
||||
# it is needed for setup, but not always recognized as a setup dependency
|
||||
# similar to pip, setuptools, and wheel, these are all needed here as a bootstrapping issues
|
||||
virtualenv_awx:
|
||||
if [ "$(VENV_BASE)" ]; then \
|
||||
if [ ! -d "$(VENV_BASE)" ]; then \
|
||||
mkdir $(VENV_BASE); \
|
||||
fi; \
|
||||
if [ ! -d "$(VENV_BASE)/awx" ]; then \
|
||||
$(PYTHON) -m venv --system-site-packages $(VENV_BASE)/awx; \
|
||||
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed docutils==0.14; \
|
||||
virtualenv -p $(PYTHON) $(VENV_BASE)/awx; \
|
||||
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) $(VENV_BOOTSTRAP) && \
|
||||
$(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) flit; \
|
||||
fi; \
|
||||
fi
|
||||
|
||||
# --ignore-install flag is not used because *.txt files should specify exact versions
|
||||
requirements_ansible: virtualenv_ansible
|
||||
if [[ "$(PIP_OPTIONS)" == *"--no-index"* ]]; then \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_local.txt | $(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --ignore-installed -r /dev/stdin ; \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_local.txt | $(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) -r /dev/stdin ; \
|
||||
else \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_git.txt | $(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) --ignore-installed -r /dev/stdin ; \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_git.txt | $(VENV_BASE)/ansible/bin/pip install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) -r /dev/stdin ; \
|
||||
fi
|
||||
$(VENV_BASE)/ansible/bin/pip uninstall --yes -r requirements/requirements_ansible_uninstall.txt
|
||||
# Same effect as using --system-site-packages flag on venv creation
|
||||
rm $(shell ls -d $(VENV_BASE)/ansible/lib/python* | head -n 1)/no-global-site-packages.txt
|
||||
|
||||
requirements_ansible_py3: virtualenv_ansible_py3
|
||||
if [[ "$(PIP_OPTIONS)" == *"--no-index"* ]]; then \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_local.txt | $(VENV_BASE)/ansible/bin/pip3 install $(PIP_OPTIONS) --ignore-installed -r /dev/stdin ; \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_local.txt | $(VENV_BASE)/ansible/bin/pip3 install $(PIP_OPTIONS) -r /dev/stdin ; \
|
||||
else \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_git.txt | $(VENV_BASE)/ansible/bin/pip3 install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) --ignore-installed -r /dev/stdin ; \
|
||||
cat requirements/requirements_ansible.txt requirements/requirements_ansible_git.txt | $(VENV_BASE)/ansible/bin/pip3 install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) -r /dev/stdin ; \
|
||||
fi
|
||||
$(VENV_BASE)/ansible/bin/pip3 uninstall --yes -r requirements/requirements_ansible_uninstall.txt
|
||||
# Same effect as using --system-site-packages flag on venv creation
|
||||
rm $(shell ls -d $(VENV_BASE)/ansible/lib/python* | head -n 1)/no-global-site-packages.txt
|
||||
|
||||
requirements_ansible_dev:
|
||||
if [ "$(VENV_BASE)" ]; then \
|
||||
@@ -186,13 +199,13 @@ requirements_ansible_dev:
|
||||
fi
|
||||
|
||||
# Install third-party requirements needed for AWX's environment.
|
||||
# this does not use system site packages intentionally
|
||||
requirements_awx: virtualenv_awx
|
||||
if [[ "$(PIP_OPTIONS)" == *"--no-index"* ]]; then \
|
||||
cat requirements/requirements.txt requirements/requirements_local.txt | $(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --ignore-installed -r /dev/stdin ; \
|
||||
cat requirements/requirements.txt requirements/requirements_local.txt | $(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) -r /dev/stdin ; \
|
||||
else \
|
||||
cat requirements/requirements.txt requirements/requirements_git.txt | $(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) --ignore-installed -r /dev/stdin ; \
|
||||
cat requirements/requirements.txt requirements/requirements_git.txt | $(VENV_BASE)/awx/bin/pip install $(PIP_OPTIONS) --no-binary $(SRC_ONLY_PKGS) -r /dev/stdin ; \
|
||||
fi
|
||||
echo "include-system-site-packages = true" >> $(VENV_BASE)/awx/lib/python$(PYTHON_VERSION)/pyvenv.cfg
|
||||
$(VENV_BASE)/awx/bin/pip uninstall --yes -r requirements/requirements_tower_uninstall.txt
|
||||
|
||||
requirements_awx_dev:
|
||||
@@ -395,7 +408,7 @@ test_collection:
|
||||
@if [ "$(VENV_BASE)" ]; then \
|
||||
. $(VENV_BASE)/awx/bin/activate; \
|
||||
fi; \
|
||||
PYTHONPATH=$(COLLECTION_VENV):/awx_devel/awx_collection:$PYTHONPATH py.test $(COLLECTION_TEST_DIRS)
|
||||
PYTHONPATH=$(COLLECTION_VENV):/awx_devel/awx_collection:$PYTHONPATH:/usr/lib/python3.6/site-packages py.test $(COLLECTION_TEST_DIRS)
|
||||
|
||||
flake8_collection:
|
||||
flake8 awx_collection/ # Different settings, in main exclude list
|
||||
@@ -411,7 +424,11 @@ test_collection_sanity:
|
||||
|
||||
build_collection:
|
||||
ansible-playbook -i localhost, awx_collection/template_galaxy.yml -e collection_package=$(COLLECTION_PACKAGE) -e collection_namespace=$(COLLECTION_NAMESPACE) -e collection_version=$(VERSION)
|
||||
ansible-galaxy collection build awx_collection --output-path=awx_collection
|
||||
ansible-galaxy collection build awx_collection --force --output-path=awx_collection
|
||||
|
||||
install_collection: build_collection
|
||||
rm -rf ~/.ansible/collections/ansible_collections/awx/awx
|
||||
ansible-galaxy collection install awx_collection/awx-awx-$(VERSION).tar.gz
|
||||
|
||||
test_unit:
|
||||
@if [ "$(VENV_BASE)" ]; then \
|
||||
|
||||
@@ -24,31 +24,18 @@ except ImportError: # pragma: no cover
|
||||
import hashlib
|
||||
|
||||
try:
|
||||
import django
|
||||
from django.db.backends.base import schema
|
||||
from django.db.backends.utils import names_digest
|
||||
import django # noqa: F401
|
||||
HAS_DJANGO = True
|
||||
except ImportError:
|
||||
HAS_DJANGO = False
|
||||
else:
|
||||
from django.db.backends.base import schema
|
||||
from django.db.backends.utils import names_digest
|
||||
|
||||
|
||||
if HAS_DJANGO is True:
|
||||
# This line exists to make sure we don't regress on FIPS support if we
|
||||
# upgrade Django; if you're upgrading Django and see this error,
|
||||
# update the version check below, and confirm that FIPS still works.
|
||||
# If operating in a FIPS environment, `hashlib.md5()` will raise a `ValueError`,
|
||||
# but will support the `usedforsecurity` keyword on RHEL and Centos systems.
|
||||
|
||||
# Keep an eye on https://code.djangoproject.com/ticket/28401
|
||||
target_version = '2.2.4'
|
||||
if django.__version__ != target_version:
|
||||
raise RuntimeError(
|
||||
"Django version other than {target} detected: {current}. "
|
||||
"Overriding `names_digest` is known to work for Django {target} "
|
||||
"and may not work in other Django versions.".format(target=target_version,
|
||||
current=django.__version__)
|
||||
)
|
||||
|
||||
# See upgrade blocker note in requirements/README.md
|
||||
try:
|
||||
names_digest('foo', 'bar', 'baz', length=8)
|
||||
except ValueError:
|
||||
|
||||
@@ -67,6 +67,7 @@ register(
|
||||
field_class=fields.CharField,
|
||||
allow_blank=True,
|
||||
required=False,
|
||||
default='',
|
||||
label=_('Login redirect override URL'),
|
||||
help_text=_('URL to which unauthorized users will be redirected to log in. '
|
||||
'If blank, users will be sent to the Tower login page.'),
|
||||
|
||||
@@ -141,6 +141,7 @@ SUMMARIZABLE_FK_FIELDS = {
|
||||
'target_credential': DEFAULT_SUMMARY_FIELDS + ('kind', 'cloud', 'credential_type_id'),
|
||||
'webhook_credential': DEFAULT_SUMMARY_FIELDS,
|
||||
'approved_or_denied_by': ('id', 'username', 'first_name', 'last_name'),
|
||||
'credential_type': DEFAULT_SUMMARY_FIELDS,
|
||||
}
|
||||
|
||||
|
||||
@@ -3873,26 +3874,6 @@ class JobEventSerializer(BaseSerializer):
|
||||
return data
|
||||
|
||||
|
||||
class JobEventWebSocketSerializer(JobEventSerializer):
|
||||
created = serializers.SerializerMethodField()
|
||||
modified = serializers.SerializerMethodField()
|
||||
event_name = serializers.CharField(source='event')
|
||||
group_name = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = JobEvent
|
||||
fields = ('*', 'event_name', 'group_name',)
|
||||
|
||||
def get_created(self, obj):
|
||||
return obj.created.isoformat()
|
||||
|
||||
def get_modified(self, obj):
|
||||
return obj.modified.isoformat()
|
||||
|
||||
def get_group_name(self, obj):
|
||||
return 'job_events'
|
||||
|
||||
|
||||
class ProjectUpdateEventSerializer(JobEventSerializer):
|
||||
stdout = serializers.SerializerMethodField()
|
||||
event_data = serializers.SerializerMethodField()
|
||||
@@ -3924,26 +3905,6 @@ class ProjectUpdateEventSerializer(JobEventSerializer):
|
||||
return {}
|
||||
|
||||
|
||||
class ProjectUpdateEventWebSocketSerializer(ProjectUpdateEventSerializer):
|
||||
created = serializers.SerializerMethodField()
|
||||
modified = serializers.SerializerMethodField()
|
||||
event_name = serializers.CharField(source='event')
|
||||
group_name = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = ProjectUpdateEvent
|
||||
fields = ('*', 'event_name', 'group_name',)
|
||||
|
||||
def get_created(self, obj):
|
||||
return obj.created.isoformat()
|
||||
|
||||
def get_modified(self, obj):
|
||||
return obj.modified.isoformat()
|
||||
|
||||
def get_group_name(self, obj):
|
||||
return 'project_update_events'
|
||||
|
||||
|
||||
class AdHocCommandEventSerializer(BaseSerializer):
|
||||
|
||||
event_display = serializers.CharField(source='get_event_display', read_only=True)
|
||||
@@ -3975,26 +3936,6 @@ class AdHocCommandEventSerializer(BaseSerializer):
|
||||
return data
|
||||
|
||||
|
||||
class AdHocCommandEventWebSocketSerializer(AdHocCommandEventSerializer):
|
||||
created = serializers.SerializerMethodField()
|
||||
modified = serializers.SerializerMethodField()
|
||||
event_name = serializers.CharField(source='event')
|
||||
group_name = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = AdHocCommandEvent
|
||||
fields = ('*', 'event_name', 'group_name',)
|
||||
|
||||
def get_created(self, obj):
|
||||
return obj.created.isoformat()
|
||||
|
||||
def get_modified(self, obj):
|
||||
return obj.modified.isoformat()
|
||||
|
||||
def get_group_name(self, obj):
|
||||
return 'ad_hoc_command_events'
|
||||
|
||||
|
||||
class InventoryUpdateEventSerializer(AdHocCommandEventSerializer):
|
||||
|
||||
class Meta:
|
||||
@@ -4010,26 +3951,6 @@ class InventoryUpdateEventSerializer(AdHocCommandEventSerializer):
|
||||
return res
|
||||
|
||||
|
||||
class InventoryUpdateEventWebSocketSerializer(InventoryUpdateEventSerializer):
|
||||
created = serializers.SerializerMethodField()
|
||||
modified = serializers.SerializerMethodField()
|
||||
event_name = serializers.CharField(source='event')
|
||||
group_name = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = InventoryUpdateEvent
|
||||
fields = ('*', 'event_name', 'group_name',)
|
||||
|
||||
def get_created(self, obj):
|
||||
return obj.created.isoformat()
|
||||
|
||||
def get_modified(self, obj):
|
||||
return obj.modified.isoformat()
|
||||
|
||||
def get_group_name(self, obj):
|
||||
return 'inventory_update_events'
|
||||
|
||||
|
||||
class SystemJobEventSerializer(AdHocCommandEventSerializer):
|
||||
|
||||
class Meta:
|
||||
@@ -4045,26 +3966,6 @@ class SystemJobEventSerializer(AdHocCommandEventSerializer):
|
||||
return res
|
||||
|
||||
|
||||
class SystemJobEventWebSocketSerializer(SystemJobEventSerializer):
|
||||
created = serializers.SerializerMethodField()
|
||||
modified = serializers.SerializerMethodField()
|
||||
event_name = serializers.CharField(source='event')
|
||||
group_name = serializers.SerializerMethodField()
|
||||
|
||||
class Meta:
|
||||
model = SystemJobEvent
|
||||
fields = ('*', 'event_name', 'group_name',)
|
||||
|
||||
def get_created(self, obj):
|
||||
return obj.created.isoformat()
|
||||
|
||||
def get_modified(self, obj):
|
||||
return obj.modified.isoformat()
|
||||
|
||||
def get_group_name(self, obj):
|
||||
return 'system_job_events'
|
||||
|
||||
|
||||
class JobLaunchSerializer(BaseSerializer):
|
||||
|
||||
# Representational fields
|
||||
|
||||
@@ -2549,7 +2549,7 @@ class JobTemplateSurveySpec(GenericAPIView):
|
||||
if not isinstance(val, allow_types):
|
||||
return Response(dict(error=_("'{field_name}' in survey question {idx} expected to be {type_label}.").format(
|
||||
field_name=field_name, type_label=type_label, **context
|
||||
)))
|
||||
)), status=status.HTTP_400_BAD_REQUEST)
|
||||
if survey_item['variable'] in variable_set:
|
||||
return Response(dict(error=_("'variable' '%(item)s' duplicated in survey question %(survey)s.") % {
|
||||
'item': survey_item['variable'], 'survey': str(idx)}), status=status.HTTP_400_BAD_REQUEST)
|
||||
@@ -2564,7 +2564,7 @@ class JobTemplateSurveySpec(GenericAPIView):
|
||||
"'{survey_item[type]}' in survey question {idx} is not one of '{allowed_types}' allowed question types."
|
||||
).format(
|
||||
allowed_types=', '.join(JobTemplateSurveySpec.ALLOWED_TYPES.keys()), **context
|
||||
)))
|
||||
)), status=status.HTTP_400_BAD_REQUEST)
|
||||
if 'default' in survey_item and survey_item['default'] != '':
|
||||
if not isinstance(survey_item['default'], JobTemplateSurveySpec.ALLOWED_TYPES[qtype]):
|
||||
type_label = 'string'
|
||||
@@ -2582,7 +2582,7 @@ class JobTemplateSurveySpec(GenericAPIView):
|
||||
if survey_item[key] is not None and (not isinstance(survey_item[key], int)):
|
||||
return Response(dict(error=_(
|
||||
"The {min_or_max} limit in survey question {idx} expected to be integer."
|
||||
).format(min_or_max=key, **context)))
|
||||
).format(min_or_max=key, **context)), status=status.HTTP_400_BAD_REQUEST)
|
||||
# if it's a multiselect or multiple choice, it must have coices listed
|
||||
# choices and defualts must come in as strings seperated by /n characters.
|
||||
if qtype == 'multiselect' or qtype == 'multiplechoice':
|
||||
@@ -2592,7 +2592,7 @@ class JobTemplateSurveySpec(GenericAPIView):
|
||||
else:
|
||||
return Response(dict(error=_(
|
||||
"Survey question {idx} of type {survey_item[type]} must specify choices.".format(**context)
|
||||
)))
|
||||
)), status=status.HTTP_400_BAD_REQUEST)
|
||||
# If there is a default string split it out removing extra /n characters.
|
||||
# Note: There can still be extra newline characters added in the API, these are sanitized out using .strip()
|
||||
if 'default' in survey_item:
|
||||
@@ -2606,11 +2606,11 @@ class JobTemplateSurveySpec(GenericAPIView):
|
||||
if len(list_of_defaults) > 1:
|
||||
return Response(dict(error=_(
|
||||
"Multiple Choice (Single Select) can only have one default value.".format(**context)
|
||||
)))
|
||||
)), status=status.HTTP_400_BAD_REQUEST)
|
||||
if any(item not in survey_item['choices'] for item in list_of_defaults):
|
||||
return Response(dict(error=_(
|
||||
"Default choice must be answered from the choices listed.".format(**context)
|
||||
)))
|
||||
)), status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
# Process encryption substitution
|
||||
if ("default" in survey_item and isinstance(survey_item['default'], str) and
|
||||
|
||||
@@ -28,6 +28,8 @@ from awx.conf import settings_registry
|
||||
from awx.conf.models import Setting
|
||||
from awx.conf.migrations._reencrypt import decrypt_field as old_decrypt_field
|
||||
|
||||
import cachetools
|
||||
|
||||
# FIXME: Gracefully handle when settings are accessed before the database is
|
||||
# ready (or during migrations).
|
||||
|
||||
@@ -136,6 +138,14 @@ def filter_sensitive(registry, key, value):
|
||||
return value
|
||||
|
||||
|
||||
# settings.__getattr__ is called *constantly*, and the LOG_AGGREGATOR_ ones are
|
||||
# so ubiquitous when external logging is enabled that they should kept in memory
|
||||
# with a short TTL to avoid even having to contact memcached
|
||||
# the primary use case for this optimization is the callback receiver
|
||||
# when external logging is enabled
|
||||
LOGGING_SETTINGS_CACHE = cachetools.TTLCache(maxsize=50, ttl=1)
|
||||
|
||||
|
||||
class EncryptedCacheProxy(object):
|
||||
|
||||
def __init__(self, cache, registry, encrypter=None, decrypter=None):
|
||||
@@ -437,11 +447,17 @@ class SettingsWrapper(UserSettingsHolder):
|
||||
return self._get_default('SETTINGS_MODULE')
|
||||
|
||||
def __getattr__(self, name):
|
||||
if name.startswith('LOG_AGGREGATOR_'):
|
||||
cached = LOGGING_SETTINGS_CACHE.get(name)
|
||||
if cached:
|
||||
return cached
|
||||
value = empty
|
||||
if name in self.all_supported_settings:
|
||||
with _ctit_db_wrapper(trans_safe=True):
|
||||
value = self._get_local(name)
|
||||
if value is not empty:
|
||||
if name.startswith('LOG_AGGREGATOR_'):
|
||||
LOGGING_SETTINGS_CACHE[name] = value
|
||||
return value
|
||||
value = self._get_default(name)
|
||||
# sometimes users specify RabbitMQ passwords that contain
|
||||
|
||||
@@ -52,7 +52,7 @@ def config(since):
|
||||
'tower_version': get_awx_version(),
|
||||
'ansible_version': get_ansible_version(),
|
||||
'license_type': license_info.get('license_type', 'UNLICENSED'),
|
||||
'free_instances': license_info.get('free instances', 0),
|
||||
'free_instances': license_info.get('free_instances', 0),
|
||||
'license_expiry': license_info.get('time_remaining', 0),
|
||||
'pendo_tracking': settings.PENDO_TRACKING_STATE,
|
||||
'authentication_backends': settings.AUTHENTICATION_BACKENDS,
|
||||
|
||||
@@ -1,17 +1,8 @@
|
||||
from django.apps import AppConfig
|
||||
from django.db.models.signals import pre_migrate
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
|
||||
|
||||
def raise_migration_flag(**kwargs):
|
||||
from awx.main.tasks import set_migration_flag
|
||||
set_migration_flag.delay()
|
||||
|
||||
|
||||
class MainConfig(AppConfig):
|
||||
|
||||
name = 'awx.main'
|
||||
verbose_name = _('Main')
|
||||
|
||||
def ready(self):
|
||||
pre_migrate.connect(raise_migration_flag, sender=self)
|
||||
|
||||
@@ -132,7 +132,7 @@ class PoolWorker(object):
|
||||
# when this occurs, it's _fine_ to ignore this KeyError because
|
||||
# the purpose of self.managed_tasks is to just track internal
|
||||
# state of which events are *currently* being processed.
|
||||
pass
|
||||
logger.warn('Event UUID {} appears to be have been duplicated.'.format(uuid))
|
||||
|
||||
@property
|
||||
def current_task(self):
|
||||
@@ -277,7 +277,7 @@ class WorkerPool(object):
|
||||
logger.warn("could not write to queue %s" % preferred_queue)
|
||||
logger.warn("detail: {}".format(tb))
|
||||
write_attempt_order.append(preferred_queue)
|
||||
logger.warn("could not write payload to any queue, attempted order: {}".format(write_attempt_order))
|
||||
logger.error("could not write payload to any queue, attempted order: {}".format(write_attempt_order))
|
||||
return None
|
||||
|
||||
def stop(self, signum):
|
||||
|
||||
@@ -119,6 +119,9 @@ class AWXConsumer(ConsumerMixin):
|
||||
|
||||
class BaseWorker(object):
|
||||
|
||||
def read(self, queue):
|
||||
return queue.get(block=True, timeout=1)
|
||||
|
||||
def work_loop(self, queue, finished, idx, *args):
|
||||
ppid = os.getppid()
|
||||
signal_handler = WorkerSignalHandler()
|
||||
@@ -128,7 +131,7 @@ class BaseWorker(object):
|
||||
if os.getppid() != ppid:
|
||||
break
|
||||
try:
|
||||
body = queue.get(block=True, timeout=1)
|
||||
body = self.read(queue)
|
||||
if body == 'QUIT':
|
||||
break
|
||||
except QueueEmpty:
|
||||
|
||||
@@ -1,19 +1,26 @@
|
||||
import logging
|
||||
import time
|
||||
import traceback
|
||||
from queue import Empty as QueueEmpty
|
||||
|
||||
from django.utils.timezone import now as tz_now
|
||||
from django.conf import settings
|
||||
from django.db import DatabaseError, OperationalError, connection as django_connection
|
||||
from django.db.utils import InterfaceError, InternalError
|
||||
from django.db.utils import InterfaceError, InternalError, IntegrityError
|
||||
|
||||
from awx.main.consumers import emit_channel_notification
|
||||
from awx.main.models import (JobEvent, AdHocCommandEvent, ProjectUpdateEvent,
|
||||
InventoryUpdateEvent, SystemJobEvent, UnifiedJob)
|
||||
from awx.main.models.events import emit_event_detail
|
||||
|
||||
from .base import BaseWorker
|
||||
|
||||
logger = logging.getLogger('awx.main.commands.run_callback_receiver')
|
||||
|
||||
# the number of seconds to buffer events in memory before flushing
|
||||
# using JobEvent.objects.bulk_create()
|
||||
BUFFER_SECONDS = .1
|
||||
|
||||
|
||||
class CallbackBrokerWorker(BaseWorker):
|
||||
'''
|
||||
@@ -26,89 +33,112 @@ class CallbackBrokerWorker(BaseWorker):
|
||||
|
||||
MAX_RETRIES = 2
|
||||
|
||||
def __init__(self):
|
||||
self.buff = {}
|
||||
|
||||
def read(self, queue):
|
||||
try:
|
||||
return queue.get(block=True, timeout=BUFFER_SECONDS)
|
||||
except QueueEmpty:
|
||||
return {'event': 'FLUSH'}
|
||||
|
||||
def flush(self, force=False):
|
||||
now = tz_now()
|
||||
if (
|
||||
force or
|
||||
any([len(events) >= 1000 for events in self.buff.values()])
|
||||
):
|
||||
for cls, events in self.buff.items():
|
||||
logger.debug(f'{cls.__name__}.objects.bulk_create({len(events)})')
|
||||
for e in events:
|
||||
if not e.created:
|
||||
e.created = now
|
||||
e.modified = now
|
||||
try:
|
||||
cls.objects.bulk_create(events)
|
||||
except Exception as exc:
|
||||
# if an exception occurs, we should re-attempt to save the
|
||||
# events one-by-one, because something in the list is
|
||||
# broken/stale (e.g., an IntegrityError on a specific event)
|
||||
for e in events:
|
||||
try:
|
||||
if (
|
||||
isinstance(exc, IntegrityError),
|
||||
getattr(e, 'host_id', '')
|
||||
):
|
||||
# this is one potential IntegrityError we can
|
||||
# work around - if the host disappears before
|
||||
# the event can be processed
|
||||
e.host_id = None
|
||||
e.save()
|
||||
except Exception:
|
||||
logger.exception('Database Error Saving Job Event')
|
||||
for e in events:
|
||||
emit_event_detail(e)
|
||||
self.buff = {}
|
||||
|
||||
def perform_work(self, body):
|
||||
try:
|
||||
event_map = {
|
||||
'job_id': JobEvent,
|
||||
'ad_hoc_command_id': AdHocCommandEvent,
|
||||
'project_update_id': ProjectUpdateEvent,
|
||||
'inventory_update_id': InventoryUpdateEvent,
|
||||
'system_job_id': SystemJobEvent,
|
||||
}
|
||||
flush = body.get('event') == 'FLUSH'
|
||||
if not flush:
|
||||
event_map = {
|
||||
'job_id': JobEvent,
|
||||
'ad_hoc_command_id': AdHocCommandEvent,
|
||||
'project_update_id': ProjectUpdateEvent,
|
||||
'inventory_update_id': InventoryUpdateEvent,
|
||||
'system_job_id': SystemJobEvent,
|
||||
}
|
||||
|
||||
if not any([key in body for key in event_map]):
|
||||
raise Exception('Payload does not have a job identifier')
|
||||
|
||||
def _save_event_data():
|
||||
job_identifier = 'unknown job'
|
||||
for key, cls in event_map.items():
|
||||
if key in body:
|
||||
cls.create_from_data(**body)
|
||||
job_identifier = body[key]
|
||||
break
|
||||
|
||||
job_identifier = 'unknown job'
|
||||
job_key = 'unknown'
|
||||
for key in event_map.keys():
|
||||
if key in body:
|
||||
job_identifier = body[key]
|
||||
job_key = key
|
||||
break
|
||||
|
||||
if settings.DEBUG:
|
||||
from pygments import highlight
|
||||
from pygments.lexers import PythonLexer
|
||||
from pygments.formatters import Terminal256Formatter
|
||||
from pprint import pformat
|
||||
if body.get('event') == 'EOF':
|
||||
event_thing = 'EOF event'
|
||||
else:
|
||||
event_thing = 'event {}'.format(body.get('counter', 'unknown'))
|
||||
logger.info('Callback worker received {} for {} {}'.format(
|
||||
event_thing, job_key[:-len('_id')], job_identifier
|
||||
))
|
||||
logger.debug('Body: {}'.format(
|
||||
highlight(pformat(body, width=160), PythonLexer(), Terminal256Formatter(style='friendly'))
|
||||
)[:1024 * 4])
|
||||
try:
|
||||
final_counter = body.get('final_counter', 0)
|
||||
logger.info('Event processing is finished for Job {}, sending notifications'.format(job_identifier))
|
||||
# EOF events are sent when stdout for the running task is
|
||||
# closed. don't actually persist them to the database; we
|
||||
# just use them to report `summary` websocket events as an
|
||||
# approximation for when a job is "done"
|
||||
emit_channel_notification(
|
||||
'jobs-summary',
|
||||
dict(group_name='jobs', unified_job_id=job_identifier, final_counter=final_counter)
|
||||
)
|
||||
# Additionally, when we've processed all events, we should
|
||||
# have all the data we need to send out success/failure
|
||||
# notification templates
|
||||
uj = UnifiedJob.objects.get(pk=job_identifier)
|
||||
if hasattr(uj, 'send_notification_templates'):
|
||||
retries = 0
|
||||
while retries < 5:
|
||||
if uj.finished:
|
||||
uj.send_notification_templates('succeeded' if uj.status == 'successful' else 'failed')
|
||||
break
|
||||
else:
|
||||
# wait a few seconds to avoid a race where the
|
||||
# events are persisted _before_ the UJ.status
|
||||
# changes from running -> successful
|
||||
retries += 1
|
||||
time.sleep(1)
|
||||
uj = UnifiedJob.objects.get(pk=job_identifier)
|
||||
except Exception:
|
||||
logger.exception('Worker failed to emit notifications: Job {}'.format(job_identifier))
|
||||
return
|
||||
|
||||
if body.get('event') == 'EOF':
|
||||
try:
|
||||
final_counter = body.get('final_counter', 0)
|
||||
logger.info('Event processing is finished for Job {}, sending notifications'.format(job_identifier))
|
||||
# EOF events are sent when stdout for the running task is
|
||||
# closed. don't actually persist them to the database; we
|
||||
# just use them to report `summary` websocket events as an
|
||||
# approximation for when a job is "done"
|
||||
emit_channel_notification(
|
||||
'jobs-summary',
|
||||
dict(group_name='jobs', unified_job_id=job_identifier, final_counter=final_counter)
|
||||
)
|
||||
# Additionally, when we've processed all events, we should
|
||||
# have all the data we need to send out success/failure
|
||||
# notification templates
|
||||
uj = UnifiedJob.objects.get(pk=job_identifier)
|
||||
if hasattr(uj, 'send_notification_templates'):
|
||||
retries = 0
|
||||
while retries < 5:
|
||||
if uj.finished:
|
||||
uj.send_notification_templates('succeeded' if uj.status == 'successful' else 'failed')
|
||||
break
|
||||
else:
|
||||
# wait a few seconds to avoid a race where the
|
||||
# events are persisted _before_ the UJ.status
|
||||
# changes from running -> successful
|
||||
retries += 1
|
||||
time.sleep(1)
|
||||
uj = UnifiedJob.objects.get(pk=job_identifier)
|
||||
except Exception:
|
||||
logger.exception('Worker failed to emit notifications: Job {}'.format(job_identifier))
|
||||
return
|
||||
event = cls.create_from_data(**body)
|
||||
self.buff.setdefault(cls, []).append(event)
|
||||
|
||||
retries = 0
|
||||
while retries <= self.MAX_RETRIES:
|
||||
try:
|
||||
_save_event_data()
|
||||
self.flush(force=flush)
|
||||
break
|
||||
except (OperationalError, InterfaceError, InternalError):
|
||||
if retries >= self.MAX_RETRIES:
|
||||
logger.exception('Worker could not re-establish database connectivity, giving up on event for Job {}'.format(job_identifier))
|
||||
logger.exception('Worker could not re-establish database connectivity, giving up on one or more events.')
|
||||
return
|
||||
delay = 60 * retries
|
||||
logger.exception('Database Error Saving Job Event, retry #{i} in {delay} seconds:'.format(
|
||||
@@ -119,7 +149,7 @@ class CallbackBrokerWorker(BaseWorker):
|
||||
time.sleep(delay)
|
||||
retries += 1
|
||||
except DatabaseError:
|
||||
logger.exception('Database Error Saving Job Event for Job {}'.format(job_identifier))
|
||||
logger.exception('Database Error Saving Job Event')
|
||||
break
|
||||
except Exception as exc:
|
||||
tb = traceback.format_exc()
|
||||
|
||||
37
awx/main/management/commands/callback_stats.py
Normal file
37
awx/main/management/commands/callback_stats.py
Normal file
@@ -0,0 +1,37 @@
|
||||
import time
|
||||
import sys
|
||||
|
||||
from django.db import connection
|
||||
from django.core.management.base import BaseCommand
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
|
||||
def handle(self, *args, **options):
|
||||
with connection.cursor() as cursor:
|
||||
clear = False
|
||||
while True:
|
||||
lines = []
|
||||
for relation in (
|
||||
'main_jobevent', 'main_inventoryupdateevent',
|
||||
'main_projectupdateevent', 'main_adhoccommandevent'
|
||||
):
|
||||
lines.append(relation)
|
||||
for label, interval in (
|
||||
('last minute: ', '1 minute'),
|
||||
('last 5 minutes:', '5 minutes'),
|
||||
('last hour: ', '1 hour'),
|
||||
):
|
||||
cursor.execute(
|
||||
f"SELECT MAX(id) - MIN(id) FROM {relation} WHERE modified > now() - '{interval}'::interval;"
|
||||
)
|
||||
events = cursor.fetchone()[0] or 0
|
||||
lines.append(f'↳ {label} {events}')
|
||||
lines.append('')
|
||||
if clear:
|
||||
for i in range(20):
|
||||
sys.stdout.write('\x1b[1A\x1b[2K')
|
||||
for l in lines:
|
||||
print(l)
|
||||
clear = True
|
||||
time.sleep(.25)
|
||||
@@ -1,6 +1,8 @@
|
||||
# Copyright (c) 2015 Ansible, Inc.
|
||||
# All Rights Reserved
|
||||
|
||||
from uuid import uuid4
|
||||
|
||||
from awx.main.models import Instance
|
||||
from django.conf import settings
|
||||
|
||||
@@ -22,6 +24,8 @@ class Command(BaseCommand):
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument('--hostname', dest='hostname', type=str,
|
||||
help='Hostname used during provisioning')
|
||||
parser.add_argument('--is-isolated', dest='is_isolated', action='store_true',
|
||||
help='Specify whether the instance is isolated')
|
||||
|
||||
def _register_hostname(self, hostname):
|
||||
if not hostname:
|
||||
@@ -37,7 +41,10 @@ class Command(BaseCommand):
|
||||
def handle(self, **options):
|
||||
if not options.get('hostname'):
|
||||
raise CommandError("Specify `--hostname` to use this command.")
|
||||
self.uuid = settings.SYSTEM_UUID
|
||||
if options['is_isolated']:
|
||||
self.uuid = str(uuid4())
|
||||
else:
|
||||
self.uuid = settings.SYSTEM_UUID
|
||||
self.changed = False
|
||||
self._register_hostname(options.get('hostname'))
|
||||
if self.changed:
|
||||
|
||||
@@ -9,6 +9,7 @@ import random
|
||||
from django.utils import timezone
|
||||
from django.core.management.base import BaseCommand
|
||||
|
||||
from awx.main.models.events import emit_event_detail
|
||||
from awx.main.models import (
|
||||
UnifiedJob,
|
||||
Job,
|
||||
@@ -17,14 +18,6 @@ from awx.main.models import (
|
||||
InventoryUpdate,
|
||||
SystemJob
|
||||
)
|
||||
from awx.main.consumers import emit_channel_notification
|
||||
from awx.api.serializers import (
|
||||
JobEventWebSocketSerializer,
|
||||
AdHocCommandEventWebSocketSerializer,
|
||||
ProjectUpdateEventWebSocketSerializer,
|
||||
InventoryUpdateEventWebSocketSerializer,
|
||||
SystemJobEventWebSocketSerializer
|
||||
)
|
||||
|
||||
|
||||
class JobStatusLifeCycle():
|
||||
@@ -96,21 +89,6 @@ class ReplayJobEvents(JobStatusLifeCycle):
|
||||
raise RuntimeError("No events for job id {}".format(job.id))
|
||||
return job_events, count
|
||||
|
||||
def get_serializer(self, job):
|
||||
if type(job) is Job:
|
||||
return JobEventWebSocketSerializer
|
||||
elif type(job) is AdHocCommand:
|
||||
return AdHocCommandEventWebSocketSerializer
|
||||
elif type(job) is ProjectUpdate:
|
||||
return ProjectUpdateEventWebSocketSerializer
|
||||
elif type(job) is InventoryUpdate:
|
||||
return InventoryUpdateEventWebSocketSerializer
|
||||
elif type(job) is SystemJob:
|
||||
return SystemJobEventWebSocketSerializer
|
||||
else:
|
||||
raise RuntimeError("Job is of type {} and replay is not yet supported.".format(type(job)))
|
||||
sys.exit(1)
|
||||
|
||||
def run(self, job_id, speed=1.0, verbosity=0, skip_range=[], random_seed=0, final_status_delay=0, debug=False):
|
||||
stats = {
|
||||
'events_ontime': {
|
||||
@@ -136,7 +114,6 @@ class ReplayJobEvents(JobStatusLifeCycle):
|
||||
try:
|
||||
job = self.get_job(job_id)
|
||||
job_events, job_event_count = self.get_job_events(job)
|
||||
serializer = self.get_serializer(job)
|
||||
except RuntimeError as e:
|
||||
print("{}".format(e.message))
|
||||
sys.exit(1)
|
||||
@@ -162,8 +139,7 @@ class ReplayJobEvents(JobStatusLifeCycle):
|
||||
stats['replay_start'] = self.replay_start
|
||||
je_previous = je_current
|
||||
|
||||
je_serialized = serializer(je_current).data
|
||||
emit_channel_notification('{}-{}'.format(je_serialized['group_name'], job.id), je_serialized)
|
||||
emit_event_detail(je_current)
|
||||
|
||||
replay_offset = self.replay_offset(je_previous.created, speed)
|
||||
recording_diff = (je_current.created - je_previous.created).total_seconds() * (1.0 / speed)
|
||||
|
||||
@@ -13,7 +13,8 @@ import urllib.parse
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import User
|
||||
from django.db.models.signals import post_save
|
||||
from django.db import IntegrityError
|
||||
from django.db.migrations.executor import MigrationExecutor
|
||||
from django.db import IntegrityError, connection
|
||||
from django.utils.functional import curry
|
||||
from django.shortcuts import get_object_or_404, redirect
|
||||
from django.apps import apps
|
||||
@@ -23,7 +24,6 @@ from django.urls import reverse, resolve
|
||||
|
||||
from awx.main.models import ActivityStream
|
||||
from awx.main.utils.named_url_graph import generate_graph, GraphNode
|
||||
from awx.main.utils.db import migration_in_progress_check_or_relase
|
||||
from awx.conf import fields, register
|
||||
|
||||
|
||||
@@ -62,6 +62,17 @@ class TimingMiddleware(threading.local, MiddlewareMixin):
|
||||
with open(filepath, 'w') as f:
|
||||
f.write('%s %s\n' % (request.method, request.get_full_path()))
|
||||
pstats.Stats(self.prof, stream=f).sort_stats('cumulative').print_stats()
|
||||
|
||||
if settings.AWX_REQUEST_PROFILE_WITH_DOT:
|
||||
from gprof2dot import main as generate_dot
|
||||
raw = os.path.join(self.dest, filename) + '.raw'
|
||||
pstats.Stats(self.prof).dump_stats(raw)
|
||||
generate_dot([
|
||||
'-n', '2.5', '-f', 'pstats', '-o',
|
||||
os.path.join( self.dest, filename).replace('.pstats', '.dot'),
|
||||
raw
|
||||
])
|
||||
os.remove(raw)
|
||||
return filepath
|
||||
|
||||
|
||||
@@ -213,7 +224,8 @@ class URLModificationMiddleware(MiddlewareMixin):
|
||||
class MigrationRanCheckMiddleware(MiddlewareMixin):
|
||||
|
||||
def process_request(self, request):
|
||||
if migration_in_progress_check_or_relase():
|
||||
if getattr(resolve(request.path), 'url_name', '') == 'migrations_notran':
|
||||
return
|
||||
executor = MigrationExecutor(connection)
|
||||
plan = executor.migration_plan(executor.loader.graph.leaf_nodes())
|
||||
if bool(plan) and \
|
||||
getattr(resolve(request.path), 'url_name', '') != 'migrations_notran':
|
||||
return redirect(reverse("ui:migrations_notran"))
|
||||
|
||||
@@ -0,0 +1,24 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
from uuid import uuid4
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
from awx.main.models import Instance
|
||||
|
||||
|
||||
def _generate_new_uuid_for_iso_nodes(apps, schema_editor):
|
||||
for instance in Instance.objects.all():
|
||||
if instance.is_isolated():
|
||||
instance.uuid = str(uuid4())
|
||||
instance.save()
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('main', '0100_v370_projectupdate_job_tags'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(_generate_new_uuid_for_iso_nodes)
|
||||
]
|
||||
@@ -1,8 +1,9 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import datetime
|
||||
import logging
|
||||
from collections import defaultdict
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import models, DatabaseError
|
||||
from django.utils.dateparse import parse_datetime
|
||||
from django.utils.text import Truncator
|
||||
@@ -11,9 +12,10 @@ from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.encoding import force_text
|
||||
|
||||
from awx.api.versioning import reverse
|
||||
from awx.main import consumers
|
||||
from awx.main.fields import JSONField
|
||||
from awx.main.models.base import CreatedModifiedModel
|
||||
from awx.main.utils import ignore_inventory_computed_fields
|
||||
from awx.main.utils import ignore_inventory_computed_fields, camelcase_to_underscore
|
||||
|
||||
analytics_logger = logging.getLogger('awx.analytics.job_events')
|
||||
|
||||
@@ -55,6 +57,51 @@ def create_host_status_counts(event_data):
|
||||
return dict(host_status_counts)
|
||||
|
||||
|
||||
def emit_event_detail(event):
|
||||
cls = event.__class__
|
||||
relation = {
|
||||
JobEvent: 'job_id',
|
||||
AdHocCommandEvent: 'ad_hoc_command_id',
|
||||
ProjectUpdateEvent: 'project_update_id',
|
||||
InventoryUpdateEvent: 'inventory_update_id',
|
||||
SystemJobEvent: 'system_job_id',
|
||||
}[cls]
|
||||
url = ''
|
||||
if isinstance(event, JobEvent):
|
||||
url = '/api/v2/job_events/{}'.format(event.id)
|
||||
if isinstance(event, AdHocCommandEvent):
|
||||
url = '/api/v2/ad_hoc_command_events/{}'.format(event.id)
|
||||
group = camelcase_to_underscore(cls.__name__) + 's'
|
||||
timestamp = event.created.isoformat()
|
||||
consumers.emit_channel_notification(
|
||||
'-'.join([group, str(getattr(event, relation))]),
|
||||
{
|
||||
'id': event.id,
|
||||
relation.replace('_id', ''): getattr(event, relation),
|
||||
'created': timestamp,
|
||||
'modified': timestamp,
|
||||
'group_name': group,
|
||||
'url': url,
|
||||
'stdout': event.stdout,
|
||||
'counter': event.counter,
|
||||
'uuid': event.uuid,
|
||||
'parent_uuid': getattr(event, 'parent_uuid', ''),
|
||||
'start_line': event.start_line,
|
||||
'end_line': event.end_line,
|
||||
'event': event.event,
|
||||
'event_data': getattr(event, 'event_data', {}),
|
||||
'failed': event.failed,
|
||||
'changed': event.changed,
|
||||
'event_level': getattr(event, 'event_level', ''),
|
||||
'play': getattr(event, 'play', ''),
|
||||
'role': getattr(event, 'role', ''),
|
||||
'task': getattr(event, 'task', ''),
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
|
||||
|
||||
class BasePlaybookEvent(CreatedModifiedModel):
|
||||
'''
|
||||
An event/message logged from a playbook callback for each host.
|
||||
@@ -63,7 +110,7 @@ class BasePlaybookEvent(CreatedModifiedModel):
|
||||
VALID_KEYS = [
|
||||
'event', 'event_data', 'playbook', 'play', 'role', 'task', 'created',
|
||||
'counter', 'uuid', 'stdout', 'parent_uuid', 'start_line', 'end_line',
|
||||
'verbosity'
|
||||
'host_id', 'host_name', 'verbosity',
|
||||
]
|
||||
|
||||
class Meta:
|
||||
@@ -271,37 +318,67 @@ class BasePlaybookEvent(CreatedModifiedModel):
|
||||
|
||||
def _update_from_event_data(self):
|
||||
# Update event model fields from event data.
|
||||
updated_fields = set()
|
||||
event_data = self.event_data
|
||||
res = event_data.get('res', None)
|
||||
if self.event in self.FAILED_EVENTS and not event_data.get('ignore_errors', False):
|
||||
self.failed = True
|
||||
updated_fields.add('failed')
|
||||
if isinstance(res, dict):
|
||||
if res.get('changed', False):
|
||||
self.changed = True
|
||||
updated_fields.add('changed')
|
||||
if self.event == 'playbook_on_stats':
|
||||
try:
|
||||
failures_dict = event_data.get('failures', {})
|
||||
dark_dict = event_data.get('dark', {})
|
||||
self.failed = bool(sum(failures_dict.values()) +
|
||||
sum(dark_dict.values()))
|
||||
updated_fields.add('failed')
|
||||
changed_dict = event_data.get('changed', {})
|
||||
self.changed = bool(sum(changed_dict.values()))
|
||||
updated_fields.add('changed')
|
||||
except (AttributeError, TypeError):
|
||||
pass
|
||||
|
||||
if isinstance(self, JobEvent):
|
||||
hostnames = self._hostnames()
|
||||
self._update_host_summary_from_stats(hostnames)
|
||||
if self.job.inventory:
|
||||
try:
|
||||
self.job.inventory.update_computed_fields()
|
||||
except DatabaseError:
|
||||
logger.exception('Computed fields database error saving event {}'.format(self.pk))
|
||||
|
||||
# find parent links and progagate changed=T and failed=T
|
||||
changed = self.job.job_events.filter(changed=True).exclude(parent_uuid=None).only('parent_uuid').values_list('parent_uuid', flat=True).distinct() # noqa
|
||||
failed = self.job.job_events.filter(failed=True).exclude(parent_uuid=None).only('parent_uuid').values_list('parent_uuid', flat=True).distinct() # noqa
|
||||
|
||||
JobEvent.objects.filter(
|
||||
job_id=self.job_id, uuid__in=changed
|
||||
).update(changed=True)
|
||||
JobEvent.objects.filter(
|
||||
job_id=self.job_id, uuid__in=failed
|
||||
).update(failed=True)
|
||||
|
||||
for field in ('playbook', 'play', 'task', 'role'):
|
||||
value = force_text(event_data.get(field, '')).strip()
|
||||
if value != getattr(self, field):
|
||||
setattr(self, field, value)
|
||||
updated_fields.add(field)
|
||||
return updated_fields
|
||||
if isinstance(self, JobEvent):
|
||||
analytics_logger.info(
|
||||
'Event data saved.',
|
||||
extra=dict(python_objects=dict(job_event=self))
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def create_from_data(cls, **kwargs):
|
||||
#
|
||||
# ⚠️ D-D-D-DANGER ZONE ⚠️
|
||||
# This function is called by the callback receiver *once* for *every
|
||||
# event* emitted by Ansible as a playbook runs. That means that
|
||||
# changes to this function are _very_ susceptible to introducing
|
||||
# performance regressions (which the user will experience as "my
|
||||
# playbook stdout takes too long to show up"), *especially* code which
|
||||
# might invoke additional database queries per event.
|
||||
#
|
||||
# Proceed with caution!
|
||||
#
|
||||
pk = None
|
||||
for key in ('job_id', 'project_update_id'):
|
||||
if key in kwargs:
|
||||
@@ -325,74 +402,16 @@ class BasePlaybookEvent(CreatedModifiedModel):
|
||||
|
||||
sanitize_event_keys(kwargs, cls.VALID_KEYS)
|
||||
workflow_job_id = kwargs.pop('workflow_job_id', None)
|
||||
job_event = cls.objects.create(**kwargs)
|
||||
event = cls(**kwargs)
|
||||
if workflow_job_id:
|
||||
setattr(job_event, 'workflow_job_id', workflow_job_id)
|
||||
analytics_logger.info('Event data saved.', extra=dict(python_objects=dict(job_event=job_event)))
|
||||
return job_event
|
||||
setattr(event, 'workflow_job_id', workflow_job_id)
|
||||
event._update_from_event_data()
|
||||
return event
|
||||
|
||||
@property
|
||||
def job_verbosity(self):
|
||||
return 0
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
# If update_fields has been specified, add our field names to it,
|
||||
# if it hasn't been specified, then we're just doing a normal save.
|
||||
update_fields = kwargs.get('update_fields', [])
|
||||
# Update model fields and related objects unless we're only updating
|
||||
# failed/changed flags triggered from a child event.
|
||||
from_parent_update = kwargs.pop('from_parent_update', False)
|
||||
if not from_parent_update:
|
||||
# Update model fields from event data.
|
||||
updated_fields = self._update_from_event_data()
|
||||
for field in updated_fields:
|
||||
if field not in update_fields:
|
||||
update_fields.append(field)
|
||||
|
||||
# Update host related field from host_name.
|
||||
if hasattr(self, 'job') and not self.host_id and self.host_name:
|
||||
if self.job.inventory.kind == 'smart':
|
||||
# optimization to avoid calling inventory.hosts, which
|
||||
# can take a long time to run under some circumstances
|
||||
from awx.main.models.inventory import SmartInventoryMembership
|
||||
membership = SmartInventoryMembership.objects.filter(
|
||||
inventory=self.job.inventory, host__name=self.host_name
|
||||
).first()
|
||||
if membership:
|
||||
host_id = membership.host_id
|
||||
else:
|
||||
host_id = None
|
||||
else:
|
||||
host_qs = self.job.inventory.hosts.filter(name=self.host_name)
|
||||
host_id = host_qs.only('id').values_list('id', flat=True).first()
|
||||
if host_id != self.host_id:
|
||||
self.host_id = host_id
|
||||
if 'host_id' not in update_fields:
|
||||
update_fields.append('host_id')
|
||||
super(BasePlaybookEvent, self).save(*args, **kwargs)
|
||||
|
||||
# Update related objects after this event is saved.
|
||||
if hasattr(self, 'job') and not from_parent_update:
|
||||
if getattr(settings, 'CAPTURE_JOB_EVENT_HOSTS', False):
|
||||
self._update_hosts()
|
||||
if self.parent_uuid:
|
||||
kwargs = {}
|
||||
if self.changed is True:
|
||||
kwargs['changed'] = True
|
||||
if self.failed is True:
|
||||
kwargs['failed'] = True
|
||||
if kwargs:
|
||||
JobEvent.objects.filter(job_id=self.job_id, uuid=self.parent_uuid).update(**kwargs)
|
||||
|
||||
if self.event == 'playbook_on_stats':
|
||||
hostnames = self._hostnames()
|
||||
self._update_host_summary_from_stats(hostnames)
|
||||
try:
|
||||
self.job.inventory.update_computed_fields()
|
||||
except DatabaseError:
|
||||
logger.exception('Computed fields database error saving event {}'.format(self.pk))
|
||||
|
||||
|
||||
|
||||
class JobEvent(BasePlaybookEvent):
|
||||
'''
|
||||
@@ -456,38 +475,6 @@ class JobEvent(BasePlaybookEvent):
|
||||
def __str__(self):
|
||||
return u'%s @ %s' % (self.get_event_display2(), self.created.isoformat())
|
||||
|
||||
def _update_from_event_data(self):
|
||||
# Update job event hostname
|
||||
updated_fields = super(JobEvent, self)._update_from_event_data()
|
||||
value = force_text(self.event_data.get('host', '')).strip()
|
||||
if value != getattr(self, 'host_name'):
|
||||
setattr(self, 'host_name', value)
|
||||
updated_fields.add('host_name')
|
||||
return updated_fields
|
||||
|
||||
def _update_hosts(self, extra_host_pks=None):
|
||||
# Update job event hosts m2m from host_name, propagate to parent events.
|
||||
extra_host_pks = set(extra_host_pks or [])
|
||||
hostnames = set()
|
||||
if self.host_name:
|
||||
hostnames.add(self.host_name)
|
||||
if self.event == 'playbook_on_stats':
|
||||
try:
|
||||
for v in self.event_data.values():
|
||||
hostnames.update(v.keys())
|
||||
except AttributeError: # In case event_data or v isn't a dict.
|
||||
pass
|
||||
qs = self.job.inventory.hosts.all()
|
||||
qs = qs.filter(models.Q(name__in=hostnames) | models.Q(pk__in=extra_host_pks))
|
||||
qs = qs.exclude(job_events__pk=self.id).only('id')
|
||||
for host in qs:
|
||||
self.hosts.add(host)
|
||||
if self.parent_uuid:
|
||||
parent = JobEvent.objects.filter(uuid=self.parent_uuid)
|
||||
if parent.exists():
|
||||
parent = parent[0]
|
||||
parent._update_hosts(qs.values_list('id', flat=True))
|
||||
|
||||
def _hostnames(self):
|
||||
hostnames = set()
|
||||
try:
|
||||
@@ -605,6 +592,17 @@ class BaseCommandEvent(CreatedModifiedModel):
|
||||
|
||||
@classmethod
|
||||
def create_from_data(cls, **kwargs):
|
||||
#
|
||||
# ⚠️ D-D-D-DANGER ZONE ⚠️
|
||||
# This function is called by the callback receiver *once* for *every
|
||||
# event* emitted by Ansible as a playbook runs. That means that
|
||||
# changes to this function are _very_ susceptible to introducing
|
||||
# performance regressions (which the user will experience as "my
|
||||
# playbook stdout takes too long to show up"), *especially* code which
|
||||
# might invoke additional database queries per event.
|
||||
#
|
||||
# Proceed with caution!
|
||||
#
|
||||
# Convert the datetime for the event's creation
|
||||
# appropriately, and include a time zone for it.
|
||||
#
|
||||
@@ -619,13 +617,8 @@ class BaseCommandEvent(CreatedModifiedModel):
|
||||
kwargs.pop('created', None)
|
||||
|
||||
sanitize_event_keys(kwargs, cls.VALID_KEYS)
|
||||
kwargs.pop('workflow_job_id', None)
|
||||
event = cls.objects.create(**kwargs)
|
||||
if isinstance(event, AdHocCommandEvent):
|
||||
analytics_logger.info(
|
||||
'Event data saved.',
|
||||
extra=dict(python_objects=dict(job_event=event))
|
||||
)
|
||||
event = cls(**kwargs)
|
||||
event._update_from_event_data()
|
||||
return event
|
||||
|
||||
def get_event_display(self):
|
||||
@@ -640,10 +633,15 @@ class BaseCommandEvent(CreatedModifiedModel):
|
||||
def get_host_status_counts(self):
|
||||
return create_host_status_counts(getattr(self, 'event_data', {}))
|
||||
|
||||
def _update_from_event_data(self):
|
||||
pass
|
||||
|
||||
|
||||
class AdHocCommandEvent(BaseCommandEvent):
|
||||
|
||||
VALID_KEYS = BaseCommandEvent.VALID_KEYS + ['ad_hoc_command_id', 'event', 'workflow_job_id']
|
||||
VALID_KEYS = BaseCommandEvent.VALID_KEYS + [
|
||||
'ad_hoc_command_id', 'event', 'host_name', 'host_id', 'workflow_job_id'
|
||||
]
|
||||
|
||||
class Meta:
|
||||
app_label = 'main'
|
||||
@@ -719,34 +717,18 @@ class AdHocCommandEvent(BaseCommandEvent):
|
||||
def get_absolute_url(self, request=None):
|
||||
return reverse('api:ad_hoc_command_event_detail', kwargs={'pk': self.pk}, request=request)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
# If update_fields has been specified, add our field names to it,
|
||||
# if it hasn't been specified, then we're just doing a normal save.
|
||||
update_fields = kwargs.get('update_fields', [])
|
||||
def _update_from_event_data(self):
|
||||
res = self.event_data.get('res', None)
|
||||
if self.event in self.FAILED_EVENTS:
|
||||
if not self.event_data.get('ignore_errors', False):
|
||||
self.failed = True
|
||||
if 'failed' not in update_fields:
|
||||
update_fields.append('failed')
|
||||
if isinstance(res, dict) and res.get('changed', False):
|
||||
self.changed = True
|
||||
if 'changed' not in update_fields:
|
||||
update_fields.append('changed')
|
||||
self.host_name = self.event_data.get('host', '').strip()
|
||||
if 'host_name' not in update_fields:
|
||||
update_fields.append('host_name')
|
||||
if not self.host_id and self.host_name:
|
||||
host_qs = self.ad_hoc_command.inventory.hosts.filter(name=self.host_name)
|
||||
try:
|
||||
host_id = host_qs.only('id').values_list('id', flat=True)
|
||||
if host_id.exists():
|
||||
self.host_id = host_id[0]
|
||||
if 'host_id' not in update_fields:
|
||||
update_fields.append('host_id')
|
||||
except (IndexError, AttributeError):
|
||||
pass
|
||||
super(AdHocCommandEvent, self).save(*args, **kwargs)
|
||||
|
||||
analytics_logger.info(
|
||||
'Event data saved.',
|
||||
extra=dict(python_objects=dict(job_event=self))
|
||||
)
|
||||
|
||||
|
||||
class InventoryUpdateEvent(BaseCommandEvent):
|
||||
|
||||
@@ -124,11 +124,6 @@ class OAuth2AccessToken(AbstractAccessToken):
|
||||
def is_valid(self, scopes=None):
|
||||
valid = super(OAuth2AccessToken, self).is_valid(scopes)
|
||||
if valid:
|
||||
try:
|
||||
self.validate_external_users()
|
||||
except oauth2.AccessDeniedError:
|
||||
logger.exception(f'Failed to authenticate {self.user.username}')
|
||||
return False
|
||||
self.last_used = now()
|
||||
|
||||
def _update_last_used():
|
||||
@@ -146,5 +141,6 @@ class OAuth2AccessToken(AbstractAccessToken):
|
||||
).format(external_account))
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
self.validate_external_users()
|
||||
if not self.pk:
|
||||
self.validate_external_users()
|
||||
super(OAuth2AccessToken, self).save(*args, **kwargs)
|
||||
|
||||
@@ -15,7 +15,6 @@ class DependencyGraph(object):
|
||||
INVENTORY_UPDATES = 'inventory_updates'
|
||||
|
||||
JOB_TEMPLATE_JOBS = 'job_template_jobs'
|
||||
JOB_INVENTORY_IDS = 'job_inventory_ids'
|
||||
|
||||
SYSTEM_JOB = 'system_job'
|
||||
INVENTORY_SOURCE_UPDATES = 'inventory_source_updates'
|
||||
@@ -40,8 +39,6 @@ class DependencyGraph(object):
|
||||
Track runnable job related project and inventory to ensure updates
|
||||
don't run while a job needing those resources is running.
|
||||
'''
|
||||
# inventory_id -> True / False
|
||||
self.data[self.JOB_INVENTORY_IDS] = {}
|
||||
|
||||
# inventory_source_id -> True / False
|
||||
self.data[self.INVENTORY_SOURCE_UPDATES] = {}
|
||||
@@ -77,7 +74,6 @@ class DependencyGraph(object):
|
||||
self.data[self.INVENTORY_SOURCE_UPDATES][inventory_source_id] = False
|
||||
|
||||
def mark_job_template_job(self, job):
|
||||
self.data[self.JOB_INVENTORY_IDS][job.inventory_id] = False
|
||||
self.data[self.JOB_TEMPLATE_JOBS][job.job_template_id] = False
|
||||
|
||||
def mark_workflow_job(self, job):
|
||||
@@ -87,8 +83,7 @@ class DependencyGraph(object):
|
||||
return self.data[self.PROJECT_UPDATES].get(job.project_id, True)
|
||||
|
||||
def can_inventory_update_run(self, job):
|
||||
return self.data[self.JOB_INVENTORY_IDS].get(job.inventory_source.inventory_id, True) and \
|
||||
self.data[self.INVENTORY_SOURCE_UPDATES].get(job.inventory_source_id, True)
|
||||
return self.data[self.INVENTORY_SOURCE_UPDATES].get(job.inventory_source_id, True)
|
||||
|
||||
def can_job_run(self, job):
|
||||
if self.data[self.PROJECT_UPDATES].get(job.project_id, True) is True and \
|
||||
|
||||
@@ -5,15 +5,11 @@ import logging
|
||||
# AWX
|
||||
from awx.main.scheduler import TaskManager
|
||||
from awx.main.dispatch.publish import task
|
||||
from awx.main.utils.db import migration_in_progress_check_or_relase
|
||||
|
||||
logger = logging.getLogger('awx.main.scheduler')
|
||||
|
||||
|
||||
@task()
|
||||
def run_task_manager():
|
||||
if migration_in_progress_check_or_relase():
|
||||
logger.debug("Not running task manager because migration is in progress.")
|
||||
return
|
||||
logger.debug("Running Tower task manager.")
|
||||
TaskManager().schedule()
|
||||
|
||||
@@ -30,12 +30,11 @@ from crum.signals import current_user_getter
|
||||
|
||||
# AWX
|
||||
from awx.main.models import (
|
||||
ActivityStream, AdHocCommandEvent, Group, Host, InstanceGroup, Inventory,
|
||||
InventorySource, InventoryUpdateEvent, Job, JobEvent, JobHostSummary,
|
||||
JobTemplate, OAuth2AccessToken, Organization, Project, ProjectUpdateEvent,
|
||||
Role, SystemJob, SystemJobEvent, SystemJobTemplate, UnifiedJob,
|
||||
UnifiedJobTemplate, User, UserSessionMembership, WorkflowJobTemplateNode,
|
||||
WorkflowApproval, WorkflowApprovalTemplate, ROLE_SINGLETON_SYSTEM_ADMINISTRATOR
|
||||
ActivityStream, Group, Host, InstanceGroup, Inventory, InventorySource,
|
||||
Job, JobHostSummary, JobTemplate, OAuth2AccessToken, Organization, Project,
|
||||
Role, SystemJob, SystemJobTemplate, UnifiedJob, UnifiedJobTemplate, User,
|
||||
UserSessionMembership, WorkflowJobTemplateNode, WorkflowApproval,
|
||||
WorkflowApprovalTemplate, ROLE_SINGLETON_SYSTEM_ADMINISTRATOR
|
||||
)
|
||||
from awx.main.constants import CENSOR_VALUE
|
||||
from awx.main.utils import model_instance_diff, model_to_dict, camelcase_to_underscore, get_current_apps
|
||||
@@ -72,42 +71,6 @@ def get_current_user_or_none():
|
||||
return u
|
||||
|
||||
|
||||
def emit_event_detail(serializer, relation, **kwargs):
|
||||
instance = kwargs['instance']
|
||||
created = kwargs['created']
|
||||
if created:
|
||||
event_serializer = serializer(instance)
|
||||
consumers.emit_channel_notification(
|
||||
'-'.join([event_serializer.get_group_name(instance), str(getattr(instance, relation))]),
|
||||
event_serializer.data
|
||||
)
|
||||
|
||||
|
||||
def emit_job_event_detail(sender, **kwargs):
|
||||
from awx.api import serializers
|
||||
emit_event_detail(serializers.JobEventWebSocketSerializer, 'job_id', **kwargs)
|
||||
|
||||
|
||||
def emit_ad_hoc_command_event_detail(sender, **kwargs):
|
||||
from awx.api import serializers
|
||||
emit_event_detail(serializers.AdHocCommandEventWebSocketSerializer, 'ad_hoc_command_id', **kwargs)
|
||||
|
||||
|
||||
def emit_project_update_event_detail(sender, **kwargs):
|
||||
from awx.api import serializers
|
||||
emit_event_detail(serializers.ProjectUpdateEventWebSocketSerializer, 'project_update_id', **kwargs)
|
||||
|
||||
|
||||
def emit_inventory_update_event_detail(sender, **kwargs):
|
||||
from awx.api import serializers
|
||||
emit_event_detail(serializers.InventoryUpdateEventWebSocketSerializer, 'inventory_update_id', **kwargs)
|
||||
|
||||
|
||||
def emit_system_job_event_detail(sender, **kwargs):
|
||||
from awx.api import serializers
|
||||
emit_event_detail(serializers.SystemJobEventWebSocketSerializer, 'system_job_id', **kwargs)
|
||||
|
||||
|
||||
def emit_update_inventory_computed_fields(sender, **kwargs):
|
||||
logger.debug("In update inventory computed fields")
|
||||
if getattr(_inventory_updates, 'is_updating', False):
|
||||
@@ -258,11 +221,6 @@ connect_computed_field_signals()
|
||||
|
||||
post_save.connect(save_related_job_templates, sender=Project)
|
||||
post_save.connect(save_related_job_templates, sender=Inventory)
|
||||
post_save.connect(emit_job_event_detail, sender=JobEvent)
|
||||
post_save.connect(emit_ad_hoc_command_event_detail, sender=AdHocCommandEvent)
|
||||
post_save.connect(emit_project_update_event_detail, sender=ProjectUpdateEvent)
|
||||
post_save.connect(emit_inventory_update_event_detail, sender=InventoryUpdateEvent)
|
||||
post_save.connect(emit_system_job_event_detail, sender=SystemJobEvent)
|
||||
m2m_changed.connect(rebuild_role_ancestor_list, Role.parents.through)
|
||||
m2m_changed.connect(rbac_activity_stream, Role.members.through)
|
||||
m2m_changed.connect(rbac_activity_stream, Role.parents.through)
|
||||
|
||||
@@ -263,12 +263,6 @@ def apply_cluster_membership_policies():
|
||||
logger.debug('Cluster policy computation finished in {} seconds'.format(time.time() - started_compute))
|
||||
|
||||
|
||||
@task(queue='tower_broadcast_all', exchange_type='fanout')
|
||||
def set_migration_flag():
|
||||
logger.debug('Received migration-in-progress signal, will serve redirect.')
|
||||
cache.set('migration_in_progress', True)
|
||||
|
||||
|
||||
@task(queue='tower_broadcast_all', exchange_type='fanout')
|
||||
def handle_setting_changes(setting_keys):
|
||||
orig_len = len(setting_keys)
|
||||
@@ -709,6 +703,7 @@ class BaseTask(object):
|
||||
def __init__(self):
|
||||
self.cleanup_paths = []
|
||||
self.parent_workflow_job_id = None
|
||||
self.host_map = {}
|
||||
|
||||
def update_model(self, pk, _attempt=0, **updates):
|
||||
"""Reload the model instance from the database and update the
|
||||
@@ -1007,11 +1002,17 @@ class BaseTask(object):
|
||||
return False
|
||||
|
||||
def build_inventory(self, instance, private_data_dir):
|
||||
script_params = dict(hostvars=True)
|
||||
script_params = dict(hostvars=True, towervars=True)
|
||||
if hasattr(instance, 'job_slice_number'):
|
||||
script_params['slice_number'] = instance.job_slice_number
|
||||
script_params['slice_count'] = instance.job_slice_count
|
||||
script_data = instance.inventory.get_script_data(**script_params)
|
||||
# maintain a list of host_name --> host_id
|
||||
# so we can associate emitted events to Host objects
|
||||
self.host_map = {
|
||||
hostname: hv.pop('remote_tower_id', '')
|
||||
for hostname, hv in script_data.get('_meta', {}).get('hostvars', {}).items()
|
||||
}
|
||||
json_data = json.dumps(script_data)
|
||||
handle, path = tempfile.mkstemp(dir=private_data_dir)
|
||||
f = os.fdopen(handle, 'w')
|
||||
@@ -1120,6 +1121,15 @@ class BaseTask(object):
|
||||
event_data.pop('parent_uuid', None)
|
||||
if self.parent_workflow_job_id:
|
||||
event_data['workflow_job_id'] = self.parent_workflow_job_id
|
||||
if self.host_map:
|
||||
host = event_data.get('event_data', {}).get('host', '').strip()
|
||||
if host:
|
||||
event_data['host_name'] = host
|
||||
if host in self.host_map:
|
||||
event_data['host_id'] = self.host_map[host]
|
||||
else:
|
||||
event_data['host_name'] = ''
|
||||
event_data['host_id'] = ''
|
||||
should_write_event = False
|
||||
event_data.setdefault(self.event_data_key, self.instance.id)
|
||||
self.dispatcher.dispatch(event_data)
|
||||
|
||||
@@ -15,7 +15,7 @@ def test_job_events_sublist_truncation(get, organization_factory, job_template_f
|
||||
inventory='test_inv', project='test_proj').job_template
|
||||
job = jt.create_unified_job()
|
||||
JobEvent.create_from_data(job_id=job.pk, uuid='abc123', event='runner_on_start',
|
||||
stdout='a' * 1025)
|
||||
stdout='a' * 1025).save()
|
||||
|
||||
url = reverse('api:job_job_events_list', kwargs={'pk': job.pk})
|
||||
if not truncate:
|
||||
@@ -35,7 +35,7 @@ def test_ad_hoc_events_sublist_truncation(get, organization_factory, job_templat
|
||||
adhoc = AdHocCommand()
|
||||
adhoc.save()
|
||||
AdHocCommandEvent.create_from_data(ad_hoc_command_id=adhoc.pk, uuid='abc123', event='runner_on_start',
|
||||
stdout='a' * 1025)
|
||||
stdout='a' * 1025).save()
|
||||
|
||||
url = reverse('api:ad_hoc_command_ad_hoc_command_events_list', kwargs={'pk': adhoc.pk})
|
||||
if not truncate:
|
||||
|
||||
@@ -69,7 +69,7 @@ def test_token_creation_disabled_for_external_accounts(oauth_application, post,
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_existing_token_disabled_for_external_accounts(oauth_application, get, post, admin):
|
||||
def test_existing_token_enabled_for_external_accounts(oauth_application, get, post, admin):
|
||||
UserEnterpriseAuth(user=admin, provider='radius').save()
|
||||
url = drf_reverse('api:oauth_authorization_root_view') + 'token/'
|
||||
with override_settings(RADIUS_SERVER='example.org', ALLOW_OAUTH2_FOR_EXTERNAL_USERS=True):
|
||||
@@ -98,9 +98,9 @@ def test_existing_token_disabled_for_external_accounts(oauth_application, get, p
|
||||
resp = get(
|
||||
drf_reverse('api:user_me_list', kwargs={'version': 'v2'}),
|
||||
HTTP_AUTHORIZATION='Bearer ' + token,
|
||||
status=401
|
||||
status=200
|
||||
)
|
||||
assert b'To establish a login session' in resp.content
|
||||
assert json.loads(resp.content)['results'][0]['username'] == 'admin'
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
|
||||
@@ -5,6 +5,7 @@
|
||||
# Python
|
||||
import pytest
|
||||
import os
|
||||
import time
|
||||
|
||||
from django.conf import settings
|
||||
from kombu.utils.url import parse_url
|
||||
@@ -276,6 +277,7 @@ def test_logging_aggregrator_connection_test_valid(mocker, get, post, admin):
|
||||
def test_logging_aggregrator_connection_test_with_masked_password(mocker, patch, post, admin):
|
||||
url = reverse('api:setting_singleton_detail', kwargs={'category_slug': 'logging'})
|
||||
patch(url, user=admin, data={'LOG_AGGREGATOR_PASSWORD': 'password123'}, expect=200)
|
||||
time.sleep(1) # log settings are cached slightly
|
||||
|
||||
with mock.patch.object(AWXProxyHandler, 'perform_test') as perform_test:
|
||||
url = reverse('api:setting_logging_test')
|
||||
|
||||
@@ -219,6 +219,30 @@ def test_survey_spec_passwords_with_default_required(job_template_factory, post,
|
||||
assert launch_value not in json.loads(job.extra_vars).values()
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_survey_spec_default_not_allowed(job_template, post, admin_user):
|
||||
survey_input_data = {
|
||||
'description': 'A survey',
|
||||
'spec': [{
|
||||
'question_name': 'You must choose wisely',
|
||||
'variable': 'your_choice',
|
||||
'default': 'blue',
|
||||
'required': False,
|
||||
'type': 'multiplechoice',
|
||||
"choices": ["red", "green", "purple"]
|
||||
}],
|
||||
'name': 'my survey'
|
||||
}
|
||||
r = post(
|
||||
url=reverse(
|
||||
'api:job_template_survey_spec',
|
||||
kwargs={'pk': job_template.id}
|
||||
),
|
||||
data=survey_input_data, user=admin_user, expect=400
|
||||
)
|
||||
assert r.data['error'] == 'Default choice must be answered from the choices listed.'
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize('default, status', [
|
||||
('SUPERSECRET', 200),
|
||||
|
||||
@@ -37,26 +37,26 @@ class TestKeyRegeneration:
|
||||
|
||||
def test_encrypted_setting_values(self):
|
||||
# test basic decryption
|
||||
settings.LOG_AGGREGATOR_PASSWORD = 'sensitive'
|
||||
s = Setting.objects.filter(key='LOG_AGGREGATOR_PASSWORD').first()
|
||||
settings.REDHAT_PASSWORD = 'sensitive'
|
||||
s = Setting.objects.filter(key='REDHAT_PASSWORD').first()
|
||||
assert s.value.startswith(PREFIX)
|
||||
assert settings.LOG_AGGREGATOR_PASSWORD == 'sensitive'
|
||||
assert settings.REDHAT_PASSWORD == 'sensitive'
|
||||
|
||||
# re-key the setting value
|
||||
new_key = regenerate_secret_key.Command().handle()
|
||||
new_setting = Setting.objects.filter(key='LOG_AGGREGATOR_PASSWORD').first()
|
||||
new_setting = Setting.objects.filter(key='REDHAT_PASSWORD').first()
|
||||
assert s.value != new_setting.value
|
||||
|
||||
# wipe out the local cache so the value is pulled from the DB again
|
||||
settings.cache.delete('LOG_AGGREGATOR_PASSWORD')
|
||||
settings.cache.delete('REDHAT_PASSWORD')
|
||||
|
||||
# verify that the old SECRET_KEY doesn't work
|
||||
with pytest.raises(InvalidToken):
|
||||
settings.LOG_AGGREGATOR_PASSWORD
|
||||
settings.REDHAT_PASSWORD
|
||||
|
||||
# verify that the new SECRET_KEY *does* work
|
||||
with override_settings(SECRET_KEY=new_key):
|
||||
assert settings.LOG_AGGREGATOR_PASSWORD == 'sensitive'
|
||||
assert settings.REDHAT_PASSWORD == 'sensitive'
|
||||
|
||||
def test_encrypted_notification_secrets(self, notification_template_with_encrypt):
|
||||
# test basic decryption
|
||||
|
||||
@@ -296,3 +296,15 @@ def test_cluster_node_long_node_name(inventory, project):
|
||||
# node name is very long, we just want to make sure it does not error
|
||||
entry = ActivityStream.objects.filter(job=job).first()
|
||||
assert entry.action_node.startswith('ffffff')
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_credential_defaults_idempotency():
|
||||
CredentialType.setup_tower_managed_defaults()
|
||||
old_inputs = CredentialType.objects.get(name='Ansible Tower', kind='cloud').inputs
|
||||
prior_count = ActivityStream.objects.count()
|
||||
# this is commonly re-ran in migrations, and no changes should be shown
|
||||
# because inputs and injectors are not actually tracked in the database
|
||||
CredentialType.setup_tower_managed_defaults()
|
||||
assert CredentialType.objects.get(name='Ansible Tower', kind='cloud').inputs == old_inputs
|
||||
assert ActivityStream.objects.count() == prior_count
|
||||
|
||||
@@ -1,18 +1,15 @@
|
||||
from unittest import mock
|
||||
import pytest
|
||||
|
||||
from awx.main.models import (Job, JobEvent, ProjectUpdate, ProjectUpdateEvent,
|
||||
AdHocCommand, AdHocCommandEvent, InventoryUpdate,
|
||||
InventorySource, InventoryUpdateEvent, SystemJob,
|
||||
SystemJobEvent)
|
||||
from awx.main.models import Job, JobEvent
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@mock.patch('awx.main.consumers.emit_channel_notification')
|
||||
@mock.patch('awx.main.models.events.emit_event_detail')
|
||||
def test_parent_changed(emit):
|
||||
j = Job()
|
||||
j.save()
|
||||
JobEvent.create_from_data(job_id=j.pk, uuid='abc123', event='playbook_on_task_start')
|
||||
JobEvent.create_from_data(job_id=j.pk, uuid='abc123', event='playbook_on_task_start').save()
|
||||
assert JobEvent.objects.count() == 1
|
||||
for e in JobEvent.objects.all():
|
||||
assert e.changed is False
|
||||
@@ -24,19 +21,26 @@ def test_parent_changed(emit):
|
||||
event_data={
|
||||
'res': {'changed': ['localhost']}
|
||||
}
|
||||
)
|
||||
assert JobEvent.objects.count() == 2
|
||||
for e in JobEvent.objects.all():
|
||||
).save()
|
||||
# the `playbook_on_stats` event is where we update the parent changed linkage
|
||||
JobEvent.create_from_data(
|
||||
job_id=j.pk,
|
||||
parent_uuid='abc123',
|
||||
event='playbook_on_stats'
|
||||
).save()
|
||||
events = JobEvent.objects.filter(event__in=['playbook_on_task_start', 'runner_on_ok'])
|
||||
assert events.count() == 2
|
||||
for e in events.all():
|
||||
assert e.changed is True
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@pytest.mark.parametrize('event', JobEvent.FAILED_EVENTS)
|
||||
@mock.patch('awx.main.consumers.emit_channel_notification')
|
||||
@mock.patch('awx.main.models.events.emit_event_detail')
|
||||
def test_parent_failed(emit, event):
|
||||
j = Job()
|
||||
j.save()
|
||||
JobEvent.create_from_data(job_id=j.pk, uuid='abc123', event='playbook_on_task_start')
|
||||
JobEvent.create_from_data(job_id=j.pk, uuid='abc123', event='playbook_on_task_start').save()
|
||||
assert JobEvent.objects.count() == 1
|
||||
for e in JobEvent.objects.all():
|
||||
assert e.failed is False
|
||||
@@ -45,69 +49,15 @@ def test_parent_failed(emit, event):
|
||||
job_id=j.pk,
|
||||
parent_uuid='abc123',
|
||||
event=event
|
||||
)
|
||||
assert JobEvent.objects.count() == 2
|
||||
for e in JobEvent.objects.all():
|
||||
).save()
|
||||
|
||||
# the `playbook_on_stats` event is where we update the parent failed linkage
|
||||
JobEvent.create_from_data(
|
||||
job_id=j.pk,
|
||||
parent_uuid='abc123',
|
||||
event='playbook_on_stats'
|
||||
).save()
|
||||
events = JobEvent.objects.filter(event__in=['playbook_on_task_start', event])
|
||||
assert events.count() == 2
|
||||
for e in events.all():
|
||||
assert e.failed is True
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@mock.patch('awx.main.consumers.emit_channel_notification')
|
||||
def test_job_event_websocket_notifications(emit):
|
||||
j = Job(id=123)
|
||||
j.save()
|
||||
JobEvent.create_from_data(job_id=j.pk)
|
||||
assert len(emit.call_args_list) == 1
|
||||
topic, payload = emit.call_args_list[0][0]
|
||||
assert topic == 'job_events-123'
|
||||
assert payload['job'] == 123
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@mock.patch('awx.main.consumers.emit_channel_notification')
|
||||
def test_ad_hoc_event_websocket_notifications(emit):
|
||||
ahc = AdHocCommand(id=123)
|
||||
ahc.save()
|
||||
AdHocCommandEvent.create_from_data(ad_hoc_command_id=ahc.pk)
|
||||
assert len(emit.call_args_list) == 1
|
||||
topic, payload = emit.call_args_list[0][0]
|
||||
assert topic == 'ad_hoc_command_events-123'
|
||||
assert payload['ad_hoc_command'] == 123
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@mock.patch('awx.main.consumers.emit_channel_notification')
|
||||
def test_project_update_event_websocket_notifications(emit, project):
|
||||
pu = ProjectUpdate(id=123, project=project)
|
||||
pu.save()
|
||||
ProjectUpdateEvent.create_from_data(project_update_id=pu.pk)
|
||||
assert len(emit.call_args_list) == 1
|
||||
topic, payload = emit.call_args_list[0][0]
|
||||
assert topic == 'project_update_events-123'
|
||||
assert payload['project_update'] == 123
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@mock.patch('awx.main.consumers.emit_channel_notification')
|
||||
def test_inventory_update_event_websocket_notifications(emit, inventory):
|
||||
source = InventorySource()
|
||||
source.save()
|
||||
iu = InventoryUpdate(id=123, inventory_source=source)
|
||||
iu.save()
|
||||
InventoryUpdateEvent.create_from_data(inventory_update_id=iu.pk)
|
||||
assert len(emit.call_args_list) == 1
|
||||
topic, payload = emit.call_args_list[0][0]
|
||||
assert topic == 'inventory_update_events-123'
|
||||
assert payload['inventory_update'] == 123
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
@mock.patch('awx.main.consumers.emit_channel_notification')
|
||||
def test_system_job_event_websocket_notifications(emit, inventory):
|
||||
j = SystemJob(id=123)
|
||||
j.save()
|
||||
SystemJobEvent.create_from_data(system_job_id=j.pk)
|
||||
assert len(emit.call_args_list) == 1
|
||||
topic, payload = emit.call_args_list[0][0]
|
||||
assert topic == 'system_job_events-123'
|
||||
assert payload['system_job'] == 123
|
||||
|
||||
@@ -353,3 +353,33 @@ def test_job_not_blocking_project_update(default_instance_group, job_template_fa
|
||||
dependency_graph = DependencyGraph(None)
|
||||
dependency_graph.add_job(job)
|
||||
assert not dependency_graph.is_job_blocked(project_update)
|
||||
|
||||
|
||||
@pytest.mark.django_db
|
||||
def test_job_not_blocking_inventory_update(default_instance_group, job_template_factory, inventory_source_factory):
|
||||
objects = job_template_factory('jt', organization='org1', project='proj',
|
||||
inventory='inv', credential='cred',
|
||||
jobs=["job"])
|
||||
job = objects.jobs["job"]
|
||||
job.instance_group = default_instance_group
|
||||
job.status = "running"
|
||||
job.save()
|
||||
|
||||
with mock.patch("awx.main.scheduler.TaskManager.start_task"):
|
||||
task_manager = TaskManager()
|
||||
task_manager._schedule()
|
||||
|
||||
inv = objects.inventory
|
||||
inv_source = inventory_source_factory("ec2")
|
||||
inv_source.source = "ec2"
|
||||
inv.inventory_sources.add(inv_source)
|
||||
inventory_update = inv_source.create_inventory_update()
|
||||
inventory_update.instance_group = default_instance_group
|
||||
inventory_update.status = "pending"
|
||||
inventory_update.save()
|
||||
|
||||
assert not task_manager.is_job_blocked(inventory_update)
|
||||
|
||||
dependency_graph = DependencyGraph(None)
|
||||
dependency_graph.add_job(job)
|
||||
assert not dependency_graph.is_job_blocked(inventory_update)
|
||||
|
||||
@@ -60,7 +60,7 @@ class TestReplayJobEvents():
|
||||
r.emit_job_status = lambda job, status: True
|
||||
return r
|
||||
|
||||
@mock.patch('awx.main.management.commands.replay_job_events.emit_channel_notification', lambda *a, **kw: None)
|
||||
@mock.patch('awx.main.management.commands.replay_job_events.emit_event_detail', lambda *a, **kw: None)
|
||||
def test_sleep(self, mocker, replayer):
|
||||
replayer.run(3, 1)
|
||||
|
||||
@@ -74,7 +74,7 @@ class TestReplayJobEvents():
|
||||
mock.call(0.000001),
|
||||
])
|
||||
|
||||
@mock.patch('awx.main.management.commands.replay_job_events.emit_channel_notification', lambda *a, **kw: None)
|
||||
@mock.patch('awx.main.management.commands.replay_job_events.emit_event_detail', lambda *a, **kw: None)
|
||||
def test_speed(self, mocker, replayer):
|
||||
replayer.run(3, 2)
|
||||
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
from datetime import datetime
|
||||
from django.utils.timezone import utc
|
||||
from unittest import mock
|
||||
import pytest
|
||||
|
||||
from awx.main.models import (JobEvent, ProjectUpdateEvent, AdHocCommandEvent,
|
||||
@@ -18,16 +17,11 @@ from awx.main.models import (JobEvent, ProjectUpdateEvent, AdHocCommandEvent,
|
||||
datetime(2018, 1, 1).isoformat(), datetime(2018, 1, 1)
|
||||
])
|
||||
def test_event_parse_created(job_identifier, cls, created):
|
||||
with mock.patch.object(cls, 'objects') as manager:
|
||||
cls.create_from_data(**{
|
||||
job_identifier: 123,
|
||||
'created': created
|
||||
})
|
||||
expected_created = datetime(2018, 1, 1).replace(tzinfo=utc)
|
||||
manager.create.assert_called_with(**{
|
||||
job_identifier: 123,
|
||||
'created': expected_created
|
||||
})
|
||||
event = cls.create_from_data(**{
|
||||
job_identifier: 123,
|
||||
'created': created
|
||||
})
|
||||
assert event.created == datetime(2018, 1, 1).replace(tzinfo=utc)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('job_identifier, cls', [
|
||||
@@ -38,24 +32,20 @@ def test_event_parse_created(job_identifier, cls, created):
|
||||
['system_job_id', SystemJobEvent],
|
||||
])
|
||||
def test_playbook_event_strip_invalid_keys(job_identifier, cls):
|
||||
with mock.patch.object(cls, 'objects') as manager:
|
||||
cls.create_from_data(**{
|
||||
job_identifier: 123,
|
||||
'extra_key': 'extra_value'
|
||||
})
|
||||
manager.create.assert_called_with(**{job_identifier: 123})
|
||||
event = cls.create_from_data(**{
|
||||
job_identifier: 123,
|
||||
'extra_key': 'extra_value'
|
||||
})
|
||||
assert getattr(event, job_identifier) == 123
|
||||
assert not hasattr(event, 'extra_key')
|
||||
|
||||
|
||||
@pytest.mark.parametrize('field', [
|
||||
'play', 'role', 'task', 'playbook'
|
||||
])
|
||||
def test_really_long_event_fields(field):
|
||||
with mock.patch.object(JobEvent, 'objects') as manager:
|
||||
JobEvent.create_from_data(**{
|
||||
'job_id': 123,
|
||||
'event_data': {field: 'X' * 4096}
|
||||
})
|
||||
manager.create.assert_called_with(**{
|
||||
'job_id': 123,
|
||||
'event_data': {field: 'X' * 1023 + '…'}
|
||||
})
|
||||
event = JobEvent.create_from_data(**{
|
||||
'job_id': 123,
|
||||
'event_data': {field: 'X' * 4096}
|
||||
})
|
||||
assert event.event_data[field] == 'X' * 1023 + '…'
|
||||
|
||||
@@ -379,7 +379,12 @@ def get_allowed_fields(obj, serializer_mapping):
|
||||
'oauth2accesstoken': ['last_used'],
|
||||
'oauth2application': ['client_secret']
|
||||
}
|
||||
field_blacklist = ACTIVITY_STREAM_FIELD_EXCLUSIONS.get(obj._meta.model_name, [])
|
||||
model_name = obj._meta.model_name
|
||||
field_blacklist = ACTIVITY_STREAM_FIELD_EXCLUSIONS.get(model_name, [])
|
||||
# see definition of from_db for CredentialType
|
||||
# injection logic of any managed types are incompatible with activity stream
|
||||
if model_name == 'credentialtype' and obj.managed_by_tower and obj.namespace:
|
||||
field_blacklist.extend(['inputs', 'injectors'])
|
||||
if field_blacklist:
|
||||
allowed_fields = [f for f in allowed_fields if f not in field_blacklist]
|
||||
return allowed_fields
|
||||
|
||||
@@ -1,16 +1,8 @@
|
||||
# Copyright (c) 2017 Ansible by Red Hat
|
||||
# All Rights Reserved.
|
||||
|
||||
import logging
|
||||
from itertools import chain
|
||||
|
||||
from django.core.cache import cache
|
||||
from django.db.migrations.executor import MigrationExecutor
|
||||
from django.db import connection
|
||||
|
||||
|
||||
logger = logging.getLogger('awx.main.utils.db')
|
||||
|
||||
|
||||
def get_all_field_names(model):
|
||||
# Implements compatibility with _meta.get_all_field_names
|
||||
@@ -22,21 +14,3 @@ def get_all_field_names(model):
|
||||
# GenericForeignKey from the results.
|
||||
if not (field.many_to_one and field.related_model is None)
|
||||
)))
|
||||
|
||||
|
||||
def migration_in_progress_check_or_relase():
|
||||
'''A memcache flag is raised (set to True) to inform cluster
|
||||
that a migration is ongoing see main.apps.MainConfig.ready
|
||||
if the flag is True then the flag is removed on this instance if
|
||||
models-db consistency is observed
|
||||
effective value of migration flag is returned
|
||||
'''
|
||||
migration_in_progress = cache.get('migration_in_progress', False)
|
||||
if migration_in_progress:
|
||||
executor = MigrationExecutor(connection)
|
||||
plan = executor.migration_plan(executor.loader.graph.leaf_nodes())
|
||||
if not bool(plan):
|
||||
logger.info('Detected that migration finished, migration flag taken down.')
|
||||
cache.delete('migration_in_progress')
|
||||
migration_in_progress = False
|
||||
return migration_in_progress
|
||||
|
||||
@@ -98,5 +98,6 @@ def handle_csp_violation(request):
|
||||
logger.error(json.loads(request.body))
|
||||
return HttpResponse(content=None)
|
||||
|
||||
|
||||
def handle_login_redirect(request):
|
||||
return HttpResponseRedirect("/#/login")
|
||||
|
||||
@@ -310,6 +310,9 @@ REST_FRAMEWORK = {
|
||||
'VIEW_DESCRIPTION_FUNCTION': 'awx.api.generics.get_view_description',
|
||||
'NON_FIELD_ERRORS_KEY': '__all__',
|
||||
'DEFAULT_VERSION': 'v2',
|
||||
# For swagger schema generation
|
||||
# see https://github.com/encode/django-rest-framework/pull/6532
|
||||
'DEFAULT_SCHEMA_CLASS': 'rest_framework.schemas.AutoSchema',
|
||||
#'URL_FORMAT_OVERRIDE': None,
|
||||
}
|
||||
|
||||
@@ -375,7 +378,7 @@ AUTH_BASIC_ENABLED = True
|
||||
|
||||
# If set, specifies a URL that unauthenticated users will be redirected to
|
||||
# when trying to access a UI page that requries authentication.
|
||||
LOGIN_REDIRECT_OVERRIDE = None
|
||||
LOGIN_REDIRECT_OVERRIDE = ''
|
||||
|
||||
# If set, serve only minified JS for UI.
|
||||
USE_MINIFIED_JS = False
|
||||
@@ -573,9 +576,6 @@ ANSIBLE_INVENTORY_UNPARSED_FAILED = True
|
||||
# Additional environment variables to be passed to the ansible subprocesses
|
||||
AWX_TASK_ENV = {}
|
||||
|
||||
# Flag to enable/disable updating hosts M2M when saving job events.
|
||||
CAPTURE_JOB_EVENT_HOSTS = False
|
||||
|
||||
# Rebuild Host Smart Inventory memberships.
|
||||
AWX_REBUILD_SMART_MEMBERSHIP = False
|
||||
|
||||
@@ -1208,6 +1208,19 @@ SILENCED_SYSTEM_CHECKS = ['models.E006']
|
||||
# Use middleware to get request statistics
|
||||
AWX_REQUEST_PROFILE = False
|
||||
|
||||
#
|
||||
# Optionally, AWX can generate DOT graphs
|
||||
# (http://www.graphviz.org/doc/info/lang.html) for per-request profiling
|
||||
# via gprof2dot (https://github.com/jrfonseca/gprof2dot)
|
||||
#
|
||||
# If you set this to True, you must `/var/lib/awx/venv/awx/bin/pip install gprof2dot`
|
||||
# .dot files will be saved in `/var/log/tower/profile/` and can be converted e.g.,
|
||||
#
|
||||
# ~ yum install graphviz
|
||||
# ~ dot -o profile.png -Tpng /var/log/tower/profile/some-profile-data.dot
|
||||
#
|
||||
AWX_REQUEST_PROFILE_WITH_DOT = False
|
||||
|
||||
# Delete temporary directories created to store playbook run-time
|
||||
AWX_CLEANUP_PATHS = True
|
||||
|
||||
|
||||
@@ -20,17 +20,7 @@ class SocialAuthMiddleware(SocialAuthExceptionMiddleware):
|
||||
|
||||
def process_request(self, request):
|
||||
if request.path.startswith('/sso'):
|
||||
# django-social keeps a list of backends in memory that it gathers
|
||||
# based on the value of settings.AUTHENTICATION_BACKENDS *at import
|
||||
# time*:
|
||||
# https://github.com/python-social-auth/social-app-django/blob/c1e2795b00b753d58a81fa6a0261d8dae1d9c73d/social_django/utils.py#L13
|
||||
#
|
||||
# our settings.AUTHENTICATION_BACKENDS can *change*
|
||||
# dynamically as Tower settings are changed (i.e., if somebody
|
||||
# configures Github OAuth2 integration), so we need to
|
||||
# _overwrite_ this in-memory value at the top of every request so
|
||||
# that we have the latest version
|
||||
# see: https://github.com/ansible/tower/issues/1979
|
||||
# See upgrade blocker note in requirements/README.md
|
||||
utils.BACKENDS = settings.AUTHENTICATION_BACKENDS
|
||||
token_key = request.COOKIES.get('token', '')
|
||||
token_key = urllib.parse.quote(urllib.parse.unquote(token_key).strip('"'))
|
||||
|
||||
@@ -78,7 +78,7 @@ def _update_m2m_from_expression(user, related, expr, remove=True):
|
||||
related.remove(user)
|
||||
|
||||
|
||||
def _update_org_from_attr(user, related, attr, remove, remove_admins):
|
||||
def _update_org_from_attr(user, related, attr, remove, remove_admins, remove_auditors):
|
||||
from awx.main.models import Organization
|
||||
|
||||
org_ids = []
|
||||
@@ -97,6 +97,10 @@ def _update_org_from_attr(user, related, attr, remove, remove_admins):
|
||||
[o.admin_role.members.remove(user) for o in
|
||||
Organization.objects.filter(Q(admin_role__members=user) & ~Q(id__in=org_ids))]
|
||||
|
||||
if remove_auditors:
|
||||
[o.auditor_role.members.remove(user) for o in
|
||||
Organization.objects.filter(Q(auditor_role__members=user) & ~Q(id__in=org_ids))]
|
||||
|
||||
|
||||
def update_user_orgs(backend, details, user=None, *args, **kwargs):
|
||||
'''
|
||||
@@ -162,9 +166,9 @@ def update_user_orgs_by_saml_attr(backend, details, user=None, *args, **kwargs):
|
||||
attr_admin_values = kwargs.get('response', {}).get('attributes', {}).get(org_map.get('saml_admin_attr'), [])
|
||||
attr_auditor_values = kwargs.get('response', {}).get('attributes', {}).get(org_map.get('saml_auditor_attr'), [])
|
||||
|
||||
_update_org_from_attr(user, "member_role", attr_values, remove, False)
|
||||
_update_org_from_attr(user, "admin_role", attr_admin_values, False, remove_admins)
|
||||
_update_org_from_attr(user, "auditor_role", attr_auditor_values, False, remove_auditors)
|
||||
_update_org_from_attr(user, "member_role", attr_values, remove, False, False)
|
||||
_update_org_from_attr(user, "admin_role", attr_admin_values, False, remove_admins, False)
|
||||
_update_org_from_attr(user, "auditor_role", attr_auditor_values, False, False, remove_auditors)
|
||||
|
||||
|
||||
def update_user_teams_by_saml_attr(backend, details, user=None, *args, **kwargs):
|
||||
|
||||
@@ -26,11 +26,9 @@ function($scope, $rootScope, ProcessErrors, GetBasePath, generateList,
|
||||
let notAdminAlreadyParams = {};
|
||||
|
||||
if ($scope.addType === 'Administrators') {
|
||||
Rest.setUrl(GetBasePath('organizations') + `${$state.params.organization_id}/object_roles`);
|
||||
Rest.setUrl(GetBasePath('organizations') + `${$state.params.organization_id}`);
|
||||
Rest.get().then(({data}) => {
|
||||
notAdminAlreadyParams.not__roles = data.results
|
||||
.filter(({name}) => name === i18n._('Admin'))
|
||||
.map(({id}) => id)[0];
|
||||
notAdminAlreadyParams.not__roles = data.summary_fields.object_roles.admin_role.id;
|
||||
init();
|
||||
});
|
||||
} else {
|
||||
|
||||
@@ -112,7 +112,7 @@ afterEach(() => {
|
||||
...
|
||||
```
|
||||
|
||||
**Test Attributes** -
|
||||
**Test Attributes** -
|
||||
It should be noted that the `dataCy` prop, as well as its equivalent attribute `data-cy`, are used as flags for any UI test that wants to avoid relying on brittle CSS selectors such as `nth-of-type()`.
|
||||
|
||||
## Handling API Errors
|
||||
@@ -296,7 +296,7 @@ The lingui library provides various React helpers for dealing with both marking
|
||||
|
||||
**Note:** Variables that are put inside the t-marked template tag will not be translated. If you have a variable string with text that needs translating, you must wrap it in ```i18n._(t``)``` where it is defined.
|
||||
|
||||
**Note:** We do not use the `I18n` consumer, `i18nMark` function, or `<Trans>` component lingui gives us access to in this repo. i18nMark does not actually replace the string in the UI (leading to the potential for untranslated bugs), and the other helpers are redundant. Settling on a consistent, single pattern helps us ease the mental overhead of the need to understand the ins and outs of the lingui API.
|
||||
**Note:** We try to avoid the `I18n` consumer, `i18nMark` function, or `<Trans>` component lingui gives us access to in this repo. i18nMark does not actually replace the string in the UI (leading to the potential for untranslated bugs), and the other helpers are redundant. Settling on a consistent, single pattern helps us ease the mental overhead of the need to understand the ins and outs of the lingui API.
|
||||
|
||||
You can learn more about the ways lingui and its React helpers at [this link](https://lingui.js.org/tutorials/react-patterns.html).
|
||||
|
||||
|
||||
@@ -1,6 +1,31 @@
|
||||
# Search Iteration 1 Requirements:
|
||||
# Simple Search
|
||||
|
||||
## DONE
|
||||
## UX Considerations
|
||||
|
||||
Historically, the code that powers search in the AngularJS version of the AWX/Tower UI is very complex and prone to bugs. In order to reduce that complexity, we've made some UX desicions to help make the code easier to maintain.
|
||||
|
||||
**ALL query params namespaced and in url bar**
|
||||
|
||||
This includes lists that aren't necessarily hyperlinked, like lookup lists. The reason behind this is so we can treat the url bar as the source of truth for queries always. Any params that have both a key AND value that is in the defaultParams section of the qs config are stripped out of the search string (see "Encoding for UI vs. API" for more info on this point)
|
||||
|
||||
**Django fuzzy search (`?search=`) is not accessible outside of "advanced search"**
|
||||
|
||||
In current smart search typing a term with no key utilizes `?search=` i.e. for "foo" tag, `?search=foo` is given. `?search=` looks on a static list of field name "guesses" (such as name, description, etc.), as well as specific fields as defined for each endpoint (for example, the events endpoint looks for a "stdout" field as well). Due to the fact a key will always be present on the left-hand of simple search, it doesn't make sense to use `?search=` as the default.
|
||||
|
||||
We may allow passing of `?search=` through our future advanced search interface. Some details that were gathered in planning phases about `?search=` that might be helpful in the future:
|
||||
- `?search=` tags are OR'd together (union is returned).
|
||||
- `?search=foo&name=bar` returns items that have a name field of bar (not case insensitive) AND some text field with foo on it
|
||||
- `?search=foo&search=bar&name=baz` returns (foo in name OR foo in description OR ...) AND (bar in name OR bar in description OR ...) AND (baz in name)
|
||||
- similarly `?related__search=` looks on the static list of "guesses" for models related to the endpoint. The specific fields are not "searched" for `?related__search=`.
|
||||
- `?related__search=` not currently used in awx ui
|
||||
|
||||
**A note on clicking a tag to putting it back into the search bar**
|
||||
|
||||
This was brought up as a nice to have when we were discussing our initial implementation of search in the new application. Since there isn't a way we would be able to know if the user created the tag from the simple or advanced search interface, we wouldn't know where to put it back. This breaks our idea of using the query params as the exclusive source of truth, so we've decided against implementing it for now.
|
||||
|
||||
## Tasklist
|
||||
|
||||
### DONE
|
||||
|
||||
- DONE update handleSearch to follow handleSort param
|
||||
- DONE update qsConfig columns to utilize isSearchable bool (just like isSortable bool)
|
||||
@@ -24,93 +49,233 @@
|
||||
- DONE add search filter removal test for qs.
|
||||
- DONE remove button for search tags of duplicate keys are broken, fix that
|
||||
|
||||
## TODO later on in 3.6: stuff to be finished for search iteration 1 (I'll card up an issue to tackle this. I plan on doing this after I finished the awx project branch work)
|
||||
### TODO pre-holiday break
|
||||
- Update COLUMNS to SORT_COLUMNS and SEARCH_COLUMNS
|
||||
- Update to using new PF Toolbar component (currently an experimental component)
|
||||
- Change the right-hand input based on the type of key selected on the left-hand side. In addition to text input, for our MVP we will support:
|
||||
- number input
|
||||
- select input (multiple-choice configured from UI or Options)
|
||||
- Update the following lists to have the following keys:
|
||||
|
||||
- currently handleSearch in Search.jsx always appends the `__icontains` post-fix to make the filtering ux expected work right. Once we start adding number-based params we will won't to change this behavior.
|
||||
- utilize new defaultSearchKey prop instead of relying on sort key
|
||||
- make access have username as the default key?
|
||||
- make default params only accept page, page_size and order_by
|
||||
- support custom order_by being typed in the url bar
|
||||
- fix up which keys are displayed in the various lists (note this will also require non-string widgetry to the right of the search key dropdown, for integers, dates, etc.)
|
||||
- fix any spacing issues like collision with action buttons and overall width of the search bar
|
||||
**Jobs list** (signed off earlier in chat)
|
||||
- Name (which is also the name of the job template) - search is ?name=jt
|
||||
- Job ID - search is ?id=13
|
||||
- Label name - search is ?labels__name=foo
|
||||
- Job type (dropdown on right with the different types) ?type = job
|
||||
- Created by (username) - search is ?created_by__username=admin
|
||||
- Status - search (dropdown on right with different statuses) is ?status=successful
|
||||
|
||||
## Lists affected in 3.6 timeframe
|
||||
Instances of jobs list include:
|
||||
- Jobs list
|
||||
- Host completed jobs list
|
||||
- JT completed jobs list
|
||||
|
||||
We should update all places to use consistent handleSearch/handleSort with paginated data list pattern. This shouldn't be too difficult to get hooked up, as the lists all inherit from PaginatedDataList, where search is hooked up. We will need to make sure the queryset config for each list includes the searchable boolean on keys that will need to be searched for.
|
||||
**Organization list**
|
||||
- Name - search is ?name=org
|
||||
- ? Team name (of a team in the org) - search is ?teams__name=ansible
|
||||
- ? Username (of a user in the org) - search is ?users__username=johndoe
|
||||
|
||||
orgs stuff
|
||||
- org list
|
||||
- org add/edit instance groups lookup list
|
||||
- org access list
|
||||
- org user/teams list in wizard
|
||||
- org teams list
|
||||
- org notifications list
|
||||
jt stuff
|
||||
- jt list
|
||||
- jt add/edit inventory, project, credentials, instance groups lookups lists
|
||||
- jt access list
|
||||
- jt user/teams list in wizard
|
||||
- jt notifications list
|
||||
- jt schedules list
|
||||
- jt completed jobs list
|
||||
jobs stuff
|
||||
- jobs list
|
||||
Instances of orgs list include:
|
||||
- Orgs list
|
||||
- User orgs list
|
||||
- Lookup on Project
|
||||
- Lookup on Credential
|
||||
- Lookup on Inventory
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
# Search code details
|
||||
**Instance Groups list**
|
||||
- Name - search is ?name=ig
|
||||
- ? is_containerized boolean choice (doesn't work right now in API but will soon) - search is ?is_containerized=true
|
||||
- ? credential name - search is ?credentials__name=kubey
|
||||
|
||||
## Search component
|
||||
Instance of instance groups list include:
|
||||
- Lookup on Org
|
||||
- Lookup on JT
|
||||
- Lookup on Inventory
|
||||
|
||||
Search is configured using the qsConfig in a similar way to sort. Columns are passed as an array, as defined in the screen where the list is located. You pass a bool isSearchable (an analog to isSortable) to mark that a certain key should show up in the left-hand dropdown of the search bar.
|
||||
**Users list**
|
||||
- Username - search is ?username=johndoe
|
||||
- First Name - search is ?first_name=John
|
||||
- Last Name - search is ?last_name=Doe
|
||||
- ? (if not superfluous, would not include on Team users list) Team Name - search is ?teams__name=team_of_john_does (note API issue: User has no field named "teams")
|
||||
- ? (only for access or permissions list) Role Name - search is ?roles__name=Admin (note API issue: Role has no field "name")
|
||||
- ? (if not superfluous, would not include on Organization users list) ORg Name - search is ?organizations__name=org_of_jhn_does
|
||||
|
||||
If you don't pass any columns, a default of isSearchable true will be added to a name column, which is nearly universally shared throughout the models of awx.
|
||||
Instance of user lists include:
|
||||
- User list
|
||||
- Org user list
|
||||
- Access list for Org, JT, Project, Credential, Inventory, User and Team
|
||||
- Access list for JT
|
||||
- Access list Project
|
||||
- Access list for Credential
|
||||
- Access list for Inventory
|
||||
- Access list for User
|
||||
- Access list for Team
|
||||
- Team add users list
|
||||
- Users list in access wizard (to add new roles for a particular list) for Org
|
||||
- Users list in access wizard (to add new roles for a particular list) for JT
|
||||
- Users list in access wizard (to add new roles for a particular list) for Project
|
||||
- Users list in access wizard (to add new roles for a particular list) for Credential
|
||||
- Users list in access wizard (to add new roles for a particular list) for Inventory
|
||||
|
||||
**Teams list**
|
||||
- Name - search is ?name=teamname
|
||||
- ? Username (of a user in the team) - search is ?users__username=johndoe
|
||||
- ? (if not superfluous, would not include on Organizations teams list) Org Name - search is ?organizations__name=org_of_john_does
|
||||
|
||||
Instance of team lists include:
|
||||
- Team list
|
||||
- Org team list
|
||||
- User team list
|
||||
- Team list in access wizard (to add new roles for a particular list) for Org
|
||||
- Team list in access wizard (to add new roles for a particular list) for JT
|
||||
- Team list in access wizard (to add new roles for a particular list) for Project
|
||||
- Team list in access wizard (to add new roles for a particular list) for Credential
|
||||
- Team list in access wizard (to add new roles for a particular list) for Inventory
|
||||
|
||||
**Credentials list**
|
||||
- Name
|
||||
- ? Type (dropdown on right with different types)
|
||||
- ? Created by (username)
|
||||
- ? Modified by (username)
|
||||
|
||||
Instance of credential lists include:
|
||||
- Credential list
|
||||
- Lookup for JT
|
||||
- Lookup for Project
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
**Projects list**
|
||||
- Name - search is ?name=proj
|
||||
- ? Type (dropdown on right with different types) - search is scm_type=git
|
||||
- ? SCM URL - search is ?scm_url=github.com/ansible/test-playbooks
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
Instance of project lists include:
|
||||
- Project list
|
||||
- Lookup for JT
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
**Templates list**
|
||||
- Name - search is ?name=cleanup
|
||||
- ? Type (dropdown on right with different types) - search is ?type=playbook_run
|
||||
- ? Playbook name - search is ?job_template__playbook=debug.yml
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
Instance of template lists include:
|
||||
- Template list
|
||||
- Project Templates list
|
||||
|
||||
**Inventories list**
|
||||
- Name - search is ?name=inv
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
Instance of inventory lists include:
|
||||
- Inventory list
|
||||
- Lookup for JT
|
||||
- User access add wizard list
|
||||
- Team access add wizard list
|
||||
|
||||
**Groups list**
|
||||
- Name - search is ?name=group_name
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
Instance of group lists include:
|
||||
- Group list
|
||||
|
||||
**Hosts list**
|
||||
- Name - search is ?name=hostname
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
Instance of host lists include:
|
||||
- Host list
|
||||
|
||||
**Notifications list**
|
||||
- Name - search is ?name=notification_template_name
|
||||
- ? Type (dropdown on right with different types) - search is ?type=slack
|
||||
- ? Created by (username) - search is ?created_by__username=admin
|
||||
- ? Modified by (username) - search is ?modified_by__username=admin
|
||||
|
||||
Instance of notification lists include:
|
||||
- Org notification list
|
||||
- JT notification list
|
||||
- Project notification list
|
||||
|
||||
### TODO backlog
|
||||
- Change the right-hand input based on the type of key selected on the left-hand side. We will eventually want to support:
|
||||
- lookup input (selection of particular resources, based on API list endpoints)
|
||||
- date picker input
|
||||
- Update the following lists to have the following keys:
|
||||
- Update all __name and __username related field search-based keys to be type-ahead lookup based searches
|
||||
|
||||
## Code Details
|
||||
|
||||
### Search component
|
||||
|
||||
The component looks like this:
|
||||
|
||||
```
|
||||
<Search
|
||||
qsConfig={qsConfig} // used to get namespace (when tags are modified
|
||||
// they append namespace to query params)
|
||||
// also used to get "type" of fields (i.e. interger
|
||||
// fields should get number picker instead of text box)
|
||||
qsConfig={qsConfig}
|
||||
columns={columns}
|
||||
onSearch={onSearch}
|
||||
/>
|
||||
```
|
||||
|
||||
## ListHeader component
|
||||
**qsConfig** is used to get namespace so that multiple lists can be on the page. When tags are modified they append namespace to query params. The qsConfig is also used to get "type" of fields in order to correctly parse values as int or date as it is translating.
|
||||
|
||||
DataListToolbar, EmptyListControls, and FilterTags components were created/moved to a new sub-component of PaginatedDataList, ListHeader. This allowed us to consolidate the logic between both lists with data (which need to show search, sort, any search tags currently active, and actions) as well as empty lists (which need to show search tags currently active so they can be removed, potentially getting you back to a "list-has-data" state, as well as a subset of options still valid (such as "add").
|
||||
**columns** are passed as an array, as defined in the screen where the list is located. You pass a bool `isDefault` to indicate that should be the key that shows up in the left-hand dropdown as default in the UI. If you don't pass any columns, a default of `isDefault=true` will be added to a name column, which is nearly universally shared throughout the models of awx.
|
||||
|
||||
search and sort are passed callbacks from functions defined in ListHeader. These will be the following.
|
||||
There is a type attribute that can be `'string'`, `'number'` or `'choice'` (and in the future, `'date'` and `'lookup'`), which will change the type of input on the right-hand side of the search bar. For a key that has a set number of choices, you will pass a choices attribute, which is an array in the format choices: [{label: 'Foo', value: 'foo'}]
|
||||
|
||||
```
|
||||
handleSort (sortedColumnKey, sortOrder) {
|
||||
this.pushHistoryState({
|
||||
order_by: sortOrder === 'ascending' ? sortedColumnKey : `-${sortedColumnKey}`,
|
||||
page: null,
|
||||
});
|
||||
}
|
||||
**onSearch** calls the `mergeParams` qs util in order to add new tags to the queryset. mergeParams is used so that we can support duplicate keys (see mergeParams vs. replaceParams for more info).
|
||||
|
||||
handleSearch (key, value, remove) {
|
||||
this.pushHistoryState({
|
||||
// ... use key and value to push a new value to the param
|
||||
// if remove false you add a new tag w key value if remove true,
|
||||
// you are removing one
|
||||
});
|
||||
}
|
||||
```
|
||||
### ListHeader component
|
||||
|
||||
Similarly, there are handleRemove and handleRemoveAll functions. All of these functions act on the react-router history using the pushHistoryState function. This causes the query params in the url to update, which in turn triggers change handlers that will re-fetch data.
|
||||
`DataListToolbar`, `EmptyListControls`, and `FilterTags` components were created or moved to a new sub-component of `PaginatedDataList`, `ListHeader`. This allowed us to consolidate the logic between both lists with data (which need to show search, sort, any search tags currently active, and actions) as well as empty lists (which need to show search tags currently active so they can be removed, potentially getting you back to a "list-has-data" state, as well as a subset of options still valid, such as "add").
|
||||
|
||||
## FilterTags component
|
||||
The ability to search and remove filters, as well as sort the list is handled through callbacks which are passed from functions defined in `ListHeader`. These are the following:
|
||||
|
||||
Similar to the way the list grabs data based on changes to the react-router params, the FilterTags component updates when new params are added. This component is a fairly straight-forward map (only slightly complex, because it needed to do a nested map over any values with duplicate keys that were represented by an inner-array).
|
||||
- `handleSort(key, direction)` - use key and direction of sort to change the order_by value in the queryset
|
||||
- `handleSearch(key, value)` - use key and value to push a new value to the param
|
||||
- `handleRemove(key, value)` - use key and value to remove a value to the param
|
||||
- `handleRemoveAll()` - remove all non-default params
|
||||
|
||||
Currently the filter tags do not display the key, though that data is available and they could very easily do so.
|
||||
All of these functions act on the react-router history using the `pushHistoryState` function. This causes the query params in the url to update, which in turn triggers change handlers that will re-fetch data for the lists.
|
||||
|
||||
## QS Updates (and supporting duplicate keys)
|
||||
**a note on sort_columns and search_columns**
|
||||
|
||||
The logic that was updated to handle search tags can be found in the qs.js util file.
|
||||
We have split out column configuration into separate search and sort column array props--these are passed to the search and sort columns. Both accept an isDefault prop for one of the items in the array to be the default option selected when going to the page. Sort column items can pass an isNumeric boolean in order to chnage the iconography of the sort UI element. Search column items can pass type and if applicable choices, in order to configure the right-hand side of the search bar.
|
||||
|
||||
From a UX perspective, we wanted to be able to support searching on the same key multiple times (i.e. searching for things like ?foo=bar&foo=baz). We do this by creating an array of all values. i.e.:
|
||||
### FilterTags component
|
||||
|
||||
Similar to the way the list grabs data based on changes to the react-router params, the `FilterTags` component updates when new params are added. This component is a fairly straight-forward map (only slightly complex, because it needed to do a nested map over any values with duplicate keys that were represented by an inner-array). Both key and value are displayed for the tag.
|
||||
|
||||
### qs utility
|
||||
|
||||
The qs (queryset) utility is used to make the search speak the language of the REST API. The main functions of the utilities are to:
|
||||
- add, replace and remove filters
|
||||
- translate filters as url params (for linking and maintaining state), in-memory representation (as JS objects), and params that Django REST Framework understands.
|
||||
|
||||
More info in the below sections:
|
||||
|
||||
#### Encoding for UI vs. API
|
||||
|
||||
For the UI url params, we want to only encode those params that aren't defaults, as the default behavior was defined through configuration and we don't need these in the url as a source of truth. For the API, we need to pass these params so that they are taken into account when the response is built.
|
||||
|
||||
#### mergeParams vs. replaceParams
|
||||
|
||||
**mergeParams** is used to suppport putting values with the same key
|
||||
|
||||
From a UX perspective, we wanted to be able to support searching on the same key multiple times (i.e. searching for things like `?foo=bar&foo=baz`). We do this by creating an array of all values. i.e.:
|
||||
|
||||
```
|
||||
{
|
||||
@@ -118,36 +283,15 @@ From a UX perspective, we wanted to be able to support searching on the same key
|
||||
}
|
||||
```
|
||||
|
||||
Changes to encodeQueryString and parseQueryString were made to convert between a single value string representation and multiple value array representations. Test cases were also added to qs.test.js.
|
||||
Concatenating terms in this way gives you the intersection of both terms (i.e. foo must be "bar" and "baz"). This is helpful for the most-common type of searching, substring (`__icontains`) searches. This will increase filtering, allowing the user to drill-down into the list as terms are added.
|
||||
|
||||
In addition, we needed to make sure any changes to the params that are not handled by search (page, page_size, and order_by) were updated by replacing the single value, rather than adding multiple values with the array representation. This additional piece of the specification was made in the newly created addParams and removeParams qs functions and a few test-cases were written to verify this.
|
||||
**replaceParams** is used to support sorting, setting page_size, etc. These params only allow one choice, and we need to replace a particular key's value if one is passed.
|
||||
|
||||
The api is coupled with the qs util through the paramsSerializer, due to the fact we need axios to support the array for duplicate key values object representation of the params to pass to the get request. This is done where axios is configured in the Base.js file, so all requests and request types should support our array syntax for duplicate keys.
|
||||
#### Working with REST API
|
||||
|
||||
# UX considerations
|
||||
The REST API is coupled with the qs util through the `paramsSerializer`, due to the fact we need axios to support the array for duplicate key values in the object representation of the params to pass to the get request. This is done where axios is configured in the Base.js file, so all requests and request types should support our array syntax for duplicate keys automatically.
|
||||
|
||||
**UX should be more tags always equates to more filtering. (so "and" logic not "or")**
|
||||
|
||||
Also, for simple search results should be returned that partially match value (i.e. use icontains prefix)
|
||||
|
||||
**ALL query params namespaced and in url bar**
|
||||
|
||||
- this includes lists that aren't necessarily hyperlinked, like lookup lists.
|
||||
- the reason behind this is so we can treat the url bar as the source of truth for queries always
|
||||
- currently /#/organizations/add?lookup.name=bar -> will eventually be something like /#/organizations/add?ig_lookup.name=bar
|
||||
- any params that have both a key AND value that is in the defaultParams section of the qs config should be stripped out of the search string
|
||||
|
||||
**django fuzzy search (?search=) is not accessible outside of "advanced search"**
|
||||
|
||||
- How "search" query param works
|
||||
- in current smart search typing a term with no key utilizes search= i.e. for "foo" tag, ?search=foo is given
|
||||
- search= looks on a static list of field name "guesses" (such as name, description, etc.), as well as specific fields as defined for each endpoint (for example, the events endpoint looks for a "stdout" field as well)
|
||||
- note that search= tags are OR'd together
|
||||
- search=foo&name=bar returns items that have a name field of bar (not case insensitive) AND some text field with foo on it
|
||||
- search=foo&search=bar&name=baz returns (foo in name OR foo in description OR ...) AND (bar in name OR bar in description OR ...) AND (baz in name)
|
||||
- similarly ?related__search= looks on the static list of "guesses" for models related to the endpoint
|
||||
- the specific fields are not "searched" for related__search
|
||||
- related__search not currently used in awx ui
|
||||
# Advanced Search - this section is a mess, update eventually
|
||||
|
||||
**a note on typing in a smart search query**
|
||||
|
||||
@@ -155,12 +299,6 @@ In order to not support a special "language" or "syntax" for crafting the query
|
||||
|
||||
Since all search bars are represented in the url, for users who want to input a string to filter results in a single step, typing directly in the url to achieve the filter is acceptable.
|
||||
|
||||
**a note on clicking a tag to putting it back into the search bar**
|
||||
|
||||
This was brought up as a nice to have when we were discussing features. There isn't a way we would be able to know if the user created the tag from the smart search or simple search interface? that info is not traceable using the query params as the exclusive source of truth
|
||||
|
||||
We have decided to not try to tackle this up front with our advanced search implementation, and may go back to this based on user feedback at a later time.
|
||||
|
||||
# Advanced search notes
|
||||
|
||||
Current thinking is Advanced Search will be post-3.6, or at least late 3.6 after awx features and "simple search" with the left dropdown and right input for the above phase 1 lists.
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
module.exports = {
|
||||
collectCoverageFrom: [
|
||||
'src/**/*.{js,jsx}'
|
||||
'src/**/*.{js,jsx}',
|
||||
'testUtils/**/*.{js,jsx}'
|
||||
],
|
||||
coveragePathIgnorePatterns: [
|
||||
'<rootDir>/src/locales',
|
||||
|
||||
173
awx/ui_next/package-lock.json
generated
173
awx/ui_next/package-lock.json
generated
@@ -1787,43 +1787,51 @@
|
||||
"dev": true
|
||||
},
|
||||
"@patternfly/patternfly": {
|
||||
"version": "2.40.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.40.2.tgz",
|
||||
"integrity": "sha512-KCPQ6EL39xJen/B67MGv56i3h6bU5l7FD6f5IYU30z+ed2gM8zAYI3mPKNV05TMJv6+EQfp6O7dqCM3PJ8Q1yw=="
|
||||
"version": "2.46.1",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.46.1.tgz",
|
||||
"integrity": "sha512-3lReQMQvedwEhKOcOw7rE3RPRXMtRit+Yj1IOO7fl5EHaZaNqA1/3w9mWNCpx52M+WD8scBkgqtVx74OU7Jemw=="
|
||||
},
|
||||
"@patternfly/react-core": {
|
||||
"version": "3.120.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.120.2.tgz",
|
||||
"integrity": "sha512-PgV5w+3NlXK7hKvu0YY1pjXgd56dLwbIWE4m72JstxJIp/vpRShB6bfiSYNQGVi2ZQUudQTSH5sVWaBqXUaquw==",
|
||||
"version": "3.129.3",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.129.3.tgz",
|
||||
"integrity": "sha512-QiTTUqA0y55YbDtzjlzKmZ6pGQqxyCF14TBQFH3rXI2RV8Z4C6HyyILm09BD/D/ITQIhT82dp+6nRY/mQOqlkw==",
|
||||
"requires": {
|
||||
"@patternfly/react-icons": "^3.14.15",
|
||||
"@patternfly/react-styles": "^3.6.2",
|
||||
"@patternfly/react-tokens": "^2.7.2",
|
||||
"@patternfly/react-icons": "^3.14.28",
|
||||
"@patternfly/react-styles": "^3.6.15",
|
||||
"@patternfly/react-tokens": "^2.7.14",
|
||||
"emotion": "^9.2.9",
|
||||
"exenv": "^1.2.2",
|
||||
"focus-trap-react": "^4.0.1",
|
||||
"tippy.js": "3.4.1"
|
||||
"tippy.js": "5.1.2"
|
||||
},
|
||||
"dependencies": {
|
||||
"@patternfly/react-icons": {
|
||||
"version": "3.14.28",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.14.28.tgz",
|
||||
"integrity": "sha512-xrmcaLaHvkixPdTuBfR+vPD2prUYxKq97TGs97lfo0K4g7Wi6lD30zMlmwzonWy1IuOHATiEwf3j7mXAqQXHlQ==",
|
||||
"requires": {
|
||||
"@fortawesome/free-brands-svg-icons": "^5.8.1"
|
||||
}
|
||||
},
|
||||
"@patternfly/react-tokens": {
|
||||
"version": "2.7.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.7.2.tgz",
|
||||
"integrity": "sha512-3QslQUErDLXGTzp2iGQNJD1UjZ+1NqwavOlsbxACUZ6LjXyJ7Y4TZbxDQrpgzPsD1SFPEVWufzpdjjtRBZ/b7g=="
|
||||
"version": "2.7.14",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.7.14.tgz",
|
||||
"integrity": "sha512-HVa1fe7H4NRRv6lmezpvW2TfIDF7bSbKvhMmCVqBk80Fd3wfLcPhacnWdt6PLWq7WX4dVx7dF7+v4sFh8RczSg=="
|
||||
}
|
||||
}
|
||||
},
|
||||
"@patternfly/react-icons": {
|
||||
"version": "3.14.15",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.14.15.tgz",
|
||||
"integrity": "sha512-7mIr1nzAXu6CdxKnhJGggIghx3DCaFXv6an+mfP/IwWifsLhcpE1c0iYkmVkvlI9X4cQAzeg9VfEGR7quhPOlA==",
|
||||
"version": "3.14.28",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.14.28.tgz",
|
||||
"integrity": "sha512-xrmcaLaHvkixPdTuBfR+vPD2prUYxKq97TGs97lfo0K4g7Wi6lD30zMlmwzonWy1IuOHATiEwf3j7mXAqQXHlQ==",
|
||||
"requires": {
|
||||
"@fortawesome/free-brands-svg-icons": "^5.8.1"
|
||||
}
|
||||
},
|
||||
"@patternfly/react-styles": {
|
||||
"version": "3.6.2",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.6.2.tgz",
|
||||
"integrity": "sha512-WRXPC1R/qL+i/ANnrA0nEe6CcLHLZJIKWzSJ4gS2h9VdHvKySEdIlk9EtAZ0dNkv3whANjaKlR/n2/uFuXlzyw==",
|
||||
"version": "3.6.15",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.6.15.tgz",
|
||||
"integrity": "sha512-9phudtz138QV82o60XvbNkeYPzLgz0DekEeu8cIX2A2yO1WzZbgXL5VPWB8bF/y+9EFyl+w8tu3ReQcvh7ULEw==",
|
||||
"requires": {
|
||||
"@babel/helper-plugin-utils": "^7.0.0-beta.48",
|
||||
"camel-case": "^3.0.0",
|
||||
@@ -1855,9 +1863,9 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"acorn": {
|
||||
"version": "6.3.0",
|
||||
"resolved": "https://registry.npmjs.org/acorn/-/acorn-6.3.0.tgz",
|
||||
"integrity": "sha512-/czfa8BwS88b9gWQVhc8eknunSA2DoJpJyTQkhheIf5E48u1N0R4q/YxxsAeqRrmK9TQ/uYfgLDfZo91UlANIA=="
|
||||
"version": "6.4.0",
|
||||
"resolved": "https://registry.npmjs.org/acorn/-/acorn-6.4.0.tgz",
|
||||
"integrity": "sha512-gac8OEcQ2Li1dxIEWGZzsp2BitJxwkwcOm0zHAJLcPJaVvm58FRnk6RkuLRpU1EujipU2ZFODv2P9DLMfnV8mw=="
|
||||
}
|
||||
}
|
||||
},
|
||||
@@ -1879,9 +1887,9 @@
|
||||
"integrity": "sha1-/cpRzuYTOJXjyI1TXOSdv/YqRjM="
|
||||
},
|
||||
"jsdom": {
|
||||
"version": "15.2.0",
|
||||
"resolved": "https://registry.npmjs.org/jsdom/-/jsdom-15.2.0.tgz",
|
||||
"integrity": "sha512-+hRyEfjRPFwTYMmSQ3/f7U9nP8ZNZmbkmUek760ZpxnCPWJIhaaLRuUSvpJ36fZKCGENxLwxClzwpOpnXNfChQ==",
|
||||
"version": "15.2.1",
|
||||
"resolved": "https://registry.npmjs.org/jsdom/-/jsdom-15.2.1.tgz",
|
||||
"integrity": "sha512-fAl1W0/7T2G5vURSyxBzrJ1LSdQn6Tr5UX/xD4PXDx/PDgwygedfW6El/KIj3xJ7FU61TTYnc/l/B7P49Eqt6g==",
|
||||
"requires": {
|
||||
"abab": "^2.0.0",
|
||||
"acorn": "^7.1.0",
|
||||
@@ -1893,7 +1901,7 @@
|
||||
"domexception": "^1.0.1",
|
||||
"escodegen": "^1.11.1",
|
||||
"html-encoding-sniffer": "^1.0.2",
|
||||
"nwsapi": "^2.1.4",
|
||||
"nwsapi": "^2.2.0",
|
||||
"parse5": "5.1.0",
|
||||
"pn": "^1.1.0",
|
||||
"request": "^2.88.0",
|
||||
@@ -1912,9 +1920,9 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"cssom": {
|
||||
"version": "0.4.1",
|
||||
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.4.1.tgz",
|
||||
"integrity": "sha512-6Aajq0XmukE7HdXUU6IoSWuH1H6gH9z6qmagsstTiN7cW2FNTsb+J2Chs+ufPgZCsV/yo8oaEudQLrb9dGxSVQ=="
|
||||
"version": "0.4.4",
|
||||
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.4.4.tgz",
|
||||
"integrity": "sha512-p3pvU7r1MyyqbTk+WbNJIgJjG2VmTIaB10rI93LzVPrmDJKkzKYMtxxyAvQXR/NS6otuzveI7+7BBq3SjBS2mw=="
|
||||
},
|
||||
"cssstyle": {
|
||||
"version": "2.0.0",
|
||||
@@ -1934,9 +1942,9 @@
|
||||
}
|
||||
},
|
||||
"nwsapi": {
|
||||
"version": "2.1.4",
|
||||
"resolved": "https://registry.npmjs.org/nwsapi/-/nwsapi-2.1.4.tgz",
|
||||
"integrity": "sha512-iGfd9Y6SFdTNldEy2L0GUhcarIutFmk+MPWIn9dmj8NMIup03G08uUF2KGbbmv/Ux4RT0VZJoP/sVbWA6d/VIw=="
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/nwsapi/-/nwsapi-2.2.0.tgz",
|
||||
"integrity": "sha512-h2AatdwYH+JHiZpv7pt/gSX1XoRGb7L/qSIeuqA6GwYoF9w1vP1cw42TO0aI2pNyshRK5893hNSl+1//vHK7hQ=="
|
||||
},
|
||||
"parse5": {
|
||||
"version": "5.1.0",
|
||||
@@ -1944,19 +1952,19 @@
|
||||
"integrity": "sha512-fxNG2sQjHvlVAYmzBZS9YlDp6PTSSDwa98vkD4QgVDDCAo84z5X1t5XyJQ62ImdLXx5NdIIfihey6xpum9/gRQ=="
|
||||
},
|
||||
"request-promise-core": {
|
||||
"version": "1.1.2",
|
||||
"resolved": "https://registry.npmjs.org/request-promise-core/-/request-promise-core-1.1.2.tgz",
|
||||
"integrity": "sha512-UHYyq1MO8GsefGEt7EprS8UrXsm1TxEvFUX1IMTuSLU2Rh7fTIdFtl8xD7JiEYiWU2dl+NYAjCTksTehQUxPag==",
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/request-promise-core/-/request-promise-core-1.1.3.tgz",
|
||||
"integrity": "sha512-QIs2+ArIGQVp5ZYbWD5ZLCY29D5CfWizP8eWnm8FoGD1TX61veauETVQbrV60662V0oFBkrDOuaBI8XgtuyYAQ==",
|
||||
"requires": {
|
||||
"lodash": "^4.17.11"
|
||||
"lodash": "^4.17.15"
|
||||
}
|
||||
},
|
||||
"request-promise-native": {
|
||||
"version": "1.0.7",
|
||||
"resolved": "https://registry.npmjs.org/request-promise-native/-/request-promise-native-1.0.7.tgz",
|
||||
"integrity": "sha512-rIMnbBdgNViL37nZ1b3L/VfPOpSi0TqVDQPAvO6U14lMzOLrt5nilxCQqtDKhZeDiW0/hkCXGoQjhgJd/tCh6w==",
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/request-promise-native/-/request-promise-native-1.0.8.tgz",
|
||||
"integrity": "sha512-dapwLGqkHtwL5AEbfenuzjTYg35Jd6KPytsC2/TLkVMz8rm+tNt72MGUWT1RP/aYawMpN6HqbNGBQaRcBtjQMQ==",
|
||||
"requires": {
|
||||
"request-promise-core": "1.1.2",
|
||||
"request-promise-core": "1.1.3",
|
||||
"stealthy-require": "^1.1.1",
|
||||
"tough-cookie": "^2.3.3"
|
||||
},
|
||||
@@ -2003,19 +2011,16 @@
|
||||
}
|
||||
},
|
||||
"ws": {
|
||||
"version": "7.2.0",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.2.0.tgz",
|
||||
"integrity": "sha512-+SqNqFbwTm/0DC18KYzIsMTnEWpLwJsiasW/O17la4iDRRIO9uaHbvKiAS3AHgTiuuWerK/brj4O6MYZkei9xg==",
|
||||
"requires": {
|
||||
"async-limiter": "^1.0.0"
|
||||
}
|
||||
"version": "7.2.1",
|
||||
"resolved": "https://registry.npmjs.org/ws/-/ws-7.2.1.tgz",
|
||||
"integrity": "sha512-sucePNSafamSKoOqoNfBd8V0StlkzJKL2ZAhGQinCfNQ+oacw+Pk7lcdAElecBF2VkLNZRiIb5Oi1Q5lVUVt2A=="
|
||||
}
|
||||
}
|
||||
},
|
||||
"@patternfly/react-tokens": {
|
||||
"version": "2.6.31",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.6.31.tgz",
|
||||
"integrity": "sha512-K9semfLIdf2vECefAbheXPVwZqq8nXY0Hf/VkWh6OBCL6R4FekxajpSBgobeoTQUotmvz5boMngqhkUjE7yChA=="
|
||||
"version": "2.7.14",
|
||||
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.7.14.tgz",
|
||||
"integrity": "sha512-HVa1fe7H4NRRv6lmezpvW2TfIDF7bSbKvhMmCVqBk80Fd3wfLcPhacnWdt6PLWq7WX4dVx7dF7+v4sFh8RczSg=="
|
||||
},
|
||||
"@types/babel__core": {
|
||||
"version": "7.1.1",
|
||||
@@ -3107,7 +3112,8 @@
|
||||
"async-limiter": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/async-limiter/-/async-limiter-1.0.0.tgz",
|
||||
"integrity": "sha512-jp/uFnooOiO+L211eZOoSyzpOITMXx1rBITauYykG3BRYPu8h0UcxsPNB04RR5vo4Tyz3+ay17tR6JVf9qzYWg=="
|
||||
"integrity": "sha512-jp/uFnooOiO+L211eZOoSyzpOITMXx1rBITauYykG3BRYPu8h0UcxsPNB04RR5vo4Tyz3+ay17tR6JVf9qzYWg==",
|
||||
"dev": true
|
||||
},
|
||||
"asynckit": {
|
||||
"version": "0.4.0",
|
||||
@@ -4997,7 +5003,7 @@
|
||||
},
|
||||
"readable-stream": {
|
||||
"version": "2.3.6",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"resolved": "http://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"integrity": "sha512-tQtKA9WIAhBF3+VLAseyMqZeBjW0AHJoxOtYqSUZNJxauErmLbVm2FW1y+J/YA9dUrAC39ITejlZWhVIwawkKw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
@@ -5354,9 +5360,9 @@
|
||||
}
|
||||
},
|
||||
"csstype": {
|
||||
"version": "2.6.7",
|
||||
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.7.tgz",
|
||||
"integrity": "sha512-9Mcn9sFbGBAdmimWb2gLVDtFJzeKtDGIr76TUqmjZrw9LFXBMSU70lcs+C0/7fyCd6iBDqmksUcCOUIkisPHsQ=="
|
||||
"version": "2.6.8",
|
||||
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.8.tgz",
|
||||
"integrity": "sha512-msVS9qTuMT5zwAGCVm4mxfrZ18BNc6Csd0oJAtiFMZ1FAx1CCvy2+5MDmYoix63LM/6NDbNtodCiGYGmFgO0dA=="
|
||||
},
|
||||
"currently-unhandled": {
|
||||
"version": "0.4.1",
|
||||
@@ -6110,7 +6116,7 @@
|
||||
},
|
||||
"readable-stream": {
|
||||
"version": "2.3.6",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"resolved": "http://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"integrity": "sha512-tQtKA9WIAhBF3+VLAseyMqZeBjW0AHJoxOtYqSUZNJxauErmLbVm2FW1y+J/YA9dUrAC39ITejlZWhVIwawkKw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
@@ -7552,7 +7558,7 @@
|
||||
},
|
||||
"readable-stream": {
|
||||
"version": "2.3.6",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"resolved": "http://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"integrity": "sha512-tQtKA9WIAhBF3+VLAseyMqZeBjW0AHJoxOtYqSUZNJxauErmLbVm2FW1y+J/YA9dUrAC39ITejlZWhVIwawkKw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
@@ -7696,7 +7702,7 @@
|
||||
},
|
||||
"readable-stream": {
|
||||
"version": "2.3.6",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"resolved": "http://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"integrity": "sha512-tQtKA9WIAhBF3+VLAseyMqZeBjW0AHJoxOtYqSUZNJxauErmLbVm2FW1y+J/YA9dUrAC39ITejlZWhVIwawkKw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
@@ -7768,8 +7774,7 @@
|
||||
"ansi-regex": {
|
||||
"version": "2.1.1",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"aproba": {
|
||||
"version": "1.2.0",
|
||||
@@ -7790,14 +7795,12 @@
|
||||
"balanced-match": {
|
||||
"version": "1.0.0",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"brace-expansion": {
|
||||
"version": "1.1.11",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"balanced-match": "^1.0.0",
|
||||
"concat-map": "0.0.1"
|
||||
@@ -7812,20 +7815,17 @@
|
||||
"code-point-at": {
|
||||
"version": "1.1.0",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"concat-map": {
|
||||
"version": "0.0.1",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"console-control-strings": {
|
||||
"version": "1.1.0",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"core-util-is": {
|
||||
"version": "1.0.2",
|
||||
@@ -7942,8 +7942,7 @@
|
||||
"inherits": {
|
||||
"version": "2.0.3",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"ini": {
|
||||
"version": "1.3.5",
|
||||
@@ -7955,7 +7954,6 @@
|
||||
"version": "1.0.0",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"number-is-nan": "^1.0.0"
|
||||
}
|
||||
@@ -7970,7 +7968,6 @@
|
||||
"version": "3.0.4",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"brace-expansion": "^1.1.7"
|
||||
}
|
||||
@@ -7978,14 +7975,12 @@
|
||||
"minimist": {
|
||||
"version": "0.0.8",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"minipass": {
|
||||
"version": "2.3.5",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"safe-buffer": "^5.1.2",
|
||||
"yallist": "^3.0.0"
|
||||
@@ -8004,7 +7999,6 @@
|
||||
"version": "0.5.1",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"minimist": "0.0.8"
|
||||
}
|
||||
@@ -8085,8 +8079,7 @@
|
||||
"number-is-nan": {
|
||||
"version": "1.0.1",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"object-assign": {
|
||||
"version": "4.1.1",
|
||||
@@ -8098,7 +8091,6 @@
|
||||
"version": "1.4.0",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"wrappy": "1"
|
||||
}
|
||||
@@ -8184,8 +8176,7 @@
|
||||
"safe-buffer": {
|
||||
"version": "5.1.2",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"safer-buffer": {
|
||||
"version": "2.1.2",
|
||||
@@ -8221,7 +8212,6 @@
|
||||
"version": "1.0.2",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"code-point-at": "^1.0.0",
|
||||
"is-fullwidth-code-point": "^1.0.0",
|
||||
@@ -8241,7 +8231,6 @@
|
||||
"version": "3.0.1",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true,
|
||||
"requires": {
|
||||
"ansi-regex": "^2.0.0"
|
||||
}
|
||||
@@ -8285,14 +8274,12 @@
|
||||
"wrappy": {
|
||||
"version": "1.0.2",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
},
|
||||
"yallist": {
|
||||
"version": "3.0.3",
|
||||
"bundled": true,
|
||||
"dev": true,
|
||||
"optional": true
|
||||
"dev": true
|
||||
}
|
||||
}
|
||||
},
|
||||
@@ -12027,7 +12014,7 @@
|
||||
},
|
||||
"readable-stream": {
|
||||
"version": "2.3.6",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"resolved": "http://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"integrity": "sha512-tQtKA9WIAhBF3+VLAseyMqZeBjW0AHJoxOtYqSUZNJxauErmLbVm2FW1y+J/YA9dUrAC39ITejlZWhVIwawkKw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
@@ -13078,7 +13065,7 @@
|
||||
},
|
||||
"readable-stream": {
|
||||
"version": "2.3.6",
|
||||
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"resolved": "http://registry.npmjs.org/readable-stream/-/readable-stream-2.3.6.tgz",
|
||||
"integrity": "sha512-tQtKA9WIAhBF3+VLAseyMqZeBjW0AHJoxOtYqSUZNJxauErmLbVm2FW1y+J/YA9dUrAC39ITejlZWhVIwawkKw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
@@ -16227,11 +16214,11 @@
|
||||
"integrity": "sha512-rru86D9CpQRLvsFG5XFdy0KdLAvjdQDyZCsRcuu60WtzFylDM3eAWSxEVz5kzL2Gp544XiUvPbVKtOA/txLi9Q=="
|
||||
},
|
||||
"tippy.js": {
|
||||
"version": "3.4.1",
|
||||
"resolved": "https://registry.npmjs.org/tippy.js/-/tippy.js-3.4.1.tgz",
|
||||
"integrity": "sha512-ZiyGP9WZyCCcjxKM4G88cm4U1r1ytjeMDGa5FSKPaPzwc/3yZJVZsb1ffcmqUMCpryRp5LNxRNGKLzbs11sb/Q==",
|
||||
"version": "5.1.2",
|
||||
"resolved": "https://registry.npmjs.org/tippy.js/-/tippy.js-5.1.2.tgz",
|
||||
"integrity": "sha512-Qtrv2wqbRbaKMUb6bWWBQWPayvcDKNrGlvihxtsyowhT7RLGEh1STWuy6EMXC6QLkfKPB2MLnf8W2mzql9VDAw==",
|
||||
"requires": {
|
||||
"popper.js": "^1.14.6"
|
||||
"popper.js": "^1.16.0"
|
||||
}
|
||||
},
|
||||
"tmp": {
|
||||
|
||||
@@ -58,10 +58,10 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@lingui/react": "^2.7.2",
|
||||
"@patternfly/patternfly": "^2.40.2",
|
||||
"@patternfly/react-core": "^3.120.2",
|
||||
"@patternfly/react-icons": "^3.14.15",
|
||||
"@patternfly/react-tokens": "^2.6.31",
|
||||
"@patternfly/patternfly": "^2.46.1",
|
||||
"@patternfly/react-core": "^3.129.3",
|
||||
"@patternfly/react-icons": "^3.14.28",
|
||||
"@patternfly/react-tokens": "^2.7.14",
|
||||
"ansi-to-html": "^0.6.11",
|
||||
"axios": "^0.18.1",
|
||||
"codemirror": "^5.47.0",
|
||||
|
||||
@@ -156,13 +156,6 @@
|
||||
// and bem style, as well as moved into component-based scss files
|
||||
//
|
||||
|
||||
.at-c-listCardBody {
|
||||
--pf-c-card__footer--PaddingX: 0;
|
||||
--pf-c-card__footer--PaddingY: 0;
|
||||
--pf-c-card__body--PaddingX: 0;
|
||||
--pf-c-card__body--PaddingY: 0;
|
||||
}
|
||||
|
||||
.awx-c-card {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
@@ -142,21 +142,57 @@ class AddResourceRole extends React.Component {
|
||||
} = this.state;
|
||||
const { onClose, roles, i18n } = this.props;
|
||||
|
||||
const userColumns = [
|
||||
const userSearchColumns = [
|
||||
{
|
||||
name: i18n._(t`Username`),
|
||||
key: 'username',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`First Name`),
|
||||
key: 'first_name',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Last Name`),
|
||||
key: 'last_name',
|
||||
},
|
||||
];
|
||||
|
||||
const teamColumns = [
|
||||
const userSortColumns = [
|
||||
{
|
||||
name: i18n._(t`Username`),
|
||||
key: 'username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`First Name`),
|
||||
key: 'first_name',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Last Name`),
|
||||
key: 'last_name',
|
||||
},
|
||||
];
|
||||
|
||||
const teamSearchColumns = [
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Created By (Username)`),
|
||||
key: 'created_by__username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified By (Username)`),
|
||||
key: 'modified_by__username',
|
||||
},
|
||||
];
|
||||
|
||||
const teamSortColumns = [
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
},
|
||||
];
|
||||
|
||||
@@ -207,7 +243,8 @@ class AddResourceRole extends React.Component {
|
||||
<Fragment>
|
||||
{selectedResource === 'users' && (
|
||||
<SelectResourceStep
|
||||
columns={userColumns}
|
||||
searchColumns={userSearchColumns}
|
||||
sortColumns={userSortColumns}
|
||||
displayKey="username"
|
||||
onRowClick={this.handleResourceCheckboxClick}
|
||||
onSearch={readUsers}
|
||||
@@ -218,7 +255,8 @@ class AddResourceRole extends React.Component {
|
||||
)}
|
||||
{selectedResource === 'teams' && (
|
||||
<SelectResourceStep
|
||||
columns={teamColumns}
|
||||
searchColumns={teamSearchColumns}
|
||||
sortColumns={teamSortColumns}
|
||||
onRowClick={this.handleResourceCheckboxClick}
|
||||
onSearch={readTeams}
|
||||
selectedLabel={i18n._(t`Selected`)}
|
||||
|
||||
@@ -3,6 +3,7 @@ import PropTypes from 'prop-types';
|
||||
import { withRouter } from 'react-router-dom';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { SearchColumns, SortColumns } from '@types';
|
||||
import PaginatedDataList from '../PaginatedDataList';
|
||||
import DataListToolbar from '../DataListToolbar';
|
||||
import CheckboxListItem from '../CheckboxListItem';
|
||||
@@ -23,7 +24,11 @@ class SelectResourceStep extends React.Component {
|
||||
this.qsConfig = getQSConfig('resource', {
|
||||
page: 1,
|
||||
page_size: 5,
|
||||
order_by: props.sortedColumnKey,
|
||||
order_by: `${
|
||||
props.sortColumns.filter(col => col.key === 'name').length
|
||||
? 'name'
|
||||
: 'username'
|
||||
}`,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -69,7 +74,8 @@ class SelectResourceStep extends React.Component {
|
||||
const { isInitialized, isLoading, count, error, resources } = this.state;
|
||||
|
||||
const {
|
||||
columns,
|
||||
searchColumns,
|
||||
sortColumns,
|
||||
displayKey,
|
||||
onRowClick,
|
||||
selectedLabel,
|
||||
@@ -99,7 +105,9 @@ class SelectResourceStep extends React.Component {
|
||||
items={resources}
|
||||
itemCount={count}
|
||||
qsConfig={this.qsConfig}
|
||||
toolbarColumns={columns}
|
||||
onRowClick={onRowClick}
|
||||
toolbarSearchColumns={searchColumns}
|
||||
toolbarSortColumns={sortColumns}
|
||||
renderItem={item => (
|
||||
<CheckboxListItem
|
||||
isSelected={selectedResourceRows.some(i => i.id === item.id)}
|
||||
@@ -122,21 +130,22 @@ class SelectResourceStep extends React.Component {
|
||||
}
|
||||
|
||||
SelectResourceStep.propTypes = {
|
||||
columns: PropTypes.arrayOf(PropTypes.object).isRequired,
|
||||
searchColumns: SearchColumns,
|
||||
sortColumns: SortColumns,
|
||||
displayKey: PropTypes.string,
|
||||
onRowClick: PropTypes.func,
|
||||
onSearch: PropTypes.func.isRequired,
|
||||
selectedLabel: PropTypes.string,
|
||||
selectedResourceRows: PropTypes.arrayOf(PropTypes.object),
|
||||
sortedColumnKey: PropTypes.string,
|
||||
};
|
||||
|
||||
SelectResourceStep.defaultProps = {
|
||||
searchColumns: null,
|
||||
sortColumns: null,
|
||||
displayKey: 'name',
|
||||
onRowClick: () => {},
|
||||
selectedLabel: null,
|
||||
selectedResourceRows: [],
|
||||
sortedColumnKey: 'name',
|
||||
};
|
||||
|
||||
export { SelectResourceStep as _SelectResourceStep };
|
||||
|
||||
@@ -6,8 +6,19 @@ import { sleep } from '../../../testUtils/testUtils';
|
||||
import SelectResourceStep from './SelectResourceStep';
|
||||
|
||||
describe('<SelectResourceStep />', () => {
|
||||
const columns = [
|
||||
{ name: 'Username', key: 'username', isSortable: true, isSearchable: true },
|
||||
const searchColumns = [
|
||||
{
|
||||
name: 'Username',
|
||||
key: 'username',
|
||||
isDefault: true,
|
||||
},
|
||||
];
|
||||
|
||||
const sortColumns = [
|
||||
{
|
||||
name: 'Username',
|
||||
key: 'username',
|
||||
},
|
||||
];
|
||||
afterEach(() => {
|
||||
jest.restoreAllMocks();
|
||||
@@ -15,11 +26,11 @@ describe('<SelectResourceStep />', () => {
|
||||
test('initially renders without crashing', () => {
|
||||
shallow(
|
||||
<SelectResourceStep
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
displayKey="username"
|
||||
onRowClick={() => {}}
|
||||
onSearch={() => {}}
|
||||
sortedColumnKey="username"
|
||||
/>
|
||||
);
|
||||
});
|
||||
@@ -36,11 +47,11 @@ describe('<SelectResourceStep />', () => {
|
||||
});
|
||||
mountWithContexts(
|
||||
<SelectResourceStep
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
displayKey="username"
|
||||
onRowClick={() => {}}
|
||||
onSearch={handleSearch}
|
||||
sortedColumnKey="username"
|
||||
/>
|
||||
);
|
||||
expect(handleSearch).toHaveBeenCalledWith({
|
||||
@@ -68,12 +79,12 @@ describe('<SelectResourceStep />', () => {
|
||||
});
|
||||
const wrapper = await mountWithContexts(
|
||||
<SelectResourceStep
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
displayKey="username"
|
||||
onRowClick={() => {}}
|
||||
onSearch={handleSearch}
|
||||
selectedResourceRows={selectedResourceRows}
|
||||
sortedColumnKey="username"
|
||||
/>,
|
||||
{
|
||||
context: { router: { history, route: { location: history.location } } },
|
||||
@@ -102,12 +113,12 @@ describe('<SelectResourceStep />', () => {
|
||||
};
|
||||
const wrapper = mountWithContexts(
|
||||
<SelectResourceStep
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
displayKey="username"
|
||||
onRowClick={handleRowClick}
|
||||
onSearch={() => ({ data })}
|
||||
selectedResourceRows={[]}
|
||||
sortedColumnKey="username"
|
||||
/>
|
||||
);
|
||||
await sleep(0);
|
||||
|
||||
23
awx/ui_next/src/components/Card/CardActionsRow.jsx
Normal file
23
awx/ui_next/src/components/Card/CardActionsRow.jsx
Normal file
@@ -0,0 +1,23 @@
|
||||
import React from 'react';
|
||||
import { CardActions } from '@patternfly/react-core';
|
||||
import styled from 'styled-components';
|
||||
|
||||
const CardActionsWrapper = styled.div`
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
margin-top: 20px;
|
||||
|
||||
& > .pf-c-card__actions > :not(:first-child) {
|
||||
margin-left: 0.5rem;
|
||||
}
|
||||
`;
|
||||
|
||||
function CardActionsRow({ children }) {
|
||||
return (
|
||||
<CardActionsWrapper>
|
||||
<CardActions>{children}</CardActions>
|
||||
</CardActionsWrapper>
|
||||
);
|
||||
}
|
||||
|
||||
export default CardActionsRow;
|
||||
9
awx/ui_next/src/components/Card/CardBody.jsx
Normal file
9
awx/ui_next/src/components/Card/CardBody.jsx
Normal file
@@ -0,0 +1,9 @@
|
||||
import styled from 'styled-components';
|
||||
import { CardBody } from '@patternfly/react-core';
|
||||
|
||||
const TabbedCardBody = styled(CardBody)`
|
||||
padding-top: var(--pf-c-card--first-child--PaddingTop);
|
||||
`;
|
||||
CardBody.displayName = 'PFCardBody';
|
||||
|
||||
export default TabbedCardBody;
|
||||
13
awx/ui_next/src/components/Card/TabbedCardHeader.js
Normal file
13
awx/ui_next/src/components/Card/TabbedCardHeader.js
Normal file
@@ -0,0 +1,13 @@
|
||||
import styled from 'styled-components';
|
||||
import { CardHeader } from '@patternfly/react-core';
|
||||
|
||||
const TabbedCardHeader = styled(CardHeader)`
|
||||
--pf-c-card--first-child--PaddingTop: 0;
|
||||
--pf-c-card--child--PaddingLeft: 0;
|
||||
--pf-c-card--child--PaddingRight: 0;
|
||||
--pf-c-card__header--not-last-child--PaddingBottom: 24px;
|
||||
--pf-c-card__header--not-last-child--PaddingBottom: 0;
|
||||
position: relative;
|
||||
`;
|
||||
|
||||
export default TabbedCardHeader;
|
||||
3
awx/ui_next/src/components/Card/index.js
Normal file
3
awx/ui_next/src/components/Card/index.js
Normal file
@@ -0,0 +1,3 @@
|
||||
export { default as TabbedCardHeader } from './TabbedCardHeader';
|
||||
export { default as CardBody } from './CardBody';
|
||||
export { default as CardActionsRow } from './CardActionsRow';
|
||||
@@ -21,7 +21,11 @@ const CheckboxListItem = ({
|
||||
}) => {
|
||||
const CheckboxRadio = isRadio ? DataListRadio : DataListCheck;
|
||||
return (
|
||||
<DataListItem key={itemId} aria-labelledby={`check-action-item-${itemId}`}>
|
||||
<DataListItem
|
||||
key={itemId}
|
||||
aria-labelledby={`check-action-item-${itemId}`}
|
||||
id={`${itemId}`}
|
||||
>
|
||||
<DataListItemRow>
|
||||
<CheckboxRadio
|
||||
id={`selected-${itemId}`}
|
||||
|
||||
@@ -83,7 +83,7 @@ function CodeMirrorInput({
|
||||
}
|
||||
CodeMirrorInput.propTypes = {
|
||||
value: string.isRequired,
|
||||
onChange: func.isRequired,
|
||||
onChange: func,
|
||||
mode: oneOf(['javascript', 'yaml', 'jinja2']).isRequired,
|
||||
readOnly: bool,
|
||||
hasErrors: bool,
|
||||
@@ -91,6 +91,7 @@ CodeMirrorInput.propTypes = {
|
||||
};
|
||||
CodeMirrorInput.defaultProps = {
|
||||
readOnly: false,
|
||||
onChange: () => {},
|
||||
rows: 6,
|
||||
hasErrors: false,
|
||||
};
|
||||
|
||||
@@ -0,0 +1,91 @@
|
||||
import React, { useState } from 'react';
|
||||
import { string, number } from 'prop-types';
|
||||
import { Split, SplitItem, TextListItemVariants } from '@patternfly/react-core';
|
||||
import { DetailName, DetailValue } from '@components/DetailList';
|
||||
import CodeMirrorInput from './CodeMirrorInput';
|
||||
import YamlJsonToggle from './YamlJsonToggle';
|
||||
import { yamlToJson, jsonToYaml, isJson } from '../../util/yaml';
|
||||
|
||||
const YAML_MODE = 'yaml';
|
||||
const JSON_MODE = 'javascript';
|
||||
|
||||
function VariablesDetail({ value, label, rows }) {
|
||||
const [mode, setMode] = useState(isJson(value) ? JSON_MODE : YAML_MODE);
|
||||
const [currentValue, setCurrentValue] = useState(value);
|
||||
const [error, setError] = useState(null);
|
||||
|
||||
if (!value) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<DetailName
|
||||
component={TextListItemVariants.dt}
|
||||
fullWidth
|
||||
css="grid-column: 1 / -1"
|
||||
>
|
||||
<Split gutter="sm">
|
||||
<SplitItem>
|
||||
<div className="pf-c-form__label">
|
||||
<span
|
||||
className="pf-c-form__label-text"
|
||||
css="font-weight: var(--pf-global--FontWeight--bold)"
|
||||
>
|
||||
{label}
|
||||
</span>
|
||||
</div>
|
||||
</SplitItem>
|
||||
<SplitItem>
|
||||
<YamlJsonToggle
|
||||
mode={mode}
|
||||
onChange={newMode => {
|
||||
try {
|
||||
const newVal =
|
||||
newMode === YAML_MODE
|
||||
? jsonToYaml(currentValue)
|
||||
: yamlToJson(currentValue);
|
||||
setCurrentValue(newVal);
|
||||
setMode(newMode);
|
||||
} catch (err) {
|
||||
setError(err);
|
||||
}
|
||||
}}
|
||||
/>
|
||||
</SplitItem>
|
||||
</Split>
|
||||
</DetailName>
|
||||
<DetailValue
|
||||
component={TextListItemVariants.dd}
|
||||
fullWidth
|
||||
css="grid-column: 1 / -1; margin-top: -20px"
|
||||
>
|
||||
<CodeMirrorInput
|
||||
mode={mode}
|
||||
value={currentValue}
|
||||
readOnly
|
||||
rows={rows}
|
||||
css="margin-top: 10px"
|
||||
/>
|
||||
{error && (
|
||||
<div
|
||||
css="color: var(--pf-global--danger-color--100);
|
||||
font-size: var(--pf-global--FontSize--sm"
|
||||
>
|
||||
Error: {error.message}
|
||||
</div>
|
||||
)}
|
||||
</DetailValue>
|
||||
</>
|
||||
);
|
||||
}
|
||||
VariablesDetail.propTypes = {
|
||||
value: string.isRequired,
|
||||
label: string.isRequired,
|
||||
rows: number,
|
||||
};
|
||||
VariablesDetail.defaultProps = {
|
||||
rows: null,
|
||||
};
|
||||
|
||||
export default VariablesDetail;
|
||||
@@ -0,0 +1,43 @@
|
||||
import React from 'react';
|
||||
import { shallow } from 'enzyme';
|
||||
import VariablesDetail from './VariablesDetail';
|
||||
|
||||
jest.mock('@api');
|
||||
|
||||
describe('<VariablesDetail>', () => {
|
||||
test('should render readonly CodeMirrorInput', () => {
|
||||
const wrapper = shallow(
|
||||
<VariablesDetail value="---foo: bar" label="Variables" />
|
||||
);
|
||||
const input = wrapper.find('Styled(CodeMirrorInput)');
|
||||
expect(input).toHaveLength(1);
|
||||
expect(input.prop('mode')).toEqual('yaml');
|
||||
expect(input.prop('value')).toEqual('---foo: bar');
|
||||
expect(input.prop('readOnly')).toEqual(true);
|
||||
});
|
||||
|
||||
test('should detect JSON', () => {
|
||||
const wrapper = shallow(
|
||||
<VariablesDetail value='{"foo": "bar"}' label="Variables" />
|
||||
);
|
||||
const input = wrapper.find('Styled(CodeMirrorInput)');
|
||||
expect(input).toHaveLength(1);
|
||||
expect(input.prop('mode')).toEqual('javascript');
|
||||
expect(input.prop('value')).toEqual('{"foo": "bar"}');
|
||||
});
|
||||
|
||||
test('should convert between modes', () => {
|
||||
const wrapper = shallow(
|
||||
<VariablesDetail value="---foo: bar" label="Variables" />
|
||||
);
|
||||
wrapper.find('YamlJsonToggle').invoke('onChange')('javascript');
|
||||
const input = wrapper.find('Styled(CodeMirrorInput)');
|
||||
expect(input.prop('mode')).toEqual('javascript');
|
||||
expect(input.prop('value')).toEqual('{\n "foo": "bar"\n}');
|
||||
|
||||
wrapper.find('YamlJsonToggle').invoke('onChange')('yaml');
|
||||
const input2 = wrapper.find('Styled(CodeMirrorInput)');
|
||||
expect(input2.prop('mode')).toEqual('yaml');
|
||||
expect(input2.prop('value')).toEqual('foo: bar\n');
|
||||
});
|
||||
});
|
||||
@@ -1,21 +1,15 @@
|
||||
import React, { useState } from 'react';
|
||||
import { string, bool } from 'prop-types';
|
||||
import { Field } from 'formik';
|
||||
import { Button, Split, SplitItem } from '@patternfly/react-core';
|
||||
import styled from 'styled-components';
|
||||
import ButtonGroup from '../ButtonGroup';
|
||||
import { Split, SplitItem } from '@patternfly/react-core';
|
||||
import CodeMirrorInput from './CodeMirrorInput';
|
||||
import YamlJsonToggle from './YamlJsonToggle';
|
||||
import { yamlToJson, jsonToYaml } from '../../util/yaml';
|
||||
|
||||
const YAML_MODE = 'yaml';
|
||||
const JSON_MODE = 'javascript';
|
||||
|
||||
const SmallButton = styled(Button)`
|
||||
padding: 3px 8px;
|
||||
font-size: var(--pf-global--FontSize--xs);
|
||||
`;
|
||||
|
||||
function VariablesField({ id, name, label, readOnly }) {
|
||||
// TODO: detect initial mode
|
||||
const [mode, setMode] = useState(YAML_MODE);
|
||||
|
||||
return (
|
||||
@@ -30,40 +24,21 @@ function VariablesField({ id, name, label, readOnly }) {
|
||||
</label>
|
||||
</SplitItem>
|
||||
<SplitItem>
|
||||
<ButtonGroup>
|
||||
<SmallButton
|
||||
onClick={() => {
|
||||
if (mode === YAML_MODE) {
|
||||
return;
|
||||
}
|
||||
try {
|
||||
form.setFieldValue(name, jsonToYaml(field.value));
|
||||
setMode(YAML_MODE);
|
||||
} catch (err) {
|
||||
form.setFieldError(name, err.message);
|
||||
}
|
||||
}}
|
||||
variant={mode === YAML_MODE ? 'primary' : 'secondary'}
|
||||
>
|
||||
YAML
|
||||
</SmallButton>
|
||||
<SmallButton
|
||||
onClick={() => {
|
||||
if (mode === JSON_MODE) {
|
||||
return;
|
||||
}
|
||||
try {
|
||||
form.setFieldValue(name, yamlToJson(field.value));
|
||||
setMode(JSON_MODE);
|
||||
} catch (err) {
|
||||
form.setFieldError(name, err.message);
|
||||
}
|
||||
}}
|
||||
variant={mode === JSON_MODE ? 'primary' : 'secondary'}
|
||||
>
|
||||
JSON
|
||||
</SmallButton>
|
||||
</ButtonGroup>
|
||||
<YamlJsonToggle
|
||||
mode={mode}
|
||||
onChange={newMode => {
|
||||
try {
|
||||
const newVal =
|
||||
newMode === YAML_MODE
|
||||
? jsonToYaml(field.value)
|
||||
: yamlToJson(field.value);
|
||||
form.setFieldValue(name, newVal);
|
||||
setMode(newMode);
|
||||
} catch (err) {
|
||||
form.setFieldError(name, err.message);
|
||||
}
|
||||
}}
|
||||
/>
|
||||
</SplitItem>
|
||||
</Split>
|
||||
<CodeMirrorInput
|
||||
|
||||
@@ -0,0 +1,44 @@
|
||||
import React from 'react';
|
||||
import { oneOf, func } from 'prop-types';
|
||||
import styled from 'styled-components';
|
||||
import { Button } from '@patternfly/react-core';
|
||||
import ButtonGroup from '../ButtonGroup';
|
||||
|
||||
const SmallButton = styled(Button)`
|
||||
padding: 3px 8px;
|
||||
font-size: var(--pf-global--FontSize--xs);
|
||||
`;
|
||||
|
||||
const YAML_MODE = 'yaml';
|
||||
const JSON_MODE = 'javascript';
|
||||
|
||||
function YamlJsonToggle({ mode, onChange }) {
|
||||
const setMode = newMode => {
|
||||
if (mode !== newMode) {
|
||||
onChange(newMode);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<ButtonGroup>
|
||||
<SmallButton
|
||||
onClick={() => setMode(YAML_MODE)}
|
||||
variant={mode === YAML_MODE ? 'primary' : 'secondary'}
|
||||
>
|
||||
YAML
|
||||
</SmallButton>
|
||||
<SmallButton
|
||||
onClick={() => setMode(JSON_MODE)}
|
||||
variant={mode === JSON_MODE ? 'primary' : 'secondary'}
|
||||
>
|
||||
JSON
|
||||
</SmallButton>
|
||||
</ButtonGroup>
|
||||
);
|
||||
}
|
||||
YamlJsonToggle.propTypes = {
|
||||
mode: oneOf([YAML_MODE, JSON_MODE]).isRequired,
|
||||
onChange: func.isRequired,
|
||||
};
|
||||
|
||||
export default YamlJsonToggle;
|
||||
@@ -1,5 +1,6 @@
|
||||
import CodeMirrorInput from './CodeMirrorInput';
|
||||
|
||||
export default CodeMirrorInput;
|
||||
export { default as VariablesDetail } from './VariablesDetail';
|
||||
export { default as VariablesInput } from './VariablesInput';
|
||||
export { default as VariablesField } from './VariablesField';
|
||||
|
||||
@@ -16,7 +16,7 @@ import ErrorDetail from '@components/ErrorDetail';
|
||||
|
||||
const EmptyState = styled(PFEmptyState)`
|
||||
width: var(--pf-c-empty-state--m-lg--MaxWidth);
|
||||
max-width: 100%;
|
||||
margin: 0 auto;
|
||||
`;
|
||||
|
||||
async function logout() {
|
||||
|
||||
@@ -2,75 +2,21 @@ import React, { Fragment } from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import {
|
||||
Checkbox,
|
||||
Toolbar as PFToolbar,
|
||||
ToolbarGroup as PFToolbarGroup,
|
||||
ToolbarItem,
|
||||
} from '@patternfly/react-core';
|
||||
|
||||
import { Checkbox } from '@patternfly/react-core';
|
||||
import styled from 'styled-components';
|
||||
import { SearchIcon } from '@patternfly/react-icons';
|
||||
import {
|
||||
DataToolbar,
|
||||
DataToolbarContent,
|
||||
DataToolbarGroup,
|
||||
DataToolbarToggleGroup,
|
||||
DataToolbarItem,
|
||||
} from '@patternfly/react-core/dist/umd/experimental';
|
||||
import ExpandCollapse from '../ExpandCollapse';
|
||||
import Search from '../Search';
|
||||
import Sort from '../Sort';
|
||||
import VerticalSeparator from '../VerticalSeparator';
|
||||
|
||||
import { QSConfig } from '@types';
|
||||
|
||||
const AWXToolbar = styled.div`
|
||||
--awx-toolbar--BackgroundColor: var(--pf-global--BackgroundColor--light-100);
|
||||
--awx-toolbar--BorderColor: #ebebeb;
|
||||
--awx-toolbar--BorderWidth: var(--pf-global--BorderWidth--sm);
|
||||
|
||||
--pf-global--target-size--MinHeight: 0;
|
||||
--pf-global--target-size--MinWidth: 0;
|
||||
--pf-global--FontSize--md: 14px;
|
||||
|
||||
border-bottom: var(--awx-toolbar--BorderWidth) solid
|
||||
var(--awx-toolbar--BorderColor);
|
||||
background-color: var(--awx-toolbar--BackgroundColor);
|
||||
display: flex;
|
||||
min-height: 70px;
|
||||
flex-grow: 1;
|
||||
`;
|
||||
|
||||
const Toolbar = styled(PFToolbar)`
|
||||
flex-grow: 1;
|
||||
margin-left: 20px;
|
||||
margin-right: 20px;
|
||||
`;
|
||||
|
||||
const ToolbarGroup = styled(PFToolbarGroup)`
|
||||
&&& {
|
||||
margin: 0;
|
||||
}
|
||||
`;
|
||||
|
||||
const ColumnLeft = styled.div`
|
||||
display: flex;
|
||||
flex-basis: ${props => (props.fillWidth ? 'auto' : '100%')};
|
||||
flex-grow: ${props => (props.fillWidth ? '1' : '0')};
|
||||
justify-content: flex-start;
|
||||
align-items: center;
|
||||
padding: 10px 0 8px 0;
|
||||
|
||||
@media screen and (min-width: 980px) {
|
||||
flex-basis: ${props => (props.fillWidth ? 'auto' : '50%')};
|
||||
}
|
||||
`;
|
||||
|
||||
const ColumnRight = styled.div`
|
||||
display: flex;
|
||||
flex-basis: ${props => (props.fillWidth ? 'auto' : '100%')};
|
||||
flex-grow: 0;
|
||||
justify-content: flex-start;
|
||||
align-items: center;
|
||||
padding: 8px 0 10px 0;
|
||||
|
||||
@media screen and (min-width: 980px) {
|
||||
flex-basis: ${props => (props.fillWidth ? 'auto' : '50%')};
|
||||
}
|
||||
`;
|
||||
import { SearchColumns, SortColumns, QSConfig } from '@types';
|
||||
|
||||
const AdditionalControlsWrapper = styled.div`
|
||||
display: flex;
|
||||
@@ -83,21 +29,34 @@ const AdditionalControlsWrapper = styled.div`
|
||||
}
|
||||
`;
|
||||
|
||||
const AdditionalControlsDataToolbarGroup = styled(DataToolbarGroup)`
|
||||
margin-left: auto;
|
||||
margin-right: 0 !important;
|
||||
`;
|
||||
|
||||
const DataToolbarSeparator = styled(DataToolbarItem)`
|
||||
width: 1px !important;
|
||||
height: 30px !important;
|
||||
margin-left: 3px !important;
|
||||
margin-right: 10px !important;
|
||||
`;
|
||||
|
||||
class DataListToolbar extends React.Component {
|
||||
render() {
|
||||
const {
|
||||
columns,
|
||||
clearAllFilters,
|
||||
searchColumns,
|
||||
sortColumns,
|
||||
showSelectAll,
|
||||
isAllSelected,
|
||||
isCompact,
|
||||
fillWidth,
|
||||
onSort,
|
||||
onSearch,
|
||||
onReplaceSearch,
|
||||
onRemove,
|
||||
onCompact,
|
||||
onExpand,
|
||||
onSelectAll,
|
||||
sortOrder,
|
||||
sortedColumnKey,
|
||||
additionalControls,
|
||||
i18n,
|
||||
qsConfig,
|
||||
@@ -105,93 +64,93 @@ class DataListToolbar extends React.Component {
|
||||
|
||||
const showExpandCollapse = onCompact && onExpand;
|
||||
return (
|
||||
<AWXToolbar>
|
||||
<Toolbar css={fillWidth ? 'margin-right: 0; margin-left: 0' : ''}>
|
||||
<ColumnLeft fillWidth={fillWidth}>
|
||||
{showSelectAll && (
|
||||
<Fragment>
|
||||
<ToolbarItem>
|
||||
<Checkbox
|
||||
isChecked={isAllSelected}
|
||||
onChange={onSelectAll}
|
||||
aria-label={i18n._(t`Select all`)}
|
||||
id="select-all"
|
||||
/>
|
||||
</ToolbarItem>
|
||||
<VerticalSeparator />
|
||||
</Fragment>
|
||||
)}
|
||||
<ToolbarItem css="flex-grow: 1;">
|
||||
<DataToolbar
|
||||
id={`${qsConfig.namespace}-list-toolbar`}
|
||||
clearAllFilters={clearAllFilters}
|
||||
collapseListedFiltersBreakpoint="xl"
|
||||
>
|
||||
<DataToolbarContent>
|
||||
{showSelectAll && (
|
||||
<DataToolbarGroup>
|
||||
<DataToolbarItem>
|
||||
<Checkbox
|
||||
isChecked={isAllSelected}
|
||||
onChange={onSelectAll}
|
||||
aria-label={i18n._(t`Select all`)}
|
||||
id="select-all"
|
||||
/>
|
||||
</DataToolbarItem>
|
||||
<DataToolbarSeparator variant="separator" />
|
||||
</DataToolbarGroup>
|
||||
)}
|
||||
<DataToolbarToggleGroup toggleIcon={<SearchIcon />} breakpoint="xl">
|
||||
<DataToolbarItem>
|
||||
<Search
|
||||
qsConfig={qsConfig}
|
||||
columns={columns}
|
||||
columns={searchColumns}
|
||||
onSearch={onSearch}
|
||||
sortedColumnKey={sortedColumnKey}
|
||||
onReplaceSearch={onReplaceSearch}
|
||||
onRemove={onRemove}
|
||||
/>
|
||||
</ToolbarItem>
|
||||
<VerticalSeparator />
|
||||
</ColumnLeft>
|
||||
<ColumnRight fillWidth={fillWidth}>
|
||||
<ToolbarItem>
|
||||
<Sort
|
||||
columns={columns}
|
||||
onSort={onSort}
|
||||
sortOrder={sortOrder}
|
||||
sortedColumnKey={sortedColumnKey}
|
||||
/>
|
||||
</ToolbarItem>
|
||||
</DataToolbarItem>
|
||||
<DataToolbarItem>
|
||||
<Sort qsConfig={qsConfig} columns={sortColumns} onSort={onSort} />
|
||||
</DataToolbarItem>
|
||||
</DataToolbarToggleGroup>
|
||||
<DataToolbarGroup>
|
||||
{showExpandCollapse && (
|
||||
<Fragment>
|
||||
<VerticalSeparator />
|
||||
<ToolbarGroup>
|
||||
<DataToolbarItem>
|
||||
<ExpandCollapse
|
||||
isCompact={isCompact}
|
||||
onCompact={onCompact}
|
||||
onExpand={onExpand}
|
||||
/>
|
||||
</ToolbarGroup>
|
||||
{additionalControls && <VerticalSeparator />}
|
||||
</DataToolbarItem>
|
||||
</Fragment>
|
||||
)}
|
||||
<AdditionalControlsWrapper>
|
||||
{additionalControls}
|
||||
</AdditionalControlsWrapper>
|
||||
</ColumnRight>
|
||||
</Toolbar>
|
||||
</AWXToolbar>
|
||||
</DataToolbarGroup>
|
||||
<AdditionalControlsDataToolbarGroup>
|
||||
<DataToolbarItem>
|
||||
<AdditionalControlsWrapper>
|
||||
{additionalControls}
|
||||
</AdditionalControlsWrapper>
|
||||
</DataToolbarItem>
|
||||
</AdditionalControlsDataToolbarGroup>
|
||||
</DataToolbarContent>
|
||||
</DataToolbar>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
DataListToolbar.propTypes = {
|
||||
clearAllFilters: PropTypes.func,
|
||||
qsConfig: QSConfig.isRequired,
|
||||
columns: PropTypes.arrayOf(PropTypes.object).isRequired,
|
||||
searchColumns: SearchColumns.isRequired,
|
||||
sortColumns: SortColumns.isRequired,
|
||||
showSelectAll: PropTypes.bool,
|
||||
isAllSelected: PropTypes.bool,
|
||||
isCompact: PropTypes.bool,
|
||||
fillWidth: PropTypes.bool,
|
||||
onCompact: PropTypes.func,
|
||||
onExpand: PropTypes.func,
|
||||
onSearch: PropTypes.func,
|
||||
onReplaceSearch: PropTypes.func,
|
||||
onSelectAll: PropTypes.func,
|
||||
onSort: PropTypes.func,
|
||||
sortOrder: PropTypes.string,
|
||||
sortedColumnKey: PropTypes.string,
|
||||
additionalControls: PropTypes.arrayOf(PropTypes.node),
|
||||
};
|
||||
|
||||
DataListToolbar.defaultProps = {
|
||||
clearAllFilters: null,
|
||||
showSelectAll: false,
|
||||
isAllSelected: false,
|
||||
isCompact: false,
|
||||
fillWidth: false,
|
||||
onCompact: null,
|
||||
onExpand: null,
|
||||
onSearch: null,
|
||||
onReplaceSearch: null,
|
||||
onSelectAll: null,
|
||||
onSort: null,
|
||||
sortOrder: 'ascending',
|
||||
sortedColumnKey: 'name',
|
||||
additionalControls: [],
|
||||
};
|
||||
|
||||
|
||||
@@ -20,14 +20,13 @@ describe('<DataListToolbar />', () => {
|
||||
});
|
||||
|
||||
const onSearch = jest.fn();
|
||||
const onReplaceSearch = jest.fn();
|
||||
const onSort = jest.fn();
|
||||
const onSelectAll = jest.fn();
|
||||
|
||||
test('it triggers the expected callbacks', () => {
|
||||
const columns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isSearchable: true },
|
||||
];
|
||||
|
||||
const searchColumns = [{ name: 'Name', key: 'name', isDefault: true }];
|
||||
const sortColumns = [{ name: 'Name', key: 'name' }];
|
||||
const search = 'button[aria-label="Search submit button"]';
|
||||
const searchTextInput = 'input[aria-label="Search text input"]';
|
||||
const selectAll = 'input[aria-label="Select all"]';
|
||||
@@ -38,10 +37,10 @@ describe('<DataListToolbar />', () => {
|
||||
qsConfig={QS_CONFIG}
|
||||
isAllSelected={false}
|
||||
showExpandCollapse
|
||||
sortedColumnKey="name"
|
||||
sortOrder="ascending"
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
onSearch={onSearch}
|
||||
onReplaceSearch={onReplaceSearch}
|
||||
onSort={onSort}
|
||||
onSelectAll={onSelectAll}
|
||||
showSelectAll
|
||||
@@ -74,19 +73,28 @@ describe('<DataListToolbar />', () => {
|
||||
const searchDropdownMenuItems =
|
||||
'DropdownMenu > ul[aria-labelledby="awx-search"]';
|
||||
|
||||
const multipleColumns = [
|
||||
{ name: 'Foo', key: 'foo', isSortable: true, isSearchable: true },
|
||||
{ name: 'Bar', key: 'bar', isSortable: true, isSearchable: true },
|
||||
{ name: 'Bakery', key: 'bakery', isSortable: true },
|
||||
{ name: 'Baz', key: 'baz' },
|
||||
const NEW_QS_CONFIG = {
|
||||
namespace: 'organization',
|
||||
dateFields: ['modified', 'created'],
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'foo' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
|
||||
const searchColumns = [
|
||||
{ name: 'Foo', key: 'foo', isDefault: true },
|
||||
{ name: 'Bar', key: 'bar' },
|
||||
];
|
||||
const sortColumns = [
|
||||
{ name: 'Foo', key: 'foo' },
|
||||
{ name: 'Bar', key: 'bar' },
|
||||
{ name: 'Bakery', key: 'Bakery' },
|
||||
];
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="foo"
|
||||
sortOrder="ascending"
|
||||
columns={multipleColumns}
|
||||
qsConfig={NEW_QS_CONFIG}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
);
|
||||
@@ -106,10 +114,9 @@ describe('<DataListToolbar />', () => {
|
||||
searchDropdownItems.at(0).simulate('click', mockedSortEvent);
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="foo"
|
||||
sortOrder="descending"
|
||||
columns={multipleColumns}
|
||||
qsConfig={NEW_QS_CONFIG}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
);
|
||||
@@ -145,77 +152,104 @@ describe('<DataListToolbar />', () => {
|
||||
});
|
||||
|
||||
test('it displays correct sort icon', () => {
|
||||
const downNumericIconSelector = 'SortNumericDownIcon';
|
||||
const upNumericIconSelector = 'SortNumericUpIcon';
|
||||
const downAlphaIconSelector = 'SortAlphaDownIcon';
|
||||
const upAlphaIconSelector = 'SortAlphaUpIcon';
|
||||
const NUM_QS_CONFIG = {
|
||||
namespace: 'organization',
|
||||
dateFields: ['modified', 'created'],
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'id' },
|
||||
integerFields: ['page', 'page_size', 'id'],
|
||||
};
|
||||
|
||||
const numericColumns = [
|
||||
{ name: 'ID', key: 'id', isSortable: true, isNumeric: true },
|
||||
];
|
||||
const alphaColumns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isNumeric: false },
|
||||
const NUM_DESC_QS_CONFIG = {
|
||||
namespace: 'organization',
|
||||
dateFields: ['modified', 'created'],
|
||||
defaultParams: { page: 1, page_size: 5, order_by: '-id' },
|
||||
integerFields: ['page', 'page_size', 'id'],
|
||||
};
|
||||
|
||||
const ALPH_QS_CONFIG = {
|
||||
namespace: 'organization',
|
||||
dateFields: ['modified', 'created'],
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'name' },
|
||||
integerFields: ['page', 'page_size', 'id'],
|
||||
};
|
||||
|
||||
const ALPH_DESC_QS_CONFIG = {
|
||||
namespace: 'organization',
|
||||
dateFields: ['modified', 'created'],
|
||||
defaultParams: { page: 1, page_size: 5, order_by: '-name' },
|
||||
integerFields: ['page', 'page_size', 'id'],
|
||||
};
|
||||
|
||||
const forwardNumericIconSelector = 'SortNumericDownIcon';
|
||||
const reverseNumericIconSelector = 'SortNumericDownAltIcon';
|
||||
const forwardAlphaIconSelector = 'SortAlphaDownIcon';
|
||||
const reverseAlphaIconSelector = 'SortAlphaDownAltIcon';
|
||||
|
||||
const numericColumns = [{ name: 'ID', key: 'id' }];
|
||||
|
||||
const alphaColumns = [{ name: 'Name', key: 'name' }];
|
||||
|
||||
const searchColumns = [
|
||||
{ name: 'Name', key: 'name', isDefault: true },
|
||||
{ name: 'ID', key: 'id' },
|
||||
];
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="id"
|
||||
sortOrder="descending"
|
||||
columns={numericColumns}
|
||||
qsConfig={NUM_DESC_QS_CONFIG}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={numericColumns}
|
||||
/>
|
||||
);
|
||||
|
||||
const downNumericIcon = toolbar.find(downNumericIconSelector);
|
||||
expect(downNumericIcon.length).toBe(1);
|
||||
const reverseNumericIcon = toolbar.find(reverseNumericIconSelector);
|
||||
expect(reverseNumericIcon.length).toBe(1);
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="id"
|
||||
sortOrder="ascending"
|
||||
columns={numericColumns}
|
||||
qsConfig={NUM_QS_CONFIG}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={numericColumns}
|
||||
/>
|
||||
);
|
||||
|
||||
const upNumericIcon = toolbar.find(upNumericIconSelector);
|
||||
expect(upNumericIcon.length).toBe(1);
|
||||
const forwardNumericIcon = toolbar.find(forwardNumericIconSelector);
|
||||
expect(forwardNumericIcon.length).toBe(1);
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="name"
|
||||
sortOrder="descending"
|
||||
columns={alphaColumns}
|
||||
qsConfig={ALPH_DESC_QS_CONFIG}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={alphaColumns}
|
||||
/>
|
||||
);
|
||||
|
||||
const downAlphaIcon = toolbar.find(downAlphaIconSelector);
|
||||
expect(downAlphaIcon.length).toBe(1);
|
||||
const reverseAlphaIcon = toolbar.find(reverseAlphaIconSelector);
|
||||
expect(reverseAlphaIcon.length).toBe(1);
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="name"
|
||||
sortOrder="ascending"
|
||||
columns={alphaColumns}
|
||||
qsConfig={ALPH_QS_CONFIG}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={alphaColumns}
|
||||
/>
|
||||
);
|
||||
|
||||
const upAlphaIcon = toolbar.find(upAlphaIconSelector);
|
||||
expect(upAlphaIcon.length).toBe(1);
|
||||
const forwardAlphaIcon = toolbar.find(forwardAlphaIconSelector);
|
||||
expect(forwardAlphaIcon.length).toBe(1);
|
||||
});
|
||||
|
||||
test('should render additionalControls', () => {
|
||||
const columns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isSearchable: true },
|
||||
];
|
||||
const searchColumns = [{ name: 'Name', key: 'name', isDefault: true }];
|
||||
const sortColumns = [{ name: 'Name', key: 'name' }];
|
||||
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
onSearch={onSearch}
|
||||
onReplaceSearch={onReplaceSearch}
|
||||
onSort={onSort}
|
||||
onSelectAll={onSelectAll}
|
||||
additionalControls={[
|
||||
@@ -232,19 +266,17 @@ describe('<DataListToolbar />', () => {
|
||||
});
|
||||
|
||||
test('it triggers the expected callbacks', () => {
|
||||
const columns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isSearchable: true },
|
||||
];
|
||||
|
||||
const searchColumns = [{ name: 'Name', key: 'name', isDefault: true }];
|
||||
const sortColumns = [{ name: 'Name', key: 'name' }];
|
||||
toolbar = mountWithContexts(
|
||||
<DataListToolbar
|
||||
qsConfig={QS_CONFIG}
|
||||
isAllSelected
|
||||
showExpandCollapse
|
||||
sortedColumnKey="name"
|
||||
sortOrder="ascending"
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
onSearch={onSearch}
|
||||
onReplaceSearch={onReplaceSearch}
|
||||
onSort={onSort}
|
||||
onSelectAll={onSelectAll}
|
||||
showSelectAll
|
||||
|
||||
50
awx/ui_next/src/components/DeleteButton/DeleteButton.jsx
Normal file
50
awx/ui_next/src/components/DeleteButton/DeleteButton.jsx
Normal file
@@ -0,0 +1,50 @@
|
||||
import React, { useState } from 'react';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { Button } from '@patternfly/react-core';
|
||||
import AlertModal from '@components/AlertModal';
|
||||
import { CardActionsRow } from '@components/Card';
|
||||
|
||||
function DeleteButton({ onConfirm, modalTitle, name, i18n }) {
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
|
||||
return (
|
||||
<>
|
||||
<Button
|
||||
variant="danger"
|
||||
aria-label={i18n._(t`Delete`)}
|
||||
onClick={() => setIsOpen(true)}
|
||||
>
|
||||
{i18n._(t`Delete`)}
|
||||
</Button>
|
||||
<AlertModal
|
||||
isOpen={isOpen}
|
||||
title={modalTitle}
|
||||
variant="danger"
|
||||
onClose={() => setIsOpen(false)}
|
||||
>
|
||||
{i18n._(t`Are you sure you want to delete:`)}
|
||||
<br />
|
||||
<strong>{name}</strong>
|
||||
<CardActionsRow>
|
||||
<Button
|
||||
variant="secondary"
|
||||
aria-label={i18n._(t`Cancel`)}
|
||||
onClick={() => setIsOpen(false)}
|
||||
>
|
||||
{i18n._(t`Cancel`)}
|
||||
</Button>
|
||||
<Button
|
||||
variant="danger"
|
||||
aria-label={i18n._(t`Delete`)}
|
||||
onClick={onConfirm}
|
||||
>
|
||||
{i18n._(t`Delete`)}
|
||||
</Button>
|
||||
</CardActionsRow>
|
||||
</AlertModal>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export default withI18n()(DeleteButton);
|
||||
1
awx/ui_next/src/components/DeleteButton/index.js
Normal file
1
awx/ui_next/src/components/DeleteButton/index.js
Normal file
@@ -0,0 +1 @@
|
||||
export { default } from './DeleteButton';
|
||||
36
awx/ui_next/src/components/DetailList/UserDateDetail.jsx
Normal file
36
awx/ui_next/src/components/DetailList/UserDateDetail.jsx
Normal file
@@ -0,0 +1,36 @@
|
||||
import React from 'react';
|
||||
import { node, string } from 'prop-types';
|
||||
import { Trans } from '@lingui/macro';
|
||||
import { Link } from 'react-router-dom';
|
||||
import { formatDateString } from '@util/dates';
|
||||
import Detail from './Detail';
|
||||
import { SummaryFieldUser } from '../../types';
|
||||
|
||||
function UserDateDetail({ label, date, user }) {
|
||||
const dateStr = formatDateString(date);
|
||||
const username = user ? user.username : '';
|
||||
return (
|
||||
<Detail
|
||||
label={label}
|
||||
value={
|
||||
user ? (
|
||||
<Trans>
|
||||
{dateStr} by <Link to={`/users/${user.id}`}>{username}</Link>
|
||||
</Trans>
|
||||
) : (
|
||||
dateStr
|
||||
)
|
||||
}
|
||||
/>
|
||||
);
|
||||
}
|
||||
UserDateDetail.propTypes = {
|
||||
label: node.isRequired,
|
||||
date: string.isRequired,
|
||||
user: SummaryFieldUser,
|
||||
};
|
||||
UserDateDetail.defaultProps = {
|
||||
user: null,
|
||||
};
|
||||
|
||||
export default UserDateDetail;
|
||||
@@ -1,2 +1,3 @@
|
||||
export { default as DetailList } from './DetailList';
|
||||
export { default as Detail, DetailName, DetailValue } from './Detail';
|
||||
export { default as UserDateDetail } from './UserDateDetail';
|
||||
|
||||
@@ -1,102 +0,0 @@
|
||||
import React from 'react';
|
||||
import { withRouter } from 'react-router-dom';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import styled from 'styled-components';
|
||||
import { Button } from '@patternfly/react-core';
|
||||
import { parseQueryString } from '@util/qs';
|
||||
import { ChipGroup as _ChipGroup, Chip } from '@components/Chip';
|
||||
import VerticalSeparator from '@components/VerticalSeparator';
|
||||
|
||||
const FilterTagsRow = styled.div`
|
||||
display: flex;
|
||||
padding: 15px 20px;
|
||||
border-top: 1px solid #d2d2d2;
|
||||
font-size: 14px;
|
||||
align-items: center;
|
||||
`;
|
||||
|
||||
const ResultCount = styled.span`
|
||||
font-weight: bold;
|
||||
`;
|
||||
|
||||
const FilterLabel = styled.span`
|
||||
padding-right: 20px;
|
||||
`;
|
||||
|
||||
const ChipGroup = styled(_ChipGroup)`
|
||||
li.pf-m-overflow {
|
||||
display: none;
|
||||
}
|
||||
`;
|
||||
|
||||
// remove non-default query params so they don't show up as filter tags
|
||||
const filterDefaultParams = (paramsArr, config) => {
|
||||
const defaultParamsKeys = Object.keys(config.defaultParams);
|
||||
return paramsArr.filter(key => defaultParamsKeys.indexOf(key) === -1);
|
||||
};
|
||||
|
||||
const FilterTags = ({
|
||||
i18n,
|
||||
itemCount,
|
||||
qsConfig,
|
||||
location,
|
||||
onRemove,
|
||||
onRemoveAll,
|
||||
}) => {
|
||||
const queryParams = parseQueryString(qsConfig, location.search);
|
||||
const queryParamsArr = [];
|
||||
const nonDefaultParams = filterDefaultParams(
|
||||
Object.keys(queryParams),
|
||||
qsConfig
|
||||
);
|
||||
nonDefaultParams.forEach(key => {
|
||||
const label = key
|
||||
.replace('__icontains', '')
|
||||
.split('_')
|
||||
.map(word => `${word.charAt(0).toUpperCase()}${word.slice(1)}`)
|
||||
.join(' ');
|
||||
|
||||
if (Array.isArray(queryParams[key])) {
|
||||
queryParams[key].forEach(val =>
|
||||
queryParamsArr.push({ key, value: val, label })
|
||||
);
|
||||
} else {
|
||||
queryParamsArr.push({ key, value: queryParams[key], label });
|
||||
}
|
||||
});
|
||||
|
||||
return (
|
||||
queryParamsArr.length > 0 && (
|
||||
<FilterTagsRow>
|
||||
<ResultCount>{i18n._(t`${itemCount} results`)}</ResultCount>
|
||||
<VerticalSeparator />
|
||||
<FilterLabel>{i18n._(t`Active Filters:`)}</FilterLabel>
|
||||
<ChipGroup defaultIsOpen>
|
||||
{queryParamsArr.map(({ key, label, value }) => (
|
||||
<Chip
|
||||
className="searchTagChip"
|
||||
key={`${key}__${value}`}
|
||||
isReadOnly={false}
|
||||
onClick={() => onRemove(key, value)}
|
||||
>
|
||||
<b>{label}:</b> {value}
|
||||
</Chip>
|
||||
))}
|
||||
<div className="pf-c-chip pf-m-overflow">
|
||||
<Button
|
||||
variant="plain"
|
||||
type="button"
|
||||
aria-label={i18n._(t`Clear all search filters`)}
|
||||
onClick={onRemoveAll}
|
||||
>
|
||||
<span className="pf-c-chip__text">{i18n._(t`Clear all`)}</span>
|
||||
</Button>
|
||||
</div>
|
||||
</ChipGroup>
|
||||
</FilterTagsRow>
|
||||
)
|
||||
);
|
||||
};
|
||||
|
||||
export default withI18n()(withRouter(FilterTags));
|
||||
@@ -1,51 +0,0 @@
|
||||
import React from 'react';
|
||||
import { createMemoryHistory } from 'history';
|
||||
import { mountWithContexts } from '../../../testUtils/enzymeHelpers';
|
||||
import FilterTags from './FilterTags';
|
||||
|
||||
describe('<ExpandCollapse />', () => {
|
||||
const qsConfig = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'name' },
|
||||
integerFields: [],
|
||||
};
|
||||
const onRemoveFn = jest.fn();
|
||||
const onRemoveAllFn = jest.fn();
|
||||
|
||||
test('initially renders without crashing', () => {
|
||||
const wrapper = mountWithContexts(
|
||||
<FilterTags
|
||||
qsConfig={qsConfig}
|
||||
onRemove={onRemoveFn}
|
||||
onRemoveAll={onRemoveAllFn}
|
||||
/>
|
||||
);
|
||||
expect(wrapper.length).toBe(1);
|
||||
wrapper.unmount();
|
||||
});
|
||||
|
||||
test('renders non-default param tags based on location history', () => {
|
||||
const history = createMemoryHistory({
|
||||
initialEntries: [
|
||||
'/foo?item.page=1&item.page_size=2&item.name__icontains=bar&item.job_type__icontains=project',
|
||||
],
|
||||
});
|
||||
const wrapper = mountWithContexts(
|
||||
<FilterTags
|
||||
qsConfig={qsConfig}
|
||||
onRemove={onRemoveFn}
|
||||
onRemoveAll={onRemoveAllFn}
|
||||
/>,
|
||||
{
|
||||
context: { router: { history, route: { location: history.location } } },
|
||||
}
|
||||
);
|
||||
const chips = wrapper.find('.pf-c-chip.searchTagChip');
|
||||
expect(chips.length).toBe(2);
|
||||
const chipLabels = wrapper.find('.pf-c-chip__text b');
|
||||
expect(chipLabels.length).toBe(2);
|
||||
expect(chipLabels.at(0).text()).toEqual('Name:');
|
||||
expect(chipLabels.at(1).text()).toEqual('Job Type:');
|
||||
wrapper.unmount();
|
||||
});
|
||||
});
|
||||
@@ -1 +0,0 @@
|
||||
export { default } from './FilterTags';
|
||||
@@ -1,10 +1,12 @@
|
||||
import React, { Fragment } from 'react';
|
||||
import PropTypes, { arrayOf, shape, string, bool } from 'prop-types';
|
||||
import PropTypes from 'prop-types';
|
||||
import { withRouter } from 'react-router-dom';
|
||||
import styled from 'styled-components';
|
||||
|
||||
import {
|
||||
DataToolbar,
|
||||
DataToolbarContent,
|
||||
} from '@patternfly/react-core/dist/umd/experimental';
|
||||
import DataListToolbar from '@components/DataListToolbar';
|
||||
import FilterTags from '@components/FilterTags';
|
||||
|
||||
import {
|
||||
encodeNonDefaultQueryString,
|
||||
@@ -13,7 +15,7 @@ import {
|
||||
replaceParams,
|
||||
removeParams,
|
||||
} from '@util/qs';
|
||||
import { QSConfig } from '@types';
|
||||
import { QSConfig, SearchColumns, SortColumns } from '@types';
|
||||
|
||||
const EmptyStateControlsWrapper = styled.div`
|
||||
display: flex;
|
||||
@@ -31,29 +33,34 @@ class ListHeader extends React.Component {
|
||||
super(props);
|
||||
|
||||
this.handleSearch = this.handleSearch.bind(this);
|
||||
this.handleReplaceSearch = this.handleReplaceSearch.bind(this);
|
||||
this.handleSort = this.handleSort.bind(this);
|
||||
this.handleRemove = this.handleRemove.bind(this);
|
||||
this.handleRemoveAll = this.handleRemoveAll.bind(this);
|
||||
}
|
||||
|
||||
getSortOrder() {
|
||||
const { qsConfig, location } = this.props;
|
||||
const queryParams = parseQueryString(qsConfig, location.search);
|
||||
if (queryParams.order_by && queryParams.order_by.startsWith('-')) {
|
||||
return [queryParams.order_by.substr(1), 'descending'];
|
||||
}
|
||||
return [queryParams.order_by, 'ascending'];
|
||||
}
|
||||
|
||||
handleSearch(key, value) {
|
||||
const { location, qsConfig } = this.props;
|
||||
let params = parseQueryString(qsConfig, location.search);
|
||||
params = mergeParams(params, { [key]: value });
|
||||
params = replaceParams(params, { page: 1 });
|
||||
this.pushHistoryState(params);
|
||||
}
|
||||
|
||||
handleReplaceSearch(key, value) {
|
||||
const { location, qsConfig } = this.props;
|
||||
const oldParams = parseQueryString(qsConfig, location.search);
|
||||
this.pushHistoryState(mergeParams(oldParams, { [key]: value }));
|
||||
this.pushHistoryState(replaceParams(oldParams, { [key]: value }));
|
||||
}
|
||||
|
||||
handleRemove(key, value) {
|
||||
const { location, qsConfig } = this.props;
|
||||
const oldParams = parseQueryString(qsConfig, location.search);
|
||||
let oldParams = parseQueryString(qsConfig, location.search);
|
||||
if (parseInt(value, 10)) {
|
||||
oldParams = removeParams(qsConfig, oldParams, {
|
||||
[key]: parseInt(value, 10),
|
||||
});
|
||||
}
|
||||
this.pushHistoryState(removeParams(qsConfig, oldParams, { [key]: value }));
|
||||
}
|
||||
|
||||
@@ -83,44 +90,40 @@ class ListHeader extends React.Component {
|
||||
const {
|
||||
emptyStateControls,
|
||||
itemCount,
|
||||
columns,
|
||||
searchColumns,
|
||||
sortColumns,
|
||||
renderToolbar,
|
||||
qsConfig,
|
||||
location,
|
||||
} = this.props;
|
||||
const [orderBy, sortOrder] = this.getSortOrder();
|
||||
const params = parseQueryString(qsConfig, location.search);
|
||||
const isEmpty = itemCount === 0 && Object.keys(params).length === 0;
|
||||
return (
|
||||
<Fragment>
|
||||
{isEmpty ? (
|
||||
<Fragment>
|
||||
<EmptyStateControlsWrapper>
|
||||
{emptyStateControls}
|
||||
</EmptyStateControlsWrapper>
|
||||
<FilterTags
|
||||
itemCount={itemCount}
|
||||
qsConfig={qsConfig}
|
||||
onRemove={this.handleRemove}
|
||||
onRemoveAll={this.handleRemoveAll}
|
||||
/>
|
||||
</Fragment>
|
||||
<DataToolbar
|
||||
id={`${qsConfig.namespace}-list-toolbar`}
|
||||
clearAllFilters={this.handleRemoveAll}
|
||||
collapseListedFiltersBreakpoint="md"
|
||||
>
|
||||
<DataToolbarContent>
|
||||
<EmptyStateControlsWrapper>
|
||||
{emptyStateControls}
|
||||
</EmptyStateControlsWrapper>
|
||||
</DataToolbarContent>
|
||||
</DataToolbar>
|
||||
) : (
|
||||
<Fragment>
|
||||
{renderToolbar({
|
||||
sortedColumnKey: orderBy,
|
||||
sortOrder,
|
||||
columns,
|
||||
searchColumns,
|
||||
sortColumns,
|
||||
onSearch: this.handleSearch,
|
||||
onReplaceSearch: this.handleReplaceSearch,
|
||||
onSort: this.handleSort,
|
||||
onRemove: this.handleRemove,
|
||||
clearAllFilters: this.handleRemoveAll,
|
||||
qsConfig,
|
||||
})}
|
||||
<FilterTags
|
||||
itemCount={itemCount}
|
||||
qsConfig={qsConfig}
|
||||
onRemove={this.handleRemove}
|
||||
onRemoveAll={this.handleRemoveAll}
|
||||
/>
|
||||
</Fragment>
|
||||
)}
|
||||
</Fragment>
|
||||
@@ -131,14 +134,8 @@ class ListHeader extends React.Component {
|
||||
ListHeader.propTypes = {
|
||||
itemCount: PropTypes.number.isRequired,
|
||||
qsConfig: QSConfig.isRequired,
|
||||
columns: arrayOf(
|
||||
shape({
|
||||
name: string.isRequired,
|
||||
key: string.isRequired,
|
||||
isSortable: bool,
|
||||
isSearchable: bool,
|
||||
})
|
||||
).isRequired,
|
||||
searchColumns: SearchColumns.isRequired,
|
||||
sortColumns: SortColumns.isRequired,
|
||||
renderToolbar: PropTypes.func,
|
||||
};
|
||||
|
||||
|
||||
@@ -1,13 +1,12 @@
|
||||
import React from 'react';
|
||||
import { createMemoryHistory } from 'history';
|
||||
import { mountWithContexts } from '@testUtils/enzymeHelpers';
|
||||
import { sleep } from '@testUtils/testUtils';
|
||||
import ListHeader from './ListHeader';
|
||||
|
||||
describe('ListHeader', () => {
|
||||
const qsConfig = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'name' },
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'foo' },
|
||||
integerFields: [],
|
||||
};
|
||||
const renderToolbarFn = jest.fn();
|
||||
@@ -17,9 +16,8 @@ describe('ListHeader', () => {
|
||||
<ListHeader
|
||||
itemCount={50}
|
||||
qsConfig={qsConfig}
|
||||
columns={[
|
||||
{ name: 'foo', key: 'foo', isSearchable: true, isSortable: true },
|
||||
]}
|
||||
searchColumns={[{ name: 'foo', key: 'foo', isDefault: true }]}
|
||||
sortColumns={[{ name: 'foo', key: 'foo' }]}
|
||||
renderToolbar={renderToolbarFn}
|
||||
/>
|
||||
);
|
||||
@@ -35,26 +33,16 @@ describe('ListHeader', () => {
|
||||
<ListHeader
|
||||
itemCount={7}
|
||||
qsConfig={qsConfig}
|
||||
columns={[
|
||||
{ name: 'name', key: 'name', isSearchable: true, isSortable: true },
|
||||
]}
|
||||
searchColumns={[{ name: 'foo', key: 'foo', isDefault: true }]}
|
||||
sortColumns={[{ name: 'foo', key: 'foo' }]}
|
||||
/>,
|
||||
{ context: { router: { history } } }
|
||||
);
|
||||
|
||||
const toolbar = wrapper.find('DataListToolbar');
|
||||
expect(toolbar.prop('sortedColumnKey')).toEqual('name');
|
||||
expect(toolbar.prop('sortOrder')).toEqual('ascending');
|
||||
toolbar.prop('onSort')('name', 'descending');
|
||||
expect(history.location.search).toEqual('?item.order_by=-name');
|
||||
await sleep(0);
|
||||
wrapper.update();
|
||||
|
||||
expect(toolbar.prop('sortedColumnKey')).toEqual('name');
|
||||
// TODO: this assertion required updating queryParams prop. Consider
|
||||
// fixing after #147 is done:
|
||||
// expect(toolbar.prop('sortOrder')).toEqual('descending');
|
||||
toolbar.prop('onSort')('name', 'ascending');
|
||||
toolbar.prop('onSort')('foo', 'descending');
|
||||
expect(history.location.search).toEqual('?item.order_by=-foo');
|
||||
toolbar.prop('onSort')('foo', 'ascending');
|
||||
// since order_by = name is the default, that should be strip out of the search
|
||||
expect(history.location.search).toEqual('');
|
||||
});
|
||||
|
||||
@@ -2,6 +2,7 @@ import React, { useEffect, useState } from 'react';
|
||||
import { bool, func, number, string, oneOfType } from 'prop-types';
|
||||
import { withRouter } from 'react-router-dom';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { CredentialsAPI } from '@api';
|
||||
import { Credential } from '@types';
|
||||
import { getQSConfig, parseQueryString, mergeParams } from '@util/qs';
|
||||
@@ -26,6 +27,7 @@ function CredentialLookup({
|
||||
credentialTypeId,
|
||||
value,
|
||||
history,
|
||||
i18n,
|
||||
}) {
|
||||
const [credentials, setCredentials] = useState([]);
|
||||
const [count, setCount] = useState(0);
|
||||
@@ -48,6 +50,8 @@ function CredentialLookup({
|
||||
})();
|
||||
}, [credentialTypeId, history.location.search]);
|
||||
|
||||
// TODO: replace credential type search with REST-based grabbing of cred types
|
||||
|
||||
return (
|
||||
<FormGroup
|
||||
fieldId="credential"
|
||||
@@ -71,6 +75,27 @@ function CredentialLookup({
|
||||
optionCount={count}
|
||||
header={label}
|
||||
qsConfig={QS_CONFIG}
|
||||
searchColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Created By (Username)`),
|
||||
key: 'created_by__username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified By (Username)`),
|
||||
key: 'modified_by__username',
|
||||
},
|
||||
]}
|
||||
sortColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
},
|
||||
]}
|
||||
readOnly={!canDelete}
|
||||
selectItem={item => dispatch({ type: 'SELECT_ITEM', item })}
|
||||
deselectItem={item => dispatch({ type: 'DESELECT_ITEM', item })}
|
||||
|
||||
@@ -64,24 +64,21 @@ function InstanceGroupsLookup(props) {
|
||||
value={state.selectedItems}
|
||||
options={instanceGroups}
|
||||
optionCount={count}
|
||||
columns={[
|
||||
searchColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified`),
|
||||
key: 'modified',
|
||||
isSortable: false,
|
||||
isNumeric: true,
|
||||
name: i18n._(t`Credential Name`),
|
||||
key: 'credential__name',
|
||||
},
|
||||
]}
|
||||
sortColumns={[
|
||||
{
|
||||
name: i18n._(t`Created`),
|
||||
key: 'created',
|
||||
isSortable: false,
|
||||
isNumeric: true,
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
},
|
||||
]}
|
||||
multiple={state.multiple}
|
||||
|
||||
@@ -68,19 +68,25 @@ function InventoryLookup({
|
||||
value={state.selectedItems}
|
||||
options={inventories}
|
||||
optionCount={count}
|
||||
columns={[
|
||||
{ name: i18n._(t`Name`), key: 'name', isSortable: true },
|
||||
searchColumns={[
|
||||
{
|
||||
name: i18n._(t`Modified`),
|
||||
key: 'modified',
|
||||
isSortable: false,
|
||||
isNumeric: true,
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Created`),
|
||||
key: 'created',
|
||||
isSortable: false,
|
||||
isNumeric: true,
|
||||
name: i18n._(t`Created By (Username)`),
|
||||
key: 'created_by__username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified By (Username)`),
|
||||
key: 'modified_by__username',
|
||||
},
|
||||
]}
|
||||
sortColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
},
|
||||
]}
|
||||
multiple={state.multiple}
|
||||
|
||||
@@ -122,12 +122,25 @@ function MultiCredentialsLookup(props) {
|
||||
value={state.selectedItems}
|
||||
options={credentials}
|
||||
optionCount={credentialsCount}
|
||||
columns={[
|
||||
searchColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Created By (Username)`),
|
||||
key: 'created_by__username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified By (Username)`),
|
||||
key: 'modified_by__username',
|
||||
},
|
||||
]}
|
||||
sortColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
},
|
||||
]}
|
||||
multiple={isMultiple}
|
||||
|
||||
@@ -70,6 +70,27 @@ function OrganizationLookup({
|
||||
header={i18n._(t`Organization`)}
|
||||
name="organization"
|
||||
qsConfig={QS_CONFIG}
|
||||
searchColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Created By (Username)`),
|
||||
key: 'created_by__username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified By (Username)`),
|
||||
key: 'modified_by__username',
|
||||
},
|
||||
]}
|
||||
sortColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
},
|
||||
]}
|
||||
readOnly={!canDelete}
|
||||
selectItem={item => dispatch({ type: 'SELECT_ITEM', item })}
|
||||
deselectItem={item => dispatch({ type: 'DESELECT_ITEM', item })}
|
||||
|
||||
@@ -70,6 +70,41 @@ function ProjectLookup({
|
||||
renderOptionsList={({ state, dispatch, canDelete }) => (
|
||||
<OptionsList
|
||||
value={state.selectedItems}
|
||||
searchColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Type`),
|
||||
options: [
|
||||
[``, i18n._(t`Manual`)],
|
||||
[`git`, i18n._(t`Git`)],
|
||||
[`hg`, i18n._(t`Mercurial`)],
|
||||
[`svn`, i18n._(t`Subversion`)],
|
||||
[`insights`, i18n._(t`Red Hat Insights`)],
|
||||
],
|
||||
},
|
||||
{
|
||||
name: i18n._(t`SCM URL`),
|
||||
key: 'scm_url',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified By (Username)`),
|
||||
key: 'modified_by__username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Created By (Username)`),
|
||||
key: 'created_by__username',
|
||||
},
|
||||
]}
|
||||
sortColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
},
|
||||
]}
|
||||
options={projects}
|
||||
optionCount={count}
|
||||
multiple={state.multiple}
|
||||
|
||||
@@ -8,19 +8,27 @@ import {
|
||||
string,
|
||||
oneOfType,
|
||||
} from 'prop-types';
|
||||
import styled from 'styled-components';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import SelectedList from '../../SelectedList';
|
||||
import PaginatedDataList from '../../PaginatedDataList';
|
||||
import CheckboxListItem from '../../CheckboxListItem';
|
||||
import DataListToolbar from '../../DataListToolbar';
|
||||
import { QSConfig } from '@types';
|
||||
import { QSConfig, SearchColumns, SortColumns } from '@types';
|
||||
|
||||
const ModalList = styled.div`
|
||||
.pf-c-data-toolbar__content {
|
||||
padding: 0 !important;
|
||||
}
|
||||
`;
|
||||
|
||||
function OptionsList({
|
||||
value,
|
||||
options,
|
||||
optionCount,
|
||||
columns,
|
||||
searchColumns,
|
||||
sortColumns,
|
||||
multiple,
|
||||
header,
|
||||
name,
|
||||
@@ -33,7 +41,7 @@ function OptionsList({
|
||||
i18n,
|
||||
}) {
|
||||
return (
|
||||
<div>
|
||||
<ModalList>
|
||||
{value.length > 0 && (
|
||||
<SelectedList
|
||||
label={i18n._(t`Selected`)}
|
||||
@@ -49,8 +57,10 @@ function OptionsList({
|
||||
itemCount={optionCount}
|
||||
pluralizedItemName={header}
|
||||
qsConfig={qsConfig}
|
||||
toolbarColumns={columns}
|
||||
toolbarSearchColumns={searchColumns}
|
||||
toolbarSortColumns={sortColumns}
|
||||
hasContentLoading={isLoading}
|
||||
onRowClick={selectItem}
|
||||
renderItem={item => (
|
||||
<CheckboxListItem
|
||||
key={item.id}
|
||||
@@ -66,7 +76,7 @@ function OptionsList({
|
||||
renderToolbar={props => <DataListToolbar {...props} fillWidth />}
|
||||
showPageSizeOptions={false}
|
||||
/>
|
||||
</div>
|
||||
</ModalList>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -79,7 +89,8 @@ OptionsList.propTypes = {
|
||||
value: arrayOf(Item).isRequired,
|
||||
options: arrayOf(Item).isRequired,
|
||||
optionCount: number.isRequired,
|
||||
columns: arrayOf(shape({})),
|
||||
searchColumns: SearchColumns,
|
||||
sortColumns: SortColumns,
|
||||
multiple: bool,
|
||||
qsConfig: QSConfig.isRequired,
|
||||
selectItem: func.isRequired,
|
||||
@@ -89,7 +100,8 @@ OptionsList.propTypes = {
|
||||
OptionsList.defaultProps = {
|
||||
multiple: false,
|
||||
renderItemChip: null,
|
||||
columns: [],
|
||||
searchColumns: [],
|
||||
sortColumns: [],
|
||||
};
|
||||
|
||||
export default withI18n()(OptionsList);
|
||||
|
||||
@@ -3,7 +3,7 @@ import { mountWithContexts } from '@testUtils/enzymeHelpers';
|
||||
import { getQSConfig } from '@util/qs';
|
||||
import OptionsList from './OptionsList';
|
||||
|
||||
const qsConfig = getQSConfig('test', {});
|
||||
const qsConfig = getQSConfig('test', { order_by: 'foo' });
|
||||
|
||||
describe('<OptionsList />', () => {
|
||||
it('should display list of options', () => {
|
||||
@@ -17,7 +17,8 @@ describe('<OptionsList />', () => {
|
||||
value={[]}
|
||||
options={options}
|
||||
optionCount={3}
|
||||
columns={[]}
|
||||
searchColumns={[{ name: 'Foo', key: 'foo', isDefault: true }]}
|
||||
sortColumns={[{ name: 'Foo', key: 'foo' }]}
|
||||
qsConfig={qsConfig}
|
||||
selectItem={() => {}}
|
||||
deselectItem={() => {}}
|
||||
@@ -39,7 +40,8 @@ describe('<OptionsList />', () => {
|
||||
value={[options[1]]}
|
||||
options={options}
|
||||
optionCount={3}
|
||||
columns={[]}
|
||||
searchColumns={[{ name: 'Foo', key: 'foo', isDefault: true }]}
|
||||
sortColumns={[{ name: 'Foo', key: 'foo' }]}
|
||||
qsConfig={qsConfig}
|
||||
selectItem={() => {}}
|
||||
deselectItem={() => {}}
|
||||
|
||||
@@ -35,7 +35,7 @@ describe('<MultiSelect />', () => {
|
||||
/>
|
||||
);
|
||||
const component = wrapper.find('MultiSelect');
|
||||
const input = component.find('TextInput');
|
||||
const input = component.find('TextInputBase');
|
||||
input.invoke('onChange')('Flabadoo');
|
||||
input.simulate('keydown', { key: 'Enter' });
|
||||
|
||||
@@ -58,7 +58,7 @@ describe('<MultiSelect />', () => {
|
||||
/>
|
||||
);
|
||||
|
||||
const input = wrapper.find('TextInput');
|
||||
const input = wrapper.find('TextInputBase');
|
||||
input.simulate('focus');
|
||||
wrapper.update();
|
||||
const event = {
|
||||
|
||||
@@ -18,12 +18,6 @@ const QS_CONFIG = getQSConfig('notification', {
|
||||
order_by: 'name',
|
||||
});
|
||||
|
||||
const COLUMNS = [
|
||||
{ key: 'name', name: 'Name', isSortable: true, isSearchable: true },
|
||||
{ key: 'modified', name: 'Modified', isSortable: true, isNumeric: true },
|
||||
{ key: 'created', name: 'Created', isSortable: true, isNumeric: true },
|
||||
];
|
||||
|
||||
class NotificationList extends Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
@@ -204,7 +198,43 @@ class NotificationList extends Component {
|
||||
itemCount={itemCount}
|
||||
pluralizedItemName={i18n._(t`Notifications`)}
|
||||
qsConfig={QS_CONFIG}
|
||||
toolbarColumns={COLUMNS}
|
||||
toolbarSearchColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Type`),
|
||||
key: 'type',
|
||||
options: [
|
||||
['email', i18n._(t`Email`)],
|
||||
['grafana', i18n._(t`Grafana`)],
|
||||
['hipchat', i18n._(t`Hipchat`)],
|
||||
['irc', i18n._(t`IRC`)],
|
||||
['mattermost', i18n._(t`Mattermost`)],
|
||||
['pagerduty', i18n._(t`Pagerduty`)],
|
||||
['rocketchat', i18n._(t`Rocket.Chat`)],
|
||||
['slack', i18n._(t`Slack`)],
|
||||
['twilio', i18n._(t`Twilio`)],
|
||||
['webhook', i18n._(t`Webhook`)],
|
||||
],
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Created By (Username)`),
|
||||
key: 'created_by__username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Modified By (Username)`),
|
||||
key: 'modified_by__username',
|
||||
},
|
||||
]}
|
||||
toolbarSortColumns={[
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
},
|
||||
]}
|
||||
renderItem={notification => (
|
||||
<NotificationListItem
|
||||
key={notification.id}
|
||||
|
||||
@@ -41,6 +41,7 @@ function NotificationListItem(props) {
|
||||
<DataListItem
|
||||
aria-labelledby={`items-list-item-${notification.id}`}
|
||||
key={notification.id}
|
||||
id={`${notification.id}`}
|
||||
>
|
||||
<DataListItemRow>
|
||||
<DataListItemCells
|
||||
|
||||
@@ -24,11 +24,13 @@ exports[`<NotificationListItem canToggleNotifications /> initially renders succe
|
||||
>
|
||||
<DataListItem
|
||||
aria-labelledby="items-list-item-9000"
|
||||
id="9000"
|
||||
key="9000"
|
||||
>
|
||||
<li
|
||||
aria-labelledby="items-list-item-9000"
|
||||
className="pf-c-data-list__item"
|
||||
id="9000"
|
||||
>
|
||||
<DataListItemRow
|
||||
key=".0"
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import React, { Fragment } from 'react';
|
||||
import PropTypes, { arrayOf, shape, string, bool } from 'prop-types';
|
||||
import PropTypes from 'prop-types';
|
||||
import { DataList } from '@patternfly/react-core';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
@@ -18,7 +18,7 @@ import {
|
||||
replaceParams,
|
||||
} from '@util/qs';
|
||||
|
||||
import { QSConfig } from '@types';
|
||||
import { QSConfig, SearchColumns, SortColumns } from '@types';
|
||||
|
||||
import PaginatedDataListItem from './PaginatedDataListItem';
|
||||
|
||||
@@ -27,8 +27,15 @@ class PaginatedDataList extends React.Component {
|
||||
super(props);
|
||||
this.handleSetPage = this.handleSetPage.bind(this);
|
||||
this.handleSetPageSize = this.handleSetPageSize.bind(this);
|
||||
this.handleListItemSelect = this.handleListItemSelect.bind(this);
|
||||
}
|
||||
|
||||
handleListItemSelect = (id = 0) => {
|
||||
const { items, onRowClick } = this.props;
|
||||
const match = items.find(item => item.id === Number(id));
|
||||
onRowClick(match);
|
||||
};
|
||||
|
||||
handleSetPage(event, pageNumber) {
|
||||
const { history, qsConfig } = this.props;
|
||||
const { search } = history.location;
|
||||
@@ -59,21 +66,29 @@ class PaginatedDataList extends React.Component {
|
||||
itemCount,
|
||||
qsConfig,
|
||||
renderItem,
|
||||
toolbarColumns,
|
||||
toolbarSearchColumns,
|
||||
toolbarSortColumns,
|
||||
pluralizedItemName,
|
||||
showPageSizeOptions,
|
||||
location,
|
||||
i18n,
|
||||
renderToolbar,
|
||||
} = this.props;
|
||||
const columns = toolbarColumns.length
|
||||
? toolbarColumns
|
||||
const searchColumns = toolbarSearchColumns.length
|
||||
? toolbarSearchColumns
|
||||
: [
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isDefault: true,
|
||||
},
|
||||
];
|
||||
const sortColumns = toolbarSortColumns.length
|
||||
? toolbarSortColumns
|
||||
: [
|
||||
{
|
||||
name: i18n._(t`Name`),
|
||||
key: 'name',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
},
|
||||
];
|
||||
const queryParams = parseQueryString(qsConfig, location.search);
|
||||
@@ -95,7 +110,12 @@ class PaginatedDataList extends React.Component {
|
||||
);
|
||||
} else {
|
||||
Content = (
|
||||
<DataList aria-label={dataListLabel}>{items.map(renderItem)}</DataList>
|
||||
<DataList
|
||||
aria-label={dataListLabel}
|
||||
onSelectDataListItem={id => this.handleListItemSelect(id)}
|
||||
>
|
||||
{items.map(renderItem)}
|
||||
</DataList>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -105,7 +125,8 @@ class PaginatedDataList extends React.Component {
|
||||
itemCount={itemCount}
|
||||
renderToolbar={renderToolbar}
|
||||
emptyStateControls={emptyStateControls}
|
||||
columns={columns}
|
||||
searchColumns={searchColumns}
|
||||
sortColumns={sortColumns}
|
||||
qsConfig={qsConfig}
|
||||
/>
|
||||
{Content}
|
||||
@@ -146,27 +167,25 @@ PaginatedDataList.propTypes = {
|
||||
pluralizedItemName: PropTypes.string,
|
||||
qsConfig: QSConfig.isRequired,
|
||||
renderItem: PropTypes.func,
|
||||
toolbarColumns: arrayOf(
|
||||
shape({
|
||||
name: string.isRequired,
|
||||
key: string.isRequired,
|
||||
isSortable: bool,
|
||||
})
|
||||
),
|
||||
toolbarSearchColumns: SearchColumns,
|
||||
toolbarSortColumns: SortColumns,
|
||||
showPageSizeOptions: PropTypes.bool,
|
||||
renderToolbar: PropTypes.func,
|
||||
hasContentLoading: PropTypes.bool,
|
||||
contentError: PropTypes.shape(),
|
||||
onRowClick: PropTypes.func,
|
||||
};
|
||||
|
||||
PaginatedDataList.defaultProps = {
|
||||
hasContentLoading: false,
|
||||
contentError: null,
|
||||
toolbarColumns: [],
|
||||
toolbarSearchColumns: [],
|
||||
toolbarSortColumns: [],
|
||||
pluralizedItemName: 'Items',
|
||||
showPageSizeOptions: true,
|
||||
renderItem: item => <PaginatedDataListItem key={item.id} item={item} />,
|
||||
renderToolbar: props => <DataListToolbar {...props} />,
|
||||
onRowClick: () => null,
|
||||
};
|
||||
|
||||
export { PaginatedDataList as _PaginatedDataList };
|
||||
|
||||
@@ -19,7 +19,11 @@ const DetailWrapper = styled(TextContent)`
|
||||
|
||||
export default function PaginatedDataListItem({ item }) {
|
||||
return (
|
||||
<DataListItem aria-labelledby={`items-list-item-${item.id}`} key={item.id}>
|
||||
<DataListItem
|
||||
aria-labelledby={`items-list-item-${item.id}`}
|
||||
key={item.id}
|
||||
id={`${item.id}`}
|
||||
>
|
||||
<DataListItemRow>
|
||||
<DataListItemCells
|
||||
dataListCells={[
|
||||
|
||||
@@ -39,7 +39,6 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
|
||||
zIndex={9999}
|
||||
>
|
||||
<PopoverBase
|
||||
animateFill={false}
|
||||
appendTo={[Function]}
|
||||
aria="describedby"
|
||||
arrow={true}
|
||||
@@ -80,7 +79,6 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
|
||||
lazy={true}
|
||||
maxWidth="18.75rem"
|
||||
onCreate={[Function]}
|
||||
performance={true}
|
||||
placement="top"
|
||||
popperOptions={
|
||||
Object {
|
||||
|
||||
@@ -162,24 +162,33 @@ class ResourceAccessList extends React.Component {
|
||||
itemCount={itemCount}
|
||||
pluralizedItemName="Roles"
|
||||
qsConfig={QS_CONFIG}
|
||||
toolbarColumns={[
|
||||
{
|
||||
name: i18n._(t`First Name`),
|
||||
key: 'first_name',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
},
|
||||
toolbarSearchColumns={[
|
||||
{
|
||||
name: i18n._(t`Username`),
|
||||
key: 'username',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
isDefault: true,
|
||||
},
|
||||
{
|
||||
name: i18n._(t`First Name`),
|
||||
key: 'first_name',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Last Name`),
|
||||
key: 'last_name',
|
||||
},
|
||||
]}
|
||||
toolbarSortColumns={[
|
||||
{
|
||||
name: i18n._(t`Username`),
|
||||
key: 'username',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`First Name`),
|
||||
key: 'first_name',
|
||||
},
|
||||
{
|
||||
name: i18n._(t`Last Name`),
|
||||
key: 'last_name',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
},
|
||||
]}
|
||||
renderToolbar={props => (
|
||||
|
||||
@@ -72,7 +72,11 @@ class ResourceAccessListItem extends React.Component {
|
||||
const [teamRoles, userRoles] = this.getRoleLists();
|
||||
|
||||
return (
|
||||
<DataListItem aria-labelledby="access-list-item" key={accessRecord.id}>
|
||||
<DataListItem
|
||||
aria-labelledby="access-list-item"
|
||||
key={accessRecord.id}
|
||||
id={`${accessRecord.id}`}
|
||||
>
|
||||
<DataListItemRow>
|
||||
<DataListItemCells
|
||||
dataListCells={[
|
||||
|
||||
@@ -34,11 +34,13 @@ exports[`<ResourceAccessListItem /> initially renders succesfully 1`] = `
|
||||
>
|
||||
<DataListItem
|
||||
aria-labelledby="access-list-item"
|
||||
id="2"
|
||||
key="2"
|
||||
>
|
||||
<li
|
||||
aria-labelledby="access-list-item"
|
||||
className="pf-c-data-list__item"
|
||||
id="2"
|
||||
>
|
||||
<DataListItemRow
|
||||
key=".0"
|
||||
|
||||
@@ -1,98 +1,64 @@
|
||||
import React from 'react';
|
||||
import React, { Fragment } from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { withRouter } from 'react-router-dom';
|
||||
import {
|
||||
Button as PFButton,
|
||||
Dropdown as PFDropdown,
|
||||
Button,
|
||||
ButtonVariant,
|
||||
Dropdown,
|
||||
DropdownPosition,
|
||||
DropdownToggle,
|
||||
DropdownItem,
|
||||
Form,
|
||||
FormGroup,
|
||||
TextInput as PFTextInput,
|
||||
InputGroup,
|
||||
Select,
|
||||
SelectOption,
|
||||
SelectVariant,
|
||||
TextInput,
|
||||
} from '@patternfly/react-core';
|
||||
import {
|
||||
DataToolbarGroup,
|
||||
DataToolbarItem,
|
||||
DataToolbarFilter,
|
||||
} from '@patternfly/react-core/dist/umd/experimental';
|
||||
import { SearchIcon } from '@patternfly/react-icons';
|
||||
|
||||
import { QSConfig } from '@types';
|
||||
|
||||
import { parseQueryString } from '@util/qs';
|
||||
import { QSConfig, SearchColumns } from '@types';
|
||||
import styled from 'styled-components';
|
||||
|
||||
const TextInput = styled(PFTextInput)`
|
||||
min-height: 0px;
|
||||
height: 30px;
|
||||
--pf-c-form-control--BorderTopColor: var(--pf-global--BorderColor--200);
|
||||
--pf-c-form-control--BorderLeftColor: var(--pf-global--BorderColor--200);
|
||||
`;
|
||||
|
||||
const Button = styled(PFButton)`
|
||||
width: 34px;
|
||||
padding: 0px;
|
||||
::after {
|
||||
border: var(--pf-c-button--BorderWidth) solid
|
||||
var(--pf-global--BorderColor--200);
|
||||
}
|
||||
`;
|
||||
|
||||
const Dropdown = styled(PFDropdown)`
|
||||
&&& {
|
||||
/* Higher specificity required because we are selecting unclassed elements */
|
||||
> button {
|
||||
min-height: 30px;
|
||||
min-width: 70px;
|
||||
height: 30px;
|
||||
padding: 0 10px;
|
||||
margin: 0px;
|
||||
|
||||
::before {
|
||||
border-color: var(--pf-global--BorderColor--200);
|
||||
border-top-left-radius: 3px;
|
||||
border-bottom-left-radius: 3px;
|
||||
}
|
||||
|
||||
> span {
|
||||
/* text element */
|
||||
width: auto;
|
||||
}
|
||||
|
||||
> svg {
|
||||
/* caret icon */
|
||||
margin: 0px;
|
||||
padding-top: 3px;
|
||||
padding-left: 3px;
|
||||
}
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
const NoOptionDropdown = styled.div`
|
||||
align-self: stretch;
|
||||
border: 1px solid var(--pf-global--BorderColor--200);
|
||||
border-top-left-radius: 3px;
|
||||
border-bottom-left-radius: 3px;
|
||||
padding: 3px 7px;
|
||||
border: 1px solid var(--pf-global--BorderColor--300);
|
||||
padding: 5px 15px;
|
||||
white-space: nowrap;
|
||||
`;
|
||||
|
||||
const InputFormGroup = styled(FormGroup)`
|
||||
flex: 1;
|
||||
border-bottom-color: var(--pf-global--BorderColor--200);
|
||||
`;
|
||||
|
||||
class Search extends React.Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
|
||||
const { sortedColumnKey } = this.props;
|
||||
const { columns } = this.props;
|
||||
|
||||
this.state = {
|
||||
isSearchDropdownOpen: false,
|
||||
searchKey: sortedColumnKey,
|
||||
searchKey: columns.find(col => col.isDefault).key,
|
||||
searchValue: '',
|
||||
isFilterDropdownOpen: false,
|
||||
};
|
||||
|
||||
this.handleSearchInputChange = this.handleSearchInputChange.bind(this);
|
||||
this.handleDropdownToggle = this.handleDropdownToggle.bind(this);
|
||||
this.handleDropdownSelect = this.handleDropdownSelect.bind(this);
|
||||
this.handleSearch = this.handleSearch.bind(this);
|
||||
this.handleTextKeyDown = this.handleTextKeyDown.bind(this);
|
||||
this.handleFilterDropdownToggle = this.handleFilterDropdownToggle.bind(
|
||||
this
|
||||
);
|
||||
this.handleFilterDropdownSelect = this.handleFilterDropdownSelect.bind(
|
||||
this
|
||||
);
|
||||
this.handleFilterBooleanSelect = this.handleFilterBooleanSelect.bind(this);
|
||||
}
|
||||
|
||||
handleDropdownToggle(isSearchDropdownOpen) {
|
||||
@@ -115,11 +81,9 @@ class Search extends React.Component {
|
||||
const { onSearch, qsConfig } = this.props;
|
||||
|
||||
const isNonStringField =
|
||||
qsConfig.integerFields.filter(field => field === searchKey).length ||
|
||||
qsConfig.dateFields.filter(field => field === searchKey).length;
|
||||
qsConfig.integerFields.find(field => field === searchKey) ||
|
||||
qsConfig.dateFields.find(field => field === searchKey);
|
||||
|
||||
// TODO: this will probably become more sophisticated, where date
|
||||
// fields and string fields are passed to a formatter
|
||||
const actualSearchKey = isNonStringField
|
||||
? searchKey
|
||||
: `${searchKey}__icontains`;
|
||||
@@ -133,95 +97,213 @@ class Search extends React.Component {
|
||||
this.setState({ searchValue });
|
||||
}
|
||||
|
||||
handleTextKeyDown(e) {
|
||||
if (e.key && e.key === 'Enter') {
|
||||
this.handleSearch(e);
|
||||
}
|
||||
}
|
||||
|
||||
handleFilterDropdownToggle(isFilterDropdownOpen) {
|
||||
this.setState({ isFilterDropdownOpen });
|
||||
}
|
||||
|
||||
handleFilterDropdownSelect(key, event, actualValue) {
|
||||
const { onSearch, onRemove } = this.props;
|
||||
|
||||
if (event.target.checked) {
|
||||
onSearch(`or__${key}`, actualValue);
|
||||
} else {
|
||||
onRemove(`or__${key}`, actualValue);
|
||||
}
|
||||
}
|
||||
|
||||
handleFilterBooleanSelect(key, selection) {
|
||||
const { onReplaceSearch } = this.props;
|
||||
onReplaceSearch(key, selection);
|
||||
}
|
||||
|
||||
render() {
|
||||
const { up } = DropdownPosition;
|
||||
const { columns, i18n } = this.props;
|
||||
const { isSearchDropdownOpen, searchKey, searchValue } = this.state;
|
||||
const { columns, i18n, onRemove, qsConfig, location } = this.props;
|
||||
const {
|
||||
isSearchDropdownOpen,
|
||||
searchKey,
|
||||
searchValue,
|
||||
isFilterDropdownOpen,
|
||||
} = this.state;
|
||||
const { name: searchColumnName } = columns.find(
|
||||
({ key }) => key === searchKey
|
||||
);
|
||||
|
||||
const searchDropdownItems = columns
|
||||
.filter(({ key, isSearchable }) => isSearchable && key !== searchKey)
|
||||
.filter(({ key }) => key !== searchKey)
|
||||
.map(({ key, name }) => (
|
||||
<DropdownItem key={key} component="button">
|
||||
{name}
|
||||
</DropdownItem>
|
||||
));
|
||||
|
||||
const filterDefaultParams = (paramsArr, config) => {
|
||||
const defaultParamsKeys = Object.keys(config.defaultParams || {});
|
||||
return paramsArr.filter(key => defaultParamsKeys.indexOf(key) === -1);
|
||||
};
|
||||
|
||||
const getChipsByKey = () => {
|
||||
const queryParams = parseQueryString(qsConfig, location.search);
|
||||
|
||||
const queryParamsByKey = {};
|
||||
columns.forEach(({ name, key }) => {
|
||||
queryParamsByKey[key] = { key, label: name, chips: [] };
|
||||
});
|
||||
const nonDefaultParams = filterDefaultParams(
|
||||
Object.keys(queryParams || {}),
|
||||
qsConfig
|
||||
);
|
||||
|
||||
nonDefaultParams.forEach(key => {
|
||||
const columnKey = key.replace('__icontains', '').replace('or__', '');
|
||||
const label = columns.filter(
|
||||
({ key: keyToCheck }) => columnKey === keyToCheck
|
||||
).length
|
||||
? columns.filter(({ key: keyToCheck }) => columnKey === keyToCheck)[0]
|
||||
.name
|
||||
: columnKey;
|
||||
|
||||
queryParamsByKey[columnKey] = { key, label, chips: [] };
|
||||
|
||||
if (Array.isArray(queryParams[key])) {
|
||||
queryParams[key].forEach(val =>
|
||||
queryParamsByKey[columnKey].chips.push(val.toString())
|
||||
);
|
||||
} else {
|
||||
queryParamsByKey[columnKey].chips.push(queryParams[key].toString());
|
||||
}
|
||||
});
|
||||
|
||||
return queryParamsByKey;
|
||||
};
|
||||
|
||||
const chipsByKey = getChipsByKey();
|
||||
|
||||
return (
|
||||
<Form autoComplete="off">
|
||||
<div className="pf-c-input-group">
|
||||
<DataToolbarGroup variant="filter-group">
|
||||
<DataToolbarItem>
|
||||
{searchDropdownItems.length > 0 ? (
|
||||
<FormGroup
|
||||
fieldId="searchKeyDropdown"
|
||||
label={
|
||||
<span className="pf-screen-reader">
|
||||
{i18n._(t`Search key dropdown`)}
|
||||
</span>
|
||||
<Dropdown
|
||||
onToggle={this.handleDropdownToggle}
|
||||
onSelect={this.handleDropdownSelect}
|
||||
direction={up}
|
||||
toggle={
|
||||
<DropdownToggle
|
||||
id="awx-search"
|
||||
onToggle={this.handleDropdownToggle}
|
||||
style={{ width: '100%' }}
|
||||
>
|
||||
{searchColumnName}
|
||||
</DropdownToggle>
|
||||
}
|
||||
>
|
||||
<Dropdown
|
||||
onToggle={this.handleDropdownToggle}
|
||||
onSelect={this.handleDropdownSelect}
|
||||
direction={up}
|
||||
isOpen={isSearchDropdownOpen}
|
||||
toggle={
|
||||
<DropdownToggle
|
||||
id="awx-search"
|
||||
onToggle={this.handleDropdownToggle}
|
||||
>
|
||||
{searchColumnName}
|
||||
</DropdownToggle>
|
||||
}
|
||||
dropdownItems={searchDropdownItems}
|
||||
/>
|
||||
</FormGroup>
|
||||
isOpen={isSearchDropdownOpen}
|
||||
dropdownItems={searchDropdownItems}
|
||||
style={{ width: '100%' }}
|
||||
/>
|
||||
) : (
|
||||
<NoOptionDropdown>{searchColumnName}</NoOptionDropdown>
|
||||
)}
|
||||
<InputFormGroup
|
||||
fieldId="searchValueTextInput"
|
||||
label={
|
||||
<span className="pf-screen-reader">
|
||||
{i18n._(t`Search value text input`)}
|
||||
</span>
|
||||
}
|
||||
style={{ width: '100%' }}
|
||||
suppressClassNameWarning
|
||||
</DataToolbarItem>
|
||||
{columns.map(({ key, name, options, isBoolean }) => (
|
||||
<DataToolbarFilter
|
||||
chips={chipsByKey[key] ? chipsByKey[key].chips : []}
|
||||
deleteChip={(unusedKey, val) => {
|
||||
onRemove(chipsByKey[key].key, val);
|
||||
}}
|
||||
categoryName={chipsByKey[key] ? chipsByKey[key].label : key}
|
||||
key={key}
|
||||
showToolbarItem={searchKey === key}
|
||||
>
|
||||
<TextInput
|
||||
type="search"
|
||||
aria-label={i18n._(t`Search text input`)}
|
||||
value={searchValue}
|
||||
onChange={this.handleSearchInputChange}
|
||||
style={{ height: '30px' }}
|
||||
/>
|
||||
</InputFormGroup>
|
||||
<Button
|
||||
variant="tertiary"
|
||||
type="submit"
|
||||
aria-label={i18n._(t`Search submit button`)}
|
||||
onClick={this.handleSearch}
|
||||
>
|
||||
<SearchIcon />
|
||||
</Button>
|
||||
</div>
|
||||
</Form>
|
||||
{(options && (
|
||||
<Fragment>
|
||||
{/* TODO: update value to being object
|
||||
{ actualValue: optionKey, toString: () => label }
|
||||
currently a pf bug that makes the checked logic
|
||||
not work with object-based values */}
|
||||
<Select
|
||||
variant={SelectVariant.checkbox}
|
||||
aria-label={name}
|
||||
onToggle={this.handleFilterDropdownToggle}
|
||||
onSelect={(event, selection) =>
|
||||
this.handleFilterDropdownSelect(key, event, selection)
|
||||
}
|
||||
selections={chipsByKey[key].chips}
|
||||
isExpanded={isFilterDropdownOpen}
|
||||
placeholderText={`Filter by ${name.toLowerCase()}`}
|
||||
>
|
||||
{options.map(([optionKey]) => (
|
||||
<SelectOption key={optionKey} value={optionKey} />
|
||||
))}
|
||||
</Select>
|
||||
</Fragment>
|
||||
)) ||
|
||||
(isBoolean && (
|
||||
<Select
|
||||
aria-label={name}
|
||||
onToggle={this.handleFilterDropdownToggle}
|
||||
onSelect={(event, selection) =>
|
||||
this.handleFilterBooleanSelect(key, selection)
|
||||
}
|
||||
selections={chipsByKey[key].chips[0]}
|
||||
isExpanded={isFilterDropdownOpen}
|
||||
placeholderText={`Filter by ${name.toLowerCase()}`}
|
||||
>
|
||||
{/* TODO: update value to being object
|
||||
{ actualValue: optionKey, toString: () => label }
|
||||
currently a pf bug that makes the checked logic
|
||||
not work with object-based values */}
|
||||
<SelectOption key="true" value="true" />
|
||||
<SelectOption key="false" value="false" />
|
||||
</Select>
|
||||
)) || (
|
||||
<InputGroup>
|
||||
{/* TODO: add support for dates:
|
||||
qsConfig.dateFields.filter(field => field === key).length && "date" */}
|
||||
<TextInput
|
||||
type={
|
||||
(qsConfig.integerFields.find(
|
||||
field => field === searchKey
|
||||
) &&
|
||||
'number') ||
|
||||
'search'
|
||||
}
|
||||
aria-label={i18n._(t`Search text input`)}
|
||||
value={searchValue}
|
||||
onChange={this.handleSearchInputChange}
|
||||
onKeyDown={this.handleTextKeyDown}
|
||||
/>
|
||||
<Button
|
||||
variant={ButtonVariant.control}
|
||||
aria-label={i18n._(t`Search submit button`)}
|
||||
onClick={this.handleSearch}
|
||||
>
|
||||
<SearchIcon />
|
||||
</Button>
|
||||
</InputGroup>
|
||||
)}
|
||||
</DataToolbarFilter>
|
||||
))}
|
||||
</DataToolbarGroup>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
Search.propTypes = {
|
||||
qsConfig: QSConfig.isRequired,
|
||||
columns: PropTypes.arrayOf(PropTypes.object).isRequired,
|
||||
columns: SearchColumns.isRequired,
|
||||
onSearch: PropTypes.func,
|
||||
sortedColumnKey: PropTypes.string,
|
||||
onRemove: PropTypes.func,
|
||||
};
|
||||
|
||||
Search.defaultProps = {
|
||||
onSearch: null,
|
||||
sortedColumnKey: 'name',
|
||||
onRemove: null,
|
||||
};
|
||||
|
||||
export default withI18n()(Search);
|
||||
export default withI18n()(withRouter(Search));
|
||||
|
||||
@@ -1,4 +1,8 @@
|
||||
import React from 'react';
|
||||
import {
|
||||
DataToolbar,
|
||||
DataToolbarContent,
|
||||
} from '@patternfly/react-core/dist/umd/experimental';
|
||||
import { mountWithContexts } from '../../../testUtils/enzymeHelpers';
|
||||
import Search from './Search';
|
||||
|
||||
@@ -19,9 +23,7 @@ describe('<Search />', () => {
|
||||
});
|
||||
|
||||
test('it triggers the expected callbacks', () => {
|
||||
const columns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isSearchable: true },
|
||||
];
|
||||
const columns = [{ name: 'Name', key: 'name', isDefault: true }];
|
||||
|
||||
const searchBtn = 'button[aria-label="Search submit button"]';
|
||||
const searchTextInput = 'input[aria-label="Search text input"]';
|
||||
@@ -29,12 +31,15 @@ describe('<Search />', () => {
|
||||
const onSearch = jest.fn();
|
||||
|
||||
search = mountWithContexts(
|
||||
<Search
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="name"
|
||||
columns={columns}
|
||||
onSearch={onSearch}
|
||||
/>
|
||||
<DataToolbar
|
||||
id={`${QS_CONFIG.namespace}-list-toolbar`}
|
||||
clearAllFilters={() => {}}
|
||||
collapseListedFiltersBreakpoint="md"
|
||||
>
|
||||
<DataToolbarContent>
|
||||
<Search qsConfig={QS_CONFIG} columns={columns} onSearch={onSearch} />
|
||||
</DataToolbarContent>
|
||||
</DataToolbar>
|
||||
);
|
||||
|
||||
search.find(searchTextInput).instance().value = 'test-321';
|
||||
@@ -46,17 +51,18 @@ describe('<Search />', () => {
|
||||
});
|
||||
|
||||
test('handleDropdownToggle properly updates state', async () => {
|
||||
const columns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isSearchable: true },
|
||||
];
|
||||
const columns = [{ name: 'Name', key: 'name', isDefault: true }];
|
||||
const onSearch = jest.fn();
|
||||
const wrapper = mountWithContexts(
|
||||
<Search
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="name"
|
||||
columns={columns}
|
||||
onSearch={onSearch}
|
||||
/>
|
||||
<DataToolbar
|
||||
id={`${QS_CONFIG.namespace}-list-toolbar`}
|
||||
clearAllFilters={() => {}}
|
||||
collapseListedFiltersBreakpoint="md"
|
||||
>
|
||||
<DataToolbarContent>
|
||||
<Search qsConfig={QS_CONFIG} columns={columns} onSearch={onSearch} />
|
||||
</DataToolbarContent>
|
||||
</DataToolbar>
|
||||
).find('Search');
|
||||
expect(wrapper.state('isSearchDropdownOpen')).toEqual(false);
|
||||
wrapper.instance().handleDropdownToggle(true);
|
||||
@@ -65,22 +71,20 @@ describe('<Search />', () => {
|
||||
|
||||
test('handleDropdownSelect properly updates state', async () => {
|
||||
const columns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isSearchable: true },
|
||||
{
|
||||
name: 'Description',
|
||||
key: 'description',
|
||||
isSortable: true,
|
||||
isSearchable: true,
|
||||
},
|
||||
{ name: 'Name', key: 'name', isDefault: true },
|
||||
{ name: 'Description', key: 'description' },
|
||||
];
|
||||
const onSearch = jest.fn();
|
||||
const wrapper = mountWithContexts(
|
||||
<Search
|
||||
qsConfig={QS_CONFIG}
|
||||
sortedColumnKey="name"
|
||||
columns={columns}
|
||||
onSearch={onSearch}
|
||||
/>
|
||||
<DataToolbar
|
||||
id={`${QS_CONFIG.namespace}-list-toolbar`}
|
||||
clearAllFilters={() => {}}
|
||||
collapseListedFiltersBreakpoint="md"
|
||||
>
|
||||
<DataToolbarContent>
|
||||
<Search qsConfig={QS_CONFIG} columns={columns} onSearch={onSearch} />
|
||||
</DataToolbarContent>
|
||||
</DataToolbar>
|
||||
).find('Search');
|
||||
expect(wrapper.state('searchKey')).toEqual('name');
|
||||
wrapper
|
||||
|
||||
@@ -1,75 +1,65 @@
|
||||
import React from 'react';
|
||||
import React, { Fragment } from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { withRouter } from 'react-router-dom';
|
||||
import { t } from '@lingui/macro';
|
||||
import {
|
||||
Button,
|
||||
Dropdown as PFDropdown,
|
||||
ButtonVariant,
|
||||
Dropdown,
|
||||
DropdownPosition,
|
||||
DropdownToggle,
|
||||
DropdownItem,
|
||||
Tooltip,
|
||||
InputGroup,
|
||||
} from '@patternfly/react-core';
|
||||
import {
|
||||
SortAlphaDownIcon,
|
||||
SortAlphaUpIcon,
|
||||
SortAlphaDownAltIcon,
|
||||
SortNumericDownIcon,
|
||||
SortNumericUpIcon,
|
||||
SortNumericDownAltIcon,
|
||||
} from '@patternfly/react-icons';
|
||||
|
||||
import { parseQueryString } from '@util/qs';
|
||||
import { SortColumns, QSConfig } from '@types';
|
||||
import styled from 'styled-components';
|
||||
|
||||
const Dropdown = styled(PFDropdown)`
|
||||
&&& {
|
||||
> button {
|
||||
min-height: 30px;
|
||||
min-width: 70px;
|
||||
height: 30px;
|
||||
padding: 0 10px;
|
||||
margin: 0px;
|
||||
|
||||
> span {
|
||||
/* text element within dropdown */
|
||||
width: auto;
|
||||
}
|
||||
|
||||
> svg {
|
||||
/* caret icon */
|
||||
margin: 0px;
|
||||
padding-top: 3px;
|
||||
padding-left: 3px;
|
||||
}
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
const IconWrapper = styled.span`
|
||||
> svg {
|
||||
font-size: 18px;
|
||||
}
|
||||
`;
|
||||
|
||||
const SortButton = styled(Button)`
|
||||
padding: 5px 8px;
|
||||
margin-top: 3px;
|
||||
|
||||
&:hover {
|
||||
background-color: #0166cc;
|
||||
color: white;
|
||||
}
|
||||
`;
|
||||
|
||||
const SortBy = styled.span`
|
||||
margin-right: 15px;
|
||||
font-size: var(--pf-global--FontSize--md);
|
||||
const NoOptionDropdown = styled.div`
|
||||
align-self: stretch;
|
||||
border: 1px solid var(--pf-global--BorderColor--300);
|
||||
padding: 5px 15px;
|
||||
white-space: nowrap;
|
||||
border-bottom-color: var(--pf-global--BorderColor--200);
|
||||
`;
|
||||
|
||||
class Sort extends React.Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
|
||||
let sortKey;
|
||||
let sortOrder;
|
||||
let isNumeric;
|
||||
|
||||
const { qsConfig, location } = this.props;
|
||||
const queryParams = parseQueryString(qsConfig, location.search);
|
||||
if (queryParams.order_by && queryParams.order_by.startsWith('-')) {
|
||||
sortKey = queryParams.order_by.substr(1);
|
||||
sortOrder = 'descending';
|
||||
} else if (queryParams.order_by) {
|
||||
sortKey = queryParams.order_by;
|
||||
sortOrder = 'ascending';
|
||||
}
|
||||
|
||||
if (qsConfig.integerFields.find(field => field === sortKey)) {
|
||||
isNumeric = true;
|
||||
} else {
|
||||
isNumeric = false;
|
||||
}
|
||||
|
||||
this.state = {
|
||||
isSortDropdownOpen: false,
|
||||
sortKey,
|
||||
sortOrder,
|
||||
isNumeric,
|
||||
};
|
||||
|
||||
this.handleDropdownToggle = this.handleDropdownToggle.bind(this);
|
||||
@@ -82,34 +72,42 @@ class Sort extends React.Component {
|
||||
}
|
||||
|
||||
handleDropdownSelect({ target }) {
|
||||
const { columns, onSort, sortOrder } = this.props;
|
||||
const { columns, onSort, qsConfig } = this.props;
|
||||
const { sortOrder } = this.state;
|
||||
const { innerText } = target;
|
||||
|
||||
const [{ key: searchKey }] = columns.filter(
|
||||
({ name }) => name === innerText
|
||||
);
|
||||
const [{ key: sortKey }] = columns.filter(({ name }) => name === innerText);
|
||||
|
||||
this.setState({ isSortDropdownOpen: false });
|
||||
onSort(searchKey, sortOrder);
|
||||
let isNumeric;
|
||||
|
||||
if (qsConfig.integerFields.find(field => field === sortKey)) {
|
||||
isNumeric = true;
|
||||
} else {
|
||||
isNumeric = false;
|
||||
}
|
||||
|
||||
this.setState({ isSortDropdownOpen: false, sortKey, isNumeric });
|
||||
onSort(sortKey, sortOrder);
|
||||
}
|
||||
|
||||
handleSort() {
|
||||
const { onSort, sortedColumnKey, sortOrder } = this.props;
|
||||
const { onSort } = this.props;
|
||||
const { sortKey, sortOrder } = this.state;
|
||||
const newSortOrder = sortOrder === 'ascending' ? 'descending' : 'ascending';
|
||||
|
||||
onSort(sortedColumnKey, newSortOrder);
|
||||
this.setState({ sortOrder: newSortOrder });
|
||||
onSort(sortKey, newSortOrder);
|
||||
}
|
||||
|
||||
render() {
|
||||
const { up } = DropdownPosition;
|
||||
const { columns, sortedColumnKey, sortOrder, i18n } = this.props;
|
||||
const { isSortDropdownOpen } = this.state;
|
||||
const [{ name: sortedColumnName, isNumeric }] = columns.filter(
|
||||
({ key }) => key === sortedColumnKey
|
||||
const { columns, i18n } = this.props;
|
||||
const { isSortDropdownOpen, sortKey, sortOrder, isNumeric } = this.state;
|
||||
const [{ name: sortedColumnName }] = columns.filter(
|
||||
({ key }) => key === sortKey
|
||||
);
|
||||
|
||||
const sortDropdownItems = columns
|
||||
.filter(({ key, isSortable }) => isSortable && key !== sortedColumnKey)
|
||||
.filter(({ key }) => key !== sortKey)
|
||||
.map(({ key, name }) => (
|
||||
<DropdownItem key={key} component="button">
|
||||
{name}
|
||||
@@ -119,65 +117,57 @@ class Sort extends React.Component {
|
||||
let SortIcon;
|
||||
if (isNumeric) {
|
||||
SortIcon =
|
||||
sortOrder === 'ascending' ? SortNumericUpIcon : SortNumericDownIcon;
|
||||
sortOrder === 'ascending'
|
||||
? SortNumericDownIcon
|
||||
: SortNumericDownAltIcon;
|
||||
} else {
|
||||
SortIcon =
|
||||
sortOrder === 'ascending' ? SortAlphaUpIcon : SortAlphaDownIcon;
|
||||
sortOrder === 'ascending' ? SortAlphaDownIcon : SortAlphaDownAltIcon;
|
||||
}
|
||||
|
||||
return (
|
||||
<React.Fragment>
|
||||
{sortDropdownItems.length > 0 && (
|
||||
<React.Fragment>
|
||||
<SortBy>{i18n._(t`Sort By`)}</SortBy>
|
||||
<Dropdown
|
||||
style={{ marginRight: '10px' }}
|
||||
onToggle={this.handleDropdownToggle}
|
||||
onSelect={this.handleDropdownSelect}
|
||||
direction={up}
|
||||
isOpen={isSortDropdownOpen}
|
||||
toggle={
|
||||
<DropdownToggle
|
||||
id="awx-sort"
|
||||
onToggle={this.handleDropdownToggle}
|
||||
>
|
||||
{sortedColumnName}
|
||||
</DropdownToggle>
|
||||
}
|
||||
dropdownItems={sortDropdownItems}
|
||||
/>
|
||||
</React.Fragment>
|
||||
<Fragment>
|
||||
{sortedColumnName && (
|
||||
<InputGroup>
|
||||
{(sortDropdownItems.length > 0 && (
|
||||
<Dropdown
|
||||
onToggle={this.handleDropdownToggle}
|
||||
onSelect={this.handleDropdownSelect}
|
||||
direction={up}
|
||||
isOpen={isSortDropdownOpen}
|
||||
toggle={
|
||||
<DropdownToggle
|
||||
id="awx-sort"
|
||||
onToggle={this.handleDropdownToggle}
|
||||
>
|
||||
{sortedColumnName}
|
||||
</DropdownToggle>
|
||||
}
|
||||
dropdownItems={sortDropdownItems}
|
||||
/>
|
||||
)) || <NoOptionDropdown>{sortedColumnName}</NoOptionDropdown>}
|
||||
<Button
|
||||
variant={ButtonVariant.control}
|
||||
aria-label={i18n._(t`Sort`)}
|
||||
onClick={this.handleSort}
|
||||
>
|
||||
<SortIcon />
|
||||
</Button>
|
||||
</InputGroup>
|
||||
)}
|
||||
<Tooltip
|
||||
content={<div>{i18n._(t`Reverse Sort Order`)}</div>}
|
||||
position="top"
|
||||
>
|
||||
<SortButton
|
||||
onClick={this.handleSort}
|
||||
variant="plain"
|
||||
aria-label={i18n._(t`Sort`)}
|
||||
>
|
||||
<IconWrapper>
|
||||
<SortIcon style={{ verticalAlign: '-0.225em' }} />
|
||||
</IconWrapper>
|
||||
</SortButton>
|
||||
</Tooltip>
|
||||
</React.Fragment>
|
||||
</Fragment>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
Sort.propTypes = {
|
||||
columns: PropTypes.arrayOf(PropTypes.object).isRequired,
|
||||
qsConfig: QSConfig.isRequired,
|
||||
columns: SortColumns.isRequired,
|
||||
onSort: PropTypes.func,
|
||||
sortOrder: PropTypes.string,
|
||||
sortedColumnKey: PropTypes.string,
|
||||
};
|
||||
|
||||
Sort.defaultProps = {
|
||||
onSort: null,
|
||||
sortOrder: 'ascending',
|
||||
sortedColumnKey: 'name',
|
||||
};
|
||||
|
||||
export default withI18n()(Sort);
|
||||
export default withI18n()(withRouter(Sort));
|
||||
|
||||
@@ -12,8 +12,17 @@ describe('<Sort />', () => {
|
||||
});
|
||||
|
||||
test('it triggers the expected callbacks', () => {
|
||||
const qsConfig = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'name' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
|
||||
const columns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isSearchable: true },
|
||||
{
|
||||
name: 'Name',
|
||||
key: 'name',
|
||||
},
|
||||
];
|
||||
|
||||
const sortBtn = 'button[aria-label="Sort"]';
|
||||
@@ -21,12 +30,7 @@ describe('<Sort />', () => {
|
||||
const onSort = jest.fn();
|
||||
|
||||
const wrapper = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="name"
|
||||
sortOrder="ascending"
|
||||
columns={columns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
<Sort qsConfig={qsConfig} columns={columns} onSort={onSort} />
|
||||
).find('Sort');
|
||||
|
||||
wrapper.find(sortBtn).simulate('click');
|
||||
@@ -36,22 +40,31 @@ describe('<Sort />', () => {
|
||||
});
|
||||
|
||||
test('onSort properly passes back descending when ascending was passed as prop', () => {
|
||||
const multipleColumns = [
|
||||
{ name: 'Foo', key: 'foo', isSortable: true },
|
||||
{ name: 'Bar', key: 'bar', isSortable: true },
|
||||
{ name: 'Bakery', key: 'bakery', isSortable: true },
|
||||
{ name: 'Baz', key: 'baz' },
|
||||
const qsConfig = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'foo' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
|
||||
const columns = [
|
||||
{
|
||||
name: 'Foo',
|
||||
key: 'foo',
|
||||
},
|
||||
{
|
||||
name: 'Bar',
|
||||
key: 'bar',
|
||||
},
|
||||
{
|
||||
name: 'Bakery',
|
||||
key: 'bakery',
|
||||
},
|
||||
];
|
||||
|
||||
const onSort = jest.fn();
|
||||
|
||||
const wrapper = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="foo"
|
||||
sortOrder="ascending"
|
||||
columns={multipleColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
<Sort qsConfig={qsConfig} columns={columns} onSort={onSort} />
|
||||
).find('Sort');
|
||||
const sortDropdownToggle = wrapper.find('Button');
|
||||
expect(sortDropdownToggle.length).toBe(1);
|
||||
@@ -60,22 +73,31 @@ describe('<Sort />', () => {
|
||||
});
|
||||
|
||||
test('onSort properly passes back ascending when descending was passed as prop', () => {
|
||||
const multipleColumns = [
|
||||
{ name: 'Foo', key: 'foo', isSortable: true },
|
||||
{ name: 'Bar', key: 'bar', isSortable: true },
|
||||
{ name: 'Bakery', key: 'bakery', isSortable: true },
|
||||
{ name: 'Baz', key: 'baz' },
|
||||
const qsConfig = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: '-foo' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
|
||||
const columns = [
|
||||
{
|
||||
name: 'Foo',
|
||||
key: 'foo',
|
||||
},
|
||||
{
|
||||
name: 'Bar',
|
||||
key: 'bar',
|
||||
},
|
||||
{
|
||||
name: 'Bakery',
|
||||
key: 'bakery',
|
||||
},
|
||||
];
|
||||
|
||||
const onSort = jest.fn();
|
||||
|
||||
const wrapper = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="foo"
|
||||
sortOrder="descending"
|
||||
columns={multipleColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
<Sort qsConfig={qsConfig} columns={columns} onSort={onSort} />
|
||||
).find('Sort');
|
||||
const sortDropdownToggle = wrapper.find('Button');
|
||||
expect(sortDropdownToggle.length).toBe(1);
|
||||
@@ -84,22 +106,31 @@ describe('<Sort />', () => {
|
||||
});
|
||||
|
||||
test('Changing dropdown correctly passes back new sort key', () => {
|
||||
const multipleColumns = [
|
||||
{ name: 'Foo', key: 'foo', isSortable: true },
|
||||
{ name: 'Bar', key: 'bar', isSortable: true },
|
||||
{ name: 'Bakery', key: 'bakery', isSortable: true },
|
||||
{ name: 'Baz', key: 'baz' },
|
||||
const qsConfig = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'foo' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
|
||||
const columns = [
|
||||
{
|
||||
name: 'Foo',
|
||||
key: 'foo',
|
||||
},
|
||||
{
|
||||
name: 'Bar',
|
||||
key: 'bar',
|
||||
},
|
||||
{
|
||||
name: 'Bakery',
|
||||
key: 'bakery',
|
||||
},
|
||||
];
|
||||
|
||||
const onSort = jest.fn();
|
||||
|
||||
const wrapper = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="foo"
|
||||
sortOrder="ascending"
|
||||
columns={multipleColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
<Sort qsConfig={qsConfig} columns={columns} onSort={onSort} />
|
||||
).find('Sort');
|
||||
|
||||
wrapper.instance().handleDropdownSelect({ target: { innerText: 'Bar' } });
|
||||
@@ -107,22 +138,31 @@ describe('<Sort />', () => {
|
||||
});
|
||||
|
||||
test('Opening dropdown correctly updates state', () => {
|
||||
const multipleColumns = [
|
||||
{ name: 'Foo', key: 'foo', isSortable: true },
|
||||
{ name: 'Bar', key: 'bar', isSortable: true },
|
||||
{ name: 'Bakery', key: 'bakery', isSortable: true },
|
||||
{ name: 'Baz', key: 'baz' },
|
||||
const qsConfig = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'foo' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
|
||||
const columns = [
|
||||
{
|
||||
name: 'Foo',
|
||||
key: 'foo',
|
||||
},
|
||||
{
|
||||
name: 'Bar',
|
||||
key: 'bar',
|
||||
},
|
||||
{
|
||||
name: 'Bakery',
|
||||
key: 'bakery',
|
||||
},
|
||||
];
|
||||
|
||||
const onSort = jest.fn();
|
||||
|
||||
const wrapper = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="foo"
|
||||
sortOrder="ascending"
|
||||
columns={multipleColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
<Sort qsConfig={qsConfig} columns={columns} onSort={onSort} />
|
||||
).find('Sort');
|
||||
expect(wrapper.state('isSortDropdownOpen')).toEqual(false);
|
||||
wrapper.instance().handleDropdownToggle(true);
|
||||
@@ -130,65 +170,70 @@ describe('<Sort />', () => {
|
||||
});
|
||||
|
||||
test('It displays correct sort icon', () => {
|
||||
const downNumericIconSelector = 'SortNumericDownIcon';
|
||||
const upNumericIconSelector = 'SortNumericUpIcon';
|
||||
const downAlphaIconSelector = 'SortAlphaDownIcon';
|
||||
const upAlphaIconSelector = 'SortAlphaUpIcon';
|
||||
const forwardNumericIconSelector = 'SortNumericDownIcon';
|
||||
const reverseNumericIconSelector = 'SortNumericDownAltIcon';
|
||||
const forwardAlphaIconSelector = 'SortAlphaDownIcon';
|
||||
const reverseAlphaIconSelector = 'SortAlphaDownAltIcon';
|
||||
|
||||
const numericColumns = [
|
||||
{ name: 'ID', key: 'id', isSortable: true, isNumeric: true },
|
||||
];
|
||||
const alphaColumns = [
|
||||
{ name: 'Name', key: 'name', isSortable: true, isNumeric: false },
|
||||
];
|
||||
const qsConfigNumDown = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: '-id' },
|
||||
integerFields: ['page', 'page_size', 'id'],
|
||||
};
|
||||
const qsConfigNumUp = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'id' },
|
||||
integerFields: ['page', 'page_size', 'id'],
|
||||
};
|
||||
const qsConfigAlphaDown = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: '-name' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
const qsConfigAlphaUp = {
|
||||
namespace: 'item',
|
||||
defaultParams: { page: 1, page_size: 5, order_by: 'name' },
|
||||
integerFields: ['page', 'page_size'],
|
||||
};
|
||||
|
||||
const numericColumns = [{ name: 'ID', key: 'id' }];
|
||||
const alphaColumns = [{ name: 'Name', key: 'name' }];
|
||||
const onSort = jest.fn();
|
||||
|
||||
sort = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="id"
|
||||
sortOrder="descending"
|
||||
qsConfig={qsConfigNumDown}
|
||||
columns={numericColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
);
|
||||
|
||||
const downNumericIcon = sort.find(downNumericIconSelector);
|
||||
expect(downNumericIcon.length).toBe(1);
|
||||
const reverseNumericIcon = sort.find(reverseNumericIconSelector);
|
||||
expect(reverseNumericIcon.length).toBe(1);
|
||||
|
||||
sort = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="id"
|
||||
sortOrder="ascending"
|
||||
columns={numericColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
<Sort qsConfig={qsConfigNumUp} columns={numericColumns} onSort={onSort} />
|
||||
);
|
||||
|
||||
const upNumericIcon = sort.find(upNumericIconSelector);
|
||||
expect(upNumericIcon.length).toBe(1);
|
||||
const forwardNumericIcon = sort.find(forwardNumericIconSelector);
|
||||
expect(forwardNumericIcon.length).toBe(1);
|
||||
|
||||
sort = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="name"
|
||||
sortOrder="descending"
|
||||
qsConfig={qsConfigAlphaDown}
|
||||
columns={alphaColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
);
|
||||
|
||||
const downAlphaIcon = sort.find(downAlphaIconSelector);
|
||||
expect(downAlphaIcon.length).toBe(1);
|
||||
const reverseAlphaIcon = sort.find(reverseAlphaIconSelector);
|
||||
expect(reverseAlphaIcon.length).toBe(1);
|
||||
|
||||
sort = mountWithContexts(
|
||||
<Sort
|
||||
sortedColumnKey="name"
|
||||
sortOrder="ascending"
|
||||
columns={alphaColumns}
|
||||
onSort={onSort}
|
||||
/>
|
||||
<Sort qsConfig={qsConfigAlphaUp} columns={alphaColumns} onSort={onSort} />
|
||||
);
|
||||
|
||||
const upAlphaIcon = sort.find(upAlphaIconSelector);
|
||||
expect(upAlphaIcon.length).toBe(1);
|
||||
const forwardAlphaIcon = sort.find(forwardAlphaIconSelector);
|
||||
expect(forwardAlphaIcon.length).toBe(1);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -5,7 +5,7 @@ const Separator = styled.span`
|
||||
display: inline-block;
|
||||
width: 1px;
|
||||
height: 30px;
|
||||
margin-right: 20px;
|
||||
margin-right: 27px;
|
||||
margin-left: 20px;
|
||||
background-color: #d7d7d7;
|
||||
vertical-align: middle;
|
||||
|
||||
@@ -0,0 +1,14 @@
|
||||
import React from 'react';
|
||||
import { Card, CardBody, PageSection } from '@patternfly/react-core';
|
||||
|
||||
function CredentialAdd() {
|
||||
return (
|
||||
<PageSection>
|
||||
<Card>
|
||||
<CardBody>Coming soon :)</CardBody>
|
||||
</Card>
|
||||
</PageSection>
|
||||
);
|
||||
}
|
||||
|
||||
export default CredentialAdd;
|
||||
@@ -0,0 +1 @@
|
||||
export { default } from './CredentialAdd';
|
||||
@@ -0,0 +1,170 @@
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import { useLocation } from 'react-router-dom';
|
||||
import { withI18n } from '@lingui/react';
|
||||
import { t } from '@lingui/macro';
|
||||
import { CredentialsAPI } from '@api';
|
||||
import { Card, PageSection } from '@patternfly/react-core';
|
||||
import AlertModal from '@components/AlertModal';
|
||||
import ErrorDetail from '@components/ErrorDetail';
|
||||
import DataListToolbar from '@components/DataListToolbar';
|
||||
import PaginatedDataList, {
|
||||
ToolbarAddButton,
|
||||
ToolbarDeleteButton,
|
||||
} from '@components/PaginatedDataList';
|
||||
import { getQSConfig, parseQueryString } from '@util/qs';
|
||||
import { CredentialListItem } from '.';
|
||||
|
||||
const QS_CONFIG = getQSConfig('credential', {
|
||||
page: 1,
|
||||
page_size: 20,
|
||||
order_by: 'name',
|
||||
});
|
||||
|
||||
function CredentialList({ i18n }) {
|
||||
const [actions, setActions] = useState(null);
|
||||
const [contentError, setContentError] = useState(null);
|
||||
const [credentialCount, setCredentialCount] = useState(0);
|
||||
const [credentials, setCredentials] = useState([]);
|
||||
const [deletionError, setDeletionError] = useState(null);
|
||||
const [hasContentLoading, setHasContentLoading] = useState(true);
|
||||
const [selected, setSelected] = useState([]);
|
||||
|
||||
const location = useLocation();
|
||||
|
||||
const loadCredentials = async ({ search }) => {
|
||||
const params = parseQueryString(QS_CONFIG, search);
|
||||
setContentError(null);
|
||||
setHasContentLoading(true);
|
||||
try {
|
||||
const [
|
||||
{
|
||||
data: { count, results },
|
||||
},
|
||||
{
|
||||
data: { actions: optionActions },
|
||||
},
|
||||
] = await Promise.all([
|
||||
CredentialsAPI.read(params),
|
||||
loadCredentialActions(),
|
||||
]);
|
||||
|
||||
setCredentials(results);
|
||||
setCredentialCount(count);
|
||||
setActions(optionActions);
|
||||
} catch (error) {
|
||||
setContentError(error);
|
||||
} finally {
|
||||
setHasContentLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
loadCredentials(location);
|
||||
}, [location]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
const loadCredentialActions = () => {
|
||||
if (actions) {
|
||||
return Promise.resolve({ data: { actions } });
|
||||
}
|
||||
return CredentialsAPI.readOptions();
|
||||
};
|
||||
|
||||
const handleSelectAll = isSelected => {
|
||||
setSelected(isSelected ? [...credentials] : []);
|
||||
};
|
||||
|
||||
const handleSelect = row => {
|
||||
if (selected.some(s => s.id === row.id)) {
|
||||
setSelected(selected.filter(s => s.id !== row.id));
|
||||
} else {
|
||||
setSelected(selected.concat(row));
|
||||
}
|
||||
};
|
||||
|
||||
const handleDelete = async () => {
|
||||
setHasContentLoading(true);
|
||||
|
||||
try {
|
||||
await Promise.all(
|
||||
selected.map(credential => CredentialsAPI.destroy(credential.id))
|
||||
);
|
||||
} catch (error) {
|
||||
setDeletionError(error);
|
||||
}
|
||||
|
||||
const params = parseQueryString(QS_CONFIG, location.search);
|
||||
try {
|
||||
const {
|
||||
data: { count, results },
|
||||
} = await CredentialsAPI.read(params);
|
||||
|
||||
setCredentials(results);
|
||||
setCredentialCount(count);
|
||||
setSelected([]);
|
||||
} catch (error) {
|
||||
setContentError(error);
|
||||
}
|
||||
|
||||
setHasContentLoading(false);
|
||||
};
|
||||
|
||||
const canAdd =
|
||||
actions && Object.prototype.hasOwnProperty.call(actions, 'POST');
|
||||
const isAllSelected =
|
||||
selected.length > 0 && selected.length === credentials.length;
|
||||
|
||||
return (
|
||||
<PageSection>
|
||||
<Card>
|
||||
<PaginatedDataList
|
||||
contentError={contentError}
|
||||
hasContentLoading={hasContentLoading}
|
||||
items={credentials}
|
||||
itemCount={credentialCount}
|
||||
qsConfig={QS_CONFIG}
|
||||
onRowClick={handleSelect}
|
||||
renderItem={item => (
|
||||
<CredentialListItem
|
||||
key={item.id}
|
||||
credential={item}
|
||||
detailUrl={`/credentials/${item.id}/details`}
|
||||
isSelected={selected.some(row => row.id === item.id)}
|
||||
onSelect={() => handleSelect(item)}
|
||||
/>
|
||||
)}
|
||||
renderToolbar={props => (
|
||||
<DataListToolbar
|
||||
{...props}
|
||||
showSelectAll
|
||||
isAllSelected={isAllSelected}
|
||||
onSelectAll={handleSelectAll}
|
||||
qsConfig={QS_CONFIG}
|
||||
additionalControls={[
|
||||
<ToolbarDeleteButton
|
||||
key="delete"
|
||||
onDelete={handleDelete}
|
||||
itemsToDelete={selected}
|
||||
pluralizedItemName={i18n._(t`Credentials`)}
|
||||
/>,
|
||||
canAdd && (
|
||||
<ToolbarAddButton key="add" linkTo="/credentials/add" />
|
||||
),
|
||||
]}
|
||||
/>
|
||||
)}
|
||||
/>
|
||||
</Card>
|
||||
<AlertModal
|
||||
isOpen={deletionError}
|
||||
variant="danger"
|
||||
title={i18n._(t`Error!`)}
|
||||
onClose={() => setDeletionError(null)}
|
||||
>
|
||||
{i18n._(t`Failed to delete one or more credentials.`)}
|
||||
<ErrorDetail error={deletionError} />
|
||||
</AlertModal>
|
||||
</PageSection>
|
||||
);
|
||||
}
|
||||
|
||||
export default withI18n()(CredentialList);
|
||||
@@ -0,0 +1,139 @@
|
||||
import React from 'react';
|
||||
import { act } from 'react-dom/test-utils';
|
||||
import { CredentialsAPI } from '@api';
|
||||
import { mountWithContexts, waitForElement } from '@testUtils/enzymeHelpers';
|
||||
import { CredentialList } from '.';
|
||||
import mockCredentials from '../shared';
|
||||
|
||||
jest.mock('@api');
|
||||
|
||||
describe('<CredentialList />', () => {
|
||||
let wrapper;
|
||||
|
||||
beforeEach(async () => {
|
||||
CredentialsAPI.read.mockResolvedValueOnce({ data: mockCredentials });
|
||||
CredentialsAPI.readOptions.mockResolvedValue({
|
||||
data: {
|
||||
actions: {
|
||||
GET: {},
|
||||
POST: {},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(<CredentialList />);
|
||||
});
|
||||
|
||||
await waitForElement(wrapper, 'ContentLoading', el => el.length === 0);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
jest.clearAllMocks();
|
||||
wrapper.unmount();
|
||||
});
|
||||
|
||||
test('initially renders successfully', () => {
|
||||
expect(wrapper.find('CredentialList').length).toBe(1);
|
||||
});
|
||||
|
||||
test('should fetch credentials from api and render the in the list', () => {
|
||||
expect(CredentialsAPI.read).toHaveBeenCalled();
|
||||
expect(wrapper.find('CredentialListItem').length).toBe(5);
|
||||
});
|
||||
|
||||
test('should show content error if credentials are not successfully fetched from api', async () => {
|
||||
CredentialsAPI.readOptions.mockImplementationOnce(() =>
|
||||
Promise.reject(new Error())
|
||||
);
|
||||
await act(async () => {
|
||||
wrapper = mountWithContexts(<CredentialList />);
|
||||
});
|
||||
await waitForElement(wrapper, 'ContentError', el => el.length === 1);
|
||||
});
|
||||
|
||||
test('should check and uncheck the row item', async () => {
|
||||
expect(
|
||||
wrapper.find('PFDataListCheck[id="select-credential-1"]').props().checked
|
||||
).toBe(false);
|
||||
await act(async () => {
|
||||
wrapper
|
||||
.find('PFDataListCheck[id="select-credential-1"]')
|
||||
.invoke('onChange')(true);
|
||||
});
|
||||
wrapper.update();
|
||||
expect(
|
||||
wrapper.find('PFDataListCheck[id="select-credential-1"]').props().checked
|
||||
).toBe(true);
|
||||
await act(async () => {
|
||||
wrapper
|
||||
.find('PFDataListCheck[id="select-credential-1"]')
|
||||
.invoke('onChange')(false);
|
||||
});
|
||||
wrapper.update();
|
||||
expect(
|
||||
wrapper.find('PFDataListCheck[id="select-credential-1"]').props().checked
|
||||
).toBe(false);
|
||||
});
|
||||
|
||||
test('should check all row items when select all is checked', async () => {
|
||||
wrapper.find('PFDataListCheck').forEach(el => {
|
||||
expect(el.props().checked).toBe(false);
|
||||
});
|
||||
await act(async () => {
|
||||
wrapper.find('Checkbox#select-all').invoke('onChange')(true);
|
||||
});
|
||||
wrapper.update();
|
||||
wrapper.find('PFDataListCheck').forEach(el => {
|
||||
expect(el.props().checked).toBe(true);
|
||||
});
|
||||
await act(async () => {
|
||||
wrapper.find('Checkbox#select-all').invoke('onChange')(false);
|
||||
});
|
||||
wrapper.update();
|
||||
wrapper.find('PFDataListCheck').forEach(el => {
|
||||
expect(el.props().checked).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
test('should call api delete credentials for each selected credential', async () => {
|
||||
CredentialsAPI.read.mockResolvedValueOnce({ data: mockCredentials });
|
||||
CredentialsAPI.destroy = jest.fn();
|
||||
|
||||
await act(async () => {
|
||||
wrapper
|
||||
.find('PFDataListCheck[id="select-credential-3"]')
|
||||
.invoke('onChange')();
|
||||
});
|
||||
wrapper.update();
|
||||
await act(async () => {
|
||||
wrapper.find('ToolbarDeleteButton').invoke('onDelete')();
|
||||
});
|
||||
wrapper.update();
|
||||
expect(CredentialsAPI.destroy).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
test('should show error modal when credential is not successfully deleted from api', async () => {
|
||||
CredentialsAPI.destroy.mockImplementationOnce(() =>
|
||||
Promise.reject(new Error())
|
||||
);
|
||||
await act(async () => {
|
||||
wrapper
|
||||
.find('PFDataListCheck[id="select-credential-2"]')
|
||||
.invoke('onChange')();
|
||||
});
|
||||
wrapper.update();
|
||||
await act(async () => {
|
||||
wrapper.find('ToolbarDeleteButton').invoke('onDelete')();
|
||||
});
|
||||
await waitForElement(
|
||||
wrapper,
|
||||
'Modal',
|
||||
el => el.props().isOpen === true && el.props().title === 'Error!'
|
||||
);
|
||||
await act(async () => {
|
||||
wrapper.find('ModalBoxCloseButton').invoke('onClose')();
|
||||
});
|
||||
await waitForElement(wrapper, 'Modal', el => el.props().isOpen === false);
|
||||
});
|
||||
});
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user