Compare commits

...

95 Commits

Author SHA1 Message Date
Sarah Akus
9df447fe75 Merge pull request #12778 from keithjgrant/12542-schedule-exceptions
Schedule exceptions
2022-09-15 16:28:56 -04:00
Keith J. Grant
7e7991bb63 adjust DetailList spacing when two appear in succession 2022-09-15 09:37:03 -07:00
Keith J. Grant
35e9d00beb improve frequency validation performance 2022-09-14 15:33:00 -07:00
Elijah DeLee
461b5221f3 Add graphs for job event processing to dashboard 2022-09-14 16:23:53 -04:00
Elijah DeLee
10d06f219d add alerting rule to grafana
This rule alerts if the redis queue is larger than what the rolling
average event insertion rate/second * 120. In other words, if the redis
queue is larger than it appears we can process events in two minutes.

It appears it has to meet this condition for 60 seconds to start firing.

Future commits will address how to configure contact points like slack.

shout out to @jainnikhil30 and @rebeccahhh who figured this out in jam
session this morning.
2022-09-14 16:23:53 -04:00
Jessica Steurer
a227fea5ef Merge pull request #12868 from keithjgrant/12853-ws-event-duplication
Don't add ws events twice to job output
2022-09-14 16:02:07 -03:00
Jessica Steurer
3f4d0bc15d Merge pull request #12788 from AlexSCorey/5941-Translations
Ensures that strings in helpText files do not miss being translated
2022-09-14 12:02:51 -03:00
Rick Elrod
0812425671 [ui] Minor tweak to capitalize GPG properly (#12734)
"GPG Public Key", not "Gpg Public Key"

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-09-14 01:37:09 +00:00
Alex Corey
94344c0214 Merge pull request #12859 from AlexSCorey/updateCanIUse-lite
updates CanIUseLite
2022-09-13 13:48:20 -04:00
Keith J. Grant
16da9b784a add schedule integration test locators 2022-09-12 16:30:46 -07:00
Keith J. Grant
1e952bab95 fix error message on new schedules with no instances 2022-09-12 12:58:25 -07:00
Jake Jackson
484db004db Update Kind Docs (#12865)
* update kind docs formatting and update some commands

* add tested on fedora update
2022-09-12 13:04:04 -04:00
Alex Corey
7465d7685f updates CanIUseLite 2022-09-09 11:17:54 -04:00
Seth Foster
f0c125efb3 Merge pull request #12762 from akira6592/fix-doc-link
fix link of Patternfly style guide
2022-09-09 09:52:00 -04:00
Keith J. Grant
2d39b81e12 don't add ws events twice to job output 2022-09-08 16:09:02 -07:00
Akira Yokochi
1044d34d98 fix link on doc 2022-09-08 22:49:11 +00:00
Rick Elrod
63567fcc52 [sig validation] better error for job template run (#12735)
When launching a job template, if the last project update failed due to
signature validation, show an error that actually says that.

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-09-08 02:13:41 -05:00
Matthew Jones
cea8c16064 Merge pull request #12724 from mtward/issue-11605
Fix: preserve_existing_hosts flag in awx.awx.group module, while adding a new host to inventory group, retains only 25 existing hosts related #11605
2022-09-07 20:23:58 -04:00
John Westcott IV
e7c97923a3 Merge pull request #12785 from jangel97/devel
Fix list_instances command
* Change from modified to last seen
2022-09-07 14:48:38 -04:00
Keith J. Grant
078c3ae6d8 add schedule form validation to ensure at least one occurrence 2022-09-07 10:33:16 -07:00
Rick Elrod
1ab3dba476 Add "cryptography" kind to CredentialType (#12842)
This was missed when we landed #12813. Adds cryptography
kind to the CredentialType allowed kinds list, which now
produces the proper error message when attempting to PUT
to modify the managed credential type.

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-09-07 12:22:47 -05:00
Alan Rominger
15964dc395 Merge pull request #11745 from AlanCoding/cancel_rework_no_close
Close database connections while processing job output
2022-09-06 15:45:29 -04:00
Keith Grant
b83b65da16 clear output follow mode flag on search (#12791) 2022-09-06 15:15:06 -04:00
Alan Rominger
430f1986c7 Merge pull request #12830 from AlanCoding/dev_stuff
Fix LDAP volume conditional, better metrics interval
2022-09-06 11:51:51 -04:00
Alex Corey
c589f8776c Fixes possible missed translation 2022-09-06 11:26:41 -04:00
Jose Angel Morena
82679ce9a3 replace modified by last_seen in heartbeat 2022-09-06 17:14:19 +02:00
Lila Yasin
6d2e28bfb0 [collection] Add GPG key information to inputs and credential types in documentation. (#12817) 2022-09-06 10:05:36 -05:00
Luiz Costa
7a4da5a8fa Add GPG credential support to awxkit 2022-09-06 10:05:36 -05:00
Rick Elrod
c475a7b6c0 [ui] make signature cred. field be project-global (#12695)
Rather than only allowing the signature credential to be specified on
project using git, allow it to be specified on any project at all.

This moves the field to always show, and moves it out of the git
subform.

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-09-06 10:05:36 -05:00
Rick Elrod
32bb603554 Update action plugin to use ansible-sign library
Signed-off-by: Rick Elrod <rick@elrod.me>
2022-09-06 10:05:36 -05:00
Rick Elrod
8d71292d1a Integrity checking on project sync
Signed-off-by: Rick Elrod <rick@elrod.me>
2022-09-06 10:05:36 -05:00
Veda Periwal
e896dc1aa7 Add Content Signature Validation Credential field to Projects Form page and Projects Detail page 2022-09-06 10:05:36 -05:00
Hao Liu
f5a2246817 add new managed credential type for gpg pub key
add new managed credential type for gpg pub key
add migration file to setup managed credential types to add the new credential type

Signed-off-by: Hao Liu <haoli@redhat.com>
2022-09-06 10:05:36 -05:00
Hao Liu
c467b6ea13 add signature_validation_credential to Project
add new column to `main_project` table
- `signature_validation_credential`

update project module for awx_collection
- added input arg for `signature_validation_credential`

Co-Authored-By: Lila Yasin  <89486372+djyasin@users.noreply.github.com>
2022-09-06 10:05:36 -05:00
Alex Corey
1636f6b196 Merge pull request #12835 from ansible/dependabot/npm_and_yarn/awx/ui/devel/patternfly/patternfly-4.210.2
Bump @patternfly/patternfly from 4.202.1 to 4.210.2 in /awx/ui
2022-09-06 10:33:00 -04:00
Alex Corey
5da528ffbb Merge pull request #12834 from ansible/dependabot/npm_and_yarn/awx/ui/devel/ace-builds-1.10.1
Bump ace-builds from 1.8.1 to 1.10.1 in /awx/ui
2022-09-06 10:30:46 -04:00
Alex Corey
2e65ae49a5 Merge pull request #12806 from ansible/dependabot/npm_and_yarn/awx/ui/devel/luxon-3.0.3
Bump luxon from 3.0.1 to 3.0.3 in /awx/ui
2022-09-06 10:15:08 -04:00
Alex Corey
d06bc815f8 Merge pull request #12807 from ansible/dependabot/npm_and_yarn/awx/ui/devel/dompurify-2.4.0
Bump dompurify from 2.3.10 to 2.4.0 in /awx/ui
2022-09-06 10:14:28 -04:00
dependabot[bot]
0290784f9b Bump @patternfly/patternfly from 4.202.1 to 4.210.2 in /awx/ui
Bumps [@patternfly/patternfly](https://github.com/patternfly/patternfly) from 4.202.1 to 4.210.2.
- [Release notes](https://github.com/patternfly/patternfly/releases)
- [Changelog](https://github.com/patternfly/patternfly/blob/main/RELEASE-NOTES.md)
- [Commits](https://github.com/patternfly/patternfly/compare/prerelease-v4.202.1...prerelease-v4.210.2)

---
updated-dependencies:
- dependency-name: "@patternfly/patternfly"
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-09-06 14:13:52 +00:00
dependabot[bot]
1cc52afc42 Bump ace-builds from 1.8.1 to 1.10.1 in /awx/ui
Bumps [ace-builds](https://github.com/ajaxorg/ace-builds) from 1.8.1 to 1.10.1.
- [Release notes](https://github.com/ajaxorg/ace-builds/releases)
- [Changelog](https://github.com/ajaxorg/ace-builds/blob/master/CHANGELOG.md)
- [Commits](https://github.com/ajaxorg/ace-builds/compare/v1.8.1...v1.10.1)

---
updated-dependencies:
- dependency-name: ace-builds
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-09-06 14:13:17 +00:00
Alex Corey
88f7f987cd Merge pull request #12810 from ansible/dependabot/npm_and_yarn/awx/ui/devel/patternfly/react-table-4.100.8
Bump @patternfly/react-table from 4.93.1 to 4.100.8 in /awx/ui
2022-09-06 10:12:01 -04:00
Alan Rominger
f512971991 Add project sync to job cancel chain 2022-09-05 22:29:19 -04:00
Alan Rominger
53de245877 Fix LDAP volume conditional, better metrics interval 2022-09-04 22:33:12 -04:00
Shane McDonald
749622427c Merge pull request #12825 from shanemcd/extend-includes
Extend black excludes instead of overriding
2022-09-02 15:40:41 -04:00
Alan Rominger
725d6fa896 Merge pull request #12820 from AlanCoding/five_seconds
Make the metrics default sampling interval 5s
2022-09-02 15:21:57 -04:00
Shane McDonald
a107bb684c Extend black excludes instead of overriding
By default it will ignore things in .gitignore, which we want
2022-09-02 15:11:45 -04:00
Alan Rominger
ccbc8ce7de Make the metrics default sampling interval 5s 2022-09-02 13:38:49 -04:00
Shane McDonald
260e1d4f2d Make static asset location consistent across all deployments (#12819) 2022-09-02 17:12:06 +00:00
Shane McDonald
1afa49f3ff Merge pull request #12632 from TheRealHaoLiu/kind-k8s-devel
Add documentation for running development environment in kind
2022-09-02 12:12:01 -04:00
Rick Elrod
6f88ea1dc7 Common Inventory slicing method for job slices
- Extract how slicing is done from Inventory#get_script_data and pull it
  into a new method, Inventory#get_sliced_hosts
- Make use of this method in Inventory#get_script_data
- Make use of this method in Job#_get_inventory_hosts (used by
  Job#start_job_fact_cache and Job#finish_job_fact_cache).

This fixes an issue (namely in Tower 4.1) where job slicing with fact
caching enabled doesn't save facts for all hosts.

Signed-off-by: Rick Elrod <rick@elrod.me>
2022-09-01 16:15:07 -05:00
Alan Rominger
c59bbdecdb Refactor canceling to work through messaging and signals, not database
If canceled attempted before, still allow attempting another cancel
in this case, attempt to send the sigterm signal again.
Keep clicking, you might help!

Replace other cancel_callbacks with sigterm watcher
  adapt special inventory mechanism for this too

Get rid of the cancel_watcher method with exception in main thread

Handle academic case of sigterm race condition

Process cancelation as control signal

Fully connect cancel method and run_dispatcher to control

Never transition workflows directly to canceled, add logs
2022-09-01 15:20:31 -04:00
Matthew Jones
f9428c10b9 Merge pull request #12803 from matburt/fix_cleanup_schedules
Fix an issue where default cleanup schedules only run once
2022-09-01 10:40:11 -04:00
dependabot[bot]
1ca054f43d Bump @patternfly/react-table from 4.93.1 to 4.100.8 in /awx/ui
Bumps [@patternfly/react-table](https://github.com/patternfly/patternfly-react) from 4.93.1 to 4.100.8.
- [Release notes](https://github.com/patternfly/patternfly-react/releases)
- [Commits](https://github.com/patternfly/patternfly-react/compare/@patternfly/react-table@4.93.1...@patternfly/react-table@4.100.8)

---
updated-dependencies:
- dependency-name: "@patternfly/react-table"
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-09-01 08:09:44 +00:00
dependabot[bot]
374f76b527 Bump dompurify from 2.3.10 to 2.4.0 in /awx/ui
Bumps [dompurify](https://github.com/cure53/DOMPurify) from 2.3.10 to 2.4.0.
- [Release notes](https://github.com/cure53/DOMPurify/releases)
- [Commits](https://github.com/cure53/DOMPurify/compare/2.3.10...2.4.0)

---
updated-dependencies:
- dependency-name: dompurify
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-09-01 08:08:10 +00:00
dependabot[bot]
f9dd5e0f1c Bump luxon from 3.0.1 to 3.0.3 in /awx/ui
Bumps [luxon](https://github.com/moment/luxon) from 3.0.1 to 3.0.3.
- [Release notes](https://github.com/moment/luxon/releases)
- [Changelog](https://github.com/moment/luxon/blob/master/CHANGELOG.md)
- [Commits](https://github.com/moment/luxon/compare/3.0.1...3.0.3)

---
updated-dependencies:
- dependency-name: luxon
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-09-01 08:07:33 +00:00
Matthew Jones
bb7509498e Fix an issue where default cleanup schedules only run once
This looks like an oversight that has existed for a long time. We intend to run these on a pretty regular basis
2022-08-31 20:10:20 -04:00
Keith Grant
8a06ffbe15 poll for events processing completion (#12689) 2022-08-31 16:03:35 -04:00
Hao Liu
8ad948f268 Merge pull request #12797 from TheRealHaoLiu/remove-helm-from-dockerfile
remove helm from dockerfile template
2022-08-31 14:18:25 -04:00
Hao Liu
73f808dee7 remove helm from dockerfile template
Signed-off-by: Hao Liu <haoli@redhat.com>
2022-08-31 13:48:30 -04:00
Shane McDonald
fecab52f86 Merge pull request #12796 from shanemcd/fix-tests
Prevent openldap from getting downgraded during build
2022-08-31 13:34:04 -04:00
Shane McDonald
609c67d85e Prevent openldap from getting downgraded during build
We noticed here that openldap was getting downgraded and caused our test suite to blow up https://github.com/ansible/awx/runs/8118323342?check_suite_focus=true
2022-08-31 13:09:29 -04:00
Keith J. Grant
0005d249c0 update tests 2022-08-30 15:44:52 -07:00
Hao Liu
8828ea706e add make target for building custom awx kube image (#12789) 2022-08-30 20:19:36 +00:00
Shane McDonald
4070ef3f33 Merge pull request #12787 from shanemcd/pre-build-ui
Speed up image build when UI is pre-built on host
2022-08-30 15:51:43 -04:00
Keith Grant
39f6e2fa32 fix TypeError when config is undefined (#12697) 2022-08-30 15:11:45 -04:00
Shane McDonald
1dfdff4a9e Speed up image build when UI is pre-built on host 2022-08-30 12:36:25 -04:00
Alan Rominger
310e354164 Merge pull request #12769 from AlanCoding/self_conn
Fix sanity check to use the relevant active connection
2022-08-29 20:36:48 -04:00
Keith J. Grant
dda2931e60 fix exception frequency placeholder text 2022-08-29 13:43:49 -07:00
Alan Rominger
6d207d2490 Merge pull request #12754 from kdelee/fix_metrics_consumed_capacity
calcuate consumed capacity in same way in metrics
2022-08-29 16:37:53 -04:00
Alan Rominger
01037fa561 Fix sanity check to use the relevant active connection 2022-08-29 16:33:07 -04:00
Alan Rominger
61f3e5cbed Merge pull request #12702 from AlanCoding/poll_cancel
Check exit conditions in loop waiting for project flock
2022-08-29 16:29:39 -04:00
Alan Rominger
44995e944a Merge pull request #12766 from AlanCoding/lazy_no_more
Revert "Merge pull request #12584 from AlanCoding/lazy_workers"
2022-08-29 16:06:50 -04:00
Keith J. Grant
4a92fcfc62 add schedule exceptions to details 2022-08-29 11:55:32 -07:00
Elijah DeLee
d3f15f5784 Merge pull request #4 from AlanCoding/elijah_metrics
Minor changes to instance loop structure
2022-08-29 14:33:46 -04:00
Alan Rominger
2437a84b48 Minor changes to instance loop structure 2022-08-29 14:28:50 -04:00
Shane McDonald
696f099940 Merge pull request #12749 from shanemcd/not-so-aggressive
Make error handling less aggressive when checking status of dispatcher task
2022-08-29 11:50:56 -04:00
Shane McDonald
3f0f538c40 Merge pull request #12759 from shanemcd/auto-prom
Automate bootstrapping of Prometheus in the development environment
2022-08-29 11:25:13 -04:00
Shane McDonald
66529d0f70 Automate bootstrapping of Prometheus in the development environment 2022-08-29 09:39:44 -04:00
Alan Rominger
974f845059 Revert "Merge pull request #12584 from AlanCoding/lazy_workers"
This reverts commit 64157f7207, reversing
changes made to 9e8ba6ca09.
2022-08-28 23:04:13 -04:00
Keith J. Grant
f6b3413a11 add schedule exemptions to form 2022-08-26 16:00:08 -07:00
Shane McDonald
b4ef687b60 Merge pull request #12760 from shanemcd/another-domino-falls
Fix browsable API in development environment
2022-08-26 17:43:37 -04:00
Shane McDonald
2ef531b2dc Fix browsable API in development environment
Fallout from https://github.com/ansible/awx/pull/12722
2022-08-26 17:19:16 -04:00
Elijah DeLee
125801ec5b add panel to grafana dashboard for capacity
also reorganize so there are two columns of panels, not
just one long skinny set of panels
2022-08-26 15:42:40 -04:00
Shane McDonald
691d9d7dc4 Merge pull request #12755 from shanemcd/fix-dev-env-admin-pw
Fix auto-generated dev env admin password
2022-08-26 13:33:43 -04:00
Shane McDonald
5ca898541f Fix auto-generated dev env admin password
Fallout from https://github.com/ansible/awx/pull/12753
2022-08-26 13:07:46 -04:00
Shane McDonald
24821ff030 Merge pull request #12753 from shanemcd/custom-dev-env-admin-pw
Allow for setting custom admin password in dev environment
2022-08-26 11:55:17 -04:00
Elijah DeLee
99815f8962 calcuate consumed capacity in same way in metrics
We should be consistent about this. Also this takes us from doing a as
many queries to the UnifiedJob table as we have instances to doing 1
query to the UnifiedJob table (and both do 1 query to Instances table)
2022-08-26 11:40:36 -04:00
Shane McDonald
d752e6ce6d Allow for setting custom admin password in dev environment 2022-08-26 11:29:11 -04:00
Shane McDonald
457dd890cb Make error handling less aggressive when checking status of dispatcher task 2022-08-26 09:05:38 -04:00
Christian Adams
4fbf5e9e2f Merge pull request #12731 from rooftopcellist/fix-messages-target
Fix make target for compiling api strings
2022-08-24 17:01:43 -04:00
Christian M. Adams
687b4ac71d Fix make target for compiling api strings 2022-08-24 16:36:25 -04:00
John Westcott IV
a1b364f80c Configuring Keycloak to also do OIDC (#12700) 2022-08-24 07:08:39 -04:00
mtward
271938c5fc Update group.py 2022-08-23 15:06:11 -04:00
Alan Rominger
88bf03c6bf Check exit conditions in loop waiting for project flock 2022-08-19 16:08:56 -04:00
Hao Liu
13fc845bcc develop AWX on MacOS using K8S
Add instruction for AWX development on MacOS using Kind Cluster

Signed-off-by: Hao Liu <haoli@redhat.com>
2022-08-15 22:48:23 -04:00
129 changed files with 3001 additions and 897 deletions

View File

@@ -1,3 +1,2 @@
awx/ui/node_modules
Dockerfile
.git

3
.gitignore vendored
View File

@@ -153,9 +153,6 @@ use_dev_supervisor.txt
/sanity/
/awx_collection_build/
# Setup for metrics gathering
tools/prometheus/prometheus.yml
.idea/*
*.unison.tmp
*.#

View File

@@ -381,7 +381,8 @@ clean-ui:
awx/ui/node_modules:
NODE_OPTIONS=--max-old-space-size=6144 $(NPM_BIN) --prefix awx/ui --loglevel warn ci
$(UI_BUILD_FLAG_FILE): awx/ui/node_modules
$(UI_BUILD_FLAG_FILE):
$(MAKE) awx/ui/node_modules
$(PYTHON) tools/scripts/compilemessages.py
$(NPM_BIN) --prefix awx/ui --loglevel warn run compile-strings
$(NPM_BIN) --prefix awx/ui --loglevel warn run build
@@ -452,6 +453,11 @@ COMPOSE_OPTS ?=
CONTROL_PLANE_NODE_COUNT ?= 1
EXECUTION_NODE_COUNT ?= 2
MINIKUBE_CONTAINER_GROUP ?= false
EXTRA_SOURCES_ANSIBLE_OPTS ?=
ifneq ($(ADMIN_PASSWORD),)
EXTRA_SOURCES_ANSIBLE_OPTS := -e admin_password=$(ADMIN_PASSWORD) $(EXTRA_SOURCES_ANSIBLE_OPTS)
endif
docker-compose-sources: .git/hooks/pre-commit
@if [ $(MINIKUBE_CONTAINER_GROUP) = true ]; then\
@@ -469,7 +475,8 @@ docker-compose-sources: .git/hooks/pre-commit
-e enable_ldap=$(LDAP) \
-e enable_splunk=$(SPLUNK) \
-e enable_prometheus=$(PROMETHEUS) \
-e enable_grafana=$(GRAFANA)
-e enable_grafana=$(GRAFANA) $(EXTRA_SOURCES_ANSIBLE_OPTS)
docker-compose: awx/projects docker-compose-sources
@@ -558,12 +565,20 @@ Dockerfile.kube-dev: tools/ansible/roles/dockerfile/templates/Dockerfile.j2
-e template_dest=_build_kube_dev \
-e receptor_image=$(RECEPTOR_IMAGE)
## Build awx_kube_devel image for development on local Kubernetes environment.
awx-kube-dev-build: Dockerfile.kube-dev
DOCKER_BUILDKIT=1 docker build -f Dockerfile.kube-dev \
--build-arg BUILDKIT_INLINE_CACHE=1 \
--cache-from=$(DEV_DOCKER_TAG_BASE)/awx_kube_devel:$(COMPOSE_TAG) \
-t $(DEV_DOCKER_TAG_BASE)/awx_kube_devel:$(COMPOSE_TAG) .
## Build awx image for deployment on Kubernetes environment.
awx-kube-build: Dockerfile
DOCKER_BUILDKIT=1 docker build -f Dockerfile \
--build-arg VERSION=$(VERSION) \
--build-arg SETUPTOOLS_SCM_PRETEND_VERSION=$(VERSION) \
--build-arg HEADLESS=$(HEADLESS) \
-t $(DEV_DOCKER_TAG_BASE)/awx:$(COMPOSE_TAG) .
# Translation TASKS
# --------------------------------------
@@ -576,7 +591,7 @@ pot: $(UI_BUILD_FLAG_FILE)
po: $(UI_BUILD_FLAG_FILE)
$(NPM_BIN) --prefix awx/ui --loglevel warn run extract-strings -- --clean
LANG = "en-us"
LANG = "en_us"
## generate API django .pot .po
messages:
@if [ "$(VENV_BASE)" ]; then \

View File

@@ -154,6 +154,7 @@ SUMMARIZABLE_FK_FIELDS = {
'source_project': DEFAULT_SUMMARY_FIELDS + ('status', 'scm_type'),
'project_update': DEFAULT_SUMMARY_FIELDS + ('status', 'failed'),
'credential': DEFAULT_SUMMARY_FIELDS + ('kind', 'cloud', 'kubernetes', 'credential_type_id'),
'signature_validation_credential': DEFAULT_SUMMARY_FIELDS + ('kind', 'credential_type_id'),
'job': DEFAULT_SUMMARY_FIELDS + ('status', 'failed', 'elapsed', 'type', 'canceled_on'),
'job_template': DEFAULT_SUMMARY_FIELDS,
'workflow_job_template': DEFAULT_SUMMARY_FIELDS,
@@ -1470,6 +1471,7 @@ class ProjectSerializer(UnifiedJobTemplateSerializer, ProjectOptionsSerializer):
'allow_override',
'custom_virtualenv',
'default_environment',
'signature_validation_credential',
) + (
'last_update_failed',
'last_updated',
@@ -4195,6 +4197,15 @@ class JobLaunchSerializer(BaseSerializer):
elif template.project.status in ('error', 'failed'):
errors['playbook'] = _("Missing a revision to run due to failed project update.")
latest_update = template.project.project_updates.last()
if latest_update is not None and latest_update.failed:
failed_validation_tasks = latest_update.project_update_events.filter(
event='runner_on_failed',
play="Perform project signature/checksum verification",
)
if failed_validation_tasks:
errors['playbook'] = _("Last project update failed due to signature validation failure.")
# cannot run a playbook without an inventory
if template.inventory and template.inventory.pending_deletion is True:
errors['inventory'] = _("The inventory associated with this Job Template is being deleted.")

View File

@@ -16,6 +16,7 @@ from awx.conf.license import get_license
from awx.main.utils import get_awx_version, camelcase_to_underscore, datetime_hook
from awx.main import models
from awx.main.analytics import register
from awx.main.scheduler.task_manager_models import TaskManagerInstances
"""
This module is used to define metrics collected by awx.main.analytics.gather()
@@ -235,25 +236,25 @@ def projects_by_scm_type(since, **kwargs):
@register('instance_info', '1.2', description=_('Cluster topology and capacity'))
def instance_info(since, include_hostnames=False, **kwargs):
info = {}
instances = models.Instance.objects.values_list('hostname').values(
'uuid', 'version', 'capacity', 'cpu', 'memory', 'managed_by_policy', 'hostname', 'enabled'
)
for instance in instances:
consumed_capacity = sum(x.task_impact for x in models.UnifiedJob.objects.filter(execution_node=instance['hostname'], status__in=('running', 'waiting')))
# Use same method that the TaskManager does to compute consumed capacity without querying all running jobs for each Instance
active_tasks = models.UnifiedJob.objects.filter(status__in=['running', 'waiting']).only('task_impact', 'controller_node', 'execution_node')
tm_instances = TaskManagerInstances(active_tasks, instance_fields=['uuid', 'version', 'capacity', 'cpu', 'memory', 'managed_by_policy', 'enabled'])
for tm_instance in tm_instances.instances_by_hostname.values():
instance = tm_instance.obj
instance_info = {
'uuid': instance['uuid'],
'version': instance['version'],
'capacity': instance['capacity'],
'cpu': instance['cpu'],
'memory': instance['memory'],
'managed_by_policy': instance['managed_by_policy'],
'enabled': instance['enabled'],
'consumed_capacity': consumed_capacity,
'remaining_capacity': instance['capacity'] - consumed_capacity,
'uuid': instance.uuid,
'version': instance.version,
'capacity': instance.capacity,
'cpu': instance.cpu,
'memory': instance.memory,
'managed_by_policy': instance.managed_by_policy,
'enabled': instance.enabled,
'consumed_capacity': tm_instance.consumed_capacity,
'remaining_capacity': instance.capacity - tm_instance.consumed_capacity,
}
if include_hostnames is True:
instance_info['hostname'] = instance['hostname']
info[instance['uuid']] = instance_info
instance_info['hostname'] = instance.hostname
info[instance.uuid] = instance_info
return info

View File

@@ -31,7 +31,7 @@ class PubSub(object):
cur.execute('SELECT pg_notify(%s, %s);', (channel, payload))
def events(self, select_timeout=5, yield_timeouts=False):
if not pg_connection.get_autocommit():
if not self.conn.autocommit:
raise RuntimeError('Listening for events can only be done in autocommit mode')
while True:

View File

@@ -37,18 +37,24 @@ class Control(object):
def running(self, *args, **kwargs):
return self.control_with_reply('running', *args, **kwargs)
def cancel(self, task_ids, *args, **kwargs):
return self.control_with_reply('cancel', *args, extra_data={'task_ids': task_ids}, **kwargs)
@classmethod
def generate_reply_queue_name(cls):
return f"reply_to_{str(uuid.uuid4()).replace('-','_')}"
def control_with_reply(self, command, timeout=5):
def control_with_reply(self, command, timeout=5, extra_data=None):
logger.warning('checking {} {} for {}'.format(self.service, command, self.queuename))
reply_queue = Control.generate_reply_queue_name()
self.result = None
with pg_bus_conn() as conn:
with pg_bus_conn(new_connection=True) as conn:
conn.listen(reply_queue)
conn.notify(self.queuename, json.dumps({'control': command, 'reply_to': reply_queue}))
send_data = {'control': command, 'reply_to': reply_queue}
if extra_data:
send_data.update(extra_data)
conn.notify(self.queuename, json.dumps(send_data))
for reply in conn.events(select_timeout=timeout, yield_timeouts=True):
if reply is None:

View File

@@ -72,11 +72,9 @@ class PoolWorker(object):
self.messages_finished = 0
self.managed_tasks = collections.OrderedDict()
self.finished = MPQueue(queue_size) if self.track_managed_tasks else NoOpResultQueue()
self.last_finished = None
self.queue = MPQueue(queue_size)
self.process = Process(target=target, args=(self.queue, self.finished) + args)
self.process.daemon = True
self.scale_down_in = settings.DISPATCHER_SCALE_DOWN_WAIT_TIME
def start(self):
self.process.start()
@@ -147,9 +145,6 @@ class PoolWorker(object):
# state of which events are *currently* being processed.
logger.warning('Event UUID {} appears to be have been duplicated.'.format(uuid))
if finished:
self.last_finished = time.time()
@property
def current_task(self):
if not self.track_managed_tasks:
@@ -195,14 +190,6 @@ class PoolWorker(object):
def idle(self):
return not self.busy
@property
def ready_to_scale_down(self):
if self.busy:
return False
if self.last_finished is None:
return True
return time.time() - self.last_finished > self.scale_down_in
class StatefulPoolWorker(PoolWorker):
@@ -263,7 +250,7 @@ class WorkerPool(object):
except Exception:
logger.exception('could not fork')
else:
logger.info(f'scaling up worker pid:{worker.pid} total:{len(self.workers)}')
logger.debug('scaling up worker pid:{}'.format(worker.pid))
return idx, worker
def debug(self, *args, **kwargs):
@@ -402,12 +389,12 @@ class AutoscalePool(WorkerPool):
logger.exception('failed to reap job UUID {}'.format(w.current_task['uuid']))
orphaned.extend(w.orphaned_tasks)
self.workers.remove(w)
elif (len(self.workers) > self.min_workers) and w.ready_to_scale_down:
elif w.idle and len(self.workers) > self.min_workers:
# the process has an empty queue (it's idle) and we have
# more processes in the pool than we need (> min)
# send this process a message so it will exit gracefully
# at the next opportunity
logger.info(f'scaling down worker pid:{w.pid} prior total:{len(self.workers)}')
logger.debug('scaling down worker pid:{}'.format(w.pid))
w.quit()
self.workers.remove(w)
if w.alive:

View File

@@ -63,7 +63,7 @@ class AWXConsumerBase(object):
def control(self, body):
logger.warning(f'Received control signal:\n{body}')
control = body.get('control')
if control in ('status', 'running'):
if control in ('status', 'running', 'cancel'):
reply_queue = body['reply_to']
if control == 'status':
msg = '\n'.join([self.listening_on, self.pool.debug()])
@@ -72,6 +72,17 @@ class AWXConsumerBase(object):
for worker in self.pool.workers:
worker.calculate_managed_tasks()
msg.extend(worker.managed_tasks.keys())
elif control == 'cancel':
msg = []
task_ids = set(body['task_ids'])
for worker in self.pool.workers:
task = worker.current_task
if task and task['uuid'] in task_ids:
logger.warn(f'Sending SIGTERM to task id={task["uuid"]}, task={task.get("task")}, args={task.get("args")}')
os.kill(worker.pid, signal.SIGTERM)
msg.append(task['uuid'])
if task_ids and not msg:
logger.info(f'Could not locate running tasks to cancel with ids={task_ids}')
with pg_bus_conn() as conn:
conn.notify(reply_queue, json.dumps(msg))

View File

@@ -54,7 +54,7 @@ class Command(BaseCommand):
capacity = f' capacity={x.capacity}' if x.node_type != 'hop' else ''
version = f" version={x.version or '?'}" if x.node_type != 'hop' else ''
heartbeat = f' heartbeat="{x.modified:%Y-%m-%d %H:%M:%S}"' if x.capacity or x.node_type == 'hop' else ''
heartbeat = f' heartbeat="{x.last_seen:%Y-%m-%d %H:%M:%S}"' if x.capacity or x.node_type == 'hop' else ''
print(f'\t{color}{x.hostname}{capacity} node_type={x.node_type}{version}{heartbeat}\033[0m')
print()

View File

@@ -1,6 +1,7 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved.
import logging
import yaml
from django.conf import settings
from django.core.cache import cache as django_cache
@@ -30,7 +31,16 @@ class Command(BaseCommand):
'--reload',
dest='reload',
action='store_true',
help=('cause the dispatcher to recycle all of its worker processes;' 'running jobs will run to completion first'),
help=('cause the dispatcher to recycle all of its worker processes; running jobs will run to completion first'),
)
parser.add_argument(
'--cancel',
dest='cancel',
help=(
'Cancel a particular task id. Takes either a single id string, or a JSON list of multiple ids. '
'Can take in output from the --running argument as input to cancel all tasks. '
'Only running tasks can be canceled, queued tasks must be started before they can be canceled.'
),
)
def handle(self, *arg, **options):
@@ -42,6 +52,16 @@ class Command(BaseCommand):
return
if options.get('reload'):
return Control('dispatcher').control({'control': 'reload'})
if options.get('cancel'):
cancel_str = options.get('cancel')
try:
cancel_data = yaml.safe_load(cancel_str)
except Exception:
cancel_data = [cancel_str]
if not isinstance(cancel_data, list):
cancel_data = [cancel_str]
print(Control('dispatcher').cancel(cancel_data))
return
# It's important to close these because we're _about_ to fork, and we
# don't want the forked processes to inherit the open sockets

View File

@@ -0,0 +1,57 @@
# Generated by Django 3.2.13 on 2022-08-24 14:02
from django.db import migrations, models
import django.db.models.deletion
from awx.main.models import CredentialType
from awx.main.utils.common import set_current_apps
def setup_tower_managed_defaults(apps, schema_editor):
set_current_apps(apps)
CredentialType.setup_tower_managed_defaults(apps)
class Migration(migrations.Migration):
dependencies = [
('main', '0166_alter_jobevent_host'),
]
operations = [
migrations.AddField(
model_name='project',
name='signature_validation_credential',
field=models.ForeignKey(
blank=True,
default=None,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name='projects_signature_validation',
to='main.credential',
help_text='An optional credential used for validating files in the project against unexpected changes.',
),
),
migrations.AlterField(
model_name='credentialtype',
name='kind',
field=models.CharField(
choices=[
('ssh', 'Machine'),
('vault', 'Vault'),
('net', 'Network'),
('scm', 'Source Control'),
('cloud', 'Cloud'),
('registry', 'Container Registry'),
('token', 'Personal Access Token'),
('insights', 'Insights'),
('external', 'External'),
('kubernetes', 'Kubernetes'),
('galaxy', 'Galaxy/Automation Hub'),
('cryptography', 'Cryptography'),
],
max_length=32,
),
),
migrations.RunPython(setup_tower_managed_defaults),
]

View File

@@ -36,7 +36,7 @@ def create_clearsessions_jt(apps, schema_editor):
if created:
sched = Schedule(
name='Cleanup Expired Sessions',
rrule='DTSTART:%s RRULE:FREQ=WEEKLY;INTERVAL=1;COUNT=1' % schedule_time,
rrule='DTSTART:%s RRULE:FREQ=WEEKLY;INTERVAL=1' % schedule_time,
description='Cleans out expired browser sessions',
enabled=True,
created=now_dt,
@@ -69,7 +69,7 @@ def create_cleartokens_jt(apps, schema_editor):
if created:
sched = Schedule(
name='Cleanup Expired OAuth 2 Tokens',
rrule='DTSTART:%s RRULE:FREQ=WEEKLY;INTERVAL=1;COUNT=1' % schedule_time,
rrule='DTSTART:%s RRULE:FREQ=WEEKLY;INTERVAL=1' % schedule_time,
description='Removes expired OAuth 2 access and refresh tokens',
enabled=True,
created=now_dt,

View File

@@ -336,6 +336,7 @@ class CredentialType(CommonModelNameNotUnique):
('external', _('External')),
('kubernetes', _('Kubernetes')),
('galaxy', _('Galaxy/Automation Hub')),
('cryptography', _('Cryptography')),
)
kind = models.CharField(max_length=32, choices=KIND_CHOICES)
@@ -1171,6 +1172,25 @@ ManagedCredentialType(
},
)
ManagedCredentialType(
namespace='gpg_public_key',
kind='cryptography',
name=gettext_noop('GPG Public Key'),
inputs={
'fields': [
{
'id': 'gpg_public_key',
'label': gettext_noop('GPG Public Key'),
'type': 'string',
'secret': True,
'multiline': True,
'help_text': gettext_noop('GPG Public Key used to validate content signatures.'),
},
],
'required': ['gpg_public_key'],
},
)
class CredentialInputSource(PrimordialModel):
class Meta:

View File

@@ -236,6 +236,12 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
raise ParseError(_('Slice number must be 1 or higher.'))
return (number, step)
def get_sliced_hosts(self, host_queryset, slice_number, slice_count):
if slice_count > 1 and slice_number > 0:
offset = slice_number - 1
host_queryset = host_queryset[offset::slice_count]
return host_queryset
def get_script_data(self, hostvars=False, towervars=False, show_all=False, slice_number=1, slice_count=1):
hosts_kw = dict()
if not show_all:
@@ -243,10 +249,8 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
fetch_fields = ['name', 'id', 'variables', 'inventory_id']
if towervars:
fetch_fields.append('enabled')
hosts = self.hosts.filter(**hosts_kw).order_by('name').only(*fetch_fields)
if slice_count > 1 and slice_number > 0:
offset = slice_number - 1
hosts = hosts[offset::slice_count]
host_queryset = self.hosts.filter(**hosts_kw).order_by('name').only(*fetch_fields)
hosts = self.get_sliced_hosts(host_queryset, slice_number, slice_count)
data = dict()
all_group = data.setdefault('all', dict())

View File

@@ -814,7 +814,8 @@ class Job(UnifiedJob, JobOptions, SurveyJobMixin, JobNotificationMixin, TaskMana
def _get_inventory_hosts(self, only=['name', 'ansible_facts', 'ansible_facts_modified', 'modified', 'inventory_id']):
if not self.inventory:
return []
return self.inventory.hosts.only(*only)
host_queryset = self.inventory.hosts.only(*only)
return self.inventory.get_sliced_hosts(host_queryset, self.job_slice_number, self.job_slice_count)
def start_job_fact_cache(self, destination, modification_times, timeout=None):
self.log_lifecycle("start_job_fact_cache")

View File

@@ -412,6 +412,11 @@ class TaskManagerJobMixin(TaskManagerUnifiedJobMixin):
class Meta:
abstract = True
def get_jobs_fail_chain(self):
if self.project_update_id:
return [self.project_update]
return []
class TaskManagerUpdateOnLaunchMixin(TaskManagerUnifiedJobMixin):
class Meta:

View File

@@ -284,6 +284,17 @@ class Project(UnifiedJobTemplate, ProjectOptions, ResourceMixin, CustomVirtualEn
help_text=_('Allow changing the SCM branch or revision in a job template ' 'that uses this project.'),
)
# credential (keys) used to validate content signature
signature_validation_credential = models.ForeignKey(
'Credential',
related_name='%(class)ss_signature_validation',
blank=True,
null=True,
default=None,
on_delete=models.SET_NULL,
help_text=_('An optional credential used for validating files in the project against unexpected changes.'),
)
scm_revision = models.CharField(
max_length=1024,
blank=True,
@@ -620,6 +631,10 @@ class ProjectUpdate(UnifiedJob, ProjectOptions, JobNotificationMixin, TaskManage
added_update_fields = []
if not self.job_tags:
job_tags = ['update_{}'.format(self.scm_type), 'install_roles', 'install_collections']
if self.project.signature_validation_credential is not None:
credential_type = self.project.signature_validation_credential.credential_type.namespace
job_tags.append(f'validation_{credential_type}')
job_tags.append('validation_checksum_manifest')
self.job_tags = ','.join(job_tags)
added_update_fields.append('job_tags')
if self.scm_delete_on_update and 'delete' not in self.job_tags and self.job_type == 'check':

View File

@@ -1395,22 +1395,6 @@ class UnifiedJob(
# Done!
return True
@property
def actually_running(self):
# returns True if the job is running in the appropriate dispatcher process
running = False
if all([self.status == 'running', self.celery_task_id, self.execution_node]):
# If the job is marked as running, but the dispatcher
# doesn't know about it (or the dispatcher doesn't reply),
# then cancel the job
timeout = 5
try:
running = self.celery_task_id in ControlDispatcher('dispatcher', self.controller_node or self.execution_node).running(timeout=timeout)
except (socket.timeout, RuntimeError):
logger.error('could not reach dispatcher on {} within {}s'.format(self.execution_node, timeout))
running = False
return running
@property
def can_cancel(self):
return bool(self.status in CAN_CANCEL)
@@ -1420,27 +1404,61 @@ class UnifiedJob(
return 'Previous Task Canceled: {"job_type": "%s", "job_name": "%s", "job_id": "%s"}' % (self.model_to_str(), self.name, self.id)
return None
def fallback_cancel(self):
if not self.celery_task_id:
self.refresh_from_db(fields=['celery_task_id'])
self.cancel_dispatcher_process()
def cancel_dispatcher_process(self):
"""Returns True if dispatcher running this job acknowledged request and sent SIGTERM"""
if not self.celery_task_id:
return
canceled = []
try:
# Use control and reply mechanism to cancel and obtain confirmation
timeout = 5
canceled = ControlDispatcher('dispatcher', self.controller_node).cancel([self.celery_task_id])
except socket.timeout:
logger.error(f'could not reach dispatcher on {self.controller_node} within {timeout}s')
except Exception:
logger.exception("error encountered when checking task status")
return bool(self.celery_task_id in canceled) # True or False, whether confirmation was obtained
def cancel(self, job_explanation=None, is_chain=False):
if self.can_cancel:
if not is_chain:
for x in self.get_jobs_fail_chain():
x.cancel(job_explanation=self._build_job_explanation(), is_chain=True)
cancel_fields = []
if not self.cancel_flag:
self.cancel_flag = True
self.start_args = '' # blank field to remove encrypted passwords
cancel_fields = ['cancel_flag', 'start_args']
if self.status in ('pending', 'waiting', 'new'):
self.status = 'canceled'
cancel_fields.append('status')
if self.status == 'running' and not self.actually_running:
self.status = 'canceled'
cancel_fields.append('status')
cancel_fields.extend(['cancel_flag', 'start_args'])
connection.on_commit(lambda: self.websocket_emit_status("canceled"))
if job_explanation is not None:
self.job_explanation = job_explanation
cancel_fields.append('job_explanation')
self.save(update_fields=cancel_fields)
self.websocket_emit_status("canceled")
controller_notified = False
if self.celery_task_id:
controller_notified = self.cancel_dispatcher_process()
else:
# Avoid race condition where we have stale model from pending state but job has already started,
# its checking signal but not cancel_flag, so re-send signal after this database commit
connection.on_commit(self.fallback_cancel)
# If a SIGTERM signal was sent to the control process, and acked by the dispatcher
# then we want to let its own cleanup change status, otherwise change status now
if not controller_notified:
if self.status != 'canceled':
self.status = 'canceled'
cancel_fields.append('status')
self.save(update_fields=cancel_fields)
return self.cancel_flag
@property

View File

@@ -723,11 +723,10 @@ class WorkflowJob(UnifiedJob, WorkflowJobOptions, SurveyJobMixin, JobNotificatio
def preferred_instance_groups(self):
return []
@property
def actually_running(self):
def cancel_dispatcher_process(self):
# WorkflowJobs don't _actually_ run anything in the dispatcher, so
# there's no point in asking the dispatcher if it knows about this task
return self.status == 'running'
return True
class WorkflowApprovalTemplate(UnifiedJobTemplate, RelatedJobsMixin):

View File

@@ -34,12 +34,10 @@ class TaskManagerInstance:
class TaskManagerInstances:
def __init__(self, active_tasks, instances=None):
def __init__(self, active_tasks, instances=None, instance_fields=('node_type', 'capacity', 'hostname', 'enabled')):
self.instances_by_hostname = dict()
if instances is None:
instances = (
Instance.objects.filter(hostname__isnull=False, enabled=True).exclude(node_type='hop').only('node_type', 'capacity', 'hostname', 'enabled')
)
instances = Instance.objects.filter(hostname__isnull=False, enabled=True).exclude(node_type='hop').only(*instance_fields)
for instance in instances:
self.instances_by_hostname[instance.hostname] = TaskManagerInstance(instance)

View File

@@ -6,17 +6,16 @@ import os
import stat
# Django
from django.utils.timezone import now
from django.conf import settings
from django_guid import get_guid
from django.utils.functional import cached_property
from django.db import connections
# AWX
from awx.main.redact import UriCleaner
from awx.main.constants import MINIMAL_EVENTS, ANSIBLE_RUNNER_NEEDS_UPDATE_MESSAGE
from awx.main.utils.update_model import update_model
from awx.main.queue import CallbackQueueDispatcher
from awx.main.tasks.signals import signal_callback
logger = logging.getLogger('awx.main.tasks.callback')
@@ -175,28 +174,6 @@ class RunnerCallback:
return False
def cancel_callback(self):
"""
Ansible runner callback to tell the job when/if it is canceled
"""
unified_job_id = self.instance.pk
if signal_callback():
return True
try:
self.instance = self.update_model(unified_job_id)
except Exception:
logger.exception(f'Encountered error during cancel check for {unified_job_id}, canceling now')
return True
if not self.instance:
logger.error('unified job {} was deleted while running, canceling'.format(unified_job_id))
return True
if self.instance.cancel_flag or self.instance.status == 'canceled':
cancel_wait = (now() - self.instance.modified).seconds if self.instance.modified else 0
if cancel_wait > 5:
logger.warning('Request to cancel {} took {} seconds to complete.'.format(self.instance.log_format, cancel_wait))
return True
return False
def finished_callback(self, runner_obj):
"""
Ansible runner callback triggered on finished run
@@ -227,6 +204,8 @@ class RunnerCallback:
with disable_activity_stream():
self.instance = self.update_model(self.instance.pk, job_args=json.dumps(runner_config.command), job_cwd=runner_config.cwd, job_env=job_env)
# We opened a connection just for that save, close it here now
connections.close_all()
elif status_data['status'] == 'failed':
# For encrypted ssh_key_data, ansible-runner worker will open and write the
# ssh_key_data to a named pipe. Then, once the podman container starts, ssh-agent will

View File

@@ -402,6 +402,10 @@ class BaseTask(object):
raise
else:
time.sleep(1.0)
self.instance.refresh_from_db(fields=['cancel_flag'])
if self.instance.cancel_flag or signal_callback():
logger.debug(f"Unified job {self.instance.id} was canceled while waiting for project file lock")
return
waiting_time = time.time() - start_time
if waiting_time > 1.0:
@@ -483,6 +487,7 @@ class BaseTask(object):
self.instance.log_lifecycle("preparing_playbook")
if self.instance.cancel_flag or signal_callback():
self.instance = self.update_model(self.instance.pk, status='canceled')
if self.instance.status != 'running':
# Stop the task chain and prevent starting the job if it has
# already been canceled.
@@ -585,7 +590,7 @@ class BaseTask(object):
event_handler=self.runner_callback.event_handler,
finished_callback=self.runner_callback.finished_callback,
status_handler=self.runner_callback.status_handler,
cancel_callback=self.runner_callback.cancel_callback,
cancel_callback=signal_callback,
**params,
)
else:
@@ -1266,6 +1271,10 @@ class RunProjectUpdate(BaseTask):
# for raw archive, prevent error moving files between volumes
extra_vars['ansible_remote_tmp'] = os.path.join(project_update.get_project_path(check_if_exists=False), '.ansible_awx', 'tmp')
if project_update.project.signature_validation_credential is not None:
pubkey = project_update.project.signature_validation_credential.get_input('gpg_public_key')
extra_vars['gpg_pubkey'] = pubkey
self._write_extra_vars_file(private_data_dir, extra_vars)
def build_playbook_path_relative_to_cwd(self, project_update, private_data_dir):
@@ -1288,10 +1297,6 @@ class RunProjectUpdate(BaseTask):
# re-create root project folder if a natural disaster has destroyed it
project_path = instance.project.get_project_path(check_if_exists=False)
instance.refresh_from_db(fields=['cancel_flag'])
if instance.cancel_flag:
logger.debug("ProjectUpdate({0}) was canceled".format(instance.pk))
return
if instance.launch_type != 'sync':
self.acquire_lock(instance.project, instance.id)
@@ -1622,7 +1627,7 @@ class RunInventoryUpdate(SourceControlMixin, BaseTask):
handler = SpecialInventoryHandler(
self.runner_callback.event_handler,
self.runner_callback.cancel_callback,
signal_callback,
verbosity=inventory_update.verbosity,
job_timeout=self.get_instance_timeout(self.instance),
start_time=inventory_update.started,

View File

@@ -12,6 +12,7 @@ import yaml
# Django
from django.conf import settings
from django.db import connections
# Runner
import ansible_runner
@@ -25,6 +26,7 @@ from awx.main.utils.common import (
cleanup_new_process,
)
from awx.main.constants import MAX_ISOLATED_PATH_COLON_DELIMITER
from awx.main.tasks.signals import signal_state, signal_callback, SignalExit
# Receptorctl
from receptorctl.socket_interface import ReceptorControl
@@ -335,24 +337,32 @@ class AWXReceptorJob:
shutil.rmtree(artifact_dir)
resultsock, resultfile = receptor_ctl.get_work_results(self.unit_id, return_socket=True, return_sockfile=True)
# Both "processor" and "cancel_watcher" are spawned in separate threads.
# We wait for the first one to return. If cancel_watcher returns first,
# we yank the socket out from underneath the processor, which will cause it
# to exit. A reference to the processor_future is passed into the cancel_watcher_future,
# Which exits if the job has finished normally. The context manager ensures we do not
# leave any threads laying around.
with concurrent.futures.ThreadPoolExecutor(max_workers=2) as executor:
processor_future = executor.submit(self.processor, resultfile)
cancel_watcher_future = executor.submit(self.cancel_watcher, processor_future)
futures = [processor_future, cancel_watcher_future]
first_future = concurrent.futures.wait(futures, return_when=concurrent.futures.FIRST_COMPLETED)
res = list(first_future.done)[0].result()
if res.status == 'canceled':
connections.close_all()
# "processor" and the main thread will be separate threads.
# If a cancel happens, the main thread will encounter an exception, in which case
# we yank the socket out from underneath the processor, which will cause it to exit.
# The ThreadPoolExecutor context manager ensures we do not leave any threads laying around.
with concurrent.futures.ThreadPoolExecutor(max_workers=1) as executor:
processor_future = executor.submit(self.processor, resultfile)
try:
signal_state.raise_exception = True
# address race condition where SIGTERM was issued after this dispatcher task started
if signal_callback():
raise SignalExit()
res = processor_future.result()
except SignalExit:
receptor_ctl.simple_command(f"work cancel {self.unit_id}")
resultsock.shutdown(socket.SHUT_RDWR)
resultfile.close()
elif res.status == 'error':
result = namedtuple('result', ['status', 'rc'])
res = result('canceled', 1)
finally:
signal_state.raise_exception = False
if res.status == 'error':
# If ansible-runner ran, but an error occured at runtime, the traceback information
# is saved via the status_handler passed in to the processor.
if 'result_traceback' in self.task.runner_callback.extra_update_fields:
@@ -446,18 +456,6 @@ class AWXReceptorJob:
return 'local'
return 'ansible-runner'
@cleanup_new_process
def cancel_watcher(self, processor_future):
while True:
if processor_future.done():
return processor_future.result()
if self.task.runner_callback.cancel_callback():
result = namedtuple('result', ['status', 'rc'])
return result('canceled', 1)
time.sleep(1)
@property
def pod_definition(self):
ee = self.task.instance.execution_environment

View File

@@ -9,12 +9,17 @@ logger = logging.getLogger('awx.main.tasks.signals')
__all__ = ['with_signal_handling', 'signal_callback']
class SignalExit(Exception):
pass
class SignalState:
def reset(self):
self.sigterm_flag = False
self.is_active = False
self.original_sigterm = None
self.original_sigint = None
self.raise_exception = False
def __init__(self):
self.reset()
@@ -22,6 +27,9 @@ class SignalState:
def set_flag(self, *args):
"""Method to pass into the python signal.signal method to receive signals"""
self.sigterm_flag = True
if self.raise_exception:
self.raise_exception = False # so it is not raised a second time in error handling
raise SignalExit()
def connect_signals(self):
self.original_sigterm = signal.getsignal(signal.SIGTERM)

View File

@@ -74,34 +74,37 @@ GLqbpJyX2r3p/Rmo6mLY71SqpA==
@pytest.mark.django_db
def test_default_cred_types():
assert sorted(CredentialType.defaults.keys()) == [
'aim',
'aws',
'azure_kv',
'azure_rm',
'centrify_vault_kv',
'conjur',
'controller',
'galaxy_api_token',
'gce',
'github_token',
'gitlab_token',
'hashivault_kv',
'hashivault_ssh',
'insights',
'kubernetes_bearer_token',
'net',
'openstack',
'registry',
'rhv',
'satellite6',
'scm',
'ssh',
'thycotic_dsv',
'thycotic_tss',
'vault',
'vmware',
]
assert sorted(CredentialType.defaults.keys()) == sorted(
[
'aim',
'aws',
'azure_kv',
'azure_rm',
'centrify_vault_kv',
'conjur',
'controller',
'galaxy_api_token',
'gce',
'github_token',
'gitlab_token',
'gpg_public_key',
'hashivault_kv',
'hashivault_ssh',
'insights',
'kubernetes_bearer_token',
'net',
'openstack',
'registry',
'rhv',
'satellite6',
'scm',
'ssh',
'thycotic_dsv',
'thycotic_tss',
'vault',
'vmware',
]
)
for type_ in CredentialType.defaults.values():
assert type_().managed is True

View File

@@ -244,7 +244,7 @@ class TestAutoScaling:
assert not self.pool.should_grow
alive_pid = self.pool.workers[1].pid
self.pool.workers[0].process.terminate()
time.sleep(1) # wait a moment for sigterm
time.sleep(2) # wait a moment for sigterm
# clean up and the dead worker
self.pool.cleanup()

View File

@@ -22,6 +22,10 @@ def test_unified_job_workflow_attributes():
assert job.workflow_job_id == 1
def mock_on_commit(f):
f()
@pytest.fixture
def unified_job(mocker):
mocker.patch.object(UnifiedJob, 'can_cancel', return_value=True)
@@ -30,12 +34,14 @@ def unified_job(mocker):
j.cancel_flag = None
j.save = mocker.MagicMock()
j.websocket_emit_status = mocker.MagicMock()
j.fallback_cancel = mocker.MagicMock()
return j
def test_cancel(unified_job):
unified_job.cancel()
with mock.patch('awx.main.models.unified_jobs.connection.on_commit', wraps=mock_on_commit):
unified_job.cancel()
assert unified_job.cancel_flag is True
assert unified_job.status == 'canceled'
@@ -50,10 +56,11 @@ def test_cancel(unified_job):
def test_cancel_job_explanation(unified_job):
job_explanation = 'giggity giggity'
unified_job.cancel(job_explanation=job_explanation)
with mock.patch('awx.main.models.unified_jobs.connection.on_commit'):
unified_job.cancel(job_explanation=job_explanation)
assert unified_job.job_explanation == job_explanation
unified_job.save.assert_called_with(update_fields=['cancel_flag', 'start_args', 'status', 'job_explanation'])
unified_job.save.assert_called_with(update_fields=['cancel_flag', 'start_args', 'job_explanation', 'status'])
def test_organization_copy_to_jobs():

View File

@@ -76,7 +76,7 @@ class SpecialInventoryHandler(logging.Handler):
def emit(self, record):
# check cancel and timeout status regardless of log level
this_time = now()
if (this_time - self.last_check).total_seconds() > 0.5: # cancel callback is expensive
if (this_time - self.last_check).total_seconds() > 0.1:
self.last_check = this_time
if self.cancel_callback():
raise PostRunError('Inventory update has been canceled', status='canceled')

View File

@@ -0,0 +1,115 @@
from __future__ import absolute_import, division, print_function
__metaclass__ = type
import gnupg
import os
import tempfile
from ansible.module_utils.basic import *
from ansible.plugins.action import ActionBase
from ansible.utils.display import Display
from ansible_sign.checksum import (
ChecksumFile,
ChecksumMismatch,
InvalidChecksumLine,
)
from ansible_sign.checksum.differ import DistlibManifestChecksumFileExistenceDiffer
from ansible_sign.signing import *
display = Display()
VALIDATION_TYPES = (
"checksum_manifest",
"gpg",
)
class ActionModule(ActionBase):
def run(self, tmp=None, task_vars=None):
self._supports_check_mode = False
super(ActionModule, self).run(tmp, task_vars)
self.params = self._task.args
self.project_path = self.params.get("project_path")
if self.project_path is None:
return {
"failed": True,
"msg": "No project path (project_path) was supplied.",
}
validation_type = self.params.get("validation_type")
if validation_type is None or validation_type not in VALIDATION_TYPES:
return {"failed": True, "msg": "validation_type must be one of: " + ', '.join(VALIDATION_TYPES)}
validation_method = getattr(self, f"validate_{validation_type}")
return validation_method()
def validate_gpg(self):
gpg_pubkey = self.params.get("gpg_pubkey")
if gpg_pubkey is None:
return {
"failed": True,
"msg": "No GPG public key (gpg_pubkey) was supplied.",
}
signature_file = os.path.join(self.project_path, ".ansible-sign", "sha256sum.txt.sig")
manifest_file = os.path.join(self.project_path, ".ansible-sign", "sha256sum.txt")
for path in (signature_file, manifest_file):
if not os.path.exists(path):
return {
"failed": True,
"msg": f"Expected file not found: {path}",
}
with tempfile.TemporaryDirectory() as gpg_home:
gpg = gnupg.GPG(gnupghome=gpg_home)
gpg.import_keys(gpg_pubkey)
verifier = GPGVerifier(
manifest_path=manifest_file,
detached_signature_path=signature_file,
gpg_home=gpg_home,
)
result = verifier.verify()
return {
"failed": not result.success,
"msg": result.summary,
"gpg_details": result.extra_information,
}
def validate_checksum_manifest(self):
checksum = ChecksumFile(self.project_path, differ=DistlibManifestChecksumFileExistenceDiffer)
manifest_file = os.path.join(self.project_path, ".ansible-sign", "sha256sum.txt")
if not os.path.exists(manifest_file):
return {
"failed": True,
"msg": f"Expected file not found: {path}",
}
checksum_file_contents = open(manifest_file, "r").read()
try:
manifest = checksum.parse(checksum_file_contents)
except InvalidChecksumLine as e:
return {
"failed": True,
"msg": f"Invalid line in checksum manifest: {e}",
}
try:
checksum.verify(manifest)
except ChecksumMismatch as e:
return {
"failed": True,
"msg": str(e),
}
return {
"failed": False,
"msg": "Checksum manifest is valid.",
}

View File

@@ -0,0 +1,65 @@
ANSIBLE_METADATA = {"metadata_version": "1.0", "status": ["stableinterface"], "supported_by": "community"}
DOCUMENTATION = """
---
module: playbook_integrity
short_description: verify that files within a project have not been tampered with.
description:
- Makes use of the 'ansible-sign' project as a library for ensuring that an
Ansible project has not been tampered with.
- There are multiple types of validation that this action plugin supports,
currently: GPG public/private key signing of a checksum manifest file, and
checking the checksum manifest file itself against the checksum of each file
that is being verified.
- In the future, other types of validation may be supported.
options:
project_path:
description:
- Directory of the project being verified. Expected to contain a
C(.ansible-sign) directory with a generated checksum manifest file and a
detached signature for it. These files are produced by the
C(ansible-sign) command-line utility.
required: true
validation_type:
description:
- Describes the kind of validation to perform on the project.
- I(validation_type=gpg) means that a GPG Public Key credential is being
used to verify the integrity of the checksum manifest (and therefore the
project).
- 'checksum_manifest' means that the signed checksum manifest is validated
against all files in the project listed by its MANIFEST.in file. Just
running this plugin with I(validation_type=checksum_manifest) is
typically B(NOT) enough. It should also be run with a I(validation_type)
that ensures that the manifest file itself has not changed, such as
I(validation_type=gpg).
required: true
choices:
- gpg
- checksum_manifest
gpg_pubkey:
description:
- The public key to validate a checksum manifest against. Must match the
detached signature in the project's C(.ansible-sign) directory.
- Required when I(validation_type=gpg).
author:
- Ansible AWX Team
"""
EXAMPLES = """
- name: Verify project content using GPG signature
playbook_integrity:
project_path: /srv/projects/example
validation_type: gpg
gpg_pubkey: |
-----BEING PGP PUBLIC KEY BLOCK-----
mWINAFXMtjsACADIf/zJS0V3UO3c+KAUcpVAcChpliM31ICDWydfIfF3dzMzLcCd
Cj2kk1mPWtP/JHfk1V5czcWWWWGC2Tw4g4IS+LokAAuwk7VKTlI34eeMl8SiZCAI
[...]
- name: Verify project content against checksum manifest
playbook_integrity:
project_path: /srv/projects/example
validation_type: checksum_manifest
"""

View File

@@ -18,6 +18,7 @@
# galaxy_task_env: environment variables to use specifically for ansible-galaxy commands
# awx_version: Current running version of the awx or tower as a string
# awx_license_type: "open" for AWX; else presume Tower
# gpg_pubkey: the GPG public key to use for validation, when enabled
- hosts: localhost
gather_facts: false
@@ -153,6 +154,28 @@
- update_insights
- update_archive
- hosts: localhost
gather_facts: false
connection: local
name: Perform project signature/checksum verification
tasks:
- name: Verify project content using GPG signature
playbook_integrity:
project_path: "{{ project_path | quote }}"
validation_type: gpg
gpg_pubkey: "{{ gpg_pubkey }}"
register: gpg_result
tags:
- validation_gpg_public_key
- name: Verify project content against checksum manifest
playbook_integrity:
project_path: "{{ project_path | quote }}"
validation_type: checksum_manifest
register: checksum_result
tags:
- validation_checksum_manifest
- hosts: localhost
gather_facts: false
connection: local

View File

@@ -444,10 +444,6 @@ EXECUTION_NODE_REMEDIATION_CHECKS = 60 * 30 # once every 30 minutes check if an
# Amount of time dispatcher will try to reconnect to database for jobs and consuming new work
DISPATCHER_DB_DOWNTOWN_TOLLERANCE = 40
# Minimum time to wait after last job finished before scaling down a worker
# A higher value will free up memory more agressively, but a lower value will require less forking
DISPATCHER_SCALE_DOWN_WAIT_TIME = 60
BROKER_URL = 'unix:///var/run/redis/redis.sock'
CELERYBEAT_SCHEDULE = {
'tower_scheduler': {'task': 'awx.main.tasks.system.awx_periodic_scheduler', 'schedule': timedelta(seconds=30), 'options': {'expires': 20}},

View File

@@ -108,7 +108,6 @@ AWX_DISABLE_TASK_MANAGERS = False
if 'sqlite3' not in DATABASES['default']['ENGINE']: # noqa
DATABASES['default'].setdefault('OPTIONS', dict()).setdefault('application_name', f'{CLUSTER_HOST_ID}-{os.getpid()}-{" ".join(sys.argv)}'[:63]) # noqa
# If any local_*.py files are present in awx/settings/, use them to override
# default settings for development. If not present, we can still run using
# only the defaults.

View File

@@ -44,7 +44,7 @@ Have questions about this document or anything not covered here? Feel free to re
- functions should adopt camelCase
- constructors/classes should adopt PascalCase
- constants to be exported should adopt UPPERCASE
- For strings, we adopt the `sentence capitalization` since it is a [Patternfly style guide](https://www.patternfly.org/v4/design-guidelines/content/grammar-and-terminology#capitalization).
- For strings, we adopt the `sentence capitalization` since it is a [Patternfly style guide](https://www.patternfly.org/v4/ux-writing/capitalization).
## Setting up your development environment

190
awx/ui/package-lock.json generated
View File

@@ -7,22 +7,22 @@
"name": "ui",
"dependencies": {
"@lingui/react": "3.14.0",
"@patternfly/patternfly": "4.202.1",
"@patternfly/patternfly": "4.210.2",
"@patternfly/react-core": "^4.221.3",
"@patternfly/react-icons": "4.75.1",
"@patternfly/react-table": "4.93.1",
"ace-builds": "^1.8.1",
"@patternfly/react-table": "4.100.8",
"ace-builds": "^1.10.1",
"ansi-to-html": "0.7.2",
"axios": "0.27.2",
"codemirror": "^6.0.1",
"d3": "7.4.4",
"dagre": "^0.8.4",
"dompurify": "2.3.10",
"dompurify": "2.4.0",
"formik": "2.2.9",
"has-ansi": "5.0.1",
"html-entities": "2.3.2",
"js-yaml": "4.1.0",
"luxon": "^3.0.1",
"luxon": "^3.0.3",
"prop-types": "^15.8.1",
"react": "17.0.2",
"react-ace": "^10.1.0",
@@ -3746,18 +3746,18 @@
"dev": true
},
"node_modules/@patternfly/patternfly": {
"version": "4.202.1",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.202.1.tgz",
"integrity": "sha512-cQiiPqmwJOm9onuTfLPQNRlpAZwDIJ/zVfDQeaFqMQyPJtxtKn3lkphz5xErY5dPs9rR4X94ytQ1I9pkVzaPJQ=="
"version": "4.210.2",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.210.2.tgz",
"integrity": "sha512-aZiW24Bxi6uVmk5RyNTp+6q6ThtlJZotNRJfWVeGuwu1UlbBuV4DFa1bpjA6jfTZpfEpX2YL5+R+4ZVSCFAVdw=="
},
"node_modules/@patternfly/react-core": {
"version": "4.224.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.224.1.tgz",
"integrity": "sha512-v8wGGNoMGndAScAoE5jeOA5jVgymlLSwttPjQk/Idr0k7roSpOrsM39oXUR5DEgkZee45DW00WKTgmg50PP3FQ==",
"version": "4.231.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.231.8.tgz",
"integrity": "sha512-2ClqlYCvSADppMfVfkUGIA/8XlO6jX8batoClXLxZDwqGoOfr61XyUgQ6SSlE4w60czoNeX4Nf6cfQKUH4RIKw==",
"dependencies": {
"@patternfly/react-icons": "4.75.1",
"@patternfly/react-styles": "^4.74.1",
"@patternfly/react-tokens": "^4.76.1",
"@patternfly/react-icons": "^4.82.8",
"@patternfly/react-styles": "^4.81.8",
"@patternfly/react-tokens": "^4.83.8",
"focus-trap": "6.9.2",
"react-dropzone": "9.0.0",
"tippy.js": "5.1.2",
@@ -3768,6 +3768,15 @@
"react-dom": "^16.8.0 || ^17.0.0"
}
},
"node_modules/@patternfly/react-core/node_modules/@patternfly/react-icons": {
"version": "4.82.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.82.8.tgz",
"integrity": "sha512-cKixprTiMLZRe/+kmdZ5suvYb9ly9p1f/HjlcNiWBfsiA8ZDEPmxJnVdend/YsafelC8YC9QGcQf97ay5PNhcw==",
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0",
"react-dom": "^16.8.0 || ^17.0.0"
}
},
"node_modules/@patternfly/react-core/node_modules/tslib": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz",
@@ -3783,19 +3792,19 @@
}
},
"node_modules/@patternfly/react-styles": {
"version": "4.74.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.74.1.tgz",
"integrity": "sha512-9eWvKrjtrJ3qhJkhY2GQKyYA13u/J0mU1befH49SYbvxZtkbuHdpKmXBAeQoHmcx1hcOKtiYXeKb+dVoRRNx0A=="
"version": "4.81.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.81.8.tgz",
"integrity": "sha512-Q5FiureSSCMIuz+KLMcEm1317TzbXcwmg2q5iNDRKyf/K+5CT6tJp0Wbtk3FlfRvzli4u/7YfXipahia5TL+tA=="
},
"node_modules/@patternfly/react-table": {
"version": "4.93.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.93.1.tgz",
"integrity": "sha512-N/zHkNsY3X3yUXPg6COwdZKAFmTCbWm25qCY2aHjrXlIlE2OKWaYvVag0CcTwPiQhIuCumztr9Y2Uw9uvv0Fsw==",
"version": "4.100.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.100.8.tgz",
"integrity": "sha512-80XZCZzoYN9gsoufNdXUB/dk33SuWF9lUnOJs7ilezD6noTSD7ARqO1h532eaEPIbPBp4uIVkEUdfGSHd0HJtg==",
"dependencies": {
"@patternfly/react-core": "^4.224.1",
"@patternfly/react-icons": "4.75.1",
"@patternfly/react-styles": "^4.74.1",
"@patternfly/react-tokens": "^4.76.1",
"@patternfly/react-core": "^4.231.8",
"@patternfly/react-icons": "^4.82.8",
"@patternfly/react-styles": "^4.81.8",
"@patternfly/react-tokens": "^4.83.8",
"lodash": "^4.17.19",
"tslib": "^2.0.0"
},
@@ -3804,15 +3813,24 @@
"react-dom": "^16.8.0 || ^17.0.0"
}
},
"node_modules/@patternfly/react-table/node_modules/@patternfly/react-icons": {
"version": "4.82.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.82.8.tgz",
"integrity": "sha512-cKixprTiMLZRe/+kmdZ5suvYb9ly9p1f/HjlcNiWBfsiA8ZDEPmxJnVdend/YsafelC8YC9QGcQf97ay5PNhcw==",
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0",
"react-dom": "^16.8.0 || ^17.0.0"
}
},
"node_modules/@patternfly/react-table/node_modules/tslib": {
"version": "2.4.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.4.0.tgz",
"integrity": "sha512-d6xOpEDfsi2CZVlPQzGeux8XMwLT9hssAsaPYExaQMuYskwb+x1x7J371tWlbBdWHroy99KnVB6qIkUbs5X3UQ=="
},
"node_modules/@patternfly/react-tokens": {
"version": "4.76.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.76.1.tgz",
"integrity": "sha512-gLEezRSzQeflaPu3SCgYmWtuiqDIRtxNNFP1+ES7P2o56YHXJ5o1Pki7LpNCPk/VOzHy2+vRFE/7l+hBEweugw=="
"version": "4.83.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.83.8.tgz",
"integrity": "sha512-Z/MHXNY8PQOuBFGUar2yzPVbz3BNJuhB+Dnk5RJcc/iIn3S+VlSru7g6v5jqoV/+a5wLqZtLGEBp8uhCZ7Xkig=="
},
"node_modules/@pmmmwh/react-refresh-webpack-plugin": {
"version": "0.5.4",
@@ -5249,9 +5267,9 @@
}
},
"node_modules/ace-builds": {
"version": "1.8.1",
"resolved": "https://registry.npmjs.org/ace-builds/-/ace-builds-1.8.1.tgz",
"integrity": "sha512-wjEQ4khMQYg9FfdEDoOtqdoHwcwFL48H0VB3te5b5A3eqHwxsTw8IX6+xzfisgborIb8dYU+1y9tcmtGFrCPIg=="
"version": "1.10.1",
"resolved": "https://registry.npmjs.org/ace-builds/-/ace-builds-1.10.1.tgz",
"integrity": "sha512-w8Xj6lZUtOYAquVYvdpZhb0GxXrZ+qpVfgj5LP2FwUbXE8fPrCmfu86FjwOiSphx/8PMbXXVldFLD2+RIXayyA=="
},
"node_modules/acorn": {
"version": "7.4.1",
@@ -6448,14 +6466,20 @@
}
},
"node_modules/caniuse-lite": {
"version": "1.0.30001300",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001300.tgz",
"integrity": "sha512-cVjiJHWGcNlJi8TZVKNMnvMid3Z3TTdDHmLDzlOdIiZq138Exvo0G+G0wTdVYolxKb4AYwC+38pxodiInVtJSA==",
"version": "1.0.30001393",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001393.tgz",
"integrity": "sha512-N/od11RX+Gsk+1qY/jbPa0R6zJupEa0lxeBG598EbrtblxVCTJsQwbRBm6+V+rxpc5lHKdsXb9RY83cZIPLseA==",
"dev": true,
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/browserslist"
}
"funding": [
{
"type": "opencollective",
"url": "https://opencollective.com/browserslist"
},
{
"type": "tidelift",
"url": "https://tidelift.com/funding/github/npm/caniuse-lite"
}
]
},
"node_modules/case-sensitive-paths-webpack-plugin": {
"version": "2.4.0",
@@ -8271,9 +8295,9 @@
}
},
"node_modules/dompurify": {
"version": "2.3.10",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-2.3.10.tgz",
"integrity": "sha512-o7Fg/AgC7p/XpKjf/+RC3Ok6k4St5F7Q6q6+Nnm3p2zGWioAY6dh0CbbuwOhH2UcSzKsdniE/YnE2/92JcsA+g=="
"version": "2.4.0",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-2.4.0.tgz",
"integrity": "sha512-Be9tbQMZds4a3C6xTmz68NlMfeONA//4dOavl/1rNw50E+/QO0KVpbcU0PcaW0nsQxurXls9ZocqFxk8R2mWEA=="
},
"node_modules/domutils": {
"version": "2.8.0",
@@ -15448,9 +15472,9 @@
}
},
"node_modules/luxon": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/luxon/-/luxon-3.0.1.tgz",
"integrity": "sha512-hF3kv0e5gwHQZKz4wtm4c+inDtyc7elkanAsBq+fundaCdUBNJB1dHEGUZIM6SfSBUlbVFduPwEtNjFK8wLtcw==",
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/luxon/-/luxon-3.0.3.tgz",
"integrity": "sha512-+EfHWnF+UT7GgTnq5zXg3ldnTKL2zdv7QJgsU5bjjpbH17E3qi/puMhQyJVYuCq+FRkogvB5WB6iVvUr+E4a7w==",
"engines": {
"node": ">=12"
}
@@ -25069,24 +25093,30 @@
"dev": true
},
"@patternfly/patternfly": {
"version": "4.202.1",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.202.1.tgz",
"integrity": "sha512-cQiiPqmwJOm9onuTfLPQNRlpAZwDIJ/zVfDQeaFqMQyPJtxtKn3lkphz5xErY5dPs9rR4X94ytQ1I9pkVzaPJQ=="
"version": "4.210.2",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-4.210.2.tgz",
"integrity": "sha512-aZiW24Bxi6uVmk5RyNTp+6q6ThtlJZotNRJfWVeGuwu1UlbBuV4DFa1bpjA6jfTZpfEpX2YL5+R+4ZVSCFAVdw=="
},
"@patternfly/react-core": {
"version": "4.224.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.224.1.tgz",
"integrity": "sha512-v8wGGNoMGndAScAoE5jeOA5jVgymlLSwttPjQk/Idr0k7roSpOrsM39oXUR5DEgkZee45DW00WKTgmg50PP3FQ==",
"version": "4.231.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-4.231.8.tgz",
"integrity": "sha512-2ClqlYCvSADppMfVfkUGIA/8XlO6jX8batoClXLxZDwqGoOfr61XyUgQ6SSlE4w60czoNeX4Nf6cfQKUH4RIKw==",
"requires": {
"@patternfly/react-icons": "4.75.1",
"@patternfly/react-styles": "^4.74.1",
"@patternfly/react-tokens": "^4.76.1",
"@patternfly/react-icons": "^4.82.8",
"@patternfly/react-styles": "^4.81.8",
"@patternfly/react-tokens": "^4.83.8",
"focus-trap": "6.9.2",
"react-dropzone": "9.0.0",
"tippy.js": "5.1.2",
"tslib": "^2.0.0"
},
"dependencies": {
"@patternfly/react-icons": {
"version": "4.82.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.82.8.tgz",
"integrity": "sha512-cKixprTiMLZRe/+kmdZ5suvYb9ly9p1f/HjlcNiWBfsiA8ZDEPmxJnVdend/YsafelC8YC9QGcQf97ay5PNhcw==",
"requires": {}
},
"tslib": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.3.1.tgz",
@@ -25101,23 +25131,29 @@
"requires": {}
},
"@patternfly/react-styles": {
"version": "4.74.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.74.1.tgz",
"integrity": "sha512-9eWvKrjtrJ3qhJkhY2GQKyYA13u/J0mU1befH49SYbvxZtkbuHdpKmXBAeQoHmcx1hcOKtiYXeKb+dVoRRNx0A=="
"version": "4.81.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-4.81.8.tgz",
"integrity": "sha512-Q5FiureSSCMIuz+KLMcEm1317TzbXcwmg2q5iNDRKyf/K+5CT6tJp0Wbtk3FlfRvzli4u/7YfXipahia5TL+tA=="
},
"@patternfly/react-table": {
"version": "4.93.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.93.1.tgz",
"integrity": "sha512-N/zHkNsY3X3yUXPg6COwdZKAFmTCbWm25qCY2aHjrXlIlE2OKWaYvVag0CcTwPiQhIuCumztr9Y2Uw9uvv0Fsw==",
"version": "4.100.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-table/-/react-table-4.100.8.tgz",
"integrity": "sha512-80XZCZzoYN9gsoufNdXUB/dk33SuWF9lUnOJs7ilezD6noTSD7ARqO1h532eaEPIbPBp4uIVkEUdfGSHd0HJtg==",
"requires": {
"@patternfly/react-core": "^4.224.1",
"@patternfly/react-icons": "4.75.1",
"@patternfly/react-styles": "^4.74.1",
"@patternfly/react-tokens": "^4.76.1",
"@patternfly/react-core": "^4.231.8",
"@patternfly/react-icons": "^4.82.8",
"@patternfly/react-styles": "^4.81.8",
"@patternfly/react-tokens": "^4.83.8",
"lodash": "^4.17.19",
"tslib": "^2.0.0"
},
"dependencies": {
"@patternfly/react-icons": {
"version": "4.82.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-4.82.8.tgz",
"integrity": "sha512-cKixprTiMLZRe/+kmdZ5suvYb9ly9p1f/HjlcNiWBfsiA8ZDEPmxJnVdend/YsafelC8YC9QGcQf97ay5PNhcw==",
"requires": {}
},
"tslib": {
"version": "2.4.0",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.4.0.tgz",
@@ -25126,9 +25162,9 @@
}
},
"@patternfly/react-tokens": {
"version": "4.76.1",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.76.1.tgz",
"integrity": "sha512-gLEezRSzQeflaPu3SCgYmWtuiqDIRtxNNFP1+ES7P2o56YHXJ5o1Pki7LpNCPk/VOzHy2+vRFE/7l+hBEweugw=="
"version": "4.83.8",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-4.83.8.tgz",
"integrity": "sha512-Z/MHXNY8PQOuBFGUar2yzPVbz3BNJuhB+Dnk5RJcc/iIn3S+VlSru7g6v5jqoV/+a5wLqZtLGEBp8uhCZ7Xkig=="
},
"@pmmmwh/react-refresh-webpack-plugin": {
"version": "0.5.4",
@@ -26307,9 +26343,9 @@
}
},
"ace-builds": {
"version": "1.8.1",
"resolved": "https://registry.npmjs.org/ace-builds/-/ace-builds-1.8.1.tgz",
"integrity": "sha512-wjEQ4khMQYg9FfdEDoOtqdoHwcwFL48H0VB3te5b5A3eqHwxsTw8IX6+xzfisgborIb8dYU+1y9tcmtGFrCPIg=="
"version": "1.10.1",
"resolved": "https://registry.npmjs.org/ace-builds/-/ace-builds-1.10.1.tgz",
"integrity": "sha512-w8Xj6lZUtOYAquVYvdpZhb0GxXrZ+qpVfgj5LP2FwUbXE8fPrCmfu86FjwOiSphx/8PMbXXVldFLD2+RIXayyA=="
},
"acorn": {
"version": "7.4.1",
@@ -27264,9 +27300,9 @@
}
},
"caniuse-lite": {
"version": "1.0.30001300",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001300.tgz",
"integrity": "sha512-cVjiJHWGcNlJi8TZVKNMnvMid3Z3TTdDHmLDzlOdIiZq138Exvo0G+G0wTdVYolxKb4AYwC+38pxodiInVtJSA==",
"version": "1.0.30001393",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001393.tgz",
"integrity": "sha512-N/od11RX+Gsk+1qY/jbPa0R6zJupEa0lxeBG598EbrtblxVCTJsQwbRBm6+V+rxpc5lHKdsXb9RY83cZIPLseA==",
"dev": true
},
"case-sensitive-paths-webpack-plugin": {
@@ -28661,9 +28697,9 @@
}
},
"dompurify": {
"version": "2.3.10",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-2.3.10.tgz",
"integrity": "sha512-o7Fg/AgC7p/XpKjf/+RC3Ok6k4St5F7Q6q6+Nnm3p2zGWioAY6dh0CbbuwOhH2UcSzKsdniE/YnE2/92JcsA+g=="
"version": "2.4.0",
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-2.4.0.tgz",
"integrity": "sha512-Be9tbQMZds4a3C6xTmz68NlMfeONA//4dOavl/1rNw50E+/QO0KVpbcU0PcaW0nsQxurXls9ZocqFxk8R2mWEA=="
},
"domutils": {
"version": "2.8.0",
@@ -34183,9 +34219,9 @@
}
},
"luxon": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/luxon/-/luxon-3.0.1.tgz",
"integrity": "sha512-hF3kv0e5gwHQZKz4wtm4c+inDtyc7elkanAsBq+fundaCdUBNJB1dHEGUZIM6SfSBUlbVFduPwEtNjFK8wLtcw=="
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/luxon/-/luxon-3.0.3.tgz",
"integrity": "sha512-+EfHWnF+UT7GgTnq5zXg3ldnTKL2zdv7QJgsU5bjjpbH17E3qi/puMhQyJVYuCq+FRkogvB5WB6iVvUr+E4a7w=="
},
"lz-string": {
"version": "1.4.4",

View File

@@ -7,22 +7,22 @@
},
"dependencies": {
"@lingui/react": "3.14.0",
"@patternfly/patternfly": "4.202.1",
"@patternfly/patternfly": "4.210.2",
"@patternfly/react-core": "^4.221.3",
"@patternfly/react-icons": "4.75.1",
"@patternfly/react-table": "4.93.1",
"ace-builds": "^1.8.1",
"@patternfly/react-table": "4.100.8",
"ace-builds": "^1.10.1",
"ansi-to-html": "0.7.2",
"axios": "0.27.2",
"codemirror": "^6.0.1",
"d3": "7.4.4",
"dagre": "^0.8.4",
"dompurify": "2.3.10",
"dompurify": "2.4.0",
"formik": "2.2.9",
"has-ansi": "5.0.1",
"html-entities": "2.3.2",
"js-yaml": "4.1.0",
"luxon": "^3.0.1",
"luxon": "^3.0.3",
"prop-types": "^15.8.1",
"react": "17.0.2",
"react-ace": "^10.1.0",

View File

@@ -7,7 +7,15 @@ class CredentialTypes extends Base {
}
async loadAllTypes(
acceptableKinds = ['machine', 'cloud', 'net', 'ssh', 'vault', 'kubernetes']
acceptableKinds = [
'machine',
'cloud',
'net',
'ssh',
'vault',
'kubernetes',
'cryptography',
]
) {
const pageSize = 200;
// The number of credential types a user can have is unlimited. In practice, it is unlikely for

View File

@@ -9,6 +9,8 @@ function CredentialChip({ credential, ...props }) {
let type;
if (credential.cloud) {
type = t`Cloud`;
} else if (credential.kind === 'gpg_public_key') {
type = t`GPG Public Key`;
} else if (credential.kind === 'aws' || credential.kind === 'ssh') {
type = credential.kind.toUpperCase();
} else {

View File

@@ -29,4 +29,8 @@ export default styled(DetailList)`
--column-count: 3;
}
`}
& + & {
margin-top: 20px;
}
`;

View File

@@ -125,6 +125,21 @@ function PromptProjectDetail({ resource }) {
}
/>
)}
{summary_fields?.signature_validation_credential?.id && (
<Detail
label={t`Content Signature Validation Credential`}
dataCy={`${prefixCy}-content-signature-validation-credential`}
value={
<CredentialChip
key={resource.summary_fields.signature_validation_credential.id}
credential={
resource.summary_fields.signature_validation_credential
}
isReadOnly
/>
}
/>
)}
{optionsList && (
<Detail
label={t`Enabled Options`}

View File

@@ -10,7 +10,13 @@ const Label = styled.div`
font-weight: var(--pf-global--FontWeight--bold);
`;
export default function FrequencyDetails({ type, label, options, timezone }) {
export default function FrequencyDetails({
type,
label,
options,
timezone,
isException,
}) {
const getRunEveryLabel = () => {
const { interval } = options;
switch (type) {
@@ -77,11 +83,17 @@ export default function FrequencyDetails({ type, label, options, timezone }) {
6: t`Sunday`,
};
const prefix = isException ? `exception-${type}` : `frequency-${type}`;
return (
<div>
<Label>{label}</Label>
<DetailList gutter="sm">
<Detail label={t`Run every`} value={getRunEveryLabel()} />
<Detail
label={isException ? t`Skip every` : t`Run every`}
value={getRunEveryLabel()}
dataCy={`${prefix}-run-every`}
/>
{type === 'week' ? (
<Detail
label={t`On days`}
@@ -89,10 +101,15 @@ export default function FrequencyDetails({ type, label, options, timezone }) {
.sort(sortWeekday)
.map((d) => weekdays[d.weekday])
.join(', ')}
dataCy={`${prefix}-days-of-week`}
/>
) : null}
<RunOnDetail type={type} options={options} />
<Detail label={t`End`} value={getEndValue(type, options, timezone)} />
<RunOnDetail type={type} options={options} prefix={prefix} />
<Detail
label={t`End`}
value={getEndValue(type, options, timezone)}
dataCy={`${prefix}-end`}
/>
</DetailList>
</div>
);
@@ -104,11 +121,15 @@ function sortWeekday(a, b) {
return a.weekday - b.weekday;
}
function RunOnDetail({ type, options }) {
function RunOnDetail({ type, options, prefix }) {
if (type === 'month') {
if (options.runOn === 'day') {
return (
<Detail label={t`Run on`} value={t`Day ${options.runOnDayNumber}`} />
<Detail
label={t`Run on`}
value={t`Day ${options.runOnDayNumber}`}
dataCy={`${prefix}-run-on-day`}
/>
);
}
const dayOfWeek = options.runOnTheDay;
@@ -129,6 +150,7 @@ function RunOnDetail({ type, options }) {
/>
)
}
dataCy={`${prefix}-run-on-day`}
/>
);
}
@@ -152,6 +174,7 @@ function RunOnDetail({ type, options }) {
<Detail
label={t`Run on`}
value={`${months[options.runOnTheMonth]} ${options.runOnDayMonth}`}
dataCy={`${prefix}-run-on-day`}
/>
);
}
@@ -186,6 +209,7 @@ function RunOnDetail({ type, options }) {
/>
)
}
dataCy={`${prefix}-run-on-day`}
/>
);
}

View File

@@ -2,7 +2,6 @@ import 'styled-components/macro';
import React, { useCallback, useEffect } from 'react';
import { Link, useHistory, useLocation } from 'react-router-dom';
import styled from 'styled-components';
import { t } from '@lingui/macro';
import { Chip, Divider, Title, Button } from '@patternfly/react-core';
import { Schedule } from 'types';
@@ -26,7 +25,7 @@ import ErrorDetail from '../../ErrorDetail';
import ChipGroup from '../../ChipGroup';
import { VariablesDetail } from '../../CodeEditor';
import { VERBOSITY } from '../../VerbositySelectField';
import helpText from '../../../screens/Template/shared/JobTemplate.helptext';
import getHelpText from '../../../screens/Template/shared/JobTemplate.helptext';
const PromptDivider = styled(Divider)`
margin-top: var(--pf-global--spacer--lg);
@@ -60,6 +59,10 @@ const FrequencyDetailsContainer = styled.div`
padding-bottom: var(--pf-global--spacer--md);
border-bottom: 1px solid var(--pf-global--palette--black-300);
}
& + & {
margin-top: calc(var(--pf-global--spacer--lg) * -1);
}
`;
function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
@@ -85,7 +88,7 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
timezone,
verbosity,
} = schedule;
const helpText = getHelpText();
const history = useHistory();
const { pathname } = useLocation();
const pathRoot = pathname.substr(0, pathname.indexOf('schedules'));
@@ -161,10 +164,14 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
month: t`Month`,
year: t`Year`,
};
const { frequency, frequencyOptions } = parseRuleObj(schedule);
const { frequency, frequencyOptions, exceptionFrequency, exceptionOptions } =
parseRuleObj(schedule);
const repeatFrequency = frequency.length
? frequency.map((f) => frequencies[f]).join(', ')
: t`None (Run Once)`;
const exceptionRepeatFrequency = exceptionFrequency.length
? exceptionFrequency.map((f) => frequencies[f]).join(', ')
: t`None (Run Once)`;
const {
ask_credential_on_launch,
@@ -271,43 +278,84 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
isDisabled={isDisabled}
/>
<DetailList gutter="sm">
<Detail label={t`Name`} value={name} />
<Detail label={t`Description`} value={description} />
<Detail label={t`Name`} value={name} dataCy="schedule-name" />
<Detail
label={t`Description`}
value={description}
dataCy="schedule-description"
/>
<Detail
label={t`First Run`}
value={formatDateString(dtstart, timezone)}
dataCy="schedule-first-run"
/>
<Detail
label={t`Next Run`}
value={formatDateString(next_run, timezone)}
dataCy="schedule-next-run"
/>
<Detail label={t`Last Run`} value={formatDateString(dtend, timezone)} />
<Detail
label={t`Local Time Zone`}
value={timezone}
helpText={helpText.localTimeZone(config)}
dataCy="schedule-timezone"
/>
<Detail
label={t`Repeat Frequency`}
value={repeatFrequency}
dataCy="schedule-repeat-frequency"
/>
<Detail
label={t`Exception Frequency`}
value={exceptionRepeatFrequency}
dataCy="schedule-exception-frequency"
/>
<Detail label={t`Repeat Frequency`} value={repeatFrequency} />
</DetailList>
{frequency.length ? (
<FrequencyDetailsContainer>
<p>
<strong>{t`Frequency Details`}</strong>
</p>
{frequency.map((freq) => (
<FrequencyDetails
key={freq}
type={freq}
label={frequencies[freq]}
options={frequencyOptions[freq]}
timezone={timezone}
/>
))}
<div ouia-component-id="schedule-frequency-details">
<p>
<strong>{t`Frequency Details`}</strong>
</p>
{frequency.map((freq) => (
<FrequencyDetails
key={freq}
type={freq}
label={frequencies[freq]}
options={frequencyOptions[freq]}
timezone={timezone}
/>
))}
</div>
</FrequencyDetailsContainer>
) : null}
{exceptionFrequency.length ? (
<FrequencyDetailsContainer>
<div ouia-component-id="schedule-exception-details">
<p css="border-top: 0">
<strong>{t`Frequency Exception Details`}</strong>
</p>
{exceptionFrequency.map((freq) => (
<FrequencyDetails
key={freq}
type={freq}
label={frequencies[freq]}
options={exceptionOptions[freq]}
timezone={timezone}
isException
/>
))}
</div>
</FrequencyDetailsContainer>
) : null}
<DetailList gutter="sm">
{hasDaysToKeepField ? (
<Detail label={t`Days of Data to Keep`} value={daysToKeep} />
<Detail
label={t`Days of Data to Keep`}
value={daysToKeep}
dataCy="schedule-days-to-keep"
/>
) : null}
<ScheduleOccurrences preview={preview} tz={timezone} />
<UserDateDetail
@@ -327,7 +375,11 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
<PromptDivider />
<PromptDetailList>
{ask_job_type_on_launch && (
<Detail label={t`Job Type`} value={job_type} />
<Detail
label={t`Job Type`}
value={job_type}
dataCy="shedule-job-type"
/>
)}
{showInventoryDetail && (
<Detail
@@ -347,19 +399,31 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
' '
)
}
dataCy="schedule-inventory"
/>
)}
{ask_verbosity_on_launch && (
<Detail label={t`Verbosity`} value={VERBOSITY()[verbosity]} />
<Detail
label={t`Verbosity`}
value={VERBOSITY()[verbosity]}
dataCy="schedule-verbosity"
/>
)}
{ask_scm_branch_on_launch && (
<Detail label={t`Source Control Branch`} value={scm_branch} />
<Detail
label={t`Source Control Branch`}
value={scm_branch}
dataCy="schedule-scm-branch"
/>
)}
{ask_limit_on_launch && (
<Detail label={t`Limit`} value={limit} dataCy="schedule-limit" />
)}
{ask_limit_on_launch && <Detail label={t`Limit`} value={limit} />}
{showDiffModeDetail && (
<Detail
label={t`Show Changes`}
value={diff_mode ? t`On` : t`Off`}
dataCy="schedule-show-changes"
/>
)}
{showCredentialsDetail && (
@@ -382,6 +446,7 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
))}
</ChipGroup>
}
dataCy="schedule-credentials"
/>
)}
{showTagsDetail && (
@@ -405,6 +470,7 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
))}
</ChipGroup>
}
dataCy="schedule-job-tags"
/>
)}
{showSkipTagsDetail && (
@@ -428,6 +494,7 @@ function ScheduleDetail({ hasDaysToKeepField, schedule, surveyConfig }) {
))}
</ChipGroup>
}
dataCy="schedule-skip-tags"
/>
)}
{showVariablesDetail && (

View File

@@ -45,7 +45,7 @@ const Checkbox = styled(_Checkbox)`
}
`;
const FrequencyDetailSubform = ({ frequency, prefix }) => {
const FrequencyDetailSubform = ({ frequency, prefix, isException }) => {
const id = prefix.replace('.', '-');
const [runOnDayMonth] = useField({
name: `${prefix}.runOnDayMonth`,
@@ -220,7 +220,7 @@ const FrequencyDetailSubform = ({ frequency, prefix }) => {
validated={
!intervalMeta.touched || !intervalMeta.error ? 'default' : 'error'
}
label={t`Run every`}
label={isException ? t`Skip every` : t`Run every`}
>
<div css="display: flex">
<TextInput

View File

@@ -20,6 +20,7 @@ import ScheduleFormFields from './ScheduleFormFields';
import UnsupportedScheduleForm from './UnsupportedScheduleForm';
import parseRuleObj, { UnsupportedRRuleError } from './parseRuleObj';
import buildRuleObj from './buildRuleObj';
import buildRuleSet from './buildRuleSet';
const NUM_DAYS_PER_FREQUENCY = {
week: 7,
@@ -411,6 +412,10 @@ function ScheduleForm({
}
});
if (values.exceptionFrequency.length > 0 && !scheduleHasInstances(values)) {
errors.exceptionFrequency = t`This schedule has no occurrences due to the selected exceptions.`;
}
return errors;
};
@@ -518,3 +523,24 @@ ScheduleForm.defaultProps = {
};
export default ScheduleForm;
function scheduleHasInstances(values) {
let rangeToCheck = 1;
values.frequency.forEach((freq) => {
if (NUM_DAYS_PER_FREQUENCY[freq] > rangeToCheck) {
rangeToCheck = NUM_DAYS_PER_FREQUENCY[freq];
}
});
const ruleSet = buildRuleSet(values, true);
const startDate = DateTime.fromISO(values.startDate);
const endDate = startDate.plus({ days: rangeToCheck });
const instances = ruleSet.between(
startDate.toJSDate(),
endDate.toJSDate(),
true,
(date, i) => i === 0
);
return instances.length > 0;
}

View File

@@ -86,7 +86,7 @@ const mockSchedule = {
let wrapper;
const defaultFieldsVisible = () => {
const defaultFieldsVisible = (isExceptionsVisible) => {
expect(wrapper.find('FormGroup[label="Name"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Description"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Start date/time"]').length).toBe(1);
@@ -94,7 +94,11 @@ const defaultFieldsVisible = () => {
expect(
wrapper.find('FormGroup[label="Local time zone"]').find('HelpIcon').length
).toBe(1);
expect(wrapper.find('FrequencySelect').length).toBe(1);
if (isExceptionsVisible) {
expect(wrapper.find('FrequencySelect').length).toBe(2);
} else {
expect(wrapper.find('FrequencySelect').length).toBe(1);
}
};
const nonRRuleValuesMatch = () => {
@@ -513,7 +517,7 @@ describe('<ScheduleForm />', () => {
runFrequencySelect.invoke('onChange')(['minute']);
});
wrapper.update();
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="On days"]').length).toBe(0);
@@ -547,7 +551,7 @@ describe('<ScheduleForm />', () => {
runFrequencySelect.invoke('onChange')(['hour']);
});
wrapper.update();
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="On days"]').length).toBe(0);
@@ -579,7 +583,7 @@ describe('<ScheduleForm />', () => {
runFrequencySelect.invoke('onChange')(['day']);
});
wrapper.update();
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="On days"]').length).toBe(0);
@@ -611,7 +615,7 @@ describe('<ScheduleForm />', () => {
runFrequencySelect.invoke('onChange')(['week']);
});
wrapper.update();
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="On days"]').length).toBe(1);
@@ -643,7 +647,7 @@ describe('<ScheduleForm />', () => {
runFrequencySelect.invoke('onChange')(['month']);
});
wrapper.update();
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="On days"]').length).toBe(0);
@@ -692,7 +696,7 @@ describe('<ScheduleForm />', () => {
runFrequencySelect.invoke('onChange')(['year']);
});
wrapper.update();
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="On days"]').length).toBe(0);
@@ -1058,7 +1062,7 @@ describe('<ScheduleForm />', () => {
wrapper.update();
expect(wrapper.find('ScheduleForm').length).toBe(1);
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Occurrences"]').length).toBe(0);
@@ -1113,7 +1117,7 @@ describe('<ScheduleForm />', () => {
wrapper.update();
expect(wrapper.find('ScheduleForm').length).toBe(1);
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Occurrences"]').length).toBe(1);
@@ -1171,7 +1175,7 @@ describe('<ScheduleForm />', () => {
wrapper.update();
expect(wrapper.find('ScheduleForm').length).toBe(1);
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="On days"]').length).toBe(0);
@@ -1224,7 +1228,7 @@ describe('<ScheduleForm />', () => {
wrapper.update();
expect(wrapper.find('ScheduleForm').length).toBe(1);
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="End date/time"]').length).toBe(1);
@@ -1318,10 +1322,7 @@ describe('<ScheduleForm />', () => {
wrapper.update();
expect(wrapper.find('ScheduleForm').length).toBe(1);
defaultFieldsVisible();
expect(wrapper.find('ScheduleForm').length).toBe(1);
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Run on"]').length).toBe(1);
@@ -1394,7 +1395,7 @@ describe('<ScheduleForm />', () => {
wrapper.update();
expect(wrapper.find('ScheduleForm').length).toBe(1);
defaultFieldsVisible();
defaultFieldsVisible(true);
expect(wrapper.find('FormGroup[label="End"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Run every"]').length).toBe(1);
expect(wrapper.find('FormGroup[label="Run on"]').length).toBe(1);

View File

@@ -3,13 +3,14 @@ import { useField } from 'formik';
import { FormGroup, Title } from '@patternfly/react-core';
import { t } from '@lingui/macro';
import styled from 'styled-components';
import 'styled-components/macro';
import FormField from 'components/FormField';
import { required } from 'util/validators';
import { useConfig } from 'contexts/Config';
import Popover from '../../Popover';
import AnsibleSelect from '../../AnsibleSelect';
import FrequencySelect, { SelectOption } from './FrequencySelect';
import helpText from '../../../screens/Template/shared/JobTemplate.helptext';
import getHelpText from '../../../screens/Template/shared/JobTemplate.helptext';
import { SubFormLayout, FormColumnLayout } from '../../FormLayout';
import FrequencyDetailSubform from './FrequencyDetailSubform';
import DateTimePicker from './DateTimePicker';
@@ -26,6 +27,7 @@ export default function ScheduleFormFields({
zoneOptions,
zoneLinks,
}) {
const helpText = getHelpText();
const [timezone, timezoneMeta] = useField({
name: 'timezone',
validate: required(t`Select a value for this field`),
@@ -53,11 +55,11 @@ export default function ScheduleFormFields({
}
const config = useConfig();
// const [exceptionFrequency, exceptionFrequencyMeta, exceptionFrequencyHelper] =
// useField({
// name: 'exceptionFrequency',
// validate: required(t`Select a value for this field`),
// });
const [exceptionFrequency, exceptionFrequencyMeta, exceptionFrequencyHelper] =
useField({
name: 'exceptionFrequency',
validate: required(t`Select a value for this field`),
});
const updateFrequency = (setFrequency) => (values) => {
setFrequency(values.sort(sortFrequencies));
@@ -151,42 +153,53 @@ export default function ScheduleFormFields({
/>
</FormColumnLayout>
))}
{/* <Title size="md" headingLevel="h4">{t`Exceptions`}</Title>
<FormGroup
name="exceptions"
fieldId="exception-frequency"
helperTextInvalid={exceptionFrequencyMeta.error}
validated={
!exceptionFrequencyMeta.touched || !exceptionFrequencyMeta.error
? 'default'
: 'error'
}
label={t`Add exceptions`}
>
<FrequencySelect
variant={SelectVariant.checkbox}
onChange={exceptionFrequencyHelper.setValue}
value={exceptionFrequency.value}
placeholderText={t`None`}
onBlur={exceptionFrequencyHelper.setTouched}
<Title
size="md"
headingLevel="h4"
css="margin-top: var(--pf-c-card--child--PaddingRight)"
>{t`Exceptions`}</Title>
<FormColumnLayout stacked>
<FormGroup
name="exceptions"
fieldId="exception-frequency"
helperTextInvalid={exceptionFrequencyMeta.error}
validated={
!exceptionFrequencyMeta.touched || !exceptionFrequencyMeta.error
? 'default'
: 'error'
}
label={t`Add exceptions`}
>
<SelectClearOption value="none">{t`None`}</SelectClearOption>
<SelectOption value="minute">{t`Minute`}</SelectOption>
<SelectOption value="hour">{t`Hour`}</SelectOption>
<SelectOption value="day">{t`Day`}</SelectOption>
<SelectOption value="week">{t`Week`}</SelectOption>
<SelectOption value="month">{t`Month`}</SelectOption>
<SelectOption value="year">{t`Year`}</SelectOption>
</FrequencySelect>
</FormGroup>
<FrequencySelect
id="exception-frequency"
onChange={updateFrequency(exceptionFrequencyHelper.setValue)}
value={exceptionFrequency.value}
placeholderText={
exceptionFrequency.value.length
? t`Select frequency`
: t`None`
}
onBlur={exceptionFrequencyHelper.setTouched}
>
<SelectClearOption value="none">{t`None`}</SelectClearOption>
<SelectOption value="minute">{t`Minute`}</SelectOption>
<SelectOption value="hour">{t`Hour`}</SelectOption>
<SelectOption value="day">{t`Day`}</SelectOption>
<SelectOption value="week">{t`Week`}</SelectOption>
<SelectOption value="month">{t`Month`}</SelectOption>
<SelectOption value="year">{t`Year`}</SelectOption>
</FrequencySelect>
</FormGroup>
</FormColumnLayout>
{exceptionFrequency.value.map((val) => (
<FormColumnLayout key={val} stacked>
<FrequencyDetailSubform
frequency={val}
prefix={`exceptionOptions.${val}`}
isException
/>
</FormColumnLayout>
))} */}
))}
</SubFormLayout>
) : null}
</>

View File

@@ -36,11 +36,19 @@ function pad(num) {
return num < 10 ? `0${num}` : num;
}
export default function buildRuleObj(values) {
export default function buildRuleObj(values, includeStart) {
const ruleObj = {
interval: values.interval,
};
if (includeStart) {
ruleObj.dtstart = buildDateTime(
values.startDate,
values.startTime,
values.timezone
);
}
switch (values.frequency) {
case 'none':
ruleObj.count = 1;
@@ -91,16 +99,11 @@ export default function buildRuleObj(values) {
ruleObj.count = values.occurrences;
break;
case 'onDate': {
const [endHour, endMinute] = parseTime(values.endTime);
const localEndDate = DateTime.fromISO(`${values.endDate}T000000`, {
zone: values.timezone,
});
const localEndTime = localEndDate.set({
hour: endHour,
minute: endMinute,
second: 0,
});
ruleObj.until = localEndTime.toJSDate();
ruleObj.until = buildDateTime(
values.endDate,
values.endTime,
values.timezone
);
break;
}
default:
@@ -110,3 +113,16 @@ export default function buildRuleObj(values) {
return ruleObj;
}
function buildDateTime(dateString, timeString, timezone) {
const localDate = DateTime.fromISO(`${dateString}T000000`, {
zone: timezone,
});
const [hour, minute] = parseTime(timeString);
const localTime = localDate.set({
hour,
minute,
second: 0,
});
return localTime.toJSDate();
}

View File

@@ -4,24 +4,29 @@ import buildRuleObj, { buildDtStartObj } from './buildRuleObj';
window.RRuleSet = RRuleSet;
const frequencies = ['minute', 'hour', 'day', 'week', 'month', 'year'];
export default function buildRuleSet(values) {
export default function buildRuleSet(values, useUTCStart) {
const set = new RRuleSet();
const startRule = buildDtStartObj({
startDate: values.startDate,
startTime: values.startTime,
timezone: values.timezone,
});
set.rrule(startRule);
if (values.frequency.length === 0) {
const rule = buildRuleObj({
if (!useUTCStart) {
const startRule = buildDtStartObj({
startDate: values.startDate,
startTime: values.startTime,
timezone: values.timezone,
frequency: 'none',
interval: 1,
});
set.rrule(startRule);
}
if (values.frequency.length === 0) {
const rule = buildRuleObj(
{
startDate: values.startDate,
startTime: values.startTime,
timezone: values.timezone,
frequency: 'none',
interval: 1,
},
useUTCStart
);
set.rrule(new RRule(rule));
}
@@ -29,17 +34,35 @@ export default function buildRuleSet(values) {
if (!values.frequency.includes(frequency)) {
return;
}
const rule = buildRuleObj({
startDate: values.startDate,
startTime: values.startTime,
timezone: values.timezone,
frequency,
...values.frequencyOptions[frequency],
});
const rule = buildRuleObj(
{
startDate: values.startDate,
startTime: values.startTime,
timezone: values.timezone,
frequency,
...values.frequencyOptions[frequency],
},
useUTCStart
);
set.rrule(new RRule(rule));
});
// TODO: exclusions
frequencies.forEach((frequency) => {
if (!values.exceptionFrequency?.includes(frequency)) {
return;
}
const rule = buildRuleObj(
{
startDate: values.startDate,
startTime: values.startTime,
timezone: values.timezone,
frequency,
...values.exceptionOptions[frequency],
},
useUTCStart
);
set.exrule(new RRule(rule));
});
return set;
}

View File

@@ -243,4 +243,257 @@ RRULE:INTERVAL=1;FREQ=MONTHLY;BYSETPOS=2;BYDAY=MO;UNTIL=20260602T170000Z`);
expect(ruleSet.toString()).toEqual(`DTSTART:20220613T123000Z
RRULE:INTERVAL=1;COUNT=1;FREQ=MINUTELY`);
});
test('should build minutely exception', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['minute'],
exceptionOptions: {
minute: {
interval: 3,
end: 'never',
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=3;FREQ=MINUTELY',
].join('\n')
);
});
test('should build hourly exception', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['hour'],
exceptionOptions: {
hour: {
interval: 3,
end: 'never',
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=3;FREQ=HOURLY',
].join('\n')
);
});
test('should build daily exception', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['day'],
exceptionOptions: {
day: {
interval: 3,
end: 'never',
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=3;FREQ=DAILY',
].join('\n')
);
});
test('should build weekly exception', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['week'],
exceptionOptions: {
week: {
interval: 3,
end: 'never',
daysOfWeek: [RRule.SU],
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=3;FREQ=WEEKLY;BYDAY=SU',
].join('\n')
);
});
test('should build monthly exception by day', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['month'],
exceptionOptions: {
month: {
interval: 3,
end: 'never',
runOn: 'day',
runOnDayNumber: 15,
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=3;FREQ=MONTHLY;BYMONTHDAY=15',
].join('\n')
);
});
test('should build monthly exception by weekday', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['month'],
exceptionOptions: {
month: {
interval: 3,
end: 'never',
runOn: 'the',
runOnTheOccurrence: 2,
runOnTheDay: 'monday',
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=3;FREQ=MONTHLY;BYSETPOS=2;BYDAY=MO',
].join('\n')
);
});
test('should build annual exception by day', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['year'],
exceptionOptions: {
year: {
interval: 1,
end: 'never',
runOn: 'day',
runOnDayMonth: 3,
runOnDayNumber: 15,
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=1;FREQ=YEARLY;BYMONTH=3;BYMONTHDAY=15',
].join('\n')
);
});
test('should build annual exception by weekday', () => {
const values = {
startDate: '2022-06-13',
startTime: '12:30 PM',
frequency: ['minute'],
frequencyOptions: {
minute: {
interval: 1,
end: 'never',
},
},
exceptionFrequency: ['year'],
exceptionOptions: {
year: {
interval: 1,
end: 'never',
runOn: 'the',
runOnTheOccurrence: 4,
runOnTheDay: 'monday',
runOnTheMonth: 6,
},
},
};
const ruleSet = buildRuleSet(values);
expect(ruleSet.toString()).toEqual(
[
'DTSTART:20220613T123000Z',
'RRULE:INTERVAL=1;FREQ=MINUTELY',
'EXRULE:INTERVAL=1;FREQ=YEARLY;BYSETPOS=4;BYDAY=MO;BYMONTH=6',
].join('\n')
);
});
});

View File

@@ -32,6 +32,9 @@ export default function parseRuleObj(schedule) {
case 'RRULE':
values = parseRrule(ruleString, schedule, values);
break;
case 'EXRULE':
values = parseExRule(ruleString, schedule, values);
break;
default:
throw new UnsupportedRRuleError(`Unsupported rrule type: ${type}`);
}
@@ -79,6 +82,54 @@ const frequencyTypes = {
};
function parseRrule(rruleString, schedule, values) {
const { frequency, options } = parseRule(
rruleString,
schedule,
values.exceptionFrequency
);
if (values.frequencyOptions[frequency]) {
throw new UnsupportedRRuleError(
'Duplicate exception frequency types not supported'
);
}
return {
...values,
frequency: [...values.frequency, frequency].sort(sortFrequencies),
frequencyOptions: {
...values.frequencyOptions,
[frequency]: options,
},
};
}
function parseExRule(exruleString, schedule, values) {
const { frequency, options } = parseRule(
exruleString,
schedule,
values.exceptionFrequency
);
if (values.exceptionOptions[frequency]) {
throw new UnsupportedRRuleError(
'Duplicate exception frequency types not supported'
);
}
return {
...values,
exceptionFrequency: [...values.exceptionFrequency, frequency].sort(
sortFrequencies
),
exceptionOptions: {
...values.exceptionOptions,
[frequency]: options,
},
};
}
function parseRule(ruleString, schedule, frequencies) {
const {
origOptions: {
bymonth,
@@ -90,7 +141,7 @@ function parseRrule(rruleString, schedule, values) {
interval,
until,
},
} = RRule.fromString(rruleString);
} = RRule.fromString(ruleString);
const now = DateTime.now();
const closestQuarterHour = DateTime.fromMillis(
@@ -127,7 +178,7 @@ function parseRrule(rruleString, schedule, values) {
throw new Error(`Unexpected rrule frequency: ${freq}`);
}
const frequency = frequencyTypes[freq];
if (values.frequency.includes(frequency)) {
if (frequencies.includes(frequency)) {
throw new Error(`Duplicate frequency types not supported (${frequency})`);
}
@@ -171,17 +222,9 @@ function parseRrule(rruleString, schedule, values) {
}
}
if (values.frequencyOptions.frequency) {
throw new UnsupportedRRuleError('Duplicate frequency types not supported');
}
return {
...values,
frequency: [...values.frequency, frequency].sort(sortFrequencies),
frequencyOptions: {
...values.frequencyOptions,
[frequency]: options,
},
frequency,
options,
};
}

View File

@@ -241,4 +241,51 @@ RRULE:INTERVAL=1;FREQ=MONTHLY;BYSETPOS=2;BYDAY=MO;UNTIL=20260602T170000Z`;
expect(parsed).toEqual(values);
});
test('should parse exemptions', () => {
const schedule = {
rrule: [
'DTSTART;TZID=US/Eastern:20220608T123000',
'RRULE:INTERVAL=1;FREQ=WEEKLY;BYDAY=MO',
'EXRULE:INTERVAL=1;FREQ=MONTHLY;BYSETPOS=1;BYDAY=MO',
].join(' '),
dtstart: '2022-06-13T16:30:00Z',
timezone: 'US/Eastern',
until: '',
dtend: null,
};
const parsed = parseRuleObj(schedule);
expect(parsed).toEqual({
startDate: '2022-06-13',
startTime: '12:30 PM',
timezone: 'US/Eastern',
frequency: ['week'],
frequencyOptions: {
week: {
interval: 1,
end: 'never',
occurrences: 1,
endDate: '2022-06-02',
endTime: '1:00 PM',
daysOfWeek: [RRule.MO],
},
},
exceptionFrequency: ['month'],
exceptionOptions: {
month: {
interval: 1,
end: 'never',
endDate: '2022-06-02',
endTime: '1:00 PM',
occurrences: 1,
runOn: 'the',
runOnDayNumber: 1,
runOnTheOccurrence: 1,
runOnTheDay: 'monday',
},
},
});
});
});

View File

@@ -11,13 +11,14 @@ import { Detail, DetailList, UserDateDetail } from 'components/DetailList';
import { ApplicationsAPI } from 'api';
import DeleteButton from 'components/DeleteButton';
import ErrorDetail from 'components/ErrorDetail';
import applicationHelpTextStrings from '../shared/Application.helptext';
import getApplicationHelpTextStrings from '../shared/Application.helptext';
function ApplicationDetails({
application,
authorizationOptions,
clientTypeOptions,
}) {
const applicationHelpTextStrings = getApplicationHelpTextStrings();
const history = useHistory();
const {
isLoading: deleteLoading,

View File

@@ -1,9 +1,9 @@
import { t } from '@lingui/macro';
const applicationHelpTextStrings = {
const applicationHelpTextStrings = () => ({
authorizationGrantType: t`The Grant type the user must use to acquire tokens for this application`,
clientType: t`Set to Public or Confidential depending on how secure the client device is.`,
redirectURIS: t`Allowed URIs list, space separated`,
};
});
export default applicationHelpTextStrings;

View File

@@ -13,13 +13,14 @@ import FormActionGroup from 'components/FormActionGroup/FormActionGroup';
import OrganizationLookup from 'components/Lookup/OrganizationLookup';
import AnsibleSelect from 'components/AnsibleSelect';
import Popover from 'components/Popover';
import applicationHelpTextStrings from './Application.helptext';
import getApplicationHelpTextStrings from './Application.helptext';
function ApplicationFormFields({
application,
authorizationOptions,
clientTypeOptions,
}) {
const applicationHelpTextStrings = getApplicationHelpTextStrings();
const match = useRouteMatch();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [organizationField, organizationMeta, organizationHelpers] =

View File

@@ -12,9 +12,10 @@ import useRequest, { useDismissableError } from 'hooks/useRequest';
import { toTitleCase } from 'util/strings';
import { ExecutionEnvironmentsAPI } from 'api';
import { relatedResourceDeleteRequests } from 'util/getRelatedResourceDeleteDetails';
import helpText from '../shared/ExecutionEnvironment.helptext';
import getHelpText from '../shared/ExecutionEnvironment.helptext';
function ExecutionEnvironmentDetails({ executionEnvironment }) {
const helpText = getHelpText();
const history = useHistory();
const {
id,

View File

@@ -1,7 +1,7 @@
import React from 'react';
import { t } from '@lingui/macro';
const executionEnvironmentHelpTextStrings = {
const executionEnvironmentHelpTextStrings = () => ({
image: (
<span>
{t`The full image location, including the container registry, image name, and version tag.`}
@@ -19,6 +19,6 @@ const executionEnvironmentHelpTextStrings = {
</span>
),
registryCredential: t`Credential to authenticate with a protected container registry.`,
};
});
export default executionEnvironmentHelpTextStrings;

View File

@@ -14,7 +14,7 @@ import ContentError from 'components/ContentError';
import ContentLoading from 'components/ContentLoading';
import { required } from 'util/validators';
import useRequest from 'hooks/useRequest';
import helpText from './ExecutionEnvironment.helptext';
import getHelpText from './ExecutionEnvironment.helptext';
function ExecutionEnvironmentFormFields({
me,
@@ -22,6 +22,7 @@ function ExecutionEnvironmentFormFields({
executionEnvironment,
isOrgLookupDisabled,
}) {
const helpText = getHelpText();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');
const [organizationField, organizationMeta, organizationHelpers] =

View File

@@ -16,10 +16,11 @@ import { InventoriesAPI } from 'api';
import useRequest, { useDismissableError } from 'hooks/useRequest';
import { Inventory } from 'types';
import { relatedResourceDeleteRequests } from 'util/getRelatedResourceDeleteDetails';
import helpText from '../shared/Inventory.helptext';
import getHelpText from '../shared/Inventory.helptext';
function InventoryDetail({ inventory }) {
const history = useHistory();
const helpText = getHelpText();
const {
result: instanceGroups,
isLoading,

View File

@@ -32,9 +32,10 @@ import Popover from 'components/Popover';
import { VERBOSITY } from 'components/VerbositySelectField';
import InventorySourceSyncButton from '../shared/InventorySourceSyncButton';
import useWsInventorySourcesDetails from '../InventorySources/useWsInventorySourcesDetails';
import helpText from '../shared/Inventory.helptext';
import getHelpText from '../shared/Inventory.helptext';
function InventorySourceDetail({ inventorySource }) {
const helpText = getHelpText();
const {
created,
custom_virtualenv,

View File

@@ -21,7 +21,7 @@ const ansibleDocUrls = {
'https://docs.ansible.com/ansible/latest/collections/community/vmware/vmware_vm_inventory_inventory.html',
};
const getInventoryHelpTextStrings = {
const getInventoryHelpTextStrings = () => ({
labels: t`Optional labels that describe this inventory,
such as 'dev' or 'test'. Labels can be used to group and filter
inventories and completed jobs.`,
@@ -191,6 +191,6 @@ const getInventoryHelpTextStrings = {
sourcePath: t`The inventory file
to be synced by this source. You can select from
the dropdown or enter a file within the input.`,
};
});
export default getInventoryHelpTextStrings;

View File

@@ -13,9 +13,10 @@ import InstanceGroupsLookup from 'components/Lookup/InstanceGroupsLookup';
import OrganizationLookup from 'components/Lookup/OrganizationLookup';
import ContentError from 'components/ContentError';
import { FormColumnLayout, FormFullWidthLayout } from 'components/FormLayout';
import helpText from './Inventory.helptext';
import getHelpText from './Inventory.helptext';
function InventoryFormFields({ inventory }) {
const helpText = getHelpText();
const [contentError, setContentError] = useState(false);
const { setFieldValue, setFieldTouched } = useFormikContext();
const [organizationField, organizationMeta, organizationHelpers] =

View File

@@ -13,9 +13,10 @@ import {
EnabledValueField,
HostFilterField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const AzureSubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -14,9 +14,10 @@ import {
HostFilterField,
SourceVarsField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const ControllerSubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -12,9 +12,10 @@ import {
EnabledValueField,
HostFilterField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const EC2SubForm = () => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta] = useField('credential');
const config = useConfig();

View File

@@ -13,9 +13,10 @@ import {
HostFilterField,
SourceVarsField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const GCESubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -14,9 +14,10 @@ import {
HostFilterField,
SourceVarsField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const InsightsSubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -13,9 +13,10 @@ import {
EnabledValueField,
HostFilterField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const OpenStackSubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -21,9 +21,10 @@ import {
EnabledValueField,
HostFilterField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const SCMSubForm = ({ autoPopulateProject }) => {
const helpText = getHelpText();
const [isOpen, setIsOpen] = useState(false);
const [sourcePath, setSourcePath] = useState([]);
const { setFieldValue, setFieldTouched } = useFormikContext();

View File

@@ -13,9 +13,10 @@ import {
EnabledValueField,
HostFilterField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const SatelliteSubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -9,25 +9,29 @@ import { VariablesField } from 'components/CodeEditor';
import FormField, { CheckboxField } from 'components/FormField';
import { FormFullWidthLayout, FormCheckboxLayout } from 'components/FormLayout';
import Popover from 'components/Popover';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
export const SourceVarsField = ({ popoverContent }) => (
<FormFullWidthLayout>
<VariablesField
id="source_vars"
name="source_vars"
label={t`Source variables`}
tooltip={
<>
{popoverContent}
{helpText.variables()}
</>
}
/>
</FormFullWidthLayout>
);
export const SourceVarsField = ({ popoverContent }) => {
const helpText = getHelpText();
return (
<FormFullWidthLayout>
<VariablesField
id="source_vars"
name="source_vars"
label={t`Source variables`}
tooltip={
<>
{popoverContent}
{helpText.variables()}
</>
}
/>
</FormFullWidthLayout>
);
};
export const VerbosityField = () => {
const helpText = getHelpText();
const [field, meta, helpers] = useField('verbosity');
const isValid = !(meta.touched && meta.error);
const options = [
@@ -54,6 +58,7 @@ export const VerbosityField = () => {
};
export const OptionsField = () => {
const helpText = getHelpText();
const [updateOnLaunchField] = useField('update_on_launch');
const [, , updateCacheTimeoutHelper] = useField('update_cache_timeout');
const [projectField] = useField('source_project');
@@ -106,33 +111,42 @@ export const OptionsField = () => {
);
};
export const EnabledVarField = () => (
<FormField
id="inventory-enabled-var"
label={t`Enabled Variable`}
tooltip={helpText.enabledVariableField}
name="enabled_var"
type="text"
/>
);
export const EnabledVarField = () => {
const helpText = getHelpText();
return (
<FormField
id="inventory-enabled-var"
label={t`Enabled Variable`}
tooltip={helpText.enabledVariableField}
name="enabled_var"
type="text"
/>
);
};
export const EnabledValueField = () => (
<FormField
id="inventory-enabled-value"
label={t`Enabled Value`}
tooltip={helpText.enabledValue}
name="enabled_value"
type="text"
/>
);
export const EnabledValueField = () => {
const helpText = getHelpText();
return (
<FormField
id="inventory-enabled-value"
label={t`Enabled Value`}
tooltip={helpText.enabledValue}
name="enabled_value"
type="text"
/>
);
};
export const HostFilterField = () => (
<FormField
id="host-filter"
label={t`Host Filter`}
tooltip={helpText.hostFilter}
name="host_filter"
type="text"
validate={regExp()}
/>
);
export const HostFilterField = () => {
const helpText = getHelpText();
return (
<FormField
id="host-filter"
label={t`Host Filter`}
tooltip={helpText.hostFilter}
name="host_filter"
type="text"
validate={regExp()}
/>
);
};

View File

@@ -13,9 +13,10 @@ import {
EnabledValueField,
HostFilterField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const VMwareSubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -13,9 +13,10 @@ import {
HostFilterField,
SourceVarsField,
} from './SharedFields';
import helpText from '../Inventory.helptext';
import getHelpText from '../Inventory.helptext';
const VirtualizationSubForm = ({ autoPopulateCredential }) => {
const helpText = getHelpText();
const { setFieldValue, setFieldTouched } = useFormikContext();
const [credentialField, credentialMeta, credentialHelpers] =
useField('credential');

View File

@@ -1,7 +1,7 @@
import React from 'react';
import { t } from '@lingui/macro';
const jobHelpText = {
const jobHelpText = () => ({
jobType: t`For job templates, select run to execute the playbook. Select check to only check playbook syntax, test environment setup, and report problems without executing the playbook.`,
inventory: t`Select the inventory containing the hosts you want this job to manage.`,
project: t`Select the project containing the playbook you want this job to execute.`,
@@ -41,6 +41,6 @@ const jobHelpText = {
) : (
t`These arguments are used with the specified module.`
),
};
});
export default jobHelpText;

View File

@@ -29,7 +29,7 @@ import { VERBOSITY } from 'components/VerbositySelectField';
import { getJobModel, isJobRunning } from 'util/jobs';
import { formatDateString } from 'util/dates';
import { Job } from 'types';
import jobHelpText from '../Job.helptext';
import getJobHelpText from '../Job.helptext';
const StatusDetailValue = styled.div`
align-items: center;
@@ -39,6 +39,7 @@ const StatusDetailValue = styled.div`
`;
function JobDetail({ job, inventorySourceLabels }) {
const jobHelpText = getJobHelpText();
const { me } = useConfig();
const {
created_by,

View File

@@ -10,7 +10,7 @@ import {
InfiniteLoader,
List,
} from 'react-virtualized';
import { Button } from '@patternfly/react-core';
import { Button, Alert } from '@patternfly/react-core';
import AlertModal from 'components/AlertModal';
import { CardBody as _CardBody } from 'components/Card';
@@ -99,6 +99,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
const scrollHeight = useRef(0);
const history = useHistory();
const eventByUuidRequests = useRef([]);
const eventsProcessedDelay = useRef(250);
const fetchEventByUuid = async (uuid) => {
let promise = eventByUuidRequests.current[uuid];
@@ -156,6 +157,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
);
const [isMonitoringWebsocket, setIsMonitoringWebsocket] = useState(false);
const [lastScrollPosition, setLastScrollPosition] = useState(0);
const [showEventsRefresh, setShowEventsRefresh] = useState(false);
useEffect(() => {
if (!isTreeReady || !onReadyEvents.length) {
@@ -185,6 +187,7 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
useEffect(() => {
const pendingRequests = Object.values(eventByUuidRequests.current || {});
setHasContentLoading(true); // prevents "no content found" screen from flashing
setIsFollowModeEnabled(false);
Promise.allSettled(pendingRequests).then(() => {
setRemoteRowCount(0);
clearLoadedEvents();
@@ -196,51 +199,71 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
rebuildEventsTree();
}, [isFlatMode]); // eslint-disable-line react-hooks/exhaustive-deps
const pollForEventsProcessed = useCallback(async () => {
const {
data: { event_processing_finished },
} = await getJobModel(job.type).readDetail(job.id);
if (event_processing_finished) {
setShowEventsRefresh(true);
return;
}
const fiveMinutes = 1000 * 60 * 5;
if (eventsProcessedDelay.current >= fiveMinutes) {
return;
}
setTimeout(pollForEventsProcessed, eventsProcessedDelay.current);
eventsProcessedDelay.current *= 2;
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [job.id, job.type, lastScrollPosition]);
useEffect(() => {
if (!isJobRunning(jobStatus)) {
setTimeout(() => {
loadJobEvents().then(() => {
setWsEvents([]);
scrollToRow(lastScrollPosition);
});
}, 500);
if (wsEvents.length) {
pollForEventsProcessed();
}
return;
}
let batchTimeout;
let batchedEvents = [];
connectJobSocket(job, (data) => {
const addBatchedEvents = () => {
let min;
let max;
let newCssMap;
batchedEvents.forEach((event) => {
if (!min || event.counter < min) {
min = event.counter;
}
if (!max || event.counter > max) {
max = event.counter;
}
const { lineCssMap } = getLineTextHtml(event);
newCssMap = {
...newCssMap,
...lineCssMap,
};
});
setWsEvents((oldWsEvents) => {
const updated = oldWsEvents.concat(batchedEvents);
jobSocketCounter.current = updated.length;
return updated.sort((a, b) => a.counter - b.counter);
});
setCssMap((prevCssMap) => ({
...prevCssMap,
...newCssMap,
}));
if (max > jobSocketCounter.current) {
jobSocketCounter.current = max;
const addBatchedEvents = () => {
let min;
let max;
let newCssMap;
batchedEvents.forEach((event) => {
if (!min || event.counter < min) {
min = event.counter;
}
batchedEvents = [];
};
if (!max || event.counter > max) {
max = event.counter;
}
const { lineCssMap } = getLineTextHtml(event);
newCssMap = {
...newCssMap,
...lineCssMap,
};
});
setWsEvents((oldWsEvents) => {
const newEvents = [];
batchedEvents.forEach((event) => {
if (!oldWsEvents.find((e) => e.id === event.id)) {
newEvents.push(event);
}
});
const updated = oldWsEvents.concat(newEvents);
jobSocketCounter.current = updated.length;
return updated.sort((a, b) => a.counter - b.counter);
});
setCssMap((prevCssMap) => ({
...prevCssMap,
...newCssMap,
}));
if (max > jobSocketCounter.current) {
jobSocketCounter.current = max;
}
batchedEvents = [];
};
connectJobSocket(job, (data) => {
if (data.group_name === `${job.type}_events`) {
batchedEvents.push(data);
clearTimeout(batchTimeout);
@@ -268,7 +291,8 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
setIsMonitoringWebsocket(false);
isMounted.current = false;
};
}, [isJobRunning(jobStatus)]); // eslint-disable-line react-hooks/exhaustive-deps
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [isJobRunning(jobStatus), pollForEventsProcessed]);
useEffect(() => {
if (isFollowModeEnabled) {
@@ -681,6 +705,26 @@ function JobOutput({ job, eventRelatedSearchableKeys, eventSearchableKeys }) {
isFollowModeEnabled={isFollowModeEnabled}
setIsFollowModeEnabled={setIsFollowModeEnabled}
/>
{showEventsRefresh ? (
<Alert
variant="default"
title={
<>
{t`Events processing complete.`}{' '}
<Button
variant="link"
isInline
onClick={() => {
loadJobEvents().then(() => {
setWsEvents([]);
});
setShowEventsRefresh(false);
}}
>{t`Reload output`}</Button>
</>
}
/>
) : null}
<PageControls
onScrollFirst={handleScrollFirst}
onScrollLast={handleScrollLast}

View File

@@ -25,12 +25,13 @@ import useRequest, { useDismissableError } from 'hooks/useRequest';
import StatusLabel from 'components/StatusLabel';
import hasCustomMessages from '../shared/hasCustomMessages';
import { NOTIFICATION_TYPES } from '../constants';
import helpText from '../shared/Notifications.helptext';
import getHelpText from '../shared/Notifications.helptext';
const NUM_RETRIES = 25;
const RETRY_TIMEOUT = 5000;
function NotificationTemplateDetail({ template, defaultMessages }) {
const helpText = getHelpText();
const history = useHistory();
const [testStatus, setTestStatus] = useState(
template.summary_fields?.recent_notifications[0]?.status ?? undefined

View File

@@ -1,7 +1,7 @@
import React from 'react';
import { t } from '@lingui/macro';
const helpText = {
const helpText = () => ({
emailRecepients: t`Use one email address per line to create a recipient list for this type of notification.`,
emailTimeout: t`The amount of time (in seconds) before the email
notification stops trying to reach the host and times out. Ranges
@@ -40,6 +40,6 @@ const helpText = {
<span>{t`for more information.`}</span>
</>
),
};
});
export default helpText;

View File

@@ -26,7 +26,7 @@ import {
} from 'util/validators';
import { NotificationType } from 'types';
import Popover from '../../../components/Popover/Popover';
import helpText from './Notifications.helptext';
import getHelpText from './Notifications.helptext';
const TypeFields = {
email: EmailFields,
@@ -59,6 +59,7 @@ TypeInputsSubForm.propTypes = {
export default TypeInputsSubForm;
function EmailFields() {
const helpText = getHelpText();
return (
<>
<FormField
@@ -142,6 +143,7 @@ function EmailFields() {
}
function GrafanaFields() {
const helpText = getHelpText();
return (
<>
<FormField
@@ -190,6 +192,8 @@ function GrafanaFields() {
}
function IRCFields() {
const helpText = getHelpText();
return (
<>
<PasswordField
@@ -351,6 +355,8 @@ function RocketChatFields() {
}
function SlackFields() {
const helpText = getHelpText();
return (
<>
<ArrayTextField
@@ -381,6 +387,8 @@ function SlackFields() {
}
function TwilioFields() {
const helpText = getHelpText();
return (
<>
<PasswordField
@@ -421,6 +429,8 @@ function TwilioFields() {
}
function WebhookFields() {
const helpText = getHelpText();
const [methodField, methodMeta] = useField({
name: 'notification_configuration.http_method',
validate: required(t`Select a value for this field`),

View File

@@ -18,10 +18,16 @@ function ProjectAdd() {
// the API might throw an unexpected error if our creation request
// has a zero-length string as its credential field. As a work-around,
// normalize falsey credential fields by deleting them.
delete values.credential;
} else {
values.credential = null;
} else if (typeof values.credential.id === 'number') {
values.credential = values.credential.id;
}
if (!values.signature_validation_credential) {
values.signature_validation_credential = null;
} else if (typeof values.signature_validation_credential.id === 'number') {
values.signature_validation_credential =
values.signature_validation_credential.id;
}
setFormSubmitError(null);
try {
const {

View File

@@ -20,6 +20,7 @@ describe('<ProjectAdd />', () => {
scm_clean: true,
scm_track_submodules: false,
credential: 100,
signature_validation_credential: 200,
local_path: '',
organization: { id: 2, name: 'Bar' },
scm_update_on_launch: true,
@@ -73,16 +74,32 @@ describe('<ProjectAdd />', () => {
},
};
const cryptographyCredentialResolve = {
data: {
results: [
{
id: 6,
name: 'GPG Public Key',
kind: 'cryptography',
},
],
count: 1,
},
};
beforeEach(async () => {
await ProjectsAPI.readOptions.mockImplementation(
() => projectOptionsResolve
);
await CredentialTypesAPI.read.mockImplementationOnce(
await CredentialTypesAPI.read.mockImplementation(
() => scmCredentialResolve
);
await CredentialTypesAPI.read.mockImplementationOnce(
await CredentialTypesAPI.read.mockImplementation(
() => insightsCredentialResolve
);
await CredentialTypesAPI.read.mockImplementation(
() => cryptographyCredentialResolve
);
});
afterEach(() => {
@@ -110,6 +127,7 @@ describe('<ProjectAdd />', () => {
...projectData,
organization: 2,
default_environment: 1,
signature_validation_credential: 200,
});
});

View File

@@ -31,7 +31,7 @@ import { formatDateString } from 'util/dates';
import Popover from 'components/Popover';
import getDocsBaseUrl from 'util/getDocsBaseUrl';
import ProjectSyncButton from '../shared/ProjectSyncButton';
import projectHelpText from '../shared/Project.helptext';
import getProjectHelpText from '../shared/Project.helptext';
import useWsProject from './useWsProject';
const Label = styled.span`
@@ -39,6 +39,7 @@ const Label = styled.span`
`;
function ProjectDetail({ project }) {
const projectHelpText = getProjectHelpText();
const {
allow_override,
created,
@@ -124,7 +125,6 @@ function ProjectDetail({ project }) {
</TextList>
);
}
const generateLastJobTooltip = (job) => (
<>
<div>{t`MOST RECENT SYNC`}</div>
@@ -149,6 +149,7 @@ function ProjectDetail({ project }) {
} else if (summary_fields?.last_job) {
job = summary_fields.last_job;
}
const getSourceControlUrlHelpText = () =>
scm_type === 'git'
? projectHelpText.githubSourceControlUrl
@@ -234,6 +235,22 @@ function ProjectDetail({ project }) {
label={t`Source Control Refspec`}
value={scm_refspec}
/>
{summary_fields.signature_validation_credential && (
<Detail
label={t`Content Signature Validation Credential`}
helpText={projectHelpText.signatureValidation}
value={
<CredentialChip
key={summary_fields.signature_validation_credential.id}
credential={summary_fields.signature_validation_credential}
isReadOnly
/>
}
isEmpty={
summary_fields.signature_validation_credential.length === 0
}
/>
)}
{summary_fields.credential && (
<Detail
label={t`Source Control Credential`}
@@ -244,6 +261,7 @@ function ProjectDetail({ project }) {
isReadOnly
/>
}
isEmpty={summary_fields.credential.length === 0}
/>
)}
<Detail

View File

@@ -46,6 +46,11 @@ describe('<ProjectDetail />', () => {
name: 'qux',
kind: 'scm',
},
signature_validation_credential: {
id: 2000,
name: 'svc',
kind: 'cryptography',
},
last_job: {
id: 9000,
status: 'successful',
@@ -78,6 +83,7 @@ describe('<ProjectDetail />', () => {
scm_delete_on_update: true,
scm_track_submodules: true,
credential: 100,
signature_validation_credential: 200,
status: 'successful',
organization: 10,
scm_update_on_launch: true,
@@ -108,6 +114,10 @@ describe('<ProjectDetail />', () => {
'Source Control Credential',
`Scm: ${mockProject.summary_fields.credential.name}`
);
assertDetail(
'Content Signature Validation Credential',
`Cryptography: ${mockProject.summary_fields.signature_validation_credential.name}`
);
assertDetail(
'Cache Timeout',
`${mockProject.scm_update_cache_timeout} Seconds`

View File

@@ -18,10 +18,17 @@ function ProjectEdit({ project }) {
// the API might throw an unexpected error if our creation request
// has a zero-length string as its credential field. As a work-around,
// normalize falsey credential fields by deleting them.
delete values.credential;
} else {
values.credential = null;
} else if (typeof values.credential.id === 'number') {
values.credential = values.credential.id;
}
if (!values.signature_validation_credential) {
values.signature_validation_credential = null;
} else if (typeof values.signature_validation_credential.id === 'number') {
values.signature_validation_credential =
values.signature_validation_credential.id;
}
try {
const {
data: { id },

View File

@@ -21,6 +21,7 @@ describe('<ProjectEdit />', () => {
scm_clean: true,
scm_track_submodules: false,
credential: 100,
signature_validation_credential: 200,
local_path: 'bar',
organization: 2,
scm_update_on_launch: true,
@@ -33,6 +34,12 @@ describe('<ProjectEdit />', () => {
credential_type_id: 5,
kind: 'insights',
},
signature_validation_credential: {
id: 200,
credential_type_id: 6,
kind: 'cryptography',
name: 'foo',
},
organization: {
id: 2,
name: 'Default',
@@ -60,6 +67,7 @@ describe('<ProjectEdit />', () => {
const scmCredentialResolve = {
data: {
count: 1,
results: [
{
id: 4,
@@ -72,6 +80,7 @@ describe('<ProjectEdit />', () => {
const insightsCredentialResolve = {
data: {
count: 1,
results: [
{
id: 5,
@@ -82,6 +91,19 @@ describe('<ProjectEdit />', () => {
},
};
const cryptographyCredentialResolve = {
data: {
count: 1,
results: [
{
id: 6,
name: 'GPG Public Key',
kind: 'cryptography',
},
],
},
};
beforeEach(async () => {
RootAPI.readAssetVariables.mockResolvedValue({
data: {
@@ -91,12 +113,15 @@ describe('<ProjectEdit />', () => {
await ProjectsAPI.readOptions.mockImplementation(
() => projectOptionsResolve
);
await CredentialTypesAPI.read.mockImplementationOnce(
await CredentialTypesAPI.read.mockImplementation(
() => scmCredentialResolve
);
await CredentialTypesAPI.read.mockImplementationOnce(
await CredentialTypesAPI.read.mockImplementation(
() => insightsCredentialResolve
);
await CredentialTypesAPI.read.mockImplementation(
() => cryptographyCredentialResolve
);
});
afterEach(() => {

View File

@@ -1,7 +1,7 @@
import React from 'react';
import { t } from '@lingui/macro';
const projectHelpTextStrings = {
const projectHelpTextStrings = () => ({
executionEnvironment: t`The execution environment that will be used for jobs that use this project. This will be used as fallback when an execution environment has not been explicitly assigned at the job template or workflow level.`,
projectBasePath: (brandName = '') => (
<span>
@@ -105,6 +105,10 @@ const projectHelpTextStrings = {
you can input tags, commit hashes, and arbitrary refs. Some
commit hashes and refs may not be available unless you also
provide a custom refspec.`,
signatureValidation: t`Enable content signing to verify that the content
has remained secure when a project is synced.
If the content has been tampered with, the
job will not run.`,
options: {
clean: t`Remove any local modifications prior to performing an update.`,
delete: t`Delete the local repository in its entirety prior to
@@ -128,6 +132,6 @@ const projectHelpTextStrings = {
considered current, and a new project update will be
performed.`,
},
};
});
export default projectHelpTextStrings;

View File

@@ -9,6 +9,7 @@ import { useConfig } from 'contexts/Config';
import AnsibleSelect from 'components/AnsibleSelect';
import ContentError from 'components/ContentError';
import ContentLoading from 'components/ContentLoading';
import CredentialLookup from 'components/Lookup/CredentialLookup';
import FormActionGroup from 'components/FormActionGroup/FormActionGroup';
import FormField, { FormSubmitError } from 'components/FormField';
import OrganizationLookup from 'components/Lookup/OrganizationLookup';
@@ -16,7 +17,7 @@ import ExecutionEnvironmentLookup from 'components/Lookup/ExecutionEnvironmentLo
import { CredentialTypesAPI, ProjectsAPI } from 'api';
import { required } from 'util/validators';
import { FormColumnLayout, SubFormLayout } from 'components/FormLayout';
import projectHelpText from './Project.helptext';
import getProjectHelpText from './Project.helptext';
import {
GitSubForm,
SvnSubForm,
@@ -37,15 +38,22 @@ const fetchCredentials = async (credential) => {
results: [insightsCredentialType],
},
},
{
data: {
results: [cryptographyCredentialType],
},
},
] = await Promise.all([
CredentialTypesAPI.read({ kind: 'scm' }),
CredentialTypesAPI.read({ name: 'Insights' }),
CredentialTypesAPI.read({ kind: 'cryptography' }),
]);
if (!credential) {
return {
scm: { typeId: scmCredentialType.id },
insights: { typeId: insightsCredentialType.id },
cryptography: { typeId: cryptographyCredentialType.id },
};
}
@@ -60,6 +68,13 @@ const fetchCredentials = async (credential) => {
value:
credential_type_id === insightsCredentialType.id ? credential : null,
},
cryptography: {
typeId: cryptographyCredentialType.id,
value:
credential_type_id === cryptographyCredentialType.id
? credential
: null,
},
};
};
@@ -69,16 +84,20 @@ function ProjectFormFields({
project_local_paths,
formik,
setCredentials,
setSignatureValidationCredentials,
credentials,
signatureValidationCredentials,
scmTypeOptions,
setScmSubFormState,
scmSubFormState,
}) {
const projectHelpText = getProjectHelpText();
const scmFormFields = {
scm_url: '',
scm_branch: '',
scm_refspec: '',
credential: '',
signature_validation_credential: '',
scm_clean: false,
scm_delete_on_update: false,
scm_track_submodules: false,
@@ -86,7 +105,6 @@ function ProjectFormFields({
allow_override: false,
scm_update_cache_timeout: 0,
};
const { setFieldValue, setFieldTouched } = useFormikContext();
const [scmTypeField, scmTypeMeta, scmTypeHelpers] = useField({
@@ -147,6 +165,32 @@ function ProjectFormFields({
[credentials, setCredentials]
);
const handleSignatureValidationCredentialSelection = useCallback(
(type, value) => {
setSignatureValidationCredentials({
...signatureValidationCredentials,
[type]: {
...signatureValidationCredentials[type],
value,
},
});
},
[signatureValidationCredentials, setSignatureValidationCredentials]
);
const handleSignatureValidationCredentialChange = useCallback(
(value) => {
handleSignatureValidationCredentialSelection('cryptography', value);
setFieldValue('signature_validation_credential', value);
setFieldTouched('signature_validation_credential', true, false);
},
[
handleSignatureValidationCredentialSelection,
setFieldValue,
setFieldTouched,
]
);
const handleOrganizationUpdate = useCallback(
(value) => {
setFieldValue('organization', value);
@@ -241,6 +285,13 @@ function ProjectFormFields({
}}
/>
</FormGroup>
<CredentialLookup
credentialTypeId={signatureValidationCredentials.cryptography.typeId}
label={t`Content Signature Validation Credential`}
onChange={handleSignatureValidationCredentialChange}
value={signatureValidationCredentials.cryptography.value}
tooltip={projectHelpText.signatureValidation}
/>
{formik.values.scm_type !== '' && (
<SubFormLayout>
<Title size="md" headingLevel="h4">
@@ -295,7 +346,6 @@ function ProjectFormFields({
</>
);
}
function ProjectForm({ project, submitError, ...props }) {
const { handleCancel, handleSubmit } = props;
const { summary_fields = {} } = project;
@@ -307,6 +357,7 @@ function ProjectForm({ project, submitError, ...props }) {
scm_branch: '',
scm_refspec: '',
credential: '',
signature_validation_credential: '',
scm_clean: false,
scm_delete_on_update: false,
scm_track_submodules: false,
@@ -318,12 +369,22 @@ function ProjectForm({ project, submitError, ...props }) {
const [credentials, setCredentials] = useState({
scm: { typeId: null, value: null },
insights: { typeId: null, value: null },
cryptography: { typeId: null, value: null },
});
const [signatureValidationCredentials, setSignatureValidationCredentials] =
useState({
scm: { typeId: null, value: null },
insights: { typeId: null, value: null },
cryptography: { typeId: null, value: null },
});
useEffect(() => {
async function fetchData() {
try {
const credentialResponse = fetchCredentials(summary_fields.credential);
const signatureValidationCredentialResponse = fetchCredentials(
summary_fields.signature_validation_credential
);
const {
data: {
actions: {
@@ -335,6 +396,9 @@ function ProjectForm({ project, submitError, ...props }) {
} = await ProjectsAPI.readOptions();
setCredentials(await credentialResponse);
setSignatureValidationCredentials(
await signatureValidationCredentialResponse
);
setScmTypeOptions(choices);
} catch (error) {
setContentError(error);
@@ -344,7 +408,10 @@ function ProjectForm({ project, submitError, ...props }) {
}
fetchData();
}, [summary_fields.credential]);
}, [
summary_fields.credential,
summary_fields.signature_validation_credential,
]);
if (isLoading) {
return <ContentLoading />;
@@ -378,6 +445,8 @@ function ProjectForm({ project, submitError, ...props }) {
scm_update_cache_timeout: project.scm_update_cache_timeout || 0,
scm_update_on_launch: project.scm_update_on_launch || false,
scm_url: project.scm_url || '',
signature_validation_credential:
project.signature_validation_credential || '',
default_environment:
project.summary_fields?.default_environment || null,
}}
@@ -392,7 +461,11 @@ function ProjectForm({ project, submitError, ...props }) {
project_local_paths={project_local_paths}
formik={formik}
setCredentials={setCredentials}
setSignatureValidationCredentials={
setSignatureValidationCredentials
}
credentials={credentials}
signatureValidationCredentials={signatureValidationCredentials}
scmTypeOptions={scmTypeOptions}
setScmSubFormState={setScmSubFormState}
scmSubFormState={scmSubFormState}

View File

@@ -19,6 +19,7 @@ describe('<ProjectForm />', () => {
scm_clean: true,
scm_track_submodules: false,
credential: 100,
signature_validation_credential: 200,
organization: 2,
scm_update_on_launch: true,
scm_update_cache_timeout: 3,
@@ -35,6 +36,12 @@ describe('<ProjectForm />', () => {
id: 2,
name: 'Default',
},
signature_validation_credential: {
id: 200,
credential_type_id: 6,
kind: 'cryptography',
name: 'Svc',
},
},
};
@@ -58,6 +65,7 @@ describe('<ProjectForm />', () => {
const scmCredentialResolve = {
data: {
count: 1,
results: [
{
id: 4,
@@ -70,6 +78,7 @@ describe('<ProjectForm />', () => {
const insightsCredentialResolve = {
data: {
count: 1,
results: [
{
id: 5,
@@ -80,6 +89,19 @@ describe('<ProjectForm />', () => {
},
};
const cryptographyCredentialResolve = {
data: {
count: 1,
results: [
{
id: 6,
name: 'GPG Public Key',
kind: 'cryptography',
},
],
},
};
beforeEach(async () => {
RootAPI.readAssetVariables.mockResolvedValue({
data: {
@@ -89,12 +111,15 @@ describe('<ProjectForm />', () => {
await ProjectsAPI.readOptions.mockImplementation(
() => projectOptionsResolve
);
await CredentialTypesAPI.read.mockImplementationOnce(
await CredentialTypesAPI.read.mockImplementation(
() => scmCredentialResolve
);
await CredentialTypesAPI.read.mockImplementationOnce(
await CredentialTypesAPI.read.mockImplementation(
() => insightsCredentialResolve
);
await CredentialTypesAPI.read.mockImplementation(
() => cryptographyCredentialResolve
);
});
afterEach(() => {
@@ -153,9 +178,17 @@ describe('<ProjectForm />', () => {
expect(
wrapper.find('FormGroup[label="Source Control Refspec"]').length
).toBe(1);
expect(
wrapper.find('FormGroup[label="Content Signature Validation Credential"]')
.length
).toBe(1);
expect(
wrapper.find('FormGroup[label="Source Control Credential"]').length
).toBe(1);
expect(
wrapper.find('FormGroup[label="Content Signature Validation Credential"]')
.length
).toBe(1);
expect(wrapper.find('FormGroup[label="Options"]').length).toBe(1);
});
@@ -177,21 +210,52 @@ describe('<ProjectForm />', () => {
id: 1,
name: 'organization',
});
wrapper.find('CredentialLookup').invoke('onBlur')();
wrapper.find('CredentialLookup').invoke('onChange')({
wrapper
.find('CredentialLookup[label="Source Control Credential"]')
.invoke('onBlur')();
wrapper
.find('CredentialLookup[label="Source Control Credential"]')
.invoke('onChange')({
id: 10,
name: 'credential',
});
wrapper
.find(
'CredentialLookup[label="Content Signature Validation Credential"]'
)
.invoke('onBlur')();
wrapper
.find(
'CredentialLookup[label="Content Signature Validation Credential"]'
)
.invoke('onChange')({
id: 20,
name: 'signature_validation_credential',
});
});
wrapper.update();
expect(wrapper.find('OrganizationLookup').prop('value')).toEqual({
id: 1,
name: 'organization',
});
expect(wrapper.find('CredentialLookup').prop('value')).toEqual({
expect(
wrapper
.find('CredentialLookup[label="Source Control Credential"]')
.prop('value')
).toEqual({
id: 10,
name: 'credential',
});
expect(
wrapper
.find(
'CredentialLookup[label="Content Signature Validation Credential"]'
)
.prop('value')
).toEqual({
id: 20,
name: 'signature_validation_credential',
});
});
test('should display insights credential lookup when source control type is "insights"', async () => {
@@ -212,14 +276,22 @@ describe('<ProjectForm />', () => {
1
);
await act(async () => {
wrapper.find('CredentialLookup').invoke('onBlur')();
wrapper.find('CredentialLookup').invoke('onChange')({
wrapper
.find('CredentialLookup[label="Insights Credential"]')
.invoke('onBlur')();
wrapper
.find('CredentialLookup[label="Insights Credential"]')
.invoke('onChange')({
id: 123,
name: 'credential',
});
});
wrapper.update();
expect(wrapper.find('CredentialLookup').prop('value')).toEqual({
expect(
wrapper
.find('CredentialLookup[label="Insights Credential"]')
.prop('value')
).toEqual({
id: 123,
name: 'credential',
});
@@ -358,7 +430,9 @@ describe('<ProjectForm />', () => {
});
test('should display ContentError on throw', async () => {
CredentialTypesAPI.read = () => Promise.reject(new Error());
CredentialTypesAPI.read.mockImplementationOnce(() =>
Promise.reject(new Error())
);
await act(async () => {
wrapper = mountWithContexts(
<ProjectForm handleSubmit={jest.fn()} handleCancel={jest.fn()} />

View File

@@ -1,6 +1,6 @@
import 'styled-components/macro';
import React from 'react';
import projectHelpText from '../Project.helptext';
import getProjectHelpText from '../Project.helptext';
import {
UrlFormField,
@@ -12,15 +12,18 @@ const ArchiveSubForm = ({
credential,
onCredentialSelection,
scmUpdateOnLaunch,
}) => (
<>
<UrlFormField tooltip={projectHelpText.archiveUrl} />
<ScmCredentialFormField
credential={credential}
onCredentialSelection={onCredentialSelection}
/>
<ScmTypeOptions scmUpdateOnLaunch={scmUpdateOnLaunch} />
</>
);
}) => {
const projectHelpText = getProjectHelpText();
return (
<>
<UrlFormField tooltip={projectHelpText.archiveUrl} />
<ScmCredentialFormField
credential={credential}
onCredentialSelection={onCredentialSelection}
/>
<ScmTypeOptions scmUpdateOnLaunch={scmUpdateOnLaunch} />
</>
);
};
export default ArchiveSubForm;

View File

@@ -11,8 +11,7 @@ import {
ScmCredentialFormField,
ScmTypeOptions,
} from './SharedFields';
import projectHelpStrings from '../Project.helptext';
import getProjectHelpStrings from '../Project.helptext';
const GitSubForm = ({
credential,
@@ -22,6 +21,7 @@ const GitSubForm = ({
const docsURL = `${getDocsBaseUrl(
useConfig()
)}/html/userguide/projects.html#manage-playbooks-using-source-control`;
const projectHelpStrings = getProjectHelpStrings();
return (
<>

View File

@@ -8,13 +8,14 @@ import AnsibleSelect from 'components/AnsibleSelect';
import FormField from 'components/FormField';
import Popover from 'components/Popover';
import useBrandName from 'hooks/useBrandName';
import projectHelpStrings from '../Project.helptext';
import getProjectHelpStrings from '../Project.helptext';
const ManualSubForm = ({
localPath,
project_base_dir,
project_local_paths,
}) => {
const projectHelpStrings = getProjectHelpStrings();
const brandName = useBrandName();
const localPaths = [...new Set([...project_local_paths, localPath])];
const options = [

View File

@@ -7,7 +7,7 @@ import CredentialLookup from 'components/Lookup/CredentialLookup';
import FormField, { CheckboxField } from 'components/FormField';
import { required } from 'util/validators';
import { FormCheckboxLayout, FormFullWidthLayout } from 'components/FormLayout';
import projectHelpStrings from '../Project.helptext';
import getProjectHelpStrings from '../Project.helptext';
export const UrlFormField = ({ tooltip }) => (
<FormField
@@ -22,15 +22,18 @@ export const UrlFormField = ({ tooltip }) => (
/>
);
export const BranchFormField = ({ label }) => (
<FormField
id="project-scm-branch"
name="scm_branch"
type="text"
label={label}
tooltip={projectHelpStrings.branchFormField}
/>
);
export const BranchFormField = ({ label }) => {
const projectHelpStrings = getProjectHelpStrings();
return (
<FormField
id="project-scm-branch"
name="scm_branch"
type="text"
label={label}
tooltip={projectHelpStrings.branchFormField}
/>
);
};
export const ScmCredentialFormField = ({
credential,
@@ -59,6 +62,7 @@ export const ScmCredentialFormField = ({
export const ScmTypeOptions = ({ scmUpdateOnLaunch, hideAllowOverride }) => {
const { values } = useFormikContext();
const projectHelpStrings = getProjectHelpStrings();
return (
<FormFullWidthLayout>

View File

@@ -1,7 +1,7 @@
import 'styled-components/macro';
import React from 'react';
import { t } from '@lingui/macro';
import projectHelpStrings from '../Project.helptext';
import getProjectHelpStrings from '../Project.helptext';
import {
UrlFormField,
@@ -14,16 +14,19 @@ const SvnSubForm = ({
credential,
onCredentialSelection,
scmUpdateOnLaunch,
}) => (
<>
<UrlFormField tooltip={projectHelpStrings.svnSourceControlUrl} />
<BranchFormField label={t`Revision #`} />
<ScmCredentialFormField
credential={credential}
onCredentialSelection={onCredentialSelection}
/>
<ScmTypeOptions scmUpdateOnLaunch={scmUpdateOnLaunch} />
</>
);
}) => {
const projectHelpStrings = getProjectHelpStrings();
return (
<>
<UrlFormField tooltip={projectHelpStrings.svnSourceControlUrl} />
<BranchFormField label={t`Revision #`} />
<ScmCredentialFormField
credential={credential}
onCredentialSelection={onCredentialSelection}
/>
<ScmTypeOptions scmUpdateOnLaunch={scmUpdateOnLaunch} />
</>
);
};
export default SvnSubForm;

View File

@@ -11,9 +11,10 @@ import useRequest, { useDismissableError } from 'hooks/useRequest';
import AlertModal from 'components/AlertModal';
import ErrorDetail from 'components/ErrorDetail';
import { ProjectsAPI } from 'api';
import projectHelpStrings from './Project.helptext';
import getProjectHelpStrings from './Project.helptext';
function ProjectSyncButton({ projectId, lastJobStatus = null }) {
const projectHelpStrings = getProjectHelpStrings();
const match = useRouteMatch();
const { request: handleSync, error: syncError } = useRequest(

View File

@@ -440,8 +440,9 @@ const ObjectField = ({ name, config, revertValue, isRequired = false }) => {
const [field, meta, helpers] = useField({ name, validate });
const isValid = !(meta.touched && meta.error);
const defaultRevertValue =
config?.default !== null ? JSON.stringify(config.default, null, 2) : null;
const defaultRevertValue = config?.default
? JSON.stringify(config.default, null, 2)
: null;
return config ? (
<FormFullWidthLayout>

View File

@@ -34,7 +34,7 @@ import useRequest, { useDismissableError } from 'hooks/useRequest';
import useBrandName from 'hooks/useBrandName';
import ExecutionEnvironmentDetail from 'components/ExecutionEnvironmentDetail';
import { relatedResourceDeleteRequests } from 'util/getRelatedResourceDeleteDetails';
import helpText from '../shared/JobTemplate.helptext';
import getHelpText from '../shared/JobTemplate.helptext';
function JobTemplateDetail({ template }) {
const {
@@ -68,7 +68,7 @@ function JobTemplateDetail({ template }) {
const { id: templateId } = useParams();
const history = useHistory();
const brandName = useBrandName();
const helpText = getHelpText();
const {
isLoading: isLoadingInstanceGroups,
request: fetchInstanceGroups,

View File

@@ -25,7 +25,7 @@ import Sparkline from 'components/Sparkline';
import { toTitleCase } from 'util/strings';
import { relatedResourceDeleteRequests } from 'util/getRelatedResourceDeleteDetails';
import useRequest, { useDismissableError } from 'hooks/useRequest';
import helpText from '../shared/WorkflowJobTemplate.helptext';
import getHelpText from '../shared/WorkflowJobTemplate.helptext';
function WorkflowJobTemplateDetail({ template }) {
const {
@@ -44,7 +44,7 @@ function WorkflowJobTemplateDetail({ template }) {
scm_branch: scmBranch,
limit,
} = template;
const helpText = getHelpText();
const urlOrigin = window.location.origin;
const history = useHistory();

View File

@@ -2,7 +2,7 @@ import React from 'react';
import { t } from '@lingui/macro';
import getDocsBaseUrl from 'util/getDocsBaseUrl';
const jtHelpTextStrings = {
const jtHelpTextStrings = () => ({
jobType: t`For job templates, select run to execute the playbook. Select check to only check playbook syntax, test environment setup, and report problems without executing the playbook.`,
inventory: t`Select the inventory containing the hosts you want this job to manage.`,
project: t`Select the project containing the playbook you want this job to execute.`,
@@ -60,6 +60,6 @@ const jtHelpTextStrings = {
{t`for more information.`}
</span>
),
};
});
export default jtHelpTextStrings;

View File

@@ -46,7 +46,7 @@ import LabelSelect from 'components/LabelSelect';
import { VerbositySelectField } from 'components/VerbositySelectField';
import PlaybookSelect from './PlaybookSelect';
import WebhookSubForm from './WebhookSubForm';
import helpText from './JobTemplate.helptext';
import getHelpText from './JobTemplate.helptext';
const { origin } = document.location;
@@ -60,6 +60,7 @@ function JobTemplateForm({
validateField,
isOverrideDisabledLookup, // TODO: this is a confusing variable name
}) {
const helpText = getHelpText();
const [contentError, setContentError] = useState(false);
const [allowCallbacks, setAllowCallbacks] = useState(
Boolean(template?.host_config_key)

View File

@@ -22,9 +22,10 @@ import {
WorkflowJobTemplatesAPI,
CredentialTypesAPI,
} from 'api';
import helpText from './WorkflowJobTemplate.helptext';
import getHelpText from './WorkflowJobTemplate.helptext';
function WebhookSubForm({ templateType }) {
const helpText = getHelpText();
const { setFieldValue } = useFormikContext();
const { id } = useParams();
const { pathname } = useLocation();

Some files were not shown because too many files have changed in this diff Show More