Compare commits

...

160 Commits
8.0.0 ... 9.0.0

Author SHA1 Message Date
Shane McDonald
d645d0894a Merge pull request #5196 from ansible/9.0.0-for-real
Bump VERSION to 9.0.0
2019-10-31 13:48:01 -04:00
Shane McDonald
4575cae458 Bump VERSION to 9.0.0 2019-10-31 13:39:42 -04:00
softwarefactory-project-zuul[bot]
6982a8aee7 Merge pull request #5176 from AlanCoding/collection_project_create
Fix and test for warning when creating project

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-31 16:50:59 +00:00
softwarefactory-project-zuul[bot]
fa1091d089 Merge pull request #5175 from AlanCoding/multi_cred_launch
Add test coverage for launch with multiple prompted creds

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-31 16:46:31 +00:00
softwarefactory-project-zuul[bot]
c605705b39 Merge pull request #5182 from wenottingham/for-this-to-actually-be-useful-we-would-need-a-sendmail.cf-and-lol-not-doing-that
[RFC] Remove admin alerts, there are better mechanisms for this

Reviewed-by: Ryan Petrello
             https://github.com/ryanpetrello
2019-10-31 14:52:37 +00:00
softwarefactory-project-zuul[bot]
ccc2a616c1 Merge pull request #5186 from wenottingham/another-driveby
Remove extraneous leftover conditional import

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-31 01:21:34 +00:00
softwarefactory-project-zuul[bot]
51184ba20d Merge pull request #5129 from mabashian/resource-access-tabs
Configures access tabs for job template, project, inventory and smart inventory details

Reviewed-by: Michael Abashian
             https://github.com/mabashian
2019-10-31 00:42:37 +00:00
AlanCoding
db33c0e4fa Add test coverage for launch with multiple prompted creds 2019-10-30 20:41:14 -04:00
mabashian
e9728f2a78 Update snapshot after rebase 2019-10-30 20:11:11 -04:00
Bill Nottingham
5cdf2f88da Remove admin alerts, there are better mechanisms for this 2019-10-30 19:35:45 -04:00
Bill Nottingham
93e940adfc Remove extraneous leftover conditional import 2019-10-30 19:21:02 -04:00
mabashian
64776f97cf Prettier formatting 2019-10-30 19:13:35 -04:00
mabashian
fc080732d4 Add breadcrumb for template access tab 2019-10-30 19:13:35 -04:00
mabashian
d02364a833 Configures access tabs for job template, project, inventory and smart inventory details views. 2019-10-30 19:13:34 -04:00
softwarefactory-project-zuul[bot]
176da040d9 Merge pull request #5185 from ansible/jakemcdermott-patch-wfjt-webhook-wording
Make WFJT webhook credential help text more helpful

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 22:51:33 +00:00
softwarefactory-project-zuul[bot]
f2b4d87152 Merge pull request #5184 from shanemcd/remove-dead-code
Deleting unused unit-tests directory

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 22:38:00 +00:00
softwarefactory-project-zuul[bot]
17798edbc4 Merge pull request #5174 from shanemcd/centos-8-upstream
Update AWX images to CentOS 8

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 22:31:26 +00:00
Jake McDermott
5e6ee4a371 Improve WFJT webhook credential wording 2019-10-30 17:52:30 -04:00
Shane McDonald
288fea8960 Deleting unused unit-tests directory
Same as https://github.com/ansible/awx/pull/5179 except I wont accidentally
close it.
2019-10-30 17:35:19 -04:00
softwarefactory-project-zuul[bot]
dca9daf719 Merge pull request #5178 from rebeccahhh/devel
Test policy_instance_ variable validation in instance group

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 21:31:40 +00:00
softwarefactory-project-zuul[bot]
634504c7a1 Merge pull request #5131 from mabashian/5010-close-button
Removes close button from footer of host details modal

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 21:04:04 +00:00
Shane McDonald
c019d873b9 Update AWX images to CentOS 8 2019-10-30 16:43:23 -04:00
Rebeccah
e4a21b67c7 remove u markers in assertion statements, they are unnecessary in python3 2019-10-30 15:52:14 -04:00
Rebeccah
2e6c484a50 added in testing for updating(or not allowing updates of) policy_instance variables in instance and container groups 2019-10-30 15:51:55 -04:00
AlanCoding
f8b64f2222 Fix and test for warning when creating project 2019-10-30 15:40:49 -04:00
softwarefactory-project-zuul[bot]
6060b62acd Merge pull request #5172 from wenottingham/sweet-but-psycopg
Re-add psycopg2 for bootstrap_development.sh

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 19:35:23 +00:00
Ryan Petrello
0dcf6a2b1f Merge pull request #5156 from ryanpetrello/cli-launch-args
properly parse CLI arguments for launch endpoints
2019-10-30 14:52:32 -04:00
Bill Nottingham
452c1b53f7 Re-add psycopg2 for bootstrap_development.sh 2019-10-30 14:23:33 -04:00
softwarefactory-project-zuul[bot]
42d2f72683 Merge pull request #5159 from wenottingham/delete-delete-delete
Trim the list of things installed during build of the dev environment

Reviewed-by: Shane McDonald <me@shanemcd.com>
             https://github.com/shanemcd
2019-10-30 18:11:39 +00:00
softwarefactory-project-zuul[bot]
57e8ba7f3c Merge pull request #5168 from AlanCoding/more_sane
Remove sanity exceptions no longer needed

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 18:09:01 +00:00
Ryan Petrello
c882cda586 properly parse CLI arguments for launch endpoints
see: https://github.com/ansible/awx/issues/5093
2019-10-30 13:49:37 -04:00
softwarefactory-project-zuul[bot]
784d18705c Merge pull request #5164 from shanemcd/fix-container-groups-upstream
Fix container groups in AWX image

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 17:44:38 +00:00
Bill Nottingham
36996584f9 Re-add dependencies needed by UI tests to the dev env 2019-10-30 13:06:48 -04:00
AlanCoding
0160dbe8bc Remove sanity exceptions no longer needed 2019-10-30 12:56:36 -04:00
Shane McDonald
28994d4b0b Install oc and kubectl in upstream task image 2019-10-30 12:15:51 -04:00
softwarefactory-project-zuul[bot]
9b09344bae Merge pull request #5015 from AlanCoding/awx_awx_cp1
tower_credential: Missing 'kind' attribute (#61324)

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-30 13:16:32 +00:00
Bill Nottingham
84ba383199 Trim the list of things installed during build
Swap git & vim for more minimal installs.
2019-10-29 23:19:00 -04:00
softwarefactory-project-zuul[bot]
6dcd87afec Merge pull request #5148 from keithjgrant/pf-upgrade
Reduce test warnings

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-29 23:56:02 +00:00
softwarefactory-project-zuul[bot]
243ab58902 Merge pull request #5152 from shanemcd/centos-8-dev-env
Update dev env to centos:8

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-29 22:09:00 +00:00
Shane McDonald
6c877a15e3 Update dev env to centos:8 2019-10-29 17:09:45 -04:00
softwarefactory-project-zuul[bot]
2ccf0a0004 Merge pull request #5143 from AlanCoding/getinline
Apply username ordering to more views

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-29 20:42:49 +00:00
softwarefactory-project-zuul[bot]
c69db02762 Merge pull request #5102 from mabashian/edit-buttons
Adds edit buttons to Templates, Inventories, Organizations, and Projects list items

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-29 20:42:45 +00:00
AlanCoding
59e1c6d492 Add collection test coverage for creating vault credential 2019-10-29 15:34:32 -04:00
mabashian
35c27c8b16 Tweak ActionButtonCell definition and export 2019-10-29 14:45:05 -04:00
Keith Grant
91edac0d84 remove prop type warnings 2019-10-29 11:41:14 -07:00
Mathieu Mallet
ae1bd9d1e9 tower_credential: Missing 'kind' attribute (#61324)
In the 'tower_credential' module, when the credential 'kind' is set to
'vault', the code expects the other parameter 'vault_id' to be set.
Unfortunately, in the module 'credential_type_for_v1_kind' method, the
'kind' parameter is popped, i.e. remove from the module dict of
parameters leading to the following error:

> Parameter 'vault_id' is only valid if parameter 'kind' is specified as
'vault'

Fixes: #45644, #61324

Testing Done: Manually create a playbook with a task as follow
  - name: Create vault with ID 'bar' exists
    tower_credential:
      name: Foobar vault
      organization: Foobar
      kind: vault
      vault_id: bar
      vault_password: foobar
2019-10-29 14:21:21 -04:00
AlanCoding
cf168b27d2 apply username ordering to more views 2019-10-29 14:20:33 -04:00
softwarefactory-project-zuul[bot]
8cb7b388dc Merge pull request #5140 from ryanpetrello/downstream-hardening
merge a variety of downstream bug fixes

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-29 18:19:10 +00:00
Ryan Petrello
171f0d6340 Merge branch 'downstream' into devel 2019-10-29 13:02:17 -04:00
Jeff Bradberry
aff31ac02f Add the no_truncate parameter to the job and adhoc event sublist views
which are the ones that the CLI actually uses.
2019-10-29 11:24:17 -04:00
Jake McDermott
a23754897e Improve accuracy of code comment 2019-10-29 11:24:17 -04:00
Ryan Petrello
3094b67664 work around a bug in the k8s client that leaves trash in /tmp 2019-10-29 11:24:17 -04:00
Alan Rominger
98d3f3dc8a Add tests for AWX collection credential fixes (#3893) 2019-10-29 11:24:16 -04:00
Jake McDermott
6f2a07a7df Scrape tag input state from dom and put it in vm
The tag input state lives somewhere in the associated select2 widgetry
and isn't directly tied to the vm like it is for the other inputs.
2019-10-29 11:24:16 -04:00
Rebeccah
54ac1905b3 pinning pytest-mock to version 1.11.1 2019-10-29 11:24:16 -04:00
AlanCoding
1bdae2d1f7 Fully rely on error ignoring for sanity rel imports 2019-10-29 11:24:16 -04:00
AlanCoding
2bc2e26cc7 Ignore import errors due to bugs in Ansible core 2019-10-29 11:24:16 -04:00
AlanCoding
5010602e6b add release note 2019-10-29 11:24:16 -04:00
AlanCoding
c103a813bf declare types in Ansible Tower module options 2019-10-29 11:24:16 -04:00
AlanCoding
e097bc61c8 New target for sanity testing of the collection
Do not run in Zuul
2019-10-29 11:24:15 -04:00
Ryan Petrello
2ea63eeca0 pin to runner==1.4.4 2019-10-29 11:24:15 -04:00
Ryan Petrello
52336c0fe8 fix a syntax error
whoopsie
2019-10-29 11:24:15 -04:00
Rebeccah
220354241b added in check to see if the the current check has an instance or not to prevent nonetype errors 2019-10-29 11:24:15 -04:00
Rebeccah
1ae8fdc15c moved filterint out policy instance values in the api browser input box into the instanceGroupDetail class where I overrode the update_raw_data function to parse out the unneeded data. Additionally added the fix for checking the value in the serializer. 2019-10-29 11:24:15 -04:00
Rebeccah
4bbdce3478 removed policy_instance variables from container groups default values in the API put/patch view 2019-10-29 11:24:15 -04:00
Rebeccah
d25e6249fd Added in validation for each of the 3 fields that should not be changed if the instance is a container group, defaults in the textarea persist with these 3 options 2019-10-29 11:24:15 -04:00
Jim Ladd
71d7bac261 Rename job_summary_dict to job_metadata
* Clarifies purpose of notification template variable
2019-10-29 11:24:14 -04:00
Alan Rominger
acba5306c6 Fix bug where SCM inventory did not have a collections destination (#3795)
* update inventory path to be in tmp project clone

* copy project folder for inventory scm launch type

* Optionally accept inventory collection paths from ansible.cfg
2019-10-29 11:24:14 -04:00
Jim Ladd
fca9245536 Update unit tests 2019-10-29 11:24:14 -04:00
Jim Ladd
47031da65b Return full webhook dict when serializing notif. 2019-10-29 11:24:14 -04:00
Jim Ladd
b024d91c66 Use correct notif. bodies when sending test notifs
* Notification backends now handle body of notifications differently
* .. depending on their type (webhook, email, and pagerduty) are
  currently the only three notification types that use body
* email and pagerduty expect a string
* webhooks expects a dict in string format
2019-10-29 11:24:14 -04:00
Jim Ladd
da7002cf0c Don't use i18n for NT body string 2019-10-29 11:24:14 -04:00
Keith Grant
f4f1762805 fix lint errors 2019-10-29 11:24:13 -04:00
Keith Grant
ad5857e06b Add notification custom message fields for workflow pause/approval 2019-10-29 11:24:13 -04:00
Jim Ladd
12d735ec8f NotificationSerializer should gracefully handle webhook/pagerduty bodies 2019-10-29 11:24:13 -04:00
Jim Ladd
1e9173e8ef In awxkit, add support for wf approval notification templates 2019-10-29 11:24:13 -04:00
Jim Ladd
4809c40f3c Render WF approval notifications w/ custom templates 2019-10-29 11:24:13 -04:00
Jim Ladd
4e9ec271c5 Refactor notification backends to use CustomNotificationBase 2019-10-29 11:24:13 -04:00
Jim Ladd
6cd6a42e20 Render default notifications using Jinja templates 2019-10-29 11:24:13 -04:00
Jim Ladd
f234c0f771 Remove unused build_notification_message method 2019-10-29 11:24:12 -04:00
Alan Rominger
3f49d2c455 RBAC relaunch 403 updates (#3835)
* RBAC relaunch 403 updates

Addresses 2 things

1. If WFJ relaunch is attempted, and relaunch is denied
  because the WFJ had encrypted survey answers,
  a generic message was shown, this changes it to show
  a specific error message

2. Org admins are banned from relaunching a job
  if the job has encrypted survey answers

* update tests to raises access pattern

* catch PermissionDenied for user_capabilities
2019-10-29 11:24:12 -04:00
Alan Rominger
a0fb9bef3a Disable activity stream and speed up host group bulk deletion (#3817) 2019-10-29 11:24:12 -04:00
Ryan Petrello
ccaaee61f0 improve cleanup of anonymous kubeconfig files 2019-10-29 11:24:12 -04:00
Alan Rominger
70269d9a0d Add support for credential_type in tower_credential module (#3820)
* Add support for credential_type

* Finish up credential_type parameter with tests

* make inputs mutually exclusive with other params

* Test credential type with dict input
2019-10-29 11:24:12 -04:00
Ryan Petrello
ab6322a8f7 fix a bug that breaks webhook launches when a survey is in use
see: https://github.com/ansible/awx/issues/5062
2019-10-29 11:24:12 -04:00
Ryan Petrello
8bc6367e1e fix a bug introduced upstream with settings.LOG_AGGREGATOR_AUDIT 2019-10-29 11:24:12 -04:00
Alex Corey
b74bf9f266 Instance Groups Instances List styling fixes (#3846)
* Instance Groups Instances slider renders properly, and that list wraps properly.

* Instance Groups responds properly

* assorted container groups ui fixes
updated responsiveness of instance groups and instances list
fix layout of container group form
update help text for container group form elements
update text for tech preview top bar

* update container group doclink

* list styling updates based on feedback
2019-10-29 11:24:11 -04:00
Martin Juhl
321aa3b01d Update handlers.py
The setFormatter tries to create the external.log file.. So we should check if LOG_AGGREGATOR_AUDIT is active here as well
2019-10-29 11:24:11 -04:00
Ryan Petrello
7f1096f711 reap k8s-based jobs when the dispatcher restarts 2019-10-29 11:24:11 -04:00
Marliana Lara
2b6cfd7b3d Handle undefined schedule value in job detail component 2019-10-29 11:24:11 -04:00
Graham Mainwaring
b2b33605cc Add UI toggle to disable public Galaxy (#3867) 2019-10-29 11:24:11 -04:00
mabashian
d06b0de74b Revert 6282b5bacb 2019-10-29 11:24:11 -04:00
Ryan Petrello
6dfc714c75 when isolated or container jobs fail to launch, set job status to error
a status of error makes more sense, because failed generally points to
an issue with the playbook itself, while error is more generally used
for reporting issues internal to Tower

see: https://github.com/ansible/awx/issues/4909
2019-10-29 11:24:10 -04:00
Jake McDermott
cf5d3d55f0 Set omitted runner event line lengths to 0
runner_on_start events have zero-length strings for their stdout
fields. We don't want to display these in the ui so we omit them.
Although the stdout field is an empty string, it still has a recorded
line length of 1 that we must account for. Since we're not rendering
the blank line, we must also go back and set the event record's line
length to 0 in order to avoid deleting too many lines when we pop or
shift events off of the view while scrolling.
2019-10-29 11:24:10 -04:00
Jake McDermott
e91d383165 Fix off-by-one errors 2019-10-29 11:24:10 -04:00
mabashian
72d19b93a0 Prettier formatting 2019-10-29 11:01:04 -04:00
softwarefactory-project-zuul[bot]
ff1c96b0e0 Merge pull request #5132 from mabashian/4980-job-details-delete
Hide delete button on job details from users without proper permissions

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-29 01:18:50 +00:00
softwarefactory-project-zuul[bot]
6aaf906594 Merge pull request #5130 from mabashian/4948-empty-list
Fix org teams empty list text

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-29 01:13:08 +00:00
Keith Grant
da7baced50 upgrade patternfly to latest, update tests 2019-10-28 15:59:47 -07:00
softwarefactory-project-zuul[bot]
2b10c0f3f2 Merge pull request #5042 from craph/devel
Improve usage of ssl_certificate in local_docker

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-28 22:09:20 +00:00
mabashian
01788263e2 Hide delete button on job details from users without proper permissions 2019-10-28 17:50:32 -04:00
mabashian
8daceabd26 Removes close button from footer of host details modal 2019-10-28 17:38:28 -04:00
Raphaël COMBEAU
712b07c136 Improve usage of ssl_certificate in local_docker
Remove nginx.conf from container

Move nginx outside ssl_certificate block
2019-10-28 17:37:14 -04:00
mabashian
8fbfed5c55 Fix org teams empty list text 2019-10-28 17:21:14 -04:00
softwarefactory-project-zuul[bot]
c4a3c0aac1 Merge pull request #5128 from fosterseth/fix-5081-towerbaseurl400
Fix URLField to allow numbers in top level domain

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-28 21:09:26 +00:00
softwarefactory-project-zuul[bot]
365f897059 Merge pull request #5103 from mabashian/proj-notifs
Hook up notifications tab on projects

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-28 19:38:15 +00:00
mabashian
7b1158ee8e Fix failing unit tests due to missing scm_revision key 2019-10-28 15:31:03 -04:00
mabashian
d8814b7162 Add displayName so that ActionButtonCell can be referenced in tests 2019-10-28 15:11:58 -04:00
mabashian
9af3fa557b Fix merge conflict fallout. Remove stale edit click handler. 2019-10-28 15:11:58 -04:00
mabashian
e0d8d35090 Adds edit buttons to Templates, Inventories, Organizations, and Projects list items when the user has edit capabilities. 2019-10-28 15:04:02 -04:00
Seth Foster
7e83ddc968 Fix URLField to allow numbers in top level domain
Add a custom regex to URLField that allows numbers to be present in the
top level domain, e.g. https://towerhost.org42

Set by variable allow_numbers_in_top_level_domain in URLField __init__,
and is set to True by default. If set to False, it will use the regex
specified in the built-in django URLValidator class.

This solution was originally implemented in LDAPServerURIField, but is
now implemented in URLField to support this behavior more generally. The
changes in LDAPServerURIField are longer needed and have been removed in
this commit.

Adds unit testing to make sure URLField changes handle regex input
and settings correctly.
2019-10-28 13:47:01 -04:00
softwarefactory-project-zuul[bot]
bbbacd62ae Merge pull request #5125 from gmarsay/bugfix-slack-notification
Bugfix - Slack notification with name and avatar

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-28 15:37:39 +00:00
softwarefactory-project-zuul[bot]
a6fd3d0c09 Merge pull request #5115 from AlanCoding/string_explosion
Fix bug: WFJT-type node YAML vars broke task manager

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-28 15:15:06 +00:00
softwarefactory-project-zuul[bot]
edf0d4bf85 Merge pull request #5120 from AlanCoding/var_lib
Move development PROJECTS_ROOT

Reviewed-by: Matthew Jones <mat@matburt.net>
             https://github.com/matburt
2019-10-28 15:08:00 +00:00
softwarefactory-project-zuul[bot]
5ab09686c9 Merge pull request #5043 from EStork09/devel
Added custom_venv_dir to local docker install,

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-27 23:42:35 +00:00
softwarefactory-project-zuul[bot]
4ed4d85b91 Merge pull request #5123 from r-daneel/devel
Add quote filter to shell variables

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-27 15:35:47 +00:00
softwarefactory-project-zuul[bot]
e066b688fc Merge pull request #5117 from ryanpetrello/runner-1-4-4
pin to runner==1.4.4

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-27 14:03:25 +00:00
Ryan Petrello
15111dd24a pin to runner==1.4.4 2019-10-27 09:17:10 -04:00
Guillaume Marsay
31a96d20ab Update slack_backend.py 2019-10-26 22:07:31 +02:00
softwarefactory-project-zuul[bot]
9a70ac88c0 Merge pull request #5075 from AlexSCorey/credentialsLookUp
Credentials look up

Reviewed-by: Alex Corey <Alex.swansboro@gmail.com>
             https://github.com/AlexSCorey
2019-10-25 20:51:13 +00:00
Ahmed RAHAL
2ec5dda1d8 Add quotes to shell variables with user input
The last update of this file added default values for passwords
but removed the 'quote' filter.
This is extremely problematic for database passwords that should always
be complex and contain special characters that the shell may interpret
wrongly.
As a sanity measure, adding the quote filter to all fields.
2019-10-25 16:44:59 -04:00
Alex Corey
dab80fb842 Adds Proptypes 2019-10-25 16:16:04 -04:00
AlanCoding
a6404bdd0d Move development PROJECTS_ROOT 2019-10-25 15:48:07 -04:00
softwarefactory-project-zuul[bot]
ee5199f77a Merge pull request #5090 from lopf/5089-fix-psql-user-specification
Fix psql: local user with ID 1001 does not exist

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 19:32:13 +00:00
Alex Corey
7f409c6487 Moves JT CredentialsList Manipulation Back to CredentialsLookup
Rename CredentialsLookup to MultiCredentialLookup
Removes unnecessary functions in Lookup.
Puts CredentialsList manipulation on CredsLookup and removes that work from JTForm.
Upates tests for CredentialsLookup and JTForm to reflect changes above.
2019-10-25 15:13:11 -04:00
softwarefactory-project-zuul[bot]
491e4c709e Merge pull request #5116 from jakemcdermott/remove-jdetails-close
Remove close button from job details

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 16:43:21 +00:00
softwarefactory-project-zuul[bot]
480c8516ab Merge pull request #5086 from keithjgrant/4614-fix-template-edit-border
Fix border/padding while loading jt edit form

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 16:31:53 +00:00
softwarefactory-project-zuul[bot]
9eda4efb74 Merge pull request #5114 from jakemcdermott/hide-revision-copy-button-sometimes
Hide revision copy button when there's no revision

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 16:21:14 +00:00
Jake McDermott
a517b15c26 Remove close button from job details 2019-10-25 11:59:40 -04:00
AlanCoding
609528e8a3 Fix bug: WFJT-type node YAML vars broke task manager 2019-10-25 11:41:58 -04:00
Jake McDermott
e17ee4b58f Hide revision copy button when there's no revision 2019-10-25 11:40:06 -04:00
softwarefactory-project-zuul[bot]
3dc8a10e85 Merge pull request #5104 from mabashian/4985-close-button-details
Remove close button from Project and Job Template details views

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 14:43:07 +00:00
softwarefactory-project-zuul[bot]
e893017e00 Merge pull request #5105 from mabashian/4956-template-details-links
Link to project and inventory from job template details

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 14:21:14 +00:00
softwarefactory-project-zuul[bot]
4a1c121792 Merge pull request #5084 from fosterseth/fix-4147-schedule500error
Fix 500 error when creating a job schedule

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 13:35:57 +00:00
softwarefactory-project-zuul[bot]
d39ad9d9ce Merge pull request #5085 from mabashian/5054-revision-column
Adds revision to project list row items

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-25 13:05:17 +00:00
mabashian
07a5e17284 Link to project and inventory from job template details 2019-10-24 16:36:19 -04:00
mabashian
583d1390d2 Remove close button from Project and Job Template details views 2019-10-24 16:17:37 -04:00
mabashian
638f8eae21 Hook up notifications tab on projects 2019-10-24 15:48:12 -04:00
Keith Grant
1d7bd835e6 remove unused imports 2019-10-24 10:29:04 -07:00
Keith Grant
4f90406e91 fix border/padding while loading jt edit form 2019-10-24 10:29:03 -07:00
Alex Corey
53b4dd5dbf Fixes linting errors 2019-10-24 12:35:30 -04:00
Alex Corey
491f4824b0 Addresses PR Issues
Improves credential ID variable in JT model.
Removes unused prop from Lookup ComponentDidMount.
Removed unused function call from Credentials ComponentDidMount.
Streamlines toggleCredential function and moves it to JobTemplateForm.  This was done because the
JobTemplateForm should handle the credential values passed to the CredentialsLookup.
Adds tests for JobTemplateForm to ensure toggleCredentialSelection function is putting proper values
in state.
Removed withRouter wrapper on CredentialsLookup export.
Improved CredentialsLookup test to ensure that onChange is called when user removes
a credential from the input.
2019-10-24 12:32:50 -04:00
Alex Corey
91721e09df Adds tests 2019-10-24 12:32:50 -04:00
Alex Corey
2828d31141 Adds CredentialLookUp to JT Form 2019-10-24 12:32:50 -04:00
Alex Corey
d10e727b3c Adds CredentialLookUp to JT Form 2019-10-24 12:32:50 -04:00
softwarefactory-project-zuul[bot]
f57cf03f4b Merge pull request #5017 from kimausloos/update-docs-openshift-scc
[docs] Update OpenShift doc section to clarify #3116

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-24 15:47:21 +00:00
mabashian
b319f47048 Adds revision to project list row items. Adds ClipboardCopyButton component to allow the user to copy the full revision to the clipboard. 2019-10-24 09:11:05 -04:00
lopf
432daa6139 fix 5089 2019-10-24 14:44:36 +02:00
Kim Ausloos
835c26f6cb [docs] Update OpenShift doc section to clarify #3116
Signed-off-by: Kim Ausloos <kim.ausloos@cegeka.be>
2019-10-24 08:55:40 +02:00
softwarefactory-project-zuul[bot]
f1c2a95f0d Merge pull request #5053 from fosterseth/fix-4797-copycredential
Fix secret lookup links when credentials are copied

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-23 23:46:52 +00:00
Seth Foster
58e84a40e5 Fix 500 error when creating a job schedule
- 500 error occurs when a non-admin user attempts to add an invalid
  credential during schedule creation
- This change checks that the user can add the object to
  serializer.validated_data, instead of serializer.initial_data
- The invalid credential field is purged in .validated_data, so the
  request passes through cleanly
- Fix for awx issue #4147
2019-10-23 14:22:07 -04:00
Seth Foster
9c04e08b4d Fix secret lookup links when credentials are copied
- When a credential that contains secret lookups (e.g. HashiCorp Vault
  Secret Lookup) is copied, the lookup fields are not properly copied
- This change adds the necessary fields to FIELDS_TO_PRESERVE_AT_COPY
  for both Credential and CredentialInputSource classes to ensure a
  proper copy
2019-10-22 17:49:10 -04:00
softwarefactory-project-zuul[bot]
bda1abab8d Merge pull request #5074 from shanemcd/devel
Downstream k8s installer changes

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-22 20:34:20 +00:00
Shane McDonald
8356327c2b Downstream k8s installer changes 2019-10-22 15:57:40 -04:00
softwarefactory-project-zuul[bot]
cafac2338d Merge pull request #5063 from HunterNyan/devel
Fixed bug with python check

Reviewed-by: Shane McDonald <me@shanemcd.com>
             https://github.com/shanemcd
2019-10-22 13:37:37 +00:00
Alice Hunter
e5dfc62dce Fixed bug with python check 2019-10-22 23:06:06 +11:00
softwarefactory-project-zuul[bot]
11edd43af3 Merge pull request #5058 from MrMEEE/patch-1
Only create /var/log/tower/external.log when LOG_AGGREGATOR_AUDIT is enabled

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-10-22 11:16:15 +00:00
Martin Juhl
27d0111a27 Update handlers.py 2019-10-22 01:25:27 +02:00
Martin Juhl
58367811a0 Update handlers.py
The setFormatter tries to create the external.log file.. So we should check if LOG_AGGREGATOR_AUDIT is active here as well
2019-10-22 01:02:31 +02:00
Evan Stork
0c0e172caf Added custom_venv_dir to local docker install,
Signed-off-by: Evan Stork <estork@live.com>
2019-10-19 20:45:02 -04:00
215 changed files with 4035 additions and 1905 deletions

7
.gitignore vendored
View File

@@ -135,9 +135,10 @@ use_dev_supervisor.txt
# Ansible module tests
awx_collection_test_venv/
awx_collection/*.tar.gz
awx_collection/galaxy.yml
/awx_collection_test_venv/
/awx_collection/*.tar.gz
/awx_collection/galaxy.yml
/sanity/
.idea/*
*.unison.tmp

View File

@@ -120,6 +120,8 @@ If these variables are present then all deployments will use these hosted images
To complete a deployment to OpenShift, you will obviously need access to an OpenShift cluster. For demo and testing purposes, you can use [Minishift](https://github.com/minishift/minishift) to create a single node cluster running inside a virtual machine.
When using OpenShift for deploying AWX make sure you have correct privileges to add the security context 'privileged', otherwise the installation will fail. The privileged context is needed because of the use of [the bubblewrap tool](https://github.com/containers/bubblewrap) to add an additional layer of security when using containers.
You will also need to have the `oc` command in your PATH. The `install.yml` playbook will call out to `oc` when logging into, and creating objects on the cluster.
The default resource requests per-deployment requires:
@@ -456,6 +458,10 @@ Before starting the build process, review the [inventory](./installer/inventory)
> When using docker-compose, the `docker-compose.yml` file will be created there (default `/tmp/awxcompose`).
*custom_venv_dir*
> Adds the custom venv environments from the local host to be passed into the containers at install.
*ca_trust_dir*
> If you're using a non trusted CA, provide a path where the untrusted Certs are stored on your Host.

View File

@@ -196,7 +196,7 @@ requirements_awx_dev:
requirements: requirements_ansible requirements_awx
requirements_dev: requirements requirements_awx_dev requirements_ansible_dev
requirements_dev: requirements_awx requirements_ansible_py3 requirements_awx_dev requirements_ansible_dev
requirements_test: requirements
@@ -399,6 +399,13 @@ flake8_collection:
test_collection_all: prepare_collection_venv test_collection flake8_collection
test_collection_sanity:
rm -rf sanity
mkdir -p sanity/ansible_collections/awx
cp -Ra awx_collection sanity/ansible_collections/awx/awx # symlinks do not work
cd sanity/ansible_collections/awx/awx && git init && git add . # requires both this file structure and a git repo, so there you go
cd sanity/ansible_collections/awx/awx && ansible-test sanity --test validate-modules
build_collection:
ansible-playbook -i localhost, awx_collection/template_galaxy.yml -e collection_package=$(COLLECTION_PACKAGE) -e collection_namespace=$(COLLECTION_NAMESPACE) -e collection_version=$(VERSION)
ansible-galaxy collection build awx_collection --output-path=awx_collection

View File

@@ -1 +1 @@
8.0.0
9.0.0

View File

@@ -574,7 +574,7 @@ class SubListCreateAPIView(SubListAPIView, ListCreateAPIView):
status=status.HTTP_400_BAD_REQUEST)
# Verify we have permission to add the object as given.
if not request.user.can_access(self.model, 'add', serializer.initial_data):
if not request.user.can_access(self.model, 'add', serializer.validated_data):
raise PermissionDenied()
# save the object through the serializer, reload and returned the saved

View File

@@ -158,9 +158,16 @@ class Metadata(metadata.SimpleMetadata):
isinstance(field, JSONField) or
isinstance(model_field, JSONField) or
isinstance(field, DRFJSONField) or
isinstance(getattr(field, 'model_field', None), JSONField)
isinstance(getattr(field, 'model_field', None), JSONField) or
field.field_name == 'credential_passwords'
):
field_info['type'] = 'json'
elif (
isinstance(field, ManyRelatedField) and
field.field_name == 'credentials'
# launch-time credentials
):
field_info['type'] = 'list_of_ids'
elif isinstance(model_field, BooleanField):
field_info['type'] = 'boolean'

View File

@@ -4338,13 +4338,30 @@ class NotificationTemplateSerializer(BaseSerializer):
error_list = []
collected_messages = []
def check_messages(messages):
for message_type in messages:
if message_type not in ('message', 'body'):
error_list.append(_("Message type '{}' invalid, must be either 'message' or 'body'").format(message_type))
continue
message = messages[message_type]
if message is None:
continue
if not isinstance(message, str):
error_list.append(_("Expected string for '{}', found {}, ").format(message_type, type(message)))
continue
if message_type == 'message':
if '\n' in message:
error_list.append(_("Messages cannot contain newlines (found newline in {} event)".format(event)))
continue
collected_messages.append(message)
# Validate structure / content types
if not isinstance(messages, dict):
error_list.append(_("Expected dict for 'messages' field, found {}".format(type(messages))))
else:
for event in messages:
if event not in ['started', 'success', 'error']:
error_list.append(_("Event '{}' invalid, must be one of 'started', 'success', or 'error'").format(event))
if event not in ('started', 'success', 'error', 'workflow_approval'):
error_list.append(_("Event '{}' invalid, must be one of 'started', 'success', 'error', or 'workflow_approval'").format(event))
continue
event_messages = messages[event]
if event_messages is None:
@@ -4352,21 +4369,21 @@ class NotificationTemplateSerializer(BaseSerializer):
if not isinstance(event_messages, dict):
error_list.append(_("Expected dict for event '{}', found {}").format(event, type(event_messages)))
continue
for message_type in event_messages:
if message_type not in ['message', 'body']:
error_list.append(_("Message type '{}' invalid, must be either 'message' or 'body'").format(message_type))
continue
message = event_messages[message_type]
if message is None:
continue
if not isinstance(message, str):
error_list.append(_("Expected string for '{}', found {}, ").format(message_type, type(message)))
continue
if message_type == 'message':
if '\n' in message:
error_list.append(_("Messages cannot contain newlines (found newline in {} event)".format(event)))
if event == 'workflow_approval':
for subevent in event_messages:
if subevent not in ('running', 'approved', 'timed_out', 'denied'):
error_list.append(_("Workflow Approval event '{}' invalid, must be one of "
"'running', 'approved', 'timed_out', or 'denied'").format(subevent))
continue
collected_messages.append(message)
subevent_messages = event_messages[subevent]
if subevent_messages is None:
continue
if not isinstance(subevent_messages, dict):
error_list.append(_("Expected dict for workflow approval event '{}', found {}").format(subevent, type(subevent_messages)))
continue
check_messages(subevent_messages)
else:
check_messages(event_messages)
# Subclass to return name of undefined field
class DescriptiveUndefined(StrictUndefined):
@@ -4497,8 +4514,18 @@ class NotificationSerializer(BaseSerializer):
'notification_type', 'recipients', 'subject', 'body')
def get_body(self, obj):
if obj.notification_type == 'webhook' and 'body' in obj.body:
return obj.body['body']
if obj.notification_type in ('webhook', 'pagerduty'):
if isinstance(obj.body, dict):
if 'body' in obj.body:
return obj.body['body']
elif isinstance(obj.body, str):
# attempt to load json string
try:
potential_body = json.loads(obj.body)
if isinstance(potential_body, dict):
return potential_body
except json.JSONDecodeError:
pass
return obj.body
def get_related(self, obj):
@@ -4774,6 +4801,18 @@ class InstanceGroupSerializer(BaseSerializer):
raise serializers.ValidationError(_('Isolated instances may not be added or removed from instances groups via the API.'))
if self.instance and self.instance.controller_id is not None:
raise serializers.ValidationError(_('Isolated instance group membership may not be managed via the API.'))
if value and self.instance and self.instance.is_containerized:
raise serializers.ValidationError(_('Containerized instances may not be managed via the API'))
return value
def validate_policy_instance_percentage(self, value):
if value and self.instance and self.instance.is_containerized:
raise serializers.ValidationError(_('Containerized instances may not be managed via the API'))
return value
def validate_policy_instance_minimum(self, value):
if value and self.instance and self.instance.is_containerized:
raise serializers.ValidationError(_('Containerized instances may not be managed via the API'))
return value
def validate_name(self, value):

View File

@@ -102,7 +102,7 @@ from awx.main.scheduler.dag_workflow import WorkflowDAG
from awx.api.views.mixin import (
ControlledByScmMixin, InstanceGroupMembershipMixin,
OrganizationCountsMixin, RelatedJobsPreventDeleteMixin,
UnifiedJobDeletionMixin,
UnifiedJobDeletionMixin, NoTruncateMixin,
)
from awx.api.views.organization import ( # noqa
OrganizationList,
@@ -383,6 +383,13 @@ class InstanceGroupDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAP
serializer_class = serializers.InstanceGroupSerializer
permission_classes = (InstanceGroupTowerPermission,)
def update_raw_data(self, data):
if self.get_object().is_containerized:
data.pop('policy_instance_percentage', None)
data.pop('policy_instance_minimum', None)
data.pop('policy_instance_list', None)
return super(InstanceGroupDetail, self).update_raw_data(data)
def destroy(self, request, *args, **kwargs):
instance = self.get_object()
if instance.controller is not None:
@@ -568,6 +575,7 @@ class TeamUsersList(BaseUsersList):
serializer_class = serializers.UserSerializer
parent_model = models.Team
relationship = 'member_role.members'
ordering = ('username',)
class TeamRolesList(SubListAttachDetachAPIView):
@@ -904,6 +912,7 @@ class UserList(ListCreateAPIView):
model = models.User
serializer_class = serializers.UserSerializer
permission_classes = (UserPermission,)
ordering = ('username',)
class UserMeList(ListAPIView):
@@ -911,6 +920,7 @@ class UserMeList(ListAPIView):
model = models.User
serializer_class = serializers.UserSerializer
name = _('Me')
ordering = ('username',)
def get_queryset(self):
return self.model.objects.filter(pk=self.request.user.pk)
@@ -1254,6 +1264,7 @@ class CredentialOwnerUsersList(SubListAPIView):
serializer_class = serializers.UserSerializer
parent_model = models.Credential
relationship = 'admin_role.members'
ordering = ('username',)
class CredentialOwnerTeamsList(SubListAPIView):
@@ -2136,12 +2147,21 @@ class InventorySourceHostsList(HostRelatedSearchMixin, SubListDestroyAPIView):
def perform_list_destroy(self, instance_list):
inv_source = self.get_parent_object()
with ignore_inventory_computed_fields():
# Activity stream doesn't record disassociation here anyway
# no signals-related reason to not bulk-delete
models.Host.groups.through.objects.filter(
host__inventory_sources=inv_source
).delete()
r = super(InventorySourceHostsList, self).perform_list_destroy(instance_list)
if not settings.ACTIVITY_STREAM_ENABLED_FOR_INVENTORY_SYNC:
from awx.main.signals import disable_activity_stream
with disable_activity_stream():
# job host summary deletion necessary to avoid deadlock
models.JobHostSummary.objects.filter(host__inventory_sources=inv_source).update(host=None)
models.Host.objects.filter(inventory_sources=inv_source).delete()
r = super(InventorySourceHostsList, self).perform_list_destroy([])
else:
# Advance delete of group-host memberships to prevent deadlock
# Activity stream doesn't record disassociation here anyway
# no signals-related reason to not bulk-delete
models.Host.groups.through.objects.filter(
host__inventory_sources=inv_source
).delete()
r = super(InventorySourceHostsList, self).perform_list_destroy(instance_list)
update_inventory_computed_fields.delay(inv_source.inventory_id, True)
return r
@@ -2157,11 +2177,18 @@ class InventorySourceGroupsList(SubListDestroyAPIView):
def perform_list_destroy(self, instance_list):
inv_source = self.get_parent_object()
with ignore_inventory_computed_fields():
# Same arguments for bulk delete as with host list
models.Group.hosts.through.objects.filter(
group__inventory_sources=inv_source
).delete()
r = super(InventorySourceGroupsList, self).perform_list_destroy(instance_list)
if not settings.ACTIVITY_STREAM_ENABLED_FOR_INVENTORY_SYNC:
from awx.main.signals import disable_activity_stream
with disable_activity_stream():
models.Group.objects.filter(inventory_sources=inv_source).delete()
r = super(InventorySourceGroupsList, self).perform_list_destroy([])
else:
# Advance delete of group-host memberships to prevent deadlock
# Same arguments for bulk delete as with host list
models.Group.hosts.through.objects.filter(
group__inventory_sources=inv_source
).delete()
r = super(InventorySourceGroupsList, self).perform_list_destroy(instance_list)
update_inventory_computed_fields.delay(inv_source.inventory_id, True)
return r
@@ -3762,18 +3789,12 @@ class JobHostSummaryDetail(RetrieveAPIView):
serializer_class = serializers.JobHostSummarySerializer
class JobEventList(ListAPIView):
class JobEventList(NoTruncateMixin, ListAPIView):
model = models.JobEvent
serializer_class = serializers.JobEventSerializer
search_fields = ('stdout',)
def get_serializer_context(self):
context = super().get_serializer_context()
if self.request.query_params.get('no_truncate'):
context.update(no_truncate=True)
return context
class JobEventDetail(RetrieveAPIView):
@@ -3786,7 +3807,7 @@ class JobEventDetail(RetrieveAPIView):
return context
class JobEventChildrenList(SubListAPIView):
class JobEventChildrenList(NoTruncateMixin, SubListAPIView):
model = models.JobEvent
serializer_class = serializers.JobEventSerializer
@@ -3811,7 +3832,7 @@ class JobEventHostsList(HostRelatedSearchMixin, SubListAPIView):
name = _('Job Event Hosts List')
class BaseJobEventsList(SubListAPIView):
class BaseJobEventsList(NoTruncateMixin, SubListAPIView):
model = models.JobEvent
serializer_class = serializers.JobEventSerializer
@@ -4007,18 +4028,12 @@ class AdHocCommandRelaunch(GenericAPIView):
return Response(data, status=status.HTTP_201_CREATED, headers=headers)
class AdHocCommandEventList(ListAPIView):
class AdHocCommandEventList(NoTruncateMixin, ListAPIView):
model = models.AdHocCommandEvent
serializer_class = serializers.AdHocCommandEventSerializer
search_fields = ('stdout',)
def get_serializer_context(self):
context = super().get_serializer_context()
if self.request.query_params.get('no_truncate'):
context.update(no_truncate=True)
return context
class AdHocCommandEventDetail(RetrieveAPIView):
@@ -4031,7 +4046,7 @@ class AdHocCommandEventDetail(RetrieveAPIView):
return context
class BaseAdHocCommandEventsList(SubListAPIView):
class BaseAdHocCommandEventsList(NoTruncateMixin, SubListAPIView):
model = models.AdHocCommandEvent
serializer_class = serializers.AdHocCommandEventSerializer
@@ -4297,8 +4312,15 @@ class NotificationTemplateTest(GenericAPIView):
def post(self, request, *args, **kwargs):
obj = self.get_object()
notification = obj.generate_notification("Tower Notification Test {} {}".format(obj.id, settings.TOWER_URL_BASE),
{"body": "Ansible Tower Test Notification {} {}".format(obj.id, settings.TOWER_URL_BASE)})
msg = "Tower Notification Test {} {}".format(obj.id, settings.TOWER_URL_BASE)
if obj.notification_type in ('email', 'pagerduty'):
body = "Ansible Tower Test Notification {} {}".format(obj.id, settings.TOWER_URL_BASE)
elif obj.notification_type == 'webhook':
body = '{{"body": "Ansible Tower Test Notification {} {}"}}'.format(obj.id, settings.TOWER_URL_BASE)
else:
body = {"body": "Ansible Tower Test Notification {} {}".format(obj.id, settings.TOWER_URL_BASE)}
notification = obj.generate_notification(msg, body)
if not notification:
return Response({}, status=status.HTTP_400_BAD_REQUEST)
else:

View File

@@ -270,3 +270,11 @@ class ControlledByScmMixin(object):
obj = super(ControlledByScmMixin, self).get_parent_object()
self._reset_inv_src_rev(obj)
return obj
class NoTruncateMixin(object):
def get_serializer_context(self):
context = super().get_serializer_context()
if self.request.query_params.get('no_truncate'):
context.update(no_truncate=True)
return context

View File

@@ -1,6 +1,5 @@
from hashlib import sha1
import hmac
import json
import logging
import urllib.parse
@@ -151,13 +150,13 @@ class WebhookReceiverBase(APIView):
'webhook_credential': obj.webhook_credential,
'webhook_guid': event_guid,
},
'extra_vars': json.dumps({
'extra_vars': {
'tower_webhook_event_type': event_type,
'tower_webhook_event_guid': event_guid,
'tower_webhook_event_ref': event_ref,
'tower_webhook_status_api': status_api,
'tower_webhook_payload': request.data,
})
}
}
new_job = obj.create_unified_job(**kwargs)

View File

@@ -1,11 +1,12 @@
# Python
import os
import re
import logging
import urllib.parse as urlparse
from collections import OrderedDict
# Django
from django.core.validators import URLValidator
from django.core.validators import URLValidator, _lazy_re_compile
from django.utils.translation import ugettext_lazy as _
# Django REST Framework
@@ -118,17 +119,42 @@ class StringListPathField(StringListField):
class URLField(CharField):
# these lines set up a custom regex that allow numbers in the
# top-level domain
tld_re = (
r'\.' # dot
r'(?!-)' # can't start with a dash
r'(?:[a-z' + URLValidator.ul + r'0-9' + '-]{2,63}' # domain label, this line was changed from the original URLValidator
r'|xn--[a-z0-9]{1,59})' # or punycode label
r'(?<!-)' # can't end with a dash
r'\.?' # may have a trailing dot
)
host_re = '(' + URLValidator.hostname_re + URLValidator.domain_re + tld_re + '|localhost)'
regex = _lazy_re_compile(
r'^(?:[a-z0-9\.\-\+]*)://' # scheme is validated separately
r'(?:[^\s:@/]+(?::[^\s:@/]*)?@)?' # user:pass authentication
r'(?:' + URLValidator.ipv4_re + '|' + URLValidator.ipv6_re + '|' + host_re + ')'
r'(?::\d{2,5})?' # port
r'(?:[/?#][^\s]*)?' # resource path
r'\Z', re.IGNORECASE)
def __init__(self, **kwargs):
schemes = kwargs.pop('schemes', None)
regex = kwargs.pop('regex', None)
self.allow_plain_hostname = kwargs.pop('allow_plain_hostname', False)
self.allow_numbers_in_top_level_domain = kwargs.pop('allow_numbers_in_top_level_domain', True)
super(URLField, self).__init__(**kwargs)
validator_kwargs = dict(message=_('Enter a valid URL'))
if schemes is not None:
validator_kwargs['schemes'] = schemes
if regex is not None:
validator_kwargs['regex'] = regex
if self.allow_numbers_in_top_level_domain and regex is None:
# default behavior is to allow numbers in the top level domain
# if a custom regex isn't provided
validator_kwargs['regex'] = URLField.regex
self.validators.append(URLValidator(**validator_kwargs))
def to_representation(self, value):

View File

@@ -1,7 +1,7 @@
import pytest
from rest_framework.fields import ValidationError
from awx.conf.fields import StringListBooleanField, StringListPathField, ListTuplesField
from awx.conf.fields import StringListBooleanField, StringListPathField, ListTuplesField, URLField
class TestStringListBooleanField():
@@ -62,7 +62,7 @@ class TestListTuplesField():
FIELD_VALUES = [
([('a', 'b'), ('abc', '123')], [("a", "b"), ("abc", "123")]),
]
FIELD_VALUES_INVALID = [
("abc", type("abc")),
([('a', 'b', 'c'), ('abc', '123', '456')], type(('a',))),
@@ -130,3 +130,25 @@ class TestStringListPathField():
field.to_internal_value([value])
assert e.value.detail[0] == "{} is not a valid path choice.".format(value)
class TestURLField():
regex = "^https://www.example.org$"
@pytest.mark.parametrize("url,schemes,regex, allow_numbers_in_top_level_domain, expect_no_error",[
("ldap://www.example.org42", "ldap", None, True, True),
("https://www.example.org42", "https", None, False, False),
("https://www.example.org", None, regex, None, True),
("https://www.example3.org", None, regex, None, False),
("ftp://www.example.org", "https", None, None, False)
])
def test_urls(self, url, schemes, regex, allow_numbers_in_top_level_domain, expect_no_error):
kwargs = {}
kwargs.setdefault("allow_numbers_in_top_level_domain", allow_numbers_in_top_level_domain)
kwargs.setdefault("schemes", schemes)
kwargs.setdefault("regex", regex)
field = URLField(**kwargs)
if expect_no_error:
field.run_validators(url)
else:
with pytest.raises(ValidationError):
field.run_validators(url)

View File

@@ -465,7 +465,7 @@ class BaseAccess(object):
else:
relationship = 'members'
return access_method(obj, parent_obj, relationship, skip_sub_obj_read_check=True, data={})
except (ParseError, ObjectDoesNotExist):
except (ParseError, ObjectDoesNotExist, PermissionDenied):
return False
return False
@@ -1660,26 +1660,19 @@ class JobAccess(BaseAccess):
except JobLaunchConfig.DoesNotExist:
config = None
if obj.job_template and (self.user not in obj.job_template.execute_role):
return False
# Check if JT execute access (and related prompts) is sufficient
if obj.job_template is not None:
if config is None:
prompts_access = False
elif not config.has_user_prompts(obj.job_template):
prompts_access = True
elif obj.created_by_id != self.user.pk and vars_are_encrypted(config.extra_data):
prompts_access = False
if self.save_messages:
self.messages['detail'] = _('Job was launched with secret prompts provided by another user.')
else:
prompts_access = (
JobLaunchConfigAccess(self.user).can_add({'reference_obj': config}) and
not config.has_unprompted(obj.job_template)
)
jt_access = self.user in obj.job_template.execute_role
if prompts_access and jt_access:
if config and obj.job_template:
if not config.has_user_prompts(obj.job_template):
return True
elif not jt_access:
return False
elif obj.created_by_id != self.user.pk and vars_are_encrypted(config.extra_data):
# never allowed, not even for org admins
raise PermissionDenied(_('Job was launched with secret prompts provided by another user.'))
elif not config.has_unprompted(obj.job_template):
if JobLaunchConfigAccess(self.user).can_add({'reference_obj': config}):
return True
org_access = bool(obj.inventory) and self.user in obj.inventory.organization.inventory_admin_role
project_access = obj.project is None or self.user in obj.project.admin_role
@@ -2098,23 +2091,20 @@ class WorkflowJobAccess(BaseAccess):
self.messages['detail'] = _('Workflow Job was launched with unknown prompts.')
return False
# execute permission to WFJT is mandatory for any relaunch
if self.user not in template.execute_role:
return False
# Check if access to prompts to prevent relaunch
if config.prompts_dict():
if obj.created_by_id != self.user.pk and vars_are_encrypted(config.extra_data):
if self.save_messages:
self.messages['detail'] = _('Job was launched with secret prompts provided by another user.')
return False
raise PermissionDenied(_("Job was launched with secret prompts provided by another user."))
if not JobLaunchConfigAccess(self.user).can_add({'reference_obj': config}):
if self.save_messages:
self.messages['detail'] = _('Job was launched with prompts you lack access to.')
return False
raise PermissionDenied(_('Job was launched with prompts you lack access to.'))
if config.has_unprompted(template):
if self.save_messages:
self.messages['detail'] = _('Job was launched with prompts no longer accepted.')
return False
raise PermissionDenied(_('Job was launched with prompts no longer accepted.'))
# execute permission to WFJT is mandatory for any relaunch
return (self.user in template.execute_role)
return True # passed config checks
def can_recreate(self, obj):
node_qs = obj.workflow_job_nodes.all().prefetch_related('inventory', 'credentials', 'unified_job_template')

View File

@@ -54,15 +54,6 @@ register(
category_slug='system',
)
register(
'TOWER_ADMIN_ALERTS',
field_class=fields.BooleanField,
label=_('Enable Administrator Alerts'),
help_text=_('Email Admin users for system events that may require attention.'),
category=_('System'),
category_slug='system',
)
register(
'TOWER_URL_BASE',
field_class=fields.URLField,
@@ -513,6 +504,16 @@ register(
category_slug='jobs'
)
register(
'PUBLIC_GALAXY_ENABLED',
field_class=fields.BooleanField,
default=True,
label=_('Allow Access to Public Galaxy'),
help_text=_('Allow or deny access to the public Ansible Galaxy during project updates.'),
category=_('Jobs'),
category_slug='jobs'
)
register(
'STDOUT_MAX_BYTES_DISPLAY',
field_class=fields.IntegerField,

View File

@@ -4,6 +4,7 @@ import importlib
import sys
import traceback
from kubernetes.config import kube_config
from awx.main.tasks import dispatch_startup, inform_cluster_of_shutdown
@@ -107,6 +108,14 @@ class TaskWorker(BaseWorker):
for callback in body.get('errbacks', []) or []:
callback['uuid'] = body['uuid']
self.perform_work(callback)
finally:
# It's frustrating that we have to do this, but the python k8s
# client leaves behind cacert files in /tmp, so we must clean up
# the tmpdir per-dispatcher process every time a new task comes in
try:
kube_config._cleanup_temp_files()
except Exception:
logger.exception('failed to cleanup k8s client tmp files')
for callback in body.get('callbacks', []) or []:
callback['uuid'] = body['uuid']

View File

@@ -6,6 +6,7 @@ import stat
import tempfile
import time
import logging
import yaml
from django.conf import settings
import ansible_runner
@@ -48,10 +49,17 @@ class IsolatedManager(object):
def build_inventory(self, hosts):
if self.instance and self.instance.is_containerized:
inventory = {'all': {'hosts': {}}}
fd, path = tempfile.mkstemp(
prefix='.kubeconfig', dir=self.private_data_dir
)
with open(path, 'wb') as temp:
temp.write(yaml.dump(self.pod_manager.kube_config).encode())
temp.flush()
os.chmod(temp.name, stat.S_IRUSR | stat.S_IWUSR | stat.S_IXUSR)
for host in hosts:
inventory['all']['hosts'][host] = {
"ansible_connection": "kubectl",
"ansible_kubectl_config": self.pod_manager.kube_config
"ansible_kubectl_config": path,
}
else:
inventory = '\n'.join([
@@ -143,6 +151,8 @@ class IsolatedManager(object):
'- /artifacts/job_events/*-partial.json.tmp',
# don't rsync the ssh_key FIFO
'- /env/ssh_key',
# don't rsync kube config files
'- .kubeconfig*'
]
for filename, data in (

View File

@@ -295,7 +295,10 @@ class PrimordialModel(HasEditsMixin, CreatedModifiedModel):
def __init__(self, *args, **kwargs):
r = super(PrimordialModel, self).__init__(*args, **kwargs)
self._prior_values_store = self._get_fields_snapshot()
if self.pk:
self._prior_values_store = self._get_fields_snapshot()
else:
self._prior_values_store = {}
return r
def save(self, *args, **kwargs):

View File

@@ -86,6 +86,7 @@ class Credential(PasswordFieldsModel, CommonModelNameNotUnique, ResourceMixin):
unique_together = (('organization', 'name', 'credential_type'))
PASSWORD_FIELDS = ['inputs']
FIELDS_TO_PRESERVE_AT_COPY = ['input_sources']
credential_type = models.ForeignKey(
'CredentialType',
@@ -1162,6 +1163,8 @@ class CredentialInputSource(PrimordialModel):
unique_together = (('target_credential', 'input_field_name'),)
ordering = ('target_credential', 'source_credential', 'input_field_name',)
FIELDS_TO_PRESERVE_AT_COPY = ['source_credential', 'metadata', 'input_field_name']
target_credential = models.ForeignKey(
'Credential',
related_name='input_sources',

View File

@@ -900,6 +900,9 @@ class LaunchTimeConfigBase(BaseModel):
data[prompt_name] = self.display_extra_vars()
else:
data[prompt_name] = self.extra_vars
# Depending on model, field type may save and return as string
if isinstance(data[prompt_name], str):
data[prompt_name] = parse_yaml_or_json(data[prompt_name])
if self.survey_passwords and not display:
data['survey_passwords'] = self.survey_passwords
else:

View File

@@ -73,7 +73,7 @@ class NotificationTemplate(CommonModelNameNotUnique):
notification_configuration = prevent_search(JSONField(blank=False))
def default_messages():
return {'started': None, 'success': None, 'error': None}
return {'started': None, 'success': None, 'error': None, 'workflow_approval': None}
messages = JSONField(
null=True,
@@ -92,25 +92,6 @@ class NotificationTemplate(CommonModelNameNotUnique):
def get_message(self, condition):
return self.messages.get(condition, {})
def build_notification_message(self, event_type, context):
env = sandbox.ImmutableSandboxedEnvironment()
templates = self.get_message(event_type)
msg_template = templates.get('message', {})
try:
notification_subject = env.from_string(msg_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_subject = ''
msg_body = templates.get('body', {})
try:
notification_body = env.from_string(msg_body).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_body = ''
return (notification_subject, notification_body)
def get_absolute_url(self, request=None):
return reverse('api:notification_template_detail', kwargs={'pk': self.pk}, request=request)
@@ -128,19 +109,34 @@ class NotificationTemplate(CommonModelNameNotUnique):
old_messages = old_nt.messages
new_messages = self.messages
def merge_messages(local_old_messages, local_new_messages, local_event):
if local_new_messages.get(local_event, {}) and local_old_messages.get(local_event, {}):
local_old_event_msgs = local_old_messages[local_event]
local_new_event_msgs = local_new_messages[local_event]
for msg_type in ['message', 'body']:
if msg_type not in local_new_event_msgs and local_old_event_msgs.get(msg_type, None):
local_new_event_msgs[msg_type] = local_old_event_msgs[msg_type]
if old_messages is not None and new_messages is not None:
for event in ['started', 'success', 'error']:
for event in ('started', 'success', 'error', 'workflow_approval'):
if not new_messages.get(event, {}) and old_messages.get(event, {}):
new_messages[event] = old_messages[event]
continue
if new_messages.get(event, {}) and old_messages.get(event, {}):
old_event_msgs = old_messages[event]
new_event_msgs = new_messages[event]
for msg_type in ['message', 'body']:
if msg_type not in new_event_msgs and old_event_msgs.get(msg_type, None):
new_event_msgs[msg_type] = old_event_msgs[msg_type]
if event == 'workflow_approval' and old_messages.get('workflow_approval', None):
new_messages.setdefault('workflow_approval', {})
for subevent in ('running', 'approved', 'timed_out', 'denied'):
old_wfa_messages = old_messages['workflow_approval']
new_wfa_messages = new_messages['workflow_approval']
if not new_wfa_messages.get(subevent, {}) and old_wfa_messages.get(subevent, {}):
new_wfa_messages[subevent] = old_wfa_messages[subevent]
continue
if old_wfa_messages:
merge_messages(old_wfa_messages, new_wfa_messages, subevent)
else:
merge_messages(old_messages, new_messages, event)
new_messages.setdefault(event, None)
for field in filter(lambda x: self.notification_class.init_parameters[x]['type'] == "password",
self.notification_class.init_parameters):
if self.notification_configuration[field].startswith("$encrypted$"):
@@ -169,12 +165,12 @@ class NotificationTemplate(CommonModelNameNotUnique):
def recipients(self):
return self.notification_configuration[self.notification_class.recipient_parameter]
def generate_notification(self, subject, message):
def generate_notification(self, msg, body):
notification = Notification(notification_template=self,
notification_type=self.notification_type,
recipients=smart_str(self.recipients),
subject=subject,
body=message)
subject=msg,
body=body)
notification.save()
return notification
@@ -370,7 +366,7 @@ class JobNotificationMixin(object):
'verbosity': 0},
'job_friendly_name': 'Job',
'url': 'https://towerhost/#/jobs/playbook/1010',
'job_summary_dict': """{'url': 'https://towerhost/$/jobs/playbook/13',
'job_metadata': """{'url': 'https://towerhost/$/jobs/playbook/13',
'traceback': '',
'status': 'running',
'started': '2019-08-07T21:46:38.362630+00:00',
@@ -389,14 +385,14 @@ class JobNotificationMixin(object):
return context
def context(self, serialized_job):
"""Returns a context that can be used for rendering notification messages.
Context contains whitelisted content retrieved from a serialized job object
"""Returns a dictionary that can be used for rendering notification messages.
The context will contain whitelisted content retrieved from a serialized job object
(see JobNotificationMixin.JOB_FIELDS_WHITELIST), the job's friendly name,
and a url to the job run."""
context = {'job': {},
'job_friendly_name': self.get_notification_friendly_name(),
'url': self.get_ui_url(),
'job_summary_dict': json.dumps(self.notification_data(), indent=4)}
'job_metadata': json.dumps(self.notification_data(), indent=4)}
def build_context(node, fields, whitelisted_fields):
for safe_field in whitelisted_fields:
@@ -434,32 +430,33 @@ class JobNotificationMixin(object):
context = self.context(job_serialization)
msg_template = body_template = None
msg = body = ''
# Use custom template if available
if nt.messages:
templates = nt.messages.get(self.STATUS_TO_TEMPLATE_TYPE[status], {}) or {}
msg_template = templates.get('message', {})
body_template = templates.get('body', {})
template = nt.messages.get(self.STATUS_TO_TEMPLATE_TYPE[status], {}) or {}
msg_template = template.get('message', None)
body_template = template.get('body', None)
# If custom template not provided, look up default template
default_template = nt.notification_class.default_messages[self.STATUS_TO_TEMPLATE_TYPE[status]]
if not msg_template:
msg_template = default_template.get('message', None)
if not body_template:
body_template = default_template.get('body', None)
if msg_template:
try:
notification_subject = env.from_string(msg_template).render(**context)
msg = env.from_string(msg_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_subject = ''
else:
notification_subject = u"{} #{} '{}' {}: {}".format(self.get_notification_friendly_name(),
self.id,
self.name,
status,
self.get_ui_url())
notification_body = self.notification_data()
notification_body['friendly_name'] = self.get_notification_friendly_name()
msg = ''
if body_template:
try:
notification_body['body'] = env.from_string(body_template).render(**context)
body = env.from_string(body_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_body['body'] = ''
body = ''
return (notification_subject, notification_body)
return (msg, body)
def send_notification_templates(self, status):
from awx.main.tasks import send_notifications # avoid circular import
@@ -475,16 +472,13 @@ class JobNotificationMixin(object):
return
for nt in set(notification_templates.get(self.STATUS_TO_TEMPLATE_TYPE[status], [])):
try:
(notification_subject, notification_body) = self.build_notification_message(nt, status)
except AttributeError:
raise NotImplementedError("build_notification_message() does not exist" % status)
(msg, body) = self.build_notification_message(nt, status)
# Use kwargs to force late-binding
# https://stackoverflow.com/a/3431699/10669572
def send_it(local_nt=nt, local_subject=notification_subject, local_body=notification_body):
def send_it(local_nt=nt, local_msg=msg, local_body=body):
def _func():
send_notifications.delay([local_nt.generate_notification(local_subject, local_body).id],
send_notifications.delay([local_nt.generate_notification(local_msg, local_body).id],
job_id=self.id)
return _func
connection.on_commit(send_it())

View File

@@ -2,6 +2,7 @@
# All Rights Reserved.
# Python
import json
import logging
from copy import copy
from urllib.parse import urljoin
@@ -16,6 +17,9 @@ from django.core.exceptions import ObjectDoesNotExist
# Django-CRUM
from crum import get_current_user
from jinja2 import sandbox
from jinja2.exceptions import TemplateSyntaxError, UndefinedError, SecurityError
# AWX
from awx.api.versioning import reverse
from awx.main.models import (prevent_search, accepts_json, UnifiedJobTemplate,
@@ -763,22 +767,45 @@ class WorkflowApproval(UnifiedJob, JobNotificationMixin):
connection.on_commit(send_it())
def build_approval_notification_message(self, nt, approval_status):
subject = []
workflow_url = urljoin(settings.TOWER_URL_BASE, '/#/workflows/{}'.format(self.workflow_job.id))
subject.append(('The approval node "{}"').format(self.workflow_approval_template.name))
if approval_status == 'running':
subject.append(('needs review. This node can be viewed at: {}').format(workflow_url))
if approval_status == 'approved':
subject.append(('was approved. {}').format(workflow_url))
if approval_status == 'timed_out':
subject.append(('has timed out. {}').format(workflow_url))
elif approval_status == 'denied':
subject.append(('was denied. {}').format(workflow_url))
subject = " ".join(subject)
body = self.notification_data()
body['body'] = subject
env = sandbox.ImmutableSandboxedEnvironment()
return subject, body
context = self.context(approval_status)
msg_template = body_template = None
msg = body = ''
# Use custom template if available
if nt.messages and nt.messages.get('workflow_approval', None):
template = nt.messages['workflow_approval'].get(approval_status, {})
msg_template = template.get('message', None)
body_template = template.get('body', None)
# If custom template not provided, look up default template
default_template = nt.notification_class.default_messages['workflow_approval'][approval_status]
if not msg_template:
msg_template = default_template.get('message', None)
if not body_template:
body_template = default_template.get('body', None)
if msg_template:
try:
msg = env.from_string(msg_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
msg = ''
if body_template:
try:
body = env.from_string(body_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
body = ''
return (msg, body)
def context(self, approval_status):
workflow_url = urljoin(settings.TOWER_URL_BASE, '/#/workflows/{}'.format(self.workflow_job.id))
return {'approval_status': approval_status,
'approval_node_name': self.workflow_approval_template.name,
'workflow_url': workflow_url,
'job_metadata': json.dumps(self.notification_data(), indent=4)}
@property
def workflow_job_template(self):

View File

@@ -1,21 +1,10 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved.
import json
from django.utils.encoding import smart_text
from django.core.mail.backends.base import BaseEmailBackend
from django.utils.translation import ugettext_lazy as _
class AWXBaseEmailBackend(BaseEmailBackend):
def format_body(self, body):
if "body" in body:
body_actual = body['body']
else:
body_actual = smart_text(_("{} #{} had status {}, view details at {}\n\n").format(
body['friendly_name'], body['id'], body['status'], body['url'])
)
body_actual += json.dumps(body, indent=4)
return body_actual
return body

View File

@@ -0,0 +1,20 @@
# Copyright (c) 2019 Ansible, Inc.
# All Rights Reserved.
class CustomNotificationBase(object):
DEFAULT_MSG = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
DEFAULT_BODY = "{{ job_friendly_name }} #{{ job.id }} had status {{ job.status }}, view details at {{ url }}\n\n{{ job_metadata }}"
default_messages = {"started": {"message": DEFAULT_MSG, "body": None},
"success": {"message": DEFAULT_MSG, "body": None},
"error": {"message": DEFAULT_MSG, "body": None},
"workflow_approval": {"running": {"message": 'The approval node "{{ approval_node_name }}" needs review. '
'This node can be viewed at: {{ workflow_url }}',
"body": None},
"approved": {"message": 'The approval node "{{ approval_node_name }}" was approved. {{ workflow_url }}',
"body": None},
"timed_out": {"message": 'The approval node "{{ approval_node_name }}" has timed out. {{ workflow_url }}',
"body": None},
"denied": {"message": 'The approval node "{{ approval_node_name }}" was denied. {{ workflow_url }}',
"body": None}}}

View File

@@ -1,14 +1,15 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved.
import json
from django.utils.encoding import smart_text
from django.core.mail.backends.smtp import EmailBackend
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.custom_notification_base import CustomNotificationBase
DEFAULT_MSG = CustomNotificationBase.DEFAULT_MSG
DEFAULT_BODY = CustomNotificationBase.DEFAULT_BODY
class CustomEmailBackend(EmailBackend):
class CustomEmailBackend(EmailBackend, CustomNotificationBase):
init_parameters = {"host": {"label": "Host", "type": "string"},
"port": {"label": "Port", "type": "int"},
@@ -19,22 +20,17 @@ class CustomEmailBackend(EmailBackend):
"sender": {"label": "Sender Email", "type": "string"},
"recipients": {"label": "Recipient List", "type": "list"},
"timeout": {"label": "Timeout", "type": "int", "default": 30}}
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
DEFAULT_BODY = smart_text(_("{{ job_friendly_name }} #{{ job.id }} had status {{ job.status }}, view details at {{ url }}\n\n{{ job_summary_dict }}"))
default_messages = {"started": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"success": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"error": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY}}
recipient_parameter = "recipients"
sender_parameter = "sender"
default_messages = {"started": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"success": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"error": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"workflow_approval": {"running": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"approved": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"timed_out": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"denied": {"message": DEFAULT_MSG, "body": DEFAULT_BODY}}}
def format_body(self, body):
if "body" in body:
body_actual = body['body']
else:
body_actual = smart_text(_("{} #{} had status {}, view details at {}\n\n").format(
body['friendly_name'], body['id'], body['status'], body['url'])
)
body_actual += json.dumps(body, indent=4)
return body_actual
# leave body unchanged (expect a string)
return body

View File

@@ -8,24 +8,21 @@ import dateutil.parser as dp
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.grafana_backend')
class GrafanaBackend(AWXBaseEmailBackend):
class GrafanaBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"grafana_url": {"label": "Grafana URL", "type": "string"},
"grafana_key": {"label": "Grafana API Key", "type": "password"}}
recipient_parameter = "grafana_url"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, grafana_key,dashboardId=None, panelId=None, annotation_tags=None, grafana_no_verify_ssl=False, isRegion=True,
fail_silently=False, **kwargs):
super(GrafanaBackend, self).__init__(fail_silently=fail_silently)

View File

@@ -7,12 +7,14 @@ import requests
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.hipchat_backend')
class HipChatBackend(AWXBaseEmailBackend):
class HipChatBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"token": {"label": "Token", "type": "password"},
"rooms": {"label": "Destination Rooms", "type": "list"},
@@ -23,11 +25,6 @@ class HipChatBackend(AWXBaseEmailBackend):
recipient_parameter = "rooms"
sender_parameter = "message_from"
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, token, color, api_url, notify, fail_silently=False, **kwargs):
super(HipChatBackend, self).__init__(fail_silently=fail_silently)
self.token = token

View File

@@ -9,12 +9,14 @@ import irc.client
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.irc_backend')
class IrcBackend(AWXBaseEmailBackend):
class IrcBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"server": {"label": "IRC Server Address", "type": "string"},
"port": {"label": "IRC Server Port", "type": "int"},
@@ -25,11 +27,6 @@ class IrcBackend(AWXBaseEmailBackend):
recipient_parameter = "targets"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, server, port, nickname, password, use_ssl, fail_silently=False, **kwargs):
super(IrcBackend, self).__init__(fail_silently=fail_silently)
self.server = server

View File

@@ -7,23 +7,20 @@ import json
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.mattermost_backend')
class MattermostBackend(AWXBaseEmailBackend):
class MattermostBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"mattermost_url": {"label": "Target URL", "type": "string"},
"mattermost_no_verify_ssl": {"label": "Verify SSL", "type": "bool"}}
recipient_parameter = "mattermost_url"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, mattermost_no_verify_ssl=False, mattermost_channel=None, mattermost_username=None,
mattermost_icon_url=None, fail_silently=False, **kwargs):
super(MattermostBackend, self).__init__(fail_silently=fail_silently)

View File

@@ -1,17 +1,23 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved.
import json
import logging
import pygerduty
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
DEFAULT_BODY = CustomNotificationBase.DEFAULT_BODY
DEFAULT_MSG = CustomNotificationBase.DEFAULT_MSG
logger = logging.getLogger('awx.main.notifications.pagerduty_backend')
class PagerDutyBackend(AWXBaseEmailBackend):
class PagerDutyBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"subdomain": {"label": "Pagerduty subdomain", "type": "string"},
"token": {"label": "API Token", "type": "password"},
@@ -20,11 +26,14 @@ class PagerDutyBackend(AWXBaseEmailBackend):
recipient_parameter = "service_key"
sender_parameter = "client_name"
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
DEFAULT_BODY = "{{ job_summary_dict }}"
default_messages = {"started": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"success": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"error": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY}}
DEFAULT_BODY = "{{ job_metadata }}"
default_messages = {"started": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"success": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"error": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"workflow_approval": {"running": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"approved": {"message": DEFAULT_MSG,"body": DEFAULT_BODY},
"timed_out": {"message": DEFAULT_MSG, "body": DEFAULT_BODY},
"denied": {"message": DEFAULT_MSG, "body": DEFAULT_BODY}}}
def __init__(self, subdomain, token, fail_silently=False, **kwargs):
super(PagerDutyBackend, self).__init__(fail_silently=fail_silently)
@@ -32,6 +41,16 @@ class PagerDutyBackend(AWXBaseEmailBackend):
self.token = token
def format_body(self, body):
# cast to dict if possible # TODO: is it true that this can be a dict or str?
try:
potential_body = json.loads(body)
if isinstance(potential_body, dict):
body = potential_body
except json.JSONDecodeError:
pass
# but it's okay if this is also just a string
return body
def send_messages(self, messages):

View File

@@ -7,22 +7,20 @@ import json
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.rocketchat_backend')
class RocketChatBackend(AWXBaseEmailBackend):
class RocketChatBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"rocketchat_url": {"label": "Target URL", "type": "string"},
"rocketchat_no_verify_ssl": {"label": "Verify SSL", "type": "bool"}}
recipient_parameter = "rocketchat_url"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, rocketchat_no_verify_ssl=False, rocketchat_username=None, rocketchat_icon_url=None, fail_silently=False, **kwargs):
super(RocketChatBackend, self).__init__(fail_silently=fail_silently)

View File

@@ -6,24 +6,21 @@ from slackclient import SlackClient
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.slack_backend')
WEBSOCKET_TIMEOUT = 30
class SlackBackend(AWXBaseEmailBackend):
class SlackBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"token": {"label": "Token", "type": "password"},
"channels": {"label": "Destination Channels", "type": "list"}}
recipient_parameter = "channels"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, token, hex_color="", fail_silently=False, **kwargs):
super(SlackBackend, self).__init__(fail_silently=fail_silently)
self.token = token
@@ -50,6 +47,7 @@ class SlackBackend(AWXBaseEmailBackend):
else:
ret = connection.api_call("chat.postMessage",
channel=r,
as_user=True,
text=m.subject)
logger.debug(ret)
if ret['ok']:

View File

@@ -7,12 +7,14 @@ from twilio.rest import Client
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.twilio_backend')
class TwilioBackend(AWXBaseEmailBackend):
class TwilioBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"account_sid": {"label": "Account SID", "type": "string"},
"account_token": {"label": "Account Token", "type": "password"},
@@ -21,11 +23,6 @@ class TwilioBackend(AWXBaseEmailBackend):
recipient_parameter = "to_numbers"
sender_parameter = "from_number"
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, account_sid, account_token, fail_silently=False, **kwargs):
super(TwilioBackend, self).__init__(fail_silently=fail_silently)
self.account_sid = account_sid

View File

@@ -7,13 +7,15 @@ import requests
from django.utils.encoding import smart_text
from django.utils.translation import ugettext_lazy as _
from awx.main.notifications.base import AWXBaseEmailBackend
from awx.main.utils import get_awx_version
from awx.main.notifications.custom_notification_base import CustomNotificationBase
logger = logging.getLogger('awx.main.notifications.webhook_backend')
class WebhookBackend(AWXBaseEmailBackend):
class WebhookBackend(AWXBaseEmailBackend, CustomNotificationBase):
init_parameters = {"url": {"label": "Target URL", "type": "string"},
"http_method": {"label": "HTTP Method", "type": "string", "default": "POST"},
@@ -24,10 +26,16 @@ class WebhookBackend(AWXBaseEmailBackend):
recipient_parameter = "url"
sender_parameter = None
DEFAULT_BODY = "{{ job_summary_dict }}"
DEFAULT_BODY = "{{ job_metadata }}"
default_messages = {"started": {"body": DEFAULT_BODY},
"success": {"body": DEFAULT_BODY},
"error": {"body": DEFAULT_BODY}}
"error": {"body": DEFAULT_BODY},
"workflow_approval": {
"running": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" needs review. '
'This node can be viewed at: {{ workflow_url }}"}'},
"approved": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" was approved. {{ workflow_url }}"}'},
"timed_out": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" has timed out. {{ workflow_url }}"}'},
"denied": {"body": '{"body": "The approval node \\"{{ approval_node_name }}\\" was denied. {{ workflow_url }}"}'}}}
def __init__(self, http_method, headers, disable_ssl_verification=False, fail_silently=False, username=None, password=None, **kwargs):
self.http_method = http_method
@@ -38,15 +46,13 @@ class WebhookBackend(AWXBaseEmailBackend):
super(WebhookBackend, self).__init__(fail_silently=fail_silently)
def format_body(self, body):
# If `body` has body field, attempt to use this as the main body,
# otherwise, leave it as a sub-field
if isinstance(body, dict) and 'body' in body and isinstance(body['body'], str):
try:
potential_body = json.loads(body['body'])
if isinstance(potential_body, dict):
body = potential_body
except json.JSONDecodeError:
pass
# expect body to be a string representing a dict
try:
potential_body = json.loads(body)
if isinstance(potential_body, dict):
body = potential_body
except json.JSONDecodeError:
body = {}
return body
def send_messages(self, messages):

View File

@@ -12,10 +12,12 @@ class UriCleaner(object):
@staticmethod
def remove_sensitive(cleartext):
# exclude_list contains the items that will _not_ be redacted
exclude_list = [settings.PUBLIC_GALAXY_SERVER['url']]
if settings.PRIMARY_GALAXY_URL:
exclude_list = [settings.PRIMARY_GALAXY_URL] + [server['url'] for server in settings.FALLBACK_GALAXY_SERVERS]
else:
exclude_list = [server['url'] for server in settings.FALLBACK_GALAXY_SERVERS]
exclude_list += [settings.PRIMARY_GALAXY_URL]
if settings.FALLBACK_GALAXY_SERVERS:
exclude_list += [server['url'] for server in settings.FALLBACK_GALAXY_SERVERS]
redactedtext = cleartext
text_index = 0
while True:

View File

@@ -1,9 +1,5 @@
import collections
import os
import stat
import time
import yaml
import tempfile
import logging
from base64 import b64encode
@@ -88,8 +84,17 @@ class PodManager(object):
@cached_property
def kube_api(self):
my_client = config.new_client_from_config(config_file=self.kube_config)
return client.CoreV1Api(api_client=my_client)
# this feels a little janky, but it's what k8s' own code does
# internally when it reads kube config files from disk:
# https://github.com/kubernetes-client/python-base/blob/0b208334ef0247aad9afcaae8003954423b61a0d/config/kube_config.py#L643
loader = config.kube_config.KubeConfigLoader(
config_dict=self.kube_config
)
cfg = type.__call__(client.Configuration)
loader.load_and_set(cfg)
return client.CoreV1Api(api_client=client.ApiClient(
configuration=cfg
))
@property
def pod_name(self):
@@ -174,10 +179,4 @@ def generate_tmp_kube_config(credential, namespace):
).decode() # decode the base64 data into a str
else:
config["clusters"][0]["cluster"]["insecure-skip-tls-verify"] = True
fd, path = tempfile.mkstemp(prefix='kubeconfig')
with open(path, 'wb') as temp:
temp.write(yaml.dump(config).encode())
temp.flush()
os.chmod(temp.name, stat.S_IRUSR | stat.S_IWUSR | stat.S_IXUSR)
return path
return config

View File

@@ -252,19 +252,25 @@ class TaskManager():
logger.debug('Submitting isolated {} to queue {} controlled by {}.'.format(
task.log_format, task.execution_node, controller_node))
elif rampart_group.is_containerized:
# find one real, non-containerized instance with capacity to
# act as the controller for k8s API interaction
match = None
for group in InstanceGroup.objects.all():
if group.is_containerized or group.controller_id:
continue
match = group.find_largest_idle_instance()
if match:
break
task.instance_group = rampart_group
if not task.supports_isolation():
if task.supports_isolation():
task.controller_node = match.hostname
else:
# project updates and inventory updates don't *actually* run in pods,
# so just pick *any* non-isolated, non-containerized host and use it
for group in InstanceGroup.objects.all():
if group.is_containerized or group.controller_id:
continue
match = group.find_largest_idle_instance()
if match:
task.execution_node = match.hostname
logger.debug('Submitting containerized {} to queue {}.'.format(
task.log_format, task.execution_node))
break
# as the execution node
task.execution_node = match.hostname
logger.debug('Submitting containerized {} to queue {}.'.format(
task.log_format, task.execution_node))
else:
task.instance_group = rampart_group
if instance is not None:

View File

@@ -22,10 +22,6 @@ import yaml
import fcntl
from pathlib import Path
from uuid import uuid4
try:
import psutil
except Exception:
psutil = None
import urllib.parse as urlparse
# Django
@@ -34,7 +30,6 @@ from django.db import transaction, DatabaseError, IntegrityError
from django.db.models.fields.related import ForeignKey
from django.utils.timezone import now, timedelta
from django.utils.encoding import smart_str
from django.core.mail import send_mail
from django.contrib.auth.models import User
from django.utils.translation import ugettext_lazy as _
from django.core.cache import cache
@@ -72,7 +67,6 @@ from awx.main.isolated import manager as isolated_manager
from awx.main.dispatch.publish import task
from awx.main.dispatch import get_local_queuename, reaper
from awx.main.utils import (get_ssh_version, update_scm_url,
get_licenser,
ignore_inventory_computed_fields,
ignore_inventory_group_removal, extract_ansible_vars, schedule_task_manager,
get_awx_version)
@@ -92,7 +86,7 @@ from rest_framework.exceptions import PermissionDenied
__all__ = ['RunJob', 'RunSystemJob', 'RunProjectUpdate', 'RunInventoryUpdate',
'RunAdHocCommand', 'handle_work_error', 'handle_work_success', 'apply_cluster_membership_policies',
'update_inventory_computed_fields', 'update_host_smart_inventory_memberships',
'send_notifications', 'run_administrative_checks', 'purge_old_stdout_files']
'send_notifications', 'purge_old_stdout_files']
HIDDEN_PASSWORD = '**********'
@@ -356,28 +350,6 @@ def gather_analytics():
os.remove(tgz)
@task()
def run_administrative_checks():
logger.warn("Running administrative checks.")
if not settings.TOWER_ADMIN_ALERTS:
return
validation_info = get_licenser().validate()
if validation_info['license_type'] != 'open' and validation_info.get('instance_count', 0) < 1:
return
used_percentage = float(validation_info.get('current_instances', 0)) / float(validation_info.get('instance_count', 100))
tower_admin_emails = User.objects.filter(is_superuser=True).values_list('email', flat=True)
if (used_percentage * 100) > 90:
send_mail("Ansible Tower host usage over 90%",
_("Ansible Tower host usage over 90%"),
tower_admin_emails,
fail_silently=True)
if validation_info.get('date_warning', False):
send_mail("Ansible Tower license will expire soon",
_("Ansible Tower license will expire soon"),
tower_admin_emails,
fail_silently=True)
@task(queue=get_local_queuename)
def purge_old_stdout_files():
nowtime = time.time()
@@ -1423,7 +1395,6 @@ class BaseTask(object):
def deploy_container_group_pod(self, task):
from awx.main.scheduler.kubernetes import PodManager # Avoid circular import
pod_manager = PodManager(self.instance)
self.cleanup_paths.append(pod_manager.kube_config)
try:
log_name = task.log_format
logger.debug(f"Launching pod for {log_name}.")
@@ -1452,7 +1423,7 @@ class BaseTask(object):
self.update_model(task.pk, execution_node=pod_manager.pod_name)
return pod_manager
@@ -1959,9 +1930,15 @@ class RunProjectUpdate(BaseTask):
env['PROJECT_UPDATE_ID'] = str(project_update.pk)
env['ANSIBLE_CALLBACK_PLUGINS'] = self.get_path_to('..', 'plugins', 'callback')
env['ANSIBLE_GALAXY_IGNORE'] = True
# Set up the fallback server, which is the normal Ansible Galaxy by default
galaxy_servers = list(settings.FALLBACK_GALAXY_SERVERS)
# If private galaxy URL is non-blank, that means this feature is enabled
# Set up the public Galaxy server, if enabled
if settings.PUBLIC_GALAXY_ENABLED:
galaxy_servers = [settings.PUBLIC_GALAXY_SERVER]
else:
galaxy_servers = []
# Set up fallback Galaxy servers, if configured
if settings.FALLBACK_GALAXY_SERVERS:
galaxy_servers = settings.FALLBACK_GALAXY_SERVERS + galaxy_servers
# Set up the primary Galaxy server, if configured
if settings.PRIMARY_GALAXY_URL:
galaxy_servers = [{'id': 'primary_galaxy'}] + galaxy_servers
for key in GALAXY_SERVER_FIELDS:
@@ -2354,6 +2331,27 @@ class RunInventoryUpdate(BaseTask):
env[str(env_k)] = str(inventory_update.source_vars_dict[env_k])
elif inventory_update.source == 'file':
raise NotImplementedError('Cannot update file sources through the task system.')
if inventory_update.source == 'scm' and inventory_update.source_project_update:
env_key = 'ANSIBLE_COLLECTIONS_PATHS'
config_setting = 'collections_paths'
folder = 'requirements_collections'
default = '~/.ansible/collections:/usr/share/ansible/collections'
config_values = read_ansible_config(os.path.join(private_data_dir, 'project'), [config_setting])
paths = default.split(':')
if env_key in env:
for path in env[env_key].split(':'):
if path not in paths:
paths = [env[env_key]] + paths
elif config_setting in config_values:
for path in config_values[config_setting].split(':'):
if path not in paths:
paths = [config_values[config_setting]] + paths
paths = [os.path.join(private_data_dir, folder)] + paths
env[env_key] = os.pathsep.join(paths)
return env
def write_args_file(self, private_data_dir, args):
@@ -2452,7 +2450,7 @@ class RunInventoryUpdate(BaseTask):
# Use the vendored script path
inventory_path = self.get_path_to('..', 'plugins', 'inventory', injector.script_name)
elif src == 'scm':
inventory_path = inventory_update.get_actual_source_path()
inventory_path = os.path.join(private_data_dir, 'project', inventory_update.source_path)
elif src == 'custom':
handle, inventory_path = tempfile.mkstemp(dir=private_data_dir)
f = os.fdopen(handle, 'w')
@@ -2473,7 +2471,7 @@ class RunInventoryUpdate(BaseTask):
'''
src = inventory_update.source
if src == 'scm' and inventory_update.source_project_update:
return inventory_update.source_project_update.get_project_path(check_if_exists=False)
return os.path.join(private_data_dir, 'project')
if src in CLOUD_PROVIDERS:
injector = None
if src in InventorySource.injectors:
@@ -2509,8 +2507,10 @@ class RunInventoryUpdate(BaseTask):
project_update_task = local_project_sync._get_task_class()
try:
project_update_task().run(local_project_sync.id)
inventory_update.inventory_source.scm_last_revision = local_project_sync.project.scm_revision
sync_task = project_update_task(job_private_data_dir=private_data_dir)
sync_task.run(local_project_sync.id)
local_project_sync.refresh_from_db()
inventory_update.inventory_source.scm_last_revision = local_project_sync.scm_revision
inventory_update.inventory_source.save(update_fields=['scm_last_revision'])
except Exception:
inventory_update = self.update_model(
@@ -2518,6 +2518,13 @@ class RunInventoryUpdate(BaseTask):
job_explanation=('Previous Task Failed: {"job_type": "%s", "job_name": "%s", "job_id": "%s"}' %
('project_update', local_project_sync.name, local_project_sync.id)))
raise
elif inventory_update.source == 'scm' and inventory_update.launch_type == 'scm' and source_project:
# This follows update, not sync, so make copy here
project_path = source_project.get_project_path(check_if_exists=False)
RunProjectUpdate.make_local_copy(
project_path, os.path.join(private_data_dir, 'project'),
source_project.scm_type, source_project.scm_revision
)
@task()

View File

@@ -0,0 +1,45 @@
import pytest
from awx.api.versioning import reverse
from awx.main.models import AdHocCommand, AdHocCommandEvent, JobEvent
@pytest.mark.django_db
@pytest.mark.parametrize('truncate, expected', [
(True, False),
(False, True),
])
def test_job_events_sublist_truncation(get, organization_factory, job_template_factory, truncate, expected):
objs = organization_factory("org", superusers=['admin'])
jt = job_template_factory("jt", organization=objs.organization,
inventory='test_inv', project='test_proj').job_template
job = jt.create_unified_job()
JobEvent.create_from_data(job_id=job.pk, uuid='abc123', event='runner_on_start',
stdout='a' * 1025)
url = reverse('api:job_job_events_list', kwargs={'pk': job.pk})
if not truncate:
url += '?no_truncate=1'
response = get(url, user=objs.superusers.admin, expect=200)
assert (len(response.data['results'][0]['stdout']) == 1025) == expected
@pytest.mark.django_db
@pytest.mark.parametrize('truncate, expected', [
(True, False),
(False, True),
])
def test_ad_hoc_events_sublist_truncation(get, organization_factory, job_template_factory, truncate, expected):
objs = organization_factory("org", superusers=['admin'])
adhoc = AdHocCommand()
adhoc.save()
AdHocCommandEvent.create_from_data(ad_hoc_command_id=adhoc.pk, uuid='abc123', event='runner_on_start',
stdout='a' * 1025)
url = reverse('api:ad_hoc_command_ad_hoc_command_events_list', kwargs={'pk': adhoc.pk})
if not truncate:
url += '?no_truncate=1'
response = get(url, user=objs.superusers.admin, expect=200)
assert (len(response.data['results'][0]['stdout']) == 1025) == expected

View File

@@ -117,3 +117,10 @@ def test_handle_content_type(post, admin):
admin,
content_type='text/html',
expect=415)
@pytest.mark.django_db
def test_basic_not_found(get, admin_user):
root_url = reverse('api:api_v2_root_view')
r = get(root_url + 'fooooooo', user=admin_user, expect=404)
assert r.data.get('detail') == 'The requested resource could not be found.'

View File

@@ -45,6 +45,14 @@ def isolated_instance_group(instance_group, instance):
return ig
@pytest.fixture
def containerized_instance_group(instance_group, kube_credential):
ig = InstanceGroup(name="container")
ig.credential = kube_credential
ig.save()
return ig
@pytest.fixture
def create_job_factory(job_factory, instance_group):
def fn(status='running'):
@@ -240,3 +248,29 @@ def test_instance_group_order_persistence(get, post, admin, source_model):
resp = get(url, admin)
assert resp.data['count'] == total
assert [ig['name'] for ig in resp.data['results']] == [ig.name for ig in before]
@pytest.mark.django_db
def test_instance_group_update_fields(patch, instance, instance_group, admin, containerized_instance_group):
# policy_instance_ variables can only be updated in instance groups that are NOT containerized
# instance group (not containerized)
ig_url = reverse("api:instance_group_detail", kwargs={'pk': instance_group.pk})
assert not instance_group.is_containerized
assert not containerized_instance_group.is_isolated
resp = patch(ig_url, {'policy_instance_percentage':15}, admin, expect=200)
assert 15 == resp.data['policy_instance_percentage']
resp = patch(ig_url, {'policy_instance_minimum':15}, admin, expect=200)
assert 15 == resp.data['policy_instance_minimum']
resp = patch(ig_url, {'policy_instance_list':[instance.hostname]}, admin)
assert [instance.hostname] == resp.data['policy_instance_list']
# containerized instance group
cg_url = reverse("api:instance_group_detail", kwargs={'pk': containerized_instance_group.pk})
assert containerized_instance_group.is_containerized
assert not containerized_instance_group.is_isolated
resp = patch(cg_url, {'policy_instance_percentage':15}, admin, expect=400)
assert ["Containerized instances may not be managed via the API"] == resp.data['policy_instance_percentage']
resp = patch(cg_url, {'policy_instance_minimum':15}, admin, expect=400)
assert ["Containerized instances may not be managed via the API"] == resp.data['policy_instance_minimum']
resp = patch(cg_url, {'policy_instance_list':[instance.hostname]}, admin)
assert ["Containerized instances may not be managed via the API"] == resp.data['policy_instance_list']

View File

@@ -8,6 +8,8 @@ from unittest.mock import PropertyMock
# Django
from django.urls import resolve
from django.http import Http404
from django.core.handlers.exception import response_for_exception
from django.contrib.auth.models import User
from django.core.serializers.json import DjangoJSONEncoder
from django.db.backends.sqlite3.base import SQLiteCursorWrapper
@@ -581,8 +583,12 @@ def _request(verb):
if 'format' not in kwargs and 'content_type' not in kwargs:
kwargs['format'] = 'json'
view, view_args, view_kwargs = resolve(urllib.parse.urlparse(url)[2])
request = getattr(APIRequestFactory(), verb)(url, **kwargs)
request_error = None
try:
view, view_args, view_kwargs = resolve(urllib.parse.urlparse(url)[2])
except Http404 as e:
request_error = e
if isinstance(kwargs.get('cookies', None), dict):
for key, value in kwargs['cookies'].items():
request.COOKIES[key] = value
@@ -591,7 +597,10 @@ def _request(verb):
if user:
force_authenticate(request, user=user)
response = view(request, *view_args, **view_kwargs)
if not request_error:
response = view(request, *view_args, **view_kwargs)
else:
response = response_for_exception(request, request_error)
if middleware:
middleware.process_response(request, response)
if expect:

View File

@@ -87,7 +87,7 @@ class TestJobNotificationMixin(object):
'use_fact_cache': bool,
'verbosity': int},
'job_friendly_name': str,
'job_summary_dict': str,
'job_metadata': str,
'url': str}
@@ -144,5 +144,3 @@ class TestJobNotificationMixin(object):
context_stub = JobNotificationMixin.context_stub()
check_structure_and_completeness(TestJobNotificationMixin.CONTEXT_STRUCTURE, context_stub)

View File

@@ -1,5 +1,4 @@
import subprocess
import yaml
import base64
from unittest import mock # noqa
@@ -51,6 +50,5 @@ def test_kubectl_ssl_verification(containerized_job):
cred.inputs['ssl_ca_cert'] = cert.stdout
cred.save()
pm = PodManager(containerized_job)
config = yaml.load(open(pm.kube_config), Loader=yaml.FullLoader)
ca_data = config['clusters'][0]['cluster']['certificate-authority-data']
ca_data = pm.kube_config['clusters'][0]['cluster']['certificate-authority-data']
assert cert.stdout == base64.b64decode(ca_data.encode())

View File

@@ -264,6 +264,7 @@ def test_inventory_update_injected_content(this_kind, script_or_plugin, inventor
assert envvars.pop('ANSIBLE_INVENTORY_ENABLED') == ('auto' if use_plugin else 'script')
set_files = bool(os.getenv("MAKE_INVENTORY_REFERENCE_FILES", 'false').lower()[0] not in ['f', '0'])
env, content = read_content(private_data_dir, envvars, inventory_update)
env.pop('ANSIBLE_COLLECTIONS_PATHS', None) # collection paths not relevant to this test
base_dir = os.path.join(DATA, script_or_plugin)
if not os.path.exists(base_dir):
os.mkdir(base_dir)

View File

@@ -43,7 +43,7 @@ def test_basic_parameterization(get, post, user, organization):
assert 'url' in response.data['notification_configuration']
assert 'headers' in response.data['notification_configuration']
assert 'messages' in response.data
assert response.data['messages'] == {'started': None, 'success': None, 'error': None}
assert response.data['messages'] == {'started': None, 'success': None, 'error': None, 'workflow_approval': None}
@pytest.mark.django_db

View File

@@ -19,6 +19,8 @@ from awx.main.models import (
Credential
)
from rest_framework.exceptions import PermissionDenied
from crum import impersonate
@@ -252,7 +254,8 @@ class TestJobRelaunchAccess:
assert 'job_var' in job.launch_config.extra_data
assert bob.can_access(Job, 'start', job, validate_license=False)
assert not alice.can_access(Job, 'start', job, validate_license=False)
with pytest.raises(PermissionDenied):
alice.can_access(Job, 'start', job, validate_license=False)
@pytest.mark.django_db

View File

@@ -7,6 +7,8 @@ from awx.main.access import (
# WorkflowJobNodeAccess
)
from rest_framework.exceptions import PermissionDenied
from awx.main.models import InventorySource, JobLaunchConfig
@@ -169,7 +171,8 @@ class TestWorkflowJobAccess:
wfjt.ask_inventory_on_launch = True
wfjt.save()
JobLaunchConfig.objects.create(job=workflow_job, inventory=inventory)
assert not WorkflowJobAccess(rando).can_start(workflow_job)
with pytest.raises(PermissionDenied):
WorkflowJobAccess(rando).can_start(workflow_job)
inventory.use_role.members.add(rando)
assert WorkflowJobAccess(rando).can_start(workflow_job)

View File

@@ -26,7 +26,7 @@ class TestNotificationTemplateSerializer():
{'started': {'message': '{{ job.id }}', 'body': '{{ job.status }}'},
'success': {'message': None, 'body': '{{ job_friendly_name }}'},
'error': {'message': '{{ url }}', 'body': None}},
{'started': {'body': '{{ job_summary_dict }}'}},
{'started': {'body': '{{ job_metadata }}'}},
{'started': {'body': '{{ job.summary_fields.inventory.total_hosts }}'}},
{'started': {'body': u'Iñtërnâtiônàlizætiøn'}}
])

View File

@@ -234,6 +234,14 @@ class TestWorkflowJobNodeJobKWARGS:
job_node_no_prompts.unified_job_template = project_unit
assert job_node_no_prompts.get_job_kwargs() == self.kwargs_base
def test_extra_vars_node_prompts(self, wfjt_node_no_prompts):
wfjt_node_no_prompts.extra_vars = {'foo': 'bar'}
assert wfjt_node_no_prompts.prompts_dict() == {'extra_vars': {'foo': 'bar'}}
def test_string_extra_vars_node_prompts(self, wfjt_node_no_prompts):
wfjt_node_no_prompts.extra_vars = '{"foo": "bar"}'
assert wfjt_node_no_prompts.prompts_dict() == {'extra_vars': {'foo': 'bar'}}
def test_get_ask_mapping_integrity():
assert list(WorkflowJobTemplate.get_ask_mapping().keys()) == ['extra_vars', 'inventory', 'limit', 'scm_branch']

View File

@@ -5,7 +5,6 @@ from datetime import timedelta
@pytest.mark.parametrize("job_name,function_path", [
('admin_checks', 'awx.main.tasks.run_administrative_checks'),
('tower_scheduler', 'awx.main.tasks.awx_periodic_scheduler'),
])
def test_CELERYBEAT_SCHEDULE(mocker, job_name, function_path):

View File

@@ -288,24 +288,30 @@ class AWXProxyHandler(logging.Handler):
'''
thread_local = threading.local()
_auditor = None
def __init__(self, **kwargs):
# TODO: process 'level' kwarg
super(AWXProxyHandler, self).__init__(**kwargs)
self._handler = None
self._old_kwargs = {}
self._auditor = logging.handlers.RotatingFileHandler(
filename='/var/log/tower/external.log',
maxBytes=1024 * 1024 * 50, # 50 MB
backupCount=5,
)
class WritableLogstashFormatter(LogstashFormatter):
@classmethod
def serialize(cls, message):
return json.dumps(message)
@property
def auditor(self):
if not self._auditor:
self._auditor = logging.handlers.RotatingFileHandler(
filename='/var/log/tower/external.log',
maxBytes=1024 * 1024 * 50, # 50 MB
backupCount=5,
)
self._auditor.setFormatter(WritableLogstashFormatter())
class WritableLogstashFormatter(LogstashFormatter):
@classmethod
def serialize(cls, message):
return json.dumps(message)
self._auditor.setFormatter(WritableLogstashFormatter())
return self._auditor
def get_handler_class(self, protocol):
return HANDLER_MAPPING.get(protocol, AWXNullHandler)
@@ -340,8 +346,8 @@ class AWXProxyHandler(logging.Handler):
if AWXProxyHandler.thread_local.enabled:
actual_handler = self.get_handler()
if settings.LOG_AGGREGATOR_AUDIT:
self._auditor.setLevel(settings.LOG_AGGREGATOR_LEVEL)
self._auditor.emit(record)
self.auditor.setLevel(settings.LOG_AGGREGATOR_LEVEL)
self.auditor.emit(record)
return actual_handler.emit(record)
def perform_test(self, custom_settings):

View File

@@ -49,12 +49,6 @@ else:
DEBUG = True
SQL_DEBUG = DEBUG
ADMINS = (
# ('Your Name', 'your_email@domain.com'),
)
MANAGERS = ADMINS
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
@@ -382,34 +376,6 @@ AUTH_BASIC_ENABLED = True
# If set, serve only minified JS for UI.
USE_MINIFIED_JS = False
# Email address that error messages come from.
SERVER_EMAIL = 'root@localhost'
# Default email address to use for various automated correspondence from
# the site managers.
DEFAULT_FROM_EMAIL = 'tower@localhost'
# Subject-line prefix for email messages send with django.core.mail.mail_admins
# or ...mail_managers. Make sure to include the trailing space.
EMAIL_SUBJECT_PREFIX = '[Tower] '
# The email backend to use. For possible shortcuts see django.core.mail.
# The default is to use the SMTP backend.
# Third-party backends can be specified by providing a Python path
# to a module that defines an EmailBackend class.
EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
# Host for sending email.
EMAIL_HOST = 'localhost'
# Port for sending email.
EMAIL_PORT = 25
# Optional SMTP authentication information for EMAIL_HOST.
EMAIL_HOST_USER = ''
EMAIL_HOST_PASSWORD = ''
EMAIL_USE_TLS = False
# Default to skipping isolated host key checking (the initial connection will
# hang on an interactive "The authenticity of host example.org can't be
# established" message)
@@ -457,10 +423,6 @@ CELERYBEAT_SCHEDULE = {
'schedule': timedelta(seconds=30),
'options': {'expires': 20,}
},
'admin_checks': {
'task': 'awx.main.tasks.run_administrative_checks',
'schedule': timedelta(days=30)
},
'cluster_heartbeat': {
'task': 'awx.main.tasks.cluster_node_heartbeat',
'schedule': timedelta(seconds=60),
@@ -635,16 +597,18 @@ PRIMARY_GALAXY_USERNAME = ''
PRIMARY_GALAXY_TOKEN = ''
PRIMARY_GALAXY_PASSWORD = ''
PRIMARY_GALAXY_AUTH_URL = ''
# Settings for the fallback galaxy server(s), normally this is the
# actual Ansible Galaxy site.
# server options: 'id', 'url', 'username', 'password', 'token', 'auth_url'
# To not use any fallback servers set this to []
FALLBACK_GALAXY_SERVERS = [
{
'id': 'galaxy',
'url': 'https://galaxy.ansible.com'
}
]
# Settings for the public galaxy server(s).
PUBLIC_GALAXY_ENABLED = True
PUBLIC_GALAXY_SERVER = {
'id': 'galaxy',
'url': 'https://galaxy.ansible.com'
}
# List of dicts of fallback (additional) Galaxy servers. If configured, these
# will be higher precedence than public Galaxy, but lower than primary Galaxy.
# Available options: 'id', 'url', 'username', 'password', 'token', 'auth_url'
FALLBACK_GALAXY_SERVERS = []
# Enable bubblewrap support for running jobs (playbook runs only).
# Note: This setting may be overridden by database settings.
@@ -978,9 +942,6 @@ FACT_CACHE_PORT = 6564
ORG_ADMINS_CAN_SEE_ALL_USERS = True
MANAGE_ORGANIZATION_AUTH = True
# Note: This setting may be overridden by database settings.
TOWER_ADMIN_ALERTS = True
# Note: This setting may be overridden by database settings.
TOWER_URL_BASE = "https://towerhost"
@@ -1061,11 +1022,6 @@ LOGGING = {
'formatter': 'json',
'filters': ['external_log_enabled', 'dynamic_level_filter'],
},
'mail_admins': {
'level': 'ERROR',
'filters': ['require_debug_false'],
'class': 'django.utils.log.AdminEmailHandler',
},
'tower_warnings': {
# don't define a level here, it's set by settings.LOG_AGGREGATOR_LEVEL
'class': 'logging.handlers.RotatingFileHandler',

View File

@@ -15,12 +15,6 @@ import os
import urllib.parse
import sys
ADMINS = (
# ('Your Name', 'your_email@domain.com'),
)
MANAGERS = ADMINS
# Enable the following lines and install the browser extension to use Django debug toolbar
# if your deployment method is not VMWare of Docker-for-Mac you may
# need a different IP address from request.META['REMOTE_ADDR']
@@ -69,7 +63,7 @@ CHANNEL_LAYERS = {
# Absolute filesystem path to the directory to host projects (with playbooks).
# This directory should NOT be web-accessible.
PROJECTS_ROOT = '/projects/'
PROJECTS_ROOT = '/var/lib/awx/projects/'
# Absolute filesystem path to the directory for job status stdout
# This directory should not be web-accessible
@@ -117,38 +111,6 @@ PROXY_IP_WHITELIST = []
# If set, use -vvv for project updates instead of -v for more output.
# PROJECT_UPDATE_VVV=True
###############################################################################
# EMAIL SETTINGS
###############################################################################
# Email address that error messages come from.
SERVER_EMAIL = 'root@localhost'
# The email backend to use. For possible shortcuts see django.core.mail.
# The default is to use the SMTP backend.
# Third-party backends can be specified by providing a Python path
# to a module that defines an EmailBackend class.
EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
# Host for sending email.
EMAIL_HOST = 'localhost'
# Port for sending email.
EMAIL_PORT = 25
# Optional SMTP authentication information for EMAIL_HOST.
EMAIL_HOST_USER = ''
EMAIL_HOST_PASSWORD = ''
EMAIL_USE_TLS = False
# Default email address to use for various automated correspondence from
# the site managers.
DEFAULT_FROM_EMAIL = 'webmaster@localhost'
# Subject-line prefix for email messages send with django.core.mail.mail_admins
# or ...mail_managers. Make sure to include the trailing space.
EMAIL_SUBJECT_PREFIX = '[AWX] '
###############################################################################
# LOGGING SETTINGS
###############################################################################

View File

@@ -12,12 +12,6 @@
# MISC PROJECT SETTINGS
###############################################################################
ADMINS = (
# ('Your Name', 'your_email@domain.com'),
)
MANAGERS = ADMINS
# Database settings to use PostgreSQL for development.
DATABASES = {
'default': {
@@ -97,38 +91,6 @@ PROXY_IP_WHITELIST = []
# If set, use -vvv for project updates instead of -v for more output.
# PROJECT_UPDATE_VVV=True
###############################################################################
# EMAIL SETTINGS
###############################################################################
# Email address that error messages come from.
SERVER_EMAIL = 'root@localhost'
# The email backend to use. For possible shortcuts see django.core.mail.
# The default is to use the SMTP backend.
# Third-party backends can be specified by providing a Python path
# to a module that defines an EmailBackend class.
EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
# Host for sending email.
EMAIL_HOST = 'localhost'
# Port for sending email.
EMAIL_PORT = 25
# Optional SMTP authentication information for EMAIL_HOST.
EMAIL_HOST_USER = ''
EMAIL_HOST_PASSWORD = ''
EMAIL_USE_TLS = False
# Default email address to use for various automated correspondence from
# the site managers.
DEFAULT_FROM_EMAIL = 'webmaster@localhost'
# Subject-line prefix for email messages send with django.core.mail.mail_admins
# or ...mail_managers. Make sure to include the trailing space.
EMAIL_SUBJECT_PREFIX = '[AWX] '
###############################################################################
# LOGGING SETTINGS
###############################################################################

View File

@@ -11,7 +11,6 @@ import awx
# Django
from django.utils import six
from django.utils.translation import ugettext_lazy as _
from django.core.validators import URLValidator, _lazy_re_compile
# Django Auth LDAP
import django_auth_ldap.config
@@ -234,34 +233,12 @@ class AuthenticationBackendsField(fields.StringListField):
class LDAPServerURIField(fields.URLField):
tld_re = (
r'\.' # dot
r'(?!-)' # can't start with a dash
r'(?:[a-z' + URLValidator.ul + r'0-9' + '-]{2,63}' # domain label, this line was changed from the original URLValidator
r'|xn--[a-z0-9]{1,59})' # or punycode label
r'(?<!-)' # can't end with a dash
r'\.?' # may have a trailing dot
)
host_re = '(' + URLValidator.hostname_re + URLValidator.domain_re + tld_re + '|localhost)'
regex = _lazy_re_compile(
r'^(?:[a-z0-9\.\-\+]*)://' # scheme is validated separately
r'(?:[^\s:@/]+(?::[^\s:@/]*)?@)?' # user:pass authentication
r'(?:' + URLValidator.ipv4_re + '|' + URLValidator.ipv6_re + '|' + host_re + ')'
r'(?::\d{2,5})?' # port
r'(?:[/?#][^\s]*)?' # resource path
r'\Z', re.IGNORECASE)
def __init__(self, **kwargs):
kwargs.setdefault('schemes', ('ldap', 'ldaps'))
kwargs.setdefault('allow_plain_hostname', True)
kwargs.setdefault('regex', LDAPServerURIField.regex)
super(LDAPServerURIField, self).__init__(**kwargs)
def run_validators(self, value):
for url in filter(None, re.split(r'[, ]', (value or ''))):
super(LDAPServerURIField, self).run_validators(url)
return value

View File

@@ -282,10 +282,12 @@ function getLaunchedByDetails () {
tooltip = strings.get('tooltips.SCHEDULE');
link = `/#/templates/job_template/${jobTemplate.id}/schedules/${schedule.id}`;
value = $filter('sanitize')(schedule.name);
} else {
} else if (schedule) {
tooltip = null;
link = null;
value = $filter('sanitize')(schedule.name);
} else {
return null;
}
return { label, link, tooltip, value };

View File

@@ -5,7 +5,7 @@
<!-- LEFT PANE HEADER ACTIONS -->
<div class="JobResults-panelHeaderButtonActions">
<!-- RELAUNCH ACTION -->
<at-relaunch job="vm.job"></at-relaunch>
<at-relaunch ng-if="vm.job" job="vm.job"></at-relaunch>
<!-- CANCEL ACTION -->
<button

View File

@@ -213,8 +213,8 @@ function JobRenderService ($q, $compile, $sce, $window) {
const record = this.createRecord(event, lines);
if (lines.length === 1 && lines[0] === '') {
// Some events, mainly runner_on_start events, have an actual line count of 1
// (stdout = '') and a claimed line count of 0 (end_line - start_line = 0).
// runner_on_start, runner_on_ok, and a few other events have an actual line count
// of 1 (stdout = '') and a claimed line count of 0 (end_line - start_line = 0).
// Since a zero-length string has an actual line count of 1, they'll still get
// rendered as blank lines unless we intercept them and add some special
// handling to remove them.

View File

@@ -208,6 +208,7 @@
max-width: none !important;
width: 100% !important;
padding-right: 0px !important;
margin-top: 10px;
}
.Form-formGroup--checkbox{

View File

@@ -15,7 +15,9 @@
title="{{ label || vm.strings.get('code_mirror.label.VARIABLES') }}"
tabindex="-1"
ng-if="tooltip">
<i class="fa fa-question-circle"></i>
<span class="at-Popover-icon" ng-class="{ 'at-Popover-icon--defaultCursor': popover.on === 'mouseenter' && !popover.click }">
<i class="fa fa-question-circle"></i>
</span>
</a>
<div class="atCodeMirror-toggleContainer FormToggle-container">
<div id="{{ name }}_parse_type" class="btn-group">

View File

@@ -202,6 +202,7 @@
.at-Row-toggle {
align-self: flex-start;
margin-right: @at-space-4x;
margin-left: 15px;
}
.at-Row-actions {
@@ -385,29 +386,3 @@
margin-right: @at-margin-right-list-row-item-inline-label;
}
}
@media screen and (max-width: @at-breakpoint-instances-wrap) {
.at-Row-items--instances {
margin-bottom: @at-padding-bottom-instances-wrap;
}
}
@media screen and (max-width: @at-breakpoint-compact-list) {
.at-Row-actions {
align-items: center;
}
.at-RowAction {
margin: @at-margin-list-row-action-mobile;
}
.at-RowItem--inline {
display: flex;
margin-right: inherit;
.at-RowItem-label {
width: @at-width-list-row-item-label;
margin-right: inherit;
}
}
}

View File

@@ -89,6 +89,9 @@ export default ['i18n', function(i18n) {
type: 'text',
reset: 'PRIMARY_GALAXY_AUTH_URL',
},
PUBLIC_GALAXY_ENABLED: {
type: 'toggleSwitch',
},
AWX_TASK_ENV: {
type: 'textarea',
reset: 'AWX_TASK_ENV',

View File

@@ -15,9 +15,6 @@ export default ['i18n', function(i18n) {
type: 'text',
reset: 'TOWER_URL_BASE',
},
TOWER_ADMIN_ALERTS: {
type: 'toggleSwitch',
},
ORG_ADMINS_CAN_SEE_ALL_USERS: {
type: 'toggleSwitch',
},

View File

@@ -1,9 +1,11 @@
.CapacityAdjuster {
margin-right: @at-space-4x;
margin-top: 15px;
margin-left: -10px;
position: relative;
&-valueLabel {
bottom: @at-space-5x;
top: -10px;
color: @at-color-body-text;
font-size: @at-font-size;
position: absolute;

View File

@@ -5,6 +5,8 @@ capacity-bar {
font-size: @at-font-size;
min-width: 100px;
white-space: nowrap;
margin-top: 5px;
margin-bottom: 5px;
.CapacityBar {
background-color: @default-bg;
@@ -42,12 +44,4 @@ capacity-bar {
text-align: right;
text-transform: uppercase;
}
.Capacity-details--percentage {
width: 40px;
}
&:only-child {
margin-right: 50px;
}
}

View File

@@ -12,6 +12,7 @@ function AddContainerGroupController(ToJSON, $scope, $state, models, strings, i1
vm.form = instanceGroup.createFormSchema('post');
vm.form.name.required = true;
delete vm.form.name.help_text;
vm.form.credential = {
type: 'field',
@@ -22,6 +23,7 @@ function AddContainerGroupController(ToJSON, $scope, $state, models, strings, i1
vm.form.credential._route = "instanceGroups.addContainerGroup.credentials";
vm.form.credential._model = credential;
vm.form.credential._placeholder = strings.get('container.CREDENTIAL_PLACEHOLDER');
vm.form.credential.help_text = strings.get('container.CREDENTIAL_HELP_TEXT');
vm.form.credential.required = true;
vm.form.extraVars = {
@@ -29,6 +31,7 @@ function AddContainerGroupController(ToJSON, $scope, $state, models, strings, i1
value: DataSet.data.actions.POST.pod_spec_override.default,
name: 'extraVars',
toggleLabel: strings.get('container.POD_SPEC_TOGGLE'),
tooltip: strings.get('container.EXTRA_VARS_HELP_TEXT')
};
vm.tab = {

View File

@@ -1,8 +1,8 @@
<div ui-view="credentials"></div>
<a class="containerGroups-messageBar-link" href="https://docs.ansible.com/ansible-tower/latest/html/administration/external_execution_envs.html#container-group-considerations" target="_blank" style="color: white">
<a class="containerGroups-messageBar-link" href="https://docs.ansible.com/ansible-tower/latest/html/administration/external_execution_envs.html#container-groups" target="_blank" style="color: white">
<div class="Section-messageBar">
<i class="Section-messageBar-warning fa fa-warning"></i>
<span class="Section-messageBar-text">This feature is tech preview, and is subject to change in a future release. Click here for documentation.</span>
<span class="Section-messageBar-text">This feature is currently in tech preview and is subject to change in a future release. Click here for documentation.</span>
</div>
</a>
<at-panel>
@@ -34,6 +34,7 @@
variables="vm.form.extraVars.value"
label="{{ vm.form.extraVars.label }}"
name="{{ vm.form.extraVars.name }}"
tooltip="{{ vm.form.extraVars.tooltip }}"
>
</at-code-mirror>
</div>

View File

@@ -27,6 +27,7 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
vm.switchDisabled = false;
vm.form.disabled = !instanceGroup.has('options', 'actions.PUT');
vm.form.name.required = true;
delete vm.form.name.help_text;
vm.form.credential = {
type: 'field',
label: i18n._('Credential'),
@@ -38,6 +39,7 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
vm.form.credential._displayValue = EditContainerGroupDataset.data.summary_fields.credential.name;
vm.form.credential.required = true;
vm.form.credential._value = EditContainerGroupDataset.data.summary_fields.credential.id;
vm.form.credential.help_text = strings.get('container.CREDENTIAL_HELP_TEXT');
vm.tab = {
details: {
@@ -59,7 +61,8 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
label: strings.get('container.POD_SPEC_LABEL'),
value: EditContainerGroupDataset.data.pod_spec_override || "---",
name: 'extraVars',
disabled: true
disabled: true,
tooltip: strings.get('container.EXTRA_VARS_HELP_TEXT')
};
vm.switchDisabled = true;
} else {
@@ -67,7 +70,8 @@ function EditContainerGroupController($rootScope, $scope, $state, models, string
label: strings.get('container.POD_SPEC_LABEL'),
value: EditContainerGroupDataset.data.pod_spec_override || instanceGroup.model.OPTIONS.actions.PUT.pod_spec_override.default,
name: 'extraVars',
toggleLabel: strings.get('container.POD_SPEC_TOGGLE')
toggleLabel: strings.get('container.POD_SPEC_TOGGLE'),
tooltip: strings.get('container.EXTRA_VARS_HELP_TEXT')
};
}

View File

@@ -1,135 +1,100 @@
.InstanceGroups {
.at-Row-actions{
justify-content: flex-start;
width: 300px;
& > capacity-bar:only-child{
margin-left: 0px;
margin-top: 5px
}
}
.at-RowAction{
margin: 0;
}
.at-Row-links{
justify-content: flex-start;
.at-Row--instances {
.at-Row-content {
flex-wrap: nowrap;
}
.BreadCrumb-menuLinkImage:hover {
color: @default-link;
.at-Row-toggle {
align-self: auto;
flex: initial;
}
.List-details {
align-self: flex-end;
color: @default-interface-txt;
.at-Row-itemGroup {
display: flex;
flex: 0 0 auto;
font-size: 12px;
margin-right:20px;
text-transform: uppercase;
flex: 1;
flex-wrap: wrap;
}
.Capacity-details {
.at-Row-items--instances {
display: flex;
margin-right: 20px;
flex-wrap: wrap;
align-items: center;
.Capacity-details--label {
color: @default-interface-txt;
margin: 0 10px 0 0;
width: 100px;
}
align-content: center;
flex: 1;
}
.RunningJobs-details {
align-items: center;
display: flex;
.RunningJobs-details--label {
margin: 0 10px 0 0;
}
.at-RowItem--isHeader {
min-width: 250px;
}
.List-tableCell--capacityColumn {
.at-Row-items--capacity {
display: flex;
height: 40px;
flex-wrap: wrap;
align-items: center;
}
.List-noItems {
margin-top: 20px;
}
.List-tableRow .List-titleBadge {
margin: 0 0 0 5px;
}
.Panel-docsLink {
cursor: pointer;
display: flex;
align-items: center;
justify-content: center;
padding: 7px;
background: @at-white;
border-radius: @at-border-radius;
height: 30px;
width: 30px;
margin: 0 20px 0 auto;
i {
font-size: @at-font-size-icon;
color: @at-gray-646972;
}
}
.Panel-docsLink:hover {
background-color: @at-blue;
i {
color: @at-white;
}
}
.at-Row-toggle{
margin-top: 20px;
padding-left: 15px;
}
.ContainerGroups-codeMirror{
margin-bottom: 10px;
}
.at-Row-container{
flex-wrap: wrap;
}
.containerGroups-messageBar-link:hover{
text-decoration: underline;
}
@media screen and (max-width: 1060px) and (min-width: 769px){
.at-Row-links {
justify-content: flex-start;
flex-wrap: wrap;
}
}
@media screen and (min-width: 1061px){
.at-Row-actions{
justify-content: flex-end;
& > capacity-bar:only-child {
margin-right: 30px;
}
}
.instanceGroupsList-details{
display: flex;
}
.at-Row-links {
justify-content: flex-end;
display: flex;
width: 445px;
}
.CapacityAdjuster {
padding-bottom: 15px;
}
}
.at-Row--instanceGroups {
.at-Row-content {
flex-wrap: nowrap;
}
.at-Row-itemGroup {
display: flex;
flex: 1;
flex-wrap: wrap;
}
.at-Row-items--instanceGroups {
display: flex;
flex-wrap: wrap;
align-items: center;
flex: 1;
max-width: 100%;
}
.at-Row-itemHeaderGroup {
min-width: 320px;
display: flex;
}
.at-Row-items--capacity {
display: flex;
flex-wrap: wrap;
align-items: center;
margin-right: 5px;
min-width: 215px;
}
.at-Row--instanceSpacer {
width: 140px;
}
.at-Row--capacitySpacer {
flex: .6;
}
.at-Row-actions {
min-width: 50px;
}
}
@media screen and (max-width: 1260px) {
.at-Row--instances .at-Row-items--capacity {
flex: 1
}
.at-Row--instances .CapacityAdjuster {
padding-bottom: 5px;
}
}
@media screen and (max-width: 600px) {
.at-Row--instanceGroups .at-Row-itemHeaderGroup,
.at-Row--instanceGroups .at-Row-itemGroup {
max-width: 270px;
}
}

View File

@@ -72,8 +72,9 @@ function InstanceGroupsStrings(BaseString) {
CREDENTIAL_PLACEHOLDER: t.s('SELECT A CREDENTIAL'),
POD_SPEC_LABEL: t.s('Pod Spec Override'),
BADGE_TEXT: t.s('Container Group'),
POD_SPEC_TOGGLE: t.s('Customize Pod Spec')
POD_SPEC_TOGGLE: t.s('Customize Pod Spec'),
CREDENTIAL_HELP_TEXT: t.s('Credential to authenticate with Kubernetes or OpenShift.  Must be of type \"Kubernetes/OpenShift API Bearer Token\”.'),
EXTRA_VARS_HELP_TEXT: t.s('Field for passing a custom Kubernetes or OpenShift Pod specification.')
};
}

View File

@@ -43,35 +43,45 @@
</at-list-toolbar>
<at-list results='vm.instances'>
<at-row ng-repeat="instance in vm.instances"
ng-class="{'at-Row--active': (instance.id === vm.activeId)}">
ng-class="{'at-Row--active': (instance.id === vm.activeId)}"
class="at-Row--instances">
<div class="at-Row-toggle">
<at-switch on-toggle="vm.toggle(instance)" switch-on="instance.enabled" switch-disabled="vm.rowAction.toggle._disabled"></at-switch>
</div>
<div class="at-Row-items at-Row-items--instances">
<at-row-item
header-value="{{ instance.hostname }}"
header-tag="{{ instance.managed_by_policy ? '' : vm.strings.get('list.MANUAL') }}">
</at-row-item>
<at-row-item
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_RUNNING_JOBS') }}"
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}, job_search: {status__in: ['running,waiting']}})"
value="{{ instance.jobs_running }}"
inline="true"
badge="true">
</at-row-item>
<at-row-item
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_TOTAL_JOBS') }}"
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}})"
value="{{ instance.jobs_total }}"
inline="true"
badge="true">
</at-row-item>
</div>
<div class="at-Row-actions">
<capacity-adjuster state="instance" disabled="{{vm.rowAction.capacity_adjustment._disabled}}"></capacity-adjuster>
<capacity-bar label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}" capacity="instance.consumed_capacity" total-capacity="instance.capacity"></capacity-bar>
<div class="at-Row-itemGroup">
<div class="at-Row-items at-Row-items--instances">
<at-row-item
header-value="{{ instance.hostname }}"
header-tag="{{ instance.managed_by_policy ? '' : vm.strings.get('list.MANUAL') }}">
</at-row-item>
<div class="at-Row-nonHeaderItems">
<at-row-item
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_RUNNING_JOBS') }}"
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}, job_search: {status__in: ['running,waiting']}})"
value="{{ instance.jobs_running }}"
inline="true"
badge="true">
</at-row-item>
<at-row-item
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_TOTAL_JOBS') }}"
label-state="instanceGroups.instanceJobs({instance_group_id: {{vm.instance_group_id}}, instance_id: {{instance.id}}})"
value="{{ instance.jobs_total }}"
inline="true"
badge="true">
</at-row-item>
</div>
</div>
<div class="at-Row-items--capacity">
<capacity-adjuster
state="instance"
disabled="{{vm.rowAction.capacity_adjustment._disabled}}">
</capacity-adjuster>
<capacity-bar
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}"
capacity="instance.consumed_capacity"
total-capacity="instance.capacity">
</capacity-bar>
</div>
</div>
</at-row>
</at-list>

View File

@@ -41,10 +41,11 @@
</at-list-toolbar>
<at-list results="instance_groups">
<at-row ng-repeat="instance_group in instance_groups"
ng-class="{'at-Row--active': (instance_group.id === vm.activeId)}" >
<div class="at-Row-items">
<div class="at-Row-container">
<div class="at-Row-content">
ng-class="{'at-Row--active': (instance_group.id === vm.activeId)}"
class="at-Row--instanceGroups">
<div class="at-Row-itemGroup">
<div class="at-Row-items at-Row-items--instanceGroups">
<div class="at-Row-itemHeaderGroup">
<at-row-item
ng-if="!instance_group.credential"
header-value="{{ instance_group.name }}"
@@ -67,23 +68,14 @@
</div>
</div>
<div class="at-RowItem--labels" ng-if="!instance_group.credential">
<div class="LabelList-tagContainer">
<div class="LabelList-tag" ng-class="{'LabelList-tag--deletable' : (showDelete && template.summary_fields.user_capabilities.edit)}">
<span class="LabelList-name">{{vm.strings.get('instance.BADGE_TEXT') }}</span>
</div>
<div class="LabelList-tagContainer">
<div class="LabelList-tag" ng-class="{'LabelList-tag--deletable' : (showDelete && template.summary_fields.user_capabilities.edit)}">
<span class="LabelList-name">{{vm.strings.get('instance.BADGE_TEXT') }}</span>
</div>
</div>
</div>
</div>
<div class="instanceGroupsList-details">
<div class="at-Row-links">
<at-row-item
ng-if="!instance_group.credential"
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_INSTANCES') }}"
label-state="instanceGroups.instances({instance_group_id: {{ instance_group.id }}})"
value="{{ instance_group.instances }}"
inline="true"
badge="true">
</at-row-item>
<div class="at-Row-nonHeaderItems">
<at-row-item
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_RUNNING_JOBS') }}"
label-state="instanceGroups.jobs({instance_group_id: {{ instance_group.id }}, job_search: {status__in: ['running,waiting']}})"
@@ -98,14 +90,38 @@
inline="true"
badge="true">
</at-row-item>
</div>
<div class="at-Row-actions" >
<capacity-bar ng-show="!instance_group.credential" label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}" capacity="instance_group.consumed_capacity" total-capacity="instance_group.capacity"></capacity-bar>
<at-row-action icon="fa-trash" ng-click="vm.deleteInstanceGroup(instance_group)" ng-if="vm.rowAction.trash(instance_group)">
</at-row-action>
</div>
<at-row-item
ng-if="!instance_group.credential"
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_INSTANCES') }}"
label-state="instanceGroups.instances({instance_group_id: {{ instance_group.id }}})"
value="{{ instance_group.instances }}"
inline="true"
badge="true">
</at-row-item>
<div
ng-if="instance_group.credential"
class="at-Row--instanceSpacer">
</div>
</div>
</div>
<div class="at-Row-items--capacity" ng-if="!instance_group.credential">
<capacity-bar
label-value="{{:: vm.strings.get('list.ROW_ITEM_LABEL_USED_CAPACITY') }}"
capacity="instance_group.consumed_capacity"
total-capacity="instance_group.capacity">
</capacity-bar>
</div>
<div
ng-if="instance_group.credential"
class="at-Row--capacitySpacer">
</div>
</div>
<div class="at-Row-actions" >
<at-row-action
icon="fa-trash"
ng-click="vm.deleteInstanceGroup(instance_group)"
ng-if="vm.rowAction.trash(instance_group)">
</at-row-action>
</div>
</at-row>
</at-list>

View File

@@ -671,6 +671,98 @@ export default ['i18n', function(i18n) {
"|| notification_type.value == 'webhook')",
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
approved_message: {
label: i18n._('Workflow Approved Message'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && notification_type.value != 'webhook'",
rows: 2,
oneLine: 'true',
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
approved_body: {
label: i18n._('Workflow Approved Message Body'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && " +
"(notification_type.value == 'email' " +
"|| notification_type.value == 'pagerduty' " +
"|| notification_type.value == 'webhook')",
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
denied_message: {
label: i18n._('Workflow Denied Message'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && notification_type.value != 'webhook'",
rows: 2,
oneLine: 'true',
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
denied_body: {
label: i18n._('Workflow Denied Message Body'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && " +
"(notification_type.value == 'email' " +
"|| notification_type.value == 'pagerduty' " +
"|| notification_type.value == 'webhook')",
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
running_message: {
label: i18n._('Workflow Running Message'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && notification_type.value != 'webhook'",
rows: 2,
oneLine: 'true',
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
running_body: {
label: i18n._('Workflow Running Message Body'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && " +
"(notification_type.value == 'email' " +
"|| notification_type.value == 'pagerduty' " +
"|| notification_type.value == 'webhook')",
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
timed_out_message: {
label: i18n._('Workflow Timed Out Message'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && notification_type.value != 'webhook'",
rows: 2,
oneLine: 'true',
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
timed_out_body: {
label: i18n._('Workflow Timed Out Message Body'),
class: 'Form-formGroup--fullWidth',
type: 'syntax_highlight',
mode: 'jinja2',
default: '',
ngShow: "customize_messages && " +
"(notification_type.value == 'email' " +
"|| notification_type.value == 'pagerduty' " +
"|| notification_type.value == 'webhook')",
ngDisabled: '!(notification_template.summary_fields.user_capabilities.edit || canAdd)',
},
},
buttons: { //for now always generates <button> tags

View File

@@ -1,19 +1,20 @@
const emptyDefaults = {
started: {
message: '',
body: '',
},
success: {
message: '',
body: '',
},
error: {
message: '',
body: '',
},
started: { message: '', body: '' },
success: { message: '', body: '' },
error: { message: '', body: '' },
workflow_approval: {
approved: { message: '', body: '' },
denied: { message: '', body: '' },
running: { message: '', body: '' },
timed_out: { message: '', body: '' },
}
};
function getMessageIfUpdated(message, defaultValue) {
return message === defaultValue ? null : message;
}
export default [function() {
return {
getMessagesObj: function ($scope, defaultMessages) {
@@ -23,22 +24,34 @@ export default [function() {
const defaults = defaultMessages[$scope.notification_type.value] || {};
return {
started: {
message: $scope.started_message === defaults.started.message ?
null : $scope.started_message,
body: $scope.started_body === defaults.started.body ?
null : $scope.started_body,
message: getMessageIfUpdated($scope.started_message, defaults.started.message),
body: getMessageIfUpdated($scope.started_body, defaults.started.body),
},
success: {
message: $scope.success_message === defaults.success.message ?
null : $scope.success_message,
body: $scope.success_body === defaults.success.body ?
null : $scope.success_body,
message: getMessageIfUpdated($scope.success_message, defaults.success.message),
body: getMessageIfUpdated($scope.success_body, defaults.success.body),
},
error: {
message: $scope.error_message === defaults.error.message ?
null : $scope.error_message,
body: $scope.error_body === defaults.error.body ?
null : $scope.error_body,
message: getMessageIfUpdated($scope.error_message, defaults.error.message),
body: getMessageIfUpdated($scope.error_body, defaults.error.body),
},
workflow_approval: {
approved: {
message: getMessageIfUpdated($scope.approved_message, defaults.workflow_approval.approved.message),
body: getMessageIfUpdated($scope.approved_body, defaults.workflow_approval.approved.body),
},
denied: {
message: getMessageIfUpdated($scope.denied_message, defaults.workflow_approval.denied.message),
body: getMessageIfUpdated($scope.denied_body, defaults.workflow_approval.denied.body),
},
running: {
message: getMessageIfUpdated($scope.running_message, defaults.workflow_approval.running.message),
body: getMessageIfUpdated($scope.running_body, defaults.workflow_approval.running.body),
},
timed_out: {
message: getMessageIfUpdated($scope.timed_out_message, defaults.workflow_approval.timed_out.message),
body: getMessageIfUpdated($scope.timed_out_body, defaults.workflow_approval.timed_out.body),
},
}
};
},
@@ -56,6 +69,15 @@ export default [function() {
$scope.success_body = defaults.success.body;
$scope.error_message = defaults.error.message;
$scope.error_body = defaults.error.body;
$scope.approved_message = defaults.workflow_approval.approved.message;
$scope.approved_body = defaults.workflow_approval.approved.body;
$scope.denied_message = defaults.workflow_approval.denied.message;
$scope.denied_body = defaults.workflow_approval.denied.body;
$scope.running_message = defaults.workflow_approval.running.message;
$scope.running_body = defaults.workflow_approval.running.body;
$scope.timed_out_message = defaults.workflow_approval.timed_out.message;
$scope.timed_out_body = defaults.workflow_approval.timed_out.body;
if (!messages) {
return;
}
@@ -84,6 +106,48 @@ export default [function() {
isCustomized = true;
$scope.error_body = messages.error.body;
}
if (messages.workflow_approval) {
if (messages.workflow_approval.approved &&
messages.workflow_approval.approved.message) {
isCustomized = true;
$scope.approved_message = messages.workflow_approval.approved.message;
}
if (messages.workflow_approval.approved &&
messages.workflow_approval.approved.body) {
isCustomized = true;
$scope.approved_body = messages.workflow_approval.approved.body;
}
if (messages.workflow_approval.denied &&
messages.workflow_approval.denied.message) {
isCustomized = true;
$scope.denied_message = messages.workflow_approval.denied.message;
}
if (messages.workflow_approval.denied &&
messages.workflow_approval.denied.body) {
isCustomized = true;
$scope.denied_body = messages.workflow_approval.denied.body;
}
if (messages.workflow_approval.running &&
messages.workflow_approval.running.message) {
isCustomized = true;
$scope.running_message = messages.workflow_approval.running.message;
}
if (messages.workflow_approval.running &&
messages.workflow_approval.running.body) {
isCustomized = true;
$scope.running_body = messages.workflow_approval.running.body;
}
if (messages.workflow_approval.timed_out &&
messages.workflow_approval.timed_out.message) {
isCustomized = true;
$scope.timed_out_message = messages.workflow_approval.timed_out.message;
}
if (messages.workflow_approval.timed_out &&
messages.workflow_approval.timed_out.body) {
isCustomized = true;
$scope.timed_out_body = messages.workflow_approval.timed_out.body;
}
}
$scope.customize_messages = isCustomized;
},
@@ -110,6 +174,30 @@ export default [function() {
if ($scope.error_body === oldDefaults.error.body) {
$scope.error_body = newDefaults.error.body;
}
if ($scope.approved_message === oldDefaults.workflow_approval.approved.message) {
$scope.approved_message = newDefaults.workflow_approval.approved.message;
}
if ($scope.approved_body === oldDefaults.workflow_approval.approved.body) {
$scope.approved_body = newDefaults.workflow_approval.approved.body;
}
if ($scope.denied_message === oldDefaults.workflow_approval.denied.message) {
$scope.denied_message = newDefaults.workflow_approval.denied.message;
}
if ($scope.denied_body === oldDefaults.workflow_approval.denied.body) {
$scope.denied_body = newDefaults.workflow_approval.denied.body;
}
if ($scope.running_message === oldDefaults.workflow_approval.running.message) {
$scope.running_message = newDefaults.workflow_approval.running.message;
}
if ($scope.running_body === oldDefaults.workflow_approval.running.body) {
$scope.running_body = newDefaults.workflow_approval.running.body;
}
if ($scope.timed_out_message === oldDefaults.workflow_approval.timed_out.message) {
$scope.timed_out_message = newDefaults.workflow_approval.timed_out.message;
}
if ($scope.timed_out_body === oldDefaults.workflow_approval.timed_out.body) {
$scope.timed_out_body = newDefaults.workflow_approval.timed_out.body;
}
}
};
}];

View File

@@ -233,6 +233,38 @@ export default [ 'ProcessErrors', 'CredentialTypeModel', 'TemplatesStrings', '$f
}, true);
};
function getSelectedTags(tagId) {
const selectedTags = [];
const choiceElements = $(tagId).siblings(".select2").first()
.find(".select2-selection__choice");
choiceElements.each((index, option) => {
selectedTags.push({
value: option.title,
name: option.title,
label: option.title
});
});
return selectedTags;
}
function consolidateTags (tags, otherTags) {
const seen = [];
const consolidated = [];
tags.forEach(tag => {
if (!seen.includes(tag.value)) {
seen.push(tag.value);
consolidated.push(tag);
}
});
otherTags.forEach(tag => {
if (!seen.includes(tag.value)) {
seen.push(tag.value);
consolidated.push(tag);
}
});
return consolidated;
}
vm.next = (currentTab) => {
if(_.has(vm, 'steps.other_prompts.tab._active') && vm.steps.other_prompts.tab._active === true){
try {
@@ -243,6 +275,22 @@ export default [ 'ProcessErrors', 'CredentialTypeModel', 'TemplatesStrings', '$f
event.preventDefault();
return;
}
// The current tag input state lives somewhere in the associated select2
// widgetry and isn't directly tied to the vm, so extract the tag values
// and update the vm to keep it in sync.
if (vm.promptDataClone.launchConf.ask_tags_on_launch) {
vm.promptDataClone.prompts.tags.value = consolidateTags(
angular.copy(vm.promptDataClone.prompts.tags.value),
getSelectedTags("#job_launch_job_tags")
);
}
if (vm.promptDataClone.launchConf.ask_skip_tags_on_launch) {
vm.promptDataClone.prompts.skipTags.value = consolidateTags(
angular.copy(vm.promptDataClone.prompts.skipTags.value),
getSelectedTags("#job_launch_skip_tags")
);
}
}
let nextStep;

View File

@@ -12,19 +12,6 @@ export default
let scope;
let consolidateTags = (tagModel, tagId) => {
let tags = angular.copy(tagModel);
$(tagId).siblings(".select2").first().find(".select2-selection__choice").each((optionIndex, option) => {
tags.push({
value: option.title,
name: option.title,
label: option.title
});
});
return [...tags.reduce((map, tag) => map.has(tag.value) ? map : map.set(tag.value, tag), new Map()).values()];
};
vm.init = (_scope_) => {
scope = _scope_;
@@ -35,14 +22,6 @@ export default
const surveyPasswords = {};
if (scope.promptData.launchConf.ask_tags_on_launch) {
scope.promptData.prompts.tags.value = consolidateTags(scope.promptData.prompts.tags.value, "#job_launch_job_tags");
}
if (scope.promptData.launchConf.ask_skip_tags_on_launch) {
scope.promptData.prompts.skipTags.value = consolidateTags(scope.promptData.prompts.skipTags.value, "#job_launch_skip_tags");
}
if (scope.promptData.launchConf.survey_enabled){
scope.promptData.extraVars = ToJSON(scope.parseType, scope.promptData.prompts.variables.value, false);
scope.promptData.surveyQuestions.forEach(surveyQuestion => {

View File

@@ -241,7 +241,7 @@ export default ['NotificationsList', 'i18n', function(NotificationsList, i18n) {
on-lookup-click="handleWebhookCredentialLookupClick"
on-tag-delete="handleWebhookCredentialTagDelete"
</webhook-credential-input>`,
awPopOver: "<p>" + i18n._("Select the credential to use with the webhook service.") + "</p>",
awPopOver: "<p>" + i18n._("Optionally, select the credential to use to send status updates back to the webhook service.") + "</p>",
dataTitle: i18n._('Webhook Credential'),
dataPlacement: 'right',
dataContainer: "body",

View File

@@ -1186,16 +1186,16 @@
"integrity": "sha512-rLu3wcBWH4P5q1CGoSSH/i9hrXs7SlbRLkoq9IGuoPYNGQvDJ3pt/wmOM+XgYjIDRMVIdkUWt0RsfzF50JfnCw=="
},
"@fortawesome/fontawesome-common-types": {
"version": "0.2.22",
"resolved": "https://registry.npmjs.org/@fortawesome/fontawesome-common-types/-/fontawesome-common-types-0.2.22.tgz",
"integrity": "sha512-QmEuZsipX5/cR9JOg0fsTN4Yr/9lieYWM8AQpmRa0eIfeOcl/HLYoEa366BCGRSrgNJEexuvOgbq9jnJ22IY5g=="
"version": "0.2.25",
"resolved": "https://registry.npmjs.org/@fortawesome/fontawesome-common-types/-/fontawesome-common-types-0.2.25.tgz",
"integrity": "sha512-3RuZPDuuPELd7RXtUqTCfed14fcny9UiPOkdr2i+cYxBoTOfQgxcDoq77fHiiHcgWuo1LoBUpvGxFF1H/y7s3Q=="
},
"@fortawesome/free-brands-svg-icons": {
"version": "5.10.2",
"resolved": "https://registry.npmjs.org/@fortawesome/free-brands-svg-icons/-/free-brands-svg-icons-5.10.2.tgz",
"integrity": "sha512-r5Dxr2h8f9bEI7F/gj/2v1OX9S6DMif9ZKR2VFQCSXHwahojLlOWnFILYsrjhzOISESkh6WDL9IOdkdbKM7KPw==",
"version": "5.11.2",
"resolved": "https://registry.npmjs.org/@fortawesome/free-brands-svg-icons/-/free-brands-svg-icons-5.11.2.tgz",
"integrity": "sha512-wKK5znpHiZ2S0VgOvbeAnYuzkk3H86rxWajD9PVpfBj3s/kySEWTFKh/uLPyxiTOx8Tsd0OGN4En/s9XudVHLQ==",
"requires": {
"@fortawesome/fontawesome-common-types": "^0.2.22"
"@fortawesome/fontawesome-common-types": "^0.2.25"
}
},
"@jest/console": {
@@ -1787,51 +1787,43 @@
"dev": true
},
"@patternfly/patternfly": {
"version": "2.27.0",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.27.0.tgz",
"integrity": "sha512-sYSKUG3PL1KNKVw6bhijur0fS2pgfxWFmLCedxXaECt4KdKcg6rGvInzQnyGiQhWMVZBbWxFCWvBxBIr7L8ilA=="
"version": "2.40.2",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.40.2.tgz",
"integrity": "sha512-KCPQ6EL39xJen/B67MGv56i3h6bU5l7FD6f5IYU30z+ed2gM8zAYI3mPKNV05TMJv6+EQfp6O7dqCM3PJ8Q1yw=="
},
"@patternfly/react-core": {
"version": "3.87.3",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.87.3.tgz",
"integrity": "sha512-5kcuOIucqtnmGKjV13gRHEcU31AbNXNiNX3yhEhdeuG4gH00gFyUAxim3fkbYnR4OtGdU2MLvVOjoMfYj62rBQ==",
"version": "3.120.2",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.120.2.tgz",
"integrity": "sha512-PgV5w+3NlXK7hKvu0YY1pjXgd56dLwbIWE4m72JstxJIp/vpRShB6bfiSYNQGVi2ZQUudQTSH5sVWaBqXUaquw==",
"requires": {
"@patternfly/react-icons": "^3.10.17",
"@patternfly/react-styles": "^3.5.13",
"@patternfly/react-tokens": "^2.6.16",
"@patternfly/react-icons": "^3.14.15",
"@patternfly/react-styles": "^3.6.2",
"@patternfly/react-tokens": "^2.7.2",
"emotion": "^9.2.9",
"exenv": "^1.2.2",
"focus-trap-react": "^4.0.1",
"tippy.js": "3.4.1"
},
"dependencies": {
"@patternfly/react-icons": {
"version": "3.12.0",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.12.0.tgz",
"integrity": "sha512-45nP1m4La/LusrKVQNwkyZV3mqAJWhYgMce2+0VewAJ5ts4ygUJYzXR1vZQk09E8gTBCHMEdq0Af1xywwluHFg==",
"requires": {
"@fortawesome/free-brands-svg-icons": "^5.8.1"
}
},
"@patternfly/react-tokens": {
"version": "2.6.16",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.6.16.tgz",
"integrity": "sha512-dhr4ne4thSmSKBr4anV07KSzUXEs6KpCMDxxNiwrgFdZwNHtyNcaPc+F9pQZ5A0n4qYmMLpCrprb7m5o/83riQ=="
"version": "2.7.2",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.7.2.tgz",
"integrity": "sha512-3QslQUErDLXGTzp2iGQNJD1UjZ+1NqwavOlsbxACUZ6LjXyJ7Y4TZbxDQrpgzPsD1SFPEVWufzpdjjtRBZ/b7g=="
}
}
},
"@patternfly/react-icons": {
"version": "3.12.0",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.12.0.tgz",
"integrity": "sha512-45nP1m4La/LusrKVQNwkyZV3mqAJWhYgMce2+0VewAJ5ts4ygUJYzXR1vZQk09E8gTBCHMEdq0Af1xywwluHFg==",
"version": "3.14.15",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.14.15.tgz",
"integrity": "sha512-7mIr1nzAXu6CdxKnhJGggIghx3DCaFXv6an+mfP/IwWifsLhcpE1c0iYkmVkvlI9X4cQAzeg9VfEGR7quhPOlA==",
"requires": {
"@fortawesome/free-brands-svg-icons": "^5.8.1"
}
},
"@patternfly/react-styles": {
"version": "3.5.13",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.5.13.tgz",
"integrity": "sha512-aiyOp/n4cMxWhNmokG9EAFt06YmWDi3EdGfa5gyjYRwABGLUhyHo2r7kBqT3xxw0bLcOYDTPU94SaH63uAaRag==",
"version": "3.6.2",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.6.2.tgz",
"integrity": "sha512-WRXPC1R/qL+i/ANnrA0nEe6CcLHLZJIKWzSJ4gS2h9VdHvKySEdIlk9EtAZ0dNkv3whANjaKlR/n2/uFuXlzyw==",
"requires": {
"@babel/helper-plugin-utils": "^7.0.0-beta.48",
"camel-case": "^3.0.0",
@@ -1849,17 +1841,24 @@
},
"dependencies": {
"acorn": {
"version": "6.3.0",
"resolved": "https://registry.npmjs.org/acorn/-/acorn-6.3.0.tgz",
"integrity": "sha512-/czfa8BwS88b9gWQVhc8eknunSA2DoJpJyTQkhheIf5E48u1N0R4q/YxxsAeqRrmK9TQ/uYfgLDfZo91UlANIA=="
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/acorn/-/acorn-7.1.0.tgz",
"integrity": "sha512-kL5CuoXA/dgxlBbVrflsflzQ3PAas7RYZB52NOm/6839iVYJgKMJ3cQJD+t2i5+qFa8h3MDpEOJiS64E8JLnSQ=="
},
"acorn-globals": {
"version": "4.3.3",
"resolved": "https://registry.npmjs.org/acorn-globals/-/acorn-globals-4.3.3.tgz",
"integrity": "sha512-vkR40VwS2SYO98AIeFvzWWh+xyc2qi9s7OoXSFEGIP/rOJKzjnhykaZJNnHdoq4BL2gGxI5EZOU16z896EYnOQ==",
"version": "4.3.4",
"resolved": "https://registry.npmjs.org/acorn-globals/-/acorn-globals-4.3.4.tgz",
"integrity": "sha512-clfQEh21R+D0leSbUdWf3OcfqyaCSAQ8Ryq00bofSekfr9W8u1jyYZo6ir0xu9Gtcf7BjcHJpnbZH7JOCpP60A==",
"requires": {
"acorn": "^6.0.1",
"acorn-walk": "^6.0.1"
},
"dependencies": {
"acorn": {
"version": "6.3.0",
"resolved": "https://registry.npmjs.org/acorn/-/acorn-6.3.0.tgz",
"integrity": "sha512-/czfa8BwS88b9gWQVhc8eknunSA2DoJpJyTQkhheIf5E48u1N0R4q/YxxsAeqRrmK9TQ/uYfgLDfZo91UlANIA=="
}
}
},
"escodegen": {
@@ -1880,16 +1879,16 @@
"integrity": "sha1-/cpRzuYTOJXjyI1TXOSdv/YqRjM="
},
"jsdom": {
"version": "15.1.1",
"resolved": "https://registry.npmjs.org/jsdom/-/jsdom-15.1.1.tgz",
"integrity": "sha512-cQZRBB33arrDAeCrAEWn1U3SvrvC8XysBua9Oqg1yWrsY/gYcusloJC3RZJXuY5eehSCmws8f2YeliCqGSkrtQ==",
"version": "15.2.0",
"resolved": "https://registry.npmjs.org/jsdom/-/jsdom-15.2.0.tgz",
"integrity": "sha512-+hRyEfjRPFwTYMmSQ3/f7U9nP8ZNZmbkmUek760ZpxnCPWJIhaaLRuUSvpJ36fZKCGENxLwxClzwpOpnXNfChQ==",
"requires": {
"abab": "^2.0.0",
"acorn": "^6.1.1",
"acorn": "^7.1.0",
"acorn-globals": "^4.3.2",
"array-equal": "^1.0.0",
"cssom": "^0.3.6",
"cssstyle": "^1.2.2",
"cssom": "^0.4.1",
"cssstyle": "^2.0.0",
"data-urls": "^1.1.0",
"domexception": "^1.0.1",
"escodegen": "^1.11.1",
@@ -1913,16 +1912,23 @@
},
"dependencies": {
"cssom": {
"version": "0.3.8",
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.3.8.tgz",
"integrity": "sha512-b0tGHbfegbhPJpxpiBPU2sCkigAqtM9O121le6bbOlgyV+NyGyCmVfJ6QW9eRjz8CpNfWEOYBIMIGRYkLwsIYg=="
"version": "0.4.1",
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.4.1.tgz",
"integrity": "sha512-6Aajq0XmukE7HdXUU6IoSWuH1H6gH9z6qmagsstTiN7cW2FNTsb+J2Chs+ufPgZCsV/yo8oaEudQLrb9dGxSVQ=="
},
"cssstyle": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/cssstyle/-/cssstyle-1.4.0.tgz",
"integrity": "sha512-GBrLZYZ4X4x6/QEoBnIrqb8B/f5l4+8me2dkom/j1Gtbxy0kBv6OGzKuAsGM75bkGwGAFkt56Iwg28S3XTZgSA==",
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/cssstyle/-/cssstyle-2.0.0.tgz",
"integrity": "sha512-QXSAu2WBsSRXCPjvI43Y40m6fMevvyRm8JVAuF9ksQz5jha4pWP1wpaK7Yu5oLFc6+XAY+hj8YhefyXcBB53gg==",
"requires": {
"cssom": "0.3.x"
"cssom": "~0.3.6"
},
"dependencies": {
"cssom": {
"version": "0.3.8",
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.3.8.tgz",
"integrity": "sha512-b0tGHbfegbhPJpxpiBPU2sCkigAqtM9O121le6bbOlgyV+NyGyCmVfJ6QW9eRjz8CpNfWEOYBIMIGRYkLwsIYg=="
}
}
}
}
@@ -1987,9 +1993,9 @@
"integrity": "sha512-M4yMwr6mAnQz76TbJm914+gPpB/nCwvZbJU28cUD6dR004SAxDLOOSUaB1JDRqLtaOV/vi0IC5lEAGFgrjGv/g=="
},
"whatwg-url": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-7.0.0.tgz",
"integrity": "sha512-37GeVSIJ3kn1JgKyjiYNmSLP1yzbpb29jdmwBSgkD9h40/hyrR/OifpVUndji3tmwGgD8qpw7iQu3RSbCrBpsQ==",
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-7.1.0.tgz",
"integrity": "sha512-WUu7Rg1DroM7oQvGWfOiAK21n74Gg+T4elXEQYkOhtyLeWiJFoOGLXPKI/9gzIie9CtwVLm8wtw6YJdKyxSjeg==",
"requires": {
"lodash.sortby": "^4.7.0",
"tr46": "^1.0.1",
@@ -1997,9 +2003,9 @@
}
},
"ws": {
"version": "7.1.2",
"resolved": "https://registry.npmjs.org/ws/-/ws-7.1.2.tgz",
"integrity": "sha512-gftXq3XI81cJCgkUiAVixA0raD9IVmXqsylCrjRygw4+UOOGzPoxnQ6r/CnVL9i+mDncJo94tSkyrtuuQVBmrg==",
"version": "7.2.0",
"resolved": "https://registry.npmjs.org/ws/-/ws-7.2.0.tgz",
"integrity": "sha512-+SqNqFbwTm/0DC18KYzIsMTnEWpLwJsiasW/O17la4iDRRIO9uaHbvKiAS3AHgTiuuWerK/brj4O6MYZkei9xg==",
"requires": {
"async-limiter": "^1.0.0"
}
@@ -2007,9 +2013,9 @@
}
},
"@patternfly/react-tokens": {
"version": "2.6.16",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.6.16.tgz",
"integrity": "sha512-dhr4ne4thSmSKBr4anV07KSzUXEs6KpCMDxxNiwrgFdZwNHtyNcaPc+F9pQZ5A0n4qYmMLpCrprb7m5o/83riQ=="
"version": "2.6.31",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.6.31.tgz",
"integrity": "sha512-K9semfLIdf2vECefAbheXPVwZqq8nXY0Hf/VkWh6OBCL6R4FekxajpSBgobeoTQUotmvz5boMngqhkUjE7yChA=="
},
"@types/babel__core": {
"version": "7.1.1",
@@ -5308,9 +5314,9 @@
}
},
"csstype": {
"version": "2.6.6",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.6.tgz",
"integrity": "sha512-RpFbQGUE74iyPgvr46U9t1xoQBM8T4BL8SxrN66Le2xYAPSaDJJKeztV3awugusb3g3G9iL8StmkBBXhcbbXhg=="
"version": "2.6.7",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.7.tgz",
"integrity": "sha512-9Mcn9sFbGBAdmimWb2gLVDtFJzeKtDGIr76TUqmjZrw9LFXBMSU70lcs+C0/7fyCd6iBDqmksUcCOUIkisPHsQ=="
},
"currently-unhandled": {
"version": "0.4.1",
@@ -12959,9 +12965,9 @@
"dev": true
},
"popper.js": {
"version": "1.15.0",
"resolved": "https://registry.npmjs.org/popper.js/-/popper.js-1.15.0.tgz",
"integrity": "sha512-w010cY1oCUmI+9KwwlWki+r5jxKfTFDVoadl7MSrIujHU5MJ5OR6HTDj6Xo8aoR/QsA56x8jKjA59qGH4ELtrA=="
"version": "1.16.0",
"resolved": "https://registry.npmjs.org/popper.js/-/popper.js-1.16.0.tgz",
"integrity": "sha512-+G+EkOPoE5S/zChTpmBSSDYmhXJ5PsW8eMhH8cP/CQHMFPBG/kC9Y5IIw6qNYgdJ+/COf0ddY2li28iHaZRSjw=="
},
"portfinder": {
"version": "1.0.20",
@@ -17421,9 +17427,9 @@
"integrity": "sha512-A5CUptxDsvxKJEU3yO6DuWBSJz/qizqzJKOMIfUJHETbBw/sFaDxgd6fxm1ewUaM0jZ444Fc5vC5ROYurg/4Pw=="
},
"xmlchars": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/xmlchars/-/xmlchars-2.1.1.tgz",
"integrity": "sha512-7hew1RPJ1iIuje/Y01bGD/mXokXxegAgVS+e+E0wSi2ILHQkYAH1+JXARwTjZSM4Z4Z+c73aKspEcqj+zPPL/w=="
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/xmlchars/-/xmlchars-2.2.0.tgz",
"integrity": "sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw=="
},
"xregexp": {
"version": "4.0.0",

View File

@@ -57,10 +57,10 @@
},
"dependencies": {
"@lingui/react": "^2.7.2",
"@patternfly/patternfly": "^2.27.0",
"@patternfly/react-core": "^3.87.3",
"@patternfly/react-icons": "^3.12.0",
"@patternfly/react-tokens": "^2.6.16",
"@patternfly/patternfly": "^2.40.2",
"@patternfly/react-core": "^3.120.2",
"@patternfly/react-icons": "^3.14.15",
"@patternfly/react-tokens": "^2.6.31",
"ansi-to-html": "^0.6.11",
"axios": "^0.18.1",
"codemirror": "^5.47.0",

View File

@@ -1,5 +1,7 @@
import AdHocCommands from './models/AdHocCommands';
import Config from './models/Config';
import CredentialTypes from './models/CredentialTypes';
import Credentials from './models/Credentials';
import InstanceGroups from './models/InstanceGroups';
import Inventories from './models/Inventories';
import InventorySources from './models/InventorySources';
@@ -23,6 +25,8 @@ import WorkflowJobTemplates from './models/WorkflowJobTemplates';
const AdHocCommandsAPI = new AdHocCommands();
const ConfigAPI = new Config();
const CredentialsAPI = new Credentials();
const CredentialTypesAPI = new CredentialTypes();
const InstanceGroupsAPI = new InstanceGroups();
const InventoriesAPI = new Inventories();
const InventorySourcesAPI = new InventorySources();
@@ -47,6 +51,8 @@ const WorkflowJobTemplatesAPI = new WorkflowJobTemplates();
export {
AdHocCommandsAPI,
ConfigAPI,
CredentialsAPI,
CredentialTypesAPI,
InstanceGroupsAPI,
InventoriesAPI,
InventorySourcesAPI,

View File

@@ -0,0 +1,10 @@
import Base from '../Base';
class CredentialTypes extends Base {
constructor(http) {
super(http);
this.baseUrl = '/api/v2/credential_types/';
}
}
export default CredentialTypes;

View File

@@ -0,0 +1,10 @@
import Base from '../Base';
class Credentials extends Base {
constructor(http) {
super(http);
this.baseUrl = '/api/v2/credentials/';
}
}
export default Credentials;

View File

@@ -4,6 +4,12 @@ class Inventories extends Base {
constructor(http) {
super(http);
this.baseUrl = '/api/v2/inventories/';
this.readAccessList = this.readAccessList.bind(this);
}
readAccessList(id, params) {
return this.http.get(`${this.baseUrl}${id}/access_list/`, { params });
}
}

View File

@@ -12,6 +12,7 @@ class JobTemplates extends InstanceGroupsMixin(NotificationsMixin(Base)) {
this.associateLabel = this.associateLabel.bind(this);
this.disassociateLabel = this.disassociateLabel.bind(this);
this.readCredentials = this.readCredentials.bind(this);
this.readAccessList = this.readAccessList.bind(this);
this.generateLabel = this.generateLabel.bind(this);
}
@@ -44,6 +45,23 @@ class JobTemplates extends InstanceGroupsMixin(NotificationsMixin(Base)) {
readCredentials(id, params) {
return this.http.get(`${this.baseUrl}${id}/credentials/`, { params });
}
associateCredentials(id, credentialId) {
return this.http.post(`${this.baseUrl}${id}/credentials/`, {
id: credentialId,
});
}
disassociateCredentials(id, credentialId) {
return this.http.post(`${this.baseUrl}${id}/credentials/`, {
id: credentialId,
disassociate: true,
});
}
readAccessList(id, params) {
return this.http.get(`${this.baseUrl}${id}/access_list/`, { params });
}
}
export default JobTemplates;

View File

@@ -1,16 +1,22 @@
import Base from '../Base';
import NotificationsMixin from '../mixins/Notifications.mixin';
import LaunchUpdateMixin from '../mixins/LaunchUpdate.mixin';
class Projects extends LaunchUpdateMixin(Base) {
class Projects extends LaunchUpdateMixin(NotificationsMixin(Base)) {
constructor(http) {
super(http);
this.baseUrl = '/api/v2/projects/';
this.readAccessList = this.readAccessList.bind(this);
this.readPlaybooks = this.readPlaybooks.bind(this);
this.readSync = this.readSync.bind(this);
this.sync = this.sync.bind(this);
}
readAccessList(id, params) {
return this.http.get(`${this.baseUrl}${id}/access_list/`, { params });
}
readPlaybooks(id) {
return this.http.get(`${this.baseUrl}${id}/playbooks/`);
}

View File

@@ -0,0 +1,10 @@
import DataListCell from '@components/DataListCell';
import styled from 'styled-components';
const ActionButtonCell = styled(DataListCell)`
& > :not(:first-child) {
margin-left: 20px;
}
`;
ActionButtonCell.displayName = 'ActionButtonCell';
export default ActionButtonCell;

View File

@@ -0,0 +1,10 @@
import React from 'react';
import { mount } from 'enzyme';
import ActionButtonCell from './ActionButtonCell';
describe('ActionButtonCell', () => {
test('renders the expected content', () => {
const wrapper = mount(<ActionButtonCell />);
expect(wrapper).toHaveLength(1);
});
});

View File

@@ -0,0 +1 @@
export { default } from './ActionButtonCell';

View File

@@ -0,0 +1,92 @@
import React from 'react';
import PropTypes from 'prop-types';
import { Button, Tooltip } from '@patternfly/react-core';
import { CopyIcon } from '@patternfly/react-icons';
import styled from 'styled-components';
const CopyButton = styled(Button)`
padding: 2px 4px;
margin-left: 8px;
border: none;
&:hover {
background-color: #0066cc;
color: white;
}
`;
export const clipboardCopyFunc = (event, text) => {
const clipboard = event.currentTarget.parentElement;
const el = document.createElement('input');
el.value = text;
clipboard.appendChild(el);
el.select();
document.execCommand('copy');
clipboard.removeChild(el);
};
class ClipboardCopyButton extends React.Component {
constructor(props) {
super(props);
this.state = {
copied: false,
};
this.handleCopyClick = this.handleCopyClick.bind(this);
}
handleCopyClick = event => {
const { stringToCopy, switchDelay } = this.props;
if (this.timer) {
window.clearTimeout(this.timer);
this.setState({ copied: false });
}
clipboardCopyFunc(event, stringToCopy);
this.setState({ copied: true }, () => {
this.timer = window.setTimeout(() => {
this.setState({ copied: false });
this.timer = null;
}, switchDelay);
});
};
render() {
const { clickTip, entryDelay, exitDelay, hoverTip } = this.props;
const { copied } = this.state;
return (
<Tooltip
entryDelay={entryDelay}
exitDelay={exitDelay}
trigger="mouseenter focus click"
content={copied ? clickTip : hoverTip}
>
<CopyButton
variant="plain"
onClick={this.handleCopyClick}
aria-label={hoverTip}
>
<CopyIcon />
</CopyButton>
</Tooltip>
);
}
}
ClipboardCopyButton.propTypes = {
clickTip: PropTypes.string.isRequired,
entryDelay: PropTypes.number,
exitDelay: PropTypes.number,
hoverTip: PropTypes.string.isRequired,
stringToCopy: PropTypes.string.isRequired,
switchDelay: PropTypes.number,
};
ClipboardCopyButton.defaultProps = {
entryDelay: 100,
exitDelay: 1600,
switchDelay: 2000,
};
export default ClipboardCopyButton;

View File

@@ -0,0 +1,36 @@
import React from 'react';
import { mountWithContexts } from '@testUtils/enzymeHelpers';
import ClipboardCopyButton from './ClipboardCopyButton';
document.execCommand = jest.fn();
jest.useFakeTimers();
describe('ClipboardCopyButton', () => {
test('renders the expected content', () => {
const wrapper = mountWithContexts(
<ClipboardCopyButton
clickTip="foo"
hoverTip="bar"
stringToCopy="foobar!"
/>
);
expect(wrapper).toHaveLength(1);
});
test('clicking button calls execCommand to copy to clipboard', () => {
const wrapper = mountWithContexts(
<ClipboardCopyButton
clickTip="foo"
hoverTip="bar"
stringToCopy="foobar!"
/>
).find('ClipboardCopyButton');
expect(wrapper.state('copied')).toBe(false);
wrapper.find('Button').simulate('click');
expect(document.execCommand).toBeCalledWith('copy');
expect(wrapper.state('copied')).toBe(true);
jest.runAllTimers();
wrapper.update();
expect(wrapper.state('copied')).toBe(false);
});
});

View File

@@ -0,0 +1 @@
export { default } from './ClipboardCopyButton';

View File

@@ -6,6 +6,9 @@ import { AngleRightIcon } from '@patternfly/react-icons';
import omitProps from '@util/omitProps';
import ExpandingContainer from './ExpandingContainer';
// Make button findable by tests
Button.displayName = 'Button';
const Toggle = styled.div`
display: flex;

View File

@@ -103,9 +103,10 @@ describe('<DataListToolbar />', () => {
let searchDropdownItems = toolbar.find(searchDropdownMenuItems).children();
expect(searchDropdownItems.length).toBe(1);
const mockedSortEvent = { target: { innerText: 'Bar' } };
sortDropdownItems.at(0).simulate('click', mockedSortEvent);
searchDropdownItems.at(0).simulate('click', mockedSortEvent);
toolbar = mountWithContexts(
<DataListToolbar
qsConfig={QS_CONFIG}
sortedColumnKey="foo"
sortOrder="descending"
columns={multipleColumns}
@@ -170,6 +171,7 @@ describe('<DataListToolbar />', () => {
toolbar = mountWithContexts(
<DataListToolbar
qsConfig={QS_CONFIG}
sortedColumnKey="id"
sortOrder="ascending"
columns={numericColumns}
@@ -236,6 +238,7 @@ describe('<DataListToolbar />', () => {
toolbar = mountWithContexts(
<DataListToolbar
qsConfig={QS_CONFIG}
isAllSelected
showExpandCollapse
sortedColumnKey="name"

View File

@@ -15,16 +15,19 @@ import {
ButtonVariant,
InputGroup as PFInputGroup,
Modal,
ToolbarItem,
} from '@patternfly/react-core';
import { withI18n } from '@lingui/react';
import { t } from '@lingui/macro';
import styled from 'styled-components';
import AnsibleSelect from '../AnsibleSelect';
import PaginatedDataList from '../PaginatedDataList';
import VerticalSeperator from '../VerticalSeparator';
import DataListToolbar from '../DataListToolbar';
import CheckboxListItem from '../CheckboxListItem';
import SelectedList from '../SelectedList';
import { ChipGroup, Chip } from '../Chip';
import { ChipGroup, Chip, CredentialChip } from '../Chip';
import { getQSConfig, parseQueryString } from '../../util/qs';
const SearchButton = styled(Button)`
@@ -83,14 +86,20 @@ class Lookup extends React.Component {
}
componentDidUpdate(prevProps) {
const { location } = this.props;
if (location !== prevProps.location) {
const { location, selectedCategory } = this.props;
if (
location !== prevProps.location ||
prevProps.selectedCategory !== selectedCategory
) {
this.getData();
}
}
assertCorrectValueType() {
const { multiple, value } = this.props;
const { multiple, value, selectCategoryOptions } = this.props;
if (selectCategoryOptions) {
return;
}
if (!multiple && Array.isArray(value)) {
throw new Error(
'Lookup value must not be an array unless `multiple` is set'
@@ -123,7 +132,13 @@ class Lookup extends React.Component {
}
toggleSelected(row) {
const { name, onLookupSave, multiple } = this.props;
const {
name,
onLookupSave,
multiple,
onToggleItem,
selectCategoryOptions,
} = this.props;
const {
lookupSelectedItems: updatedSelectedItems,
isModalOpen,
@@ -132,8 +147,10 @@ class Lookup extends React.Component {
const selectedIndex = updatedSelectedItems.findIndex(
selectedRow => selectedRow.id === row.id
);
if (multiple) {
if (selectCategoryOptions) {
onToggleItem(row, isModalOpen);
}
if (selectedIndex > -1) {
updatedSelectedItems.splice(selectedIndex, 1);
this.setState({ lookupSelectedItems: updatedSelectedItems });
@@ -156,7 +173,7 @@ class Lookup extends React.Component {
handleModalToggle() {
const { isModalOpen } = this.state;
const { value, multiple } = this.props;
const { value, multiple, selectCategory } = this.props;
// Resets the selected items from parent state whenever modal is opened
// This handles the case where the user closes/cancels the modal and
// opens it again
@@ -168,6 +185,9 @@ class Lookup extends React.Component {
this.setState({ lookupSelectedItems });
} else {
this.clearQSParams();
if (selectCategory) {
selectCategory(null, 'Machine');
}
}
this.setState(prevState => ({
isModalOpen: !prevState.isModalOpen,
@@ -180,8 +200,9 @@ class Lookup extends React.Component {
const value = multiple
? lookupSelectedItems
: lookupSelectedItems[0] || null;
onLookupSave(value, name);
this.handleModalToggle();
onLookupSave(value, name);
}
clearQSParams() {
@@ -201,6 +222,7 @@ class Lookup extends React.Component {
count,
} = this.state;
const {
form,
id,
lookupHeader,
value,
@@ -208,27 +230,40 @@ class Lookup extends React.Component {
multiple,
name,
onBlur,
selectCategory,
required,
i18n,
selectCategoryOptions,
selectedCategory,
} = this.props;
const header = lookupHeader || i18n._(t`Items`);
const canDelete = !required || (multiple && value.length > 1);
const chips = value ? (
<ChipGroup>
{(multiple ? value : [value]).map(chip => (
<Chip
key={chip.id}
onClick={() => this.toggleSelected(chip)}
isReadOnly={!canDelete}
>
{chip.name}
</Chip>
))}
</ChipGroup>
) : null;
const chips = () => {
return selectCategoryOptions && selectCategoryOptions.length > 0 ? (
<ChipGroup>
{(multiple ? value : [value]).map(chip => (
<CredentialChip
key={chip.id}
onClick={() => this.toggleSelected(chip)}
isReadOnly={!canDelete}
credential={chip}
/>
))}
</ChipGroup>
) : (
<ChipGroup>
{(multiple ? value : [value]).map(chip => (
<Chip
key={chip.id}
onClick={() => this.toggleSelected(chip)}
isReadOnly={!canDelete}
>
{chip.name}
</Chip>
))}
</ChipGroup>
);
};
return (
<Fragment>
<InputGroup onBlur={onBlur}>
@@ -240,7 +275,9 @@ class Lookup extends React.Component {
>
<SearchIcon />
</SearchButton>
<ChipHolder className="pf-c-form-control">{chips}</ChipHolder>
<ChipHolder className="pf-c-form-control">
{value ? chips(value) : null}
</ChipHolder>
</InputGroup>
<Modal
className="awx-c-modal"
@@ -265,6 +302,21 @@ class Lookup extends React.Component {
</Button>,
]}
>
{selectCategoryOptions && selectCategoryOptions.length > 0 && (
<ToolbarItem css=" display: flex; align-items: center;">
<span css="flex: 0 0 25%;">Selected Category</span>
<VerticalSeperator />
<AnsibleSelect
css="flex: 1 1 75%;"
id="multiCredentialsLookUp-select"
label="Selected Category"
data={selectCategoryOptions}
value={selectedCategory.label}
onChange={selectCategory}
form={form}
/>
</ToolbarItem>
)}
<PaginatedDataList
items={results}
itemCount={count}
@@ -277,9 +329,18 @@ class Lookup extends React.Component {
itemId={item.id}
name={multiple ? item.name : name}
label={item.name}
isSelected={lookupSelectedItems.some(i => i.id === item.id)}
isSelected={
selectCategoryOptions
? value.some(i => i.id === item.id)
: lookupSelectedItems.some(i => i.id === item.id)
}
onSelect={() => this.toggleSelected(item)}
isRadio={!multiple}
isRadio={
!multiple ||
(selectCategoryOptions &&
selectCategoryOptions.length &&
selectedCategory.value !== 'Vault')
}
/>
)}
renderToolbar={props => <DataListToolbar {...props} fillWidth />}
@@ -288,10 +349,13 @@ class Lookup extends React.Component {
{lookupSelectedItems.length > 0 && (
<SelectedList
label={i18n._(t`Selected`)}
selected={lookupSelectedItems}
selected={selectCategoryOptions ? value : lookupSelectedItems}
showOverflowAfter={5}
onRemove={this.toggleSelected}
isReadOnly={!canDelete}
isCredentialList={
selectCategoryOptions && selectCategoryOptions.length > 0
}
/>
)}
{error ? <div>error</div> : ''}

View File

@@ -0,0 +1,162 @@
import React from 'react';
import PropTypes from 'prop-types';
import { withI18n } from '@lingui/react';
import { t } from '@lingui/macro';
import { FormGroup, Tooltip } from '@patternfly/react-core';
import { QuestionCircleIcon as PFQuestionCircleIcon } from '@patternfly/react-icons';
import styled from 'styled-components';
import { CredentialsAPI, CredentialTypesAPI } from '@api';
import Lookup from '@components/Lookup';
const QuestionCircleIcon = styled(PFQuestionCircleIcon)`
margin-left: 10px;
`;
class MultiCredentialsLookup extends React.Component {
constructor(props) {
super(props);
this.state = {
selectedCredentialType: { label: 'Machine', id: 1, kind: 'ssh' },
credentialTypes: [],
};
this.loadCredentialTypes = this.loadCredentialTypes.bind(this);
this.handleCredentialTypeSelect = this.handleCredentialTypeSelect.bind(
this
);
this.loadCredentials = this.loadCredentials.bind(this);
this.toggleCredentialSelection = this.toggleCredentialSelection.bind(this);
}
componentDidMount() {
this.loadCredentialTypes();
}
async loadCredentialTypes() {
const { onError } = this.props;
try {
const { data } = await CredentialTypesAPI.read();
const acceptableTypes = ['machine', 'cloud', 'net', 'ssh', 'vault'];
const credentialTypes = [];
data.results.forEach(cred => {
acceptableTypes.forEach(aT => {
if (aT === cred.kind) {
// This object has several repeated values as some of it's children
// require different field values.
cred = {
id: cred.id,
key: cred.id,
kind: cred.kind,
type: cred.namespace,
value: cred.name,
label: cred.name,
isDisabled: false,
};
credentialTypes.push(cred);
}
});
});
this.setState({ credentialTypes });
} catch (err) {
onError(err);
}
}
async loadCredentials(params) {
const { selectedCredentialType } = this.state;
params.credential_type = selectedCredentialType.id || 1;
return CredentialsAPI.read(params);
}
toggleCredentialSelection(newCredential) {
const { onChange, credentials: credentialsToUpdate } = this.props;
let newCredentialsList;
const isSelectedCredentialInState =
credentialsToUpdate.filter(cred => cred.id === newCredential.id).length >
0;
if (isSelectedCredentialInState) {
newCredentialsList = credentialsToUpdate.filter(
cred => cred.id !== newCredential.id
);
} else {
newCredentialsList = credentialsToUpdate.filter(
credential =>
credential.kind === 'vault' || credential.kind !== newCredential.kind
);
newCredentialsList = [...newCredentialsList, newCredential];
}
onChange(newCredentialsList);
}
handleCredentialTypeSelect(value, type) {
const { credentialTypes } = this.state;
const selectedType = credentialTypes.filter(item => item.label === type);
this.setState({ selectedCredentialType: selectedType[0] });
}
render() {
const { selectedCredentialType, credentialTypes } = this.state;
const { tooltip, i18n, credentials } = this.props;
return (
<FormGroup label={i18n._(t`Credentials`)} fieldId="org-credentials">
{tooltip && (
<Tooltip position="right" content={tooltip}>
<QuestionCircleIcon />
</Tooltip>
)}
{credentialTypes && (
<Lookup
selectCategoryOptions={credentialTypes}
selectCategory={this.handleCredentialTypeSelect}
selectedCategory={selectedCredentialType}
onToggleItem={this.toggleCredentialSelection}
onloadCategories={this.loadCredentialTypes}
id="org-credentials"
lookupHeader={i18n._(t`Credentials`)}
name="credentials"
value={credentials}
multiple
onLookupSave={() => {}}
getItems={this.loadCredentials}
qsNamespace="credentials"
columns={[
{
name: i18n._(t`Name`),
key: 'name',
isSortable: true,
isSearchable: true,
},
]}
sortedColumnKey="name"
/>
)}
</FormGroup>
);
}
}
MultiCredentialsLookup.propTypes = {
tooltip: PropTypes.string,
credentials: PropTypes.arrayOf(
PropTypes.shape({
id: PropTypes.number,
name: PropTypes.string,
description: PropTypes.string,
kind: PropTypes.string,
clound: PropTypes.bool,
})
),
onChange: PropTypes.func.isRequired,
onError: PropTypes.func.isRequired,
};
MultiCredentialsLookup.defaultProps = {
tooltip: '',
credentials: [],
};
export { MultiCredentialsLookup as _MultiCredentialsLookup };
export default withI18n()(MultiCredentialsLookup);

View File

@@ -0,0 +1,113 @@
import React from 'react';
import { mountWithContexts } from '@testUtils/enzymeHelpers';
import MultiCredentialsLookup from './MultiCredentialsLookup';
import { CredentialsAPI, CredentialTypesAPI } from '@api';
jest.mock('@api');
describe('<MultiCredentialsLookup />', () => {
let wrapper;
let lookup;
let credLookup;
let onChange;
const credentials = [
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
{ id: 2, kind: 'ssh', name: 'Alex', url: 'www.google.com' },
{ name: 'Gatsby', id: 21, kind: 'vault' },
{ name: 'Gatsby', id: 8, kind: 'Machine' },
];
beforeEach(() => {
CredentialTypesAPI.read.mockResolvedValue({
data: {
results: [
{
id: 400,
kind: 'ssh',
namespace: 'biz',
name: 'Amazon Web Services',
},
{ id: 500, kind: 'vault', namespace: 'buzz', name: 'Vault' },
{ id: 600, kind: 'machine', namespace: 'fuzz', name: 'Machine' },
],
count: 2,
},
});
CredentialsAPI.read.mockResolvedValueOnce({
data: {
results: [
{ id: 1, kind: 'cloud', name: 'Cred 1', url: 'www.google.com' },
{ id: 2, kind: 'ssh', name: 'Cred 2', url: 'www.google.com' },
{ id: 3, kind: 'Ansible', name: 'Cred 3', url: 'www.google.com' },
{ id: 4, kind: 'Machine', name: 'Cred 4', url: 'www.google.com' },
{ id: 5, kind: 'Machine', name: 'Cred 5', url: 'www.google.com' },
],
count: 3,
},
});
onChange = jest.fn();
wrapper = mountWithContexts(
<MultiCredentialsLookup
onError={() => {}}
credentials={credentials}
onChange={onChange}
tooltip="This is credentials look up"
/>
);
lookup = wrapper.find('Lookup');
credLookup = wrapper.find('MultiCredentialsLookup');
});
afterEach(() => {
jest.clearAllMocks();
wrapper.unmount();
});
test('MultiCredentialsLookup renders properly', () => {
expect(wrapper.find('MultiCredentialsLookup')).toHaveLength(1);
expect(CredentialTypesAPI.read).toHaveBeenCalled();
});
test('onChange is called when you click to remove a credential from input', async () => {
const chip = wrapper.find('PFChip');
const button = chip.at(1).find('Button');
expect(chip).toHaveLength(4);
button.prop('onClick')();
expect(onChange).toBeCalledWith([
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
{ id: 21, kind: 'vault', name: 'Gatsby' },
{ id: 8, kind: 'Machine', name: 'Gatsby' },
]);
});
test('can change credential types', () => {
lookup.prop('selectCategory')({}, 'Vault');
expect(credLookup.state('selectedCredentialType')).toEqual({
id: 500,
key: 500,
kind: 'vault',
type: 'buzz',
value: 'Vault',
label: 'Vault',
isDisabled: false,
});
expect(CredentialsAPI.read).toHaveBeenCalled();
});
test('Toggle credentials only adds 1 credential per credential type except vault(see below)', () => {
lookup.prop('onToggleItem')({ name: 'Party', id: 9, kind: 'Machine' });
expect(onChange).toBeCalledWith([
{ id: 1, kind: 'cloud', name: 'Foo', url: 'www.google.com' },
{ id: 2, kind: 'ssh', name: 'Alex', url: 'www.google.com' },
{ id: 21, kind: 'vault', name: 'Gatsby' },
{ id: 9, kind: 'Machine', name: 'Party' },
]);
});
test('Toggle credentials only adds 1 credential per credential type', () => {
lookup.prop('onToggleItem')({ name: 'Party', id: 22, kind: 'vault' });
expect(onChange).toBeCalledWith([
...credentials,
{ name: 'Party', id: 22, kind: 'vault' },
]);
});
});

View File

@@ -2,3 +2,4 @@ export { default } from './Lookup';
export { default as InstanceGroupsLookup } from './InstanceGroupsLookup';
export { default as InventoryLookup } from './InventoryLookup';
export { default as ProjectLookup } from './ProjectLookup';
export { default as MultiCredentialsLookup } from './MultiCredentialsLookup';

View File

@@ -92,10 +92,9 @@ describe('<MultiSelect />', () => {
/>
);
wrapper
.find('Chip')
.at(1)
.invoke('onClick')();
const chips = wrapper.find('PFChip');
expect(chips).toHaveLength(2);
chips.at(1).invoke('onClick')();
expect(onRemoveItem).toHaveBeenCalledWith(value[1]);
const newVal = onChange.mock.calls[0][0];

View File

@@ -14,7 +14,7 @@ const mockData = [
const qsConfig = {
namespace: 'item',
defaultParams: { page: 1, page_size: 5, order_by: 'name' },
integerFields: [],
integerFields: ['page', 'page_size'],
};
describe('<PaginatedDataList />', () => {

View File

@@ -28,9 +28,13 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
"bottom",
]
}
id=""
isAppLauncher={false}
isContentLeftAligned={false}
isVisible={false}
maxWidth="18.75rem"
position="top"
tippyProps={Object {}}
trigger="mouseenter focus"
zIndex={9999}
>
@@ -43,9 +47,12 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
content={
<div
className=""
id=""
role="tooltip"
>
<TooltipContent>
<TooltipContent
isLeftAligned={false}
>
Select a row to delete
</TooltipContent>
</div>
@@ -69,6 +76,7 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
"bottom",
]
}
isVisible={false}
lazy={true}
maxWidth="18.75rem"
onCreate={[Function]}
@@ -126,50 +134,84 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
onClick={[Function]}
variant="plain"
>
<Button
<Component
aria-label="Delete"
className="ToolbarDeleteButton__DeleteButton-sc-1e3r0eg-0 bQjfFG"
isDisabled={true}
onClick={[Function]}
variant="plain"
>
<button
aria-disabled={null}
aria-label="Delete"
className="pf-c-button pf-m-plain pf-m-disabled ToolbarDeleteButton__DeleteButton-sc-1e3r0eg-0 bQjfFG"
disabled={true}
onClick={[Function]}
tabIndex={null}
type="button"
<ComponentWithOuia
component={[Function]}
componentProps={
Object {
"aria-label": "Delete",
"children": <TrashAltIcon
color="currentColor"
noVerticalAlign={false}
size="sm"
title={null}
/>,
"className": "ToolbarDeleteButton__DeleteButton-sc-1e3r0eg-0 bQjfFG",
"isDisabled": true,
"onClick": [Function],
"variant": "plain",
}
}
consumerContext={null}
>
<TrashAltIcon
color="currentColor"
noVerticalAlign={false}
size="sm"
title={null}
>
<svg
aria-hidden={true}
aria-labelledby={null}
fill="currentColor"
height="1em"
role="img"
style={
Object {
"verticalAlign": "-0.125em",
}
<Button
aria-label="Delete"
className="ToolbarDeleteButton__DeleteButton-sc-1e3r0eg-0 bQjfFG"
isDisabled={true}
onClick={[Function]}
ouiaContext={
Object {
"isOuia": false,
"ouiaId": null,
}
viewBox="0 0 448 512"
width="1em"
}
variant="plain"
>
<button
aria-disabled={null}
aria-label="Delete"
className="pf-c-button pf-m-plain ToolbarDeleteButton__DeleteButton-sc-1e3r0eg-0 bQjfFG"
disabled={true}
onClick={[Function]}
tabIndex={null}
type="button"
>
<path
d="M32 464a48 48 0 0 0 48 48h288a48 48 0 0 0 48-48V128H32zm272-256a16 16 0 0 1 32 0v224a16 16 0 0 1-32 0zm-96 0a16 16 0 0 1 32 0v224a16 16 0 0 1-32 0zm-96 0a16 16 0 0 1 32 0v224a16 16 0 0 1-32 0zM432 32H312l-9.4-18.7A24 24 0 0 0 281.1 0H166.8a23.72 23.72 0 0 0-21.4 13.3L136 32H16A16 16 0 0 0 0 48v32a16 16 0 0 0 16 16h416a16 16 0 0 0 16-16V48a16 16 0 0 0-16-16z"
transform=""
/>
</svg>
</TrashAltIcon>
</button>
</Button>
<TrashAltIcon
color="currentColor"
noVerticalAlign={false}
size="sm"
title={null}
>
<svg
aria-hidden={true}
aria-labelledby={null}
fill="currentColor"
height="1em"
role="img"
style={
Object {
"verticalAlign": "-0.125em",
}
}
viewBox="0 0 448 512"
width="1em"
>
<path
d="M32 464a48 48 0 0 0 48 48h288a48 48 0 0 0 48-48V128H32zm272-256a16 16 0 0 1 32 0v224a16 16 0 0 1-32 0zm-96 0a16 16 0 0 1 32 0v224a16 16 0 0 1-32 0zm-96 0a16 16 0 0 1 32 0v224a16 16 0 0 1-32 0zM432 32H312l-9.4-18.7A24 24 0 0 0 281.1 0H166.8a23.72 23.72 0 0 0-21.4 13.3L136 32H16A16 16 0 0 0 0 48v32a16 16 0 0 0 16 16h416a16 16 0 0 0 16-16V48a16 16 0 0 0-16-16z"
transform=""
/>
</svg>
</TrashAltIcon>
</button>
</Button>
</ComponentWithOuia>
</Component>
</StyledComponent>
</ToolbarDeleteButton__DeleteButton>
</div>
@@ -178,6 +220,7 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
<div>
<div
class=""
id=""
role="tooltip"
>
<div
@@ -191,9 +234,12 @@ exports[`<ToolbarDeleteButton /> should render button 1`] = `
>
<div
className=""
id=""
role="tooltip"
>
<TooltipContent>
<TooltipContent
isLeftAligned={false}
>
<div
className="pf-c-tooltip__content"
>

Some files were not shown because too many files have changed in this diff Show More