Compare commits

...

485 Commits
6.1.0 ... 7.0.0

Author SHA1 Message Date
softwarefactory-project-zuul[bot]
4edfe7e5fc Merge pull request #4658 from ryanpetrello/7.0.0-release
Bump VERSION to 7.0.0

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-04 17:27:01 +00:00
Ryan Petrello
1fc210d002 Bump VERSION to 7.0.0 2019-09-04 12:48:52 -04:00
softwarefactory-project-zuul[bot]
4bcb941df9 Merge pull request #4655 from ryanpetrello/cli-py2-u-marker
cli: fix a -f human formatting bug in py2

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-04 16:18:58 +00:00
Ryan Petrello
dbfe85da53 cli: fix a -f human formatting bug in py2
if we encounter non-strings in JSON responses, attempt to represent them
as JSON, instead of stringify-ing them (in py2, stringify-ing adds `u`
markers, which is confusing to users)
2019-09-04 11:10:12 -04:00
softwarefactory-project-zuul[bot]
66907151a0 Merge pull request #4651 from ryanpetrello/py2-argparse-alias
cli: add support for deprecated tower-cli aliases in py2

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-04 14:24:15 +00:00
softwarefactory-project-zuul[bot]
8a1a3918f1 Merge pull request #4641 from mabashian/upgrade-react
Ensure react 16.8 or greater is used

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-04 14:18:19 +00:00
softwarefactory-project-zuul[bot]
7418453d6f Merge pull request #4652 from ryanpetrello/cli-file-load-bug
cli: fix a bug introduced in @ file support

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-04 02:30:10 +00:00
Ryan Petrello
70989ca616 cli: fix a bug introduced in @ file support 2019-09-03 21:33:46 -04:00
Ryan Petrello
b888c4b75a cli: add support for deprecated tower-cli aliases in py2 2019-09-03 21:22:49 -04:00
softwarefactory-project-zuul[bot]
ba8b876dd3 Merge pull request #4640 from ryanpetrello/cli-lookup-by-name
cli: add ability to specify a name instead of primary key

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-03 22:09:10 +00:00
Ryan Petrello
4ec5e82023 cli: add ability to specify a name instead of primary key 2019-09-03 17:27:10 -04:00
softwarefactory-project-zuul[bot]
392fc803fa Merge pull request #4646 from rooftopcellist/rm_mk_dbshell
remove redundant dbshell make target

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-03 19:43:58 +00:00
softwarefactory-project-zuul[bot]
b62c0c57cb Merge pull request #4644 from ryanpetrello/sos-license
include license data/state in the sosreport

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-03 19:41:21 +00:00
Christian Adams
70f9f09fef remove redundant dbshell make target 2019-09-03 14:22:07 -04:00
Ryan Petrello
7a8234bb09 include license data/state in the sosreport 2019-09-03 13:56:02 -04:00
mabashian
120190eb82 Ensure react 16.8 or greater is used 2019-09-03 12:24:13 -04:00
softwarefactory-project-zuul[bot]
f21c6dc330 Merge pull request #4624 from ryanpetrello/cli-association
cli: implement support for credential and notification association

Reviewed-by: Elijah DeLee <kdelee@redhat.com>
             https://github.com/kdelee
2019-09-03 14:53:44 +00:00
softwarefactory-project-zuul[bot]
45f9457abe Merge pull request #4626 from ryanpetrello/more-cli-doc-examples
cli: add support for loading JSON/YAML w/ the ansible-like @ syntax

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-09-03 13:09:29 +00:00
softwarefactory-project-zuul[bot]
09c105e125 Merge pull request #4631 from wenottingham/stop-me-before-i-touch-javascript-again
Fix fetching of result traceback in job details.

Reviewed-by: Jake McDermott <yo@jakemcdermott.me>
             https://github.com/jakemcdermott
2019-08-30 22:01:38 +00:00
Jake McDermott
a7db4cf367 set result traceback state on sync and send it to subscribers 2019-08-30 17:05:33 -04:00
Bill Nottingham
a0671bd36a Fix fetching of result traceback in job details.
Add it to the list of things to subscribe to and fetch at the end.
2019-08-30 17:05:33 -04:00
softwarefactory-project-zuul[bot]
fd32eff281 Merge pull request #4632 from ryanpetrello/adhoc-event-log-agg
send adhoc command events to the external job_event logger

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-30 19:47:12 +00:00
Ryan Petrello
8d251c2f2e send adhoc command events to the external job_event logger
see: https://github.com/ansible/awx/issues/4545
2019-08-30 15:08:25 -04:00
Ryan Petrello
8e58a4a7de cli: add support for loading JSON/YAML w/ the ansible-like @ syntax 2019-08-30 00:51:46 -04:00
softwarefactory-project-zuul[bot]
8dc1737419 Merge pull request #4609 from mabashian/4429-deps
Update Patternfly and Axios deps

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-29 22:30:05 +00:00
softwarefactory-project-zuul[bot]
3c82785eb3 Merge pull request #4625 from beeankha/approval_timeout_websocket
Update Approval Node Count in the Event of a Timeout

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-29 21:15:06 +00:00
mabashian
7ae9e13321 A better fix for the tab bottom border 2019-08-29 16:19:54 -04:00
beeankha
2fc7e93c6a Emit websocket for approval node timeout
...and update timeout_message to be more translation-friendly.
2019-08-29 14:30:33 -04:00
Ryan Petrello
88dfcaa439 cli: implement support for credential and notification association 2019-08-29 13:11:02 -04:00
mabashian
7c81ec0df5 Linting cleanup. Also fixed error thrown to console around passing Link to the DropdownItem component. 2019-08-29 11:18:34 -04:00
softwarefactory-project-zuul[bot]
9571801e9f Merge pull request #4347 from AlanCoding/no_read_role
Kill off all model can_read access methods

Reviewed-by: Jake McDermott <yo@jakemcdermott.me>
             https://github.com/jakemcdermott
2019-08-29 14:38:44 +00:00
softwarefactory-project-zuul[bot]
33f2b0bf1a Merge pull request #4610 from ryanpetrello/cli-cleanup-injectors-language
fix a few minor CLI bugs

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-29 14:15:48 +00:00
softwarefactory-project-zuul[bot]
226d507013 Merge pull request #4619 from rooftopcellist/cloud_rh_settings
Add Settings for license & automation analytics creds

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-29 14:05:45 +00:00
Christian Adams
eacd356881 fix wording in settings for license & automation analytics creds 2019-08-29 09:32:50 -04:00
Ryan Petrello
a107a17bc9 fix a few minor CLI bugs
see: https://github.com/ansible/awx/issues/4608
2019-08-29 08:54:17 -04:00
softwarefactory-project-zuul[bot]
2918b6c927 Merge pull request #4264 from beeankha/workflow_pause_approve
Workflow Approval Nodes

Reviewed-by: Ryan Petrello
             https://github.com/ryanpetrello
2019-08-28 22:25:39 +00:00
John Mitchell
56c6944049 add ui fields to configure tower in tower for automation analytics fields 2019-08-28 13:54:15 -04:00
Christian Adams
1a78c16adf add settings for license & automation analytics creds 2019-08-28 12:39:28 -04:00
Bianca Henderson
97d9c264f9 Merge pull request #14 from mabashian/workflow_pause_approve_cleanup_5
Styles cleanup
2019-08-28 10:14:43 -04:00
mabashian
f229418ae2 Styles cleanup 2019-08-28 10:00:41 -04:00
beeankha
073f6dbf07 Fix flake8 error 2019-08-28 09:33:15 -04:00
softwarefactory-project-zuul[bot]
3d4cd1b575 Merge pull request #4511 from AlexSCorey/multiSelecBug
Allows user to hit enter to create label, fixes console errors.

Reviewed-by: Jake McDermott <yo@jakemcdermott.me>
             https://github.com/jakemcdermott
2019-08-27 22:43:16 +00:00
Jake McDermott
04f7218b4a also make labels work for add view 2019-08-27 17:41:54 -04:00
Alex Corey
fbe6abfb53 Allows user to hit enter to create label, fixes console errors. 2019-08-27 17:37:23 -04:00
softwarefactory-project-zuul[bot]
8be46e43b4 Merge pull request #4600 from ryanpetrello/cli-json-inputs
cli: improve parsing of JSON inputs

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-27 20:33:51 +00:00
mabashian
5f1f4bd109 Update Patternfly and Axios deps 2019-08-27 16:21:24 -04:00
Ryan Petrello
23f75cf74a fix a bug introduced in rebase 2019-08-27 15:59:16 -04:00
Ryan Petrello
b9f75ecad7 update migration numbering for WF approval 2019-08-27 15:42:49 -04:00
beeankha
2ac1c3d1e1 Update timeout info on AWX docs. 2019-08-27 15:38:19 -04:00
Ryan Petrello
1eeab7e0d5 add approval timeout to the summary fields for WorkflowJobTemplateNodes 2019-08-27 15:38:18 -04:00
beeankha
459012e879 Fix 500 error on workflow_approvals endpoint 2019-08-27 15:38:17 -04:00
beeankha
2e58a47118 Minor change to fix rebase conflict. 2019-08-27 15:38:16 -04:00
beeankha
b2819793df Set view's permission classes to be more explicit 2019-08-27 15:38:16 -04:00
beeankha
ea509f518e Addressing comments, updating tests, etc. 2019-08-27 15:38:15 -04:00
mabashian
9f0307404e Fix loading pending approval count on login 2019-08-27 15:38:14 -04:00
beeankha
703de8f3c0 Edit minor typo 2019-08-27 15:38:14 -04:00
beeankha
b5c0f58137 Add test for approve node denial 2019-08-27 15:38:13 -04:00
beeankha
8b23ff71b4 Update/add more functional tests 2019-08-27 15:38:12 -04:00
beeankha
582bbda9c4 Fix bug in Activity Stream, add tests. 2019-08-27 15:38:11 -04:00
mabashian
3fa9497e3c Various bug fixes and minor ux enhancements 2019-08-27 15:38:10 -04:00
mabashian
5fc3b2c3f5 Add timed out text to workflow job node. Change timeout to minutes and seconds.
Remove workflow template badge in approvals drawer.
2019-08-27 15:38:09 -04:00
beeankha
9bbc14c5a1 Update AWX docs to include info about wf approvals 2019-08-27 15:38:09 -04:00
beeankha
aab04bcbb1 Fix accidental deletions, update docstrings...
... and update migration file for rebase.
2019-08-27 15:38:08 -04:00
beeankha
667fce5012 Fix flake8 errors, update doc strings, ...
... and return full object details when doing a POST to create new approval nodes.
2019-08-27 15:37:22 -04:00
Ryan Petrello
dd89e46ee6 change up a few activity stream and approval drawer issues 2019-08-27 15:36:32 -04:00
mabashian
aac8c9fb04 Rename workflow approval migration. Add approval option back to workflow node form. 2019-08-27 15:36:32 -04:00
beeankha
cf436eea37 Update RBAC for adding approval nodes 2019-08-27 15:36:31 -04:00
beeankha
f7d6f4538c Emit approve/deny status for websockets, update doc string + a comment 2019-08-27 15:36:30 -04:00
Ryan Petrello
761dad060c allow org/WF admins to create approval templates 2019-08-27 15:36:30 -04:00
mabashian
73485b220e fix jshint errors 2019-08-27 15:36:29 -04:00
Elijah DeLee
bdf4defdbe Add approval node logic to awxkit
Co-authored-by: <Apurva bakshiapurva93@gmail.com>
2019-08-27 15:36:29 -04:00
mabashian
adf621d2cf Timeout, socket and activity stream changes for workflow pause approve 2019-08-27 15:36:28 -04:00
beeankha
9186cb23a6 Update summary field for activity stream 2019-08-27 15:36:27 -04:00
beeankha
f6f6e5883a Update websockets for pending approvals, change timeout expiration to 2019-08-27 15:36:27 -04:00
Ryan Petrello
7814592285 when copying workflows w/ pause nodes, copy the WorkflowApprovalTemplate 2019-08-27 15:36:26 -04:00
Ryan Petrello
4a75edf549 fix a few nits w/ workflow approval activity stream records 2019-08-27 15:36:25 -04:00
beeankha
d9f3fed06f Update UJ/UJT endpoints, update approval RBAC, update approval timeout 2019-08-27 15:36:25 -04:00
beeankha
544a5063f3 Update timeout implementation, placeholder code for possible websocket support 2019-08-27 15:36:24 -04:00
beeankha
8c17990750 Activity stream and timeout
Update activity stream to show approval node info, add meaningful log
message for expired approval nodes in the Task Manager timeout
function.
2019-08-27 15:36:24 -04:00
Ryan Petrello
0522d45ab0 fixed a few issues related to approval role RBAC for normal users 2019-08-27 15:36:23 -04:00
beeankha
28289e85c1 Add timeout for workflow approval nodes 2019-08-27 15:36:22 -04:00
beeankha
5f82754a3f Clean up RBAC code 2019-08-27 15:36:22 -04:00
beeankha
296b4e830b Add more RBAC for approval nodes 2019-08-27 15:36:21 -04:00
mabashian
630f428d77 Cleanup a few jshint errors 2019-08-27 15:36:20 -04:00
mabashian
013792f0f8 Prompt bug cleanup. Filter workflow_approval jobs out of jobs list. Add initial support for timeout. 2019-08-27 15:36:20 -04:00
beeankha
3357c96774 Enable deletion of orphaned approval nodes
Update serializer to include workflow approval for activity stream
2019-08-27 15:36:19 -04:00
beeankha
64c94d478d Add more RBAC, filter out AJT/AJs from unified jobs lists
Comment out placeholder in serializer
2019-08-27 15:36:17 -04:00
beeankha
453e142635 Fix UJT-related error, add notification placeholders 2019-08-27 15:35:43 -04:00
beeankha
24c5404c25 Fix error related to workflow_approval_templates/N endpoint 2019-08-27 15:30:50 -04:00
mabashian
4a801c60b9 Cleanup and changes to the way approval templates are created 2019-08-27 15:30:49 -04:00
beeankha
294d6551b9 Polishing up work on new endpoint 2019-08-27 15:30:48 -04:00
beeankha
320284267c Add new endpoint for creation of approval nodes 2019-08-27 15:30:47 -04:00
mabashian
83f9681941 Fix jshint errors 2019-08-27 15:30:47 -04:00
mabashian
e0cdc4ff80 Approval drawer cleanup and workflow node form UX cleanup 2019-08-27 15:30:46 -04:00
mabashian
1d814beca1 Fix linting error 2019-08-27 15:30:45 -04:00
mabashian
0720857022 Add initial support for workflow pause approve 2019-08-27 15:30:44 -04:00
beeankha
82e0b2121b Add approve/deny endpoints, fix some typos 2019-08-27 15:30:43 -04:00
beeankha
d76e9125e8 Clean up redundancies 2019-08-27 15:30:42 -04:00
beeankha
9024a514a6 Add API endpoints for workflow approvals 2019-08-27 15:30:39 -04:00
beeankha
72a65f74fd Add migration file 2019-08-27 15:29:34 -04:00
beeankha
b88b1111bd Add workflow pause/approve node 2019-08-27 15:29:30 -04:00
Ryan Petrello
f22adca6f7 improve parsing of JSON inputs
see: https://github.com/ansible/awx/issues/4573
see: https://github.com/ansible/awx/issues/2371
2019-08-27 12:47:27 -04:00
softwarefactory-project-zuul[bot]
534c4e776a Merge pull request #4291 from jladdjr/templated_messages
Templated notifications

Reviewed-by: Jim Ladd
             https://github.com/jladdjr
2019-08-27 16:29:21 +00:00
softwarefactory-project-zuul[bot]
187360a8ad Merge pull request #4605 from ryanpetrello/cli-inventory-alias
cli: add an alias for `awx inventories`

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-27 15:40:18 +00:00
softwarefactory-project-zuul[bot]
8ae93848db Merge pull request #4564 from rooftopcellist/manifest_destiny
add collection version tracker & query info

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-27 15:09:17 +00:00
Ryan Petrello
036a04c918 cli: add an alias for awx inventories 2019-08-27 10:38:28 -04:00
softwarefactory-project-zuul[bot]
92b9176455 Merge pull request #4517 from jakemcdermott/fix-akit-shell-init
fix akit shell init when no credential config is provided

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-27 14:36:34 +00:00
softwarefactory-project-zuul[bot]
1a01fdb02d Merge pull request #4604 from ryanpetrello/host-not-hosts
cli: fix an awx CLI alias typo

Reviewed-by: Christian Adams <rooftopcellist@gmail.com>
             https://github.com/rooftopcellist
2019-08-27 14:36:30 +00:00
Christian Adams
78c0d531bc Adds versions to analytics collectors and manifest file.
- adds 'query_info.json' to contain collection metadata
  - adds 'manifest.json' to contain collection file version info
2019-08-27 10:14:14 -04:00
Ryan Petrello
5bd61823ab cli: fix an awx CLI alias typo
see: https://github.com/ansible/awx/issues/4603
2019-08-27 09:24:04 -04:00
softwarefactory-project-zuul[bot]
9e849ad3e6 Merge pull request #4596 from ryanpetrello/fix-cli-required-args
cli: fix a few bugs related to required OPTIONS

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-26 21:47:46 +00:00
softwarefactory-project-zuul[bot]
073c4322a3 Merge pull request #4582 from marshmalien/4233-playbook-field
Add playbook select and project field validation

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-26 21:20:13 +00:00
softwarefactory-project-zuul[bot]
38a5355574 Merge pull request #4591 from ryanpetrello/cli-command-help
cli: make --help work properly for custom commands

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-26 20:11:43 +00:00
softwarefactory-project-zuul[bot]
0bebc0febc Merge pull request #4594 from ryanpetrello/fix4565
cli: print a newline after HTTP JSON errors

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-26 20:05:07 +00:00
softwarefactory-project-zuul[bot]
c74f826e29 Merge pull request #4472 from rooftopcellist/collection_org_job_info
Send job & org data

Reviewed-by: Christian Adams <rooftopcellist@gmail.com>
             https://github.com/rooftopcellist
2019-08-26 19:44:08 +00:00
softwarefactory-project-zuul[bot]
5f02906b28 Merge pull request #4590 from ryanpetrello/workflow-job-creation
prevent POST on /api/v2/workflow_jobs/

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-26 19:44:04 +00:00
softwarefactory-project-zuul[bot]
0d8abba613 Merge pull request #4597 from rooftopcellist/fix_encryption_typo
fix typo in comment about encryption

Reviewed-by: awxbot
             https://github.com/awxbot
2019-08-26 19:37:22 +00:00
Ryan Petrello
ea36be3a0e cli: fix a few bugs related to required OPTIONS
see: https://github.com/ansible/awx/issues/4581
see: https://github.com/ansible/awx/issues/4583
see: https://github.com/ansible/awx/issues/4560
2019-08-26 15:25:28 -04:00
softwarefactory-project-zuul[bot]
7dd6306221 Merge pull request #4593 from ryanpetrello/fix-4567
cli: fix a bug when printing complex data structures w/ -f human

Reviewed-by: Elijah DeLee <kdelee@redhat.com>
             https://github.com/kdelee
2019-08-26 19:10:56 +00:00
softwarefactory-project-zuul[bot]
fbf19de993 Merge pull request #4592 from ryanpetrello/fix-4563
cli: remove --id flag from awx <resource> list

Reviewed-by: Elijah DeLee <kdelee@redhat.com>
             https://github.com/kdelee
2019-08-26 19:10:52 +00:00
Marliana Lara
b77160f575 Fix broken tests due to JobTemplateForm changes 2019-08-26 15:01:06 -04:00
Marliana Lara
156d03fa45 Add playbook select and project field validation 2019-08-26 15:00:59 -04:00
Ryan Petrello
6999d779a8 make --help work properly for custom commands
see: https://github.com/ansible/awx/issues/4559
2019-08-26 15:00:16 -04:00
Christian Adams
cf464c7cb1 fix typo in comment about encryption 2019-08-26 14:20:39 -04:00
Ryan Petrello
ce6905d54a cli: print a newline after HTTP JSON errors
see: https://github.com/ansible/awx/issues/4565
2019-08-26 12:44:00 -04:00
Ryan Petrello
1d2edc1d81 cli: fix a bug when printing complex data structures w/ -f human
see: https://github.com/ansible/awx/issues/4567
2019-08-26 12:41:35 -04:00
Ryan Petrello
f9230d9879 cli: remove --id flag from awx <resource> list
see: https://github.com/ansible/awx/issues/4563
2019-08-26 12:25:18 -04:00
Ryan Petrello
a89324defa prevent POST on /api/v2/workflow_jobs/ 2019-08-26 11:47:19 -04:00
softwarefactory-project-zuul[bot]
e19035079e Merge pull request #4557 from jlmitch5/fixPopoverClick
fix regression where clicking inside popover closed it

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-26 15:26:39 +00:00
Christian Adams
cd3645eb4d Send job & org data 2019-08-26 10:22:07 -04:00
softwarefactory-project-zuul[bot]
9baa9eee96 Merge pull request #4585 from ryanpetrello/system-job-creation
prevent POST on /api/v2/system_jobs/

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-26 13:50:45 +00:00
Keith Grant
901d41e261 show error for disallowed new lines in code mirror 2019-08-25 23:11:25 -07:00
Jim Ladd
a10ad58c75 Use custom webhook bodies as is (instead of as a sub-field in webhook) 2019-08-25 23:11:25 -07:00
Jim Ladd
774a310e10 Don't collect job_host_summaries if job is running 2019-08-25 23:11:25 -07:00
Jim Ladd
c8805cc55b No need to merge old/new notification messages if messages field is null 2019-08-25 23:11:25 -07:00
Jim Ladd
24a383c7c1 Set default messages (for each message type) to null 2019-08-25 23:11:25 -07:00
Jim Ladd
487276613f Fix issue where only one NT attached to UJT would be used to send notifications 2019-08-25 23:11:25 -07:00
Keith Grant
7a6e62c022 update e2e tests for disabled toggle switches 2019-08-25 23:11:24 -07:00
Jake McDermott
d068fef767 handle message validation errors 2019-08-25 23:11:24 -07:00
Jim Ladd
2b792573f8 set messages default 2019-08-25 23:11:24 -07:00
Jim Ladd
ec20081d74 bump migration 2019-08-25 23:11:24 -07:00
Jim Ladd
8158632344 render notification templates 2019-08-25 23:11:24 -07:00
Jim Ladd
1a1eab4dab create jinja context based on job serialization 2019-08-25 23:11:24 -07:00
Jim Ladd
13b9679496 save/validate messages 2019-08-25 23:11:24 -07:00
Jim Ladd
3bb0aa4eec serialize notification body 2019-08-25 23:11:23 -07:00
Jim Ladd
24c3903c30 add debug info for failed slack notification 2019-08-24 20:37:59 -07:00
Jim Ladd
7bf250ecfa show default messages in options 2019-08-24 20:37:59 -07:00
Jim Ladd
0ddc32a6dc sort notification_type 2019-08-24 20:37:58 -07:00
Jim Ladd
8ca79e3579 job notification data omits new host summary fields 2019-08-24 20:37:58 -07:00
Jim Ladd
ccdbd0510f Add support for grafana, rocketchat in awxkit 2019-08-24 20:37:58 -07:00
Jim Ladd
616db6bc51 Add support for messages field in awxkit 2019-08-24 20:37:58 -07:00
Jim Ladd
cb411cc3be Add messages field 2019-08-24 20:37:35 -07:00
Jim Ladd
efbaf46179 Docs update for notification templates 2019-08-23 17:43:20 -07:00
Keith Grant
5468624df5 fix ui lint errors 2019-08-23 17:43:20 -07:00
Keith Grant
15e6117472 fix webhook method default value 2019-08-23 17:43:20 -07:00
Keith Grant
62f31d6b3f fix console error on hidden syntax-highlight directive 2019-08-23 17:43:20 -07:00
Keith Grant
965dc79a0a update notifications UI for new default messages structure 2019-08-23 17:43:20 -07:00
Keith Grant
150de6a70b update notification messages for webhook/pagerduty 2019-08-23 17:43:20 -07:00
Keith Grant
56f04e0153 change custom notification message from checkbox to toggle 2019-08-23 17:43:20 -07:00
Keith Grant
1470fa61d5 open docs link in new tab 2019-08-23 17:43:20 -07:00
Keith Grant
1c79d21416 add custom notification message help text 2019-08-23 17:43:20 -07:00
Keith Grant
3c4862acfe preserve default notification messages for users with read-only access 2019-08-23 17:43:20 -07:00
Keith Grant
37b44fe77d fix template view for auditor/limited permissions 2019-08-23 17:43:20 -07:00
Keith Grant
191d18cec0 fix ui lint errors 2019-08-23 17:43:20 -07:00
Keith Grant
885c5050a0 re-init message templates on notification type change 2019-08-23 17:43:20 -07:00
Jim Ladd
03ebe44802 In UI, rename start to started 2019-08-23 17:43:20 -07:00
Keith Grant
0398ce0530 get default template messages from OPTIONS 2019-08-23 17:43:20 -07:00
Keith Grant
a56a6d7158 wire in custom template messages on edit form 2019-08-23 17:43:20 -07:00
Keith Grant
b80ca62072 add messages to Add Notification form payload 2019-08-23 17:43:20 -07:00
Keith Grant
fc4c9af86f fix empty template message after expanding 2019-08-23 17:43:20 -07:00
Keith Grant
0f19d98d84 set heights on syntax highlight inputs 2019-08-23 17:43:20 -07:00
Keith Grant
7b828d73be fix ids to support multiple syntax-highlights at once 2019-08-23 17:43:20 -07:00
Keith Grant
8a04cf0cb4 add syntax-highlight directive 2019-08-23 17:43:20 -07:00
Keith Grant
adf25c61a2 add custom notification message input fields 2019-08-23 17:43:20 -07:00
softwarefactory-project-zuul[bot]
2eaf62a62d Merge pull request #4558 from mabashian/4228-jobs-delete
Hook up delete on jobs list

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-23 21:14:55 +00:00
softwarefactory-project-zuul[bot]
4516e6400e Merge pull request #4525 from mabashian/4293-vars
Fixes issues with extra var prompting in workflow nodes

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-23 21:09:21 +00:00
softwarefactory-project-zuul[bot]
72df2ca3a3 Merge pull request #4572 from mabashian/4474-datalist-width
Reverts data list toolbar back to expected width on normal lists

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-23 20:08:20 +00:00
Ryan Petrello
a949cc33f1 prevent POST on /api/v2/system_jobs/
SystemJobs should only be created by launching a SystemJobTemplate
2019-08-23 15:10:26 -04:00
softwarefactory-project-zuul[bot]
6e6676adb3 Merge pull request #4578 from ryanpetrello/awx-cli-install
fix install instructions for the AWX CLI

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-23 17:02:09 +00:00
softwarefactory-project-zuul[bot]
49b840a996 Merge pull request #4577 from ansible/remove_job_status
Removing job_status from the docs because it doesn't exist.

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-23 16:35:34 +00:00
Ryan Petrello
150b3e6f6d fix install instructions for the AWX CLI 2019-08-23 11:34:55 -04:00
softwarefactory-project-zuul[bot]
90af9a9e33 Merge pull request #4576 from ryanpetrello/ssl-insecure-cli
suppress urllib3 insecure warnings in the CLI

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-23 15:22:22 +00:00
Rebeccah Hunter
be0c36540e Removing job_status from the docs because it doesn't exist. 2019-08-23 10:53:23 -04:00
Ryan Petrello
70ce074f5a suppress urllib3 insecure warnings in the CLI 2019-08-23 10:11:59 -04:00
softwarefactory-project-zuul[bot]
39a96a620e Merge pull request #4562 from ryanpetrello/fix-required-cli-args
fix a formatting bug re: required arguments in the CLI

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-22 22:03:44 +00:00
softwarefactory-project-zuul[bot]
e13274c73f Merge pull request #4571 from ryanpetrello/v2-test-cleanup
clean up old v2 versioning in API tests

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-22 21:57:44 +00:00
Ryan Petrello
2e8be41111 fix a formatting bug re: required arguments in the CLI 2019-08-22 17:05:53 -04:00
mabashian
3079b54d31 Reverts data list toolbar back to 50/50 width on normal lists but maintains full width on lookups. 2019-08-22 15:34:30 -04:00
Ryan Petrello
4e6b0e1580 clean up old v2 versioning in API tests 2019-08-22 15:14:06 -04:00
Jake McDermott
94d6fcbe39 set default credentials when cred file not provided 2019-08-22 14:47:23 -04:00
John Mitchell
36229d92ee remove inadverdent debugger 2019-08-22 12:55:54 -04:00
mabashian
5549dac17d Hook up delete on jobs list. Add more comprehensive error handling on delete in organization and template lists. 2019-08-22 11:22:46 -04:00
John Mitchell
605c5784c8 fix regression where clicking inside popover closed it 2019-08-22 11:16:07 -04:00
softwarefactory-project-zuul[bot]
92bc608af3 Merge pull request #4535 from AlanCoding/null_ip
Allow gce host and public IP hostvars to be null

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-22 15:07:12 +00:00
softwarefactory-project-zuul[bot]
8566c30557 Merge pull request #4537 from jbradberry/fix-project-test
Fix asserts in test_project.py to use the id directly off of the job template

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-22 14:52:35 +00:00
softwarefactory-project-zuul[bot]
045578ce22 Merge pull request #4551 from rebeccahhh/devel
Remove extra warning when using garbage credentials

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-22 14:18:30 +00:00
Rebeccah Hunter
fb71b2699f removed tabbing 2019-08-22 09:41:32 -04:00
Rebeccah Hunter
af6e035c3b removed tabbing 2019-08-22 09:39:59 -04:00
softwarefactory-project-zuul[bot]
4a45a7e9c3 Merge pull request #4548 from ryanpetrello/more-cli-tweaks
more CLI tweaks

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-21 21:56:00 +00:00
Rebeccah Hunter
017274e2aa Removed extraneous warning when using garbage credentials for ssh_key_data
added in logic to check if there was an existing error before checking form field entry for ssh_key_unlock, also added a test to ensure that garbage data entered would not trigger the error message for both the incorrect ssh_key_data and the incorrect ssh_key_unlock, rather just the incorrect ssh_key_data
2019-08-21 17:01:51 -04:00
Ryan Petrello
44ff141c23 replace the (optional) tabulate dependency w/ a simple table printer 2019-08-21 15:54:47 -04:00
Ryan Petrello
ec5d471640 add an ad_hoc resource alias to the new CLI 2019-08-21 15:22:08 -04:00
mabashian
531a7b2c05 Add support for processing extra vars that come in string or object form. Small bug fixes for extra var corner cases in workflow nodes. 2019-08-21 13:34:41 -04:00
softwarefactory-project-zuul[bot]
1d05c21af4 Merge pull request #4544 from AlanCoding/rm_credential
Remove deprecated credential logic from create factory

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-21 16:45:07 +00:00
AlanCoding
a4f04cd534 remove deprecated credential logic from create factory 2019-08-21 10:40:38 -04:00
softwarefactory-project-zuul[bot]
bccb54aec8 Merge pull request #4516 from ryanpetrello/py2
support the new CLI in py2 *and* py3

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-20 21:33:39 +00:00
Jeff Bradberry
ed1c667418 Fix asserts in test_project.py to use the id directly off of the job template
test_no_changing_overwrite_behavior_if_used, specifically.
2019-08-20 16:42:34 -04:00
softwarefactory-project-zuul[bot]
4bdbb88934 Merge pull request #4534 from ryanpetrello/nginx-server-version
hide nginx server version headers

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-20 19:20:37 +00:00
AlanCoding
85b351a0c8 Allow gce host and public IP hostvars to be null 2019-08-20 14:44:56 -04:00
softwarefactory-project-zuul[bot]
192fecad72 Merge pull request #4526 from jakemcdermott/update-gitignore
add akit config to gitignore

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-20 18:44:34 +00:00
Ryan Petrello
b82030b025 hide nginx server version headers 2019-08-20 14:34:04 -04:00
softwarefactory-project-zuul[bot]
8454adf8d4 Merge pull request #4490 from AlanCoding/wf_node_credential
Remove deprecated WFJT node credential field

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-20 18:23:49 +00:00
softwarefactory-project-zuul[bot]
e9df4ed800 Merge pull request #4435 from keithjgrant/4244-not-found-route
Add NotFound route handling

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-20 16:33:26 +00:00
Keith Grant
e1636b3ad4 add link back to dashboard from ContentError 2019-08-20 08:42:59 -07:00
Keith Grant
eeb86b3105 remove NotFoundError and use ContentError instead 2019-08-20 08:42:59 -07:00
Keith Grant
db1dddb95e fix redirect to login with expired session on org list & template list 2019-08-20 08:42:07 -07:00
Keith Grant
47357aea28 fix lint errors 2019-08-20 08:42:07 -07:00
Keith Grant
fe8df27811 add more meaningful 404 error screens 2019-08-20 08:42:07 -07:00
Keith Grant
256fc74676 add NotFound screen/route handling 2019-08-20 08:41:19 -07:00
Jake McDermott
2bda1db43e add akit config to gitignore 2019-08-20 11:10:08 -04:00
AlanCoding
4e99ad3e27 minor doc update 2019-08-20 10:37:41 -04:00
AlanCoding
f230da5437 update tests for credential removal 2019-08-20 10:37:41 -04:00
AlanCoding
b660800c5d remove deprecated WFJT node credential field 2019-08-20 10:37:41 -04:00
mabashian
4747be7014 Fixes bug in wf prompt modal by checking extra vars type before processing 2019-08-20 09:36:36 -04:00
mabashian
2de87dcef0 Fix prompt modal tab spacing when job launched from within jt form. 2019-08-20 09:35:19 -04:00
Ryan Petrello
80b4102aa9 support the new CLI in py2 *and* py3 2019-08-20 02:41:45 -04:00
softwarefactory-project-zuul[bot]
411667773a Merge pull request #4488 from jladdjr/docker_login_password_stdin
docker login: s/-p/--password-stdin/

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-19 23:22:05 +00:00
softwarefactory-project-zuul[bot]
ced5319ac9 Merge pull request #4512 from jlmitch5/projBranchFix
clear out branch/prompt on jt form when project changes

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-19 20:52:59 +00:00
John Mitchell
ef26f6a4c2 clear out branch/prompt on jt form when project changes 2019-08-19 14:16:01 -04:00
softwarefactory-project-zuul[bot]
b28655181d Merge pull request #4440 from mabashian/toggle-dynamics
Remove restriction on toggling dynamic hosts on/off from the host form view

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-19 16:42:17 +00:00
softwarefactory-project-zuul[bot]
2cdd007ed0 Merge pull request #4509 from saito-hideki/issue/tower/3679
Fixed form validation to JT survey minimum & maximum values

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-19 14:31:59 +00:00
softwarefactory-project-zuul[bot]
c963236a36 Merge pull request #4453 from ansible/e2e-cleanup
E2E websocket cleanup

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-19 14:09:31 +00:00
softwarefactory-project-zuul[bot]
2e762276bf Merge pull request #4507 from vrevelas/typo
Fix typo

Reviewed-by: Jake McDermott <yo@jakemcdermott.me>
             https://github.com/jakemcdermott
2019-08-19 13:07:26 +00:00
Hideki Saito
0f4de69e57 Fixed form validation to JT survey minimum & maximum values
- Fixed issue ansible/tower#3679

Signed-off-by: Hideki Saito <saito@fgrep.org>
2019-08-19 09:15:29 +00:00
vrevelas
1d1706665f Fix typo 2019-08-19 11:34:45 +03:00
softwarefactory-project-zuul[bot]
858f43fd2a Merge pull request #4489 from keithjgrant/4427-job-template-console-error
Fix default props for jt form

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-16 23:45:56 +00:00
Keith Grant
491287b1de fix default props for jt form 2019-08-16 19:11:07 -04:00
softwarefactory-project-zuul[bot]
de78d5d63b Merge pull request #4505 from marshmalien/awx-pf-jt-project-field
Add project single select input to job template form

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-16 21:57:24 +00:00
softwarefactory-project-zuul[bot]
ab45938d41 Merge pull request #4506 from rooftopcellist/fix_migrations
fix typo in migration name

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-16 18:24:41 +00:00
Christian Adams
a58a191071 fix typo in migration name 2019-08-16 13:46:41 -04:00
softwarefactory-project-zuul[bot]
2d7dc9aec7 Merge pull request #4493 from ryanpetrello/safeload
replace usage of FullLoader w/ safe_load

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-16 17:45:07 +00:00
Marliana Lara
45a69551f1 Change JT form project field into a single select input 2019-08-16 13:05:12 -04:00
softwarefactory-project-zuul[bot]
f7ea14107e Merge pull request #4503 from shanemcd/nit
Fix typo in migration filename

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-16 16:59:24 +00:00
softwarefactory-project-zuul[bot]
4dc97ac8d1 Merge pull request #3812 from skinlayers/devel
Add support for kubernetes nodeSelector, tolerations and affinity

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-16 16:36:56 +00:00
softwarefactory-project-zuul[bot]
165600b876 Merge pull request #4502 from rooftopcellist/token_description
Use consistent description types

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2019-08-16 16:27:41 +00:00
Shane McDonald
18a316646b Fix typo in migration filename 2019-08-16 10:29:34 -04:00
Ryan Petrello
39d0eb62e4 replace usage of FullLoader w/ safe_load 2019-08-16 10:13:27 -04:00
AlanCoding
d302f134ac Kill off all can_read access methods 2019-08-16 10:12:46 -04:00
Christian Adams
52f8a8a6e5 Use consistent description types 2019-08-16 09:25:03 -04:00
softwarefactory-project-zuul[bot]
e08e70efb4 Merge pull request #4498 from ryanpetrello/awx-cli-help
prevent `awx -h` CLI command from printing a scary connection error

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-15 18:19:01 +00:00
softwarefactory-project-zuul[bot]
89c41a5931 Merge pull request #4494 from kdelee/awxkit_remove_dateutil
Remove this dependency that we don't need

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-15 18:03:53 +00:00
Ryan Petrello
94235f4736 prevent awx -h CLI command from printing a scary connection error 2019-08-15 13:38:37 -04:00
softwarefactory-project-zuul[bot]
099a7f6cde Merge pull request #4495 from ryanpetrello/docker-for-mac-sdb
fix a bug in the sdb-listen setup

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-15 16:19:34 +00:00
Ryan Petrello
57d60e5b97 fix a bug in the sdb-listen setup
Docker for Mac recently renamed itself to Docker Desktop
2019-08-15 11:50:28 -04:00
Elijah DeLee
8efa0fc397 Remove this dependency that we don't need 2019-08-15 11:26:02 -04:00
softwarefactory-project-zuul[bot]
dc44e68980 Merge pull request #4479 from saito-hideki/issue/tower/3639
Fixed "DEFAULT ANSWER" to be properly deleted for Integer and Float types

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-15 14:54:46 +00:00
softwarefactory-project-zuul[bot]
65e359bdcf Merge pull request #4491 from ryanpetrello/remove-cli-termcolor
replace the termcolor dependency w/ a simple function

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-15 14:25:39 +00:00
Hideki Saito
f1a69e9357 Fixed "DEFAULT ANSWER" to be properly deleted for Integer and Float types
- Fixed issue ansible/tower#3639

Signed-off-by: Hideki Saito <saito@fgrep.org>
2019-08-15 10:24:17 -04:00
Ryan Petrello
224750c0d6 replace the termcolor dependency w/ a simple function 2019-08-15 09:54:01 -04:00
softwarefactory-project-zuul[bot]
2f658a4e5d Merge pull request #4305 from catjones9/jobTemplateAddButton
Adds Job Template Add Button to TemplatesList with link to add form

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-14 22:52:38 +00:00
catjones9
38a7fa5558 Linting errors
Signed-off-by: catjones9 <catjones@redhat.com>
2019-08-14 16:52:30 -04:00
catjones9
e591305dfe Changes conditional canAdd statement based on PR feedback
Signed-off-by: catjones9 <catjones@redhat.com>
2019-08-14 16:52:30 -04:00
catjones9
9e0d113063 Conditional Add Button on Template List screen
Signed-off-by: catjones9 <catjones@redhat.com>
2019-08-14 16:52:30 -04:00
softwarefactory-project-zuul[bot]
ad17bdc559 Merge pull request #4487 from ryanpetrello/many-groups-slowness
optimize a slow query in inventory script generation

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-14 20:32:01 +00:00
softwarefactory-project-zuul[bot]
c35fbd6853 Merge pull request #4483 from ryanpetrello/multi-owner
fix bug where cred org permission was not checked

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-14 19:54:57 +00:00
Jim Ladd
74623a33a2 docker login: s/-p/--password-stdin/ 2019-08-14 12:32:26 -07:00
softwarefactory-project-zuul[bot]
8df70f5412 Merge pull request #4471 from jakemcdermott/multicred-template-loading
add related credential loading needed for multicredential select

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-14 19:31:34 +00:00
Ryan Petrello
98e7ae5f9f optimize a slow query in inventory script generation
see: https://github.com/ansible/awx/issues/4461
2019-08-14 15:03:53 -04:00
softwarefactory-project-zuul[bot]
26637499d1 Merge pull request #4484 from chrismeyersfsu/fix-notification_password
do not expose the notification secret fields

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-14 18:45:08 +00:00
Jake McDermott
55376bfd13 load related credentials when editing 2019-08-14 14:15:24 -04:00
Jake McDermott
a8511f967b build details url once 2019-08-14 14:15:24 -04:00
Jake McDermott
e3d6ee6f9e move label requests to function 2019-08-14 14:15:24 -04:00
Jake McDermott
d05c1bdd6e move function comment into function 2019-08-14 14:15:24 -04:00
Jake McDermott
c96dfd101c use alias for type import 2019-08-14 14:15:24 -04:00
chris meyers
9fa4dac847 do not expose the notication secret fields 2019-08-14 13:58:47 -04:00
softwarefactory-project-zuul[bot]
7374732d9b Merge pull request #4482 from ryanpetrello/prometheus_errors
fix a bug in the API metrics endpoint

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-14 17:14:25 +00:00
AlanCoding
4831cde39f fix bug where cred org permission was not checked 2019-08-14 12:07:28 -04:00
softwarefactory-project-zuul[bot]
43d816b6e4 Merge pull request #4265 from AlanCoding/branch_feature_phase_2
Allow JT specification and prompting for project branch

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-14 14:56:19 +00:00
Ryan Petrello
a45c93ed47 fix a bug in the API metrics endpoint
The metrics JSON renderer shouldn't try to parse data that isn't
a string (generally, this represents things like HTTP 403)
2019-08-14 10:40:21 -04:00
softwarefactory-project-zuul[bot]
31308e3795 Merge pull request #4383 from marshmalien/4236-output-toolbar
Job Output - Pagination and Static List

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-13 21:38:25 +00:00
Marliana Lara
748bf63d4e Move job event line styles into a shared dir
Set a field to avoid setState warnings

Fix lint errors
2019-08-13 17:06:24 -04:00
Jake McDermott
2a926fffd9 set default timezone to UTC for test runs 2019-08-13 17:05:48 -04:00
Marliana Lara
475645f604 Add JobOutput tests 2019-08-13 17:05:47 -04:00
Jake McDermott
b2922792bc add function for testing output lines 2019-08-13 17:05:47 -04:00
Marliana Lara
74ef0e7abf Refactor MenuControls as a functional component
* Fix lint errors
2019-08-13 17:05:47 -04:00
Marliana Lara
2aa38e84dd Add guard clause to loadMoreRows and style tweaks 2019-08-13 17:05:46 -04:00
Jake McDermott
033308de69 add missing event placeholders and recompute heights on load 2019-08-13 17:05:46 -04:00
Jake McDermott
0a3633113e ensure results are always indexed by counter when loading new rows 2019-08-13 17:05:46 -04:00
Marliana Lara
161c7706bc Add InfiniteLoader to fetch rows as needed 2019-08-13 17:05:46 -04:00
Jake McDermott
40560e962f compute row height on-the-fly 2019-08-13 17:05:45 -04:00
Jake McDermott
474a2a48bb add job event component and sanitized html building for output lines 2019-08-13 17:05:45 -04:00
Marliana Lara
da92889323 WIP - react virtualizer 2019-08-13 17:05:45 -04:00
Marliana Lara
859c364fbe Add MenuControl tests 2019-08-13 17:05:44 -04:00
Marliana Lara
408b38174a Add job output menu controls component 2019-08-13 17:05:44 -04:00
softwarefactory-project-zuul[bot]
e3c6e245d6 Merge pull request #3623 from wenottingham/hello-you-are-being-audited
Allow mapping org auditors where we map org admins.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-13 19:07:31 +00:00
softwarefactory-project-zuul[bot]
8fef029bc3 Merge pull request #4442 from mabashian/4225-sparkline-templates
Add Sparkline to templates list

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-13 18:14:24 +00:00
softwarefactory-project-zuul[bot]
8fbda8a773 Merge pull request #4469 from jakemcdermott/lookup-tests
refactor lookup tests

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-13 17:46:02 +00:00
Jake McDermott
245252ed11 refactor lookup tests 2019-08-13 13:13:12 -04:00
Bill Nottingham
bbf28f50bd Allow mapping org auditors where we map org admins. 2019-08-13 11:32:35 -04:00
softwarefactory-project-zuul[bot]
0cc9199f23 Merge pull request #4468 from ryanpetrello/more-awxkit-dep-cleanup
simplify awxkit dependencies

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-13 13:07:18 +00:00
AlanCoding
be21a8bcb4 Fix logic for turning off override behavior 2019-08-12 22:17:19 -04:00
John Mitchell
3df476e3f6 remove inadverdent duplicate CreateSelect2 call from playbook on jt edit form 2019-08-12 22:17:19 -04:00
AlanCoding
2f3aafe1bb Add collection setting toggle to UI
Additional API housekeeping, removing unused code

Treat default branch as no branch provided
2019-08-12 22:16:04 -04:00
John Mitchell
79a1dbc5a0 fix issue with interior scope declaration eslint error 2019-08-12 22:16:03 -04:00
AlanCoding
dc5d696238 avoid unnecessary checkout, more docs content 2019-08-12 22:16:03 -04:00
John Mitchell
139e8cde70 more ui work for branch and refspec on project/jt
- add refspec field to project
- update refspec and branch help text on project form
- add refspec field to job detail
- adjust form gen and ProcessErrors to show api errors for checkbox_groups correctly
- consolidate showPromptButton conditionals and fix the add/edit workflow node one for showing prompt when only branch is promptable
2019-08-12 22:16:03 -04:00
John Mitchell
13751e73f9 working commit 2019-08-12 22:16:02 -04:00
AlanCoding
03d72dd18a JT-branch docs and code cleanup
bump migration

fine tune validation of project allow_override
  return highly custom error message

Restore branch after syncs to address bugs
  encountered after changing scm_refspec

remove unused code to determine scm_revision

Check Ansible version before project update and
  do not install collections if Ansible version too old

Add docs related to project branch override
  New file specific to branch override and refspec
Complete docs on collections to reflect current
  implementation and give a folder tree example
Update clustering docs related to project syncs

Fix bug where git depth was ignored during the
  local clone from project folder to run folder

Fix bug where submodules were not copied
2019-08-12 22:16:02 -04:00
chris meyers
d785145c59 force proj sync when collections/requirements.yml
* Similar to roles/requirements.yml sync optimization logic.
2019-08-12 22:16:02 -04:00
AlanCoding
270bd19dbd Fix bugs with discovery of collection requirements
Addresses some cases where
  collection requirements do not exist
  collection requirements cannot be evaluated

Consolidate logic for roles and collection installs
2019-08-12 22:14:32 -04:00
chris meyers
cc6413c44c use ansible nightly
* ansible:devel now has ansible-galaxy collection support
2019-08-12 22:14:31 -04:00
chris meyers
4be65a0879 collections/requirements.yml support
* just like we support ansible-galaxy role install, support
ansible-galaxy collection install
2019-08-12 22:14:30 -04:00
AlanCoding
f1f57e45de Add scm_refspec field
Update migration syntax to Django 2

fix status bug where canceled switched to error
2019-08-12 22:13:57 -04:00
softwarefactory-project-zuul[bot]
845e3867a6 Merge pull request #4467 from ansible/e2e-debugging-help
Known E2E Issues

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-12 23:34:06 +00:00
softwarefactory-project-zuul[bot]
7fcfc88c82 Merge pull request #4460 from jakemcdermott/normalize-e2e-urls
remove extra and trailing slashes from e2e url settings

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-12 23:27:27 +00:00
Jake McDermott
61ca4278c8 remove extra and trailing slashes from url 2019-08-12 18:59:02 -04:00
Ryan Petrello
299fa3b6b4 simplify awxkit dependencies
- remove flake8 as an install requirements (it's only used for tests)
- vendor toposort, which is Apache 2.0 licensed (and very small)
- change websocket-client to a setuptools optional dependency, which you
  can install via:

  pip install "./awxkit[websockets]"

- add `jq` and `tabulate` under an additional optional setuptools
  dependency:

  pip install "./awxkit[formatting]"

- remove `cryptography`, which is only used for random RSA generation
  (unused by the CLI)
2019-08-12 17:27:57 -04:00
softwarefactory-project-zuul[bot]
0b112e5b8f Merge pull request #4465 from ryanpetrello/json-metrics
add support for Accept:application/json to /api/v2/metrics

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-12 20:26:16 +00:00
John Hill
121bc96108 Updating for the known MacOS High Sierra issue 2019-08-12 15:46:39 -04:00
Ryan Petrello
82f5072c7d add support for Accept:application/json to /api/v2/metrics
see: https://github.com/ansible/awx/issues/4144
2019-08-12 15:17:40 -04:00
softwarefactory-project-zuul[bot]
99357acf5d Merge pull request #4462 from AlanCoding/the_edge_of_unicode
Fix error due to randint inclusivity

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-12 19:02:04 +00:00
AlanCoding
3a5e609a11 Fix error due to randint inclusivity 2019-08-12 13:36:53 -04:00
softwarefactory-project-zuul[bot]
a776d0ba59 Merge pull request #4459 from ryanpetrello/cli-version
make awxkit have the same version as the AWX package

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-12 15:15:41 +00:00
John Mitchell
0c89c6c79e fix ui conditional for adding fields to jt edit save payload 2019-08-12 11:01:11 -04:00
AlanCoding
6baba10abe Add scm_revision to project updates and cleanup
Add validation around prompted scm_branch requiring
  project allow_override field to be true

Updated related process isolation docs

Fix invalid comarision in serializer

from PR review, clarify pre-check logging, minor docs additions
2019-08-12 11:01:10 -04:00
John Mitchell
76dcd57ac6 assorted UI work to support the new branch field
update project to have allow branch override checkbox
add new text input for branch field
adjust show/hide for branch and playbook jt fields
make playbook field allowed to add a new option not in the dropdown
update job results ui to show branch
update prompting to support new branch field
2019-08-12 11:01:10 -04:00
AlanCoding
ac86dc4fb9 Allow JTs to specify and prompt for SCM branch
Copy project folder each job run
  change cwd to private_data_dir, from proj
  do not add cwd to show_paths if it is
  a subdirectory of private_data_dir, which
  is already shown

Pass the job private_data_dir to the local
  project sync, and also add that directory
  to the project sync show paths

Add GitPython dep and use for job sync logic
  use this to manage shallow clone from desired
  commit, and to map branch to commit,
  and to assess necessity of project sync

Start on some validation change, but not all
  allow arbitrary playbooks with custom branch
2019-08-12 11:01:07 -04:00
Ryan Petrello
b90d1456b3 make awxkit have the same version as the AWX package 2019-08-12 09:42:04 -04:00
Gabriel Totusek
794808cd10 Fix compatibility with postgresql helm chart v6.0.0+ 2019-08-12 02:40:25 -07:00
Gabriel Totusek
d932a70eff Downgrade postgres helm chart to v5.3.13 2019-08-12 00:43:09 -07:00
Gabriel Totusek
90e5b0a12d Update postgres helm chart to v6.2.1 2019-08-11 23:16:43 -07:00
Gabriel Totusek
f705eba7ed Add support for kubernetes tolerations, nodeSelector, and affinity 2019-08-11 23:10:56 -07:00
John Hill
8341601c60 Adding Debugging section to e2e doc
First of many debugging, Troubleshooting, and FAQ tips
2019-08-11 22:06:55 -04:00
softwarefactory-project-zuul[bot]
28e3625066 Merge pull request #4455 from ryanpetrello/stdout-is-not-missing
remove awxkit logic for working around an old stdout handling bug

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-09 20:43:19 +00:00
Ryan Petrello
d92753f20a remove awxkit logic for working around an old stdout handling bug
related: https://github.com/ansible/awx/issues/200
2019-08-09 14:44:52 -04:00
softwarefactory-project-zuul[bot]
76a10991ae Merge pull request #4451 from ryanpetrello/awxkit
open source awxkit

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-09 16:10:47 +00:00
Daniel Sami
2064309182 e2e cleanup websockets 2019-08-09 11:32:26 -04:00
Ryan Petrello
adaa4148c6 include awxkit CI in zuul runs
additionally, fix up some flake8 failures
2019-08-09 10:07:40 -04:00
Ryan Petrello
9616cc6f78 import awxkit
Co-authored-by: Christopher Wang <cwang@ansible.com>
Co-authored-by: Jake McDermott <jmcdermott@ansible.com>
Co-authored-by: Jim Ladd <jladd@redhat.com>
Co-authored-by: Elijah DeLee <kdelee@redhat.com>
Co-authored-by: Alan Rominger <arominge@redhat.com>
Co-authored-by: Yanis Guenane <yanis@guenane.org>
2019-08-08 22:12:31 -04:00
softwarefactory-project-zuul[bot]
9b836abf1f Merge pull request #4444 from elyezer/app-token-e2e
Add app token e2e

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-08 18:50:34 +00:00
softwarefactory-project-zuul[bot]
3441d0cb46 Merge pull request #4441 from ansible/noretry
[WIP] Parameterize E2E Suite retries

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-08 17:56:22 +00:00
mabashian
3d98d98d3c Moves tooltip and link logic out to the sparkline from the job status icon 2019-08-08 11:53:47 -04:00
softwarefactory-project-zuul[bot]
860d83d798 Merge pull request #4437 from rooftopcellist/correct_insights_collection_setting
Fix NoneType path error with analytics collection

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-08 15:17:57 +00:00
John Hill
ce37bc9897 remove static definition and move to pipelines 2019-08-08 10:44:56 -04:00
Christian Adams
c37bf5e9f5 Fix NoneType path error with analytics collection 2019-08-07 16:19:05 -04:00
mabashian
fba0da4c58 Fix linting 2019-08-07 16:16:57 -04:00
Elyézer Rezende
e7a15d478d Add app token e2e 2019-08-07 15:56:06 -04:00
softwarefactory-project-zuul[bot]
d7c15a782f Merge pull request #4425 from mabashian/toggles
Swap text-based on and off toggles to non-text based

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-07 19:55:16 +00:00
mabashian
6a7481a27c Prettier again 2019-08-07 14:52:40 -04:00
mabashian
a57f2ca2bf Run prettier 2019-08-07 14:46:41 -04:00
mabashian
37e5b6b134 Add sparkline tests 2019-08-07 14:44:16 -04:00
John Hill
d5dd1719b6 Parameterize E2E Suite retries 2019-08-07 11:48:12 -04:00
mabashian
0bd9d4abaf Change awxSwitch class prefix to atSwitch to match component name 2019-08-07 11:22:28 -04:00
mabashian
993855f70a Remove restriction on toggling dynamic hosts on/off from the host form view. 2019-08-07 11:15:42 -04:00
mabashian
c71068fa1c Create at-switch directive. Use it in all the places 2019-08-07 11:10:08 -04:00
mabashian
c4700998af Swap text-based on and off toggles to non-text based 2019-08-07 11:05:58 -04:00
mabashian
19d2c8c634 Adds sparkline to templates list 2019-08-06 15:51:27 -04:00
softwarefactory-project-zuul[bot]
4f3f87ebc7 Merge pull request #4433 from AlanCoding/we_dont_do_that
Remove setting not actually customizable

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-06 19:47:55 +00:00
AlanCoding
63978f7d10 remove setting not actually customizable 2019-08-06 13:40:52 -04:00
softwarefactory-project-zuul[bot]
5ed2a38e1d Merge pull request #4423 from saimonn/typo-INSTALL-md
INSTALL.md: fix #post-build-2 href fragment

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-06 14:31:18 +00:00
softwarefactory-project-zuul[bot]
dbcc3c5733 Merge pull request #4420 from wenottingham/be-an-enabler
[RFC] Allow enable/disable of hosts in dynamic inventory from the UI.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-05 17:29:17 +00:00
Bill Nottingham
b4f272a575 Catch another area where this toggle is set. 2019-08-05 12:06:41 -04:00
Simon Séhier
c17ce49e2e fix #post-build-2 href fragment 2019-08-05 17:29:36 +02:00
Bill Nottingham
8e1e33735a Allow enable/disable of hosts in dynamic inventory from the UI.
The API lets you do it, so we shouldn't block it from the UI.
2019-08-05 11:18:56 -04:00
softwarefactory-project-zuul[bot]
50c0867156 Merge pull request #4417 from ansible/stability-e2e
Stability for e2e websockets

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-05 13:16:45 +00:00
softwarefactory-project-zuul[bot]
5984b6235a Merge pull request #4403 from mabashian/3654-inv-status-popover
Fix summary popover on inventory list

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-03 16:11:21 +00:00
softwarefactory-project-zuul[bot]
481df40764 Merge pull request #4411 from mabashian/workflow-start-node-width
Makes workflow start node width dynamic to account for languages other than English

Reviewed-by: Michael Abashian
             https://github.com/mabashian
2019-08-03 13:43:58 +00:00
mabashian
521ecc883b Fix jshint errors 2019-08-03 09:26:50 -04:00
mabashian
688f14a0ee Fix summary popover on inventory list 2019-08-03 09:12:34 -04:00
mabashian
b3002e0b9d Makes workflow start node width dynamic to account for languages other than english 2019-08-03 09:11:20 -04:00
softwarefactory-project-zuul[bot]
9ab920cdb9 Merge pull request #4413 from AlexSCorey/multiSelectPatternFly
Multi-Select AWX-PF

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-02 22:37:27 +00:00
Alex Corey
8b35642b08 Fixes failing test and addresses PR issues 2019-08-02 17:37:58 -04:00
Alex Corey
74a1ebff32 Adds tests and refines chip interaction in MultiSelect component 2019-08-02 16:42:22 -04:00
Alex Corey
a577be906e Adds Multiselect functionality to labels on JTs 2019-08-02 16:42:22 -04:00
Daniel Sami
934d09e0de Stability for e2e websockets 2019-08-02 15:33:09 -04:00
softwarefactory-project-zuul[bot]
bb2474f56f Merge pull request #4415 from ryanpetrello/bubblewrap_aaaaaaaaaaarg
fix a bug that breaks isolated task execution

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-02 15:12:47 +00:00
Ryan Petrello
1388dec4b0 fix a bug that breaks isolated task execution 2019-08-02 07:48:46 -04:00
softwarefactory-project-zuul[bot]
9d2549b4b1 Merge pull request #4368 from jlmitch5/uiNextSearch
initial implementation of search in ui_next

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-01 20:51:48 +00:00
John Mitchell
c0dcba91f5 Update SEARCH.md 2019-08-01 16:21:19 -04:00
John Mitchell
2e777368bf Update SEARCH.md 2019-08-01 16:21:19 -04:00
John Mitchell
0276a37e8d small updates to qs syntax based on feedback 2019-08-01 16:21:19 -04:00
John Mitchell
30253d21fc assorted ui_next search phase 1 pr feedback updates
- remove unnecessary displayAll prop from ChipGroup
- update notification api fn to be 2 with no boolean param
- fix params passing to api functions
2019-08-01 16:21:19 -04:00
John Mitchell
f0ff5b190a update qs addParams/removeParams fns to take param object not string 2019-08-01 16:21:19 -04:00
John Mitchell
bdfeb2cb9c updates based on pr feedback
run prettier
update hasContentError to contentError in all the places
function naming updates
2019-08-01 16:21:19 -04:00
John Mitchell
357887417c working commit 2019-08-01 16:21:19 -04:00
John Mitchell
a58468ffee initial implementation of search in UI_next 2019-08-01 16:21:19 -04:00
softwarefactory-project-zuul[bot]
602ee856fa Merge pull request #4405 from AlanCoding/gce_token
Always provide gce token_uri

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-01 19:20:52 +00:00
AlanCoding
4399d9287d Always provide gce token_uri 2019-08-01 14:18:44 -04:00
softwarefactory-project-zuul[bot]
2faf69e3b5 Merge pull request #4410 from jbradberry/upgrade-django-2.2.4
Upgrade to django 2.2.4

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-01 17:58:53 +00:00
softwarefactory-project-zuul[bot]
008ea30c5f Merge pull request #4409 from ryanpetrello/bubblewrap_aaaaaaaaaaarg
attempt to properly clean up orphaned runner ansible_pi directories

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-01 16:03:22 +00:00
Ryan Petrello
cfeedb158e attempt to properly clean up runner ansible_pi directories 2019-08-01 11:21:38 -04:00
Jeff Bradberry
10200fced0 Add patch to the list of system packages installed into the container
since it is a requirement of the new updater.sh Python requirements script.
2019-08-01 10:41:49 -04:00
Jeff Bradberry
2926d0198d Bump the version of Django to 2.2.4
This is a security release.
2019-08-01 10:41:36 -04:00
softwarefactory-project-zuul[bot]
c742700a01 Merge pull request #4379 from saito-hideki/pr/fix_ui-devel-languages-1
Document UI building process for development environment to cover I18N static contents

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-01 12:13:50 +00:00
softwarefactory-project-zuul[bot]
3d5e072169 Merge pull request #4407 from mabashian/3929-notif-options-bold
Make notification form options regular font weight

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-01 12:04:49 +00:00
softwarefactory-project-zuul[bot]
772d087b6e Merge pull request #4404 from mabashian/4113-grafana
Fix js error thrown preventing creation of grafana notification

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-08-01 00:43:50 +00:00
mabashian
213df70419 Make notification form options regular font weight 2019-07-31 20:06:22 -04:00
softwarefactory-project-zuul[bot]
6a6d55fe41 Merge pull request #4317 from wenottingham/pexpect-the-requirements-inquisition
Remove pexpect, etc, from the ansible venv, that's now runner's problem.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-31 21:45:41 +00:00
softwarefactory-project-zuul[bot]
a36b436414 Merge pull request #4401 from AlanCoding/no_prompting
Update docs to reflect field removals

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-31 20:33:31 +00:00
mabashian
94b5bb8cf9 Ensure variable exists before calling toString on it to fix js error thrown when creating grafana notification 2019-07-31 14:15:13 -04:00
Hideki Saito
8362aa71db Update tooling and UI development documentation to cover I18N
- Document steps for adding I18N in builds
- Add "clean-language" target to remove *.mo files

Signed-off-by: Hideki Saito <saito@fgrep.org>
2019-07-31 09:44:29 -04:00
AlanCoding
b3651ecf30 Update docs to reflect field removals 2019-07-31 08:58:39 -04:00
softwarefactory-project-zuul[bot]
4a35df9a1c Merge pull request #4399 from mabashian/3676-notif-menu
Show notification menu to users with notification_admin team role

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-30 22:46:07 +00:00
softwarefactory-project-zuul[bot]
416b2ef37a Merge pull request #4398 from mabashian/3644-launch-outside-click
Prevent clicks outside of prompt modal from closing the modal without saving

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-30 22:18:54 +00:00
Bill Nottingham
ad28e11502 Remove pexpect, etc, from the ansible venv, that's now runner's problem. 2019-07-30 17:09:12 -04:00
softwarefactory-project-zuul[bot]
815823adc0 Merge pull request #4363 from jomach/feature/updateGitVersion
[4362] git version is old and does not work with x509 certificates

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-30 21:07:24 +00:00
mabashian
a6c50f6d20 Fix unit test endpoint to match notif admin request endpoint 2019-07-30 16:52:52 -04:00
mabashian
df177d6dc3 Removes close behavior when clicking outside of modal and dialog components 2019-07-30 16:47:17 -04:00
mabashian
1121a2b623 Show notification menu to users with notification_admin team role 2019-07-30 16:18:39 -04:00
softwarefactory-project-zuul[bot]
f02aa3528e Merge pull request #4311 from wenottingham/pair-of-mikos
Update paramiko to a version that can work with any python-gssapi.

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-30 19:35:48 +00:00
mabashian
c1cf7b79e3 Rogue console be gone 2019-07-30 15:25:42 -04:00
mabashian
47c59d5211 Prevent clicks outside of prompt modal from closing the modal without saving. User will now need to explicity hit the X or Cancel buttons to close the modal prematurely. 2019-07-30 15:22:01 -04:00
Bill Nottingham
20f1ed4533 Update source tarball. 2019-07-30 12:09:51 -04:00
Bill Nottingham
fafe9ce4ea Update paramiko to a version that can work with any python-gssapi. 2019-07-30 12:09:48 -04:00
softwarefactory-project-zuul[bot]
6499f2b233 Merge pull request #4380 from saito-hideki/issue/4359
Add description to template and project list view

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-30 15:49:32 +00:00
softwarefactory-project-zuul[bot]
8c4aac3b6c Merge pull request #4396 from ryanpetrello/ldap-audit
properly set `is_system_auditor` on initial LDAP login

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-30 15:28:23 +00:00
Ryan Petrello
a47a2d8567 properly set is_system_auditor on initial LDAP login
django-auth-ldap recently changed its behavior at login to *delay* the
user.save() call:

b777321fb4

our current process of discovering and setting up the system auditor
role at LDAP login *relies* on the user having a primary key, so this
code now manually calls .save() to enforce one
2019-07-30 10:05:39 -04:00
softwarefactory-project-zuul[bot]
5d6916f69e Merge pull request #4391 from ryanpetrello/skip-empty-stdout
skip events w/ empty stdout when generating stdout downloads

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-30 13:48:07 +00:00
Hideki Saito
329b791908 Add description to template and project list view
- Fixed issue #4359

Signed-off-by: Hideki Saito <saito@fgrep.org>
2019-07-30 20:18:54 +09:00
Jorge Machado
76933ed889 * upgrade from git on containers
* agreed with terms of DCO 1.1

Signed-off-by: Jorge Machado <jorge@jmachado.me>
2019-07-30 07:04:04 +02:00
softwarefactory-project-zuul[bot]
c7bb0f10e1 Merge pull request #4385 from chrismeyersfsu/fix-home_dir
fake it till you make it!

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-29 17:47:36 +00:00
softwarefactory-project-zuul[bot]
7afa35af17 Merge pull request #4367 from keithjgrant/4232-single-select-lookup
Single select lookup

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-29 17:42:12 +00:00
softwarefactory-project-zuul[bot]
2e48718746 Merge pull request #4393 from marshmalien/4392-org-inv-link
Add link to organization inventory list

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-29 17:25:21 +00:00
softwarefactory-project-zuul[bot]
26a7ec97fa Merge pull request #4387 from keithjgrant/3565-insights-translation
Insights translation

Reviewed-by: Keith Grant
             https://github.com/keithjgrant
2019-07-29 16:46:35 +00:00
Marliana Lara
3c96968ee0 Add link to organization inventory list 2019-07-29 12:36:30 -04:00
chris meyers
9236fd2a53 fake it till you make it!
* The user awx is passed to the launch of our dev docker container. The
docker system automagically creates that user for us and sets the home
dir to /tmp in /etc/passwd. Many methods of detecting the user home dir
don't use that. Instead, they use the HOME env var. This is a half-way
solution that solves the problem of python expanding the ~ dir.
* If other things break because they determine the users home dir via
/etc/passwd entry then a more in-depth fix will be needed.
2019-07-29 09:58:47 -04:00
Ryan Petrello
79723cea21 skip events w/ empty stdout when generating stdout downloads
see: https://github.com/ansible/tower/issues/3677
2019-07-29 09:36:00 -04:00
softwarefactory-project-zuul[bot]
9cc23d5a71 Merge pull request #4388 from falcon78921/wip-awx-grammar
awx/ui: fixed minor grammar error in Survey form

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-29 12:49:47 +00:00
James McClune
bb92296478 awx/ui: fixed minor grammar error in Survey form
Signed-off-by: James McClune <jmcclune@mcclunetechnologies.net>
2019-07-27 15:24:24 -04:00
Keith Grant
3d3952c549 remove unnecessary scrollbar from Inventories Lookup 2019-07-26 16:24:48 -07:00
Keith Grant
276ed792a2 translate insights tooltip 2019-07-26 12:57:24 -07:00
softwarefactory-project-zuul[bot]
0ef97c497f Merge pull request #3851 from AlanCoding/ig_distinct
Remove duplicates from IG list

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-26 16:45:06 +00:00
Keith Grant
e903425785 mark button text for translation 2019-07-25 15:39:06 -07:00
softwarefactory-project-zuul[bot]
94e14ae6f8 Merge pull request #4378 from ryanpetrello/run-rabbit-run
don't filter out schedules that have a null `next_run`

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-25 21:34:03 +00:00
Ryan Petrello
e711d32ea2 don't filter out schedules that have a null next_run
when schedules are disabled, their `next_run` is unset; we should still
show them in this list view, just with an empty value in the `next_run`
column (they're disabled, so they'll never run)
2019-07-25 17:07:28 -04:00
softwarefactory-project-zuul[bot]
a5c5874e20 Merge pull request #4377 from ryanpetrello/fix-4376
fix a bug which can cause isolated artifact cleanup to fail

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-25 20:27:03 +00:00
Ryan Petrello
2608e8d47d fix a bug which can cause isolated artifact cleanup to fail
see: https://github.com/ansible/awx/issues/4376
2019-07-25 15:52:04 -04:00
softwarefactory-project-zuul[bot]
06260bdbaf Merge pull request #4374 from rooftopcellist/update_job_status_info
Update job status comments

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-25 19:36:43 +00:00
Christian Adams
670a184708 Update job status comments
- waiting and pending job descriptions were not accurate
2019-07-25 15:06:06 -04:00
softwarefactory-project-zuul[bot]
24b166aec9 Merge pull request #4375 from rooftopcellist/pending_jobs_metrics
add pending jobs and system level job status to metrics

Reviewed-by: Christian Adams <rooftopcellist@gmail.com>
             https://github.com/rooftopcellist
2019-07-25 18:56:23 +00:00
Christian Adams
11a6e98230 Add pending jobs and system level job status to metrics 2019-07-25 14:19:20 -04:00
AlanCoding
2c533edb3c remove duplicates from IG list 2019-07-25 10:20:25 -04:00
softwarefactory-project-zuul[bot]
128fa8947a Merge pull request #4124 from beeankha/webhook_enhancement
Webhook Custom HTTP Method + Basic Auth Support

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-24 21:06:23 +00:00
Jake McDermott
97f841057f fix method mapping for webhook notification add 2019-07-24 15:50:27 -04:00
softwarefactory-project-zuul[bot]
2ccb5ba4a7 Merge pull request #4372 from ryanpetrello/instance-metrics-hostname
include instance hostnames in metrics endpoint

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-24 18:18:06 +00:00
Ryan Petrello
f2996f1c89 include instance hostnames in metrics endpoint 2019-07-24 13:41:56 -04:00
beeankha
f7502eed2f Correct the comment in migration file 2019-07-24 08:59:32 -04:00
Jake McDermott
1fe18dc588 normalize http method choice values 2019-07-23 19:58:35 -04:00
Keith Grant
2c86d7400a remove duplicate type declaration; lint fixes 2019-07-23 12:22:44 -07:00
beeankha
7580491f1a Add migration file to define http_method explicitly 2019-07-23 14:52:26 -04:00
Keith Grant
2392e57d2f fix InventoriesLookup on new JT form; add DataListRadio tests 2019-07-23 10:49:28 -07:00
Keith Grant
bb5b255c28 updating job template tests 2019-07-23 10:49:28 -07:00
Keith Grant
5edc6deeae finish core InventoriesLookup core functionality 2019-07-23 10:49:28 -07:00
Keith Grant
c080346751 start on InventoriesLookup 2019-07-23 10:49:28 -07:00
softwarefactory-project-zuul[bot]
d9c2bd8ef3 Merge pull request #4364 from AlanCoding/azure_mo_data
Re-create lost data in Azure_rm imports

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-23 15:47:29 +00:00
Jake McDermott
37e73acb62 cleanup tooling 2019-07-23 11:47:19 -04:00
beeankha
04404c93db Enforce http_method restrictions via API 2019-07-23 11:47:19 -04:00
beeankha
6ef235dcd5 Enable auth header to send with just username field filled in 2019-07-23 11:47:19 -04:00
Jake McDermott
d66106d380 rename docker-notifications to docker-httpbin 2019-07-23 11:47:19 -04:00
beeankha
99737937cd No auth header sent if username/password fields are blank 2019-07-23 11:47:19 -04:00
beeankha
0a0b09b394 Update logic in send method to recognize password field in upgraded webhook notifications 2019-07-23 11:47:19 -04:00
Jake McDermott
2b74b6f9b6 add tooling for basic testing of notification webhooks 2019-07-23 11:47:19 -04:00
beeankha
6e9f74eb17 Updating tests, changing 'method' to 'http_method' 2019-07-23 11:47:19 -04:00
Jake McDermott
cc0310ccd4 add notification webhook fields 2019-07-23 11:47:19 -04:00
beeankha
52b01feafe Change init parameter name to 'http_method' to reduce ambiguity 2019-07-23 11:47:19 -04:00
beeankha
fbb3fd2799 Add custom HTTP method 2019-07-23 11:47:19 -04:00
beeankha
5071e1c75f Update webhook backend to take username/password 2019-07-23 11:47:19 -04:00
beeankha
6f030256f5 Add username and password fields to webhook backend 2019-07-23 11:47:19 -04:00
softwarefactory-project-zuul[bot]
0fff7465e8 Merge pull request #4360 from ryanpetrello/smart-inv-ignore-conflicts
replace the smart inventory membership lock with a new Django 2.2 flag

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-23 05:30:53 +00:00
AlanCoding
a0c7471110 Re-create lost data in Azure_rm imports 2019-07-22 15:32:24 -04:00
Ryan Petrello
1cedf244b7 replace the smart inventory membership lock with a new Django 2.2 flag 2019-07-22 11:11:36 -04:00
Ryan Petrello
f6c357659d Merge pull request #4348 from shanemcd/devel
Bump VERSION to 6.1.0
2019-07-18 14:26:18 -04:00
softwarefactory-project-zuul[bot]
8ccccfecf1 Merge pull request #4316 from keithjgrant/222-job-results-details
222 job results details

Reviewed-by: https://github.com/softwarefactory-project-zuul[bot]
2019-07-18 17:15:38 +00:00
Keith Grant
0d7500d349 use unified jobs api to redirect to canonical url from /jobs/:id 2019-07-17 16:12:00 -07:00
Keith Grant
2f9be4796a job detail style tweaks 2019-07-17 09:32:43 -07:00
Keith Grant
183bd4fa80 revert i18n on credential kind (API translates it) 2019-07-17 09:32:43 -07:00
Keith Grant
db4a964e64 run credential type through i18n 2019-07-17 09:32:43 -07:00
Keith Grant
761ed6dec0 prettier 2019-07-17 09:32:43 -07:00
Keith Grant
e3d67117e7 fix job detail breadcrumbs 2019-07-17 09:32:43 -07:00
Keith Grant
552164c25c flush out more type defs; JobDetail tests 2019-07-17 09:32:43 -07:00
Keith Grant
40f9b0dc7f add CredentialChip component 2019-07-17 09:32:43 -07:00
Keith Grant
eee1601528 job details: handle different job types 2019-07-17 09:32:43 -07:00
Keith Grant
da780c9d7c make VariablesInput detect whether value is JSON or YAML on init 2019-07-17 09:32:43 -07:00
Keith Grant
968cc8c79c add variables & artifacts to job detail 2019-07-17 09:32:43 -07:00
Keith Grant
4372e977f0 build basic job details 2019-07-17 09:32:43 -07:00
565 changed files with 36888 additions and 8418 deletions

1
.gitignore vendored
View File

@@ -125,6 +125,7 @@ local/
requirements/vendor
.i18n_built
.idea/*
*credentials*.y*ml*
# AWX python libs populated by requirements.txt
awx/lib/.deps_built

View File

@@ -134,6 +134,8 @@ Run the following to build the AWX UI:
```bash
(host) $ make ui-devel
```
See [the ui development documentation](awx/ui/README.md) for more information on using the frontend development, build, and test tooling.
### Running the environment
#### Start the containers

View File

@@ -36,7 +36,7 @@ This document provides a guide for installing AWX.
- [PostgreSQL](#postgresql-1)
- [Proxy settings](#proxy-settings)
- [Start the build](#start-the-build-2)
- [Post build](#post-build-1)
- [Post build](#post-build-2)
- [Accessing AWX](#accessing-awx-2)
## Getting started
@@ -72,7 +72,7 @@ Before you can run a deployment, you'll need the following installed in your loc
The system that runs the AWX service will need to satisfy the following requirements
- At leasts 4GB of memory
- At least 4GB of memory
- At least 2 cpu cores
- At least 20GB of space
- Running Docker, Openshift, or Kubernetes

View File

@@ -59,14 +59,14 @@ UI_RELEASE_FLAG_FILE = awx/ui/.release_built
I18N_FLAG_FILE = .i18n_built
.PHONY: awx-link clean clean-tmp clean-venv requirements requirements_dev \
develop refresh adduser migrate dbchange dbshell runserver \
develop refresh adduser migrate dbchange runserver \
receiver test test_unit test_coverage coverage_html \
dev_build release_build release_clean sdist \
ui-docker-machine ui-docker ui-release ui-devel \
ui-test ui-deps ui-test-ci VERSION
# remove ui build artifacts
clean-ui:
clean-ui: clean-languages
rm -rf awx/ui/static/
rm -rf awx/ui/node_modules/
rm -rf awx/ui/test/unit/reports/
@@ -94,6 +94,10 @@ clean-schema:
rm -rf schema.json
rm -rf reference-schema.json
clean-languages:
rm -f $(I18N_FLAG_FILE)
find . -type f -regex ".*\.mo$$" -delete
# Remove temporary build files, compiled Python files.
clean: clean-ui clean-dist
rm -rf awx/public
@@ -242,10 +246,6 @@ migrate:
dbchange:
$(MANAGEMENT_COMMAND) makemigrations
# access database shell, asks for password
dbshell:
sudo -u postgres psql -d awx-dev
server_noattach:
tmux new-session -d -s awx 'exec make uwsgi'
tmux rename-window 'AWX'
@@ -372,6 +372,7 @@ test:
. $(VENV_BASE)/awx/bin/activate; \
fi; \
PYTHONDONTWRITEBYTECODE=1 py.test -p no:cacheprovider -n auto $(TEST_DIRS)
cd awxkit && $(VENV_BASE)/awx/bin/tox -re py2,py3
awx-manage check_migrations --dry-run --check -n 'vNNN_missing_migration_file'
test_unit:
@@ -564,8 +565,8 @@ setup-bundle-build:
mkdir -p $@
docker-auth:
if [ "$(IMAGE_REPOSITORY_AUTH)" ]; then \
docker login -u oauth2accesstoken -p "$(IMAGE_REPOSITORY_AUTH)" $(IMAGE_REPOSITORY_BASE); \
@if [ "$(IMAGE_REPOSITORY_AUTH)" ]; then \
echo "$(IMAGE_REPOSITORY_AUTH)" | docker login -u oauth2accesstoken --password-stdin $(IMAGE_REPOSITORY_BASE); \
fi;
# Docker isolated rampart

View File

@@ -1 +1 @@
6.1.0
7.0.0

View File

@@ -40,7 +40,7 @@ if HAS_DJANGO is True:
# but will support the `usedforsecurity` keyword on RHEL and Centos systems.
# Keep an eye on https://code.djangoproject.com/ticket/28401
target_version = '2.2.2'
target_version = '2.2.4'
if django.__version__ != target_version:
raise RuntimeError(
"Django version other than {target} detected: {current}. "

View File

@@ -34,7 +34,8 @@ from rest_framework.negotiation import DefaultContentNegotiation
# AWX
from awx.api.filters import FieldLookupBackend
from awx.main.models import (
UnifiedJob, UnifiedJobTemplate, User, Role, Credential
UnifiedJob, UnifiedJobTemplate, User, Role, Credential,
WorkflowJobTemplateNode, WorkflowApprovalTemplate
)
from awx.main.access import access_registry
from awx.main.utils import (
@@ -882,6 +883,21 @@ class CopyAPIView(GenericAPIView):
create_kwargs[field.name] = CopyAPIView._decrypt_model_field_if_needed(
obj, field.name, field_val
)
# WorkflowJobTemplateNodes that represent an approval are *special*;
# when we copy them, we actually want to *copy* the UJT they point at
# rather than share the template reference between nodes in disparate
# workflows
if (
isinstance(obj, WorkflowJobTemplateNode) and
isinstance(getattr(obj, 'unified_job_template'), WorkflowApprovalTemplate)
):
new_approval_template, sub_objs = CopyAPIView.copy_model_obj(
None, None, WorkflowApprovalTemplate,
obj.unified_job_template, creater
)
create_kwargs['unified_job_template'] = new_approval_template
new_obj = model.objects.create(**create_kwargs)
logger.debug('Deep copy: Created new object {}({})'.format(
new_obj, model

View File

@@ -5,6 +5,8 @@ from collections import OrderedDict
# Django
from django.core.exceptions import PermissionDenied
from django.db.models.fields import PositiveIntegerField, BooleanField
from django.db.models.fields.related import ForeignKey
from django.http import Http404
from django.utils.encoding import force_text, smart_text
from django.utils.translation import ugettext_lazy as _
@@ -14,9 +16,11 @@ from rest_framework import exceptions
from rest_framework import metadata
from rest_framework import serializers
from rest_framework.relations import RelatedField, ManyRelatedField
from rest_framework.fields import JSONField as DRFJSONField
from rest_framework.request import clone_request
# AWX
from awx.main.fields import JSONField
from awx.main.models import InventorySource, NotificationTemplate
@@ -68,6 +72,8 @@ class Metadata(metadata.SimpleMetadata):
else:
for model_field in serializer.Meta.model._meta.fields:
if field.field_name == model_field.name:
if getattr(model_field, '__accepts_json__', None):
field_info['type'] = 'json'
field_info['filterable'] = True
break
else:
@@ -114,15 +120,48 @@ class Metadata(metadata.SimpleMetadata):
for (notification_type_name, notification_tr_name, notification_type_class) in NotificationTemplate.NOTIFICATION_TYPES:
field_info[notification_type_name] = notification_type_class.init_parameters
# Special handling of notification messages where the required properties
# are conditional on the type selected.
try:
view_model = field.context['view'].model
except (AttributeError, KeyError):
view_model = None
if view_model == NotificationTemplate and field.field_name == 'messages':
for (notification_type_name, notification_tr_name, notification_type_class) in NotificationTemplate.NOTIFICATION_TYPES:
field_info[notification_type_name] = notification_type_class.default_messages
# Update type of fields returned...
model_field = None
if serializer and hasattr(serializer, 'Meta') and hasattr(serializer.Meta, 'model'):
try:
model_field = serializer.Meta.model._meta.get_field(field.field_name)
except Exception:
pass
if field.field_name == 'type':
field_info['type'] = 'choice'
elif field.field_name == 'url':
elif field.field_name in ('url', 'custom_virtualenv', 'token'):
field_info['type'] = 'string'
elif field.field_name in ('related', 'summary_fields'):
field_info['type'] = 'object'
elif isinstance(field, PositiveIntegerField):
field_info['type'] = 'integer'
elif field.field_name in ('created', 'modified'):
field_info['type'] = 'datetime'
elif (
RelatedField in field.__class__.__bases__ or
isinstance(model_field, ForeignKey)
):
field_info['type'] = 'id'
elif (
isinstance(field, JSONField) or
isinstance(model_field, JSONField) or
isinstance(field, DRFJSONField) or
isinstance(getattr(field, 'model_field', None), JSONField)
):
field_info['type'] = 'json'
elif isinstance(model_field, BooleanField):
field_info['type'] = 'boolean'
return field_info

View File

@@ -17,7 +17,7 @@ logger = logging.getLogger('awx.api.permissions')
__all__ = ['ModelAccessPermission', 'JobTemplateCallbackPermission', 'VariableDataPermission',
'TaskPermission', 'ProjectUpdatePermission', 'InventoryInventorySourcesUpdatePermission',
'UserPermission', 'IsSuperUser', 'InstanceGroupTowerPermission',]
'UserPermission', 'IsSuperUser', 'InstanceGroupTowerPermission', 'WorkflowApprovalPermission']
class ModelAccessPermission(permissions.BasePermission):
@@ -196,6 +196,17 @@ class TaskPermission(ModelAccessPermission):
return False
class WorkflowApprovalPermission(ModelAccessPermission):
'''
Permission check used by workflow `approval` and `deny` views to determine
who has access to approve and deny paused workflow nodes
'''
def check_post_permissions(self, request, view, obj=None):
approval = get_object_or_400(view.model, pk=view.kwargs['pk'])
return check_user_access(request.user, view.model, 'approve_or_deny', approval)
class ProjectUpdatePermission(ModelAccessPermission):
'''
Permission check used by ProjectUpdateView to determine who can update projects
@@ -238,4 +249,3 @@ class InstanceGroupTowerPermission(ModelAccessPermission):
if request.method == 'DELETE' and obj.name == "tower":
return False
return super(InstanceGroupTowerPermission, self).has_object_permission(request, view, obj)

View File

@@ -2,6 +2,7 @@
# All Rights Reserved.
from django.utils.safestring import SafeText
from prometheus_client.parser import text_string_to_metric_families
# Django REST Framework
from rest_framework import renderers
@@ -103,3 +104,21 @@ class AnsiTextRenderer(PlainTextRenderer):
class AnsiDownloadRenderer(PlainTextRenderer):
format = "ansi_download"
class PrometheusJSONRenderer(renderers.JSONRenderer):
def render(self, data, accepted_media_type=None, renderer_context=None):
if isinstance(data, dict):
# HTTP errors are {'detail': ErrorDetail(string='...', code=...)}
return super(PrometheusJSONRenderer, self).render(
data, accepted_media_type, renderer_context
)
parsed_metrics = text_string_to_metric_families(data)
data = {}
for family in parsed_metrics:
for sample in family.samples:
data[sample[0]] = {"labels": sample[1], "value": sample[2]}
return super(PrometheusJSONRenderer, self).render(
data, accepted_media_type, renderer_context
)

View File

@@ -13,6 +13,10 @@ from datetime import timedelta
from oauthlib import oauth2
from oauthlib.common import generate_token
# Jinja
from jinja2 import sandbox, StrictUndefined
from jinja2.exceptions import TemplateSyntaxError, UndefinedError, SecurityError
# Django
from django.conf import settings
from django.contrib.auth import update_session_auth_hash
@@ -50,12 +54,12 @@ from awx.main.models import (
CredentialType, CustomInventoryScript, Group, Host, Instance,
InstanceGroup, Inventory, InventorySource, InventoryUpdate,
InventoryUpdateEvent, Job, JobEvent, JobHostSummary, JobLaunchConfig,
JobTemplate, Label, Notification, NotificationTemplate,
JobNotificationMixin, JobTemplate, Label, Notification, NotificationTemplate,
OAuth2AccessToken, OAuth2Application, Organization, Project,
ProjectUpdate, ProjectUpdateEvent, RefreshToken, Role, Schedule,
SystemJob, SystemJobEvent, SystemJobTemplate, Team, UnifiedJob,
UnifiedJobTemplate, WorkflowJob, WorkflowJobNode,
WorkflowJobTemplate, WorkflowJobTemplateNode, StdoutMaxBytesExceeded
UnifiedJobTemplate, WorkflowApproval, WorkflowApprovalTemplate, WorkflowJob,
WorkflowJobNode, WorkflowJobTemplate, WorkflowJobTemplateNode, StdoutMaxBytesExceeded
)
from awx.main.models.base import VERBOSITY_CHOICES, NEW_JOB_TYPE_CHOICES
from awx.main.models.rbac import (
@@ -117,6 +121,8 @@ SUMMARIZABLE_FK_FIELDS = {
'job_template': DEFAULT_SUMMARY_FIELDS,
'workflow_job_template': DEFAULT_SUMMARY_FIELDS,
'workflow_job': DEFAULT_SUMMARY_FIELDS,
'workflow_approval_template': DEFAULT_SUMMARY_FIELDS + ('timeout',),
'workflow_approval': DEFAULT_SUMMARY_FIELDS + ('timeout',),
'schedule': DEFAULT_SUMMARY_FIELDS + ('next_run',),
'unified_job_template': DEFAULT_SUMMARY_FIELDS + ('unified_job_type',),
'last_job': DEFAULT_SUMMARY_FIELDS + ('finished', 'status', 'failed', 'license_error'),
@@ -677,6 +683,8 @@ class UnifiedJobTemplateSerializer(BaseSerializer):
serializer_class = SystemJobTemplateSerializer
elif isinstance(obj, WorkflowJobTemplate):
serializer_class = WorkflowJobTemplateSerializer
elif isinstance(obj, WorkflowApprovalTemplate):
serializer_class = WorkflowApprovalTemplateSerializer
return serializer_class
def to_representation(self, obj):
@@ -778,6 +786,8 @@ class UnifiedJobSerializer(BaseSerializer):
serializer_class = SystemJobSerializer
elif isinstance(obj, WorkflowJob):
serializer_class = WorkflowJobSerializer
elif isinstance(obj, WorkflowApproval):
serializer_class = WorkflowApprovalSerializer
return serializer_class
def to_representation(self, obj):
@@ -834,6 +844,8 @@ class UnifiedJobListSerializer(UnifiedJobSerializer):
serializer_class = SystemJobListSerializer
elif isinstance(obj, WorkflowJob):
serializer_class = WorkflowJobListSerializer
elif isinstance(obj, WorkflowApproval):
serializer_class = WorkflowApprovalListSerializer
return serializer_class
def to_representation(self, obj):
@@ -1285,8 +1297,8 @@ class OrganizationSerializer(BaseSerializer):
class ProjectOptionsSerializer(BaseSerializer):
class Meta:
fields = ('*', 'local_path', 'scm_type', 'scm_url', 'scm_branch',
'scm_clean', 'scm_delete_on_update', 'credential', 'timeout',)
fields = ('*', 'local_path', 'scm_type', 'scm_url', 'scm_branch', 'scm_refspec',
'scm_clean', 'scm_delete_on_update', 'credential', 'timeout', 'scm_revision')
def get_related(self, obj):
res = super(ProjectOptionsSerializer, self).get_related(obj)
@@ -1311,18 +1323,14 @@ class ProjectOptionsSerializer(BaseSerializer):
attrs.pop('local_path', None)
if 'local_path' in attrs and attrs['local_path'] not in valid_local_paths:
errors['local_path'] = _('This path is already being used by another manual project.')
if attrs.get('scm_refspec') and scm_type != 'git':
errors['scm_refspec'] = _('SCM refspec can only be used with git projects.')
if errors:
raise serializers.ValidationError(errors)
return super(ProjectOptionsSerializer, self).validate(attrs)
def to_representation(self, obj):
ret = super(ProjectOptionsSerializer, self).to_representation(obj)
if obj is not None and 'credential' in ret and not obj.credential:
ret['credential'] = None
return ret
class ProjectSerializer(UnifiedJobTemplateSerializer, ProjectOptionsSerializer):
@@ -1338,7 +1346,7 @@ class ProjectSerializer(UnifiedJobTemplateSerializer, ProjectOptionsSerializer):
class Meta:
model = Project
fields = ('*', 'organization', 'scm_update_on_launch',
'scm_update_cache_timeout', 'scm_revision', 'custom_virtualenv',) + \
'scm_update_cache_timeout', 'allow_override', 'custom_virtualenv',) + \
('last_update_failed', 'last_updated') # Backwards compatibility
def get_related(self, obj):
@@ -1388,6 +1396,21 @@ class ProjectSerializer(UnifiedJobTemplateSerializer, ProjectOptionsSerializer):
elif self.instance:
organization = self.instance.organization
if 'allow_override' in attrs and self.instance:
# case where user is turning off this project setting
if self.instance.allow_override and not attrs['allow_override']:
used_by = set(
JobTemplate.objects.filter(
models.Q(project=self.instance),
models.Q(ask_scm_branch_on_launch=True) | ~models.Q(scm_branch="")
).values_list('pk', flat=True)
)
if used_by:
raise serializers.ValidationError({
'allow_override': _('One or more job templates depend on branch override behavior for this project (ids: {}).').format(
' '.join([str(pk) for pk in used_by])
)})
view = self.context.get('view', None)
if not organization and not view.request.user.is_superuser:
# Only allow super users to create orgless projects
@@ -1979,6 +2002,9 @@ class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOpt
fields = ('*', 'name', 'inventory', 'update_on_launch', 'update_cache_timeout',
'source_project', 'update_on_project_update') + \
('last_update_failed', 'last_updated') # Backwards compatibility.
extra_kwargs = {
'inventory': {'required': True}
}
def get_related(self, obj):
res = super(InventorySourceSerializer, self).get_related(obj)
@@ -2701,7 +2727,7 @@ class LabelsListMixin(object):
class JobOptionsSerializer(LabelsListMixin, BaseSerializer):
class Meta:
fields = ('*', 'job_type', 'inventory', 'project', 'playbook',
fields = ('*', 'job_type', 'inventory', 'project', 'playbook', 'scm_branch',
'forks', 'limit', 'verbosity', 'extra_vars', 'job_tags',
'force_handlers', 'skip_tags', 'start_at_task', 'timeout',
'use_fact_cache',)
@@ -2748,16 +2774,28 @@ class JobOptionsSerializer(LabelsListMixin, BaseSerializer):
def validate(self, attrs):
if 'project' in self.fields and 'playbook' in self.fields:
project = attrs.get('project', self.instance and self.instance.project or None)
project = attrs.get('project', self.instance.project if self.instance else None)
playbook = attrs.get('playbook', self.instance and self.instance.playbook or '')
scm_branch = attrs.get('scm_branch', self.instance.scm_branch if self.instance else None)
ask_scm_branch_on_launch = attrs.get(
'ask_scm_branch_on_launch', self.instance.ask_scm_branch_on_launch if self.instance else None)
if not project:
raise serializers.ValidationError({'project': _('This field is required.')})
if project and project.scm_type and playbook and force_text(playbook) not in project.playbook_files:
raise serializers.ValidationError({'playbook': _('Playbook not found for project.')})
if project and not project.scm_type and playbook and force_text(playbook) not in project.playbooks:
playbook_not_found = bool(
(
project and project.scm_type and (not project.allow_override) and
playbook and force_text(playbook) not in project.playbook_files
) or
(project and not project.scm_type and playbook and force_text(playbook) not in project.playbooks) # manual
)
if playbook_not_found:
raise serializers.ValidationError({'playbook': _('Playbook not found for project.')})
if project and not playbook:
raise serializers.ValidationError({'playbook': _('Must select playbook for project.')})
if scm_branch and not project.allow_override:
raise serializers.ValidationError({'scm_branch': _('Project does not allow overriding branch.')})
if ask_scm_branch_on_launch and not project.allow_override:
raise serializers.ValidationError({'ask_scm_branch_on_launch': _('Project does not allow overriding branch.')})
ret = super(JobOptionsSerializer, self).validate(attrs)
return ret
@@ -2799,7 +2837,8 @@ class JobTemplateSerializer(JobTemplateMixin, UnifiedJobTemplateSerializer, JobO
class Meta:
model = JobTemplate
fields = ('*', 'host_config_key', 'ask_diff_mode_on_launch', 'ask_variables_on_launch', 'ask_limit_on_launch', 'ask_tags_on_launch',
fields = ('*', 'host_config_key', 'ask_scm_branch_on_launch', 'ask_diff_mode_on_launch', 'ask_variables_on_launch',
'ask_limit_on_launch', 'ask_tags_on_launch',
'ask_skip_tags_on_launch', 'ask_job_type_on_launch', 'ask_verbosity_on_launch', 'ask_inventory_on_launch',
'ask_credential_on_launch', 'survey_enabled', 'become_enabled', 'diff_mode',
'allow_simultaneous', 'custom_virtualenv', 'job_slice_count')
@@ -3364,7 +3403,78 @@ class WorkflowJobCancelSerializer(WorkflowJobSerializer):
fields = ('can_cancel',)
class WorkflowApprovalViewSerializer(UnifiedJobSerializer):
class Meta:
model = WorkflowApproval
fields = []
class WorkflowApprovalSerializer(UnifiedJobSerializer):
can_approve_or_deny = serializers.SerializerMethodField()
approval_expiration = serializers.SerializerMethodField()
timed_out = serializers.ReadOnlyField()
class Meta:
model = WorkflowApproval
fields = ('*', '-controller_node', '-execution_node', 'can_approve_or_deny', 'approval_expiration', 'timed_out',)
def get_approval_expiration(self, obj):
if obj.status != 'pending' or obj.timeout == 0:
return None
return obj.created + timedelta(seconds=obj.timeout)
def get_can_approve_or_deny(self, obj):
request = self.context.get('request', None)
allowed = request.user.can_access(WorkflowApproval, 'approve_or_deny', obj)
return allowed is True and obj.status == 'pending'
def get_related(self, obj):
res = super(WorkflowApprovalSerializer, self).get_related(obj)
if obj.workflow_approval_template:
res['workflow_approval_template'] = self.reverse('api:workflow_approval_template_detail',
kwargs={'pk': obj.workflow_approval_template.pk})
res['approve'] = self.reverse('api:workflow_approval_approve', kwargs={'pk': obj.pk})
res['deny'] = self.reverse('api:workflow_approval_deny', kwargs={'pk': obj.pk})
return res
class WorkflowApprovalActivityStreamSerializer(WorkflowApprovalSerializer):
"""
timed_out and status are usually read-only fields
However, when we generate an activity stream record, we *want* to record
these types of changes. This serializer allows us to do so.
"""
status = serializers.ChoiceField(choices=JobTemplate.JOB_TEMPLATE_STATUS_CHOICES)
timed_out = serializers.BooleanField()
class WorkflowApprovalListSerializer(WorkflowApprovalSerializer, UnifiedJobListSerializer):
class Meta:
fields = ('*', '-controller_node', '-execution_node', 'can_approve_or_deny', 'approval_expiration', 'timed_out',)
class WorkflowApprovalTemplateSerializer(UnifiedJobTemplateSerializer):
class Meta:
model = WorkflowApprovalTemplate
fields = ('*', 'timeout', 'name',)
def get_related(self, obj):
res = super(WorkflowApprovalTemplateSerializer, self).get_related(obj)
if 'last_job' in res:
del res['last_job']
res.update(dict(jobs = self.reverse('api:workflow_approval_template_jobs_list', kwargs={'pk': obj.pk}),))
return res
class LaunchConfigurationBaseSerializer(BaseSerializer):
scm_branch = serializers.CharField(allow_blank=True, allow_null=True, required=False, default=None)
job_type = serializers.ChoiceField(allow_blank=True, allow_null=True, required=False, default=None,
choices=NEW_JOB_TYPE_CHOICES)
job_tags = serializers.CharField(allow_blank=True, allow_null=True, required=False, default=None)
@@ -3377,7 +3487,7 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
class Meta:
fields = ('*', 'extra_data', 'inventory', # Saved launch-time config fields
'job_type', 'job_tags', 'skip_tags', 'limit', 'skip_tags', 'diff_mode', 'verbosity')
'scm_branch', 'job_type', 'job_tags', 'skip_tags', 'limit', 'skip_tags', 'diff_mode', 'verbosity')
def get_related(self, obj):
res = super(LaunchConfigurationBaseSerializer, self).get_related(obj)
@@ -3409,12 +3519,6 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
ret['extra_data'] = obj.display_extra_vars()
return ret
def get_summary_fields(self, obj):
summary_fields = super(LaunchConfigurationBaseSerializer, self).get_summary_fields(obj)
# Credential would be an empty dictionary in this case
summary_fields.pop('credential', None)
return summary_fields
def validate(self, attrs):
db_extra_data = {}
if self.instance:
@@ -3427,6 +3531,10 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
ujt = attrs['unified_job_template']
elif self.instance:
ujt = self.instance.unified_job_template
if ujt is None:
if 'workflow_job_template' in attrs:
return {'workflow_job_template': attrs['workflow_job_template']}
return {}
# build additional field survey_passwords to track redacted variables
password_dict = {}
@@ -3496,7 +3604,6 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
class WorkflowJobTemplateNodeSerializer(LaunchConfigurationBaseSerializer):
credential = DeprecatedCredentialField()
success_nodes = serializers.PrimaryKeyRelatedField(many=True, read_only=True)
failure_nodes = serializers.PrimaryKeyRelatedField(many=True, read_only=True)
always_nodes = serializers.PrimaryKeyRelatedField(many=True, read_only=True)
@@ -3504,11 +3611,12 @@ class WorkflowJobTemplateNodeSerializer(LaunchConfigurationBaseSerializer):
class Meta:
model = WorkflowJobTemplateNode
fields = ('*', 'credential', 'workflow_job_template', '-name', '-description', 'id', 'url', 'related',
fields = ('*', 'workflow_job_template', '-name', '-description', 'id', 'url', 'related',
'unified_job_template', 'success_nodes', 'failure_nodes', 'always_nodes',)
def get_related(self, obj):
res = super(WorkflowJobTemplateNodeSerializer, self).get_related(obj)
res['create_approval_template'] = self.reverse('api:workflow_job_template_node_create_approval', kwargs={'pk': obj.pk})
res['success_nodes'] = self.reverse('api:workflow_job_template_node_success_nodes_list', kwargs={'pk': obj.pk})
res['failure_nodes'] = self.reverse('api:workflow_job_template_node_failure_nodes_list', kwargs={'pk': obj.pk})
res['always_nodes'] = self.reverse('api:workflow_job_template_node_always_nodes_list', kwargs={'pk': obj.pk})
@@ -3520,14 +3628,6 @@ class WorkflowJobTemplateNodeSerializer(LaunchConfigurationBaseSerializer):
pass
return res
def build_field(self, field_name, info, model_class, nested_depth):
# have to special-case the field so that DRF will not automagically make it
# read-only because it's a property on the model.
if field_name == 'credential':
return self.build_standard_field(field_name,
self.credential)
return super(WorkflowJobTemplateNodeSerializer, self).build_field(field_name, info, model_class, nested_depth)
def build_relational_field(self, field_name, relation_info):
field_class, field_kwargs = super(WorkflowJobTemplateNodeSerializer, self).build_relational_field(field_name, relation_info)
# workflow_job_template is read-only unless creating a new node.
@@ -3536,65 +3636,21 @@ class WorkflowJobTemplateNodeSerializer(LaunchConfigurationBaseSerializer):
field_kwargs.pop('queryset', None)
return field_class, field_kwargs
def validate(self, attrs):
deprecated_fields = {}
if 'credential' in attrs: # TODO: remove when v2 API is deprecated
deprecated_fields['credential'] = attrs.pop('credential')
view = self.context.get('view')
attrs = super(WorkflowJobTemplateNodeSerializer, self).validate(attrs)
ujt_obj = None
if 'unified_job_template' in attrs:
ujt_obj = attrs['unified_job_template']
elif self.instance:
ujt_obj = self.instance.unified_job_template
if 'credential' in deprecated_fields: # TODO: remove when v2 API is deprecated
cred = deprecated_fields['credential']
attrs['credential'] = cred
if cred is not None:
if not ujt_obj.ask_credential_on_launch:
raise serializers.ValidationError({"credential": _(
"Related template is not configured to accept credentials on launch.")})
cred = Credential.objects.get(pk=cred)
view = self.context.get('view', None)
if (not view) or (not view.request) or (view.request.user not in cred.use_role):
raise PermissionDenied()
return attrs
def create(self, validated_data): # TODO: remove when v2 API is deprecated
deprecated_fields = {}
if 'credential' in validated_data:
deprecated_fields['credential'] = validated_data.pop('credential')
obj = super(WorkflowJobTemplateNodeSerializer, self).create(validated_data)
if 'credential' in deprecated_fields:
if deprecated_fields['credential']:
obj.credentials.add(deprecated_fields['credential'])
return obj
def update(self, obj, validated_data): # TODO: remove when v2 API is deprecated
deprecated_fields = {}
if 'credential' in validated_data:
deprecated_fields['credential'] = validated_data.pop('credential')
obj = super(WorkflowJobTemplateNodeSerializer, self).update(obj, validated_data)
if 'credential' in deprecated_fields:
existing = obj.credentials.filter(credential_type__kind='ssh')
new_cred = deprecated_fields['credential']
if new_cred not in existing:
for cred in existing:
obj.credentials.remove(cred)
if new_cred:
obj.credentials.add(new_cred)
return obj
def get_summary_fields(self, obj):
summary_fields = super(WorkflowJobTemplateNodeSerializer, self).get_summary_fields(obj)
if isinstance(obj.unified_job_template, WorkflowApprovalTemplate):
summary_fields['unified_job_template']['timeout'] = obj.unified_job_template.timeout
return summary_fields
class WorkflowJobNodeSerializer(LaunchConfigurationBaseSerializer):
credential = DeprecatedCredentialField()
success_nodes = serializers.PrimaryKeyRelatedField(many=True, read_only=True)
failure_nodes = serializers.PrimaryKeyRelatedField(many=True, read_only=True)
always_nodes = serializers.PrimaryKeyRelatedField(many=True, read_only=True)
class Meta:
model = WorkflowJobNode
fields = ('*', 'credential', 'job', 'workflow_job', '-name', '-description', 'id', 'url', 'related',
fields = ('*', 'job', 'workflow_job', '-name', '-description', 'id', 'url', 'related',
'unified_job_template', 'success_nodes', 'failure_nodes', 'always_nodes',
'do_not_run',)
@@ -3611,6 +3667,12 @@ class WorkflowJobNodeSerializer(LaunchConfigurationBaseSerializer):
res['workflow_job'] = self.reverse('api:workflow_job_detail', kwargs={'pk': obj.workflow_job.pk})
return res
def get_summary_fields(self, obj):
summary_fields = super(WorkflowJobNodeSerializer, self).get_summary_fields(obj)
if isinstance(obj.job, WorkflowApproval):
summary_fields['job']['timed_out'] = obj.job.timed_out
return summary_fields
class WorkflowJobNodeListSerializer(WorkflowJobNodeSerializer):
pass
@@ -3636,6 +3698,16 @@ class WorkflowJobTemplateNodeDetailSerializer(WorkflowJobTemplateNodeSerializer)
return field_class, field_kwargs
class WorkflowJobTemplateNodeCreateApprovalSerializer(BaseSerializer):
class Meta:
model = WorkflowApprovalTemplate
fields = ('timeout', 'name', 'description',)
def to_representation(self, obj):
return {}
class JobListSerializer(JobSerializer, UnifiedJobListSerializer):
pass
@@ -3960,6 +4032,7 @@ class JobLaunchSerializer(BaseSerializer):
required=False, write_only=True
)
credential_passwords = VerbatimField(required=False, write_only=True)
scm_branch = serializers.CharField(required=False, write_only=True, allow_blank=True)
diff_mode = serializers.BooleanField(required=False, write_only=True)
job_tags = serializers.CharField(required=False, write_only=True, allow_blank=True)
job_type = serializers.ChoiceField(required=False, choices=NEW_JOB_TYPE_CHOICES, write_only=True)
@@ -3970,13 +4043,15 @@ class JobLaunchSerializer(BaseSerializer):
class Meta:
model = JobTemplate
fields = ('can_start_without_user_input', 'passwords_needed_to_start',
'extra_vars', 'inventory', 'limit', 'job_tags', 'skip_tags', 'job_type', 'verbosity', 'diff_mode',
'credentials', 'credential_passwords', 'ask_variables_on_launch', 'ask_tags_on_launch',
'extra_vars', 'inventory', 'scm_branch', 'limit', 'job_tags', 'skip_tags', 'job_type', 'verbosity', 'diff_mode',
'credentials', 'credential_passwords',
'ask_scm_branch_on_launch', 'ask_variables_on_launch', 'ask_tags_on_launch',
'ask_diff_mode_on_launch', 'ask_skip_tags_on_launch', 'ask_job_type_on_launch', 'ask_limit_on_launch',
'ask_verbosity_on_launch', 'ask_inventory_on_launch', 'ask_credential_on_launch',
'survey_enabled', 'variables_needed_to_start', 'credential_needed_to_start',
'inventory_needed_to_start', 'job_template_data', 'defaults', 'verbosity')
read_only_fields = (
'ask_scm_branch_on_launch',
'ask_diff_mode_on_launch', 'ask_variables_on_launch', 'ask_limit_on_launch', 'ask_tags_on_launch',
'ask_skip_tags_on_launch', 'ask_job_type_on_launch', 'ask_verbosity_on_launch',
'ask_inventory_on_launch', 'ask_credential_on_launch',)
@@ -4162,7 +4237,8 @@ class NotificationTemplateSerializer(BaseSerializer):
class Meta:
model = NotificationTemplate
fields = ('*', 'organization', 'notification_type', 'notification_configuration')
fields = ('*', 'organization', 'notification_type', 'notification_configuration', 'messages')
type_map = {"string": (str,),
"int": (int,),
@@ -4196,6 +4272,96 @@ class NotificationTemplateSerializer(BaseSerializer):
d['recent_notifications'] = self._recent_notifications(obj)
return d
def validate_messages(self, messages):
if messages is None:
return None
error_list = []
collected_messages = []
# Validate structure / content types
if not isinstance(messages, dict):
error_list.append(_("Expected dict for 'messages' field, found {}".format(type(messages))))
else:
for event in messages:
if event not in ['started', 'success', 'error']:
error_list.append(_("Event '{}' invalid, must be one of 'started', 'success', or 'error'").format(event))
continue
event_messages = messages[event]
if event_messages is None:
continue
if not isinstance(event_messages, dict):
error_list.append(_("Expected dict for event '{}', found {}").format(event, type(event_messages)))
continue
for message_type in event_messages:
if message_type not in ['message', 'body']:
error_list.append(_("Message type '{}' invalid, must be either 'message' or 'body'").format(message_type))
continue
message = event_messages[message_type]
if message is None:
continue
if not isinstance(message, str):
error_list.append(_("Expected string for '{}', found {}, ").format(message_type, type(message)))
continue
if message_type == 'message':
if '\n' in message:
error_list.append(_("Messages cannot contain newlines (found newline in {} event)".format(event)))
continue
collected_messages.append(message)
# Subclass to return name of undefined field
class DescriptiveUndefined(StrictUndefined):
# The parent class prevents _accessing attributes_ of an object
# but will render undefined objects with 'Undefined'. This
# prevents their use entirely.
__repr__ = __str__ = StrictUndefined._fail_with_undefined_error
def __init__(self, *args, **kwargs):
super(DescriptiveUndefined, self).__init__(*args, **kwargs)
# When an undefined field is encountered, return the name
# of the undefined field in the exception message
# (StrictUndefined refers to the explicitly set exception
# message as the 'hint')
self._undefined_hint = self._undefined_name
# Ensure messages can be rendered
for msg in collected_messages:
env = sandbox.ImmutableSandboxedEnvironment(undefined=DescriptiveUndefined)
try:
env.from_string(msg).render(JobNotificationMixin.context_stub())
except TemplateSyntaxError as exc:
error_list.append(_("Unable to render message '{}': {}".format(msg, exc.message)))
except UndefinedError as exc:
error_list.append(_("Field '{}' unavailable".format(exc.message)))
except SecurityError as exc:
error_list.append(_("Security error due to field '{}'".format(exc.message)))
# Ensure that if a webhook body was provided, that it can be rendered as a dictionary
notification_type = ''
if self.instance:
notification_type = getattr(self.instance, 'notification_type', '')
else:
notification_type = self.initial_data.get('notification_type', '')
if notification_type == 'webhook':
for event in messages:
if not messages[event]:
continue
body = messages[event].get('body', {})
if body:
try:
potential_body = json.loads(body)
if not isinstance(potential_body, dict):
error_list.append(_("Webhook body for '{}' should be a json dictionary. Found type '{}'."
.format(event, type(potential_body).__name__)))
except json.JSONDecodeError as exc:
error_list.append(_("Webhook body for '{}' is not a valid json dictionary ({}).".format(event, exc)))
if error_list:
raise serializers.ValidationError(error_list)
return messages
def validate(self, attrs):
from awx.api.views import NotificationTemplateDetail
@@ -4212,6 +4378,7 @@ class NotificationTemplateSerializer(BaseSerializer):
notification_class = NotificationTemplate.CLASS_FOR_NOTIFICATION_TYPE[notification_type]
missing_fields = []
incorrect_type_fields = []
password_fields_to_forward = []
error_list = []
if 'notification_configuration' not in attrs:
return attrs
@@ -4236,7 +4403,9 @@ class NotificationTemplateSerializer(BaseSerializer):
error_list.append(_("No values specified for field '{}'").format(field))
continue
if field_type == "password" and field_val == "$encrypted$" and object_actual is not None:
attrs['notification_configuration'][field] = object_actual.notification_configuration[field]
password_fields_to_forward.append(field)
if field == "http_method" and field_val.lower() not in ['put', 'post']:
error_list.append(_("HTTP method must be either 'POST' or 'PUT'."))
if missing_fields:
error_list.append(_("Missing required fields for Notification Configuration: {}.").format(missing_fields))
if incorrect_type_fields:
@@ -4245,15 +4414,31 @@ class NotificationTemplateSerializer(BaseSerializer):
type_field_error[1]))
if error_list:
raise serializers.ValidationError(error_list)
# Only pull the existing encrypted passwords from the existing objects
# to assign to the attribute and forward on the call stack IF AND ONLY IF
# we know an error will not be raised in the validation phase.
# Otherwise, the encrypted password will be exposed.
for field in password_fields_to_forward:
attrs['notification_configuration'][field] = object_actual.notification_configuration[field]
return super(NotificationTemplateSerializer, self).validate(attrs)
class NotificationSerializer(BaseSerializer):
body = serializers.SerializerMethodField(
help_text=_('Notification body')
)
class Meta:
model = Notification
fields = ('*', '-name', '-description', 'notification_template', 'error', 'status', 'notifications_sent',
'notification_type', 'recipients', 'subject')
'notification_type', 'recipients', 'subject', 'body')
def get_body(self, obj):
if obj.notification_type == 'webhook' and 'body' in obj.body:
return obj.body['body']
return obj.body
def get_related(self, obj):
res = super(NotificationSerializer, self).get_related(obj)
@@ -4262,6 +4447,15 @@ class NotificationSerializer(BaseSerializer):
))
return res
def to_representation(self, obj):
ret = super(NotificationSerializer, self).to_representation(obj)
if obj.notification_type == 'webhook':
ret.pop('subject')
if obj.notification_type not in ('email', 'webhook', 'pagerduty'):
ret.pop('body')
return ret
class LabelSerializer(BaseSerializer):
@@ -4574,7 +4768,8 @@ class ActivityStreamSerializer(BaseSerializer):
('o_auth2_access_token', ('id', 'user_id', 'description', 'application_id', 'scope')),
('o_auth2_application', ('id', 'name', 'description')),
('credential_type', ('id', 'name', 'description', 'kind', 'managed_by_tower')),
('ad_hoc_command', ('id', 'name', 'status', 'limit'))
('ad_hoc_command', ('id', 'name', 'status', 'limit')),
('workflow_approval', ('id', 'name', 'unified_job_id')),
]
return field_list
@@ -4683,6 +4878,8 @@ class ActivityStreamSerializer(BaseSerializer):
def _summarize_parent_ujt(self, obj, fk, summary_fields):
summary_keys = {'job': 'job_template',
'workflow_job_template_node': 'workflow_job_template',
'workflow_approval_template': 'workflow_job_template',
'workflow_approval': 'workflow_job',
'schedule': 'unified_job_template'}
if fk not in summary_keys:
return

View File

@@ -71,6 +71,8 @@ from .instance import urls as instance_urls
from .instance_group import urls as instance_group_urls
from .oauth2 import urls as oauth2_urls
from .oauth2_root import urls as oauth2_root_urls
from .workflow_approval_template import urls as workflow_approval_template_urls
from .workflow_approval import urls as workflow_approval_urls
v2_urls = [
@@ -131,8 +133,11 @@ v2_urls = [
url(r'^unified_job_templates/$', UnifiedJobTemplateList.as_view(), name='unified_job_template_list'),
url(r'^unified_jobs/$', UnifiedJobList.as_view(), name='unified_job_list'),
url(r'^activity_stream/', include(activity_stream_urls)),
url(r'^workflow_approval_templates/', include(workflow_approval_template_urls)),
url(r'^workflow_approvals/', include(workflow_approval_urls)),
]
app_name = 'api'
urlpatterns = [
url(r'^$', ApiRootView.as_view(), name='api_root_view'),

View File

@@ -0,0 +1,21 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
WorkflowApprovalList,
WorkflowApprovalDetail,
WorkflowApprovalApprove,
WorkflowApprovalDeny,
)
urls = [
url(r'^$', WorkflowApprovalList.as_view(), name='workflow_approval_list'),
url(r'^(?P<pk>[0-9]+)/$', WorkflowApprovalDetail.as_view(), name='workflow_approval_detail'),
url(r'^(?P<pk>[0-9]+)/approve/$', WorkflowApprovalApprove.as_view(), name='workflow_approval_approve'),
url(r'^(?P<pk>[0-9]+)/deny/$', WorkflowApprovalDeny.as_view(), name='workflow_approval_deny'),
]
__all__ = ['urls']

View File

@@ -0,0 +1,17 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from django.conf.urls import url
from awx.api.views import (
WorkflowApprovalTemplateDetail,
WorkflowApprovalTemplateJobsList,
)
urls = [
url(r'^(?P<pk>[0-9]+)/$', WorkflowApprovalTemplateDetail.as_view(), name='workflow_approval_template_detail'),
url(r'^(?P<pk>[0-9]+)/approvals/$', WorkflowApprovalTemplateJobsList.as_view(), name='workflow_approval_template_jobs_list'),
]
__all__ = ['urls']

View File

@@ -10,6 +10,7 @@ from awx.api.views import (
WorkflowJobTemplateNodeFailureNodesList,
WorkflowJobTemplateNodeAlwaysNodesList,
WorkflowJobTemplateNodeCredentialsList,
WorkflowJobTemplateNodeCreateApproval,
)
@@ -20,6 +21,7 @@ urls = [
url(r'^(?P<pk>[0-9]+)/failure_nodes/$', WorkflowJobTemplateNodeFailureNodesList.as_view(), name='workflow_job_template_node_failure_nodes_list'),
url(r'^(?P<pk>[0-9]+)/always_nodes/$', WorkflowJobTemplateNodeAlwaysNodesList.as_view(), name='workflow_job_template_node_always_nodes_list'),
url(r'^(?P<pk>[0-9]+)/credentials/$', WorkflowJobTemplateNodeCredentialsList.as_view(), name='workflow_job_template_node_credentials_list'),
url(r'^(?P<pk>[0-9]+)/create_approval_template/$', WorkflowJobTemplateNodeCreateApproval.as_view(), name='workflow_job_template_node_create_approval'),
]
__all__ = ['urls']

View File

@@ -91,7 +91,8 @@ from awx.main.redact import UriCleaner
from awx.api.permissions import (
JobTemplateCallbackPermission, TaskPermission, ProjectUpdatePermission,
InventoryInventorySourcesUpdatePermission, UserPermission,
InstanceGroupTowerPermission, VariableDataPermission
InstanceGroupTowerPermission, VariableDataPermission,
WorkflowApprovalPermission
)
from awx.api import renderers
from awx.api import serializers
@@ -839,8 +840,6 @@ class SystemJobEventsList(SubListAPIView):
return super(SystemJobEventsList, self).finalize_response(request, response, *args, **kwargs)
class ProjectUpdateCancel(RetrieveAPIView):
model = models.ProjectUpdate
@@ -3013,6 +3012,34 @@ class WorkflowJobTemplateNodeChildrenBaseList(EnforceParentRelationshipMixin, Su
return None
class WorkflowJobTemplateNodeCreateApproval(RetrieveAPIView):
model = models.WorkflowJobTemplateNode
serializer_class = serializers.WorkflowJobTemplateNodeCreateApprovalSerializer
permission_classes = []
def post(self, request, *args, **kwargs):
obj = self.get_object()
serializer = self.get_serializer(instance=obj, data=request.data)
if not serializer.is_valid():
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
approval_template = obj.create_approval_template(**serializer.validated_data)
data = serializers.WorkflowApprovalTemplateSerializer(
approval_template,
context=self.get_serializer_context()
).data
return Response(data, status=status.HTTP_200_OK)
def check_permissions(self, request):
obj = self.get_object().workflow_job_template
if request.method == 'POST':
if not request.user.can_access(models.WorkflowJobTemplate, 'change', obj, request.data):
self.permission_denied(request)
else:
if not request.user.can_access(models.WorkflowJobTemplate, 'read', obj):
self.permission_denied(request)
class WorkflowJobTemplateNodeSuccessNodesList(WorkflowJobTemplateNodeChildrenBaseList):
relationship = 'success_nodes'
@@ -3287,7 +3314,7 @@ class WorkflowJobTemplateActivityStreamList(SubListAPIView):
Q(workflow_job_template_node__workflow_job_template=parent)).distinct()
class WorkflowJobList(ListCreateAPIView):
class WorkflowJobList(ListAPIView):
model = models.WorkflowJob
serializer_class = serializers.WorkflowJobListSerializer
@@ -3975,7 +4002,7 @@ class AdHocCommandNotificationsList(SubListAPIView):
search_fields = ('subject', 'notification_type', 'body',)
class SystemJobList(ListCreateAPIView):
class SystemJobList(ListAPIView):
model = models.SystemJob
serializer_class = serializers.SystemJobListSerializer
@@ -4405,3 +4432,63 @@ for attr, value in list(locals().items()):
name = camelcase_to_underscore(attr)
view = value.as_view()
setattr(this_module, name, view)
class WorkflowApprovalTemplateDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAPIView):
model = models.WorkflowApprovalTemplate
serializer_class = serializers.WorkflowApprovalTemplateSerializer
class WorkflowApprovalTemplateJobsList(SubListAPIView):
model = models.WorkflowApproval
serializer_class = serializers.WorkflowApprovalListSerializer
parent_model = models.WorkflowApprovalTemplate
relationship = 'approvals'
parent_key = 'workflow_approval_template'
class WorkflowApprovalList(ListCreateAPIView):
model = models.WorkflowApproval
serializer_class = serializers.WorkflowApprovalListSerializer
def get(self, request, *args, **kwargs):
return super(WorkflowApprovalList, self).get(request, *args, **kwargs)
class WorkflowApprovalDetail(UnifiedJobDeletionMixin, RetrieveDestroyAPIView):
model = models.WorkflowApproval
serializer_class = serializers.WorkflowApprovalSerializer
class WorkflowApprovalApprove(RetrieveAPIView):
model = models.WorkflowApproval
serializer_class = serializers.WorkflowApprovalViewSerializer
permission_classes = (WorkflowApprovalPermission,)
def post(self, request, *args, **kwargs):
obj = self.get_object()
if not request.user.can_access(models.WorkflowApproval, 'approve_or_deny', obj):
raise PermissionDenied(detail=_("User does not have permission to approve or deny this workflow."))
if obj.status != 'pending':
return Response({"error": _("This workflow step has already been approved or denied.")}, status=status.HTTP_400_BAD_REQUEST)
obj.approve(request)
return Response(status=status.HTTP_204_NO_CONTENT)
class WorkflowApprovalDeny(RetrieveAPIView):
model = models.WorkflowApproval
serializer_class = serializers.WorkflowApprovalViewSerializer
permission_classes = (WorkflowApprovalPermission,)
def post(self, request, *args, **kwargs):
obj = self.get_object()
if not request.user.can_access(models.WorkflowApproval, 'approve_or_deny', obj):
raise PermissionDenied(detail=_("User does not have permission to approve or deny this workflow."))
if obj.status != 'pending':
return Response({"error": _("This workflow step has already been approved or denied.")}, status=status.HTTP_400_BAD_REQUEST)
obj.deny(request)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -31,9 +31,10 @@ class MetricsView(APIView):
swagger_topic = 'Metrics'
renderer_classes = [renderers.PlainTextRenderer,
renderers.PrometheusJSONRenderer,
renderers.BrowsableAPIRenderer,]
def get(self, request, format='txt'):
def get(self, request):
''' Show Metrics Details '''
if (request.user.is_superuser or request.user.is_system_auditor):
return Response(metrics().decode('UTF-8'))

View File

@@ -124,6 +124,7 @@ class ApiVersionRootView(APIView):
data['activity_stream'] = reverse('api:activity_stream_list', request=request)
data['workflow_job_templates'] = reverse('api:workflow_job_template_list', request=request)
data['workflow_jobs'] = reverse('api:workflow_job_list', request=request)
data['workflow_approvals'] = reverse('api:workflow_approval_list', request=request)
data['workflow_job_template_nodes'] = reverse('api:workflow_job_template_node_list', request=request)
data['workflow_job_nodes'] = reverse('api:workflow_job_node_list', request=request)
return Response(data)

View File

@@ -37,6 +37,7 @@ from awx.main.models import (
ProjectUpdateEvent, Role, Schedule, SystemJob, SystemJobEvent,
SystemJobTemplate, Team, UnifiedJob, UnifiedJobTemplate, WorkflowJob,
WorkflowJobNode, WorkflowJobTemplate, WorkflowJobTemplateNode,
WorkflowApproval, WorkflowApprovalTemplate,
ROLE_SINGLETON_SYSTEM_ADMINISTRATOR, ROLE_SINGLETON_SYSTEM_AUDITOR
)
from awx.main.models.mixins import ResourceMixin
@@ -538,7 +539,7 @@ class InstanceGroupAccess(BaseAccess):
def filtered_queryset(self):
return InstanceGroup.objects.filter(
organization__in=Organization.accessible_pk_qs(self.user, 'admin_role'))
organization__in=Organization.accessible_pk_qs(self.user, 'admin_role')).distinct()
def can_add(self, data):
return self.user.is_superuser
@@ -833,10 +834,6 @@ class InventoryAccess(BaseAccess):
def filtered_queryset(self, allowed=None, ad_hoc=None):
return self.model.accessible_objects(self.user, 'read_role')
@check_superuser
def can_read(self, obj):
return self.user in obj.read_role
@check_superuser
def can_use(self, obj):
return self.user in obj.use_role
@@ -906,9 +903,6 @@ class HostAccess(BaseAccess):
def filtered_queryset(self):
return self.model.objects.filter(inventory__in=Inventory.accessible_pk_qs(self.user, 'read_role'))
def can_read(self, obj):
return obj and self.user in obj.inventory.read_role
def can_add(self, data):
if not data: # So the browseable API will work
return Inventory.accessible_objects(self.user, 'admin_role').exists()
@@ -970,9 +964,6 @@ class GroupAccess(BaseAccess):
def filtered_queryset(self):
return Group.objects.filter(inventory__in=Inventory.accessible_pk_qs(self.user, 'read_role'))
def can_read(self, obj):
return obj and self.user in obj.inventory.read_role
def can_add(self, data):
if not data or 'inventory' not in data:
return False
@@ -1016,12 +1007,6 @@ class InventorySourceAccess(NotificationAttachMixin, BaseAccess):
def filtered_queryset(self):
return self.model.objects.filter(inventory__in=Inventory.accessible_pk_qs(self.user, 'read_role'))
def can_read(self, obj):
if obj and obj.inventory:
return self.user.can_access(Inventory, 'read', obj.inventory)
else:
return False
def can_add(self, data):
if not data or 'inventory' not in data:
return Organization.accessible_objects(self.user, 'admin_role').exists()
@@ -1114,9 +1099,6 @@ class CredentialTypeAccess(BaseAccess):
model = CredentialType
prefetch_related = ('created_by', 'modified_by',)
def can_read(self, obj):
return True
def can_use(self, obj):
return True
@@ -1158,25 +1140,26 @@ class CredentialAccess(BaseAccess):
def filtered_queryset(self):
return self.model.accessible_objects(self.user, 'read_role')
@check_superuser
def can_read(self, obj):
return self.user in obj.read_role
@check_superuser
def can_add(self, data):
if not data: # So the browseable API will work
return True
if data and data.get('user', None):
user_obj = get_object_from_data('user', User, data)
return bool(self.user == user_obj or UserAccess(self.user).can_admin(user_obj, None, check_setting=False))
if not bool(self.user == user_obj or UserAccess(self.user).can_admin(user_obj, None, check_setting=False)):
return False
if data and data.get('team', None):
team_obj = get_object_from_data('team', Team, data)
return check_user_access(self.user, Team, 'change', team_obj, None)
if not check_user_access(self.user, Team, 'change', team_obj, None):
return False
if data and data.get('organization', None):
organization_obj = get_object_from_data('organization', Organization, data)
return any([check_user_access(self.user, Organization, 'change', organization_obj, None),
self.user in organization_obj.credential_admin_role])
return False
if not any([check_user_access(self.user, Organization, 'change', organization_obj, None),
self.user in organization_obj.credential_admin_role]):
return False
if not any(data.get(key, None) for key in ('user', 'team', 'organization')):
return False # you have to provide 1 owner field
return True
@check_superuser
def can_use(self, obj):
@@ -1219,10 +1202,6 @@ class CredentialInputSourceAccess(BaseAccess):
return CredentialInputSource.objects.filter(
target_credential__in=Credential.accessible_pk_qs(self.user, 'read_role'))
@check_superuser
def can_read(self, obj):
return self.user in obj.target_credential.read_role
@check_superuser
def can_add(self, data):
return (
@@ -1971,10 +1950,6 @@ class WorkflowJobTemplateAccess(NotificationAttachMixin, BaseAccess):
def filtered_queryset(self):
return self.model.accessible_objects(self.user, 'read_role')
@check_superuser
def can_read(self, obj):
return self.user in obj.read_role
@check_superuser
def can_add(self, data):
'''
@@ -2372,13 +2347,18 @@ class UnifiedJobTemplateAccess(BaseAccess):
return self.model.objects.filter(
Q(pk__in=self.model.accessible_pk_qs(self.user, 'read_role')) |
Q(inventorysource__inventory__id__in=Inventory._accessible_pk_qs(
Inventory, self.user, 'read_role')))
Inventory, self.user, 'read_role'))
)
def can_start(self, obj, validate_license=True):
access_class = access_registry[obj.__class__]
access_instance = access_class(self.user)
return access_instance.can_start(obj, validate_license=validate_license)
def get_queryset(self):
return super(UnifiedJobTemplateAccess, self).get_queryset().filter(
workflowapprovaltemplate__isnull=True)
class UnifiedJobAccess(BaseAccess):
'''
@@ -2425,6 +2405,10 @@ class UnifiedJobAccess(BaseAccess):
)
return qs
def get_queryset(self):
return super(UnifiedJobAccess, self).get_queryset().filter(
workflowapproval__isnull=True)
class ScheduleAccess(BaseAccess):
'''
@@ -2486,14 +2470,6 @@ class NotificationTemplateAccess(BaseAccess):
Q(organization__in=self.user.auditor_of_organizations)
).distinct()
def can_read(self, obj):
if self.user.is_superuser or self.user.is_system_auditor:
return True
if obj.organization is not None:
if self.user in obj.organization.notification_admin_role or self.user in obj.organization.auditor_role:
return True
return False
@check_superuser
def can_add(self, data):
if not data:
@@ -2533,9 +2509,6 @@ class NotificationAccess(BaseAccess):
Q(notification_template__organization__in=self.user.auditor_of_organizations)
).distinct()
def can_read(self, obj):
return self.user.can_access(NotificationTemplate, 'read', obj.notification_template)
def can_delete(self, obj):
return self.user.can_access(NotificationTemplate, 'delete', obj.notification_template)
@@ -2550,10 +2523,6 @@ class LabelAccess(BaseAccess):
def filtered_queryset(self):
return self.model.objects.all()
@check_superuser
def can_read(self, obj):
return self.user in obj.organization.read_role
@check_superuser
def can_add(self, data):
if not data: # So the browseable API will work
@@ -2711,15 +2680,6 @@ class RoleAccess(BaseAccess):
result = result | super_qs
return result
def can_read(self, obj):
if not obj:
return False
if self.user.is_superuser or self.user.is_system_auditor:
return True
return Role.filter_visible_roles(
self.user, Role.objects.filter(pk=obj.id)).exists()
def can_add(self, obj, data):
# Unsupported for now
return False
@@ -2764,5 +2724,80 @@ class RoleAccess(BaseAccess):
return False
class WorkflowApprovalAccess(BaseAccess):
'''
A user can create a workflow approval if they are a superuser, an org admin
of the org connected to the workflow, or if they are assigned as admins to
the workflow.
A user can approve a workflow when they are:
- a superuser
- a workflow admin
- an organization admin
- any user who has explicitly been assigned the "approver" role
A user can see approvals if they have read access to the associated WorkflowJobTemplate.
'''
model = WorkflowApproval
prefetch_related = ('created_by', 'modified_by',)
def can_use(self, obj):
return True
def can_start(self, obj, validate_license=True):
return True
def filtered_queryset(self):
return self.model.objects.filter(
unified_job_node__workflow_job__unified_job_template__in=WorkflowJobTemplate.accessible_pk_qs(
self.user, 'read_role'))
def can_approve_or_deny(self, obj):
if (
(obj.workflow_job_template and self.user in obj.workflow_job_template.approval_role) or
self.user.is_superuser
):
return True
class WorkflowApprovalTemplateAccess(BaseAccess):
'''
A user can create a workflow approval if they are a superuser, an org admin
of the org connected to the workflow, or if they are assigned as admins to
the workflow.
A user can approve a workflow when they are:
- a superuser
- a workflow admin
- an organization admin
- any user who has explicitly been assigned the "approver" role at the workflow or organization level
A user can see approval templates if they have read access to the associated WorkflowJobTemplate.
'''
model = WorkflowApprovalTemplate
prefetch_related = ('created_by', 'modified_by',)
@check_superuser
def can_add(self, data):
if data is None: # Hide direct creation in API browser
return False
else:
return (self.check_related('workflow_approval_template', UnifiedJobTemplate, role_field='admin_role'))
def can_start(self, obj, validate_license=False):
# for copying WFJTs that contain approval nodes
if self.user.is_superuser:
return True
return self.user in obj.workflow_job_template.execute_role
def filtered_queryset(self):
return self.model.objects.filter(
workflowjobtemplatenodes__workflow_job_template__in=WorkflowJobTemplate.accessible_pk_qs(
self.user, 'read_role'))
for cls in BaseAccess.__subclasses__():
access_registry[cls.model] = cls

View File

@@ -1 +1 @@
from .core import register, gather, ship # noqa
from .core import register, gather, ship, table_version # noqa

View File

@@ -12,14 +12,14 @@ from awx.main.utils import (get_awx_version, get_ansible_version,
get_custom_venv_choices, camelcase_to_underscore)
from awx.main import models
from django.contrib.sessions.models import Session
from awx.main.analytics import register
from awx.main.analytics import register, table_version
'''
This module is used to define metrics collected by awx.main.analytics.gather()
Each function is decorated with a key name, and should return a data
structure that can be serialized to JSON
@register('something')
@register('something', '1.0')
def something(since):
# the generated archive will contain a `something.json` w/ this JSON
return {'some': 'json'}
@@ -31,7 +31,7 @@ data _since_ the last report date - i.e., new data in the last 24 hours)
'''
@register('config')
@register('config', '1.0')
def config(since):
license_info = get_license(show_key=False)
install_type = 'traditional'
@@ -62,7 +62,7 @@ def config(since):
}
@register('counts')
@register('counts', '1.0')
def counts(since):
counts = {}
for cls in (models.Organization, models.Team, models.User,
@@ -93,10 +93,11 @@ def counts(since):
counts['active_user_sessions'] = active_user_sessions
counts['active_anonymous_sessions'] = active_anonymous_sessions
counts['running_jobs'] = models.UnifiedJob.objects.exclude(launch_type='sync').filter(status__in=('running', 'waiting',)).count()
counts['pending_jobs'] = models.UnifiedJob.objects.exclude(launch_type='sync').filter(status__in=('pending',)).count()
return counts
@register('org_counts')
@register('org_counts', '1.0')
def org_counts(since):
counts = {}
for org in models.Organization.objects.annotate(num_users=Count('member_role__members', distinct=True),
@@ -108,7 +109,7 @@ def org_counts(since):
return counts
@register('cred_type_counts')
@register('cred_type_counts', '1.0')
def cred_type_counts(since):
counts = {}
for cred_type in models.CredentialType.objects.annotate(num_credentials=Count(
@@ -120,7 +121,7 @@ def cred_type_counts(since):
return counts
@register('inventory_counts')
@register('inventory_counts', '1.0')
def inventory_counts(since):
counts = {}
for inv in models.Inventory.objects.filter(kind='').annotate(num_sources=Count('inventory_sources', distinct=True),
@@ -140,7 +141,7 @@ def inventory_counts(since):
return counts
@register('projects_by_scm_type')
@register('projects_by_scm_type', '1.0')
def projects_by_scm_type(since):
counts = dict(
(t[0] or 'manual', 0)
@@ -159,8 +160,8 @@ def _get_isolated_datetime(last_check):
return last_check
@register('instance_info')
def instance_info(since):
@register('instance_info', '1.0')
def instance_info(since, include_hostnames=False):
info = {}
instances = models.Instance.objects.values_list('hostname').values(
'uuid', 'version', 'capacity', 'cpu', 'memory', 'managed_by_policy', 'hostname', 'last_isolated_check', 'enabled')
@@ -175,11 +176,13 @@ def instance_info(since):
'last_isolated_check': _get_isolated_datetime(instance['last_isolated_check']),
'enabled': instance['enabled']
}
if include_hostnames is True:
instance_info['hostname'] = instance['hostname']
info[instance['uuid']] = instance_info
return info
@register('job_counts')
@register('job_counts', '1.0')
def job_counts(since):
counts = {}
counts['total_jobs'] = models.UnifiedJob.objects.exclude(launch_type='sync').count()
@@ -189,7 +192,7 @@ def job_counts(since):
return counts
@register('job_instance_counts')
@register('job_instance_counts', '1.0')
def job_instance_counts(since):
counts = {}
job_types = models.UnifiedJob.objects.exclude(launch_type='sync').values_list(
@@ -204,7 +207,19 @@ def job_instance_counts(since):
return counts
@register('query_info', '1.0')
def query_info(since, collection_type):
query_info = {}
query_info['last_run'] = str(since)
query_info['current_time'] = str(now())
query_info['collection_type'] = collection_type
return query_info
# Copies Job Events from db to a .csv to be shipped
@table_version('events_table.csv', '1.0')
@table_version('unified_jobs_table.csv', '1.0')
@table_version('unified_job_template_table.csv', '1.0')
def copy_tables(since, full_path):
def _copy_table(table, query, path):
file_path = os.path.join(path, table + '_table.csv')
@@ -233,10 +248,12 @@ def copy_tables(since, full_path):
WHERE main_jobevent.created > {}
ORDER BY main_jobevent.id ASC) TO STDOUT WITH CSV HEADER'''.format(since.strftime("'%Y-%m-%d %H:%M:%S'"))
_copy_table(table='events', query=events_query, path=full_path)
unified_job_query = '''COPY (SELECT main_unifiedjob.id,
unified_job_query = '''COPY (SELECT main_unifiedjob.id,
main_unifiedjob.polymorphic_ctype_id,
django_content_type.model,
main_project.organization_id,
main_organization.name as organization_name,
main_unifiedjob.created,
main_unifiedjob.name,
main_unifiedjob.unified_job_template_id,
@@ -252,13 +269,16 @@ def copy_tables(since, full_path):
main_unifiedjob.elapsed,
main_unifiedjob.job_explanation,
main_unifiedjob.instance_group_id
FROM main_unifiedjob, django_content_type
WHERE main_unifiedjob.created > {} AND
main_unifiedjob.polymorphic_ctype_id = django_content_type.id AND
main_unifiedjob.launch_type != 'sync'
FROM main_unifiedjob
JOIN main_job ON main_unifiedjob.id = main_job.unifiedjob_ptr_id
JOIN django_content_type ON main_unifiedjob.polymorphic_ctype_id = django_content_type.id
JOIN main_project ON main_project.unifiedjobtemplate_ptr_id = main_job.project_id
JOIN main_organization ON main_organization.id = main_project.organization_id
WHERE main_unifiedjob.created > {}
AND main_unifiedjob.launch_type != 'sync'
ORDER BY main_unifiedjob.id ASC) TO STDOUT WITH CSV HEADER'''.format(since.strftime("'%Y-%m-%d %H:%M:%S'"))
_copy_table(table='unified_jobs', query=unified_job_query, path=full_path)
unified_job_template_query = '''COPY (SELECT main_unifiedjobtemplate.id,
main_unifiedjobtemplate.polymorphic_ctype_id,
django_content_type.model,
@@ -279,4 +299,3 @@ def copy_tables(since, full_path):
ORDER BY main_unifiedjobtemplate.id ASC) TO STDOUT WITH CSV HEADER'''.format(since.strftime("'%Y-%m-%d %H:%M:%S'"))
_copy_table(table='unified_job_template', query=unified_job_template_query, path=full_path)
return

View File

@@ -18,11 +18,13 @@ from awx.main.access import access_registry
from awx.main.models.ha import TowerAnalyticsState
__all__ = ['register', 'gather', 'ship']
__all__ = ['register', 'gather', 'ship', 'table_version']
logger = logging.getLogger('awx.main.analytics')
manifest = dict()
def _valid_license():
try:
@@ -35,25 +37,37 @@ def _valid_license():
return True
def register(key):
def register(key, version):
"""
A decorator used to register a function as a metric collector.
Decorated functions should return JSON-serializable objects.
@register('projects_by_scm_type')
@register('projects_by_scm_type', 1)
def projects_by_scm_type():
return {'git': 5, 'svn': 1, 'hg': 0}
"""
def decorate(f):
f.__awx_analytics_key__ = key
f.__awx_analytics_version__ = version
return f
return decorate
def gather(dest=None, module=None):
def table_version(file_name, version):
global manifest
manifest[file_name] = version
def decorate(f):
return f
return decorate
def gather(dest=None, module=None, collection_type='scheduled'):
"""
Gather all defined metrics and write them as JSON files in a .tgz
@@ -84,18 +98,33 @@ def gather(dest=None, module=None):
from awx.main.analytics import collectors
module = collectors
dest = dest or tempfile.mkdtemp(prefix='awx_analytics')
for name, func in inspect.getmembers(module):
if inspect.isfunction(func) and hasattr(func, '__awx_analytics_key__'):
key = func.__awx_analytics_key__
manifest['{}.json'.format(key)] = func.__awx_analytics_version__
path = '{}.json'.format(os.path.join(dest, key))
with open(path, 'w', encoding='utf-8') as f:
try:
json.dump(func(last_run), f)
if func.__name__ == 'query_info':
json.dump(func(last_run, collection_type=collection_type), f)
else:
json.dump(func(last_run), f)
except Exception:
logger.exception("Could not generate metric {}.json".format(key))
f.close()
os.remove(f.name)
path = os.path.join(dest, 'manifest.json')
with open(path, 'w', encoding='utf-8') as f:
try:
json.dump(manifest, f)
except Exception:
logger.exception("Could not generate manifest.json")
f.close()
os.remove(f.name)
try:
collectors.copy_tables(since=last_run, full_path=dest)
except Exception:

View File

@@ -15,6 +15,7 @@ from awx.main.analytics.collectors import (
counts,
instance_info,
job_instance_counts,
job_counts,
)
@@ -36,11 +37,13 @@ INV_SCRIPT_COUNT = Gauge('awx_inventory_scripts_total', 'Number of invetory scri
USER_SESSIONS = Gauge('awx_sessions_total', 'Number of sessions', ['type',])
CUSTOM_VENVS = Gauge('awx_custom_virtualenvs_total', 'Number of virtualenvs')
RUNNING_JOBS = Gauge('awx_running_jobs_total', 'Number of running jobs on the Tower system')
PENDING_JOBS = Gauge('awx_pending_jobs_total', 'Number of pending jobs on the Tower system')
STATUS = Gauge('awx_status_total', 'Status of Job launched', ['status',])
INSTANCE_CAPACITY = Gauge('awx_instance_capacity', 'Capacity of each node in a Tower system', ['instance_uuid',])
INSTANCE_CPU = Gauge('awx_instance_cpu', 'CPU cores on each node in a Tower system', ['instance_uuid',])
INSTANCE_MEMORY = Gauge('awx_instance_memory', 'RAM (Kb) on each node in a Tower system', ['instance_uuid',])
INSTANCE_INFO = Info('awx_instance', 'Info about each node in a Tower system', ['instance_uuid',])
INSTANCE_CAPACITY = Gauge('awx_instance_capacity', 'Capacity of each node in a Tower system', ['hostname', 'instance_uuid',])
INSTANCE_CPU = Gauge('awx_instance_cpu', 'CPU cores on each node in a Tower system', ['hostname', 'instance_uuid',])
INSTANCE_MEMORY = Gauge('awx_instance_memory', 'RAM (Kb) on each node in a Tower system', ['hostname', 'instance_uuid',])
INSTANCE_INFO = Info('awx_instance', 'Info about each node in a Tower system', ['hostname', 'instance_uuid',])
INSTANCE_LAUNCH_TYPE = Gauge('awx_instance_launch_type_total', 'Type of Job launched', ['node', 'launch_type',])
INSTANCE_STATUS = Gauge('awx_instance_status_total', 'Status of Job launched', ['node', 'status',])
@@ -87,15 +90,21 @@ def metrics():
USER_SESSIONS.labels(type='user').set(current_counts['active_user_sessions'])
USER_SESSIONS.labels(type='anonymous').set(current_counts['active_anonymous_sessions'])
all_job_data = job_counts(None)
statuses = all_job_data.get('status', {})
for status, value in statuses.items():
STATUS.labels(status=status).set(value)
RUNNING_JOBS.set(current_counts['running_jobs'])
PENDING_JOBS.set(current_counts['pending_jobs'])
instance_data = instance_info(None)
for uuid in instance_data:
INSTANCE_CAPACITY.labels(instance_uuid=uuid).set(instance_data[uuid]['capacity'])
INSTANCE_CPU.labels(instance_uuid=uuid).set(instance_data[uuid]['cpu'])
INSTANCE_MEMORY.labels(instance_uuid=uuid).set(instance_data[uuid]['memory'])
INSTANCE_INFO.labels(instance_uuid=uuid).info({
instance_data = instance_info(None, include_hostnames=True)
for uuid, info in instance_data.items():
hostname = info['hostname']
INSTANCE_CAPACITY.labels(hostname=hostname, instance_uuid=uuid).set(instance_data[uuid]['capacity'])
INSTANCE_CPU.labels(hostname=hostname, instance_uuid=uuid).set(instance_data[uuid]['cpu'])
INSTANCE_MEMORY.labels(hostname=hostname, instance_uuid=uuid).set(instance_data[uuid]['memory'])
INSTANCE_INFO.labels(hostname=hostname, instance_uuid=uuid).info({
'enabled': str(instance_data[uuid]['enabled']),
'last_isolated_check': getattr(instance_data[uuid], 'last_isolated_check', 'None'),
'managed_by_policy': str(instance_data[uuid]['managed_by_policy']),

View File

@@ -124,6 +124,44 @@ register(
category_slug='system',
)
register(
'REDHAT_USERNAME',
field_class=fields.CharField,
default='',
allow_blank=True,
encrypted=False,
read_only=False,
label=_('Red Hat customer username'),
help_text=_('This username is used to retrieve license information and to send Automation Analytics'), # noqa
category=_('System'),
category_slug='system',
)
register(
'REDHAT_PASSWORD',
field_class=fields.CharField,
default='',
allow_blank=True,
encrypted=True,
read_only=False,
label=_('Red Hat customer password'),
help_text=_('This password is used to retrieve license information and to send Automation Analytics'), # noqa
category=_('System'),
category_slug='system',
)
register(
'AUTOMATION_ANALYTICS_URL',
field_class=fields.URLField,
default='https://cloud.redhat.com',
schemes=('http', 'https'),
allow_plain_hostname=True, # Allow hostname only without TLD.
label=_('Automation Analytics upload URL.'),
help_text=_('This setting is used to to configure data collection for the Automation Analytics dashboard'),
category=_('System'),
category_slug='system',
)
register(
'INSTALL_UUID',
field_class=fields.CharField,
@@ -328,6 +366,16 @@ register(
category_slug='jobs',
)
register(
'AWX_COLLECTIONS_ENABLED',
field_class=fields.BooleanField,
default=True,
label=_('Enable Collection(s) Download'),
help_text=_('Allows collections to be dynamically downloaded from a requirements.yml file for SCM projects.'),
category=_('Jobs'),
category_slug='jobs',
)
register(
'STDOUT_MAX_BYTES_DISPLAY',
field_class=fields.IntegerField,

View File

@@ -164,7 +164,7 @@ def is_implicit_parent(parent_role, child_role):
# The only singleton implicit parent is the system admin being
# a parent of the system auditor role
return bool(
child_role.singleton_name == ROLE_SINGLETON_SYSTEM_AUDITOR and
child_role.singleton_name == ROLE_SINGLETON_SYSTEM_AUDITOR and
parent_role.singleton_name == ROLE_SINGLETON_SYSTEM_ADMINISTRATOR
)
# Get the list of implicit parents that were defined at the class level.
@@ -691,10 +691,12 @@ class CredentialInputField(JSONSchemaField):
if model_instance.has_encrypted_ssh_key_data and not value.get('ssh_key_unlock'):
errors['ssh_key_unlock'] = [_('must be set when SSH key is encrypted.')]
if all([
model_instance.inputs.get('ssh_key_data'),
value.get('ssh_key_unlock'),
not model_instance.has_encrypted_ssh_key_data
not model_instance.has_encrypted_ssh_key_data,
'ssh_key_data' not in errors
]):
errors['ssh_key_unlock'] = [_('should not be set when SSH key is not encrypted.')]

View File

@@ -1,6 +1,8 @@
# Copyright (c) 2015 Ansible, Inc.
# All Rights Reserved
import json
from awx.main.utils import get_licenser
from django.core.management.base import BaseCommand
@@ -8,6 +10,15 @@ from django.core.management.base import BaseCommand
class Command(BaseCommand):
"""Returns license type, e.g., 'enterprise', 'open', 'none'"""
def add_arguments(self, parser):
parser.add_argument('--data', dest='data', action='store_true',
help='verbose, prints the actual (sanitized) license')
def handle(self, *args, **options):
super(Command, self).__init__()
return get_licenser().validate().get('license_type', 'none')
license = get_licenser().validate()
if options.get('data'):
if license.get('license_key', '') != 'UNLICENSED':
license['license_key'] = '********'
return json.dumps(license)
return license.get('license_type', 'none')

View File

@@ -23,7 +23,7 @@ class Command(BaseCommand):
self.logger.propagate = False
def handle(self, *args, **options):
tgz = gather()
tgz = gather(collection_type='manual')
self.init_logging()
if tgz:
self.logger.debug(tgz)

View File

@@ -0,0 +1,25 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations
def add_webhook_notification_template_fields(apps, schema_editor):
# loop over all existing webhook notification templates and make
# sure they have the new "http_method" field filled in with "POST"
NotificationTemplate = apps.get_model('main', 'notificationtemplate')
webhooks = NotificationTemplate.objects.filter(notification_type='webhook')
for w in webhooks:
w.notification_configuration['http_method'] = 'POST'
w.save()
class Migration(migrations.Migration):
dependencies = [
('main', '0081_v360_notify_on_start'),
]
operations = [
migrations.RunPython(add_webhook_notification_template_fields, migrations.RunPython.noop),
]

View File

@@ -0,0 +1,60 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.20 on 2019-06-14 15:08
from __future__ import unicode_literals
import awx.main.fields
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('main', '0082_v360_webhook_http_method'),
]
operations = [
# Add fields for user-provided project refspec
migrations.AddField(
model_name='project',
name='scm_refspec',
field=models.CharField(blank=True, default='', help_text='For git projects, an additional refspec to fetch.', max_length=1024, verbose_name='SCM refspec'),
),
migrations.AddField(
model_name='projectupdate',
name='scm_refspec',
field=models.CharField(blank=True, default='', help_text='For git projects, an additional refspec to fetch.', max_length=1024, verbose_name='SCM refspec'),
),
# Add fields for job specification of project branch
migrations.AddField(
model_name='job',
name='scm_branch',
field=models.CharField(blank=True, default='', help_text='Branch to use in job run. Project default used if blank. Only allowed if project allow_override field is set to true.', max_length=1024),
),
migrations.AddField(
model_name='jobtemplate',
name='ask_scm_branch_on_launch',
field=awx.main.fields.AskForField(blank=True, default=False),
),
migrations.AddField(
model_name='jobtemplate',
name='scm_branch',
field=models.CharField(blank=True, default='', help_text='Branch to use in job run. Project default used if blank. Only allowed if project allow_override field is set to true.', max_length=1024),
),
migrations.AddField(
model_name='project',
name='allow_override',
field=models.BooleanField(default=False, help_text='Allow changing the SCM branch or revision in a job template that uses this project.'),
),
# Fix typo in help_text
migrations.AlterField(
model_name='project',
name='scm_update_cache_timeout',
field=models.PositiveIntegerField(blank=True, default=0, help_text='The number of seconds after the last project update ran that a new project update will be launched as a job dependency.'),
),
# Start tracking the fetched revision on project update model
migrations.AddField(
model_name='projectupdate',
name='scm_revision',
field=models.CharField(blank=True, default='', editable=False, help_text='The SCM Revision discovered by this update for the given project and branch.', max_length=1024, verbose_name='SCM Revision'),
),
]

View File

@@ -0,0 +1,19 @@
# Generated by Django 2.2.4 on 2019-08-16 13:22
from django.db import migrations, models
import awx
class Migration(migrations.Migration):
dependencies = [
('main', '0083_v360_job_branch_override'),
]
operations = [
migrations.AlterField(
model_name='oauth2accesstoken',
name='description',
field=models.TextField(blank=True, default=''),
),
]

View File

@@ -0,0 +1,36 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.20 on 2019-06-10 16:56
from __future__ import unicode_literals
from django.db import migrations, models
import awx.main.fields
import awx.main.models.notifications
class Migration(migrations.Migration):
dependencies = [
('main', '0084_v360_token_description'),
]
operations = [
migrations.AddField(
model_name='notificationtemplate',
name='messages',
field=awx.main.fields.JSONField(default=awx.main.models.notifications.NotificationTemplate.default_messages,
help_text='Optional custom messages for notification template.',
null=True,
blank=True),
),
migrations.AlterField(
model_name='notification',
name='notification_type',
field=models.CharField(choices=[('email', 'Email'), ('grafana', 'Grafana'), ('hipchat', 'HipChat'), ('irc', 'IRC'), ('mattermost', 'Mattermost'), ('pagerduty', 'Pagerduty'), ('rocketchat', 'Rocket.Chat'), ('slack', 'Slack'), ('twilio', 'Twilio'), ('webhook', 'Webhook')], max_length=32),
),
migrations.AlterField(
model_name='notificationtemplate',
name='notification_type',
field=models.CharField(choices=[('email', 'Email'), ('grafana', 'Grafana'), ('hipchat', 'HipChat'), ('irc', 'IRC'), ('mattermost', 'Mattermost'), ('pagerduty', 'Pagerduty'), ('rocketchat', 'Rocket.Chat'), ('slack', 'Slack'), ('twilio', 'Twilio'), ('webhook', 'Webhook')], max_length=32),
),
]

View File

@@ -0,0 +1,83 @@
# Generated by Django 2.2.4 on 2019-08-02 17:51
import awx.main.fields
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('main', '0085_v360_add_notificationtemplate_messages'),
]
operations = [
migrations.CreateModel(
name='WorkflowApprovalTemplate',
fields=[
('unifiedjobtemplate_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='main.UnifiedJobTemplate')),
('timeout', models.IntegerField(blank=True, default=0, help_text='The amount of time (in seconds) before the approval node expires and fails.')),
],
bases=('main.unifiedjobtemplate',),
),
migrations.AddField(
model_name='organization',
name='approval_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role='admin_role', related_name='+', to='main.Role'),
preserve_default='True',
),
migrations.AddField(
model_name='workflowjobtemplate',
name='approval_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['organization.approval_role', 'admin_role'], related_name='+', to='main.Role'),
preserve_default='True',
),
migrations.AlterField(
model_name='workflowjobnode',
name='unified_job_template',
field=models.ForeignKey(blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='workflowjobnodes', to='main.UnifiedJobTemplate'),
),
migrations.AlterField(
model_name='workflowjobtemplatenode',
name='unified_job_template',
field=models.ForeignKey(blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='workflowjobtemplatenodes', to='main.UnifiedJobTemplate'),
),
migrations.CreateModel(
name='WorkflowApproval',
fields=[
('unifiedjob_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='main.UnifiedJob')),
('workflow_approval_template', models.ForeignKey(blank=True, default=None, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='approvals', to='main.WorkflowApprovalTemplate')),
],
bases=('main.unifiedjob',),
),
migrations.AddField(
model_name='activitystream',
name='workflow_approval',
field=models.ManyToManyField(blank=True, to='main.WorkflowApproval'),
),
migrations.AddField(
model_name='activitystream',
name='workflow_approval_template',
field=models.ManyToManyField(blank=True, to='main.WorkflowApprovalTemplate'),
),
migrations.AlterField(
model_name='organization',
name='read_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['member_role', 'auditor_role', 'execute_role', 'project_admin_role', 'inventory_admin_role', 'workflow_admin_role', 'notification_admin_role', 'credential_admin_role', 'job_template_admin_role', 'approval_role'], related_name='+', to='main.Role'),
),
migrations.AlterField(
model_name='workflowjobtemplate',
name='read_role',
field=awx.main.fields.ImplicitRoleField(editable=False, null='True', on_delete=django.db.models.deletion.CASCADE, parent_role=['singleton:system_auditor', 'organization.auditor_role', 'execute_role', 'admin_role', 'approval_role'], related_name='+', to='main.Role'),
),
migrations.AddField(
model_name='workflowapproval',
name='timeout',
field=models.IntegerField(blank=True, default=0, help_text='The amount of time (in seconds) before the approval node expires and fails.'),
),
migrations.AddField(
model_name='workflowapproval',
name='timed_out',
field=models.BooleanField(default=False, help_text='Shows when an approval node (with a timeout assigned to it) has timed out.'),
),
]

View File

@@ -0,0 +1,29 @@
# Generated by Django 2.2.4 on 2019-08-27 21:50
import awx.main.fields
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('main', '0086_v360_workflow_approval'),
]
operations = [
migrations.AlterField(
model_name='credential',
name='inputs',
field=awx.main.fields.CredentialInputField(blank=True, default=dict, help_text='Enter inputs using either JSON or YAML syntax. Refer to the Ansible Tower documentation for example syntax.'),
),
migrations.AlterField(
model_name='credentialtype',
name='injectors',
field=awx.main.fields.CredentialTypeInjectorField(blank=True, default=dict, help_text='Enter injectors using either JSON or YAML syntax. Refer to the Ansible Tower documentation for example syntax.'),
),
migrations.AlterField(
model_name='credentialtype',
name='inputs',
field=awx.main.fields.CredentialTypeInputField(blank=True, default=dict, help_text='Enter inputs using either JSON or YAML syntax. Refer to the Ansible Tower documentation for example syntax.'),
),
]

View File

@@ -7,7 +7,8 @@ from django.db.models.signals import pre_delete # noqa
# AWX
from awx.main.models.base import ( # noqa
BaseModel, PrimordialModel, prevent_search, CLOUD_INVENTORY_SOURCES, VERBOSITY_CHOICES
BaseModel, PrimordialModel, prevent_search, accepts_json,
CLOUD_INVENTORY_SOURCES, VERBOSITY_CHOICES
)
from awx.main.models.unified_jobs import ( # noqa
UnifiedJob, UnifiedJobTemplate, StdoutMaxBytesExceeded
@@ -48,11 +49,14 @@ from awx.main.models.mixins import ( # noqa
TaskManagerJobMixin, TaskManagerProjectUpdateMixin,
TaskManagerUnifiedJobMixin,
)
from awx.main.models.notifications import Notification, NotificationTemplate # noqa
from awx.main.models.notifications import ( # noqa
Notification, NotificationTemplate,
JobNotificationMixin
)
from awx.main.models.label import Label # noqa
from awx.main.models.workflow import ( # noqa
WorkflowJob, WorkflowJobNode, WorkflowJobOptions, WorkflowJobTemplate,
WorkflowJobTemplateNode,
WorkflowJobTemplateNode, WorkflowApproval, WorkflowApprovalTemplate,
)
from awx.main.models.channels import ChannelGroup # noqa
from awx.api.versioning import reverse
@@ -122,18 +126,22 @@ def user_is_system_auditor(user):
@user_is_system_auditor.setter
def user_is_system_auditor(user, tf):
if user.id:
if tf:
role = Role.singleton('system_auditor')
# must check if member to not duplicate activity stream
if user not in role.members.all():
role.members.add(user)
user._is_system_auditor = True
else:
role = Role.singleton('system_auditor')
if user in role.members.all():
role.members.remove(user)
user._is_system_auditor = False
if not user.id:
# If the user doesn't have a primary key yet (i.e., this is the *first*
# time they've logged in, and we've just created the new User in this
# request), we need one to set up the system auditor role
user.save()
if tf:
role = Role.singleton('system_auditor')
# must check if member to not duplicate activity stream
if user not in role.members.all():
role.members.add(user)
user._is_system_auditor = True
else:
role = Role.singleton('system_auditor')
if user in role.members.all():
role.members.remove(user)
user._is_system_auditor = False
User.add_to_class('is_system_auditor', user_is_system_auditor)
@@ -195,6 +203,8 @@ activity_stream_registrar.connect(User)
activity_stream_registrar.connect(WorkflowJobTemplate)
activity_stream_registrar.connect(WorkflowJobTemplateNode)
activity_stream_registrar.connect(WorkflowJob)
activity_stream_registrar.connect(WorkflowApproval)
activity_stream_registrar.connect(WorkflowApprovalTemplate)
activity_stream_registrar.connect(OAuth2Application)
activity_stream_registrar.connect(OAuth2AccessToken)
@@ -205,4 +215,3 @@ prevent_search(RefreshToken._meta.get_field('token'))
prevent_search(OAuth2Application._meta.get_field('client_secret'))
prevent_search(OAuth2Application._meta.get_field('client_id'))
prevent_search(Grant._meta.get_field('code'))

View File

@@ -4,6 +4,7 @@
# Tower
from awx.api.versioning import reverse
from awx.main.fields import JSONField
from awx.main.models.base import accepts_json
# Django
from django.db import models
@@ -34,7 +35,7 @@ class ActivityStream(models.Model):
actor = models.ForeignKey('auth.User', null=True, on_delete=models.SET_NULL, related_name='activity_stream')
operation = models.CharField(max_length=13, choices=OPERATION_CHOICES)
timestamp = models.DateTimeField(auto_now_add=True)
changes = models.TextField(blank=True)
changes = accepts_json(models.TextField(blank=True))
deleted_actor = JSONField(null=True)
action_node = models.CharField(
blank=True,
@@ -66,6 +67,8 @@ class ActivityStream(models.Model):
workflow_job_node = models.ManyToManyField("WorkflowJobNode", blank=True)
workflow_job_template = models.ManyToManyField("WorkflowJobTemplate", blank=True)
workflow_job = models.ManyToManyField("WorkflowJob", blank=True)
workflow_approval_template = models.ManyToManyField("WorkflowApprovalTemplate", blank=True)
workflow_approval = models.ManyToManyField("WorkflowApproval", blank=True)
unified_job_template = models.ManyToManyField("UnifiedJobTemplate", blank=True, related_name='activity_stream_as_unified_job_template+')
unified_job = models.ManyToManyField("UnifiedJob", blank=True, related_name='activity_stream_as_unified_job+')
ad_hoc_command = models.ManyToManyField("AdHocCommand", blank=True)

View File

@@ -408,3 +408,14 @@ def prevent_search(relation):
"""
setattr(relation, '__prevent_search__', True)
return relation
def accepts_json(relation):
"""
Used to mark a model field as allowing JSON e.g,. JobTemplate.extra_vars
This is *mostly* used as a way to provide type hints for certain fields
so that HTTP OPTIONS reports the type data we need for the CLI to allow
JSON/YAML input.
"""
setattr(relation, '__accepts_json__', True)
return relation

View File

@@ -106,9 +106,8 @@ class Credential(PasswordFieldsModel, CommonModelNameNotUnique, ResourceMixin):
inputs = CredentialInputField(
blank=True,
default=dict,
help_text=_('Enter inputs using either JSON or YAML syntax. Use the '
'radio button to toggle between the two. Refer to the '
'Ansible Tower documentation for example syntax.')
help_text=_('Enter inputs using either JSON or YAML syntax. '
'Refer to the Ansible Tower documentation for example syntax.')
)
admin_role = ImplicitRoleField(
parent_role=[
@@ -344,16 +343,14 @@ class CredentialType(CommonModelNameNotUnique):
inputs = CredentialTypeInputField(
blank=True,
default=dict,
help_text=_('Enter inputs using either JSON or YAML syntax. Use the '
'radio button to toggle between the two. Refer to the '
'Ansible Tower documentation for example syntax.')
help_text=_('Enter inputs using either JSON or YAML syntax. '
'Refer to the Ansible Tower documentation for example syntax.')
)
injectors = CredentialTypeInjectorField(
blank=True,
default=dict,
help_text=_('Enter injectors using either JSON or YAML syntax. Use the '
'radio button to toggle between the two. Refer to the '
'Ansible Tower documentation for example syntax.')
help_text=_('Enter injectors using either JSON or YAML syntax. '
'Refer to the Ansible Tower documentation for example syntax.')
)
@classmethod

View File

@@ -28,10 +28,7 @@ def gce(cred, env, private_data_dir):
if 'INVENTORY_UPDATE_ID' not in env:
env['GCE_EMAIL'] = username
env['GCE_PROJECT'] = project
else:
# gcp_compute inventory plugin requires token_uri
# although it probably should not, since gce_modules do not
json_cred['token_uri'] = 'https://oauth2.googleapis.com/token'
json_cred['token_uri'] = 'https://oauth2.googleapis.com/token'
handle, path = tempfile.mkstemp(dir=private_data_dir)
f = os.fdopen(handle, 'w')

View File

@@ -614,7 +614,13 @@ class BaseCommandEvent(CreatedModifiedModel):
kwargs.pop('created', None)
sanitize_event_keys(kwargs, cls.VALID_KEYS)
return cls.objects.create(**kwargs)
event = cls.objects.create(**kwargs)
if isinstance(event, AdHocCommandEvent):
analytics_logger.info(
'Event data saved.',
extra=dict(python_objects=dict(job_event=event))
)
return event
def get_event_display(self):
'''
@@ -622,6 +628,9 @@ class BaseCommandEvent(CreatedModifiedModel):
'''
return self.event
def get_event_display2(self):
return self.get_event_display()
def get_host_status_counts(self):
return create_host_status_counts(getattr(self, 'event_data', {}))

View File

@@ -45,7 +45,7 @@ from awx.main.models.base import (
CommonModelNameNotUnique,
VarsDictProperty,
CLOUD_INVENTORY_SOURCES,
prevent_search
prevent_search, accepts_json
)
from awx.main.models.events import InventoryUpdateEvent
from awx.main.models.unified_jobs import UnifiedJob, UnifiedJobTemplate
@@ -93,11 +93,11 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
on_delete=models.SET_NULL,
null=True,
)
variables = models.TextField(
variables = accepts_json(models.TextField(
blank=True,
default='',
help_text=_('Inventory variables in JSON or YAML format.'),
)
))
has_active_failures = models.BooleanField(
default=False,
editable=False,
@@ -309,7 +309,7 @@ class Inventory(CommonModelNameNotUnique, ResourceMixin, RelatedJobsMixin):
# Now use in-memory maps to build up group info.
all_group_names = []
for group in self.groups.only('name', 'id', 'variables'):
for group in self.groups.only('name', 'id', 'variables', 'inventory_id'):
group_info = dict()
if group.id in group_hosts_map:
group_info['hosts'] = group_hosts_map[group.id]
@@ -608,11 +608,11 @@ class Host(CommonModelNameNotUnique, RelatedJobsMixin):
default='',
help_text=_('The value used by the remote inventory source to uniquely identify the host'),
)
variables = models.TextField(
variables = accepts_json(models.TextField(
blank=True,
default='',
help_text=_('Host variables in JSON or YAML format.'),
)
))
last_job = models.ForeignKey(
'Job',
related_name='hosts_as_last_job+',
@@ -796,11 +796,11 @@ class Group(CommonModelNameNotUnique, RelatedJobsMixin):
related_name='children',
blank=True,
)
variables = models.TextField(
variables = accepts_json(models.TextField(
blank=True,
default='',
help_text=_('Group variables in JSON or YAML format.'),
)
))
hosts = models.ManyToManyField(
'Host',
related_name='groups',
@@ -2048,8 +2048,10 @@ class azure_rm(PluginFileInjector):
'provisioning_state': 'provisioning_state | title',
'computer_name': 'name',
'type': 'resource_type',
'private_ip': 'private_ipv4_addresses[0]',
'public_ip': 'public_ipv4_addresses[0]',
'private_ip': 'private_ipv4_addresses[0] if private_ipv4_addresses else None',
'public_ip': 'public_ipv4_addresses[0] if public_ipv4_addresses else None',
'public_ip_name': 'public_ip_name if public_ip_name is defined else None',
'public_ip_id': 'public_ip_id if public_ip_id is defined else None',
'tags': 'tags if tags else None'
}
# Special functionality from script
@@ -2332,6 +2334,12 @@ class gce(PluginFileInjector):
ini_env_reference = 'GCE_INI_PATH'
base_injector = 'managed'
def get_plugin_env(self, *args, **kwargs):
ret = super(gce, self).get_plugin_env(*args, **kwargs)
# We need native jinja2 types so that ip addresses can give JSON null value
ret['ANSIBLE_JINJA2_NATIVE'] = str(True)
return ret
def get_script_env(self, inventory_update, private_data_dir, private_data_files):
env = super(gce, self).get_script_env(inventory_update, private_data_dir, private_data_files)
cred = inventory_update.get_cloud_credential()
@@ -2352,7 +2360,7 @@ class gce(PluginFileInjector):
'gce_name': 'name',
'gce_network': 'networkInterfaces[0].network.name',
'gce_private_ip': 'networkInterfaces[0].networkIP',
'gce_public_ip': 'networkInterfaces[0].accessConfigs[0].natIP',
'gce_public_ip': 'networkInterfaces[0].accessConfigs[0].natIP | default(None)',
'gce_status': 'status',
'gce_subnetwork': 'networkInterfaces[0].subnetwork.name',
'gce_tags': 'tags.get("items", [])',
@@ -2362,7 +2370,7 @@ class gce(PluginFileInjector):
'gce_image': 'image',
# We need this as long as hostnames is non-default, otherwise hosts
# will not be addressed correctly, was returned in script
'ansible_ssh_host': 'networkInterfaces[0].accessConfigs[0].natIP'
'ansible_ssh_host': 'networkInterfaces[0].accessConfigs[0].natIP | default(networkInterfaces[0].networkIP)'
}
def inventory_as_dict(self, inventory_update, private_data_dir):

View File

@@ -27,7 +27,7 @@ from rest_framework.exceptions import ParseError
from awx.api.versioning import reverse
from awx.main.models.base import (
BaseModel, CreatedModifiedModel,
prevent_search,
prevent_search, accepts_json,
JOB_TYPE_CHOICES, VERBOSITY_CHOICES,
VarsDictProperty
)
@@ -96,6 +96,13 @@ class JobOptions(BaseModel):
default='',
blank=True,
)
scm_branch = models.CharField(
max_length=1024,
default='',
blank=True,
help_text=_('Branch to use in job run. Project default used if blank. '
'Only allowed if project allow_override field is set to true.'),
)
forks = models.PositiveIntegerField(
blank=True,
default=0,
@@ -109,10 +116,10 @@ class JobOptions(BaseModel):
blank=True,
default=0,
)
extra_vars = prevent_search(models.TextField(
extra_vars = prevent_search(accepts_json(models.TextField(
blank=True,
default='',
))
)))
job_tags = models.CharField(
max_length=1024,
blank=True,
@@ -234,6 +241,11 @@ class JobTemplate(UnifiedJobTemplate, JobOptions, SurveyJobTemplateMixin, Resour
default=False,
allows_field='credentials'
)
ask_scm_branch_on_launch = AskForField(
blank=True,
default=False,
allows_field='scm_branch'
)
job_slice_count = models.PositiveIntegerField(
blank=True,
default=1,
@@ -387,7 +399,21 @@ class JobTemplate(UnifiedJobTemplate, JobOptions, SurveyJobTemplateMixin, Resour
# no-op case: Fields the same as template's value
# counted as neither accepted or ignored
continue
elif field_name == 'scm_branch' and old_value == '' and self.project and new_value == self.project.scm_branch:
# special case of "not provided" for branches
# job template does not provide branch, runs with default branch
continue
elif getattr(self, ask_field_name):
# Special case where prompts can be rejected based on project setting
if field_name == 'scm_branch':
if not self.project:
rejected_data[field_name] = new_value
errors_dict[field_name] = _('Project is missing.')
continue
if kwargs['scm_branch'] != self.project.scm_branch and not self.project.allow_override:
rejected_data[field_name] = new_value
errors_dict[field_name] = _('Project does not allow override of branch.')
continue
# accepted prompt
prompted_data[field_name] = new_value
else:
@@ -396,7 +422,7 @@ class JobTemplate(UnifiedJobTemplate, JobOptions, SurveyJobTemplateMixin, Resour
# Not considered an error for manual launch, to support old
# behavior of putting them in ignored_fields and launching anyway
if 'prompts' not in exclude_errors:
errors_dict[field_name] = _('Field is not configured to prompt on launch.').format(field_name=field_name)
errors_dict[field_name] = _('Field is not configured to prompt on launch.')
if ('prompts' not in exclude_errors and
(not getattr(self, 'ask_credential_on_launch', False)) and
@@ -644,7 +670,7 @@ class Job(UnifiedJob, JobOptions, SurveyJobMixin, JobNotificationMixin, TaskMana
data = super(Job, self).notification_data()
all_hosts = {}
# NOTE: Probably related to job event slowness, remove at some point -matburt
if block:
if block and self.status != 'running':
summaries = self.job_host_summaries.all()
while block > 0 and not len(summaries):
time.sleep(1)
@@ -658,7 +684,7 @@ class Job(UnifiedJob, JobOptions, SurveyJobMixin, JobNotificationMixin, TaskMana
failures=h.failures,
ok=h.ok,
processed=h.processed,
skipped=h.skipped)
skipped=h.skipped) # TODO: update with rescued, ignored (see https://github.com/ansible/awx/issues/4394)
data.update(dict(inventory=self.inventory.name if self.inventory else None,
project=self.project.name if self.project else None,
playbook=self.playbook,
@@ -894,27 +920,6 @@ class LaunchTimeConfigBase(BaseModel):
def display_extra_data(self):
return self.display_extra_vars()
@property
def _credential(self):
'''
Only used for workflow nodes to support backward compatibility.
'''
try:
return [cred for cred in self.credentials.all() if cred.credential_type.kind == 'ssh'][0]
except IndexError:
return None
@property
def credential(self):
'''
Returns an integer so it can be used as IntegerField in serializer
'''
cred = self._credential
if cred is not None:
return cred.pk
else:
return None
class LaunchTimeConfig(LaunchTimeConfigBase):
'''

View File

@@ -483,4 +483,3 @@ class RelatedJobsMixin(object):
raise RuntimeError("Programmer error. Expected _get_active_jobs() to return a QuerySet.")
return [dict(id=t[0], type=mapping[t[1]]) for t in jobs.values_list('id', 'polymorphic_ctype_id')]

View File

@@ -2,7 +2,9 @@
# All Rights Reserved.
from copy import deepcopy
import datetime
import logging
import json
from django.db import models
from django.conf import settings
@@ -10,6 +12,8 @@ from django.core.mail.message import EmailMessage
from django.db import connection
from django.utils.translation import ugettext_lazy as _
from django.utils.encoding import smart_str, force_text
from jinja2 import sandbox
from jinja2.exceptions import TemplateSyntaxError, UndefinedError, SecurityError
# AWX
from awx.api.versioning import reverse
@@ -45,7 +49,7 @@ class NotificationTemplate(CommonModelNameNotUnique):
('mattermost', _('Mattermost'), MattermostBackend),
('rocketchat', _('Rocket.Chat'), RocketChatBackend),
('irc', _('IRC'), IrcBackend)]
NOTIFICATION_TYPE_CHOICES = [(x[0], x[1]) for x in NOTIFICATION_TYPES]
NOTIFICATION_TYPE_CHOICES = sorted([(x[0], x[1]) for x in NOTIFICATION_TYPES])
CLASS_FOR_NOTIFICATION_TYPE = dict([(x[0], x[2]) for x in NOTIFICATION_TYPES])
class Meta:
@@ -68,6 +72,45 @@ class NotificationTemplate(CommonModelNameNotUnique):
notification_configuration = JSONField(blank=False)
def default_messages():
return {'started': None, 'success': None, 'error': None}
messages = JSONField(
null=True,
blank=True,
default=default_messages,
help_text=_('Optional custom messages for notification template.'))
def has_message(self, condition):
potential_template = self.messages.get(condition, {})
if potential_template == {}:
return False
if potential_template.get('message', {}) == {}:
return False
return True
def get_message(self, condition):
return self.messages.get(condition, {})
def build_notification_message(self, event_type, context):
env = sandbox.ImmutableSandboxedEnvironment()
templates = self.get_message(event_type)
msg_template = templates.get('message', {})
try:
notification_subject = env.from_string(msg_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_subject = ''
msg_body = templates.get('body', {})
try:
notification_body = env.from_string(msg_body).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_body = ''
return (notification_subject, notification_body)
def get_absolute_url(self, request=None):
return reverse('api:notification_template_detail', kwargs={'pk': self.pk}, request=request)
@@ -78,6 +121,26 @@ class NotificationTemplate(CommonModelNameNotUnique):
def save(self, *args, **kwargs):
new_instance = not bool(self.pk)
update_fields = kwargs.get('update_fields', [])
# preserve existing notification messages if not overwritten by new messages
if not new_instance:
old_nt = NotificationTemplate.objects.get(pk=self.id)
old_messages = old_nt.messages
new_messages = self.messages
if old_messages is not None and new_messages is not None:
for event in ['started', 'success', 'error']:
if not new_messages.get(event, {}) and old_messages.get(event, {}):
new_messages[event] = old_messages[event]
continue
if new_messages.get(event, {}) and old_messages.get(event, {}):
old_event_msgs = old_messages[event]
new_event_msgs = new_messages[event]
for msg_type in ['message', 'body']:
if msg_type not in new_event_msgs and old_event_msgs.get(msg_type, None):
new_event_msgs[msg_type] = old_event_msgs[msg_type]
new_messages.setdefault(event, None)
for field in filter(lambda x: self.notification_class.init_parameters[x]['type'] == "password",
self.notification_class.init_parameters):
if self.notification_configuration[field].startswith("$encrypted$"):
@@ -118,9 +181,10 @@ class NotificationTemplate(CommonModelNameNotUnique):
def send(self, subject, body):
for field in filter(lambda x: self.notification_class.init_parameters[x]['type'] == "password",
self.notification_class.init_parameters):
self.notification_configuration[field] = decrypt_field(self,
'notification_configuration',
subfield=field)
if field in self.notification_configuration:
self.notification_configuration[field] = decrypt_field(self,
'notification_configuration',
subfield=field)
recipients = self.notification_configuration.pop(self.notification_class.recipient_parameter)
if not isinstance(recipients, list):
recipients = [recipients]
@@ -200,56 +264,227 @@ class Notification(CreatedModifiedModel):
class JobNotificationMixin(object):
STATUS_TO_TEMPLATE_TYPE = {'succeeded': 'success',
'running': 'started',
'failed': 'error'}
# Tree of fields that can be safely referenced in a notification message
JOB_FIELDS_WHITELIST = ['id', 'type', 'url', 'created', 'modified', 'name', 'description', 'job_type', 'playbook',
'forks', 'limit', 'verbosity', 'job_tags', 'force_handlers', 'skip_tags', 'start_at_task',
'timeout', 'use_fact_cache', 'launch_type', 'status', 'failed', 'started', 'finished',
'elapsed', 'job_explanation', 'execution_node', 'controller_node', 'allow_simultaneous',
'scm_revision', 'diff_mode', 'job_slice_number', 'job_slice_count', 'custom_virtualenv',
{'host_status_counts': ['skipped', 'ok', 'changed', 'failures', 'dark']},
{'playbook_counts': ['play_count', 'task_count']},
{'summary_fields': [{'inventory': ['id', 'name', 'description', 'has_active_failures',
'total_hosts', 'hosts_with_active_failures', 'total_groups',
'groups_with_active_failures', 'has_inventory_sources',
'total_inventory_sources', 'inventory_sources_with_failures',
'organization_id', 'kind']},
{'project': ['id', 'name', 'description', 'status', 'scm_type']},
{'project_update': ['id', 'name', 'description', 'status', 'failed']},
{'job_template': ['id', 'name', 'description']},
{'unified_job_template': ['id', 'name', 'description', 'unified_job_type']},
{'instance_group': ['name', 'id']},
{'created_by': ['id', 'username', 'first_name', 'last_name']},
{'labels': ['count', 'results']},
{'source_workflow_job': ['description', 'elapsed', 'failed', 'id', 'name', 'status']}]}]
@classmethod
def context_stub(cls):
"""Returns a stub context that can be used for validating notification messages.
Context has the same structure as the context that will actually be used to render
a notification message."""
context = {'job': {'allow_simultaneous': False,
'controller_node': 'foo_controller',
'created': datetime.datetime(2018, 11, 13, 6, 4, 0, 0, tzinfo=datetime.timezone.utc),
'custom_virtualenv': 'my_venv',
'description': 'Sample job description',
'diff_mode': False,
'elapsed': 0.403018,
'execution_node': 'awx',
'failed': False,
'finished': False,
'force_handlers': False,
'forks': 0,
'host_status_counts': {'skipped': 1, 'ok': 5, 'changed': 3, 'failures': 0, 'dark': 0},
'id': 42,
'job_explanation': 'Sample job explanation',
'job_slice_count': 1,
'job_slice_number': 0,
'job_tags': '',
'job_type': 'run',
'launch_type': 'workflow',
'limit': 'bar_limit',
'modified': datetime.datetime(2018, 12, 13, 6, 4, 0, 0, tzinfo=datetime.timezone.utc),
'name': 'Stub JobTemplate',
'playbook_counts': {'play_count': 5, 'task_count': 10},
'playbook': 'ping.yml',
'scm_revision': '',
'skip_tags': '',
'start_at_task': '',
'started': '2019-07-29T17:38:14.137461Z',
'status': 'running',
'summary_fields': {'created_by': {'first_name': '',
'id': 1,
'last_name': '',
'username': 'admin'},
'instance_group': {'id': 1, 'name': 'tower'},
'inventory': {'description': 'Sample inventory description',
'groups_with_active_failures': 0,
'has_active_failures': False,
'has_inventory_sources': False,
'hosts_with_active_failures': 0,
'id': 17,
'inventory_sources_with_failures': 0,
'kind': '',
'name': 'Stub Inventory',
'organization_id': 121,
'total_groups': 0,
'total_hosts': 1,
'total_inventory_sources': 0},
'job_template': {'description': 'Sample job template description',
'id': 39,
'name': 'Stub JobTemplate'},
'labels': {'count': 0, 'results': []},
'project': {'description': 'Sample project description',
'id': 38,
'name': 'Stub project',
'scm_type': 'git',
'status': 'successful'},
'project_update': {'id': 5, 'name': 'Stub Project Update', 'description': 'Project Update',
'status': 'running', 'failed': False},
'unified_job_template': {'description': 'Sample unified job template description',
'id': 39,
'name': 'Stub Job Template',
'unified_job_type': 'job'},
'source_workflow_job': {'description': 'Sample workflow job description',
'elapsed': 0.000,
'failed': False,
'id': 88,
'name': 'Stub WorkflowJobTemplate',
'status': 'running'}},
'timeout': 0,
'type': 'job',
'url': '/api/v2/jobs/13/',
'use_fact_cache': False,
'verbosity': 0},
'job_friendly_name': 'Job',
'url': 'https://towerhost/#/jobs/playbook/1010',
'job_summary_dict': """{'url': 'https://towerhost/$/jobs/playbook/13',
'traceback': '',
'status': 'running',
'started': '2019-08-07T21:46:38.362630+00:00',
'project': 'Stub project',
'playbook': 'ping.yml',
'name': 'Stub Job Template',
'limit': '',
'inventory': 'Stub Inventory',
'id': 42,
'hosts': {},
'friendly_name': 'Job',
'finished': False,
'credential': 'Stub credential',
'created_by': 'admin'}"""}
return context
def context(self, serialized_job):
"""Returns a context that can be used for rendering notification messages.
Context contains whitelisted content retrieved from a serialized job object
(see JobNotificationMixin.JOB_FIELDS_WHITELIST), the job's friendly name,
and a url to the job run."""
context = {'job': {},
'job_friendly_name': self.get_notification_friendly_name(),
'url': self.get_ui_url(),
'job_summary_dict': json.dumps(self.notification_data(), indent=4)}
def build_context(node, fields, whitelisted_fields):
for safe_field in whitelisted_fields:
if type(safe_field) is dict:
field, whitelist_subnode = safe_field.copy().popitem()
# ensure content present in job serialization
if field not in fields:
continue
subnode = fields[field]
node[field] = {}
build_context(node[field], subnode, whitelist_subnode)
else:
# ensure content present in job serialization
if safe_field not in fields:
continue
node[safe_field] = fields[safe_field]
build_context(context['job'], serialized_job, self.JOB_FIELDS_WHITELIST)
return context
def get_notification_templates(self):
raise RuntimeError("Define me")
def get_notification_friendly_name(self):
raise RuntimeError("Define me")
def _build_notification_message(self, status_str):
def notification_data(self):
raise RuntimeError("Define me")
def build_notification_message(self, nt, status):
env = sandbox.ImmutableSandboxedEnvironment()
from awx.api.serializers import UnifiedJobSerializer
job_serialization = UnifiedJobSerializer(self).to_representation(self)
context = self.context(job_serialization)
msg_template = body_template = None
if nt.messages:
templates = nt.messages.get(self.STATUS_TO_TEMPLATE_TYPE[status], {}) or {}
msg_template = templates.get('message', {})
body_template = templates.get('body', {})
if msg_template:
try:
notification_subject = env.from_string(msg_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_subject = ''
else:
notification_subject = u"{} #{} '{}' {}: {}".format(self.get_notification_friendly_name(),
self.id,
self.name,
status,
self.get_ui_url())
notification_body = self.notification_data()
notification_subject = u"{} #{} '{}' {}: {}".format(self.get_notification_friendly_name(),
self.id,
self.name,
status_str,
notification_body['url'])
notification_body['friendly_name'] = self.get_notification_friendly_name()
if body_template:
try:
notification_body['body'] = env.from_string(body_template).render(**context)
except (TemplateSyntaxError, UndefinedError, SecurityError):
notification_body['body'] = ''
return (notification_subject, notification_body)
def build_notification_succeeded_message(self):
return self._build_notification_message('succeeded')
def build_notification_failed_message(self):
return self._build_notification_message('failed')
def build_notification_running_message(self):
return self._build_notification_message('running')
def send_notification_templates(self, status_str):
def send_notification_templates(self, status):
from awx.main.tasks import send_notifications # avoid circular import
if status_str not in ['succeeded', 'failed', 'running']:
raise ValueError(_("status_str must be either running, succeeded or failed"))
if status not in ['running', 'succeeded', 'failed']:
raise ValueError(_("status must be either running, succeeded or failed"))
try:
notification_templates = self.get_notification_templates()
except Exception:
logger.warn("No notification template defined for emitting notification")
notification_templates = None
if notification_templates:
if status_str == 'succeeded':
notification_template_type = 'success'
elif status_str == 'running':
notification_template_type = 'started'
else:
notification_template_type = 'error'
all_notification_templates = set(notification_templates.get(notification_template_type, []))
if len(all_notification_templates):
try:
(notification_subject, notification_body) = getattr(self, 'build_notification_%s_message' % status_str)()
except AttributeError:
raise NotImplementedError("build_notification_%s_message() does not exist" % status_str)
return
def send_it():
send_notifications.delay([n.generate_notification(notification_subject, notification_body).id
for n in all_notification_templates],
if not notification_templates:
return
for nt in set(notification_templates.get(self.STATUS_TO_TEMPLATE_TYPE[status], [])):
try:
(notification_subject, notification_body) = self.build_notification_message(nt, status)
except AttributeError:
raise NotImplementedError("build_notification_message() does not exist" % status)
# Use kwargs to force late-binding
# https://stackoverflow.com/a/3431699/10669572
def send_it(local_nt=nt, local_subject=notification_subject, local_body=notification_body):
def _func():
send_notifications.delay([local_nt.generate_notification(local_subject, local_body).id],
job_id=self.id)
connection.on_commit(send_it)
return _func
connection.on_commit(send_it())

View File

@@ -98,8 +98,7 @@ class OAuth2AccessToken(AbstractAccessToken):
related_name="%(app_label)s_%(class)s",
help_text=_('The user representing the token owner')
)
description = models.CharField(
max_length=200,
description = models.TextField(
default='',
blank=True,
)

View File

@@ -87,7 +87,10 @@ class Organization(CommonModel, NotificationFieldsModel, ResourceMixin, CustomVi
'execute_role', 'project_admin_role',
'inventory_admin_role', 'workflow_admin_role',
'notification_admin_role', 'credential_admin_role',
'job_template_admin_role',],
'job_template_admin_role', 'approval_role',],
)
approval_role = ImplicitRoleField(
parent_role='admin_role',
)

View File

@@ -106,6 +106,13 @@ class ProjectOptions(models.Model):
verbose_name=_('SCM Branch'),
help_text=_('Specific branch, tag or commit to checkout.'),
)
scm_refspec = models.CharField(
max_length=1024,
blank=True,
default='',
verbose_name=_('SCM refspec'),
help_text=_('For git projects, an additional refspec to fetch.'),
)
scm_clean = models.BooleanField(
default=False,
help_text=_('Discard any local changes before syncing the project.'),
@@ -241,7 +248,7 @@ class Project(UnifiedJobTemplate, ProjectOptions, ResourceMixin, CustomVirtualEn
SOFT_UNIQUE_TOGETHER = [('polymorphic_ctype', 'name', 'organization')]
FIELDS_TO_PRESERVE_AT_COPY = ['labels', 'instance_groups', 'credentials']
FIELDS_TO_DISCARD_AT_COPY = ['local_path']
FIELDS_TRIGGER_UPDATE = frozenset(['scm_url', 'scm_branch', 'scm_type'])
FIELDS_TRIGGER_UPDATE = frozenset(['scm_url', 'scm_branch', 'scm_type', 'scm_refspec'])
class Meta:
app_label = 'main'
@@ -261,9 +268,14 @@ class Project(UnifiedJobTemplate, ProjectOptions, ResourceMixin, CustomVirtualEn
scm_update_cache_timeout = models.PositiveIntegerField(
default=0,
blank=True,
help_text=_('The number of seconds after the last project update ran that a new'
help_text=_('The number of seconds after the last project update ran that a new '
'project update will be launched as a job dependency.'),
)
allow_override = models.BooleanField(
default=False,
help_text=_('Allow changing the SCM branch or revision in a job template '
'that uses this project.'),
)
scm_revision = models.CharField(
max_length=1024,
@@ -471,6 +483,14 @@ class ProjectUpdate(UnifiedJob, ProjectOptions, JobNotificationMixin, TaskManage
choices=PROJECT_UPDATE_JOB_TYPE_CHOICES,
default='check',
)
scm_revision = models.CharField(
max_length=1024,
blank=True,
default='',
editable=False,
verbose_name=_('SCM Revision'),
help_text=_('The SCM Revision discovered by this update for the given project and branch.'),
)
def _get_parent_field_name(self):
return 'project'

View File

@@ -48,6 +48,7 @@ role_names = {
'read_role': _('Read'),
'update_role': _('Update'),
'use_role': _('Use'),
'approval_role': _('Approve'),
}
role_descriptions = {
@@ -70,6 +71,7 @@ role_descriptions = {
'read_role': _('May view settings for the %s'),
'update_role': _('May update the %s'),
'use_role': _('Can use the %s in a job template'),
'approval_role': _('Can approve or deny a workflow approval node'),
}

View File

@@ -65,8 +65,8 @@ class UnifiedJobTemplate(PolymorphicModel, CommonModelNameNotUnique, Notificatio
# status inherits from related jobs. Thus, status must be able to be set to any status that a job status is settable to.
JOB_STATUS_CHOICES = [
('new', _('New')), # Job has been created, but not started.
('pending', _('Pending')), # Job has been queued, but is not yet running.
('waiting', _('Waiting')), # Job is waiting on an update/dependency.
('pending', _('Pending')), # Job is pending Task Manager processing (blocked by dependency req, capacity or a concurrent job)
('waiting', _('Waiting')), # Job has been assigned to run on a specific node (and is about to run).
('running', _('Running')), # Job is currently running.
('successful', _('Successful')), # Job completed successfully.
('failed', _('Failed')), # Job completed, but with failures.
@@ -1031,7 +1031,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
fd.write = lambda s: _write(smart_text(s))
cursor.copy_expert(
"copy (select stdout from {} where {}={} order by start_line) to stdout".format(
"copy (select stdout from {} where {}={} and stdout != '' order by start_line) to stdout".format(
self._meta.db_table + 'event',
self.event_parent_key,
self.id
@@ -1173,7 +1173,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
def websocket_emit_data(self):
''' Return extra data that should be included when submitting data to the browser over the websocket connection '''
websocket_data = dict()
websocket_data = dict(type=self.get_real_instance_class()._meta.verbose_name.replace(' ', '_'))
if self.spawned_by_workflow:
websocket_data.update(dict(workflow_job_id=self.workflow_job_id,
workflow_node_id=self.workflow_node_id))

View File

@@ -13,7 +13,8 @@ from django.core.exceptions import ObjectDoesNotExist
# AWX
from awx.api.versioning import reverse
from awx.main.models import prevent_search, UnifiedJobTemplate, UnifiedJob
from awx.main.models import (prevent_search, accepts_json, UnifiedJobTemplate,
UnifiedJob)
from awx.main.models.notifications import (
NotificationTemplate,
JobNotificationMixin
@@ -34,11 +35,14 @@ from awx.main.models.jobs import LaunchTimeConfigBase, LaunchTimeConfig, JobTemp
from awx.main.models.credential import Credential
from awx.main.redact import REPLACE_STR
from awx.main.fields import JSONField
from awx.main.utils import schedule_task_manager
from copy import copy
from urllib.parse import urljoin
__all__ = ['WorkflowJobTemplate', 'WorkflowJob', 'WorkflowJobOptions', 'WorkflowJobNode', 'WorkflowJobTemplateNode',]
__all__ = ['WorkflowJobTemplate', 'WorkflowJob', 'WorkflowJobOptions', 'WorkflowJobNode',
'WorkflowJobTemplateNode', 'WorkflowApprovalTemplate', 'WorkflowApproval']
logger = logging.getLogger('awx.main.models.workflow')
@@ -70,7 +74,7 @@ class WorkflowNodeBase(CreatedModifiedModel, LaunchTimeConfig):
unified_job_template = models.ForeignKey(
'UnifiedJobTemplate',
related_name='%(class)ss',
blank=False,
blank=True,
null=True,
default=None,
on_delete=models.SET_NULL,
@@ -160,6 +164,13 @@ class WorkflowJobTemplateNode(WorkflowNodeBase):
new_node.credentials.add(cred)
return new_node
def create_approval_template(self, **kwargs):
approval_template = WorkflowApprovalTemplate(**kwargs)
approval_template.save()
self.unified_job_template = approval_template
self.save()
return approval_template
class WorkflowJobNode(WorkflowNodeBase):
job = models.OneToOneField(
@@ -291,10 +302,10 @@ class WorkflowJobOptions(BaseModel):
class Meta:
abstract = True
extra_vars = prevent_search(models.TextField(
extra_vars = accepts_json(prevent_search(models.TextField(
blank=True,
default='',
))
)))
allow_simultaneous = models.BooleanField(
default=False
)
@@ -384,7 +395,11 @@ class WorkflowJobTemplate(UnifiedJobTemplate, WorkflowJobOptions, SurveyJobTempl
])
read_role = ImplicitRoleField(parent_role=[
'singleton:' + ROLE_SINGLETON_SYSTEM_AUDITOR,
'organization.auditor_role', 'execute_role', 'admin_role'
'organization.auditor_role', 'execute_role', 'admin_role',
'approval_role',
])
approval_role = ImplicitRoleField(parent_role=[
'organization.approval_role', 'admin_role',
])
@property
@@ -600,3 +615,92 @@ class WorkflowJob(UnifiedJob, WorkflowJobOptions, SurveyJobMixin, JobNotificatio
# WorkflowJobs don't _actually_ run anything in the dispatcher, so
# there's no point in asking the dispatcher if it knows about this task
return self.status == 'running'
class WorkflowApprovalTemplate(UnifiedJobTemplate):
FIELDS_TO_PRESERVE_AT_COPY = ['description', 'timeout',]
class Meta:
app_label = 'main'
timeout = models.IntegerField(
blank=True,
default=0,
help_text=_("The amount of time (in seconds) before the approval node expires and fails."),
)
@classmethod
def _get_unified_job_class(cls):
return WorkflowApproval
@classmethod
def _get_unified_job_field_names(cls):
return ['name', 'description', 'timeout']
def get_absolute_url(self, request=None):
return reverse('api:workflow_approval_template_detail', kwargs={'pk': self.pk}, request=request)
@property
def workflow_job_template(self):
return self.workflowjobtemplatenodes.first().workflow_job_template
class WorkflowApproval(UnifiedJob):
class Meta:
app_label = 'main'
workflow_approval_template = models.ForeignKey(
'WorkflowApprovalTemplate',
related_name='approvals',
blank=True,
null=True,
default=None,
on_delete=models.SET_NULL,
)
timeout = models.IntegerField(
blank=True,
default=0,
help_text=_("The amount of time (in seconds) before the approval node expires and fails."),
)
timed_out = models.BooleanField(
default=False,
help_text=_("Shows when an approval node (with a timeout assigned to it) has timed out.")
)
@classmethod
def _get_unified_job_template_class(cls):
return WorkflowApprovalTemplate
def get_absolute_url(self, request=None):
return reverse('api:workflow_approval_detail', kwargs={'pk': self.pk}, request=request)
@property
def event_class(self):
return None
def _get_parent_field_name(self):
return 'workflow_approval_template'
def approve(self, request=None):
self.status = 'successful'
self.save()
self.websocket_emit_status(self.status)
schedule_task_manager()
return reverse('api:workflow_approval_approve', kwargs={'pk': self.pk}, request=request)
def deny(self, request=None):
self.status = 'failed'
self.save()
self.websocket_emit_status(self.status)
schedule_task_manager()
return reverse('api:workflow_approval_deny', kwargs={'pk': self.pk}, request=request)
@property
def workflow_job_template(self):
return self.unified_job_node.workflow_job.unified_job_template
@property
def workflow_job(self):
return self.unified_job_node.workflow_job

View File

@@ -19,6 +19,12 @@ class CustomEmailBackend(EmailBackend):
"sender": {"label": "Sender Email", "type": "string"},
"recipients": {"label": "Recipient List", "type": "list"},
"timeout": {"label": "Timeout", "type": "int", "default": 30}}
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
DEFAULT_BODY = smart_text(_("{{ job_friendly_name }} #{{ job.id }} had status {{ job.status }}, view details at {{ url }}\n\n{{ job_summary_dict }}"))
default_messages = {"started": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"success": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"error": {"message": DEFAULT_SUBJECT, "body": DEFAULT_BODY}}
recipient_parameter = "recipients"
sender_parameter = "sender"

View File

@@ -21,6 +21,11 @@ class GrafanaBackend(AWXBaseEmailBackend):
recipient_parameter = "grafana_url"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, grafana_key,dashboardId=None, panelId=None, annotation_tags=None, grafana_no_verify_ssl=False, isRegion=True,
fail_silently=False, **kwargs):
super(GrafanaBackend, self).__init__(fail_silently=fail_silently)

View File

@@ -23,6 +23,11 @@ class HipChatBackend(AWXBaseEmailBackend):
recipient_parameter = "rooms"
sender_parameter = "message_from"
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, token, color, api_url, notify, fail_silently=False, **kwargs):
super(HipChatBackend, self).__init__(fail_silently=fail_silently)
self.token = token

View File

@@ -25,6 +25,11 @@ class IrcBackend(AWXBaseEmailBackend):
recipient_parameter = "targets"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, server, port, nickname, password, use_ssl, fail_silently=False, **kwargs):
super(IrcBackend, self).__init__(fail_silently=fail_silently)
self.server = server

View File

@@ -19,6 +19,11 @@ class MattermostBackend(AWXBaseEmailBackend):
recipient_parameter = "mattermost_url"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, mattermost_no_verify_ssl=False, mattermost_channel=None, mattermost_username=None,
mattermost_icon_url=None, fail_silently=False, **kwargs):
super(MattermostBackend, self).__init__(fail_silently=fail_silently)

View File

@@ -20,6 +20,12 @@ class PagerDutyBackend(AWXBaseEmailBackend):
recipient_parameter = "service_key"
sender_parameter = "client_name"
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
DEFAULT_BODY = "{{ job_summary_dict }}"
default_messages = {"started": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"success": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY},
"error": { "message": DEFAULT_SUBJECT, "body": DEFAULT_BODY}}
def __init__(self, subdomain, token, fail_silently=False, **kwargs):
super(PagerDutyBackend, self).__init__(fail_silently=fail_silently)
self.subdomain = subdomain

View File

@@ -19,6 +19,11 @@ class RocketChatBackend(AWXBaseEmailBackend):
recipient_parameter = "rocketchat_url"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, rocketchat_no_verify_ssl=False, rocketchat_username=None, rocketchat_icon_url=None, fail_silently=False, **kwargs):
super(RocketChatBackend, self).__init__(fail_silently=fail_silently)
self.rocketchat_no_verify_ssl = rocketchat_no_verify_ssl

View File

@@ -19,6 +19,11 @@ class SlackBackend(AWXBaseEmailBackend):
recipient_parameter = "channels"
sender_parameter = None
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, token, hex_color="", fail_silently=False, **kwargs):
super(SlackBackend, self).__init__(fail_silently=fail_silently)
self.token = token
@@ -50,7 +55,7 @@ class SlackBackend(AWXBaseEmailBackend):
if ret['ok']:
sent_messages += 1
else:
raise RuntimeError("Slack Notification unable to send {}: {}".format(r, m.subject))
raise RuntimeError("Slack Notification unable to send {}: {} ({})".format(r, m.subject, ret['error']))
except Exception as e:
logger.error(smart_text(_("Exception sending messages: {}").format(e)))
if not self.fail_silently:

View File

@@ -21,6 +21,11 @@ class TwilioBackend(AWXBaseEmailBackend):
recipient_parameter = "to_numbers"
sender_parameter = "from_number"
DEFAULT_SUBJECT = "{{ job_friendly_name }} #{{ job.id }} '{{ job.name }}' {{ job.status }}: {{ url }}"
default_messages = {"started": {"message": DEFAULT_SUBJECT},
"success": {"message": DEFAULT_SUBJECT},
"error": {"message": DEFAULT_SUBJECT}}
def __init__(self, account_sid, account_token, fail_silently=False, **kwargs):
super(TwilioBackend, self).__init__(fail_silently=fail_silently)
self.account_sid = account_sid

View File

@@ -1,6 +1,7 @@
# Copyright (c) 2016 Ansible, Inc.
# All Rights Reserved.
import json
import logging
import requests
@@ -15,25 +16,52 @@ logger = logging.getLogger('awx.main.notifications.webhook_backend')
class WebhookBackend(AWXBaseEmailBackend):
init_parameters = {"url": {"label": "Target URL", "type": "string"},
"http_method": {"label": "HTTP Method", "type": "string", "default": "POST"},
"disable_ssl_verification": {"label": "Verify SSL", "type": "bool", "default": False},
"username": {"label": "Username", "type": "string", "default": ""},
"password": {"label": "Password", "type": "password", "default": ""},
"headers": {"label": "HTTP Headers", "type": "object"}}
recipient_parameter = "url"
sender_parameter = None
def __init__(self, headers, disable_ssl_verification=False, fail_silently=False, **kwargs):
DEFAULT_BODY = "{{ job_summary_dict }}"
default_messages = {"started": {"body": DEFAULT_BODY},
"success": {"body": DEFAULT_BODY},
"error": {"body": DEFAULT_BODY}}
def __init__(self, http_method, headers, disable_ssl_verification=False, fail_silently=False, username=None, password=None, **kwargs):
self.http_method = http_method
self.disable_ssl_verification = disable_ssl_verification
self.headers = headers
self.username = username
self.password = password
super(WebhookBackend, self).__init__(fail_silently=fail_silently)
def format_body(self, body):
# If `body` has body field, attempt to use this as the main body,
# otherwise, leave it as a sub-field
if isinstance(body, dict) and 'body' in body and isinstance(body['body'], str):
try:
potential_body = json.loads(body['body'])
if isinstance(potential_body, dict):
body = potential_body
except json.JSONDecodeError:
pass
return body
def send_messages(self, messages):
sent_messages = 0
if 'User-Agent' not in self.headers:
self.headers['User-Agent'] = "Tower {}".format(get_awx_version())
if self.http_method.lower() not in ['put','post']:
raise ValueError("HTTP method must be either 'POST' or 'PUT'.")
chosen_method = getattr(requests, self.http_method.lower(), None)
for m in messages:
r = requests.post("{}".format(m.recipients()[0]),
auth = None
if self.username or self.password:
auth = (self.username, self.password)
r = chosen_method("{}".format(m.recipients()[0]),
auth=auth,
json=m.body,
headers=self.headers,
verify=(not self.disable_ssl_verification))

View File

@@ -23,6 +23,7 @@ from awx.main.models import (
Project,
ProjectUpdate,
SystemJob,
WorkflowApproval,
WorkflowJob,
WorkflowJobTemplate
)
@@ -518,6 +519,24 @@ class TaskManager():
if not found_acceptable_queue:
logger.debug("{} couldn't be scheduled on graph, waiting for next cycle".format(task.log_format))
def timeout_approval_node(self):
workflow_approvals = WorkflowApproval.objects.filter(status='pending')
now = tz_now()
for task in workflow_approvals:
approval_timeout_seconds = timedelta(seconds=task.timeout)
if task.timeout == 0:
continue
if (now - task.created) >= approval_timeout_seconds:
timeout_message = _(
"The approval node {name} ({pk}) has expired after {timeout} seconds."
).format(name=task.name, pk=task.pk, timeout=task.timeout)
logger.warn(timeout_message)
task.timed_out = True
task.status = 'failed'
task.websocket_emit_status(task.status)
task.job_explanation = timeout_message
task.save(update_fields=['status', 'job_explanation', 'timed_out'])
def calculate_capacity_consumed(self, tasks):
self.graph = InstanceGroup.objects.capacity_values(tasks=tasks, graph=self.graph)
@@ -573,6 +592,8 @@ class TaskManager():
self.spawn_workflow_graph_jobs(running_workflow_tasks)
self.timeout_approval_node()
self.process_tasks(all_sorted_tasks)
return finished_wfjs

View File

@@ -34,8 +34,8 @@ from awx.main.models import (
InventorySource, InventoryUpdateEvent, Job, JobEvent, JobHostSummary,
JobTemplate, OAuth2AccessToken, Organization, Project, ProjectUpdateEvent,
Role, SystemJob, SystemJobEvent, SystemJobTemplate, UnifiedJob,
UnifiedJobTemplate, User, UserSessionMembership,
ROLE_SINGLETON_SYSTEM_ADMINISTRATOR
UnifiedJobTemplate, User, UserSessionMembership, WorkflowJobTemplateNode,
WorkflowApproval, WorkflowApprovalTemplate, ROLE_SINGLETON_SYSTEM_ADMINISTRATOR
)
from awx.main.constants import CENSOR_VALUE
from awx.main.utils import model_instance_diff, model_to_dict, camelcase_to_underscore, get_current_apps
@@ -355,6 +355,7 @@ def update_host_last_job_after_job_deleted(sender, **kwargs):
for host in Host.objects.filter(pk__in=hosts_pks):
_update_host_last_jhs(host)
# Set via ActivityStreamRegistrar to record activity stream events
@@ -429,6 +430,8 @@ def model_serializer_mapping():
models.Label: serializers.LabelSerializer,
models.WorkflowJobTemplate: serializers.WorkflowJobTemplateWithSpecSerializer,
models.WorkflowJobTemplateNode: serializers.WorkflowJobTemplateNodeSerializer,
models.WorkflowApproval: serializers.WorkflowApprovalActivityStreamSerializer,
models.WorkflowApprovalTemplate: serializers.WorkflowApprovalTemplateSerializer,
models.WorkflowJob: serializers.WorkflowJobSerializer,
models.OAuth2AccessToken: serializers.OAuth2TokenSerializer,
models.OAuth2Application: serializers.OAuth2ApplicationSerializer,
@@ -637,6 +640,30 @@ def delete_inventory_for_org(sender, instance, **kwargs):
logger.debug(e)
@receiver(pre_delete, sender=WorkflowJobTemplateNode)
def delete_approval_templates(sender, instance, **kwargs):
if type(instance.unified_job_template) is WorkflowApprovalTemplate:
instance.unified_job_template.delete()
@receiver(pre_save, sender=WorkflowJobTemplateNode)
def delete_approval_node_type_change(sender, instance, **kwargs):
try:
old = WorkflowJobTemplateNode.objects.get(id=instance.id)
except sender.DoesNotExist:
return
if old.unified_job_template == instance.unified_job_template:
return
if type(old.unified_job_template) is WorkflowApprovalTemplate:
old.unified_job_template.delete()
@receiver(pre_delete, sender=WorkflowApprovalTemplate)
def deny_orphaned_approvals(sender, instance, **kwargs):
for approval in WorkflowApproval.objects.filter(workflow_approval_template=instance, status='pending'):
approval.deny()
@receiver(post_save, sender=Session)
def save_user_session_membership(sender, **kwargs):
session = kwargs.get('instance', None)

View File

@@ -20,6 +20,8 @@ from distutils.dir_util import copy_tree
from distutils.version import LooseVersion as Version
import yaml
import fcntl
from pathlib import Path
from uuid import uuid4
try:
import psutil
except Exception:
@@ -41,6 +43,10 @@ from django.core.exceptions import ObjectDoesNotExist
# Django-CRUM
from crum import impersonate
# GitPython
import git
from gitdb.exc import BadName as BadGitName
# Runner
import ansible_runner
@@ -67,7 +73,7 @@ from awx.main.utils import (get_ssh_version, update_scm_url,
ignore_inventory_computed_fields,
ignore_inventory_group_removal, extract_ansible_vars, schedule_task_manager,
get_awx_version)
from awx.main.utils.common import _get_ansible_version, get_custom_venv_choices
from awx.main.utils.common import get_ansible_version, _get_ansible_version, get_custom_venv_choices
from awx.main.utils.safe_yaml import safe_dump, sanitize_jinja
from awx.main.utils.reload import stop_local_services
from awx.main.utils.pglock import advisory_lock
@@ -330,10 +336,12 @@ def send_notifications(notification_list, job_id=None):
@task()
def gather_analytics():
if settings.PENDO_TRACKING_STATE == 'off':
if not settings.INSIGHTS_TRACKING_STATE:
return
try:
tgz = analytics.gather()
if not tgz:
return
logger.debug('gathered analytics: {}'.format(tgz))
analytics.ship(tgz)
finally:
@@ -602,26 +610,25 @@ def update_inventory_computed_fields(inventory_id, should_update_hosts=True):
def update_smart_memberships_for_inventory(smart_inventory):
with advisory_lock('update_smart_memberships_for_inventory-{}'.format(smart_inventory.id)):
current = set(SmartInventoryMembership.objects.filter(inventory=smart_inventory).values_list('host_id', flat=True))
new = set(smart_inventory.hosts.values_list('id', flat=True))
additions = new - current
removals = current - new
if additions or removals:
with transaction.atomic():
if removals:
SmartInventoryMembership.objects.filter(inventory=smart_inventory, host_id__in=removals).delete()
if additions:
add_for_inventory = [
SmartInventoryMembership(inventory_id=smart_inventory.id, host_id=host_id)
for host_id in additions
]
SmartInventoryMembership.objects.bulk_create(add_for_inventory)
logger.debug('Smart host membership cached for {}, {} additions, {} removals, {} total count.'.format(
smart_inventory.pk, len(additions), len(removals), len(new)
))
return True # changed
return False
current = set(SmartInventoryMembership.objects.filter(inventory=smart_inventory).values_list('host_id', flat=True))
new = set(smart_inventory.hosts.values_list('id', flat=True))
additions = new - current
removals = current - new
if additions or removals:
with transaction.atomic():
if removals:
SmartInventoryMembership.objects.filter(inventory=smart_inventory, host_id__in=removals).delete()
if additions:
add_for_inventory = [
SmartInventoryMembership(inventory_id=smart_inventory.id, host_id=host_id)
for host_id in additions
]
SmartInventoryMembership.objects.bulk_create(add_for_inventory, ignore_conflicts=True)
logger.debug('Smart host membership cached for {}, {} additions, {} removals, {} total count.'.format(
smart_inventory.pk, len(additions), len(removals), len(new)
))
return True # changed
return False
@task()
@@ -693,9 +700,11 @@ class BaseTask(object):
model = None
event_model = None
abstract = True
cleanup_paths = []
proot_show_paths = []
def __init__(self):
self.cleanup_paths = []
def update_model(self, pk, _attempt=0, **updates):
"""Reload the model instance from the database and update the
given fields.
@@ -768,9 +777,11 @@ class BaseTask(object):
os.chmod(path, stat.S_IRUSR | stat.S_IWUSR | stat.S_IXUSR)
if settings.AWX_CLEANUP_PATHS:
self.cleanup_paths.append(path)
# Ansible Runner requires that this directory exists.
# Specifically, when using process isolation
os.mkdir(os.path.join(path, 'project'))
runner_project_folder = os.path.join(path, 'project')
if not os.path.exists(runner_project_folder):
# Ansible Runner requires that this directory exists.
# Specifically, when using process isolation
os.mkdir(runner_project_folder)
return path
def build_private_data_files(self, instance, private_data_dir):
@@ -859,15 +870,28 @@ class BaseTask(object):
'''
process_isolation_params = dict()
if self.should_use_proot(instance):
show_paths = self.proot_show_paths + [private_data_dir, cwd] + \
local_paths = [private_data_dir]
if cwd != private_data_dir and Path(private_data_dir) not in Path(cwd).parents:
local_paths.append(cwd)
show_paths = self.proot_show_paths + local_paths + \
settings.AWX_PROOT_SHOW_PATHS
# Help the user out by including the collections path inside the bubblewrap environment
if getattr(settings, 'AWX_ANSIBLE_COLLECTIONS_PATHS', []):
show_paths.extend(settings.AWX_ANSIBLE_COLLECTIONS_PATHS)
pi_path = settings.AWX_PROOT_BASE_PATH
if not self.instance.is_isolated():
pi_path = tempfile.mkdtemp(
prefix='ansible_runner_pi_',
dir=settings.AWX_PROOT_BASE_PATH
)
os.chmod(pi_path, stat.S_IRUSR | stat.S_IWUSR | stat.S_IXUSR)
self.cleanup_paths.append(pi_path)
process_isolation_params = {
'process_isolation': True,
'process_isolation_path': settings.AWX_PROOT_BASE_PATH,
'process_isolation_path': pi_path,
'process_isolation_show_paths': show_paths,
'process_isolation_hide_paths': [
settings.AWX_PROOT_BASE_PATH,
@@ -1019,7 +1043,7 @@ class BaseTask(object):
expect_passwords[k] = passwords.get(v, '') or ''
return expect_passwords
def pre_run_hook(self, instance):
def pre_run_hook(self, instance, private_data_dir):
'''
Hook for any steps to run before the job/task starts
'''
@@ -1146,7 +1170,8 @@ class BaseTask(object):
try:
isolated = self.instance.is_isolated()
self.instance.send_notification_templates("running")
self.pre_run_hook(self.instance)
private_data_dir = self.build_private_data_dir(self.instance)
self.pre_run_hook(self.instance, private_data_dir)
if self.instance.cancel_flag:
self.instance = self.update_model(self.instance.pk, status='canceled')
if self.instance.status != 'running':
@@ -1162,7 +1187,6 @@ class BaseTask(object):
# store a record of the venv used at runtime
if hasattr(self.instance, 'custom_virtualenv'):
self.update_model(pk, custom_virtualenv=getattr(self.instance, 'ansible_virtualenv_path', settings.ANSIBLE_VENV_PATH))
private_data_dir = self.build_private_data_dir(self.instance)
# Fetch "cached" fact data from prior runs and put on the disk
# where ansible expects to find it
@@ -1245,9 +1269,6 @@ class BaseTask(object):
module_args = ansible_runner.utils.args2cmdline(
params.get('module_args'),
)
else:
# otherwise, it's a playbook, so copy the project dir
copy_tree(cwd, os.path.join(private_data_dir, 'project'))
shutil.move(
params.pop('inventory'),
os.path.join(private_data_dir, 'inventory')
@@ -1453,6 +1474,15 @@ class RunJob(BaseTask):
if authorize:
env['ANSIBLE_NET_AUTH_PASS'] = network_cred.get_input('authorize_password', default='')
for env_key, folder in (
('ANSIBLE_COLLECTIONS_PATHS', 'requirements_collections'),
('ANSIBLE_ROLES_PATH', 'requirements_roles')):
paths = []
if env_key in env:
paths.append(env[env_key])
paths.append(os.path.join(private_data_dir, folder))
env[env_key] = os.pathsep.join(paths)
return env
def build_args(self, job, private_data_dir, passwords):
@@ -1521,15 +1551,10 @@ class RunJob(BaseTask):
return args
def build_cwd(self, job, private_data_dir):
cwd = job.project.get_project_path()
if not cwd:
root = settings.PROJECTS_ROOT
raise RuntimeError('project local_path %s cannot be found in %s' %
(job.project.local_path, root))
return cwd
return os.path.join(private_data_dir, 'project')
def build_playbook_path_relative_to_cwd(self, job, private_data_dir):
return os.path.join(job.playbook)
return job.playbook
def build_extra_vars_file(self, job, private_data_dir):
# Define special extra_vars for AWX, combine with job.extra_vars.
@@ -1576,39 +1601,86 @@ class RunJob(BaseTask):
'''
return getattr(settings, 'AWX_PROOT_ENABLED', False)
def pre_run_hook(self, job):
def pre_run_hook(self, job, private_data_dir):
if job.inventory is None:
error = _('Job could not start because it does not have a valid inventory.')
self.update_model(job.pk, status='failed', job_explanation=error)
raise RuntimeError(error)
if job.project and job.project.scm_type:
elif job.project is None:
error = _('Job could not start because it does not have a valid project.')
self.update_model(job.pk, status='failed', job_explanation=error)
raise RuntimeError(error)
elif job.project.status in ('error', 'failed'):
msg = _(
'The project revision for this job template is unknown due to a failed update.'
)
job = self.update_model(job.pk, status='failed', job_explanation=msg)
raise RuntimeError(msg)
project_path = job.project.get_project_path(check_if_exists=False)
job_revision = job.project.scm_revision
needs_sync = True
if not job.project.scm_type:
# manual projects are not synced, user has responsibility for that
needs_sync = False
elif not os.path.exists(project_path):
logger.debug('Performing fresh clone of {} on this instance.'.format(job.project))
elif not job.project.scm_revision:
logger.debug('Revision not known for {}, will sync with remote'.format(job.project))
elif job.project.scm_type == 'git':
git_repo = git.Repo(project_path)
try:
desired_revision = job.project.scm_revision
if job.scm_branch and job.scm_branch != job.project.scm_branch:
desired_revision = job.scm_branch # could be commit or not, but will try as commit
current_revision = git_repo.head.commit.hexsha
if desired_revision == current_revision:
job_revision = desired_revision
logger.info('Skipping project sync for {} because commit is locally available'.format(job.log_format))
needs_sync = False
except (ValueError, BadGitName):
logger.debug('Needed commit for {} not in local source tree, will sync with remote'.format(job.log_format))
# Galaxy requirements are not supported for manual projects
if not needs_sync and job.project.scm_type:
# see if we need a sync because of presence of roles
galaxy_req_path = os.path.join(project_path, 'roles', 'requirements.yml')
if os.path.exists(galaxy_req_path):
logger.debug('Running project sync for {} because of galaxy role requirements.'.format(job.log_format))
needs_sync = True
galaxy_collections_req_path = os.path.join(project_path, 'collections', 'requirements.yml')
if os.path.exists(galaxy_collections_req_path):
logger.debug('Running project sync for {} because of galaxy collections requirements.'.format(job.log_format))
needs_sync = True
if needs_sync:
pu_ig = job.instance_group
pu_en = job.execution_node
if job.is_isolated() is True:
pu_ig = pu_ig.controller
pu_en = settings.CLUSTER_HOST_ID
if job.project.status in ('error', 'failed'):
msg = _(
'The project revision for this job template is unknown due to a failed update.'
)
job = self.update_model(job.pk, status='failed', job_explanation=msg)
raise RuntimeError(msg)
local_project_sync = job.project.create_project_update(
_eager_fields=dict(
launch_type="sync",
job_type='run',
status='running',
instance_group = pu_ig,
execution_node=pu_en,
celery_task_id=job.celery_task_id))
sync_metafields = dict(
launch_type="sync",
job_type='run',
status='running',
instance_group = pu_ig,
execution_node=pu_en,
celery_task_id=job.celery_task_id
)
if job.scm_branch and job.scm_branch != job.project.scm_branch:
sync_metafields['scm_branch'] = job.scm_branch
local_project_sync = job.project.create_project_update(_eager_fields=sync_metafields)
# save the associated job before calling run() so that a
# cancel() call on the job can cancel the project update
job = self.update_model(job.pk, project_update=local_project_sync)
project_update_task = local_project_sync._get_task_class()
try:
project_update_task().run(local_project_sync.id)
job = self.update_model(job.pk, scm_revision=job.project.scm_revision)
# the job private_data_dir is passed so sync can download roles and collections there
sync_task = project_update_task(job_private_data_dir=private_data_dir)
sync_task.run(local_project_sync.id)
local_project_sync.refresh_from_db()
job = self.update_model(job.pk, scm_revision=local_project_sync.scm_revision)
except Exception:
local_project_sync.refresh_from_db()
if local_project_sync.status != 'canceled':
@@ -1616,6 +1688,38 @@ class RunJob(BaseTask):
job_explanation=('Previous Task Failed: {"job_type": "%s", "job_name": "%s", "job_id": "%s"}' %
('project_update', local_project_sync.name, local_project_sync.id)))
raise
job.refresh_from_db()
if job.cancel_flag:
return
else:
# Case where a local sync is not needed, meaning that local tree is
# up-to-date with project, job is running project current version
if job_revision:
job = self.update_model(job.pk, scm_revision=job_revision)
# copy the project directory
runner_project_folder = os.path.join(private_data_dir, 'project')
if job.project.scm_type == 'git':
git_repo = git.Repo(project_path)
if not os.path.exists(runner_project_folder):
os.mkdir(runner_project_folder)
tmp_branch_name = 'awx_internal/{}'.format(uuid4())
# always clone based on specific job revision
if not job.scm_revision:
raise RuntimeError('Unexpectedly could not determine a revision to run from project.')
source_branch = git_repo.create_head(tmp_branch_name, job.scm_revision)
# git clone must take file:// syntax for source repo or else options like depth will be ignored
source_as_uri = Path(project_path).as_uri()
git.Repo.clone_from(
source_as_uri, runner_project_folder, branch=source_branch,
depth=1, single_branch=True, # shallow, do not copy full history
recursive=True # include submodules
)
# force option is necessary because remote refs are not counted, although no information is lost
git_repo.delete_head(tmp_branch_name, force=True)
else:
copy_tree(project_path, runner_project_folder)
if job.inventory.kind == 'smart':
# cache smart inventory memberships so that the host_filter query is not
# ran inside of the event saving code
@@ -1652,7 +1756,24 @@ class RunProjectUpdate(BaseTask):
@property
def proot_show_paths(self):
return [settings.PROJECTS_ROOT]
show_paths = [settings.PROJECTS_ROOT]
if self.job_private_data_dir:
show_paths.append(self.job_private_data_dir)
return show_paths
def __init__(self, *args, job_private_data_dir=None, **kwargs):
super(RunProjectUpdate, self).__init__(*args, **kwargs)
self.playbook_new_revision = None
self.original_branch = None
self.job_private_data_dir = job_private_data_dir
def event_handler(self, event_data):
super(RunProjectUpdate, self).event_handler(event_data)
returned_data = event_data.get('event_data', {})
if returned_data.get('task_action', '') == 'set_fact':
returned_facts = returned_data.get('res', {}).get('ansible_facts', {})
if 'scm_version' in returned_facts:
self.playbook_new_revision = returned_facts['scm_version']
def build_private_data(self, project_update, private_data_dir):
'''
@@ -1667,14 +1788,17 @@ class RunProjectUpdate(BaseTask):
}
}
'''
handle, self.revision_path = tempfile.mkstemp(dir=settings.PROJECTS_ROOT)
if settings.AWX_CLEANUP_PATHS:
self.cleanup_paths.append(self.revision_path)
private_data = {'credentials': {}}
if project_update.credential:
credential = project_update.credential
if credential.has_input('ssh_key_data'):
private_data['credentials'][credential] = credential.get_input('ssh_key_data', default='')
# Create dir where collections will live for the job run
if project_update.job_type != 'check' and getattr(self, 'job_private_data_dir'):
for folder_name in ('requirements_collections', 'requirements_roles'):
folder_path = os.path.join(self.job_private_data_dir, folder_name)
os.mkdir(folder_path, stat.S_IREAD | stat.S_IWRITE | stat.S_IEXEC)
return private_data
def build_passwords(self, project_update, runtime_passwords):
@@ -1770,10 +1894,21 @@ class RunProjectUpdate(BaseTask):
scm_url, extra_vars_new = self._build_scm_url_extra_vars(project_update)
extra_vars.update(extra_vars_new)
if project_update.project.scm_revision and project_update.job_type == 'run':
scm_branch = project_update.scm_branch
branch_override = bool(project_update.scm_branch != project_update.project.scm_branch)
if project_update.job_type == 'run' and scm_branch and (not branch_override):
scm_branch = project_update.project.scm_revision
elif not scm_branch:
scm_branch = {'hg': 'tip'}.get(project_update.scm_type, 'HEAD')
if project_update.job_type == 'check':
roles_enabled = False
collections_enabled = False
else:
scm_branch = project_update.scm_branch or {'hg': 'tip'}.get(project_update.scm_type, 'HEAD')
roles_enabled = getattr(settings, 'AWX_ROLES_ENABLED', True)
collections_enabled = getattr(settings, 'AWX_COLLECTIONS_ENABLED', True)
# collections were introduced in Ansible version 2.8
if Version(get_ansible_version()) <= Version('2.8'):
collections_enabled = False
extra_vars.update({
'project_path': project_update.get_project_path(check_if_exists=False),
'insights_url': settings.INSIGHTS_URL_BASE,
@@ -1785,17 +1920,24 @@ class RunProjectUpdate(BaseTask):
'scm_clean': project_update.scm_clean,
'scm_delete_on_update': project_update.scm_delete_on_update if project_update.job_type == 'check' else False,
'scm_full_checkout': True if project_update.job_type == 'run' else False,
'scm_revision_output': self.revision_path,
'scm_revision': project_update.project.scm_revision,
'roles_enabled': getattr(settings, 'AWX_ROLES_ENABLED', True)
'roles_enabled': roles_enabled,
'collections_enabled': collections_enabled,
})
if project_update.job_type != 'check' and self.job_private_data_dir:
extra_vars['collections_destination'] = os.path.join(self.job_private_data_dir, 'requirements_collections')
extra_vars['roles_destination'] = os.path.join(self.job_private_data_dir, 'requirements_roles')
# apply custom refspec from user for PR refs and the like
if project_update.scm_refspec:
extra_vars['scm_refspec'] = project_update.scm_refspec
elif project_update.project.allow_override:
# If branch is override-able, do extra fetch for all branches
extra_vars['scm_refspec'] = 'refs/heads/*:refs/remotes/origin/*'
self._write_extra_vars_file(private_data_dir, extra_vars)
def build_cwd(self, project_update, private_data_dir):
return self.get_path_to('..', 'playbooks')
def build_playbook_path_relative_to_cwd(self, project_update, private_data_dir):
self.build_cwd(project_update, private_data_dir)
return os.path.join('project_update.yml')
def get_password_prompts(self, passwords={}):
@@ -1909,25 +2051,42 @@ class RunProjectUpdate(BaseTask):
'{} spent {} waiting to acquire lock for local source tree '
'for path {}.'.format(instance.log_format, waiting_time, lock_path))
def pre_run_hook(self, instance):
def pre_run_hook(self, instance, private_data_dir):
# re-create root project folder if a natural disaster has destroyed it
if not os.path.exists(settings.PROJECTS_ROOT):
os.mkdir(settings.PROJECTS_ROOT)
self.acquire_lock(instance)
self.original_branch = None
if (instance.scm_type == 'git' and instance.job_type == 'run' and instance.project and
instance.scm_branch != instance.project.scm_branch):
project_path = instance.project.get_project_path(check_if_exists=False)
if os.path.exists(project_path):
git_repo = git.Repo(project_path)
self.original_branch = git_repo.active_branch
def post_run_hook(self, instance, status):
if self.original_branch:
# for git project syncs, non-default branches can be problems
# restore to branch the repo was on before this run
try:
self.original_branch.checkout()
except Exception:
# this could have failed due to dirty tree, but difficult to predict all cases
logger.exception('Failed to restore project repo to prior state after {}'.format(instance.log_format))
self.release_lock(instance)
p = instance.project
if self.playbook_new_revision:
instance.scm_revision = self.playbook_new_revision
instance.save(update_fields=['scm_revision'])
if instance.job_type == 'check' and status not in ('failed', 'canceled',):
fd = open(self.revision_path, 'r')
lines = fd.readlines()
if lines:
p.scm_revision = lines[0].strip()
if self.playbook_new_revision:
p.scm_revision = self.playbook_new_revision
else:
logger.info("{} Could not find scm revision in check".format(instance.log_format))
if status == 'successful':
logger.error("{} Could not find scm revision in check".format(instance.log_format))
p.playbook_files = p.playbooks
p.inventory_files = p.inventories
p.save()
p.save(update_fields=['scm_revision', 'playbook_files', 'inventory_files'])
# Update any inventories that depend on this project
dependent_inventory_sources = p.scm_inventory_sources.filter(update_on_project_update=True)
@@ -2148,11 +2307,12 @@ class RunInventoryUpdate(BaseTask):
# All credentials not used by inventory source injector
return inventory_update.get_extra_credentials()
def pre_run_hook(self, inventory_update):
def pre_run_hook(self, inventory_update, private_data_dir):
source_project = None
if inventory_update.inventory_source:
source_project = inventory_update.inventory_source.source_project
if (inventory_update.source=='scm' and inventory_update.launch_type!='scm' and source_project):
# In project sync, pulling galaxy roles is not needed
local_project_sync = source_project.create_project_update(
_eager_fields=dict(
launch_type="sync",

View File

@@ -8,9 +8,11 @@ fail_on_template_errors: false
hostvar_expressions:
ansible_host: private_ipv4_addresses[0]
computer_name: name
private_ip: private_ipv4_addresses[0]
private_ip: private_ipv4_addresses[0] if private_ipv4_addresses else None
provisioning_state: provisioning_state | title
public_ip: public_ipv4_addresses[0]
public_ip: public_ipv4_addresses[0] if public_ipv4_addresses else None
public_ip_id: public_ip_id if public_ip_id is defined else None
public_ip_name: public_ip_name if public_ip_name is defined else None
tags: tags if tags else None
type: resource_type
keyed_groups:

View File

@@ -1,4 +1,5 @@
{
"ANSIBLE_JINJA2_NATIVE": "True",
"ANSIBLE_TRANSFORM_INVALID_GROUP_CHARS": "never",
"GCE_CREDENTIALS_FILE_PATH": "{{ file_reference }}",
"GCP_AUTH_KIND": "serviceaccount",

View File

@@ -1,6 +1,6 @@
auth_kind: serviceaccount
compose:
ansible_ssh_host: networkInterfaces[0].accessConfigs[0].natIP
ansible_ssh_host: networkInterfaces[0].accessConfigs[0].natIP | default(networkInterfaces[0].networkIP)
gce_description: description if description else None
gce_id: id
gce_image: image
@@ -9,7 +9,7 @@ compose:
gce_name: name
gce_network: networkInterfaces[0].network.name
gce_private_ip: networkInterfaces[0].networkIP
gce_public_ip: networkInterfaces[0].accessConfigs[0].natIP
gce_public_ip: networkInterfaces[0].accessConfigs[0].natIP | default(None)
gce_status: status
gce_subnetwork: networkInterfaces[0].subnetwork.name
gce_tags: tags.get("items", [])

View File

@@ -117,7 +117,7 @@ def mk_credential(name, credential_type='ssh', persisted=True):
def mk_notification_template(name, notification_type='webhook', configuration=None, organization=None, persisted=True):
nt = NotificationTemplate(name=name)
nt.notification_type = notification_type
nt.notification_configuration = configuration or dict(url="http://localhost", headers={"Test": "Header"})
nt.notification_configuration = configuration or dict(url="http://localhost", username="", password="", headers={"Test": "Header"})
if organization is not None:
nt.organization = organization
@@ -216,7 +216,7 @@ def mk_workflow_job_template(name, extra_vars='', spec=None, organization=None,
def mk_workflow_job_template_node(workflow_job_template=None,
unified_job_template=None,
unified_job_template=None,
success_nodes=None,
failure_nodes=None,
always_nodes=None,
@@ -231,11 +231,11 @@ def mk_workflow_job_template_node(workflow_job_template=None,
return workflow_node
def mk_workflow_job_node(unified_job_template=None,
def mk_workflow_job_node(unified_job_template=None,
success_nodes=None,
failure_nodes=None,
always_nodes=None,
workflow_job=None,
workflow_job=None,
job=None,
persisted=True):
workflow_node = WorkflowJobNode(unified_job_template=unified_job_template,
@@ -247,4 +247,3 @@ def mk_workflow_job_node(unified_job_template=None,
if persisted:
workflow_node.save()
return workflow_node

View File

@@ -9,17 +9,17 @@ from django.conf import settings
from awx.main.analytics import gather, register
@register('example')
@register('example', '1.0')
def example(since):
return {'awx': 123}
@register('bad_json')
@register('bad_json', '1.0')
def bad_json(since):
return set()
@register('throws_error')
@register('throws_error', '1.0')
def throws_error(since):
raise ValueError()

View File

@@ -26,7 +26,8 @@ def test_empty():
"team": 0,
"user": 0,
"workflow_job_template": 0,
"unified_job": 0
"unified_job": 0,
"pending_jobs": 0
}

View File

@@ -30,6 +30,7 @@ EXPECTED_VALUES = {
'awx_instance_info':1.0,
'awx_license_instance_total':0,
'awx_license_instance_free':0,
'awx_pending_jobs_total':0,
}

File diff suppressed because it is too large Load Diff

View File

@@ -6,10 +6,7 @@ from awx.api.versioning import reverse
@pytest.mark.django_db
def test_associate_credential_input_source(get, post, delete, admin, vault_credential, external_credential):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
list_url = reverse('api:credential_input_source_list')
# attach
params = {
@@ -35,7 +32,7 @@ def test_associate_credential_input_source(get, post, delete, admin, vault_crede
response = delete(
reverse(
'api:credential_input_source_detail',
kwargs={'version': 'v2', 'pk': detail.data['id']}
kwargs={'pk': detail.data['id']}
),
admin
)
@@ -55,10 +52,7 @@ def test_associate_credential_input_source(get, post, delete, admin, vault_crede
{'extraneous': 'foo'}, # invalid parameter
])
def test_associate_credential_input_source_with_invalid_metadata(get, post, admin, vault_credential, external_credential, metadata):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'},
)
list_url = reverse('api:credential_input_source_list')
params = {
'target_credential': vault_credential.pk,
@@ -81,7 +75,6 @@ def test_create_from_list(get, post, admin, vault_credential, external_credentia
}
assert post(reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
), params, admin).status_code == 201
assert CredentialInputSource.objects.count() == 1
@@ -90,7 +83,6 @@ def test_create_from_list(get, post, admin, vault_credential, external_credentia
def test_create_credential_input_source_with_external_target_returns_400(post, admin, external_credential, other_external_credential):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
params = {
'target_credential': other_external_credential.pk,
@@ -107,7 +99,6 @@ def test_create_credential_input_source_with_external_target_returns_400(post, a
def test_input_source_rbac_associate(get, post, delete, alice, vault_credential, external_credential):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
params = {
'target_credential': vault_credential.pk,
@@ -142,7 +133,7 @@ def test_input_source_rbac_associate(get, post, delete, alice, vault_credential,
# alice can't admin the target (so she can't remove the input source)
delete_url = reverse(
'api:credential_input_source_detail',
kwargs={'version': 'v2', 'pk': detail.data['id']}
kwargs={'pk': detail.data['id']}
)
response = delete(delete_url, alice)
assert response.status_code == 403
@@ -159,7 +150,7 @@ def test_input_source_detail_rbac(get, post, patch, delete, admin, alice,
other_external_credential):
sublist_url = reverse(
'api:credential_input_source_sublist',
kwargs={'version': 'v2', 'pk': vault_credential.pk}
kwargs={'pk': vault_credential.pk}
)
params = {
'source_credential': external_credential.pk,
@@ -213,10 +204,7 @@ def test_input_source_detail_rbac(get, post, patch, delete, admin, alice,
def test_input_source_create_rbac(get, post, patch, delete, alice,
vault_credential, external_credential,
other_external_credential):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
list_url = reverse('api:credential_input_source_list')
params = {
'target_credential': vault_credential.pk,
'source_credential': external_credential.pk,
@@ -248,10 +236,7 @@ def test_input_source_rbac_swap_target_credential(get, post, put, patch, admin,
# you have to have admin role on the *original* credential (so you can
# remove the relationship) *and* on the *new* credential (so you can apply the
# new relationship)
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
list_url = reverse('api:credential_input_source_list')
params = {
'target_credential': vault_credential.pk,
'source_credential': external_credential.pk,
@@ -292,10 +277,7 @@ def test_input_source_rbac_change_metadata(get, post, put, patch, admin, alice,
machine_credential, external_credential):
# To change an input source, a user must have admin permissions on the
# target credential and use permissions on the source credential.
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
list_url = reverse('api:credential_input_source_list')
params = {
'target_credential': machine_credential.pk,
'source_credential': external_credential.pk,
@@ -328,10 +310,7 @@ def test_input_source_rbac_change_metadata(get, post, put, patch, admin, alice,
@pytest.mark.django_db
def test_create_credential_input_source_with_non_external_source_returns_400(post, admin, credential, vault_credential):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
list_url = reverse('api:credential_input_source_list')
params = {
'target_credential': vault_credential.pk,
'source_credential': credential.pk,
@@ -344,10 +323,7 @@ def test_create_credential_input_source_with_non_external_source_returns_400(pos
@pytest.mark.django_db
def test_create_credential_input_source_with_undefined_input_returns_400(post, admin, vault_credential, external_credential):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
list_url = reverse('api:credential_input_source_list')
params = {
'target_credential': vault_credential.pk,
'source_credential': external_credential.pk,
@@ -361,10 +337,7 @@ def test_create_credential_input_source_with_undefined_input_returns_400(post, a
@pytest.mark.django_db
def test_create_credential_input_source_with_already_used_input_returns_400(post, admin, vault_credential, external_credential, other_external_credential):
list_url = reverse(
'api:credential_input_source_list',
kwargs={'version': 'v2'}
)
list_url = reverse('api:credential_input_source_list')
all_params = [{
'target_credential': vault_credential.pk,
'source_credential': external_credential.pk,

View File

@@ -32,7 +32,7 @@ def test_extra_credentials_filtering(get, job_template, admin,
job_template.credentials.add(credential)
url = reverse(
'api:job_template_extra_credentials_list',
kwargs={'version': 'v2', 'pk': job_template.pk}
kwargs={'pk': job_template.pk}
)
resp = get(url, admin, expect=200)
assert resp.data['count'] == 1
@@ -45,7 +45,7 @@ def test_extra_credentials_requires_cloud_or_net(get, post, job_template, admin,
net_credential):
url = reverse(
'api:job_template_extra_credentials_list',
kwargs={'version': 'v2', 'pk': job_template.pk}
kwargs={'pk': job_template.pk}
)
for cred in (machine_credential, vault_credential):
@@ -63,7 +63,7 @@ def test_extra_credentials_requires_cloud_or_net(get, post, job_template, admin,
def test_prevent_multiple_machine_creds(get, post, job_template, admin, machine_credential):
url = reverse(
'api:job_template_credentials_list',
kwargs={'version': 'v2', 'pk': job_template.pk}
kwargs={'pk': job_template.pk}
)
def _new_cred(name):
@@ -120,7 +120,7 @@ def test_extra_credentials_unique_by_kind(get, post, job_template, admin,
credentialtype_aws):
url = reverse(
'api:job_template_extra_credentials_list',
kwargs={'version': 'v2', 'pk': job_template.pk}
kwargs={'pk': job_template.pk}
)
def _new_cred(name):

View File

@@ -70,7 +70,7 @@ class TestDeleteViews:
delete(
reverse(
'api:inventory_source_hosts_list',
kwargs={'version': 'v2', 'pk': inventory_source.pk}
kwargs={'pk': inventory_source.pk}
), user=rando, expect=403
)
@@ -80,7 +80,7 @@ class TestDeleteViews:
delete(
reverse(
'api:inventory_source_hosts_list',
kwargs={'version': 'v2', 'pk': inventory_source.pk}
kwargs={'pk': inventory_source.pk}
), user=rando, expect=204
)
assert inventory_source.hosts.count() == 0

View File

@@ -600,7 +600,7 @@ class TestControlledBySCM:
assert scm_inventory.inventory_sources.count() == 0
def test_adding_inv_src_ok(self, post, scm_inventory, admin_user):
post(reverse('api:inventory_inventory_sources_list', kwargs={'version': 'v2', 'pk': scm_inventory.id}),
post(reverse('api:inventory_inventory_sources_list', kwargs={'pk': scm_inventory.id}),
{'name': 'new inv src', 'update_on_project_update': False, 'source': 'scm', 'overwrite_vars': True},
admin_user, expect=201)
@@ -611,7 +611,7 @@ class TestControlledBySCM:
def test_two_update_on_project_update_inv_src_prohibited(self, patch, scm_inventory, factory_scm_inventory, project, admin_user):
scm_inventory2 = factory_scm_inventory(name="scm_inventory2")
res = patch(reverse('api:inventory_source_detail', kwargs={'version': 'v2', 'pk': scm_inventory2.id}),
res = patch(reverse('api:inventory_source_detail', kwargs={'pk': scm_inventory2.id}),
{'update_on_project_update': True,},
admin_user, expect=400)
content = json.loads(res.content)

View File

@@ -31,7 +31,7 @@ def test_extra_credentials(get, organization_factory, job_template_factory, cred
jt.save()
job = jt.create_unified_job()
url = reverse('api:job_extra_credentials_list', kwargs={'version': 'v2', 'pk': job.pk})
url = reverse('api:job_extra_credentials_list', kwargs={'pk': job.pk})
response = get(url, user=objs.superusers.admin)
assert response.data.get('count') == 1
@@ -225,19 +225,19 @@ def test_disallowed_http_update_methods(put, patch, post, inventory, project, ad
)
job = jt.create_unified_job()
post(
url=reverse('api:job_detail', kwargs={'pk': job.pk, 'version': 'v2'}),
url=reverse('api:job_detail', kwargs={'pk': job.pk}),
data={},
user=admin_user,
expect=405
)
put(
url=reverse('api:job_detail', kwargs={'pk': job.pk, 'version': 'v2'}),
url=reverse('api:job_detail', kwargs={'pk': job.pk}),
data={},
user=admin_user,
expect=405
)
patch(
url=reverse('api:job_detail', kwargs={'pk': job.pk, 'version': 'v2'}),
url=reverse('api:job_detail', kwargs={'pk': job.pk}),
data={},
user=admin_user,
expect=405

View File

@@ -516,6 +516,25 @@ def test_job_launch_JT_with_credentials(machine_credential, credential, net_cred
assert machine_credential in creds
@pytest.mark.django_db
def test_job_branch_rejected_and_accepted(deploy_jobtemplate):
deploy_jobtemplate.ask_scm_branch_on_launch = True
deploy_jobtemplate.save()
prompted_fields, ignored_fields, errors = deploy_jobtemplate._accept_or_ignore_job_kwargs(
scm_branch='foobar'
)
assert 'scm_branch' in ignored_fields
assert 'does not allow override of branch' in errors['scm_branch']
deploy_jobtemplate.project.allow_override = True
deploy_jobtemplate.project.save()
prompted_fields, ignored_fields, errors = deploy_jobtemplate._accept_or_ignore_job_kwargs(
scm_branch='foobar'
)
assert not ignored_fields
assert prompted_fields['scm_branch'] == 'foobar'
@pytest.mark.django_db
@pytest.mark.job_runtime_vars
def test_job_launch_unprompted_vars_with_survey(mocker, survey_spec_factory, job_template_prompts, post, admin_user):

View File

@@ -46,7 +46,7 @@ def test_extra_credential_creation(get, post, organization_factory, job_template
jt = job_template_factory("jt", organization=objs.organization,
inventory='test_inv', project='test_proj').job_template
url = reverse('api:job_template_extra_credentials_list', kwargs={'version': 'v2', 'pk': jt.pk})
url = reverse('api:job_template_extra_credentials_list', kwargs={'pk': jt.pk})
response = post(url, {
'name': 'My Cred',
'credential_type': credentialtype_aws.pk,
@@ -68,7 +68,7 @@ def test_invalid_credential_kind_xfail(get, post, organization_factory, job_temp
jt = job_template_factory("jt", organization=objs.organization,
inventory='test_inv', project='test_proj').job_template
url = reverse('api:job_template_credentials_list', kwargs={'version': 'v2', 'pk': jt.pk})
url = reverse('api:job_template_credentials_list', kwargs={'pk': jt.pk})
cred_type = CredentialType.defaults[kind]()
cred_type.save()
response = post(url, {
@@ -88,7 +88,7 @@ def test_extra_credential_unique_type_xfail(get, post, organization_factory, job
jt = job_template_factory("jt", organization=objs.organization,
inventory='test_inv', project='test_proj').job_template
url = reverse('api:job_template_extra_credentials_list', kwargs={'version': 'v2', 'pk': jt.pk})
url = reverse('api:job_template_extra_credentials_list', kwargs={'pk': jt.pk})
response = post(url, {
'name': 'My Cred',
'credential_type': credentialtype_aws.pk,
@@ -124,7 +124,7 @@ def test_attach_extra_credential(get, post, organization_factory, job_template_f
jt = job_template_factory("jt", organization=objs.organization,
inventory='test_inv', project='test_proj').job_template
url = reverse('api:job_template_extra_credentials_list', kwargs={'version': 'v2', 'pk': jt.pk})
url = reverse('api:job_template_extra_credentials_list', kwargs={'pk': jt.pk})
response = post(url, {
'associate': True,
'id': credential.id,
@@ -143,7 +143,7 @@ def test_detach_extra_credential(get, post, organization_factory, job_template_f
jt.credentials.add(credential)
jt.save()
url = reverse('api:job_template_extra_credentials_list', kwargs={'version': 'v2', 'pk': jt.pk})
url = reverse('api:job_template_extra_credentials_list', kwargs={'pk': jt.pk})
response = post(url, {
'disassociate': True,
'id': credential.id,
@@ -161,7 +161,7 @@ def test_attach_extra_credential_wrong_kind_xfail(get, post, organization_factor
jt = job_template_factory("jt", organization=objs.organization,
inventory='test_inv', project='test_proj').job_template
url = reverse('api:job_template_extra_credentials_list', kwargs={'version': 'v2', 'pk': jt.pk})
url = reverse('api:job_template_extra_credentials_list', kwargs={'pk': jt.pk})
response = post(url, {
'associate': True,
'id': machine_credential.id,
@@ -505,3 +505,37 @@ def test_callback_disallowed_null_inventory(project):
with pytest.raises(ValidationError) as exc:
serializer.validate({'host_config_key': 'asdfbasecfeee'})
assert 'Cannot enable provisioning callback without an inventory set' in str(exc)
@pytest.mark.django_db
def test_job_template_branch_error(project, inventory, post, admin_user):
r = post(
url=reverse('api:job_template_list'),
data={
"name": "fooo",
"inventory": inventory.pk,
"project": project.pk,
"playbook": "helloworld.yml",
"scm_branch": "foobar"
},
user=admin_user,
expect=400
)
assert 'Project does not allow overriding branch' in str(r.data['scm_branch'])
@pytest.mark.django_db
def test_job_template_branch_prompt_error(project, inventory, post, admin_user):
r = post(
url=reverse('api:job_template_list'),
data={
"name": "fooo",
"inventory": inventory.pk,
"project": project.pk,
"playbook": "helloworld.yml",
"ask_scm_branch_on_launch": True
},
user=admin_user,
expect=400
)
assert 'Project does not allow overriding branch' in str(r.data['ask_scm_branch_on_launch'])

View File

@@ -51,7 +51,7 @@ def test_pagination_cap_page_size(get, admin, inventory):
def host_list_url(params):
request_qs = '?' + urlencode(params)
return reverse('api:host_list', kwargs={'version': 'v2'}) + request_qs
return reverse('api:host_list') + request_qs
with patch('awx.api.pagination.Pagination.max_page_size', 5):
resp = get(host_list_url({'page': '2', 'page_size': '10'}), user=admin)

View File

@@ -5,17 +5,18 @@ from django.conf import settings
import pytest
from awx.api.versioning import reverse
from awx.main.models import Project, JobTemplate
@pytest.mark.django_db
class TestInsightsCredential:
def test_insights_credential(self, patch, insights_project, admin_user, insights_credential):
patch(insights_project.get_absolute_url(),
patch(insights_project.get_absolute_url(),
{'credential': insights_credential.id}, admin_user,
expect=200)
def test_non_insights_credential(self, patch, insights_project, admin_user, scm_credential):
patch(insights_project.get_absolute_url(),
patch(insights_project.get_absolute_url(),
{'credential': scm_credential.id}, admin_user,
expect=400)
@@ -44,3 +45,53 @@ def test_project_unset_custom_virtualenv(get, patch, project, admin, value):
url = reverse('api:project_detail', kwargs={'pk': project.id})
resp = patch(url, {'custom_virtualenv': value}, user=admin, expect=200)
assert resp.data['custom_virtualenv'] is None
@pytest.mark.django_db
def test_no_changing_overwrite_behavior_if_used(post, patch, organization, admin_user):
r1 = post(
url=reverse('api:project_list'),
data={
'name': 'fooo',
'organization': organization.id,
'allow_override': True
},
user=admin_user,
expect=201
)
jt = JobTemplate.objects.create(
name='provides branch', project_id=r1.data['id'],
playbook='helloworld.yml',
scm_branch='foobar'
)
r2 = patch(
url=reverse('api:project_detail', kwargs={'pk': r1.data['id']}),
data={'allow_override': False},
user=admin_user,
expect=400
)
p = Project.objects.get(pk=r1.data['id'])
assert 'job templates depend on branch override behavior for this project' in str(r2.data['allow_override'])
assert 'ids: {}'.format(jt.id) in str(r2.data['allow_override'])
assert p.allow_override is True
@pytest.mark.django_db
def test_changing_overwrite_behavior_okay_if_not_used(post, patch, organization, admin_user):
r1 = post(
url=reverse('api:project_list'),
data={
'name': 'fooo',
'organization': organization.id,
'allow_override': True
},
user=admin_user,
expect=201
)
patch(
url=reverse('api:project_detail', kwargs={'pk': r1.data['id']}),
data={'allow_override': False},
user=admin_user,
expect=200
)
assert Project.objects.get(pk=r1.data['id']).allow_override is False

View File

@@ -11,7 +11,7 @@ def test_empty_inventory(post, get, admin_user, organization, group_factory):
kind='',
organization=organization)
inventory.save()
resp = get(reverse('api:inventory_script_view', kwargs={'version': 'v2', 'pk': inventory.pk}), admin_user)
resp = get(reverse('api:inventory_script_view', kwargs={'pk': inventory.pk}), admin_user)
jdata = json.loads(resp.content)
jdata.pop('all')
@@ -27,7 +27,7 @@ def test_ungrouped_hosts(post, get, admin_user, organization, group_factory):
inventory.save()
Host.objects.create(name='first_host', inventory=inventory)
Host.objects.create(name='second_host', inventory=inventory)
resp = get(reverse('api:inventory_script_view', kwargs={'version': 'v2', 'pk': inventory.pk}), admin_user)
resp = get(reverse('api:inventory_script_view', kwargs={'pk': inventory.pk}), admin_user)
jdata = json.loads(resp.content)
assert inventory.hosts.count() == 2
assert len(jdata['all']['hosts']) == 2

View File

@@ -3,9 +3,17 @@ import json
from awx.api.versioning import reverse
from awx.main.models.activity_stream import ActivityStream
from awx.main.models.jobs import JobTemplate
from awx.main.models.workflow import WorkflowJobTemplateNode
from awx.main.models.workflow import (
WorkflowApproval,
WorkflowApprovalTemplate,
WorkflowJob,
WorkflowJobTemplate,
WorkflowJobTemplateNode,
)
from awx.main.models.credential import Credential
from awx.main.scheduler import TaskManager
@pytest.fixture
@@ -19,31 +27,18 @@ def job_template(inventory, project):
@pytest.fixture
def node(workflow_job_template, post, admin_user, job_template):
def node(workflow_job_template, admin_user, job_template):
return WorkflowJobTemplateNode.objects.create(
workflow_job_template=workflow_job_template,
unified_job_template=job_template
)
@pytest.mark.django_db
def test_blank_UJT_unallowed(workflow_job_template, post, admin_user):
url = reverse('api:workflow_job_template_workflow_nodes_list',
kwargs={'pk': workflow_job_template.pk})
r = post(url, {}, user=admin_user, expect=400)
assert 'unified_job_template' in r.data
@pytest.mark.django_db
def test_cannot_remove_UJT(node, patch, admin_user):
r = patch(
node.get_absolute_url(),
data={'unified_job_template': None},
user=admin_user,
expect=400
@pytest.fixture
def approval_node(workflow_job_template, admin_user):
return WorkflowJobTemplateNode.objects.create(
workflow_job_template=workflow_job_template
)
assert 'unified_job_template' in r.data
@pytest.mark.django_db
@@ -55,7 +50,7 @@ def test_node_rejects_unprompted_fields(inventory, project, workflow_job_templat
ask_limit_on_launch = False
)
url = reverse('api:workflow_job_template_workflow_nodes_list',
kwargs={'pk': workflow_job_template.pk, 'version': 'v2'})
kwargs={'pk': workflow_job_template.pk})
r = post(url, {'unified_job_template': job_template.pk, 'limit': 'webservers'},
user=admin_user, expect=400)
assert 'limit' in r.data
@@ -71,11 +66,196 @@ def test_node_accepts_prompted_fields(inventory, project, workflow_job_template,
ask_limit_on_launch = True
)
url = reverse('api:workflow_job_template_workflow_nodes_list',
kwargs={'pk': workflow_job_template.pk, 'version': 'v2'})
kwargs={'pk': workflow_job_template.pk})
post(url, {'unified_job_template': job_template.pk, 'limit': 'webservers'},
user=admin_user, expect=201)
@pytest.mark.django_db
class TestApprovalNodes():
def test_approval_node_creation(self, post, approval_node, admin_user):
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': approval_node.pk, 'version': 'v2'})
post(url, {'name': 'Test', 'description': 'Approval Node', 'timeout': 0},
user=admin_user, expect=200)
approval_node = WorkflowJobTemplateNode.objects.get(pk=approval_node.pk)
assert isinstance(approval_node.unified_job_template, WorkflowApprovalTemplate)
assert approval_node.unified_job_template.name=='Test'
assert approval_node.unified_job_template.description=='Approval Node'
assert approval_node.unified_job_template.timeout==0
def test_approval_node_creation_failure(self, post, approval_node, admin_user):
# This test leaves off a required param to assert that user will get a 400.
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': approval_node.pk, 'version': 'v2'})
r = post(url, {'name': '', 'description': 'Approval Node', 'timeout': 0},
user=admin_user, expect=400)
approval_node = WorkflowJobTemplateNode.objects.get(pk=approval_node.pk)
assert isinstance(approval_node.unified_job_template, WorkflowApprovalTemplate) is False
assert {'name': ['This field may not be blank.']} == json.loads(r.content)
@pytest.mark.parametrize("is_admin, is_org_admin, status", [
[True, False, 200], # if they're a WFJT admin, they get a 200
[False, False, 403], # if they're not a WFJT *nor* org admin, they get a 403
[False, True, 200], # if they're an organization admin, they get a 200
])
def test_approval_node_creation_rbac(self, post, approval_node, alice, is_admin, is_org_admin, status):
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': approval_node.pk, 'version': 'v2'})
if is_admin is True:
approval_node.workflow_job_template.admin_role.members.add(alice)
if is_org_admin is True:
approval_node.workflow_job_template.organization.admin_role.members.add(alice)
post(url, {'name': 'Test', 'description': 'Approval Node', 'timeout': 0},
user=alice, expect=status)
@pytest.mark.django_db
def test_approval_node_exists(self, post, admin_user, get):
workflow_job_template = WorkflowJobTemplate.objects.create()
approval_node = WorkflowJobTemplateNode.objects.create(
workflow_job_template=workflow_job_template
)
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': approval_node.pk, 'version': 'v2'})
post(url, {'name': 'URL Test', 'description': 'An approval', 'timeout': 0},
user=admin_user)
get(url, admin_user, expect=200)
@pytest.mark.django_db
def test_activity_stream_create_wf_approval(self, post, admin_user, workflow_job_template):
wfjn = WorkflowJobTemplateNode.objects.create(workflow_job_template=workflow_job_template)
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': wfjn.pk, 'version': 'v2'})
post(url, {'name': 'Activity Stream Test', 'description': 'Approval Node', 'timeout': 0},
user=admin_user)
qs1 = ActivityStream.objects.filter(organization__isnull=False)
assert qs1.count() == 1
assert qs1[0].operation == 'create'
qs2 = ActivityStream.objects.filter(organization__isnull=True)
assert qs2.count() == 5
assert list(qs2.values_list('operation', 'object1')) == [('create', 'user'),
('create', 'workflow_job_template'),
('create', 'workflow_job_template_node'),
('create', 'workflow_approval_template'),
('update', 'workflow_job_template_node'),
]
@pytest.mark.django_db
def test_approval_node_approve(self, post, admin_user, job_template):
# This test ensures that a user (with permissions to do so) can APPROVE
# workflow approvals. Also asserts that trying to APPROVE approvals
# that have already been dealt with will throw an error.
wfjt = WorkflowJobTemplate.objects.create(name='foobar')
node = wfjt.workflow_nodes.create(unified_job_template=job_template)
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': node.pk, 'version': 'v2'})
post(url, {'name': 'Approve Test', 'description': '', 'timeout': 0},
user=admin_user, expect=200)
post(reverse('api:workflow_job_template_launch', kwargs={'pk': wfjt.pk}),
user=admin_user, expect=201)
wf_job = WorkflowJob.objects.first()
TaskManager().schedule()
TaskManager().schedule()
wfj_node = wf_job.workflow_nodes.first()
approval = wfj_node.job
assert approval.name == 'Approve Test'
post(reverse('api:workflow_approval_approve', kwargs={'pk': approval.pk}),
user=admin_user, expect=204)
# Test that there is an activity stream entry that was created for the "approve" action.
qs = ActivityStream.objects.order_by('-timestamp').first()
assert qs.object1 == 'workflow_approval'
assert qs.changes == '{"status": ["pending", "successful"]}'
assert WorkflowApproval.objects.get(pk=approval.pk).status == 'successful'
assert qs.operation == 'update'
post(reverse('api:workflow_approval_approve', kwargs={'pk': approval.pk}),
user=admin_user, expect=400)
@pytest.mark.django_db
def test_approval_node_deny(self, post, admin_user, job_template):
# This test ensures that a user (with permissions to do so) can DENY
# workflow approvals. Also asserts that trying to DENY approvals
# that have already been dealt with will throw an error.
wfjt = WorkflowJobTemplate.objects.create(name='foobar')
node = wfjt.workflow_nodes.create(unified_job_template=job_template)
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': node.pk, 'version': 'v2'})
post(url, {'name': 'Deny Test', 'description': '', 'timeout': 0},
user=admin_user, expect=200)
post(reverse('api:workflow_job_template_launch', kwargs={'pk': wfjt.pk}),
user=admin_user, expect=201)
wf_job = WorkflowJob.objects.first()
TaskManager().schedule()
TaskManager().schedule()
wfj_node = wf_job.workflow_nodes.first()
approval = wfj_node.job
assert approval.name == 'Deny Test'
post(reverse('api:workflow_approval_deny', kwargs={'pk': approval.pk}),
user=admin_user, expect=204)
# Test that there is an activity stream entry that was created for the "deny" action.
qs = ActivityStream.objects.order_by('-timestamp').first()
assert qs.object1 == 'workflow_approval'
assert qs.changes == '{"status": ["pending", "failed"]}'
assert WorkflowApproval.objects.get(pk=approval.pk).status == 'failed'
assert qs.operation == 'update'
post(reverse('api:workflow_approval_deny', kwargs={'pk': approval.pk}),
user=admin_user, expect=400)
def test_approval_node_cleanup(self, post, approval_node, admin_user, get):
workflow_job_template = WorkflowJobTemplate.objects.create()
approval_node = WorkflowJobTemplateNode.objects.create(
workflow_job_template=workflow_job_template
)
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': approval_node.pk, 'version': 'v2'})
post(url, {'name': 'URL Test', 'description': 'An approval', 'timeout': 0},
user=admin_user)
assert WorkflowApprovalTemplate.objects.count() == 1
workflow_job_template.delete()
assert WorkflowApprovalTemplate.objects.count() == 0
get(url, admin_user, expect=404)
def test_changed_approval_deletion(self, post, approval_node, admin_user, workflow_job_template, job_template):
# This test verifies that when an approval node changes into something else
# (in this case, a job template), then the previously-set WorkflowApprovalTemplate
# is automatically deleted.
workflow_job_template = WorkflowJobTemplate.objects.create()
approval_node = WorkflowJobTemplateNode.objects.create(
workflow_job_template=workflow_job_template
)
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': approval_node.pk, 'version': 'v2'})
post(url, {'name': 'URL Test', 'description': 'An approval', 'timeout': 0},
user=admin_user)
assert WorkflowApprovalTemplate.objects.count() == 1
approval_node.unified_job_template = job_template
approval_node.save()
assert WorkflowApprovalTemplate.objects.count() == 0
def test_deleted_approval_denial(self, post, approval_node, admin_user, workflow_job_template):
# Verifying that when a WorkflowApprovalTemplate is deleted, any/all of
# its pending approvals are auto-denied (vs left in 'pending' state).
workflow_job_template = WorkflowJobTemplate.objects.create()
approval_node = WorkflowJobTemplateNode.objects.create(
workflow_job_template=workflow_job_template
)
url = reverse('api:workflow_job_template_node_create_approval',
kwargs={'pk': approval_node.pk, 'version': 'v2'})
post(url, {'name': 'URL Test', 'description': 'An approval', 'timeout': 0},
user=admin_user)
assert WorkflowApprovalTemplate.objects.count() == 1
approval_template = WorkflowApprovalTemplate.objects.first()
approval = approval_template.create_unified_job()
approval.status = 'pending'
approval.save()
approval_template.delete()
approval.refresh_from_db()
assert approval.status == 'failed'
@pytest.mark.django_db
class TestExclusiveRelationshipEnforcement():
@pytest.fixture
@@ -129,6 +309,12 @@ class TestNodeCredentials:
under the "credentials" key - WFJT nodes have a many-to-many relationship
corresponding to this, and it must follow rules consistent with other prompts
'''
@pytest.fixture
def job_template_ask(self, job_template):
job_template.ask_credential_on_launch = True
job_template.save()
return job_template
def test_not_allows_non_job_models(self, post, admin_user, workflow_job_template,
project, machine_credential):
node = WorkflowJobTemplateNode.objects.create(
@@ -146,21 +332,6 @@ class TestNodeCredentials:
)
assert 'cannot accept credentials on launch' in str(r.data['msg'])
@pytest.mark.django_db
class TestOldCredentialField:
'''
The field `credential` on JTs & WFJT nodes is deprecated, but still supported
TODO: remove tests when JT vault_credential / credential / other stuff
is removed
'''
@pytest.fixture
def job_template_ask(self, job_template):
job_template.ask_credential_on_launch = True
job_template.save()
return job_template
def test_credential_accepted_create(self, workflow_job_template, post, admin_user,
job_template_ask, machine_credential):
r = post(
@@ -168,16 +339,16 @@ class TestOldCredentialField:
'api:workflow_job_template_workflow_nodes_list',
kwargs = {'pk': workflow_job_template.pk}
),
data = {'credential': machine_credential.pk, 'unified_job_template': job_template_ask.pk},
data = {'unified_job_template': job_template_ask.pk},
user = admin_user,
expect = 201
)
assert r.data['credential'] == machine_credential.pk
node = WorkflowJobTemplateNode.objects.get(pk=r.data['id'])
post(url=r.data['related']['credentials'], data={'id': machine_credential.pk}, user=admin_user, expect=204)
assert list(node.credentials.all()) == [machine_credential]
@pytest.mark.parametrize('role,code', [
['use_role', 201],
['use_role', 204],
['read_role', 403]
])
def test_credential_rbac(self, role, code, workflow_job_template, post, rando,
@@ -186,39 +357,41 @@ class TestOldCredentialField:
role_obj.members.add(rando)
job_template_ask.execute_role.members.add(rando)
workflow_job_template.admin_role.members.add(rando)
post(
r = post(
reverse(
'api:workflow_job_template_workflow_nodes_list',
kwargs = {'pk': workflow_job_template.pk}
),
data = {'credential': machine_credential.pk, 'unified_job_template': job_template_ask.pk},
data = {'unified_job_template': job_template_ask.pk},
user = rando,
expect = code
expect = 201
)
creds_url = r.data['related']['credentials']
post(url=creds_url, data={'id': machine_credential.pk}, user=rando, expect=code)
def test_credential_add_remove(self, node, patch, machine_credential, admin_user):
def test_credential_add_remove(self, node, get, post, machine_credential, admin_user):
node.unified_job_template.ask_credential_on_launch = True
node.unified_job_template.save()
url = node.get_absolute_url()
patch(
url,
data = {'credential': machine_credential.pk},
r = get(url=url, user=admin_user, expect=200)
post(
url = r.data['related']['credentials'],
data = {'id': machine_credential.pk},
user = admin_user,
expect = 200
expect = 204
)
node.refresh_from_db()
assert node.credential == machine_credential.pk
patch(
url,
data = {'credential': None},
post(
url = r.data['related']['credentials'],
data = {'id': machine_credential.pk, 'disassociate': True},
user = admin_user,
expect = 200
expect = 204
)
node.refresh_from_db()
assert list(node.credentials.values_list('pk', flat=True)) == []
def test_credential_replace(self, node, patch, credentialtype_ssh, admin_user):
def test_credential_replace(self, node, get, post, credentialtype_ssh, admin_user):
node.unified_job_template.ask_credential_on_launch = True
node.unified_job_template.save()
cred1 = Credential.objects.create(
@@ -230,12 +403,14 @@ class TestOldCredentialField:
name='machine-cred2',
inputs={'username': 'test_user', 'password': 'pas4word'})
node.credentials.add(cred1)
assert node.credential == cred1.pk
url = node.get_absolute_url()
patch(
url,
data = {'credential': cred2.pk},
user = admin_user,
expect = 200
)
assert node.credential == cred2.pk
r = get(url=url, user=admin_user, expect=200)
creds_url = r.data['related']['credentials']
# cannot do it this way
r2 = post(url=creds_url, data={'id': cred2.pk}, user=admin_user, expect=400)
assert 'This launch configuration already provides a Machine credential' in r2.data['msg']
# guess I will remove that existing one
post(url=creds_url, data={'id': cred1.pk, 'disassociate': True}, user=admin_user, expect=204)
# okay, now I will add the new one
post(url=creds_url, data={'id': cred2.pk}, user=admin_user, expect=204)
assert list(node.credentials.values_list('id', flat=True)) == [cred2.pk]

View File

@@ -385,7 +385,9 @@ def notification_template(organization):
organization=organization,
notification_type="webhook",
notification_configuration=dict(url="http://localhost",
headers={"Test": "Header"}))
username="",
password="",
headers={"Test": "Header",}))
@pytest.fixture

View File

@@ -0,0 +1,148 @@
# -*- coding: utf-8 -*-
from copy import deepcopy
import datetime
import pytest
#from awx.main.models import NotificationTemplates, Notifications, JobNotificationMixin
from awx.main.models import (AdHocCommand, InventoryUpdate, Job, JobNotificationMixin, ProjectUpdate,
SystemJob, WorkflowJob)
from awx.api.serializers import UnifiedJobSerializer
class TestJobNotificationMixin(object):
CONTEXT_STRUCTURE = {'job': {'allow_simultaneous': bool,
'custom_virtualenv': str,
'controller_node': str,
'created': datetime.datetime,
'description': str,
'diff_mode': bool,
'elapsed': float,
'execution_node': str,
'failed': bool,
'finished': bool,
'force_handlers': bool,
'forks': int,
'host_status_counts': {'skipped': int, 'ok': int, 'changed': int,
'failures': int, 'dark': int},
'id': int,
'job_explanation': str,
'job_slice_count': int,
'job_slice_number': int,
'job_tags': str,
'job_type': str,
'launch_type': str,
'limit': str,
'modified': datetime.datetime,
'name': str,
'playbook': str,
'playbook_counts': {'play_count': int, 'task_count': int},
'scm_revision': str,
'skip_tags': str,
'start_at_task': str,
'started': str,
'status': str,
'summary_fields': {'created_by': {'first_name': str,
'id': int,
'last_name': str,
'username': str},
'instance_group': {'id': int, 'name': str},
'inventory': {'description': str,
'groups_with_active_failures': int,
'has_active_failures': bool,
'has_inventory_sources': bool,
'hosts_with_active_failures': int,
'id': int,
'inventory_sources_with_failures': int,
'kind': str,
'name': str,
'organization_id': int,
'total_groups': int,
'total_hosts': int,
'total_inventory_sources': int},
'job_template': {'description': str,
'id': int,
'name': str},
'labels': {'count': int, 'results': list},
'project': {'description': str,
'id': int,
'name': str,
'scm_type': str,
'status': str},
'project_update': {'id': int, 'name': str, 'description': str, 'status': str, 'failed': bool},
'unified_job_template': {'description': str,
'id': int,
'name': str,
'unified_job_type': str},
'source_workflow_job': {'description': str,
'elapsed': float,
'failed': bool,
'id': int,
'name': str,
'status': str}},
'timeout': int,
'type': str,
'url': str,
'use_fact_cache': bool,
'verbosity': int},
'job_friendly_name': str,
'job_summary_dict': str,
'url': str}
@pytest.mark.django_db
@pytest.mark.parametrize('JobClass', [AdHocCommand, InventoryUpdate, Job, ProjectUpdate, SystemJob, WorkflowJob])
def test_context(self, JobClass, sqlite_copy_expert, project, inventory_source):
"""The Jinja context defines all of the fields that can be used by a template. Ensure that the context generated
for each job type has the expected structure."""
def check_structure(expected_structure, obj):
if isinstance(expected_structure, dict):
assert isinstance(obj, dict)
for key in obj:
assert key in expected_structure
if obj[key] is None:
continue
if isinstance(expected_structure[key], dict):
assert isinstance(obj[key], dict)
check_structure(expected_structure[key], obj[key])
else:
assert isinstance(obj[key], expected_structure[key])
kwargs = {}
if JobClass is InventoryUpdate:
kwargs['inventory_source'] = inventory_source
elif JobClass is ProjectUpdate:
kwargs['project'] = project
job = JobClass.objects.create(name='foo', **kwargs)
job_serialization = UnifiedJobSerializer(job).to_representation(job)
context = job.context(job_serialization)
check_structure(TestJobNotificationMixin.CONTEXT_STRUCTURE, context)
def test_context_stub(self):
"""The context stub is a fake context used to validate custom notification messages. Ensure that
this also has the expected structure. Furthermore, ensure that the stub context contains
*all* fields that could possibly be included in a context."""
def check_structure_and_completeness(expected_structure, obj):
expected_structure = deepcopy(expected_structure)
if isinstance(expected_structure, dict):
assert isinstance(obj, dict)
for key in obj:
assert key in expected_structure
# Context stub should not have any undefined fields
assert obj[key] is not None
if isinstance(expected_structure[key], dict):
assert isinstance(obj[key], dict)
check_structure_and_completeness(expected_structure[key], obj[key])
expected_structure.pop(key)
else:
assert isinstance(obj[key], expected_structure[key])
expected_structure.pop(key)
# Ensure all items in expected structure were present
assert not len(expected_structure)
context_stub = JobNotificationMixin.context_stub()
check_structure_and_completeness(TestJobNotificationMixin.CONTEXT_STRUCTURE, context_stub)

View File

@@ -10,7 +10,7 @@ from django.contrib.contenttypes.models import ContentType
# AWX
from awx.main.models import (
UnifiedJobTemplate, Job, JobTemplate, WorkflowJobTemplate,
Project, WorkflowJob, Schedule,
WorkflowApprovalTemplate, Project, WorkflowJob, Schedule,
Credential
)
@@ -20,7 +20,9 @@ def test_subclass_types(rando):
assert set(UnifiedJobTemplate._submodels_with_roles()) == set([
ContentType.objects.get_for_model(JobTemplate).id,
ContentType.objects.get_for_model(Project).id,
ContentType.objects.get_for_model(WorkflowJobTemplate).id
ContentType.objects.get_for_model(WorkflowJobTemplate).id,
ContentType.objects.get_for_model(WorkflowApprovalTemplate).id
])

View File

@@ -3,7 +3,9 @@ from unittest import mock
from awx.api.versioning import reverse
from awx.main.utils import decrypt_field
from awx.main.models.workflow import WorkflowJobTemplateNode
from awx.main.models.workflow import (
WorkflowJobTemplate, WorkflowJobTemplateNode, WorkflowApprovalTemplate
)
from awx.main.models.jobs import JobTemplate
from awx.main.tasks import deep_copy_model_obj
@@ -175,6 +177,76 @@ def test_workflow_job_template_copy(workflow_job_template, post, get, admin, org
assert copied_node_list[4] in copied_node_list[3].failure_nodes.all()
@pytest.mark.django_db
def test_workflow_approval_node_copy(workflow_job_template, post, get, admin, organization):
workflow_job_template.organization = organization
workflow_job_template.save()
ajts = [
WorkflowApprovalTemplate.objects.create(
name='test-approval-{}'.format(i),
description='description-{}'.format(i),
timeout=30
)
for i in range(0, 5)
]
nodes = [
WorkflowJobTemplateNode.objects.create(
workflow_job_template=workflow_job_template, unified_job_template=ajts[i]
) for i in range(0, 5)
]
nodes[0].success_nodes.add(nodes[1])
nodes[1].success_nodes.add(nodes[2])
nodes[0].failure_nodes.add(nodes[3])
nodes[3].failure_nodes.add(nodes[4])
assert WorkflowJobTemplate.objects.count() == 1
assert WorkflowJobTemplateNode.objects.count() == 5
assert WorkflowApprovalTemplate.objects.count() == 5
with mock.patch('awx.api.generics.trigger_delayed_deep_copy') as deep_copy_mock:
wfjt_copy_id = post(
reverse('api:workflow_job_template_copy', kwargs={'pk': workflow_job_template.pk}),
{'name': 'new wfjt name'}, admin, expect=201
).data['id']
wfjt_copy = type(workflow_job_template).objects.get(pk=wfjt_copy_id)
args, kwargs = deep_copy_mock.call_args
deep_copy_model_obj(*args, **kwargs)
assert wfjt_copy.organization == organization
assert wfjt_copy.created_by == admin
assert wfjt_copy.name == 'new wfjt name'
assert WorkflowJobTemplate.objects.count() == 2
assert WorkflowJobTemplateNode.objects.count() == 10
assert WorkflowApprovalTemplate.objects.count() == 10
original_templates = [
x.unified_job_template for x in workflow_job_template.workflow_job_template_nodes.all()
]
copied_templates = [
x.unified_job_template for x in wfjt_copy.workflow_job_template_nodes.all()
]
# make sure shallow fields like `timeout` are copied properly
for i, t in enumerate(original_templates):
assert t.timeout == 30
assert t.description == 'description-{}'.format(i)
for i, t in enumerate(copied_templates):
assert t.timeout == 30
assert t.description == 'description-{}'.format(i)
# the Approval Template IDs on the *original* WFJT should not match *any*
# of the Approval Template IDs on the *copied* WFJT
assert not set([x.id for x in original_templates]).intersection(
set([x.id for x in copied_templates])
)
# if you remove the " copy" suffix from the copied template names, they
# should match the original templates
assert (
set([x.name for x in original_templates]) ==
set([x.name.replace(' copy', '') for x in copied_templates])
)
@pytest.mark.django_db
def test_credential_copy(post, get, machine_credential, credentialtype_ssh, admin):
assert get(

View File

@@ -42,6 +42,8 @@ def test_basic_parameterization(get, post, user, organization):
assert 'notification_configuration' in response.data
assert 'url' in response.data['notification_configuration']
assert 'headers' in response.data['notification_configuration']
assert 'messages' in response.data
assert response.data['messages'] == {'started': None, 'success': None, 'error': None}
@pytest.mark.django_db
@@ -92,7 +94,7 @@ def test_inherited_notification_templates(get, post, user, organization, project
isrc.save()
jt = JobTemplate.objects.create(name='test', inventory=i, project=project, playbook='debug.yml')
jt.save()
@pytest.mark.django_db
def test_notification_template_simple_patch(patch, notification_template, admin):
@@ -124,7 +126,7 @@ def test_custom_environment_injection(post, user, organization):
organization=organization.id,
notification_type="webhook",
notification_configuration=dict(url="https://example.org", disable_ssl_verification=False,
headers={"Test": "Header"})),
http_method="POST", headers={"Test": "Header"})),
u)
assert response.status_code == 201
template = NotificationTemplate.objects.get(pk=response.data['id'])

View File

@@ -74,6 +74,19 @@ def test_org_credential_access_admin(role_name, alice, org_credential):
'organization': org_credential.organization.pk})
@pytest.mark.django_db
def test_org_and_user_credential_access(alice, organization):
"""Address specific bug where any user could make an org credential
in another org without any permissions to that org
"""
# Owner is both user and org, but org permission should still be checked
assert not CredentialAccess(alice).can_add({
'name': 'New credential.',
'user': alice.pk,
'organization': organization.pk
})
@pytest.mark.django_db
def test_org_credential_access_member(alice, org_credential):
org_credential.admin_role.members.add(alice)

View File

@@ -6,6 +6,7 @@ from awx.main.access import (
InventoryAccess,
JobTemplateAccess,
)
from awx.main.models import Organization
@pytest.mark.django_db
@@ -36,6 +37,16 @@ def test_ig_normal_user_associability(organization, default_instance_group, user
assert not access.can_attach(organization, default_instance_group, 'instance_groups', None)
@pytest.mark.django_db
def test_access_via_two_organizations(rando, default_instance_group):
for org_name in ['org1', 'org2']:
org = Organization.objects.create(name=org_name)
org.instance_groups.add(default_instance_group)
org.admin_role.members.add(rando)
access = InstanceGroupAccess(rando)
assert list(access.get_queryset()) == [default_instance_group]
@pytest.mark.django_db
def test_ig_associability(organization, default_instance_group, admin, system_auditor, org_admin, org_member, job_template_factory):
admin_access = OrganizationAccess(admin)
@@ -53,7 +64,7 @@ def test_ig_associability(organization, default_instance_group, admin, system_au
assert not oadmin_access.can_unattach(organization, default_instance_group, 'instance_groups', None)
assert not auditor_access.can_unattach(organization, default_instance_group, 'instance_groups', None)
assert not omember_access.can_unattach(organization, default_instance_group, 'instance_groups', None)
objects = job_template_factory('jt', organization=organization, project='p',
inventory='i', credential='c')
admin_access = InventoryAccess(admin)
@@ -75,5 +86,3 @@ def test_ig_associability(organization, default_instance_group, admin, system_au
assert oadmin_access.can_attach(objects.job_template, default_instance_group, 'instance_groups', None)
assert not auditor_access.can_attach(objects.job_template, default_instance_group, 'instance_groups', None)
assert not omember_access.can_attach(objects.job_template, default_instance_group, 'instance_groups', None)

View File

@@ -106,7 +106,7 @@ def test_job_template_extra_credentials_prompts_access(
jt.credentials.add(machine_credential)
jt.execute_role.members.add(rando)
r = post(
reverse('api:job_template_launch', kwargs={'version': 'v2', 'pk': jt.id}),
reverse('api:job_template_launch', kwargs={'pk': jt.id}),
{'credentials': [machine_credential.pk, vault_credential.pk]}, rando
)
assert r.status_code == 403

View File

@@ -22,7 +22,7 @@ def test_label_get_queryset_su(label, user):
@pytest.mark.django_db
def test_label_access(label, user):
access = LabelAccess(user('user', False))
assert not access.can_read(label)
assert access.can_read(label)
@pytest.mark.django_db

View File

@@ -87,7 +87,7 @@ def test_notification_template_access_admin(role, organization_factory, notifica
assert access.can_change(notification_template, {'organization': present_org.id})
assert access.can_delete(notification_template)
nf = notification_template_factory("test-orphaned")
nf = notification_template_factory("test-orphaned").notification_template
assert not access.can_read(nf)
assert not access.can_change(nf, None)
assert not access.can_delete(nf)

View File

@@ -0,0 +1,59 @@
# -*- coding: utf-8 -*-
import pytest
from rest_framework.serializers import ValidationError
# AWX
from awx.api.serializers import NotificationTemplateSerializer
class StubNotificationTemplate():
notification_type = 'email'
class TestNotificationTemplateSerializer():
@pytest.mark.parametrize('valid_messages',
[None,
{'started': None},
{'started': {'message': None}},
{'started': {'message': 'valid'}},
{'started': {'body': 'valid'}},
{'started': {'message': 'valid', 'body': 'valid'}},
{'started': None, 'success': None, 'error': None},
{'started': {'message': None, 'body': None},
'success': {'message': None, 'body': None},
'error': {'message': None, 'body': None}},
{'started': {'message': '{{ job.id }}', 'body': '{{ job.status }}'},
'success': {'message': None, 'body': '{{ job_friendly_name }}'},
'error': {'message': '{{ url }}', 'body': None}},
{'started': {'body': '{{ job_summary_dict }}'}},
{'started': {'body': '{{ job.summary_fields.inventory.total_hosts }}'}},
{'started': {'body': u'Iñtërnâtiônàlizætiøn'}}
])
def test_valid_messages(self, valid_messages):
serializer = NotificationTemplateSerializer()
serializer.instance = StubNotificationTemplate()
serializer.validate_messages(valid_messages)
@pytest.mark.parametrize('invalid_messages',
[1,
[],
'',
{'invalid_event': ''},
{'started': 'should_be_dict'},
{'started': {'bad_message_type': ''}},
{'started': {'message': 1}},
{'started': {'message': []}},
{'started': {'message': {}}},
{'started': {'message': '{{ unclosed_braces'}},
{'started': {'message': '{{ undefined }}'}},
{'started': {'message': '{{ job.undefined }}'}},
{'started': {'message': '{{ job.id | bad_filter }}'}},
{'started': {'message': '{{ job.__class__ }}'}},
{'started': {'message': 'Newlines \n not allowed\n'}},
])
def test_invalid__messages(self, invalid_messages):
serializer = NotificationTemplateSerializer()
serializer.instance = StubNotificationTemplate()
with pytest.raises(ValidationError):
serializer.validate_messages(invalid_messages)

View File

@@ -256,7 +256,7 @@ class TestExtraVarSanitation(TestJobExecution):
def test_vars_unsafe_by_default(self, job, private_data_dir):
job.created_by = User(pk=123, username='angry-spud')
job.inventory = Inventory(pk=123, name='example-inv')
job.inventory = Inventory(pk=123, name='example-inv')
task = tasks.RunJob()
task.build_extra_vars_file(job, private_data_dir)
@@ -361,15 +361,16 @@ class TestExtraVarSanitation(TestJobExecution):
class TestGenericRun():
def test_generic_failure(self, patch_Job):
job = Job(status='running', inventory=Inventory())
job = Job(status='running', inventory=Inventory(), project=Project())
job.websocket_emit_status = mock.Mock()
task = tasks.RunJob()
task.update_model = mock.Mock(return_value=job)
task.build_private_data_files = mock.Mock(side_effect=OSError())
with pytest.raises(Exception):
task.run(1)
with mock.patch('awx.main.tasks.copy_tree'):
with pytest.raises(Exception):
task.run(1)
update_model_call = task.update_model.call_args[1]
assert 'OSError' in update_model_call['result_traceback']
@@ -386,8 +387,9 @@ class TestGenericRun():
task.update_model = mock.Mock(wraps=update_model_wrapper)
task.build_private_data_files = mock.Mock()
with pytest.raises(Exception):
task.run(1)
with mock.patch('awx.main.tasks.copy_tree'):
with pytest.raises(Exception):
task.run(1)
for c in [
mock.call(1, status='running', start_args=''),
@@ -434,6 +436,7 @@ class TestGenericRun():
job = Job(project=Project(), inventory=Inventory())
task = tasks.RunJob()
task.should_use_proot = lambda instance: True
task.instance = job
private_data_dir = '/foo'
cwd = '/bar'
@@ -445,7 +448,7 @@ class TestGenericRun():
process_isolation_params = task.build_params_process_isolation(job, private_data_dir, cwd)
assert True is process_isolation_params['process_isolation']
assert settings.AWX_PROOT_BASE_PATH == process_isolation_params['process_isolation_path'], \
assert process_isolation_params['process_isolation_path'].startswith(settings.AWX_PROOT_BASE_PATH), \
"Directory where a temp directory will be created for the remapping to take place"
assert private_data_dir in process_isolation_params['process_isolation_show_paths'], \
"The per-job private data dir should be in the list of directories the user can see."
@@ -523,7 +526,10 @@ class TestGenericRun():
with mock.patch('awx.main.tasks.settings.AWX_ANSIBLE_COLLECTIONS_PATHS', ['/AWX_COLLECTION_PATH']):
with mock.patch('awx.main.tasks.settings.AWX_TASK_ENV', {'ANSIBLE_COLLECTIONS_PATHS': '/MY_COLLECTION1:/MY_COLLECTION2'}):
env = task.build_env(job, private_data_dir)
assert env['ANSIBLE_COLLECTIONS_PATHS'] == '/MY_COLLECTION1:/MY_COLLECTION2:/AWX_COLLECTION_PATH'
used_paths = env['ANSIBLE_COLLECTIONS_PATHS'].split(':')
assert used_paths[-1].endswith('/requirements_collections')
used_paths.pop()
assert used_paths == ['/MY_COLLECTION1', '/MY_COLLECTION2', '/AWX_COLLECTION_PATH']
def test_valid_custom_virtualenv(self, patch_Job, private_data_dir):
job = Job(project=Project(), inventory=Inventory())
@@ -1701,6 +1707,7 @@ class TestProjectUpdateCredentials(TestJobExecution):
def test_process_isolation_exposes_projects_root(self, private_data_dir, project_update):
task = tasks.RunProjectUpdate()
task.revision_path = 'foobar'
task.instance = project_update
ssh = CredentialType.defaults['ssh']()
project_update.scm_type = 'git'
project_update.credential = Credential(
@@ -1718,8 +1725,6 @@ class TestProjectUpdateCredentials(TestJobExecution):
call_args, _ = task._write_extra_vars_file.call_args_list[0]
_, extra_vars = call_args
assert extra_vars["scm_revision_output"] == 'foobar'
def test_username_and_password_auth(self, project_update, scm_type):
task = tasks.RunProjectUpdate()
ssh = CredentialType.defaults['ssh']()

View File

@@ -11,9 +11,9 @@
# scm_username: username (only for svn/insights)
# scm_password: password (only for svn/insights)
# scm_accept_hostkey: true/false (only for git, set automatically)
# scm_revision: current revision in tower
# scm_revision_output: where to store gathered revision (temporary file)
# scm_refspec: a refspec to fetch in addition to obtaining version
# roles_enabled: Allow us to pull roles from a requirements.yml file
# roles_destination: Path to save roles from galaxy to
# awx_version: Current running version of the awx or tower as a string
# awx_license_type: "open" for AWX; else presume Tower
@@ -29,27 +29,12 @@
delegate_to: localhost
- block:
- name: check repo using git
git:
dest: "{{project_path|quote}}"
repo: "{{scm_url}}"
version: "{{scm_branch|quote}}"
force: "{{scm_clean}}"
update: false
clone: false
register: repo_check
when: scm_full_checkout|default('')
ignore_errors: true
- name: break if already checked out
meta: end_play
when: scm_full_checkout|default('') and repo_check is succeeded and repo_check.before == scm_branch
- name: update project using git
git:
dest: "{{project_path|quote}}"
repo: "{{scm_url}}"
version: "{{scm_branch|quote}}"
refspec: "{{scm_refspec|default(omit)}}"
force: "{{scm_clean}}"
accept_hostkey: "{{scm_accept_hostkey|default(omit)}}"
register: git_result
@@ -131,13 +116,6 @@
debug: msg="Repository Version {{ scm_version }}"
when: scm_version is defined
- name: Write Repository Version
copy:
dest: "{{ scm_revision_output }}"
content: "{{ scm_version }}"
when: scm_version is defined and scm_revision_output is defined
delegate_to: localhost
- hosts: all
gather_facts: false
tasks:
@@ -148,18 +126,28 @@
register: doesRequirementsExist
- name: fetch galaxy roles from requirements.yml
command: ansible-galaxy install -r requirements.yml -p {{project_path|quote}}/roles/
command: ansible-galaxy install -r requirements.yml -p {{roles_destination|quote}}
args:
chdir: "{{project_path|quote}}/roles"
register: galaxy_result
when: doesRequirementsExist.stat.exists and (scm_version is undefined or (git_result is not skipped and git_result['before'] == git_result['after']))
when: doesRequirementsExist.stat.exists
changed_when: "'was installed successfully' in galaxy_result.stdout"
- name: fetch galaxy roles from requirements.yml (forced update)
command: ansible-galaxy install -r requirements.yml -p {{project_path|quote}}/roles/ --force
args:
chdir: "{{project_path|quote}}/roles"
when: doesRequirementsExist.stat.exists and galaxy_result is skipped
when: roles_enabled|bool
delegate_to: localhost
- block:
- name: detect collections/requirements.yml
stat: path={{project_path|quote}}/collections/requirements.yml
register: doesCollectionRequirementsExist
- name: fetch galaxy collections from collections/requirements.yml
command: ansible-galaxy collection install -r requirements.yml -p {{collections_destination|quote}}
args:
chdir: "{{project_path|quote}}/collections"
register: galaxy_collection_result
when: doesCollectionRequirementsExist.stat.exists
changed_when: "'Installing ' in galaxy_collection_result.stdout"
when: collections_enabled|bool
delegate_to: localhost

View File

@@ -51,9 +51,11 @@ def main():
try:
re_match = re.match(r'\/tmp\/awx_\d+_.+', path)
if re_match is not None:
if subprocess.check_call(['ansible-runner', 'is-alive', path]) == 0:
continue
else:
try:
if subprocess.check_call(['ansible-runner', 'is-alive', path]) == 0:
continue
except subprocess.CalledProcessError:
# the job isn't running anymore, clean up this path
module.debug('Deleting path {} its job has completed.'.format(path))
except (ValueError, IndexError):
continue

View File

@@ -604,6 +604,11 @@ ALLOW_JINJA_IN_EXTRA_VARS = 'template'
# Note: This setting may be overridden by database settings.
AWX_ROLES_ENABLED = True
# Enable dynamically pulling collections from a requirement.yml file
# when updating SCM projects
# Note: This setting may be overridden by database settings.
AWX_COLLECTIONS_ENABLED = True
# Enable bubblewrap support for running jobs (playbook runs only).
# Note: This setting may be overridden by database settings.
AWX_PROOT_ENABLED = True
@@ -619,9 +624,6 @@ AWX_PROOT_HIDE_PATHS = []
# Note: This setting may be overridden by database settings.
AWX_PROOT_SHOW_PATHS = []
# Number of jobs to show as part of the job template history
AWX_JOB_TEMPLATE_HISTORY = 10
# The directory in which Tower will create new temporary directories for job
# execution and isolation (such as credential files and custom
# inventory scripts).

View File

@@ -171,7 +171,7 @@ CELERYBEAT_SCHEDULE.update({ # noqa
CLUSTER_HOST_ID = socket.gethostname()
if 'Docker for Mac' in os.getenv('OS', ''):
if 'Docker Desktop' in os.getenv('OS', ''):
os.environ['SDB_NOTIFY_HOST'] = 'docker.for.mac.host.internal'
else:
os.environ['SDB_NOTIFY_HOST'] = os.popen('ip route').read().split(' ')[2]

View File

@@ -369,6 +369,10 @@ def on_populate_user(sender, **kwargs):
remove_admins = bool(org_opts.get('remove_admins', remove))
_update_m2m_from_groups(user, ldap_user, org.admin_role.members, admins_opts,
remove_admins)
auditors_opts = org_opts.get('auditors', None)
remove_auditors = bool(org_opts.get('remove_auditors', remove))
_update_m2m_from_groups(user, ldap_user, org.auditor_role.members, auditors_opts,
remove_auditors)
users_opts = org_opts.get('users', None)
remove_users = bool(org_opts.get('remove_users', remove))
_update_m2m_from_groups(user, ldap_user, org.member_role.members, users_opts,

View File

@@ -53,6 +53,7 @@ SOCIAL_AUTH_ORGANIZATION_MAP_PLACEHOLDER = collections.OrderedDict([
])),
('Test Org', collections.OrderedDict([
('admins', ['admin@example.com']),
('auditors', ['auditor@example.com']),
('users', True),
])),
('Test Org 2', collections.OrderedDict([
@@ -379,6 +380,7 @@ def _register_ldap(append=None):
placeholder=collections.OrderedDict([
('Test Org', collections.OrderedDict([
('admins', 'CN=Domain Admins,CN=Users,DC=example,DC=com'),
('auditors', 'CN=Domain Auditors,CN=Users,DC=example,DC=com'),
('users', ['CN=Domain Users,CN=Users,DC=example,DC=com']),
('remove_users', True),
('remove_admins', True),
@@ -1170,8 +1172,10 @@ register(
placeholder=collections.OrderedDict([
('saml_attr', 'organization'),
('saml_admin_attr', 'organization_admin'),
('saml_auditor_attr', 'organization_auditor'),
('remove', True),
('remove_admins', True),
('remove_auditors', True),
]),
)

View File

@@ -532,8 +532,10 @@ class LDAPSingleOrganizationMapField(HybridDictField):
admins = LDAPDNMapField(allow_null=True, required=False)
users = LDAPDNMapField(allow_null=True, required=False)
auditors = LDAPDNMapField(allow_null=True, required=False)
remove_admins = fields.BooleanField(required=False)
remove_users = fields.BooleanField(required=False)
remove_auditors = fields.BooleanField(required=False)
child = _Forbidden()
@@ -729,6 +731,8 @@ class SAMLOrgAttrField(HybridDictField):
saml_attr = fields.CharField(required=False, allow_null=True)
remove_admins = fields.BooleanField(required=False)
saml_admin_attr = fields.CharField(required=False, allow_null=True)
remove_auditors = fields.BooleanField(required=False)
saml_auditor_attr = fields.CharField(required=False, allow_null=True)
child = _Forbidden()

Some files were not shown because too many files have changed in this diff Show More